WO2015070243A1 - Methods and systems for auto-segmenting quantitatively derived disease objects in digital medical images - Google Patents

Methods and systems for auto-segmenting quantitatively derived disease objects in digital medical images Download PDF

Info

Publication number
WO2015070243A1
WO2015070243A1 PCT/US2014/065073 US2014065073W WO2015070243A1 WO 2015070243 A1 WO2015070243 A1 WO 2015070243A1 US 2014065073 W US2014065073 W US 2014065073W WO 2015070243 A1 WO2015070243 A1 WO 2015070243A1
Authority
WO
WIPO (PCT)
Prior art keywords
voxels
voxel
eroded
valued
objects
Prior art date
Application number
PCT/US2014/065073
Other languages
French (fr)
Inventor
Jeffrey Leal
Richard L. Wahl
Original Assignee
The Johns Hopkins University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by The Johns Hopkins University filed Critical The Johns Hopkins University
Publication of WO2015070243A1 publication Critical patent/WO2015070243A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Definitions

  • the current patent application generally relates to medical image processing.
  • phase II trials The typical development pathway for cancer therapeutic drugs includes an evolution from phase I to phase II and to phase III clinical trials.
  • phase I trials toxicity of the agent is typically assessed to determine what dose is appropriate for subsequent trials.
  • the statistical power of phase I drug trials is inadequate to assess antitumor efficacy.
  • phase II trials evidence of antitumor activity is obtained.
  • Phase II trials can be done in several ways. One approach is to examine tumor response rate versus a historical control population treated with an established drug. New drugs with a low response rate are typically not moved forward to advanced clinical testing under such a paradigm. In such trials, tumor response has nearly always been determined anatomically.
  • phase III trials are typically performed. Phase III trials are larger and typically have a control arm treated with a standard therapy. Not all phase III trials are successful, but all are costly.
  • a computer-implemented method for image processing comprising receiving an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; determining a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, eroding the object by removing a plurality of least contributory-valued edge voxels until either a single voxel is remaining in the object or the object is separated into multiple objects; and dilating resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
  • a system for image processing comprising a memory storing an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; and a processor coupled to the memory, the processor being configured to: receive the image; determine a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory- valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
  • a non-transitory computer readable storage medium comprising instructions that if executed enables a computing system to receive an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; determine a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory-valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
  • FIG. 1 shows a flow chart according to an embodiment of the invention.
  • Fig. 2A-2C compares original and auto-segmented Metabolic Tumor Volumes according to some embodiments of the current invention.
  • Fig. 3 illustrates a Receiver Operator Characteristic analysis at the median survival of 246 days according to some embodiments of the current invention.
  • Fig. 4 analyzes the area under the curve for each of the various measurements illustrated in Fig. 3.
  • Figs. 5A-5G illustrates the likelihood of survival according to some embodiments of the current invention.
  • FIG. 6 illustrates an example of a computer system that may be configured to practice an illustrative embodiment of the invention.
  • the present invention may be implemented on a computer system operating as discussed herein.
  • the computer system may include, e.g., but is not limited to, a main memory, random access memory (RAM), and a secondary memory, etc.
  • Main memory, random access memory (RAM), and a secondary memory, etc. may be a computer-readable medium that may be configured to store instructions configured to implement one or more embodiments and may comprise a random-access memory (RAM) that may include RAM devices, such as Dynamic RAM (DRAM) devices, flash memory devices, Static RAM (SRAM) devices, etc.
  • DRAM Dynamic RAM
  • SRAM Static RAM
  • the secondary memory may include, for example, (but is not limited to) a hard disk drive and/or a removable storage drive, representing a floppy diskette drive, a magnetic tape drive, an optical disk drive, a compact disk drive CD-ROM, flash memory, cloud instance, etc.
  • the removable storage drive may, e.g., but is not limited to, read from and/or write to a removable storage unit in a well-known manner.
  • the removable storage unit also called a program storage device or a computer program product, may represent, e.g., but is not limited to, a floppy disk, magnetic tape, optical disk, compact disk, etc. which may be read from and written to the removable storage drive.
  • the removable storage unit may include a computer usable storage medium having stored therein computer software and/or data.
  • the secondary memory may include other similar devices for allowing computer programs or other instructions to be loaded into the computer system.
  • Such devices may include, for example, a removable storage unit and an interface. Examples of such may include a program cartridge and cartridge interface (such as, e.g., but not limited to, those found in video game devices), a removable memory chip (such as, e.g., but not limited to, an erasable programmable read only memory (EPROM), or programmable read only memory (PROM)) and associated socket, and other removable storage units and interfaces, which may allow software and data to be transferred from the removable storage unit to the computer system.
  • a program cartridge and cartridge interface such as, e.g., but not limited to, those found in video game devices
  • EPROM erasable programmable read only memory
  • PROM programmable read only memory
  • the computer may also include an input device including any mechanism or combination of mechanisms that may permit information to be input into the computer system from, e.g., a user.
  • the input device may include logic configured to receive information for the computer system from, e.g. a user. Examples of the input device may include, e.g., but are not limited to include, a mouse, pen-based pointing device, or other pointing device such as a digitizer, a touch sensitive display device, and/or a keyboard or other data entry device (none of which are labeled).
  • Other input devices may include, e.g., but are not limited to include, a biometric input device, a video source, an audio source, a microphone, a web cam, a video camera, and/or other camera.
  • the input device may communicate with a processor either wired or wirelessly.
  • the computer may also include output devices which may include any mechanism or combination of mechanisms that may output information from a computer system.
  • An output device may include logic configured to output information from the computer system.
  • Embodiments of an output device may include, e.g., but are not limited to include, display, and display interface, including displays, printers, speakers, cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum florescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), etc.
  • the computer may include input/output (1/0) devices such as, e.g., (but not limited to) communications interface, cable and communications path, etc. These devices may include, e.g., but are not limited to, a network interface card, and/or modems.
  • the output device may communicate with processor either wired or wirelessly.
  • a communications interface may allow software and data to be transferred between the computer system and external devices.
  • the term "data processor” is intended to have a broad meaning that includes, e.g., but is not limited to include, one or more central processing units that are connected to a communication infrastructure (e.g., but not limited to, a communications bus, cross-over bar, interconnect, or network, etc.).
  • the term data processor may include any type of processor, microprocessor and/or processing logic that may interpret and execute instructions (e.g., for example, a field programmable gate array (FPGA)).
  • the data processor may comprise a single device (e.g., for example, a single core) and/or a group of devices (e.g., multi-core).
  • the data processor may include logic configured to execute computer-executable instructions configured to implement one or more embodiments.
  • the instructions may reside in main memory or secondary memory.
  • the data processor may also include multiple independent cores, such as a dual-core processor or a multi-core processor.
  • the data processors may also include one or more graphics processing units (GPU) which may be in the form of a dedicated graphics card, an integrated graphics solution, and/or a hybrid graphics solution.
  • GPU graphics processing units
  • data storage device is intended to have a broad meaning that includes a removable storage drive, a hard disk installed in hard disk drive, flash memories, removable discs, non-removable discs, Cloud storage such as Amazon, Apple, Dell, Google, etc., and other storage implementations.
  • various electromagnetic radiation such as wireless communication, electrical communication carried over an electrically conductive wire (e.g., but not limited to twisted pair, CATS, etc.) or an optical medium (e.g., but not limited to, optical fiber) and the like may be encoded to carry computer-executable instructions and/or computer data that embodiments of the invention on e.g., a communication network.
  • These computer program products may provide software to the computer system.
  • a computer-readable medium that comprises computer-executable instructions for execution in a processor may be configured to store various embodiments of the present invention.
  • Fig. 1 shows a flow chart according to an embodiment of the current invention.
  • Block 101 may represent quantitative cross-sectional image data of a medical image of a subject under observation over a course of time.
  • the subject may be a human patient, an animal, etc.
  • the medical image may be of any region of the subject, such as, for example, the chest, the leg, the arm, or the head of the subject.
  • the medical image may be, for example, a Positron Emission Tomography/Computed Tomography (PET/CT) image, a Magnetic Resonance Imaging (MRI) image, etc.
  • PET/CT Positron Emission Tomography/Computed Tomography
  • MRI Magnetic Resonance Imaging
  • the medical image may be one of or a combination of two- and three-dimensional data.
  • the medical image may be taken from a follow-up exam of the same subject, for example, after commencement of treatment. Alternatively, the medical image may be taken of the subject before commencement of treatment.
  • the quantitative cross-sectional image data at block 101 comprises a plurality of image voxels representing multiple tissue regions of the subject.
  • a single image voxel may represent multiple tissue regions.
  • Each voxel may have a different signal classification representing normal or non-normal tissue.
  • Non-normal tissue may include, for example, a tumor, neoplasm, hyperplasia, or dysplasia or any other type of diseased tissue.
  • an object may be quantitatively derived based on the image data acquired at block 101.
  • the derivation of the object may be based on a Metabolic Tumor Volumes (MTV) of the voxels detected in the medical image at Block 101.
  • MTV of the objects are those that assist an image analyst in classifying the object as normal or non- normal tissue.
  • the MTV are constructed by filtering the image data and using the spatial coordinates of the voxel having the highest radioactive concentration as a "seed" for determining whether the tissue is normal or non-normal.
  • the voxels that are "connected” to the "seed” and have a value greater than or equal to a threshold-based constraint, such as a PERCIST minimum threshold, are used in deriving the object.
  • the voxel is "connected” to the "seed” if there are any points of contact between two neighboring voxels, such as, for example, a face, an edge or a point touching between the two neighboring voxels.
  • the PERCIST minimum threshold for measurement is defined as 1.5*Tumor m ean + 2.0*Tumor s d, where the reference liver measurements are calculated using a 3cm diameter spherical Volume of Interest (VOI).
  • the edge voxels of the MTV are identified at block 103.
  • the edges voxels of the MTV are determined by the original delineation of the object.
  • the edge voxels are those that interface between known normal and non-normal tissue.
  • edges voxels are then sorted by value in either ascending or descending order.
  • erosion of the object occurs by the least contributory valued voxels of the object being removed one at a time from the edge of the image.
  • the least contributory valued voxels may be user-selectable before this process begins or predefined.
  • the least contributory valued voxel may be any statistical or volumetric criteria available for analyzing a voxel, such as, for example, a minimum target volume of voxels, a minimum number of voxels required to acquire a target object, a number of lowest valued voxels for data with high value signals, or a number of highest valued voxels for data with low value signals.
  • the eroded (removed) voxel is then stored into a buffer at block 106. The buffer maintains the spatial relationship of the eroded voxels.
  • the buffer, at block 106, may sort the eroded voxels in either ascending or descending order as they are removed from the edge of the object.
  • each of the resulting objects are then dilated at block 108.
  • the dilation of the resulting objects may be in order of a rank of the resulting objects.
  • the ranking of resulting objects may, for example, take the form of sorting, from the highest value to the lowest value, of candidates based on the chosen criterion for candidacy of the voxel, including, for example, an intensity value of the voxels.
  • a non-normal object may be grown around the target voxel's spatial location in the first medical image according to a user- selectable criterion.
  • the criterion may include, for example, an absolute or fixed threshold as in PERCIST, a relative or variable threshold (e.g., percentage of max or peak value), or multi- variable threshold, etc.
  • the buffer is then searched in either descending or ascending order for the highest valued eroded voxel that is "connected" to the highest ranked voxel of the object.
  • the highest valued eroded voxel may be based on a chosen criterion for candidacy of the voxel, including, for example, an intensity value of the voxels.
  • the highest valued eroded voxel may be the most contributory-valued voxel remaining of the eroded voxels that is on a surface of the particular resulting object.
  • the highest valued eroded voxel is then removed from the buffer and inserted into the resulting object.
  • This process repeats with the next highest ranked object and continues to do so until the buffer is empty of eroded voxels.
  • the result object 109, 110... N may then presented to a user without signal change.
  • the resulting images may be also presented in different colors to discern between them.
  • information of the spatial locations of the non-normal object may be subsequently extracted.
  • Data encoding the spatial location of the identified non-normal objects may then be entered into a database.
  • the database may be reviewed, for example, manually for false positives in the medical image.
  • Some normal tissue region, for example, the heart tend to have high metabolic activities and can be mistakenly identified as a disease object. If false negatives are identified, they can be easily removed.
  • a disease assessment report can be generated.
  • a disease assessment report may be generated at a later observation point. The disease assessment reports at different observation times may be compared to generate a report on disease progression or treatment response.
  • Fig. 2A-2B compares an original Metabolic Tumor Volumes (MTV) and an auto-segmented MTV according to some embodiments of the current invention.
  • MTV Metabolic Tumor Volumes
  • the objects in the auto-segmented MTV are more easily identified than those in the original MTV.
  • Figure 2A compares an original MTV and auto-segmented MTV of a heart relative to a tumor.
  • Fig. 2B compares an original MTV and auto-segmented MTV of a kidney relative to a tumor.
  • the diseased tissue may be further analyzed to determine a prognostic value, such as a length of survival, of an individual.
  • the diseased tissue may be measured according to a Peak Value SUL, MAX-SUL, Whole Body MTV, Whole body TLG, Object Count or Global HIPP.
  • the Peak Value SUL being the largest possible mean value of a 1 cm 3 spherical VOI positioned within all diseased objects.
  • the MAX-SUL being the single-voxel maximum SUL over all diseased objects.
  • the Whole Body MTV being the total volume of voxels classified as diseased that have SUL values greater than or equal to the PERCIST minimum threshold of measurement.
  • the Whole Body TLG being the product of MTV and the diseased SUL, summed over all detected diseased objects.
  • the Object Count being the number of discrete tumors detected and classified as disease at the PERCIST minimum threshold of measurement.
  • the Global HIPP being is the sum of the HIPP indices for all discrete objects detected and classified as disease at the PERCIST minimum threshold of measurement. If the HIPP indices for the discrete objects were not initially identified in the first run of the process described in Fig. 1, a second run of the process described in Fig. 1 may then be performed. [0039] Baseline FDG-PET studies of 86 randomly selected patients with Ewings
  • Fig. 3 illustrates a Receiver Operator Characteristic analysis at the median survival of 246 days according to some embodiments of the current invention.
  • Fig. 3 provides a Receiver Operator Characteristic analysis based on Peak Value SUL, MAX-SUL, Whole Body MTV, Whole body TLG, Object Count and HIPP of the patients.
  • Fig. 3 shows the sensitivity in detecting diseased tissue versus the specificity of locating the diseased tissue for each of the various measurements.
  • Fig. 4 analyzes the area under the curve for each of these various measurements illustrated in Fig. 3. As shown, the study found that the Global HIPP and Object Count are the most accurate in determining the likelihood of survival, whereas MAX-SUL and Peak Value SUL are the least accurate.
  • the Global HIPP was found to be a better indicator than the Object Count as it extended to information contained in the Object Count and additional information regarding the pattern of metabolic activity distribution within each distinct object.
  • Figs. 5A-5G illustrates a likelihood of survival according to some embodiments of the current invention.
  • the study found that the difference between the Global HIPP and the Object Count were indicative of a prognostic value regardless of the type of diseased tissue. More specifically, the study found that the greater the difference between the Global HIPP and the Object Count the fewer the days that the patient was likely to survive.
  • FIG. 6 illustrates an example of a computer system 600 that may be configured to practice an embodiment of the invention.
  • computer system 600 may be used to implement client 610, service provider 650, target environment 660, etc.
  • Computer system 600 may include processor 620, memory 670, storage device 640, input device 610, output device 660, and network interface 680.
  • Processor 620 may include logic configured to execute computer-executable instructions that implement embodiments of the invention.
  • An example of a processor that may be used with the invention includes the Pentium® processor, Core i7® processor, or Xeon® processor all available from Intel Corporation, Santa Clara, California.
  • the instructions may reside in memory 670 and may include instructions associated with TCE.
  • Memory 670 may be a computer-readable medium that may be configured to store instructions configured to implement embodiments of the invention.
  • Memory 670 may be a primary storage accessible to processor 620 and can include a random-access memory (RAM) that may include RAM devices, such as, for example, Dynamic RAM (DRAM) devices, flash memory devices, Static RANI (SRAM) devices, etc.
  • RAM random-access memory
  • Storage device 640 may include a magnetic disk and/or optical disk and its corresponding drive for storing information and/or instructions. Memory 670 and/or storage device 640 may store class definitions.
  • Interconnect 650 may include logic that operatively couples components of computer system 600 together.
  • interconnect 650 may allow components to communicate with each other, may provide power to components of computer system 600, etc.
  • interconnect 650 may be implemented as a bus.
  • Input device 610 may include logic configured to receive information for computer system 600 from, e.g., a user.
  • Embodiments of input device 610 may include keyboards, touch sensitive displays, biometric sensing devices, computer mice, trackballs, pen- based point devices, etc.
  • Output device 660 may include logic configured to output information from computer system.
  • Embodiments of output device 660 may include cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum florescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), etc.
  • CTRs cathode ray tubes
  • LED light-emitting diode
  • LCDs liquid crystal displays
  • VFDs vacuum florescent displays
  • SEDs surface-conduction electron-emitter displays
  • FEDs field emission displays
  • Network interface 680 may include logic configured to interface computer system 1600 with a network, e.g., network 640, and may enable computer system 600 to exchange information with other entities connected to the network, such as, for example, service provider, target environment and cluster.
  • Network interface 680 may be implemented as a built-in network adapter, network interface card (NIC), Personal Computer Memory Card International Association (PCMCIA) network card, card bus network adapter, wireless network adapter, Universal Serial Bus (USB) network adapter, modem or any other device suitable for interfacing computer system 1600 to any type of network.
  • NIC network interface card
  • PCMCIA Personal Computer Memory Card International Association
  • USB Universal Serial Bus
  • embodiments may be implemented using some combination of hardware and/or software.
  • a computer-readable medium that includes computer-executable instructions for execution in a processor may be configured to store embodiments of the invention.
  • the computer-readable medium may include volatile memories, non-volatile memories, flash memories, removable discs, non-removable discs and so on.
  • various electromagnetic signals such as wireless signals, electrical signals carried over a wire, optical signals carried over optical fiber and the like may be encoded to carry computer-executable instructions and/or computer data on e.g., a communication network for an embodiment of the invention.
  • a hardware unit of execution may include a device (e.g., a hardware resource) that performs and/or participates in parallel programming activities.
  • a hardware unit of execution may perform and/or participate in parallel programming activities in response to a request and/or a task it has received (e.g., received directly or via a proxy).
  • a hardware unit of execution may perform and/or participate in substantially any type of parallel programming (e.g., task, data, stream processing, etc.) using one or more devices.
  • a hardware unit of execution may include a single processing device that includes multiple cores, and in another implementation, the hardware unit of execution may include a number of processors 620.
  • a hardware unit of execution may also be a programmable device, such as a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), a digital signal processor (DSP), etc.
  • FPGA field programmable gate array
  • ASIC application specific integrated circuit
  • DSP digital signal processor
  • Devices used in a hardware unit of execution may be arranged in substantially any configuration (or topology), such as a grid, ring, star, etc.
  • a hardware unit of execution may support one or more threads (or processes) when performing processing operations.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

An embodiment of the current invention includes a computer-implemented method for imaging processing. The method includes receiving an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject, determining a first set of quantitatively derived objects from the image, eroding each of the quantitatively derived objects by removing least contributory-valued edge voxels until either a single voxel is remaining in the object or the object is separated into multiple objects and dilating resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.

Description

METHODS AND SYSTEMS FOR AUTO-SEGMENTING QUANTITATIVELY DERIVED DISEASE OBJECTS IN DIGITAL MEDICAL IMAGES
CROSS-REFERENCE OF RELATED APPLICATION
[0001] This application claims priority to U.S. Provisional Application No. 61/902,670 filed on November 11, 2013, the entire contents of which are relied upon and incorporated by reference.
BACKGROUND
1. Field of Invention
[0002] The current patent application generally relates to medical image processing.
2. Discussion of Related Art
[0003] The ultimate goal of new cancer therapies is cure. A good cancer treatment should ideally prolong survival while preserving a high quality of life cost-effectively. To demonstrate prolonged survival in a clinical trial in some more slowly progressing cancers can take 5-10 years or longer. Such trials are expensive, not only in cost but in time.
[0004] The typical development pathway for cancer therapeutic drugs includes an evolution from phase I to phase II and to phase III clinical trials. In phase I trials, toxicity of the agent is typically assessed to determine what dose is appropriate for subsequent trials. Typically, the statistical power of phase I drug trials is inadequate to assess antitumor efficacy. In phase II trials, evidence of antitumor activity is obtained. Phase II trials can be done in several ways. One approach is to examine tumor response rate versus a historical control population treated with an established drug. New drugs with a low response rate are typically not moved forward to advanced clinical testing under such a paradigm. In such trials, tumor response has nearly always been determined anatomically. An alternative approach is to use a typically larger sample size and have a randomized phase II trial, in which the new treatment is given in one treatment arm and compared with a standard treatment. Once drug activity is shown— or suggested— in phase II, phase III trials are typically performed. Phase III trials are larger and typically have a control arm treated with a standard therapy. Not all phase III trials are successful, but all are costly.
[0005] Determining which innovative cancer therapeutics should be advanced to pivotal large phase III trials can be unacceptably delayed if survival is the sole endpoint for efficacy. Survival trials can also be complicated by deaths due to nonmalignant causes, especially in older patients in whom comorbidities are common. Additional complexities can include patients who progress on a clinical trial but who go on to have one of several non- randomly distributed follow-up therapies— which can confound survival outcomes.
[0006] Therefore, there is great interest in surrogate metrics for survival after investigational cancer treatments, such as response rate, time to tumor progression, or progression-free survival. Changes in tumor size after treatment are often, but not invariably, related to duration of survival. To this end, a variety of approaches to measuring response rate have been developed, beginning with the original reports by Moertel on physical examination in 1976 and continuing to the subsequent World Health Organization (WHO) criteria (1979), Response Evaluation Criteria in Solid Tumors (RECIST) (2000), and RECIST 1.1 (2009). These approaches typically focus on how often a tumor shrinks anatomically and defines such response in several ways, including, for example, complete response, partial response, stable disease, and progressive disease. This type of classification divides intrinsically continuous data (tumor size) into 4 bins, losing statistical power for ease of nomenclature and convenience.
[0007] Thus, intrinsic limitations of currently applied anatomic tumor response metrics, including WHO, RECIST, and the new RECIST 1.1 criteria, led to on-going pursuit for quantitative and qualitative approaches to investigate surrogate endpoints based on functional imaging such as Positron Emission Tomography / Computed Tomography (PET/CT). In particular, a framework for PET Response Criteria in Solid Tumors (PERCIST, version 1.0) has been recently proposed. These functional surrogate endpoints may be useful in future multicenter trials and may serve as a starting point for further refinements of quantitative PET response. They may also provide some guidance for clinical quantitative structured reporting on individual patients. Thus, there is a need in the art for a method that can be objectively implemented by different users in a multi-center environment to monitor a patient's condition over time (i.e., in a longitudinal manner).
SUMMARY
[0008] In an illustrative embodiment of the present invention, a method, a system and a non-transitory computer readable storage medium is disclosed.
[0009] A computer-implemented method for image processing, the method comprising receiving an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; determining a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, eroding the object by removing a plurality of least contributory-valued edge voxels until either a single voxel is remaining in the object or the object is separated into multiple objects; and dilating resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
[0010] A system for image processing, the system comprising a memory storing an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; and a processor coupled to the memory, the processor being configured to: receive the image; determine a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory- valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject. [0011] A non-transitory computer readable storage medium comprising instructions that if executed enables a computing system to receive an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; determine a first set of quantitatively derived objects from the image; for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory-valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
[0012] Additional features, advantages, and embodiments of the invention are set forth or apparent from consideration of the following detailed description, drawings and claims. Moreover, it is to be understood that both the foregoing summary of the invention and the following detailed description are exemplary and intended to provide further explanation without limiting the scope of the invention as claimed.
BRIEF DESCRIPTION OF THE DRAWINGS
[0013] Further objectives and advantages will become apparent from a consideration of the description, drawings, and examples.
[0014] Fig. 1 shows a flow chart according to an embodiment of the invention.
[0015] Fig. 2A-2C compares original and auto-segmented Metabolic Tumor Volumes according to some embodiments of the current invention.
[0016] Fig. 3 illustrates a Receiver Operator Characteristic analysis at the median survival of 246 days according to some embodiments of the current invention.
[0017] Fig. 4 analyzes the area under the curve for each of the various measurements illustrated in Fig. 3. [0018] Figs. 5A-5G illustrates the likelihood of survival according to some embodiments of the current invention.
[0019] Fig. 6 illustrates an example of a computer system that may be configured to practice an illustrative embodiment of the invention.
DETAILED DESCRIPTION
[0020] Some embodiments of the current invention are discussed in detail below. In describing the embodiments, specific terminology is employed for the sake of clarity. However, the invention is not intended to be limited to the specific terminology so selected. A person skilled in the relevant art will recognize that other equivalent components can be employed and other methods developed without departing from the broad concepts of the current invention. All references cited herein are incorporated by reference as if each had been individually incorporated.
[0021] In one illustrative embodiment, the present invention may be implemented on a computer system operating as discussed herein. The computer system may include, e.g., but is not limited to, a main memory, random access memory (RAM), and a secondary memory, etc. Main memory, random access memory (RAM), and a secondary memory, etc., may be a computer-readable medium that may be configured to store instructions configured to implement one or more embodiments and may comprise a random-access memory (RAM) that may include RAM devices, such as Dynamic RAM (DRAM) devices, flash memory devices, Static RAM (SRAM) devices, etc.
[0022] The secondary memory may include, for example, (but is not limited to) a hard disk drive and/or a removable storage drive, representing a floppy diskette drive, a magnetic tape drive, an optical disk drive, a compact disk drive CD-ROM, flash memory, cloud instance, etc. The removable storage drive may, e.g., but is not limited to, read from and/or write to a removable storage unit in a well-known manner. The removable storage unit, also called a program storage device or a computer program product, may represent, e.g., but is not limited to, a floppy disk, magnetic tape, optical disk, compact disk, etc. which may be read from and written to the removable storage drive. As will be appreciated, the removable storage unit may include a computer usable storage medium having stored therein computer software and/or data.
[0023] In alternative illustrative embodiments, the secondary memory may include other similar devices for allowing computer programs or other instructions to be loaded into the computer system. Such devices may include, for example, a removable storage unit and an interface. Examples of such may include a program cartridge and cartridge interface (such as, e.g., but not limited to, those found in video game devices), a removable memory chip (such as, e.g., but not limited to, an erasable programmable read only memory (EPROM), or programmable read only memory (PROM)) and associated socket, and other removable storage units and interfaces, which may allow software and data to be transferred from the removable storage unit to the computer system.
[0024] The computer may also include an input device including any mechanism or combination of mechanisms that may permit information to be input into the computer system from, e.g., a user. The input device may include logic configured to receive information for the computer system from, e.g. a user. Examples of the input device may include, e.g., but are not limited to include, a mouse, pen-based pointing device, or other pointing device such as a digitizer, a touch sensitive display device, and/or a keyboard or other data entry device (none of which are labeled). Other input devices may include, e.g., but are not limited to include, a biometric input device, a video source, an audio source, a microphone, a web cam, a video camera, and/or other camera. The input device may communicate with a processor either wired or wirelessly. [0025] The computer may also include output devices which may include any mechanism or combination of mechanisms that may output information from a computer system. An output device may include logic configured to output information from the computer system. Embodiments of an output device may include, e.g., but are not limited to include, display, and display interface, including displays, printers, speakers, cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum florescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), etc. The computer may include input/output (1/0) devices such as, e.g., (but not limited to) communications interface, cable and communications path, etc. These devices may include, e.g., but are not limited to, a network interface card, and/or modems. The output device may communicate with processor either wired or wirelessly. A communications interface may allow software and data to be transferred between the computer system and external devices.
[0026] The term "data processor" is intended to have a broad meaning that includes, e.g., but is not limited to include, one or more central processing units that are connected to a communication infrastructure (e.g., but not limited to, a communications bus, cross-over bar, interconnect, or network, etc.). The term data processor may include any type of processor, microprocessor and/or processing logic that may interpret and execute instructions (e.g., for example, a field programmable gate array (FPGA)). The data processor may comprise a single device (e.g., for example, a single core) and/or a group of devices (e.g., multi-core). The data processor may include logic configured to execute computer-executable instructions configured to implement one or more embodiments. The instructions may reside in main memory or secondary memory. The data processor may also include multiple independent cores, such as a dual-core processor or a multi-core processor. The data processors may also include one or more graphics processing units (GPU) which may be in the form of a dedicated graphics card, an integrated graphics solution, and/or a hybrid graphics solution. Various illustrative software embodiments may be described in terms of this illustrative computer system. After reading this description, it will become apparent to a person skilled in the relevant art(s) how to implement the invention using other computer systems and/or architectures.
[0027] The term "data storage device" is intended to have a broad meaning that includes a removable storage drive, a hard disk installed in hard disk drive, flash memories, removable discs, non-removable discs, Cloud storage such as Amazon, Apple, Dell, Google, etc., and other storage implementations. In addition, it should be noted that various electromagnetic radiation, such as wireless communication, electrical communication carried over an electrically conductive wire (e.g., but not limited to twisted pair, CATS, etc.) or an optical medium (e.g., but not limited to, optical fiber) and the like may be encoded to carry computer-executable instructions and/or computer data that embodiments of the invention on e.g., a communication network. These computer program products may provide software to the computer system. It should be noted that a computer-readable medium that comprises computer-executable instructions for execution in a processor may be configured to store various embodiments of the present invention.
[0028] Fig. 1 shows a flow chart according to an embodiment of the current invention.
The flow chart of Fig. 1 describes a technique for automatically and objectively identifying a surface for separating objects in a medical image that has a spatially connected area of shared or connected signals but which are of different signal classifications including, for example, normal and non-normal tissue. Block 101 may represent quantitative cross-sectional image data of a medical image of a subject under observation over a course of time. The subject may be a human patient, an animal, etc. The medical image may be of any region of the subject, such as, for example, the chest, the leg, the arm, or the head of the subject. The medical image may be, for example, a Positron Emission Tomography/Computed Tomography (PET/CT) image, a Magnetic Resonance Imaging (MRI) image, etc. The medical image may be one of or a combination of two- and three-dimensional data. The medical image may be taken from a follow-up exam of the same subject, for example, after commencement of treatment. Alternatively, the medical image may be taken of the subject before commencement of treatment.
[0029] The quantitative cross-sectional image data at block 101 comprises a plurality of image voxels representing multiple tissue regions of the subject. A single image voxel may represent multiple tissue regions. Each voxel may have a different signal classification representing normal or non-normal tissue. Non-normal tissue may include, for example, a tumor, neoplasm, hyperplasia, or dysplasia or any other type of diseased tissue.
[0030] Proceeding to block 102, an object may be quantitatively derived based on the image data acquired at block 101. The derivation of the object may be based on a Metabolic Tumor Volumes (MTV) of the voxels detected in the medical image at Block 101. The MTV of the objects are those that assist an image analyst in classifying the object as normal or non- normal tissue. The MTV are constructed by filtering the image data and using the spatial coordinates of the voxel having the highest radioactive concentration as a "seed" for determining whether the tissue is normal or non-normal. The voxels that are "connected" to the "seed" and have a value greater than or equal to a threshold-based constraint, such as a PERCIST minimum threshold, are used in deriving the object. The voxel is "connected" to the "seed" if there are any points of contact between two neighboring voxels, such as, for example, a face, an edge or a point touching between the two neighboring voxels. The PERCIST minimum threshold for measurement is defined as 1.5*Tumormean + 2.0*Tumorsd, where the reference liver measurements are calculated using a 3cm diameter spherical Volume of Interest (VOI). [0031] Upon deriving the object based on the threshold-based constraint, the edge voxels of the MTV are identified at block 103. The edges voxels of the MTV are determined by the original delineation of the object. The edge voxels are those that interface between known normal and non-normal tissue.
[0032] Thereafter, at block 104, the edges voxels are then sorted by value in either ascending or descending order. At block 105, erosion of the object occurs by the least contributory valued voxels of the object being removed one at a time from the edge of the image. The least contributory valued voxels may be user-selectable before this process begins or predefined. The least contributory valued voxel may be any statistical or volumetric criteria available for analyzing a voxel, such as, for example, a minimum target volume of voxels, a minimum number of voxels required to acquire a target object, a number of lowest valued voxels for data with high value signals, or a number of highest valued voxels for data with low value signals. The eroded (removed) voxel is then stored into a buffer at block 106. The buffer maintains the spatial relationship of the eroded voxels.
[0033] After each least contributory valued voxels is removed from the edge of the image, a determination is made, at block 107, whether the resulting object has been eroded into a single voxel or separated into two or more distinct objects. If neither has occurred, then the process returns to block 103 and proceeds through the process. This sequence continues until the resulting object has been eroded into a single voxel or separated into two or more distinct objects. The buffer, at block 106, may sort the eroded voxels in either ascending or descending order as they are removed from the edge of the object.
[0034] Once the resulting object has been eroded into a single voxel or separated into two or more distinct objects, each of the resulting objects are then dilated at block 108. The dilation of the resulting objects may be in order of a rank of the resulting objects. The ranking of resulting objects may, for example, take the form of sorting, from the highest value to the lowest value, of candidates based on the chosen criterion for candidacy of the voxel, including, for example, an intensity value of the voxels. If the voxel intensity value of the highest ranking target voxel is greater than a non-normal threshold value, a non-normal object may be grown around the target voxel's spatial location in the first medical image according to a user- selectable criterion. The criterion may include, for example, an absolute or fixed threshold as in PERCIST, a relative or variable threshold (e.g., percentage of max or peak value), or multi- variable threshold, etc.
[0035] Once the highest ranked object is determined, the buffer is then searched in either descending or ascending order for the highest valued eroded voxel that is "connected" to the highest ranked voxel of the object. The highest valued eroded voxel may be based on a chosen criterion for candidacy of the voxel, including, for example, an intensity value of the voxels. The highest valued eroded voxel may be the most contributory-valued voxel remaining of the eroded voxels that is on a surface of the particular resulting object. The highest valued eroded voxel is then removed from the buffer and inserted into the resulting object. This process repeats with the next highest ranked object and continues to do so until the buffer is empty of eroded voxels. Upon completing dilation, the result object 109, 110... N may then presented to a user without signal change. The resulting images may be also presented in different colors to discern between them.
[0036] In an embodiment of the invention, once the non-normal object is identified in the medical image, information of the spatial locations of the non-normal object may be subsequently extracted. Data encoding the spatial location of the identified non-normal objects may then be entered into a database. The database may be reviewed, for example, manually for false positives in the medical image. Some normal tissue region, for example, the heart, tend to have high metabolic activities and can be mistakenly identified as a disease object. If false negatives are identified, they can be easily removed. Thereafter, a disease assessment report can be generated. In a similar fashion, a disease assessment report may be generated at a later observation point. The disease assessment reports at different observation times may be compared to generate a report on disease progression or treatment response.
[0037] Fig. 2A-2B compares an original Metabolic Tumor Volumes (MTV) and an auto-segmented MTV according to some embodiments of the current invention. The objects in the auto-segmented MTV are more easily identified than those in the original MTV. Figure 2A compares an original MTV and auto-segmented MTV of a heart relative to a tumor. Fig. 2B compares an original MTV and auto-segmented MTV of a kidney relative to a tumor.
[0038] In an additional embodiment of the invention, after separating normal and diseased tissue as described in Figure 1, the diseased tissue may be further analyzed to determine a prognostic value, such as a length of survival, of an individual. The diseased tissue may be measured according to a Peak Value SUL, MAX-SUL, Whole Body MTV, Whole body TLG, Object Count or Global HIPP. The Peak Value SUL being the largest possible mean value of a 1 cm3 spherical VOI positioned within all diseased objects. The MAX-SUL being the single-voxel maximum SUL over all diseased objects. The Whole Body MTV being the total volume of voxels classified as diseased that have SUL values greater than or equal to the PERCIST minimum threshold of measurement. The Whole Body TLG being the product of MTV and the diseased SUL, summed over all detected diseased objects. The Object Count being the number of discrete tumors detected and classified as disease at the PERCIST minimum threshold of measurement. The Global HIPP being is the sum of the HIPP indices for all discrete objects detected and classified as disease at the PERCIST minimum threshold of measurement. If the HIPP indices for the discrete objects were not initially identified in the first run of the process described in Fig. 1, a second run of the process described in Fig. 1 may then be performed. [0039] Baseline FDG-PET studies of 86 randomly selected patients with Ewings
Sarcoma were conducted and analyzed to demonstrate the feasibility of the proposed computer- assisted method. The median overall survival of these 86 patients was 246 days. Using a Receiver Operating characteristic (ROC) analysis, multiple measurements were then taken of the voxels of the diseased tissue to determine the ability to predict a prognostic value at the median overall survival.
[0040] Fig. 3 illustrates a Receiver Operator Characteristic analysis at the median survival of 246 days according to some embodiments of the current invention. Fig. 3 provides a Receiver Operator Characteristic analysis based on Peak Value SUL, MAX-SUL, Whole Body MTV, Whole body TLG, Object Count and HIPP of the patients. Fig. 3 shows the sensitivity in detecting diseased tissue versus the specificity of locating the diseased tissue for each of the various measurements. Fig. 4 analyzes the area under the curve for each of these various measurements illustrated in Fig. 3. As shown, the study found that the Global HIPP and Object Count are the most accurate in determining the likelihood of survival, whereas MAX-SUL and Peak Value SUL are the least accurate. The Global HIPP was found to be a better indicator than the Object Count as it extended to information contained in the Object Count and additional information regarding the pattern of metabolic activity distribution within each distinct object.
[0041] Figs. 5A-5G illustrates a likelihood of survival according to some embodiments of the current invention. As illustrated, the study found that the difference between the Global HIPP and the Object Count were indicative of a prognostic value regardless of the type of diseased tissue. More specifically, the study found that the greater the difference between the Global HIPP and the Object Count the fewer the days that the patient was likely to survive.
[0042] FIG. 6 illustrates an example of a computer system 600 that may be configured to practice an embodiment of the invention. For example, computer system 600 may be used to implement client 610, service provider 650, target environment 660, etc. Computer system 600 may include processor 620, memory 670, storage device 640, input device 610, output device 660, and network interface 680. Processor 620 may include logic configured to execute computer-executable instructions that implement embodiments of the invention. An example of a processor that may be used with the invention includes the Pentium® processor, Core i7® processor, or Xeon® processor all available from Intel Corporation, Santa Clara, California. The instructions may reside in memory 670 and may include instructions associated with TCE.
[0043] Memory 670 may be a computer-readable medium that may be configured to store instructions configured to implement embodiments of the invention. Memory 670 may be a primary storage accessible to processor 620 and can include a random-access memory (RAM) that may include RAM devices, such as, for example, Dynamic RAM (DRAM) devices, flash memory devices, Static RANI (SRAM) devices, etc. Storage device 640 may include a magnetic disk and/or optical disk and its corresponding drive for storing information and/or instructions. Memory 670 and/or storage device 640 may store class definitions.
[0044] Interconnect 650 may include logic that operatively couples components of computer system 600 together. For example, interconnect 650 may allow components to communicate with each other, may provide power to components of computer system 600, etc. In an embodiment of computer system 600, interconnect 650 may be implemented as a bus.
[0045] Input device 610 may include logic configured to receive information for computer system 600 from, e.g., a user. Embodiments of input device 610 may include keyboards, touch sensitive displays, biometric sensing devices, computer mice, trackballs, pen- based point devices, etc. Output device 660 may include logic configured to output information from computer system. Embodiments of output device 660 may include cathode ray tubes (CRTs), plasma displays, light-emitting diode (LED) displays, liquid crystal displays (LCDs), printers, vacuum florescent displays (VFDs), surface-conduction electron-emitter displays (SEDs), field emission displays (FEDs), etc.
[0046] Network interface 680 may include logic configured to interface computer system 1600 with a network, e.g., network 640, and may enable computer system 600 to exchange information with other entities connected to the network, such as, for example, service provider, target environment and cluster. Network interface 680 may be implemented as a built-in network adapter, network interface card (NIC), Personal Computer Memory Card International Association (PCMCIA) network card, card bus network adapter, wireless network adapter, Universal Serial Bus (USB) network adapter, modem or any other device suitable for interfacing computer system 1600 to any type of network.
[0047] It should be noted that embodiments may be implemented using some combination of hardware and/or software. It should be further noted that a computer-readable medium that includes computer-executable instructions for execution in a processor may be configured to store embodiments of the invention. The computer-readable medium may include volatile memories, non-volatile memories, flash memories, removable discs, non-removable discs and so on. In addition, it should be noted that various electromagnetic signals such as wireless signals, electrical signals carried over a wire, optical signals carried over optical fiber and the like may be encoded to carry computer-executable instructions and/or computer data on e.g., a communication network for an embodiment of the invention.
[0048] A hardware unit of execution may include a device (e.g., a hardware resource) that performs and/or participates in parallel programming activities. For example, a hardware unit of execution may perform and/or participate in parallel programming activities in response to a request and/or a task it has received (e.g., received directly or via a proxy). A hardware unit of execution may perform and/or participate in substantially any type of parallel programming (e.g., task, data, stream processing, etc.) using one or more devices. For example, in one implementation, a hardware unit of execution may include a single processing device that includes multiple cores, and in another implementation, the hardware unit of execution may include a number of processors 620. A hardware unit of execution may also be a programmable device, such as a field programmable gate array (FPGA), an application specific integrated circuit (ASIC), a digital signal processor (DSP), etc. Devices used in a hardware unit of execution may be arranged in substantially any configuration (or topology), such as a grid, ring, star, etc. A hardware unit of execution may support one or more threads (or processes) when performing processing operations.
[0049] Although the foregoing description is directed to the preferred embodiments of the invention, it is noted that other variations and modifications will be apparent to those skilled in the art, and may be made without departing from the spirit or scope of the invention. Moreover, features described in connection with one embodiment of the invention may be used in conjunction with other embodiments, even if not explicitly stated above.

Claims

WE CLAIM: CLAIMS
1. A computer-implemented method for image processing, the method comprising:
receiving an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject;
determining a first set of quantitatively derived objects from the image;
for each of the quantitatively derived objects, eroding the object by removing a plurality of least contributory-valued edge voxels until either a single voxel is remaining in the object or the object is separated into multiple objects; and
dilating resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
2. The computer-implemented method of claim 1, wherein the eroding of the object by removing least contributory-valued edge voxels comprises:
identifying the edge voxels of the object;
removing a least contributory-valued voxel remaining in the edge voxels; and placing the removed least contributory-valued voxel in a buffer.
3. The computer-implemented method of claim 2, wherein the least contributory-valued voxels is a lowest-valued voxel or a highest-valued voxel.
4. The computer- implemented method of claim 1, wherein the dilating of the resulting objects further comprises:
for each resulting object of the resulting objects and in order of rank: reacquire an eroded voxel from the eroded voxels, wherein the reacquired eroded voxel is a most contributory-valued voxel remaining of the eroded voxels that are on a surface of the object, and
removing the eroded voxel from the eroded voxels; and
repeating the reacquiring and removing for each resulting object and in order of rank until all the eroded voxels are reacquired.
5. The computer-implemented method of claim 4, wherein the rank is at least one of a mean voxel intensity of the object, a highest-valued voxel of the object, or a lowest-valued voxel of the object.
6. The computer-implemented method of claim 1, further comprising:
classifying the second plurality of tissue regions.
7. The computer- implemented method of claim 1, further comprising:
creating an index value based on a number of the second plurality of tissue regions.
8. The computer-implemented method of claim 1, further comprising:
creating an index value by normalizing a number of the second plurality of tissue regions with a number of the plurality of tissue regions.
9. The computer-implemented method of claim 1 , wherein spatial coordinate mappings of the eroded voxels are maintained.
10. The computer-implemented method of claim 1 , wherein the determining the first set of quantitatively derived objects from the image further comprises: grouping continuous voxels that meet a threshold into objects.
1 1. The computer-implemented method of claim 10, wherein the threshold is a voxel intensity of about 70 or greater.
12. The computer-implemented method of claim 1, additionally comprising:
determining a prognostic value based on the resulting objects.
13. A system for image processing, the system comprising:
a memory storing an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject; and
a processor coupled to the memory, the processor being configured to:
receive the image;
determine a first set of quantitatively derived objects from the image;
for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory-valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and
dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
14. The system of claim 13, wherein the eroding of the object by removing least contributory- valued edge voxels comprises:
identifying the edge voxels of the object;
removing a least contributory-valued voxel remaining in the edge voxels; and placing the removed least contributory-valued voxel in a buffer.
15. The system of claim 14, wherein the least contributory -valued voxels is a lowest-valued voxel or a highest-valued voxel.
16. The system of claim 13, wherein the dilating of the resulting objects using eroded voxels to produce the second plurality of tissue regions of the subject further comprises:
for each resulting object of the resulting objects and in order of rank:
reacquire an eroded voxel from the eroded voxels, wherein the reacquired eroded voxel is a most contributory-valued voxel remaining of the eroded voxels that are on a surface of the object, and
removing the eroded voxel from the eroded voxels; and
repeating the reacquiring and removing for each resulting object and in order of rank until all the eroded voxels are reacquired.
17. The system of claim 16, wherein the rank is at least one of a mean voxel intensity of the object, a highest-valued voxel of the object, or a lowest-valued voxel of the object.
18. The system of claim 13, further comprising:
classifying the second plurality of tissue regions.
19. The system of claim 13, further comprising:
creating an index value based on a number of the second plurality of tissue regions.
20. The system of claim 13, further comprising: creating an index value by normalizing a number of the second plurality of tissue regions with a number of the plurality of tissue regions.
21. The system of claim 13, wherein spatial coordinate mappings of the eroded voxels are maintained.
22. The system of claim 13, wherein the determining the first set of quantitatively derived objects from the image further comprises:
grouping continuous voxels that meet a threshold into objects.
23. The system of claim 22, wherein the threshold is a voxel intensity of about 70 or greater.
24. The system of claim 13, additionally comprising:
determining a prognostic value based on the resulting objects.
25. A non-transitory computer readable storage medium comprising instructions that if executed enables a computing system to:
receive an image comprising a plurality of image voxels representing a plurality of tissue regions of a subject;
determine a first set of quantitatively derived objects from the image;
for each of the quantitatively derived objects, erode the object by removing a plurality of least contributory-valued edge voxels until either there is a single voxel remaining in the object or the object is separated into multiple objects; and
dilate resulting objects using eroded voxels to produce a second plurality of tissue regions of the subject.
26. The non-transitory computer readable storage medium of claim 25, wherein the eroding of the object by removing least contributory-valued edge voxels comprises:
identifying the edge voxels of the object;
removing a least contributory-valued voxel remaining in the edge voxels; and placing the removed least contributory-valued voxel in a buffer.
27. The non-transitory computer readable storage medium of claim 25, wherein the least contributory-valued voxels is a lowest-valued voxel or a highest-valued voxel.
28. The non-transitory computer readable storage medium of claim 25, wherein the dilating of the resulting objects using eroded voxels to produce the second plurality of tissue regions of the subject further comprises:
for each resulting object of the resulting objects and in order of rank:
reacquire an eroded voxel from the eroded voxels, wherein the reacquired eroded voxel is a most contributory-valued voxel remaining of the eroded voxels that are on a surface of the object, and
removing the eroded voxel from the eroded voxels; and
repeating the reacquiring and removing for each resulting object and in order of rank until all the eroded voxels are reacquired.
29. The non-transitory computer readable storage medium of claim 28, wherein the rank is at least one of a mean voxel intensity of the object, a highest-valued voxel of the object, or a lowest-valued voxel of the object.
30. The non-transitory computer readable storage medium of claim 25, further comprising: classifying the second plurality of tissue regions.
31. The non-transitory computer readable storage medium of claim 25, further comprising: creating an index value based on a number of the second plurality of tissue regions.
32. The non-transitory computer readable storage medium of claim 25, further comprising: creating an index value by normalizing a number of the second plurality of tissue regions with a number of the plurality of tissue regions.
33. The non-transitory computer readable storage medium of claim 25, wherein spatial coordinate mappings of the eroded voxels are maintained.
34. The non-transitory computer readable storage medium of claim 25, wherein the determining the first set of quantitatively derived objects from the image further comprises: grouping continuous voxels that meet a threshold into objects.
35. The non-transitory computer readable storage medium of claim 34, wherein the threshold is a voxel intensity of about 70 or greater.
36. The transitory computer readable storage medium of claim 25, further comprising: determining a prognostic value based on the resulting images.
PCT/US2014/065073 2013-11-11 2014-11-11 Methods and systems for auto-segmenting quantitatively derived disease objects in digital medical images WO2015070243A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361902670P 2013-11-11 2013-11-11
US61/902,670 2013-11-11

Publications (1)

Publication Number Publication Date
WO2015070243A1 true WO2015070243A1 (en) 2015-05-14

Family

ID=53042251

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/065073 WO2015070243A1 (en) 2013-11-11 2014-11-11 Methods and systems for auto-segmenting quantitatively derived disease objects in digital medical images

Country Status (1)

Country Link
WO (1) WO2015070243A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643663B2 (en) * 2003-04-04 2010-01-05 Koninklijke Philips Electronics N.V. Volume measurement in 3D datasets
US20120201445A1 (en) * 2011-02-08 2012-08-09 University Of Louisville Research Foundation, Inc. Computer aided diagnostic system incorporating appearance analysis for diagnosing malignant lung nodules
US20130129168A1 (en) * 2011-11-23 2013-05-23 The Regents Of The University Of Michigan Voxel-Based Approach for Disease Detection and Evolution

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7643663B2 (en) * 2003-04-04 2010-01-05 Koninklijke Philips Electronics N.V. Volume measurement in 3D datasets
US20120201445A1 (en) * 2011-02-08 2012-08-09 University Of Louisville Research Foundation, Inc. Computer aided diagnostic system incorporating appearance analysis for diagnosing malignant lung nodules
US20130129168A1 (en) * 2011-11-23 2013-05-23 The Regents Of The University Of Michigan Voxel-Based Approach for Disease Detection and Evolution

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
BAI ET AL.: "Tumor quantification in clinical positron emission tomography", THERANOSTICS, vol. 3, no. 10, 2013, pages 787 - 801 *
ZAIDI ET AL.: "PET-guided delineation of radiation therapy treatment volumes: a survey of image segmentation techniques", EUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING, vol. 37, no. 11, 2010, pages 2165 - 2187 *

Similar Documents

Publication Publication Date Title
Kickingereder et al. Automated quantitative tumour response assessment of MRI in neuro-oncology with artificial neural networks: a multicentre, retrospective study
Lou et al. An image-based deep learning framework for individualising radiotherapy dose: a retrospective analysis of outcome prediction
US10282588B2 (en) Image-based tumor phenotyping with machine learning from synthetic data
US10575774B2 (en) Predicting immunotherapy response in non-small cell lung cancer with serial radiomics
Tunali et al. Application of radiomics and artificial intelligence for lung cancer precision medicine
US20230148980A1 (en) Systems and methods for automated and interactive analysis of bone scan images for detection of metastases
US9171369B2 (en) Computer-aided detection (CAD) system for personalized disease detection, assessment, and tracking, in medical imaging based on user selectable criteria
Zhao et al. Evaluating variability in tumor measurements from same-day repeat CT scans of patients with non–small cell lung cancer
Gardin et al. Radiomics: principles and radiotherapy applications
Lambin et al. Predicting outcomes in radiation oncology—multifactorial decision support systems
US7606405B2 (en) Dynamic tumor diagnostic and treatment system
EP2510468B1 (en) Diagnostic techniques for continuous storage and joint analysis of both image and non-image medical data
US20130329973A1 (en) Subvolume identification for prediction of treatment outcome
Desbordes et al. Predictive value of initial FDG-PET features for treatment response and survival in esophageal cancer patients treated with chemo-radiation therapy using a random forest classifier
US20110124976A1 (en) Model enhanced imaging
Astaraki et al. Early survival prediction in non-small cell lung cancer from PET/CT images using an intra-tumor partitioning method
US20100142774A1 (en) method, a system, and an apparatus for using and processing multidimensional data
AU2019449346B2 (en) Radiotherapy plan parameters with privacy guarantees
US11090508B2 (en) System and method for biological treatment planning and decision support
Mu et al. 18F-FDG PET/CT habitat radiomics predicts outcome of patients with cervical cancer treated with chemoradiotherapy
JP2019518288A (en) Change detection in medical image
WO2006119340A2 (en) Dynamic tumor diagnostic and treatment system
El‐Galaly et al. Pre‐treatment total metabolic tumour volumes in lymphoma: Does quantity matter?
US11574404B2 (en) Predicting recurrence and overall survival using radiomic features correlated with PD-L1 expression in early stage non-small cell lung cancer (ES-NSCLC)
Compter et al. Deciphering the glioblastoma phenotype by computed tomography radiomics

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14859455

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14859455

Country of ref document: EP

Kind code of ref document: A1