WO2022043143A1 - System and method for providing near real-time two dimensional projection images - Google Patents

System and method for providing near real-time two dimensional projection images Download PDF

Info

Publication number
WO2022043143A1
WO2022043143A1 PCT/EP2021/072875 EP2021072875W WO2022043143A1 WO 2022043143 A1 WO2022043143 A1 WO 2022043143A1 EP 2021072875 W EP2021072875 W EP 2021072875W WO 2022043143 A1 WO2022043143 A1 WO 2022043143A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
image space
dimensional
seed
current
Prior art date
Application number
PCT/EP2021/072875
Other languages
French (fr)
Inventor
Charles CARMAN
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2022043143A1 publication Critical patent/WO2022043143A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/04Positioning of patients; Tiltable beds or the like
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/424Iterative

Definitions

  • This invention pertains to imaging systems, and in particular systems and methods of providing near real-time two dimensional projection images from three-dimensional imaging systems.
  • PET positron emission tomography
  • anatomical imaging modality such as computed tomography (CT) or magnetic resonance imaging (MRI)
  • CT computed tomography
  • MRI magnetic resonance imaging
  • sinograms comprise histograms of detected events which may be sorted by the angle of each view and tilt (for 3D images) and grouped into projection images.
  • Such sinogram images are analogous to the projections captured by CT scanners, and can be reconstructed in a similar way.
  • SPECT and PET systems reconstruct the 3-D patient volume images using a variety of algorithms, the most common of which are iterative.
  • the raw data already may be binned into sinograms, which can be reformatted to create 2-D projection views, but these views do not make use of all of the detected events, and thus take a relatively long time to acquire / refresh.
  • sinograms is generally not available for 3-D SPECT-only systems which do not include an anatomical imaging modality.
  • typical combination PET/CT systems do not have a means for properly positioning a patient, or planning a scan, to ensure that a region of interest is within the image space prior to starting the scan, without exposing the patient to x-rays as part of a scout (2-D planar projection) acquisition.
  • ring-based SPECT-only systems that are not integrated with a CT system typically do not have a natural, near real-time display/view for positioning a patient to ensure that a region of interest is within the image space prior to starting the scan. Accordingly, patient positioning for ring-based SPECT-only systems that are not integrated with a CT system must be done using external patient and system landmarks/aids, or, again, with relatively lengthy image acquisitions to check the patient position.
  • a system comprises: an imaging apparatus, a display device, and a processor.
  • the imaging apparatus is configured to detect image-related events in three dimensions from a subject, but lacks an anatomical reference for the subject.
  • the processor is configured to: establish a seed three-dimensional image for an image space comprising at least a portion of the subject, as a current three-dimensional image of the image space, and seed projection data for the image space as current projection data for the image space; and perform a series of first iterations for a corresponding series of epochs.
  • the processor receives, from the imaging apparatus, new event data corresponding to newly detected image-related events from the subject; updates the current projection data for the image space based on the new event data; and performs a series of second iterations within each epoch.
  • Each second iteration includes: determining computed projection data for the image space from the current three-dimensional image for the image space; determining a difference between the computed projection data for the image space and the current projection data for the image space; determining a quality metric for the computed projection data for the image space, based on the difference; and comparing the quality metric to a threshold.
  • the processor When the quality metric is greater than the threshold: the difference is backprojected to the current three-dimensional image for the image space; the current three-dimensional image for the image space is updated based on the backproj ection; and the series of second iterations is continued. When the quality metric is less than or equal to the threshold, the series of second iterations is ended.
  • the processor outputs the current three-dimensional image for the image space; computes a two-dimensional projection of the image space from the current three-dimensional image for the image space; and provides to the display device the two-dimensional projection of the image space.
  • the processor continues the series of first iterations until the processor receives a request to stop.
  • the processor is further configured to: compute a second two-dimensional projection of the image space from the current three-dimensional image for the image space, and display the second two-dimensional projection of the image space.
  • the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
  • the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
  • the imaging apparatus comprises a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
  • 3-D three-dimensional single photon emission computed tomography
  • the imaging apparatus includes a Positron Emission Tomography (PET) imaging apparatus.
  • PET Positron Emission Tomography
  • a method comprises: establishing a seed three-dimensional image for a subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; and perform a series of iterations for a corresponding series of epochs.
  • Each iteration includes: receiving new event data corresponding to newly detected image-related events generated from the subject; updating the current projection data for the image space based on the new event data; performing an iterative reconstruction of the current three- dimensional image for the image space using the updated current projection data for the image space; computing a two-dimensional projection of the image space from the reconstructed current three- dimensional image for the image space; and displaying the two-dimensional projection of the image space.
  • the method further comprises: computing a second two-dimensional projection of the image space from the current three-dimensional image for the image space; and displaying the second two-dimensional projection of the image space.
  • performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space comprises performing a series of second iterations within each epoch, wherein each second iteration comprises: determining computed projection data for the image space from the current three-dimensional image for the image space, determining a difference between the computed projection data for the image space and the current projection data for the image space, determining a quality metric for the computed projection data for the image space, based on the difference; and comparing the quality metric to a threshold.
  • the quality metric is greater than the threshold: the difference is backprojected to the current three- dimensional image for the image space; the current three-dimensional image for the image space is updated based on the backproj ection; and the series of second iterations is continued.
  • the quality metric is less than or equal to the threshold, the iterative reconstruction is stopped.
  • the method further comprises: receiving from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and resetting the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
  • the method further comprises: receiving from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decaying the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
  • the new event data is received from an imaging apparatus which lacks an anatomical reference for the subject.
  • the new event data is received from a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
  • 3-D three-dimensional single photon emission computed tomography
  • the new event data is received from a Positron Emission Tomography (PET) imaging apparatus.
  • PET Positron Emission Tomography
  • a system comprises: an imaging apparatus, wherein the imaging apparatus is configured to detect image-related events in three dimensions from a subject, and wherein the imaging apparatus lacks an anatomical reference for the subject; and a processor.
  • the processor is configured to: establish a seed three-dimensional image of an image space comprising at least a portion of the subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data the image space; and perform a series of first iterations for a corresponding series of epochs.
  • Each first iteration includes: receiving, from the imaging apparatus, new event data corresponding to newly detected image-related events; updating the current projection data for the image space based on the new event data; performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space; and computing a two-dimensional projection of the image space from the current three-dimensional image for the image space.
  • the processor is further configured to: compute a second two-dimensional projection of the image space from the current three-dimensional image for the image space; and display the second two-dimensional projection of the image space.
  • the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
  • the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
  • the imaging apparatus comprises a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
  • 3-D three-dimensional single photon emission computed tomography
  • the imaging apparatus includes a Positron Emission Tomography (PET) imaging apparatus.
  • PET Positron Emission Tomography
  • FIG. 1 illustrates a high level functional block diagram of one example of an imaging system, in particular a single photon emission computed tomography (SPECT) imaging system.
  • SPECT single photon emission computed tomography
  • FIG. 2 illustrates a block diagram of another example of an imaging system, in particular a combination positron emission tomography (PET) / computed tomography (CT) imaging system.
  • PET positron emission tomography
  • CT computed tomography
  • FIG. 3 illustrates an example of assignments of imaging system components between positron emission tomography (PET) functionality and computed tomography (CT) functionality in a combined PET/CT system.
  • PET positron emission tomography
  • CT computed tomography
  • FIG. 4 illustrates is a block diagram illustrating an example embodiment of a processor and associated memory according to embodiments of the disclosure.
  • FIG. 5 illustrates a flowchart of a method of providing near real-time two dimensional projection images from data obtained by a three-dimensional imaging apparatus.
  • FIG. 6 illustrates a flowchart of a method of iterative reconstruction.
  • FIG. 1 illustrates a high level functional block diagram of one example of an imaging system, in particular a SPECT imaging system 10.
  • SPECT imaging system 10 is designed to produce useful images (e.g., three-dimensional images) of a region of interest 20 of a patient or subject 14, using suitable detector components (such as pin-hole gamma cameras or collimated scintillating detectors) as described in detail below.
  • suitable detector components such as pin-hole gamma cameras or collimated scintillating detectors
  • SPECT imaging system 10 includes a scanner 16, a patient support 18, system control and processing circuitry 28 and an operator interface 40.
  • Support 18 may be movable with respect to scanner 16 to allow for imaging of different tissues or anatomies in a region of interest 20 within subject 14.
  • a radioisotope such as a radiopharmaceutical substance (sometimes referred to as a radiotracer), is administered to subject 14, and may be bound or taken up by particular tissues or organs in region of interest 20.
  • Typical radioisotopes include various radioactive forms of elements that emit gamma radiation during decay.
  • Various additional substances may be selectively combined with such radioisotopes to target specific areas or tissues in region of interest 20 of the body of subject 14.
  • SPECT imaging system 10 produces three-dimensional volume images of an image space in response to detection of image-related events.
  • detection of an image-related event may be detection of an individual photon from the decay of an element of the radioisotope which has been administered to subject 14.
  • Gamma radiation emitted by the radioisotope is detected by a detector component 22, such as a digital detector or gamma cameras.
  • a detector component 22 such as a digital detector or gamma cameras.
  • the detector structure(s) 22 may be positioned about subject 14, such as in an arc or ring about subject 14, or may be attached to a positioner (e.g., a C-arm, gantry, or other movable arm) that allows the detector structure(s) 22 to be moved in such an arc or orbit about the patient during data acquisition to produce a three-dimensional image of region of interest 20.
  • the detector structure(s) 22 typically include one or more components or elements capable of sensing gamma radiation or otherwise generating a detectable signal in response to such radiation.
  • the detector structures comprise one or more collimators and a scintillator, together represented generally as reference numeral 24.
  • the collimator may be formed from parallel or non-parallel elements that allow gamma radiation emitted only in certain directions to impact the detecting components.
  • the scintillator may be made of a crystalline material, such as sodium iodide (Nal), that converts the received gamma radiation to lower energy light energy (e.g., in an ultraviolet range).
  • Photomultiplier tubes 26 then receive this light and generate event data corresponding to photons impacting specific discrete picture element (pixel) regions.
  • the detector structure 22 may not be collimated but may instead use other gamma radiation sensing technologies, such as one or more pin-hole gamma cameras, as also discussed herein.
  • system control and processing circuitry 28 may include a number of physical and/or software components that cooperate to allow the collection and processing of image data to create the desired images.
  • system control and processing circuitry 28 may include raw data processing circuitry 30 that initially receives the data from the detector structure(s) 22, and that may perform various filtering, value adjustments, and so forth.
  • Processing circuitry 32 allows for the overall control of imaging system 10, and for manipulation and/or reconstruction of image data 12. Processing circuitry 32 may also perform calibration functions, correction functions, and so forth on the data. Processing circuitry 32 may also perform image reconstruction functions, such as based on various algorithms (e.g., back projection, iterative reconstruction, and so forth). Such functions may also be performed in post-processing on local or remote equipment.
  • the various image reconstruction and artifact correction algorithms discussed herein may be implemented in part or in their entirety using one or both of raw data processing circuitry 30 and/or processing circuitry 32.
  • processing circuitry 32 interacts with control circuitry/interface 34 that allows for control of the scanner and its components, including patient support 18, a camera, and so forth.
  • processing circuitry 32 may be supported by various circuits, such as memory circuitry 36 that may be used to store image data, calibration or correction values, routines performed by processing circuitry 32 (such as the motion artifact correction algorithms disclosed herein), and so forth.
  • processing circuitry 32 executes one or more iterative reconstruction algorithms that may utilize approaches for creating near real-time two dimensional projection images for facilitating positioning of subject 14 and/or patient support 18, as discussed herein. Such iterative reconstruction approaches may generally utilize iterated comparisons between expected or reference images and observed or measured event data. In such an iterative reconstruction approach, the convergence process or loop may be repeated or iterated until some completion criteria is met, such as minimization of a cost function.
  • processing circuitry 32 may interact with interface circuitry 38, designed to support operator interface 40.
  • Operator interface 40 allows for imaging sequences to be commanded, scanner and system settings to be viewed and adjusted, images to be viewed, and so forth.
  • reconstructed images 12 may be viewed by an operator via display 42.
  • the imaging system 10 may be coupled to one or more networks to allow for the transfer of scheduling data to the imaging system, as well as to permit transmission and storage of image data and processed images.
  • networks for example, local area networks, wide area networks, wireless networks, and so forth may allow for storage of image data on radiology department information systems and/or on hospital information systems.
  • Such network connections further allow for transmission of image data to remote post-processing systems, physician offices, and so forth.
  • a detector may be collimated with an arrangement of parallel structures such that the resulting acquisition of gamma rays is not divergent, for example with a collimated detector assembly or collimated camera.
  • the employed collimation is non-parallel such as a pinhole collimator, fan-beam collimator, or cone-beam collimator.
  • SPECT system 10 may not have an anatomical reference for subject 14 to facilitate this positioning. Accordingly, described below with respect to FIGs. 5 and 6 are methods of providing near real-time two dimensional projection images from three-dimensional imaging systems, such as SPECT system 10, which otherwise do not provide such near real-time two dimensional projection images.
  • Such images may be provided to and viewed by an operator via display 42 and operator interface 40, allowing the operator to ensure that subject 14 is properly positioned for an imaging session prior to acquiring image data.
  • the operator may provide input via operator interface 40 to cause system control and processing circuitry 28 to move patient support 18 and/or scanner 16 to ensure that a region of interest is within the image space of scanner 16 prior to starting the scan of subject 14.
  • FIG. 2 illustrates a block diagram of another example of an imaging system 2000.
  • Imaging system 2000 includes as an imaging apparatus a positron emission tomography (PET) subsystem in combination with an X-ray computed tomography (CT) subsystem.
  • PET positron emission tomography
  • CT X-ray computed tomography
  • the imaging apparatus is configured to detect image-related events for a subject 14 in three dimensions, but the imaging apparatus lacks an anatomical reference for subject 14.
  • detection of an image -related event may be detection of a photon pair from the decay of an element of the radioisotope which has been administered to subject 14.
  • Imaging system 2000 includes: gantry 15; patient support 18; gamma ray detector/sensor 2022; first digitization and readout electronics 2030; X-ray source 2120; X- ray detector/sensor 2122; second digitization and readout electronics 2130; processing system 2028; user interface 2040 and display 2042.
  • Processing system includes modules for processing data from first digitization and readout electronics 2030 and second digitization and readout electronics 2130, and application packages for utilizing reconstructed image data. Also illustrated in dashed lines in FIG. 2 is a radiopharm source 2020. Radiopharm source 2020 is not strictly part of PET/CT imaging system 2000 system, but it is essential to create useful images.
  • combined PET/CT imaging system 2000 uses radioactive materials (also known as a tracer or radio-tracer) for imaging, and is generally categorized within the field of nuclear medicine (NM).
  • Subject 14 resides on patient support 18, and one or more detectors 2022 may be provided in gantry 15 for detecting image-related events which can be processed to obtain a three-dimensional volume image of region of interest 20 in subject 14.
  • a radioisotope such as a radiopharmaceutical substance (sometimes referred to as a radiotracer), is administered to subject 14, which gets trapped within the tissues in region of interest 20.
  • the unstable nuclei of radioisotopes within subject 14 emit positrons, which combine with neighboring electrons to produce a pair of gamma rays moving at 180 degrees to each other.
  • gamma rays are detected by detector(s) 2022 disposed within the donutshaped body of gantry 15. The energy and location of these gamma rays are recorded, preprocessing and sinogram binning is performed and used by processing system 2028 to reconstruct three-dimensional (3D) images of tracer concentration within region of interest 20 in the body of subject 14, using algorithms as are known in the art.
  • the PET subsystem may share components with the CT subsystem.
  • FIG. 3 illustrates an example of assignments of imaging system components between positron emission tomography (PET) functionality 3200 and computed tomography (CT) functionality 3100 in a combined PET/CT system, such as PET/CT imaging system 2000.
  • PET positron emission tomography
  • CT computed tomography
  • CT modality 3100 and PET modality 3200 share patient support 18, gantry 15, a console subsystem 3400 supporting a graphical user interface (GUI) for an operator (e.g., user interface 2040 and display 2042 of FIG. 2), and a service subsystem supporting software applications used by service personnel for installation, configuration, repair, and maintenance of combined PET/CT imaging system 2000.
  • GUI graphical user interface
  • CT modality 3100 employs a Data Measurement System (DMS) 3130 (e.g., detector/sensor 2122 and readout/ digitization electronics 2130 of FIG. 2) to acquire CT data (e.g., from an X-ray source and corresponding detector), and Common Image Reconstruction System (CIRS) 3132 to reconstruct CT image data.
  • DMS Data Measurement System
  • CIRS Common Image Reconstruction System
  • PET modality 3200 includes PET Detector and Acquisition Subsystem to acquire PET data (e.g., detector/sensor 2122, readout/digitization electronics 2130 and the preprocessing and sinogram binning module of FIG. 2), and a Nuclear Medicine Reconstruction System (NMRS) subsystem 3132 to reconstruct PET image data.
  • NMRS Nuclear Medicine Reconstruction System
  • imaging system 2000 may not have an anatomical reference for subject 14 to facilitate this positioning. Accordingly, described below with respect to FIGs. 5 and 6 are methods of providing near real-time two dimensional projection images from three-dimensional imaging systems, such as imaging system 2000, which otherwise do not provide such near real-time two dimensional projection images.
  • Such images may be provided to and viewed by an operator via display 2402 and user interface 2040, allowing the operator to ensure that subject 14 is properly positioned for an imaging session prior to acquiring image data.
  • the operator may provide input via user interface 40 to cause processing system 2028 to move patient support 18 and/or gantry 15 to ensure that a region of interest is within the image space of imaging system 2000 prior to starting the scan of subject 14
  • FIG. 4 illustrates is a block diagram illustrating an example embodiment of a processing circuit 4000 according to embodiments of the disclosure.
  • Processing circuit 4000 includes processor 400 and associated c.
  • Processing circuit 4000 may be used to implement one or more processors described herein, for example, system control and processing circuitry 28 in FIG. 2, or processing system 2028 in FIG. 3.
  • processing circuitry 28 in FIG. 2, or processing system 2028 in FIG. 3 may include more than one processor 400, and/or more than memory circuit 450.
  • Processor 400 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
  • DSP digital signal processor
  • FPGA field programmable array
  • GPU graphical processing unit
  • ASIC application specific circuit
  • Processor 400 may include one or more cores 402.
  • Core 402 may include one or more arithmetic logic units (ALU) 404.
  • core 402 may include a floating point logic unit (FPLU) 406 and/or a digital signal processing unit (DSPU) 408 in addition to or instead of ALU 404.
  • FPLU floating point logic unit
  • DSPU digital signal processing unit
  • Processor 400 may include one or more registers 412 communicatively coupled to core 402.
  • Registers 412 may be implemented using dedicated logic gate circuits (e.g., flip-flops) and/or any memory technology. In some embodiments registers 412 may be implemented using static memory. Registers 412 may provide data, instructions and addresses to core 402.
  • processor 400 may include one or more levels of cache memory 410 communicatively coupled to core 402.
  • Cache memory 410 may provide computer-readable instructions to core 402 for execution.
  • Cache memory 410 may provide data for processing by core 402.
  • the computer-readable instructions may have been provided to cache memory 410 by a local memory, for example, local memory attached to the external bus 416.
  • Cache memory 410 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
  • MOS metal-oxide semiconductor
  • Processor 400 may include a controller 414, which may control input to the processor 400 from other processors and/or components included in a system (e.g., patient support 18, scanner 16, gantry 15, readout and digitization electronics 2030/2130, etc.) and/or outputs from processor 400 to other processors and/or components included in the system (e.g., patient support 18, scanner 16, gantry 15, etc.). Controller 414 may control the data paths in the ALU 404, FPLU 406 and/or DSPU 408. Controller 414 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 414 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology.
  • Registers 412 and cache 410 may communicate with controller 414 and core 402 via internal connections 420 A, 420B, 420C and 420D.
  • Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
  • Bus 416 may be communicatively coupled to one or more components of processor 400, for example controller 414, cache 410, and/or register 412. Bus 416 may be coupled to one or more components of the system, such as patient support 18, scanner 16, gantry 15, etc. mentioned previously.
  • the external memories may include Read Only Memory (ROM) 432.
  • ROM 432 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology.
  • the external memory (ies) may include Random Access Memory (RAM) 433.
  • RAM 433 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology.
  • the external memory(ies) may include Electrically Erasable Programmable Read Only Memory (EEPROM) 435.
  • the external memory(ies) may include Flash memory 434.
  • the External memory(ies) may include a magnetic storage device such as disc 436, and/or optical storage devices such as compact discs (CDs), digital versatile discs (DVDs), etc.
  • processing circuit 4000, and in particular processor 400 may control operations of, and process data obtained by, a SPECT imaging system, such as SPECT imaging system 10 illustrated in FIG. 1.
  • processing circuit 4000, and in particular processor 400 may control operations of, and process data obtained by, a combination PET/CT system, such as the combination PET/CT imaging system 2000 illustrated in FIG. 2.
  • Processing circuit 4000 may perform algorithms or operations described below with respect to FIGs. 5 and 6 to provide near real-time two dimensional projection images which may displayed to an operator via a display device and used by the operator to properly position region of interest 20 in subject 14 to be within a desired location for proper imaging.
  • FIG. 5 illustrates a flowchart of a method 500 of providing near real-time two dimensional projection images from data obtained by a three-dimensional imaging apparatus, such as SPECT imaging system 10 or PET/CT imaging system 2000.
  • a processing system of the imaging system initializes volume imaging data and projection data for an image space for processing by the imaging system.
  • projection data may refer to a collection of input event data.
  • An initial input data space and a reconstruction volume are created.
  • the imaging system or, more precisely, a processor of the imaging system
  • an operation 520 it is determined whether to stop execution of method 500, for example in response to a command or instruction from a user, or after method 500 has executed for a predetermined period of time or number of cycles, which may be stored in memory. If it is determined in operation 520 that method 500 should not end, then the next of a series of first iterations comprising operations 530 through 560 is performed. The time period during which each of these first iterations occur is referred to herein as an epoch. Thus execution of method 500 may occur over one or more epochs. Beneficially, each epoch lasts less than one second. In some embodiments, each epoch may be at least 0.1 second.
  • the imaging system acquires more event data for the set of projection data.
  • the processor receives, from the imaging apparatus, new event data corresponding to newly detected image-related events from subject 14.
  • the imaging system updates the set of projection data.
  • the processor updates the current set of projection data for the image space based on the new event data.
  • iterative image reconstruction is performed using a series of second iterations. Typically, the reconstruction runs for a small number of second iterations (possibly as small as one).
  • An example embodiment of iterative image reconstruction operation 550 will be described below with respect to FIG. 6.
  • one or more two-dimensional projection(s) of the image space are computed from the current three-dimensional image for the image space.
  • one two-dimensional projection of subject 14 may lie in a plane defined by a craniocaudal (head-to-foot) direction of subject 14 and a lateral (left-to-right) direction of subject 14, sometimes referred to as a coronal or frontal plane.
  • Another two-dimensional projection of subject 14 may lie in a plane defined by a craniocaudal (head-to- foot) direction of subject 14 and an anterior/posterior (top-to-bottom) direction of subject 14, sometimes referred to as a sagittal plane.
  • Operation 560 may also include outputting the current three-dimensional image for the image space, computing the computed two-dimensional projection(s) of the image space from the current three-dimensional image for the image space, and providing to a display device the computed two-dimensional projection(s) of the image space.
  • FIG. 6 illustrates a flowchart of a method 600 of iterative reconstruction.
  • the method 600 may comprises one embodiment of operation 550 in method 500 of FIG. 5.
  • a three-dimensional reconstructed volume of the image space is initialized.
  • a processor computes a set of projection data for the image space from the current reconstructed three-dimensional volume.
  • a processor computes or otherwise determines a difference between the computed projection data for the image space, obtained in operation 620, and the current projection data for the image space.
  • a processor computes or otherwise determines a quality or cost metric for the computed data for the image space, based on the difference obtained in operation 630.
  • a processor determines whether the second iterations of method 600 are done. In particular, the processor compares the quality metric to a threshold, and when the quality metric is less than or equal to the threshold, ends the series of second iterations.
  • the method proceeds to operation 660 in which the method outputs an updated three- dimensional volume of the image space.
  • a processor backprojects the difference in the projection data for the image space to the current three-dimensional image for the image space.
  • the current three-dimensional image for the image space is updated in response to the backproj ection of operation 670. Then the method returns to operation 620 and another second iteration.
  • Method 600 of FIG. 6 is an example of an iterative reconstruction algorithm.
  • An iterative reconstruction starts with an assumed image, computes projection data from the image, compares the original projection data and updates the image based upon the difference between the calculated and the actual projection data. Implied in this algorithm description are two collections of data: (1) the "assumed image” or 3D reconstructed volume; and (2) the "original projection data.” Also implied are two projection functions: (1) a forward projection which creates projection data from the current 3D reconstructed volume; and (2) a backward projection, that takes the projection data differences back to the 3D reconstructed volume.
  • a comparison operation to calculate or determine a difference or differences between the original projection data and the projection data which are obtained from the 3D reconstruction volume; (2) a quality or cost metric calculated based on the difference(s), used to determine when to stop the (second) iterations of method 600; and (3) the operation to update or correct the 3D reconstruction volume using the back projected difference(s).
  • the iterative reconstruction of method 600 involves:
  • the method 500 of FIG. 5 wraps the iterative reconstruction algorithm 600 of FIG. 6 within another (first) iterative loop that updates the acquired projection data as additional input is acquired from the imaging apparatus, and updates the displayed output images from the re-reconstructed 3D volume.
  • the goal / purpose of the “outer” loop [method 500] is to provide relatively quick 2D projection updates to the operator so they can position the patient efficiently, without waiting significantly between each positioning movement of the patient 14 or support 18.
  • Method 500 provides an adaptation of the generic iterative reconstruction algorithm 600 of FIG. 6 to include input data updates between invocations of reconstruction, and the use of different stopping criteria and volume update functions, and to create 2-D projections from the reconstructed volume following each reconstruction.
  • the resulting images may be quite noisy, and so in some embodiments, some low-pass or band-pass filtering may be included to make it easier for the operator to see the patient’s gross anatomy.
  • method 600 is tuned for speed in processing low-resolution, noisy data / images / volumes. Accordingly, the quality metrics and updating functions in method 600 may be different than the corresponding functions used for iterative reconstruction of diagnostic images from the full acquisition image event data. They would be different because they would be tuned for processing noisier, lower- resolution projections without most of the time-consuming corrections and image processing typically included in the diagnostic image reconstruction implementation.
  • each epoch (a single first iteration, e.g. one pass through the “outside loop”) the imaging system acquires a new sample of detected image-related events for some period of time [the detection of image-related events for the next epoch can start during the reconstruction processing in this epoch]. Then:
  • the reconstruction runs for a small number of iterations (possibly as small as one).
  • one or more two-dimensional projections of the reconstructed volume are calculated, representing anterior / posterior, or lateral views of the patient / object, which are then displayed to the user. While a maximum intensity projection may be fastest, it may also result in very noisy images, and since these reconstructions are expected to have a low signal- to-noise ratio, an integration projection would provide more useful images.
  • a new epoch begins and the method returns to acquiring a new sample of detected image-related events, as described above, and continues the series of first iterations.
  • method 500 may include aging of the event data at the end of each epoch, or first iteration, or after a predetermined number of epochs or first iterations. In some embodiments, this may include full resetting the input data and the three-dimensional reconstruction volume after a user selectable number of epochs (i.e., the previous input data and reconstruction volume are initialized to [empty] defaults). In other embodiments, this may entail applying an exponential decay of the event set and seed spaces with a user selectable strength of decay (i.e., the previous events, reconstruction, and imaging spaces are reduced in number or magnitude as appropriate for that space). For the event data, a fraction of the events in the event set may be removed before adding the new events. For summary data and the reconstruction and image spaces, the magnitudes may be decreased by dividing each entry by a user-selected value.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • High Energy & Nuclear Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Nuclear Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A system and method: establish a seed three-dimensional image for a subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; and perform a series of iterations for a corresponding series of epochs. Each iteration includes: receiving new event data corresponding to newly detected image-related events generated from the subject, updating the current projection data for the image space based on the new event data, performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space, computing a two-dimensional projection of the image space from the reconstructed current three-dimensional image for the image space, and displaying the two-dimensional projection of the image space.

Description

SYSTEM AND METHOD FOR PROVIDING NEAR REAL-TIME TWO DIMENSIONAL PROJECTION IMAGES
TECHNICAL FIELD
This invention pertains to imaging systems, and in particular systems and methods of providing near real-time two dimensional projection images from three-dimensional imaging systems.
BACKGROUND AND SUMMARY
For many years near real-time two-dimensional (2-D) projection images have been used for positioning patients and phantoms on general nuclear medicine (NM) systems with large area, planar detectors. Positioning the patient or phantom, and/or planning the appropriate scan range, is necessary to ensure that a region (e.g., an organ or organs) of interest is within the detector field of view (FOV) and/or scanned region (referred to hereafter as “the image space”) prior to starting the scan.
Early positron emission tomography (PET) systems, which had not [yet] been integrated with an anatomical imaging modality, such as computed tomography (CT) or magnetic resonance imaging (MRI), also had some form of positioning display, based on re-formatting / re-binning of sinograms. Here, it is understood that sinograms comprise histograms of detected events which may be sorted by the angle of each view and tilt (for 3D images) and grouped into projection images. Such sinogram images are analogous to the projections captured by CT scanners, and can be reconstructed in a similar way.
Full-ring detector systems have significant advantages for three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging and coincidence pair-based PET imaging. SPECT and PET systems reconstruct the 3-D patient volume images using a variety of algorithms, the most common of which are iterative.
However these systems lost or removed the capability to create near real-time 2-D projection images of emission data, as their raw data does not directly represent planar projections.
For combination PET/CT systems, the raw data already may be binned into sinograms, which can be reformatted to create 2-D projection views, but these views do not make use of all of the detected events, and thus take a relatively long time to acquire / refresh. This use of sinograms is generally not available for 3-D SPECT-only systems which do not include an anatomical imaging modality.
Meanwhile, without the ability to produce near real-time 2-D projection images, typical combination PET/CT systems do not have a means for properly positioning a patient, or planning a scan, to ensure that a region of interest is within the image space prior to starting the scan, without exposing the patient to x-rays as part of a scout (2-D planar projection) acquisition.
Also, ring-based SPECT-only systems that are not integrated with a CT system typically do not have a natural, near real-time display/view for positioning a patient to ensure that a region of interest is within the image space prior to starting the scan. Accordingly, patient positioning for ring-based SPECT-only systems that are not integrated with a CT system must be done using external patient and system landmarks/aids, or, again, with relatively lengthy image acquisitions to check the patient position.
However, many physicians and patients would like to reduce the radiation exposure to patients, either in general or due to patient-specific considerations.
In addition to concerns about patient positioning, these systems typically have no way to monitor the image acquisition in progress to check whether the patient has not moved significantly during the scan. It would be desirable to allow a technologist to monitor the position of the patient and identify significant patient motions shortly after the motions have occurred, so the technologist can decide early whether to continue or re-start the scan.
Accordingly, it would be desirable to provide systems and methods of providing near real-time two dimensional projection images from three-dimensional imaging systems which otherwise do not provide such near real-time two dimensional projection images. Providing a near real-time two- dimensional display to system operators would enable shorter clinical workflows and increase the operator’s confidence during scan setup that they have included the organ(s) of interest in the scan.
In one aspect of the invention, a system comprises: an imaging apparatus, a display device, and a processor. The imaging apparatus is configured to detect image-related events in three dimensions from a subject, but lacks an anatomical reference for the subject. The processor is configured to: establish a seed three-dimensional image for an image space comprising at least a portion of the subject, as a current three-dimensional image of the image space, and seed projection data for the image space as current projection data for the image space; and perform a series of first iterations for a corresponding series of epochs. In each first iteration, the processor: receives, from the imaging apparatus, new event data corresponding to newly detected image-related events from the subject; updates the current projection data for the image space based on the new event data; and performs a series of second iterations within each epoch. Each second iteration includes: determining computed projection data for the image space from the current three-dimensional image for the image space; determining a difference between the computed projection data for the image space and the current projection data for the image space; determining a quality metric for the computed projection data for the image space, based on the difference; and comparing the quality metric to a threshold. When the quality metric is greater than the threshold: the difference is backprojected to the current three-dimensional image for the image space; the current three-dimensional image for the image space is updated based on the backproj ection; and the series of second iterations is continued. When the quality metric is less than or equal to the threshold, the series of second iterations is ended. At the end of each first iteration, the processor: outputs the current three-dimensional image for the image space; computes a two-dimensional projection of the image space from the current three-dimensional image for the image space; and provides to the display device the two-dimensional projection of the image space. The processor continues the series of first iterations until the processor receives a request to stop. In some embodiments, the processor is further configured to: compute a second two-dimensional projection of the image space from the current three-dimensional image for the image space, and display the second two-dimensional projection of the image space.
In some embodiments, the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
In some embodiments, the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
In some embodiments, the imaging apparatus comprises a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
In some embodiments, the imaging apparatus includes a Positron Emission Tomography (PET) imaging apparatus.
In another aspect of the invention, a method comprises: establishing a seed three-dimensional image for a subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; and perform a series of iterations for a corresponding series of epochs. Each iteration includes: receiving new event data corresponding to newly detected image-related events generated from the subject; updating the current projection data for the image space based on the new event data; performing an iterative reconstruction of the current three- dimensional image for the image space using the updated current projection data for the image space; computing a two-dimensional projection of the image space from the reconstructed current three- dimensional image for the image space; and displaying the two-dimensional projection of the image space.
In some embodiments, the method further comprises: computing a second two-dimensional projection of the image space from the current three-dimensional image for the image space; and displaying the second two-dimensional projection of the image space.
In some embodiments, performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space comprises performing a series of second iterations within each epoch, wherein each second iteration comprises: determining computed projection data for the image space from the current three-dimensional image for the image space, determining a difference between the computed projection data for the image space and the current projection data for the image space, determining a quality metric for the computed projection data for the image space, based on the difference; and comparing the quality metric to a threshold. When the quality metric is greater than the threshold: the difference is backprojected to the current three- dimensional image for the image space; the current three-dimensional image for the image space is updated based on the backproj ection; and the series of second iterations is continued. When the quality metric is less than or equal to the threshold, the iterative reconstruction is stopped.
In some embodiments, the method further comprises: receiving from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and resetting the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
In some embodiments, the method further comprises: receiving from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decaying the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
In some embodiments, the new event data is received from an imaging apparatus which lacks an anatomical reference for the subject.
In some embodiments, the new event data is received from a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
In some embodiments, the new event data is received from a Positron Emission Tomography (PET) imaging apparatus.
In yet another aspect of the invention, a system comprises: an imaging apparatus, wherein the imaging apparatus is configured to detect image-related events in three dimensions from a subject, and wherein the imaging apparatus lacks an anatomical reference for the subject; and a processor. The processor is configured to: establish a seed three-dimensional image of an image space comprising at least a portion of the subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data the image space; and perform a series of first iterations for a corresponding series of epochs. Each first iteration includes: receiving, from the imaging apparatus, new event data corresponding to newly detected image-related events; updating the current projection data for the image space based on the new event data; performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space; and computing a two-dimensional projection of the image space from the current three-dimensional image for the image space.
In some embodiments, the processor is further configured to: compute a second two-dimensional projection of the image space from the current three-dimensional image for the image space; and display the second two-dimensional projection of the image space.
In some embodiments, the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
In some embodiments, the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three- dimensional image, and the seed projection data periodically for each period.
In some embodiments, the imaging apparatus comprises a three-dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
In some embodiments, the imaging apparatus includes a Positron Emission Tomography (PET) imaging apparatus.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates a high level functional block diagram of one example of an imaging system, in particular a single photon emission computed tomography (SPECT) imaging system.
FIG. 2 illustrates a block diagram of another example of an imaging system, in particular a combination positron emission tomography (PET) / computed tomography (CT) imaging system.
FIG. 3 illustrates an example of assignments of imaging system components between positron emission tomography (PET) functionality and computed tomography (CT) functionality in a combined PET/CT system.
FIG. 4 illustrates is a block diagram illustrating an example embodiment of a processor and associated memory according to embodiments of the disclosure.
FIG. 5 illustrates a flowchart of a method of providing near real-time two dimensional projection images from data obtained by a three-dimensional imaging apparatus.
FIG. 6 illustrates a flowchart of a method of iterative reconstruction.
DETAILED DESCRIPTION
The present invention will now be described more fully hereinafter with reference to the accompanying drawings, in which preferred embodiments of the invention are shown. This invention may, however, be embodied in different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided as teaching examples of the invention.
In particular, in order to illustrate the principles of the present invention, various systems and methods are described in the context of patient positioning for performing a full imaging scan. More broadly, however, principles which are utilized in the disclosed systems and methods may also be employed in other contexts, for example applications could be made to luggage, package, and shipping container full-ring gamma scanners that are intended to create images of the objects. Accordingly, the invention is to be understood to be defined by the claims and not limited by details of specific embodiments described herein, unless those details are recited in the claims themselves.
With the foregoing discussion in mind, FIG. 1 illustrates a high level functional block diagram of one example of an imaging system, in particular a SPECT imaging system 10. SPECT imaging system 10 is designed to produce useful images (e.g., three-dimensional images) of a region of interest 20 of a patient or subject 14, using suitable detector components (such as pin-hole gamma cameras or collimated scintillating detectors) as described in detail below.
SPECT imaging system 10 includes a scanner 16, a patient support 18, system control and processing circuitry 28 and an operator interface 40.
Support 18 may be movable with respect to scanner 16 to allow for imaging of different tissues or anatomies in a region of interest 20 within subject 14.
Prior to image data collection, a radioisotope, such as a radiopharmaceutical substance (sometimes referred to as a radiotracer), is administered to subject 14, and may be bound or taken up by particular tissues or organs in region of interest 20. Typical radioisotopes include various radioactive forms of elements that emit gamma radiation during decay. Various additional substances may be selectively combined with such radioisotopes to target specific areas or tissues in region of interest 20 of the body of subject 14. SPECT imaging system 10 produces three-dimensional volume images of an image space in response to detection of image-related events. Here, detection of an image-related event may be detection of an individual photon from the decay of an element of the radioisotope which has been administered to subject 14.
Gamma radiation emitted by the radioisotope is detected by a detector component 22, such as a digital detector or gamma cameras. Although illustrated in the figure as a planar device positioned above the patient to simplify illustration, in practice the detector structure(s) 22 may be positioned about subject 14, such as in an arc or ring about subject 14, or may be attached to a positioner (e.g., a C-arm, gantry, or other movable arm) that allows the detector structure(s) 22 to be moved in such an arc or orbit about the patient during data acquisition to produce a three-dimensional image of region of interest 20. In general, the detector structure(s) 22 typically include one or more components or elements capable of sensing gamma radiation or otherwise generating a detectable signal in response to such radiation. In the illustrated embodiment, the detector structures comprise one or more collimators and a scintillator, together represented generally as reference numeral 24. The collimator may be formed from parallel or non-parallel elements that allow gamma radiation emitted only in certain directions to impact the detecting components. In detector embodiments employing a scintillator, the scintillator may be made of a crystalline material, such as sodium iodide (Nal), that converts the received gamma radiation to lower energy light energy (e.g., in an ultraviolet range). Photomultiplier tubes 26 then receive this light and generate event data corresponding to photons impacting specific discrete picture element (pixel) regions. In other embodiments, the detector structure 22 may not be collimated but may instead use other gamma radiation sensing technologies, such as one or more pin-hole gamma cameras, as also discussed herein.
In the depicted embodiment, the detector structure(s) 22 is coupled to system control and processing circuitry 28. System control and processing circuitry 28 may include a number of physical and/or software components that cooperate to allow the collection and processing of image data to create the desired images. For example, system control and processing circuitry 28 may include raw data processing circuitry 30 that initially receives the data from the detector structure(s) 22, and that may perform various filtering, value adjustments, and so forth. Processing circuitry 32 allows for the overall control of imaging system 10, and for manipulation and/or reconstruction of image data 12. Processing circuitry 32 may also perform calibration functions, correction functions, and so forth on the data. Processing circuitry 32 may also perform image reconstruction functions, such as based on various algorithms (e.g., back projection, iterative reconstruction, and so forth). Such functions may also be performed in post-processing on local or remote equipment. As will be appreciated, the various image reconstruction and artifact correction algorithms discussed herein may be implemented in part or in their entirety using one or both of raw data processing circuitry 30 and/or processing circuitry 32.
In the depicted embodiment, processing circuitry 32 interacts with control circuitry/interface 34 that allows for control of the scanner and its components, including patient support 18, a camera, and so forth. Moreover, processing circuitry 32 may be supported by various circuits, such as memory circuitry 36 that may be used to store image data, calibration or correction values, routines performed by processing circuitry 32 (such as the motion artifact correction algorithms disclosed herein), and so forth. In some embodiments, processing circuitry 32 executes one or more iterative reconstruction algorithms that may utilize approaches for creating near real-time two dimensional projection images for facilitating positioning of subject 14 and/or patient support 18, as discussed herein. Such iterative reconstruction approaches may generally utilize iterated comparisons between expected or reference images and observed or measured event data. In such an iterative reconstruction approach, the convergence process or loop may be repeated or iterated until some completion criteria is met, such as minimization of a cost function.
Finally, processing circuitry 32 may interact with interface circuitry 38, designed to support operator interface 40. Operator interface 40 allows for imaging sequences to be commanded, scanner and system settings to be viewed and adjusted, images to be viewed, and so forth. In the illustrated embodiment, reconstructed images 12 may be viewed by an operator via display 42.
In an institutional setting, the imaging system 10 may be coupled to one or more networks to allow for the transfer of scheduling data to the imaging system, as well as to permit transmission and storage of image data and processed images. For example, local area networks, wide area networks, wireless networks, and so forth may allow for storage of image data on radiology department information systems and/or on hospital information systems. Such network connections further allow for transmission of image data to remote post-processing systems, physician offices, and so forth.
With respect to the gamma ray detection components 22 of the SPECT imaging system 10, two arrangements may be employed: parallel and non-parallel. In an example of a parallel arrangement, a detector may be collimated with an arrangement of parallel structures such that the resulting acquisition of gamma rays is not divergent, for example with a collimated detector assembly or collimated camera. In an example of a non-parallel arrangement, the employed collimation is non-parallel such as a pinhole collimator, fan-beam collimator, or cone-beam collimator.
To produce images of region of interest, 20, subject 14 must be properly positioned with respect to a scanner, designated by reference numeral 16, by manipulating a position of subject 14 and/or patient support 18 such that region of interest 20 is within the imaging space of SPECT system 10. However, in general, SPECT system 10 may not have an anatomical reference for subject 14 to facilitate this positioning. Accordingly, described below with respect to FIGs. 5 and 6 are methods of providing near real-time two dimensional projection images from three-dimensional imaging systems, such as SPECT system 10, which otherwise do not provide such near real-time two dimensional projection images. Such images may be provided to and viewed by an operator via display 42 and operator interface 40, allowing the operator to ensure that subject 14 is properly positioned for an imaging session prior to acquiring image data. In some embodiments, the operator may provide input via operator interface 40 to cause system control and processing circuitry 28 to move patient support 18 and/or scanner 16 to ensure that a region of interest is within the image space of scanner 16 prior to starting the scan of subject 14.
FIG. 2 illustrates a block diagram of another example of an imaging system 2000. Imaging system 2000 includes as an imaging apparatus a positron emission tomography (PET) subsystem in combination with an X-ray computed tomography (CT) subsystem. It should be understood that in general the PET subsystem may share components with the CT subsystem, as will be discussed below with respect to FIG. 3. The imaging apparatus is configured to detect image-related events for a subject 14 in three dimensions, but the imaging apparatus lacks an anatomical reference for subject 14. Here, detection of an image -related event may be detection of a photon pair from the decay of an element of the radioisotope which has been administered to subject 14.
Imaging system 2000 includes: gantry 15; patient support 18; gamma ray detector/sensor 2022; first digitization and readout electronics 2030; X-ray source 2120; X- ray detector/sensor 2122; second digitization and readout electronics 2130; processing system 2028; user interface 2040 and display 2042. Processing system includes modules for processing data from first digitization and readout electronics 2030 and second digitization and readout electronics 2130, and application packages for utilizing reconstructed image data. Also illustrated in dashed lines in FIG. 2 is a radiopharm source 2020. Radiopharm source 2020 is not strictly part of PET/CT imaging system 2000 system, but it is essential to create useful images.
In operation, combined PET/CT imaging system 2000 uses radioactive materials (also known as a tracer or radio-tracer) for imaging, and is generally categorized within the field of nuclear medicine (NM). Subject 14 resides on patient support 18, and one or more detectors 2022 may be provided in gantry 15 for detecting image-related events which can be processed to obtain a three-dimensional volume image of region of interest 20 in subject 14. A radioisotope, such as a radiopharmaceutical substance (sometimes referred to as a radiotracer), is administered to subject 14, which gets trapped within the tissues in region of interest 20. The unstable nuclei of radioisotopes within subject 14 emit positrons, which combine with neighboring electrons to produce a pair of gamma rays moving at 180 degrees to each other. These gamma rays are detected by detector(s) 2022 disposed within the donutshaped body of gantry 15. The energy and location of these gamma rays are recorded, preprocessing and sinogram binning is performed and used by processing system 2028 to reconstruct three-dimensional (3D) images of tracer concentration within region of interest 20 in the body of subject 14, using algorithms as are known in the art.
As noted above, in combined PET/CT imaging system 2000, the PET subsystem may share components with the CT subsystem.
FIG. 3 illustrates an example of assignments of imaging system components between positron emission tomography (PET) functionality 3200 and computed tomography (CT) functionality 3100 in a combined PET/CT system, such as PET/CT imaging system 2000.
As shown in FIG. 3, CT modality 3100 and PET modality 3200 share patient support 18, gantry 15, a console subsystem 3400 supporting a graphical user interface (GUI) for an operator (e.g., user interface 2040 and display 2042 of FIG. 2), and a service subsystem supporting software applications used by service personnel for installation, configuration, repair, and maintenance of combined PET/CT imaging system 2000.
CT modality 3100 employs a Data Measurement System (DMS) 3130 (e.g., detector/sensor 2122 and readout/ digitization electronics 2130 of FIG. 2) to acquire CT data (e.g., from an X-ray source and corresponding detector), and Common Image Reconstruction System (CIRS) 3132 to reconstruct CT image data. PET modality 3200 includes PET Detector and Acquisition Subsystem to acquire PET data (e.g., detector/sensor 2122, readout/digitization electronics 2130 and the preprocessing and sinogram binning module of FIG. 2), and a Nuclear Medicine Reconstruction System (NMRS) subsystem 3132 to reconstruct PET image data.
The general operations of a combined PET/CT imaging system are known, and will not be repeated in detail here. Instead, only details related to providing near real-time two dimensional projections from an imaging system such as imaging system 2000 will be described in detail below, in particular with respect to FIGs. 5 and 6.
To produce images of region of interest, 20, subject 14 must be properly positioned with respect to gantry 15, by manipulating the position of subject 14 and/or patient support 18 such that region of interest 20 is within the imaging space of imaging system 2000. However, in general, imaging system 2000 may not have an anatomical reference for subject 14 to facilitate this positioning. Accordingly, described below with respect to FIGs. 5 and 6 are methods of providing near real-time two dimensional projection images from three-dimensional imaging systems, such as imaging system 2000, which otherwise do not provide such near real-time two dimensional projection images. Such images may be provided to and viewed by an operator via display 2402 and user interface 2040, allowing the operator to ensure that subject 14 is properly positioned for an imaging session prior to acquiring image data. In some embodiments, the operator may provide input via user interface 40 to cause processing system 2028 to move patient support 18 and/or gantry 15 to ensure that a region of interest is within the image space of imaging system 2000 prior to starting the scan of subject 14
FIG. 4 illustrates is a block diagram illustrating an example embodiment of a processing circuit 4000 according to embodiments of the disclosure. Processing circuit 4000 includes processor 400 and associated c.
Processing circuit 4000 may be used to implement one or more processors described herein, for example, system control and processing circuitry 28 in FIG. 2, or processing system 2028 in FIG. 3. In some embodiments, processing circuitry 28 in FIG. 2, or processing system 2028 in FIG. 3 may include more than one processor 400, and/or more than memory circuit 450.
Processor 400 may be any suitable processor type including, but not limited to, a microprocessor, a microcontroller, a digital signal processor (DSP), a field programmable array (FPGA) where the FPGA has been programmed to form a processor, a graphical processing unit (GPU), an application specific circuit (ASIC) where the ASIC has been designed to form a processor, or a combination thereof.
Processor 400 may include one or more cores 402. Core 402 may include one or more arithmetic logic units (ALU) 404. In some embodiments, core 402 may include a floating point logic unit (FPLU) 406 and/or a digital signal processing unit (DSPU) 408 in addition to or instead of ALU 404.
Processor 400 may include one or more registers 412 communicatively coupled to core 402. Registers 412 may be implemented using dedicated logic gate circuits (e.g., flip-flops) and/or any memory technology. In some embodiments registers 412 may be implemented using static memory. Registers 412 may provide data, instructions and addresses to core 402.
In some embodiments, processor 400 may include one or more levels of cache memory 410 communicatively coupled to core 402. Cache memory 410 may provide computer-readable instructions to core 402 for execution. Cache memory 410 may provide data for processing by core 402. In some embodiments, the computer-readable instructions may have been provided to cache memory 410 by a local memory, for example, local memory attached to the external bus 416. Cache memory 410 may be implemented with any suitable cache memory type, for example, metal-oxide semiconductor (MOS) memory such as static random access memory (SRAM), dynamic random access memory (DRAM), and/or any other suitable memory technology.
Processor 400 may include a controller 414, which may control input to the processor 400 from other processors and/or components included in a system (e.g., patient support 18, scanner 16, gantry 15, readout and digitization electronics 2030/2130, etc.) and/or outputs from processor 400 to other processors and/or components included in the system (e.g., patient support 18, scanner 16, gantry 15, etc.). Controller 414 may control the data paths in the ALU 404, FPLU 406 and/or DSPU 408. Controller 414 may be implemented as one or more state machines, data paths and/or dedicated control logic. The gates of controller 414 may be implemented as standalone gates, FPGA, ASIC or any other suitable technology. Registers 412 and cache 410 may communicate with controller 414 and core 402 via internal connections 420 A, 420B, 420C and 420D. Internal connections may be implemented as a bus, multiplexor, crossbar switch, and/or any other suitable connection technology.
Inputs and outputs for processor 400 may be provided via bus 416, which may include one or more conductive lines. Bus 416 may be communicatively coupled to one or more components of processor 400, for example controller 414, cache 410, and/or register 412. Bus 416 may be coupled to one or more components of the system, such as patient support 18, scanner 16, gantry 15, etc. mentioned previously.
Bus 416 may be coupled to one or more external memories which may be included in memory circuit 450. The external memories may include Read Only Memory (ROM) 432. ROM 432 may be a masked ROM, Electronically Programmable Read Only Memory (EPROM) or any other suitable technology. The external memory (ies) may include Random Access Memory (RAM) 433. RAM 433 may be a static RAM, battery backed up static RAM, Dynamic RAM (DRAM) or any other suitable technology. The external memory(ies) may include Electrically Erasable Programmable Read Only Memory (EEPROM) 435. The external memory(ies) may include Flash memory 434. The External memory(ies) may include a magnetic storage device such as disc 436, and/or optical storage devices such as compact discs (CDs), digital versatile discs (DVDs), etc.
In some embodiments, processing circuit 4000, and in particular processor 400, may control operations of, and process data obtained by, a SPECT imaging system, such as SPECT imaging system 10 illustrated in FIG. 1. In some embodiments, processing circuit 4000, and in particular processor 400, may control operations of, and process data obtained by, a combination PET/CT system, such as the combination PET/CT imaging system 2000 illustrated in FIG. 2.
Processing circuit 4000, and in particular processor 400, may perform algorithms or operations described below with respect to FIGs. 5 and 6 to provide near real-time two dimensional projection images which may displayed to an operator via a display device and used by the operator to properly position region of interest 20 in subject 14 to be within a desired location for proper imaging.
FIG. 5 illustrates a flowchart of a method 500 of providing near real-time two dimensional projection images from data obtained by a three-dimensional imaging apparatus, such as SPECT imaging system 10 or PET/CT imaging system 2000.
In an operation 510, a processing system of the imaging system initializes volume imaging data and projection data for an image space for processing by the imaging system. Here, it should be understood that in the context of PET/CT imaging system 2000, projection data may refer to a collection of input event data. An initial input data space and a reconstruction volume are created. In other words, the imaging system (or, more precisely, a processor of the imaging system) establishes a seed three- dimensional image for the image space (which becomes the current three-dimensional image), and a set of seed projection data for the image space (which becomes the current projection data for the image space). In an operation 520 it is determined whether to stop execution of method 500, for example in response to a command or instruction from a user, or after method 500 has executed for a predetermined period of time or number of cycles, which may be stored in memory. If it is determined in operation 520 that method 500 should not end, then the next of a series of first iterations comprising operations 530 through 560 is performed. The time period during which each of these first iterations occur is referred to herein as an epoch. Thus execution of method 500 may occur over one or more epochs. Beneficially, each epoch lasts less than one second. In some embodiments, each epoch may be at least 0.1 second.
In an operation 530, the imaging system acquires more event data for the set of projection data. In particular, the processor receives, from the imaging apparatus, new event data corresponding to newly detected image-related events from subject 14.
In an operation 540, the imaging system updates the set of projection data. In particular, the processor updates the current set of projection data for the image space based on the new event data.
In an operation 550, iterative image reconstruction is performed using a series of second iterations. Typically, the reconstruction runs for a small number of second iterations (possibly as small as one). An example embodiment of iterative image reconstruction operation 550 will be described below with respect to FIG. 6.
In an operation 560, one or more two-dimensional projection(s) of the image space are computed from the current three-dimensional image for the image space. For example, one two-dimensional projection of subject 14 may lie in a plane defined by a craniocaudal (head-to-foot) direction of subject 14 and a lateral (left-to-right) direction of subject 14, sometimes referred to as a coronal or frontal plane. Another two-dimensional projection of subject 14 may lie in a plane defined by a craniocaudal (head-to- foot) direction of subject 14 and an anterior/posterior (top-to-bottom) direction of subject 14, sometimes referred to as a sagittal plane. In various embodiments, other two-dimensional projections in any of other various planes (e.g., a transverse plane) may be produced. Operation 560 may also include outputting the current three-dimensional image for the image space, computing the computed two-dimensional projection(s) of the image space from the current three-dimensional image for the image space, and providing to a display device the computed two-dimensional projection(s) of the image space.
Upon completion of operation 560, the process returns to operation 520 where it is determined whether the first iterations should continue at operation 530, or whether they should stop, as discussed above.
FIG. 6 illustrates a flowchart of a method 600 of iterative reconstruction. In particular, the method 600 may comprises one embodiment of operation 550 in method 500 of FIG. 5.
In an operation 610, a three-dimensional reconstructed volume of the image space is initialized.
In an operation 620, a processor computes a set of projection data for the image space from the current reconstructed three-dimensional volume.
In an operation 630, , a processor computes or otherwise determines a difference between the computed projection data for the image space, obtained in operation 620, and the current projection data for the image space.
In an operation 640, a processor computes or otherwise determines a quality or cost metric for the computed data for the image space, based on the difference obtained in operation 630.
In an operation 650, a processor determines whether the second iterations of method 600 are done. In particular, the processor compares the quality metric to a threshold, and when the quality metric is less than or equal to the threshold, ends the series of second iterations.
In that case, the method proceeds to operation 660 in which the method outputs an updated three- dimensional volume of the image space.
On the other hand, when the quality metric is greater than the threshold in operation 650, then the method proceeds to operation 670. In operation 670, a processor backprojects the difference in the projection data for the image space to the current three-dimensional image for the image space.
In an operation 680, the current three-dimensional image for the image space is updated in response to the backproj ection of operation 670. Then the method returns to operation 620 and another second iteration.
The methods 500 and 600 described above may be more clearly understood from the following explanation.
Method 600 of FIG. 6 is an example of an iterative reconstruction algorithm.
An iterative reconstruction, such as method 600, starts with an assumed image, computes projection data from the image, compares the original projection data and updates the image based upon the difference between the calculated and the actual projection data. Implied in this algorithm description are two collections of data: (1) the "assumed image" or 3D reconstructed volume; and (2) the "original projection data." Also implied are two projection functions: (1) a forward projection which creates projection data from the current 3D reconstructed volume; and (2) a backward projection, that takes the projection data differences back to the 3D reconstructed volume. Lastly, there are three operations: (1) a comparison operation to calculate or determine a difference or differences between the original projection data and the projection data which are obtained from the 3D reconstruction volume; (2) a quality or cost metric calculated based on the difference(s), used to determine when to stop the (second) iterations of method 600; and (3) the operation to update or correct the 3D reconstruction volume using the back projected difference(s).
In other words, the iterative reconstruction of method 600 involves:
• an initial input data space and a reconstruction volume,
• forward projection of the reconstruction volume,
• calculation of a difference / correction factor between the projection data and the input data space,
• calculation of a “goodness metric” from the correction factor, • back-projection of the correction factor into reconstruction volume and updating the volume, and
• a decision of whether to perform another iteration based on the goodness metric
The method 500 of FIG. 5 wraps the iterative reconstruction algorithm 600 of FIG. 6 within another (first) iterative loop that updates the acquired projection data as additional input is acquired from the imaging apparatus, and updates the displayed output images from the re-reconstructed 3D volume. The goal / purpose of the “outer” loop [method 500] is to provide relatively quick 2D projection updates to the operator so they can position the patient efficiently, without waiting significantly between each positioning movement of the patient 14 or support 18.
Method 500 provides an adaptation of the generic iterative reconstruction algorithm 600 of FIG. 6 to include input data updates between invocations of reconstruction, and the use of different stopping criteria and volume update functions, and to create 2-D projections from the reconstructed volume following each reconstruction. The resulting images may be quite noisy, and so in some embodiments, some low-pass or band-pass filtering may be included to make it easier for the operator to see the patient’s gross anatomy.
Here, method 600 is tuned for speed in processing low-resolution, noisy data / images / volumes. Accordingly, the quality metrics and updating functions in method 600 may be different than the corresponding functions used for iterative reconstruction of diagnostic images from the full acquisition image event data. They would be different because they would be tuned for processing noisier, lower- resolution projections without most of the time-consuming corrections and image processing typically included in the diagnostic image reconstruction implementation.
In method 500, at the beginning of each epoch (a single first iteration, e.g. one pass through the “outside loop”) the imaging system acquires a new sample of detected image-related events for some period of time [the detection of image-related events for the next epoch can start during the reconstruction processing in this epoch]. Then:
(a) the input data is updated with the newly acquired events (with possible aging, discussed below), and
(b) the previous epoch’s input data and reconstruction volume are used to initialize the next reconstruction.
After the input data is updated, the reconstruction runs for a small number of iterations (possibly as small as one).
When the reconstruction is complete, one or more two-dimensional projections of the reconstructed volume are calculated, representing anterior / posterior, or lateral views of the patient / object, which are then displayed to the user. While a maximum intensity projection may be fastest, it may also result in very noisy images, and since these reconstructions are expected to have a low signal- to-noise ratio, an integration projection would provide more useful images. At the end of each epoch, if the operation has not been requested to stop, a new epoch begins and the method returns to acquiring a new sample of detected image-related events, as described above, and continues the series of first iterations.
In some embodiments, method 500 may include aging of the event data at the end of each epoch, or first iteration, or after a predetermined number of epochs or first iterations. In some embodiments, this may include full resetting the input data and the three-dimensional reconstruction volume after a user selectable number of epochs (i.e., the previous input data and reconstruction volume are initialized to [empty] defaults). In other embodiments, this may entail applying an exponential decay of the event set and seed spaces with a user selectable strength of decay (i.e., the previous events, reconstruction, and imaging spaces are reduced in number or magnitude as appropriate for that space). For the event data, a fraction of the events in the event set may be removed before adding the new events. For summary data and the reconstruction and image spaces, the magnitudes may be decreased by dividing each entry by a user-selected value.
Various embodiments may combine the variations described above. While preferred embodiments are disclosed in detail herein, many other variations are possible which remain within the concept and scope of the invention. Such variations would become clear to one of ordinary skill in the art after inspection of the specification, drawings and claims herein. The invention therefore is not to be restricted except within the scope of the appended claims.

Claims

Claim 1. A system, comprising: an imaging apparatus, wherein the imaging apparatus is configured to detect image-related events in three dimensions from a subject, and wherein the imaging apparatus lacks an anatomical reference for the subject; a display device; and a processor configured to: establish a seed three-dimensional image for an image space comprising at least a portion of the subject, as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; in a series of first iterations for a corresponding series of epochs: receive, from the imaging apparatus, new event data corresponding to newly detected image-related events from the subject, update the current projection data for the image space based on the new event data, in a series of second iterations, within each epoch: determine computed projection data for the image space from the current three-dimensional image for the image space, determine a difference between the computed projection data for the image space and the current projection data for the image space, determine a quality metric for the computed projection data for the image space, based on the difference, compare the quality metric to a threshold, when the quality metric is greater than the threshold: backproject the difference to the current three-dimensional image for the image space, update the current three-dimensional image for the image space based on the backproj ection, and continue the series of second iterations, and when the quality metric is less than or equal to the threshold, end the series of second iterations, output the current three-dimensional image for the image space, compute a two-dimensional projection of the image space from the current three-dimensional image for the image space, provide to the display device the computed two-dimensional projection of the image space; and continue the series of first iterations until the processor receives a request to stop.
Claim 2. The system of claim 1, wherein the processor is further configured to compute a second two-dimensional projection of the image space from the current three- dimensional image for the image space; and display the second two-dimensional projection of the image space.
Claim 3. The system of claim 1, wherein the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three-dimensional image, and the seed projection data periodically for each period.
Claim 4. The system of claim 1, wherein the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
Claim 5. The system of claim 1, wherein the imaging apparatus comprises a three- dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
Claim 6. The system of claim 1, wherein the imaging apparatus comprises a Positron Emission Tomography (PET) imaging apparatus.
Claim 7. A method, comprising: establishing a seed three-dimensional image for an image space comprising at least a portion of a subject as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; and in a series of iterations for a corresponding series of epochs: receiving new event data corresponding to newly detected image-related events generated from the subject, updating the current projection data for the image space based on the new event data, 18 performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space, computing a two-dimensional projection of the image space from the reconstructed current three-dimensional image for the image space, and displaying the two-dimensional projection of the image space.
Claim 8. The method of claim 7, further comprising: computing a second two-dimensional projection of the image space from the current three- dimensional image for the image space; and displaying the second two-dimensional projection of the image space.
Claim 9. The method of claim 7, wherein performing an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space comprises performing a series of second iterations within each epoch, wherein each second iteration comprises: determining computed projection data for the image space from the current three- dimensional image for the image space, determining a difference between the computed projection data for the image space and the current projection data for the image space, determining a quality metric for the computed projection data for the image space, based on the difference, comparing the quality metric to a threshold, when the quality metric is greater than the threshold: backprojecting the difference to current three-dimensional image for the image space, updating the current three-dimensional image for the image space based on the backproj ection, and continuing the series of second iterations, and when the quality metric is less than or equal to the threshold, stopping the iterative reconstruction.
Claim 10. The method of claim 7, further comprising: receiving from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and resetting the event data, the seed three-dimensional image, and the seed projection data 19 periodically for each period.
Claim 11. The method of claim 7, further comprising: receiving from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decaying the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
Claim 12. The method of claim 7, wherein the new event data is received from an imaging apparatus which lacks an anatomical reference for the subject.
Claim 13. The method of claim 12, wherein the new event data is received from a three- dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
Claim 14. The method of claim 12, wherein the new event data is received from a Positron Emission Tomography (PET) imaging apparatus.
Claim 15. A system, comprising. an imaging apparatus, wherein the imaging apparatus is configured to detect image-related events in three dimensions from a subject, and wherein the imaging apparatus lacks an anatomical reference for the subject; and a processor configured to: establish a seed three-dimensional image for an image space comprising at least a portion of the subject, as a current three-dimensional image for the image space, and seed projection data for the image space as current projection data for the image space; in a series of first iterations for a corresponding series of epochs: receive, from the imaging apparatus, new event data corresponding to newly detected image-related events, update the current projection data for the image space based on the new event data, perform an iterative reconstruction of the current three-dimensional image for the image space using the updated current projection data for the image space, and compute a two-dimensional projection of the image space from the current three-dimensional image of the image space.
Claim 16. The system of claim 15, wherein the processor is further configured to: 20 compute a second two-dimensional projection of the image space from the current three- dimensional image for the image space; and display the second two-dimensional projection of the image space.
Claim 17. The system of claim 15, wherein the processor is further configured to: receive from a user an indication of a decay time constant for exponentially decaying the event data, the seed three-dimensional image, and the seed projection data; and exponentially decay the event data, the seed three-dimensional image, and the seed projection data at a decay rate corresponding to the decay time constant.
Claim 18. The system of claim 15, wherein the processor is further configured to: receive from a user an indication of a period corresponding to a specified number of epochs for periodically resetting the event data, the seed three-dimensional image, and the seed projection data; and reset the event data, the seed three-dimensional image, and the seed projection data periodically for each period.
Claim 19. The system of claim 15, wherein the imaging apparatus comprises a three- dimensional (3-D) single photon emission computed tomography (SPECT) imaging apparatus.
Claim 20. The system of claim 15, wherein the imaging apparatus comprises a Positron Emission Tomography (PET) imaging apparatus.
PCT/EP2021/072875 2020-08-26 2021-08-18 System and method for providing near real-time two dimensional projection images WO2022043143A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202063070309P 2020-08-26 2020-08-26
US63/070,309 2020-08-26

Publications (1)

Publication Number Publication Date
WO2022043143A1 true WO2022043143A1 (en) 2022-03-03

Family

ID=77520762

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2021/072875 WO2022043143A1 (en) 2020-08-26 2021-08-18 System and method for providing near real-time two dimensional projection images

Country Status (1)

Country Link
WO (1) WO2022043143A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120070057A1 (en) * 2009-06-08 2012-03-22 Koninklijke Philips Electronics N.V. Time-of-flight positron emission tomography reconstruction using image content generated event-by-event based on time-of-flight information
US10380735B2 (en) * 2010-04-16 2019-08-13 Koninklijke Philips N.V. Image data segmentation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120070057A1 (en) * 2009-06-08 2012-03-22 Koninklijke Philips Electronics N.V. Time-of-flight positron emission tomography reconstruction using image content generated event-by-event based on time-of-flight information
US10380735B2 (en) * 2010-04-16 2019-08-13 Koninklijke Philips N.V. Image data segmentation

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
JI XIAOYING ET AL: "Fast PET Preview Image Reconstruction, Streaming, and Visualization During Data Acquisition: A Preliminary Study", JOURNAL OF NUCLEAR MEDICINE TECHNOLOGY., vol. 47, no. 3, 15 February 2019 (2019-02-15), US, pages 243 - 248, XP055863245, ISSN: 0091-4916, DOI: 10.2967/jnmt.118.218511 *

Similar Documents

Publication Publication Date Title
JP6192542B2 (en) Truncation correction for iterative cone beam CT reconstruction for SPECT / CT systems
EP3224801B1 (en) Multi-modality imaging system and method
US7991450B2 (en) Methods and systems for volume fusion in diagnostic imaging
US9619905B2 (en) Apparatus and method for generation of attenuation map
US8421021B2 (en) Motion correction of SPECT images
JP5172347B2 (en) Reconstruction of 2D planar images of nuclear medicine by iterative constraint deconvolution
JP3800101B2 (en) Tomographic image creating apparatus, tomographic image creating method and radiation inspection apparatus
US11309072B2 (en) Systems and methods for functional imaging
JP2011507640A (en) Image Restoration Method Using Dilution Constraint Correction
US9905044B1 (en) Systems and methods for functional imaging
US20080187094A1 (en) Method and system for performing local tomography
CN101842806A (en) Dirty isotope pet reconstruction
JP2018505390A (en) Radiation emission imaging system, storage medium, and imaging method
US20070297575A1 (en) Systems and methods for determining object position
EP3804625A1 (en) Internal dose tomography
JP2015504515A (en) SPECT system without gantry
WO2022043143A1 (en) System and method for providing near real-time two dimensional projection images
WO2022096335A1 (en) System and method for nuclear medicine imaging with adaptive stopping criteria
WO2022073744A1 (en) System and method for automated patient and phantom positioning for nuclear medicine imaging
JP7209496B2 (en) nuclear medicine diagnostic equipment
Atkins et al. Positron emission computed tomography using large area detectors
JP2023141790A (en) Nuclear medicine diagnosis device and adsorption coefficient image estimation method
Weiss Cohen et al. An automatic system for analyzing phantom images to determine the reliability of PET/SPECT cameras
JP2003232854A (en) Device for nuclear medicine diagnosis, image processor, and method for image reconstruction

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21762489

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21762489

Country of ref document: EP

Kind code of ref document: A1