CN115222599A - PET image reconstruction method, system and storage medium - Google Patents

PET image reconstruction method, system and storage medium Download PDF

Info

Publication number
CN115222599A
CN115222599A CN202210878458.1A CN202210878458A CN115222599A CN 115222599 A CN115222599 A CN 115222599A CN 202210878458 A CN202210878458 A CN 202210878458A CN 115222599 A CN115222599 A CN 115222599A
Authority
CN
China
Prior art keywords
bed
image
beds
raw data
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210878458.1A
Other languages
Chinese (zh)
Inventor
吕杨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202210878458.1A priority Critical patent/CN115222599A/en
Publication of CN115222599A publication Critical patent/CN115222599A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine (AREA)

Abstract

The specification provides a PET image reconstruction method, a system and a storage medium. The method can include acquiring raw data and correction coefficients of at least two beds; and performing iterative reconstruction on the generated data of the at least two beds based on the correction coefficient to obtain a target image spliced by the at least two beds. At least one iteration of the iterative reconstruction includes: updating the initial image of each of the at least two beds, splicing the updated image of each bed to obtain a spliced image, processing the spliced image to obtain a processed image, and splitting the processed image to obtain an initial image of each of the at least two beds in the next iteration.

Description

PET image reconstruction method, system and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method, a system, and a storage medium for reconstructing a positron emission computed tomography image.
Background
Positron Emission Tomography (PET) systems are a type of radionuclide labelled contrast systems. The conventional clinical PET system adopts a barrel type structure, and the axial imaging field of view is generally 15-30 cm. Because of the limited axial field of view of PET, a single scan (a PET scan may also be referred to as a bed) can only cover a portion of the subject (e.g., human body), and thus multiple scans (i.e., multiple beds) are required to scan the entire subject. During single-bed scanning, the structure of the PET system determines that the sensitivity of the PET system is in a triangular distribution in the axial direction, so that the signal-to-noise ratio of a single-bed PET image is not uniform in the axial direction, and the single-bed PET image is in a distribution with two ends being low and the middle being high. In order to alleviate the problem, a certain overlapping area is arranged between adjacent beds during multi-bed scanning, and the weighted average is carried out on the images of the overlapping area of the adjacent beds, so that the noise of a splicing area can be reduced, and the axial signal-to-noise ratio of a PET image is improved. Ideally, a 50% overlap region would ensure a consistent signal-to-noise ratio for the PET image in the axial direction. However, the overlapping area is set too large, which means more beds are needed to cover the whole human body scanning area, which results in the reduction of the scanning efficiency of the PET system.
Therefore, it is desirable to provide a PET image reconstruction method and system, which can reduce the noise in the reconstructed image splicing region, obtain a PET image with a more uniform axial signal-to-noise ratio, further reduce the limit on the length of the splicing region, and improve the working efficiency of a PET system.
Disclosure of Invention
One aspect of the present specification provides a PET image reconstruction method. The method can include obtaining raw data and correction coefficients for at least two beds; and performing iterative reconstruction on the generated data of the at least two beds based on the correction coefficient to obtain a target image spliced by the at least two beds. At least one iteration of the iterative reconstruction includes: updating the initial image of each of the at least two beds, splicing the updated image of each bed to obtain a spliced image, processing the spliced image to obtain a processed image, and splitting the processed image to obtain an initial image of each of the at least two beds in the next iteration.
In some embodiments, obtaining the raw data for the at least two beds may comprise: acquiring a scanning range of a scanning object; determining the number of bed bits and an overlapping area of the scanning bed according to the scanning range; determining a position of the scanning bed at each of the at least two beds based on the number of beds and the overlap region; acquiring the scanning time of each bed; and controlling an imaging device to perform data acquisition on the scanning object when the scanning bed moves to the position of each bed based on the scanning time of each bed so as to obtain the raw data of the at least two beds.
In some embodiments, obtaining the raw data for the at least two beds may further comprise: pre-processing the raw data of the at least two beds to screen out positioning information and time of flight (TOF) information of a coincidence event.
In some embodiments, obtaining the correction coefficient may include: and carrying out physical correction on the raw data of the at least two beds to obtain the correction coefficient.
In some embodiments, updating the initial image of each of the at least two beds may include: initializing and reconstructing the initial image of each of the at least two beds or acquiring the initial image of each of the at least two beds obtained from a previous iteration; forward projecting the initial image of each of the at least two beds; comparing the forward projection data of each of the at least two beds with the raw data of each of the at least two beds to obtain a deviation value; and carrying out back projection on the deviation value and accumulating the deviation value into the initial image of the corresponding bed so as to update the initial image of each bed.
In some embodiments, the at least two beds can include N beds, where N ≧ 3, N is a positive integer. Performing iterative reconstruction on the raw data of the at least two beds to obtain a target image spliced by the at least two beds may include: performing iterative reconstruction on the raw data of the first bed and the second bed to obtain a spliced image M 1 (ii) a Number of birth to the (N-i) th bed and the (N-i + 1) th bedPerforming iterative reconstruction, wherein i is more than or equal to 2 and less than N, and i is a positive integer to obtain a spliced image M N-i (ii) a Performing iterative reconstruction on the raw data of the (N-1) th bed and the N-th bed to obtain a spliced image M N-1 (ii) a And applying said image M 1 、……、M N-i 、……、M N-1 And splicing to obtain the target image.
In some embodiments, the physical correction may include a scatter correction. The correction coefficients may comprise scatter correction coefficients, said scatter correcting said raw data of said at least two beds comprising: acquiring raw data of a previous bed and a current bed; performing iterative correction on the raw data of the previous bed and the current bed to obtain at least a scattering correction coefficient of the previous bed, wherein at least one iteration in the iterative correction includes: acquiring a previous initial activity distribution image of the previous bed and a current initial activity distribution image of the current bed, which are acquired in the previous iteration; splicing the previous initial activity distribution image and the current initial activity distribution image to obtain a middle activity distribution image; acquiring at least one scattering point in the intermediate activity distribution image; determining an intermediate scatter distribution estimate based on the intermediate activity distribution image and the at least one scattering point; splitting the intermediate scattering distribution estimation to obtain candidate scattering distribution estimation of each of the previous bed and the current bed; and respectively reconstructing images based on the candidate scattering distribution estimation of each bed in the previous bed and the current bed and the raw data of the corresponding bed, and updating the previous initial activity distribution image and the current initial activity distribution image.
In some embodiments, the physical correction comprises a scatter correction, the correction coefficients comprise scatter correction coefficients, and the scatter correction of the raw data for the at least two beds comprises: acquiring raw data of a first bed, a second bed and a third bed which are continuous in bed; iteratively correcting the raw data of the first bed, the second bed and the third bed to obtain at least a scatter correction coefficient of the second bed, wherein at least one iteration in the iterative correction includes: obtaining a first initial activity distribution image of the first bed, a second initial activity distribution image of the second bed and a third initial activity distribution image of the third bed from the previous iteration; splicing the first initial activity distribution image, the second initial activity distribution image and the third initial activity distribution image to obtain an intermediate activity distribution image; acquiring at least one scattering point in the intermediate activity distribution image; determining an intermediate scatter distribution estimate based on the intermediate activity distribution image and the at least one scattering point; splitting the intermediate scattering distribution estimate to obtain candidate scattering distribution estimates for each of the first, second, and third beds; and respectively reconstructing images based on the candidate scattering distribution estimation of each of the first, second and third beds and the raw data of the corresponding bed, and updating the first, second and third initial activity distribution images.
Another aspect of the present specification provides a PET image reconstruction system. The system comprises an acquisition module and a reconstruction module. The acquisition module can be used for acquiring the raw data and the correction coefficients of at least two beds. The reconstruction module may be configured to perform iterative reconstruction on the raw data of the at least two beds based on the correction coefficient to obtain a target image obtained by splicing the at least two beds, where at least one iteration in the iterative reconstruction may include updating an initial image of each bed in the at least two beds, splicing the updated images of each bed to obtain a spliced image, processing the spliced image to obtain a processed image, and splitting the processed image to obtain an initial image of each bed in the at least two beds at the next iteration.
Yet another aspect of the present specification provides a computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the operations of the PET image reconstruction method as described above.
Additional features will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present invention may be realized and obtained by means of the instruments and methods set forth in the detailed description below.
Drawings
This description may be further described in terms of exemplary embodiments. The exemplary embodiments may be described in detail with reference to the accompanying drawings. The described embodiments are not limiting exemplary embodiments in which like reference numerals represent similar structures throughout the several views of the drawings and wherein:
FIG. 1 is a schematic diagram of an application scenario of an image reconstruction system according to some embodiments of the present description;
FIG. 2 is an exemplary block diagram of an image reconstruction system, shown in accordance with some embodiments of the present description;
FIG. 3 is an exemplary flow diagram of a PET image reconstruction method shown in accordance with some embodiments of the present description;
FIG. 4 is an exemplary flow diagram of a PET image reconstruction method shown in accordance with some embodiments of the present description;
FIG. 5 is an exemplary flow diagram of a PET image reconstruction method according to some embodiments of the present description;
FIG. 6 is an exemplary flow diagram of a scatter correction coefficient determination method, shown in accordance with some embodiments of the present description; and
fig. 7 is an exemplary flow diagram of a scatter correction coefficient determination method, shown in accordance with some embodiments of the present description.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present specification, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
It should be understood that the terms "system," "device," "unit," "component," "module," and/or "block" as used herein are a way to distinguish between different levels of different components, elements, parts, portions, or parts. However, other words may be substituted by other expressions if they accomplish the same purpose.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to or removed from these processes.
FIG. 1 is a schematic diagram of an application scenario of an exemplary image reconstruction system, shown in accordance with some embodiments of the present description. As shown in fig. 1, the image reconstruction system 100 may include an imaging device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150. The components of the image reconstruction system 100 may be connected in various ways. By way of example only, as shown in fig. 1, processing device 140 may be connected to imaging device 110 via network 120. As another example, processing device 140 may be directly connected to imaging device 110 (as indicated by the dashed double-headed arrow connecting processing device 140 and imaging device 110). As another example, the terminal devices (e.g., 131, 132, 133, etc.) may be directly connected to the processing device 140 (as indicated by the dashed double-headed arrow connecting the terminal device 130 and the processing device 140), or may be connected to the processing device 140 via the network 120.
The imaging device 110 may be used to acquire image data relating to a scanned object or portion thereof. In some embodiments, the scan subject may include a human, an animal (e.g., other animals such as laboratory mice), a phantom, etc., or any combination thereof. In some embodiments, the scan subject may include a particular portion of a human body, such as the head, chest, abdomen, etc., or any combination thereof. In some embodiments, the scan object may include a particular organ, such as the heart, thyroid, esophagus, trachea, stomach, gallbladder, small intestine, colon, bladder, ureter, uterus, fallopian tube, and the like. In some embodiments, imaging device 110 may include a positron emission computed tomography (PET), PET-CT device, PET/MRI device, single-photon emission computed tomography (SPECT) device, or the like. For convenience of description, the present specification will be explained with a PET device as an example of the imaging device 110, which does not limit the scope of the present specification.
Network 120 may facilitate the exchange of information and/or data. In some embodiments, one or more components of image reconstruction system 100 (e.g., imaging device 110, terminal device 130, processing device 140, storage device 150, etc.) may exchange information and/or data with other components in image reconstruction system 100 via network 120. For example, the processing device 140 may obtain raw data for at least two beds from the imaging device 110 via the network 120.
Terminal device 130 may enable a user to interact with other components in image reconstruction system 100. For example, the user may acquire the target image from the processing device 140 through the terminal device 130. As another example, terminal device 130 may also retrieve data/information stored in storage device 150 via network 120. In some embodiments, the end device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, the like, or any combination thereof.
Processing device 140 may process information and/or data obtained from imaging device 110, terminal device 130, and/or storage device 150. For example, the processing device 140 may obtain the raw data of the at least two beds and the correction coefficients to iteratively reconstruct the raw data of the at least two beds based on the correction coefficients. For another example, the processing device 140 may perform a physical correction on the raw data of at least two beds to obtain the correction coefficient. In some embodiments, the processing device 140 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, processing device 140 may access information and/or data from imaging device 110, terminal device 130, and/or storage device 150 via network 120. As another example, processing device 140 may be directly connected to imaging device 110, terminal device 130, and/or storage device 150 to access information and/or data. In some embodiments, processing device 140 may include one or more processing units (e.g., single core processing engines or multiple core processing engines). By way of example only, the processing device 140 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), a Graphics Processing Unit (GPU), a Physical Processing Unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include one or a combination of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, and the like. In some embodiments, processing device 140 may be part of imaging device 110 or terminal device 130.
Storage device 150 may store data, instructions, and/or any other information. In some embodiments, storage device 150 may store data obtained from imaging device 110, terminal device 130, and/or processing device 140. For example, the storage device 150 may store parameters (e.g., a reconstruction protocol) associated with the imaging device 110. For another example, the storage device 150 may store raw data (or image data) of the scan object obtained from the imaging device 110. In some embodiments, storage device 150 may store data and/or instructions that processing device 140 may execute or use to perform the example methods described herein. In some embodiments, the storage device 150 may include one or a combination of mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like. In some embodiments, the storage device 150 may be implemented by a cloud platform as described herein.
In some embodiments, storage device 150 may be connected to network 120 to enable communication with one or more components (e.g., imaging device 110, processing device 140, terminal device 130, etc.) in image reconstruction system 100. One or more components in the image reconstruction system 100 may read data or instructions in the storage device 150 over the network 120. In some embodiments, the storage device 150 may be part of the processing device 140 or may be separate and directly or indirectly coupled to the processing device 140.
It should be noted that the above description of the image reconstruction system 100 is for illustrative purposes only and is not intended to limit the scope of the present description. It will be understood by those skilled in the art that, having the benefit of the teachings of this system, various modifications and changes in form and detail may be made to the field of application in which the above-described system is implemented without departing from such teachings. However, such changes and modifications do not depart from the scope of the present specification. For example, the imaging device 110, the processing device 140, and the terminal device 130 may share one storage device 150, or may have respective storage devices.
FIG. 2 is an exemplary block diagram of an image reconstruction system shown in accordance with some embodiments of the present description. In some embodiments, the image reconstruction system 200 may be implemented by the processing device 140. As shown in fig. 2, the image reconstruction system 200 may include an acquisition module 210 and a reconstruction module 220.
The obtaining module 210 may be configured to obtain raw data and correction coefficients of at least two beds. In some embodiments, the acquisition module 210 may pre-process the raw data for at least two beds prior to iterative reconstruction of the raw data. For example, the acquisition module 210 may screen raw data for at least two beds to screen out location information and time of flight (TOF) information that are consistent with an event. For another example, the acquisition module 210 may intercept raw data of at least two beds (e.g., intercept data of the first few minutes of each scan) to intercept target raw data desired by the user.
The reconstruction module 220 may be configured to perform iterative reconstruction on the raw data of the at least two beds based on the correction coefficient, so as to obtain a target image obtained by splicing the at least two beds. For example, the reconstruction module 220 may initialize an image of each bed to obtain an initial image of each bed. The reconstruction module 220 may update the initial image of each of the at least two beds, and stitch the updated images of each bed to obtain a stitched image. Further, the reconstruction module 220 may process the stitched image to obtain a processed image. The reconstruction module 220 may split the processed image to obtain an initial image of each of the at least two beds at the next iteration. When the iteration stop condition is satisfied, the reconstruction module 220 may determine that the stitched image in this iteration is the target image. For another example, when the number of bed positions is N, where N is greater than or equal to 3 and is an integer, the reconstruction module 220 may perform iterative reconstruction on the raw data of the first bed position and the second bed position to obtain the spliced image M 1 . The reconstruction module 220 may perform iterative reconstruction on the raw data of the second bed and the third bed to obtain a spliced image M 2 . By analogy, the reconstruction module 220 may perform iterative reconstruction on the raw data of the nth-i bed and the nth-i +1 (i is greater than or equal to 2 and less than N, i is an integer) bed to obtain the spliced image M N-i . By analogy, the reconstruction module 220 may perform iterative reconstruction on the raw data of the (N-1) th bed and the raw data of the (N) th bed to obtain the spliced image M N-1 . The reconstruction module 220 may reconstruct the image M 1 、M 2 、……、M N-i 、……、M N-1 And splicing to obtain a target image. For more description of the iterative reconstruction, reference may be made to fig. 4 and 5 and the description thereof.
It should be understood that the system and its modules shown in FIG. 2 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system and its modules is for convenience of description only and should not limit the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, the acquisition module 210 may include two units, e.g., a raw data acquisition unit and a correction coefficient acquisition unit, to acquire raw data and a correction coefficient, respectively. For another example, the modules may share one storage device, and each module may have its own storage device. Such variations are within the scope of the present disclosure.
Fig. 3 is an exemplary flow chart of a PET image reconstruction method shown in accordance with some embodiments of the present description. In some embodiments, flow 300 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), etc., or any combination thereof. In some embodiments, the process 300 may be implemented as a set of instructions (e.g., an application program) stored in the storage device 150. The set of instructions may be executed by processing device 140 and/or the modules in fig. 2, and when executing the instructions, processing device 140 and/or the modules may be configured to perform flow 300. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, flow 300 may be accomplished with one or more additional operations not described and/or without one or more operations discussed herein. Additionally, the order of the operations in the process as shown in FIG. 3 and described below is not intended to be limiting.
In step 310, the processing device 140 can obtain raw data and correction coefficients for at least two beds. In some embodiments, step 310 may be performed by acquisition module 210 in system 200.
In this specification, the raw data of a certain bed refers to image data (may also be referred to as projection data or a chord chart) corresponding to a scanning object (for example, a patient, a phantom, or the like) placed on the scanning bed, which is obtained by image-capturing the scanning object by an imaging device (for example, a PET device) when the scanning bed is located at a position corresponding to the bed. In some embodiments, the raw data of the at least two beds may include respiratory gating information, position information of a Line of Response (LOR), energy information of the Line of Response, and/or time information of the Line of Response, etc.
In some embodiments, the processing device 140 may obtain raw data and/or correction coefficients for at least two beds from the storage device 150 and/or the imaging device 110. For example, when performing a multi-bed scan, the imaging device 110 may transmit the acquired raw data for each bed to the storage device 150 for storage. After the scanning by imaging device 110 is complete, processing device 140 may retrieve raw data for at least two beds from storage device 150. For another example, the imaging device 110 may transmit the acquired raw data for each bed to the processing device 140 in real time for subsequent processing (e.g., image reconstruction, correction coefficient calculation, etc.).
In some embodiments, when performing a multi-bed scan on a scan object, the processing device 140 may acquire a scan range of the scan object and determine the number of beds and the overlap region of the scan bed according to the scan range. Illustratively, the doctor may input a scanning range of the scanning object through the terminal device 130 according to a lesion size of the patient or a scanning purpose (e.g., a whole body examination). The processing device 140 can receive the scan range and determine the number of beds (N, N being a positive integer greater than 1) and the overlap region (e.g., the overlap range of two adjacent beds is 10%, 20%, 50% of a single bed, etc.) of the scan bed in conjunction with the PET device's own structural parameters (e.g., the detector axial range). For example, the axial scan range of the PET device may be L 0 The axial extent of the scanned object may be L 1 Then the processing device 140 may determine that the number of beds (N) is equal to ([ L) 1 /L 0 ]+ 1), and determining the overlap area to be equal to
Figure BDA0003763126010000091
In some embodiments, the overlap region may be a default value of the image reconstruction system 100, or may be set by the user through the terminal device 130 (e.g., through a physical button, a touch screen, a mouse, voice, etc.). The processing device 140 may further determine the number of beds (N) based on the overlap region and the scan range. In some embodiments, the user (e.g., a doctor) may directly set the number of beds (N) and/or the overlap area through the terminal device 130.
Further, the processing device 140 may determine the position of the scanning bed at each of the at least two beds (e.g., the coordinate value (or bed code value) of the head or tail of the bed corresponding to each bed, the center bed code value, etc.) based on the number of beds (N) and the overlap region. Further, the processing device 140 may acquire the scan times of the respective beds. In some embodiments, the scan time for each bed may be the same or different. For example, for a whole body scan of a human body, the scan time of a bed corresponding to the feet may be shorter than the scan time of a bed corresponding to the head. In some embodiments, the scanning time of each bed may be a default value of the image reconstruction system 100 or may be set by the user through the terminal device 130. Further, the processing device 140 may control an imaging device (e.g., a PET device) to scan the scanning object based on the scanning time and position of each bed to obtain raw data of each bed. Specifically, the processing device 140 may issue a scanning bed movement control command to control the scanning bed to move to a first position corresponding to the first bed position. After the scanning bed reaches the first position, the processing device 140 may issue a scan control command to control the imaging device to scan (i.e., acquire data) the scanning object. After the scanning object is scanned for the first scanning time corresponding to the first bed, the processing device 140 may send out the scanning bed control command to control the scanning bed to move to the second position corresponding to the second bed, and perform scanning of the second bed. In this analogy, the processing device 140 may sequentially move the scanning bed to the positions of the beds according to the position sequence of the beds, and scan the scanning object thereon, so as to obtain the raw data of at least two beds. In some embodiments, when the data acquisition is completed, the processing device 140 may save all the acquired data (i.e., the raw data of each bed) in a list mode (e.g., in the storage device 150) for retrieval and extraction during subsequent use (e.g., image reconstruction, correction coefficient calculation, etc.).
In some embodiments, the correction coefficients may include attenuation correction coefficients, scatter correction coefficients, random correction coefficients, normalized correction coefficients, and the like, or any combination thereof. In some embodiments, the scan object may be scanned by a CT device, an MR device, or the like to obtain a CT image or an MR image of the scan object. The processing device 140 may determine attenuation correction coefficients for the respective beds based on the CT image or the MR image of the scanned subject. In some embodiments, the processing device 140 may derive the correction factor by physically correcting the raw data for at least two beds. For example, the physical correction may include a scatter correction. The processing device 140 may perform scatter correction on the raw data of at least two beds to obtain scatter correction coefficients corresponding to the beds. Exemplary scatter correction methods may include Single Scatter Simulation (SSS), monte-carlo scatter simulation (MCS), and the like. For more description of determining the scatter correction coefficients, reference may be made to fig. 6 and 7 and their description.
In step 320, the processing device 140 may perform iterative reconstruction on the raw data of the at least two beds based on the correction coefficient, so as to obtain a target image after splicing the at least two beds. In some embodiments, step 320 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the processing device 140 may pre-process the raw data for at least two beds prior to iteratively reconstructing the raw data. In some embodiments, preprocessing may include one or more of screening, reordering, downsampling, clipping, and the like. For example, the processing device 140 may filter the raw data for at least two beds to filter out positioning information and time of flight (TOF) information that correspond to the event. For another example, the processing device 140 may intercept raw data of at least two beds (e.g., intercept data of the first few minutes of each scan) to intercept target raw data desired by the user.
In some embodiments, the processing device 140 may perform iterative reconstruction on the raw data (or the raw data after preprocessing (e.g., the target raw data)) based on the correction coefficients to obtain at least two bed-stitched target images. For example, the processing device 140 may acquire an initial image of each bed. The processing device 140 may update the initial image of each of the at least two beds, and stitch the updated image of each bed to obtain a stitched image. Further, the processing device 140 may process the stitched image to obtain a processed image. The processing device 140 may split the processed image to obtain at least two beds, each of which is located in the next iterationThe initial image of (1). When the iteration stop condition is satisfied, the processing device 140 may determine the stitched image in the iteration as the target image. For another example, when the number of bed bits N is greater than or equal to 3, the processing device 140 may perform iterative reconstruction on the raw data of the first bed and the second bed to obtain the stitched image M 1 . The processing device 140 may perform iterative reconstruction on the raw data of the second bed and the third bed to obtain a spliced image M 2 . By analogy, the processing device 140 may perform iterative reconstruction on the raw data of the nth-i bed and the nth-i +1 (i is greater than or equal to 2 and less than N, i is an integer) bed to obtain the spliced image M N-i . By analogy, the processing device 140 may perform iterative reconstruction on the raw data of the (N-1) th bed and the raw data of the (N) th bed to obtain the spliced image M N-1 . The processing device 140 may render the image M 1 、M 2 、……、M N-i 、……、M N-1 And splicing to obtain a target image. More description of the iterative reconstruction can be found in fig. 4 and 5 and their description.
It should be noted that the above description is for convenience only and should not be taken as limiting the scope of the embodiments. It will be understood by those skilled in the art that, having the benefit of this disclosure, numerous modifications and variations in form and detail can be made without departing from the principles of the system and the application of the method and system described above.
Fig. 4 is an exemplary flow chart of a PET image reconstruction method shown in accordance with some embodiments of the present description. In some embodiments, flow 400 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), etc., or any combination thereof. In some embodiments, the flow 400 may be implemented as a set of instructions (e.g., an application program) stored in the storage device 150. Processing device 140 and/or the modules in fig. 2 may execute the set of instructions and, when executing the instructions, processing device 140 and/or the modules may be configured to perform flow 400. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, flow 400 may be accomplished with one or more additional operations not described and/or without one or more operations discussed herein. Additionally, the order of the operations in the process as shown in FIG. 4 and described below is not intended to be limiting. According to some embodiments of the present description, the flow 400 may be used for offline reconstruction.
In step 410, the processing device 140 can obtain raw data and correction coefficients for at least two beds. In some embodiments, step 410 may be performed by acquisition module 210 in system 200. In some embodiments, the correction coefficients may include attenuation correction coefficients, scatter correction coefficients, random correction coefficients, or the like, or any combination thereof.
In some embodiments, the processing device 140 may obtain raw data and/or correction coefficients for at least two beds from the storage device 150 and/or the imaging device 110. For example, when performing a multi-bed scan, the imaging device 110 may transmit the acquired raw data for each bed to the storage device 150 for storage. After the scanning by imaging device 110 is complete, processing device 140 may retrieve raw data for at least two beds from storage device 150. For another example, after the imaging device 110 acquires the raw data of all beds, the imaging device 110 may transmit the raw data of all beds to the processing device 140. Further description of step 410 may refer to step 310, which is not repeated herein.
In step 420, the processing device 140 may update the initial image of each of the at least two beds. In some embodiments, step 420 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, for the first iteration, the initial image may be a default image or a random initial image of the image reconstruction system 100. Each bed may correspond to an initial image. The processing device 140 may directly call up the initial image of each bed. In some embodiments, for a non-first iteration, the processing device 140 may acquire an initial image of each of the at least two beds from a previous iteration.
The processing device 140 may reconstruct the raw data for at least two beds using a reconstruction algorithm based on the correction coefficients to update the initial image for each bed. In some embodiments, the reconstruction algorithm may include an iterative reconstruction algorithm (IR), a joint iterative reconstruction algorithm (SIRT), a neural network model, an adaptive statistical iterative reconstruction Algorithm (ASIR), and the like, or any combination thereof. Exemplary iterative reconstruction algorithms may include an Ordered Subset Expectation Maximization (OSEM) iterative algorithm, a Maximum Likelihood Expectation Maximization (MLEM) method, and the like.
Specifically, in some embodiments, the processing device 140 may forward project an initial image of each of the at least two beds. The forward projection operation may convert an image (e.g., an initial image or an updated initial image) to a data domain (e.g., forward projection data). In some embodiments, the forward projection operation may be implemented by distance-driven methods or monte carlo simulations, etc. Further, the processing device 140 can compare the forward projection data of each of the at least two beds with the raw data of each of the at least two beds to obtain the deviation value. The processing device 140 may back-project and accumulate the deviation values into the initial image of the corresponding bed to update the initial image of each bed. The backprojection operation may convert data in the data domain into data in the image domain. The processing device 140 may approximate the reconstructed image (i.e., the updated initial image) of each bed to its raw data by the deviation value, thereby making the reconstructed image more accurate.
In some embodiments, the processing device 140 may update the raw data for at least two beds based on equation (1).
Figure BDA0003763126010000131
Wherein, f j (n) Representing the value of the pixel marked j in the reconstructed image at the nth iteration, M ij Presentation systemMatrix, a i Denotes the attenuation correction coefficient, n i Representing a normalized correction factor, P i The measured projection value, r, representing the ith line of response i Denotes a random correction coefficient, s i Representing the scatter correction coefficient.
It should be understood that the description may also update the initial image of each of the at least two beds by other means, which is not limited in the description.
In step 430, the processing device 140 may stitch the updated images of each bed to obtain a stitched image. In some embodiments, step 420 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the processing device 140 may perform weighted stitching on the updated initial images of two adjacent beds based on the overlap region and the bed order to obtain a stitched image. For example, the processing device 140 may perform weighted summation on the overlapped areas of the updated initial images of two adjacent beds to stitch the updated initial images of two adjacent beds together to obtain a stitched image (i.e., a multi-bed merged image). In some embodiments, the processing device 140 may zoom in or out on the updated initial images of the two adjacent beds. The processing device 140 may perform a weighted summation on the corresponding enlarged or reduced images, and then reduce or enlarge the stitched image.
In step 440, the processing device 140 may process the stitched image to obtain a processed image. In some embodiments, step 420 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the processing device 140 may perform noise reduction, contrast enhancement, etc. on the stitched image to obtain a processed image. In some embodiments, step 440 may be omitted. In other words, the processing device 140 may perform subsequent steps directly after obtaining the stitched image without performing post-processing.
In step 450, the processing device 140 may determine whether an iteration stop condition is satisfied. In some embodiments, step 420 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the iteration stop condition may include whether the number of iterations reaches a preset number, whether a difference between images (e.g., a stitched image or a processed image) obtained in two adjacent iterations is less than a predetermined threshold, and the like.
In response to the iteration stop condition being met, the processing device 140 may designate the processed image (or stitched image) as the target image (e.g., a complete PET image of the scanned object) in step 470. In response to the iteration stop condition not being satisfied, the processing device 140 may split the processed image (or the stitched image) in step 460, resulting in an initial image for each of the at least two beds at the next iteration. For example, the processing device 140 may take the portion of the processed image (or stitched image) corresponding to the overlapping region directly as the portion of the image corresponding to the overlapping region in the unpacked image. In other words, the portion of the unpacked image corresponding to the overlap region includes the same image information or data as the portion of the processed image (or stitched image) corresponding to the overlap region. Further, the processing apparatus 140 may execute the process 400 to return to step 420 to update the initial image (i.e., the image obtained in the last iteration after splitting the processed image (or the stitched image)) of each of the at least two beds, and execute steps 430 to 440 until the iteration stop condition is satisfied.
Fig. 5 is an exemplary flow diagram of a PET image reconstruction method shown in accordance with some embodiments of the present description. In some embodiments, flow 500 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), etc., or any combination thereof. In some embodiments, flow 500 may be implemented as a set of instructions (e.g., an application program) stored in storage device 150. Processing device 140 and/or the modules in fig. 2 may execute the set of instructions and, when executing the instructions, processing device 140 and/or the modules may be configured to perform flow 500. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, flow 500 may be accomplished with one or more additional operations not described and/or without one or more operations discussed herein. Additionally, the order of the operations in the process as shown in FIG. 5 and described below is not intended to be limiting. According to some embodiments of the present description, the flow 500 may be used for online reconstruction.
In step 510, the processing device 140 may obtain raw data and correction factors for the first and second beds. In some embodiments, step 510 may be performed by acquisition module 210 in system 200.
The imaging device 110 may transmit the acquired raw data for each bed to the processing device 140 in real time. In other words, after the imaging device 110 completes scanning of a bed, the raw data of the bed is transmitted to the processing device 140 for subsequent processing. Alternatively, the processing device 140 may acquire raw data for each bed from the imaging device 110 in real-time. For example, when the imaging device 110 completes the scan of the first bed, the processing device 140 may immediately acquire and pre-process raw data for the first bed from the imaging device 110. Further, the processing device 140 may determine an attenuation correction factor for the first bed based on the CT image of the scanned object. When the imaging device 110 completes the scan of the second bed, the processing device 140 may immediately acquire and pre-process the raw data of the second bed from the imaging device 110. Furthermore, the processing device 140 may also determine an attenuation correction factor for the second bed based on the CT image of the scanned object. In some embodiments, the processing device 140 may further perform a physical correction on the raw data of the first and second beds to obtain first and second bed correction coefficients (e.g., scatter correction coefficients). The correction coefficient of the second bed obtained at this time is a relatively rough correction coefficient because the influence of the third bed is not considered.
In step 520, the processing device 140 may perform iterative reconstruction on the raw data of the first bed and the second bed to obtain a spliced image M 1 . In some embodiments, step 520 may be performed by the reconstruction module 220 in the system 200.
Treatment apparatus 140 may perform iterative reconstruction on the raw data of the first bed and the second bed based on a similar iterative reconstruction manner (e.g., step 420 to step 470) in the process 400, which is not described herein again. In some embodiments, after determining the accurate correction coefficient of the second bed in step 530, the processing device 140 may perform iterative reconstruction on the raw data of the first bed and the second bed based on the correction coefficient of the first bed and the accurate correction coefficient of the second bed to obtain a stitched image M 1
In step 530, the processing apparatus 140 may obtain the raw data and the correction factor for the third bed. In some embodiments, step 530 may be performed by acquisition module 210 in system 200.
The processing device 140 may obtain and pre-process the raw data of the third bed from the imaging device 110 on the fly. Furthermore, the processing device 140 may also determine an attenuation correction factor for the third bed based on the CT image of the scanned object. In some embodiments, the processing device 140 may further perform a physical correction on the raw data of the first, second, and third beds to obtain an accurate correction coefficient (e.g., a scatter correction coefficient) for the second bed and a coarse correction coefficient for the third bed.
In step 540, the processing device 140 may perform iterative reconstruction on the raw data of the second bed and the third bed to obtain a spliced image M 2 . In some embodiments, step 540 may be performed by the reconstruction module 220 in the system 200. In some embodiments, the processing device 140 may determine the accurate correction coefficient of the third bed in the next step (not shown), and then iteratively reconstruct the raw data of the second bed and the third bed based on the accurate correction coefficient of the second bed and the accurate correction coefficient of the third bed to obtain the stitched image M 2
By analogy, in step 550, the processing device 140 may obtain raw data and correction coefficients of the (N-1) (N ≧ 3, which is an integer) bed. In some embodiments, step 550 may be performed by acquisition module 210 in system 200.
The processing device 140 can instantly acquire raw data of the (N-1) th bed from the imaging device 110 and preprocess the raw data. In addition, the processing device 140 may also determine an attenuation correction factor for the N-1 st bed based on the CT image of the scanned object. In some embodiments, the processing device 140 can further perform physical correction on the raw data of the N-3 bed (not shown), the N-2 bed (not shown), and the N-1 bed to obtain the accurate correction coefficient of the N-2 bed and the rough correction coefficient of the N-1 bed.
In step 560, the processing device 140 may perform iterative reconstruction on the raw data of the N-2 th bed and the raw data of the N-1 st bed to obtain a stitched image M N-2 . In some embodiments, step 560 may be performed by the reconstruction module 220 in the system 200. In some embodiments, the processing device 140 may determine the accurate correction coefficient of the (N-1) th bed in step 570, and then perform iterative reconstruction on the raw data of the (N-2) th bed and the (N-1) th bed based on the accurate correction coefficient of the (N-2) th bed and the accurate correction coefficient of the (N-1) th bed to obtain the stitched image M N-2
In step 570, the processing device 140 may obtain raw data and correction coefficients for the nth bed. In some embodiments, step 570 may be performed by acquisition module 210 in system 200.
The processing device 140 may obtain and pre-process the raw data of the nth bed from the imaging device 110 on the fly. Furthermore, the processing device 140 may also determine an attenuation correction coefficient for the nth couch based on the CT image of the scanned object. In some embodiments, the processing device 140 can further perform a physical correction on the raw data of the N-2 bed (not shown), the N-1 bed and the Nth bed to obtain an accurate correction factor for the N-1 bed and a correction factor for the Nth bed.
In step 580, the processing device 140 may perform iterative reconstruction on the raw data of the (N-1) th bed and the raw data of the (N) th bed to obtain the stitched image M N-1 . In some embodiments, step 580 may be performed by the reconstruction module 220 in the system 200.
In step 590, the processing device 140 may convert the image M to a digital image 1 、M 2 、……、M N-2 、M N-1 Splicing to obtain target image (i.e. scanning object)Complete PET image). In some embodiments, step 590 may be performed by the reconstruction module 220 in the system 200. In some embodiments, when a partially stitched image (e.g., M) is obtained 1 、M 2 、……、M N-x (1≤x<N, x is an integer), etc.), the processing device 140 may stitch the partially stitched images.
It should be appreciated that the imaging device 110 may continue the scan of the remaining bed while the processing device 140 is iteratively reconstructing the acquired raw data. In other words, the scanning by the imaging device 110 and the image reconstruction by the processing device 140 are performed simultaneously. By means of the mode that image reconstruction and scanning are carried out simultaneously, whether scanning is successfully carried out or not, whether the system state is normal or not and whether the quality of the reconstructed image meets the requirements or not are conveniently and quickly judged. If the image quality is abnormal, the problem can be checked in time, and the situation that the scanning object (such as a patient) is injected with the tracer and cannot complete the examination is avoided. Furthermore, in some embodiments, the various reconstruction steps described above (e.g., step 520, step 540, step 560, step 580) may be performed in parallel. For example, the various reconstruction steps described above may be performed by different image processors (GPUs).
Fig. 6 is an exemplary flow diagram of a scatter correction coefficient determination method shown in accordance with some embodiments of the present description. In some embodiments, flow 600 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), etc., or any combination thereof. In some embodiments, flow 600 may be implemented as a set of instructions (e.g., an application program) stored in storage device 150. Processing device 140 and/or the modules in fig. 2 may execute the set of instructions and, when executing the instructions, processing device 140 and/or the modules may be configured to perform flow 600. The operations of the illustrated processes presented below are intended to be illustrative. In some embodiments, flow 600 may be accomplished with one or more additional operations not described and/or without one or more operations discussed herein. Additionally, the order of the operations in the process as shown in FIG. 6 and described below is not intended to be limiting.
In step 610, the processing device 140 may obtain raw data for the previous bed and the current bed. In some embodiments, step 610 may be performed by acquisition module 210 in system 200.
In some embodiments, the previous bed and the current bed may be any two adjacent (or consecutive) beds.
In step 620, the processing apparatus 140 may reconstruct the raw data of the previous bed and the current bed respectively to obtain a previous initial activity distribution image of the previous bed and a current initial activity distribution image of the current bed. In some embodiments, step 620 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the PET image may reflect the activity distribution of the radiopharmaceutical, and thus, the activity distribution image may also be a PET image. The processing apparatus 140 may reconstruct the raw data of the previous bed and the current bed by using the reconstruction algorithm as described in step 420 to obtain a previous initial activity distribution image of the previous bed and a current initial activity distribution image of the current bed.
In step 630, the processing apparatus 140 may splice the previous initial activity distribution image and the current initial activity distribution image to obtain an intermediate activity distribution image. In some embodiments, step 630 may be performed by the reconstruction module 220 in the system 200.
The processing apparatus 140 may perform a weighted summation on the overlapping regions of the previous initial activity distribution image and the current initial activity distribution image, so as to splice the previous initial activity distribution image and the current initial activity distribution image together to obtain an intermediate activity distribution image.
In step 640, the processing device 140 may acquire at least one scattering point in the intermediate activity distribution image. In some embodiments, step 640 may be performed by acquisition module 210 in system 200.
The scatter point may be any point (or pixel) in the intermediate activity distribution image. In some embodiments, the scattering point may be set according to a default value of the image reconstruction system 100, or may be set by a user (e.g., a doctor or an engineer) through the terminal device 130. For example, the scattering point may be located in a region having an activity distribution in the intermediate activity distribution image. In some embodiments, the processing device 140 may obtain an attenuation map based on CT images, MR images, etc. of the scanned object. The processing device 140 may obtain the spatial distribution information of the scattering points based on the attenuation in the attenuation map, thereby determining the positions of the scattering points. For example, the processing device 140 may determine at least one scattering point based on a random sampling of the spatial distribution information of the scattering points.
In step 650, the processing device 140 may determine an intermediate scatter distribution estimate based on the intermediate activity distribution image and the at least one scattering point. In some embodiments, step 650 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, a scatter correction coefficient for each line of response may be derived from the scatter distribution estimate. In some embodiments, the scatter distribution estimate may include image domain data, and may also include data domain data (e.g., a chord chart). In some embodiments, the processing device 140 may simulate a photon scattering process (e.g., using a monte carlo method) to determine an intermediate scattering distribution estimate.
In step 660, the processing device 140 may split the intermediate scatter distribution estimate to obtain candidate scatter distribution estimates for each of the previous and current beds. In some embodiments, step 660 may be performed by acquisition module 210 in system 200.
The processing device 140 may use the portion of the intermediate scatter distribution estimate corresponding to the overlap region directly as the portion of the corresponding overlap region in the unpacked image. In other words, the portions of the candidate scatter distribution estimates for each of the previous and current beds corresponding to the overlap region include the same image information or data.
In step 670, the processing device 140 may determine whether the iteration stop condition is satisfied. In some embodiments, step 670 may be performed by the reconstruction module 220 in the system 200.
In some embodiments, the iteration stop condition may include whether the number of iterations reaches a preset number, whether a difference between images obtained in two adjacent iterations (e.g., each initial activity distribution image or an intermediate activity distribution image) and/or a scatter distribution estimate (e.g., a candidate scatter distribution estimate for a previous bed) is less than a predetermined threshold, and/or the like.
In some embodiments, in response to the iteration stop condition being satisfied and the current bed being not the last bed, the processing device 140 may specify a candidate scatter distribution estimate for the previous bed as the scatter correction coefficient for the previous bed in step 690. In some embodiments, in response to the iteration stop condition being satisfied and the current bed being the last bed, the processing device 140 may specify a candidate scatter distribution estimate for the previous bed as the scatter correction coefficient for the previous bed and a candidate scatter distribution estimate for the current bed as the scatter correction coefficient for the current bed in step 690. The processing device 140 may store the scatter correction coefficients for the previous bed and/or the scatter correction coefficients for the current bed in the memory device 150 for recall. In this specification, for the previous bed, since the influence of scattering of the next bed (i.e. the current bed) on the previous bed is considered in the calculation, a more accurate scattering correction coefficient for the previous bed can be obtained. Aiming at the current bed, because the influence of scattering of the next bed on the current bed is not considered in the calculation, the scattering correction coefficient of the current bed can be obtained more accurately by judging whether to output the scattering correction coefficient of the current bed according to whether the current bed is the last bed or not.
In response to the iteration stop condition not being satisfied, the processing apparatus 140 may perform image reconstruction based on the candidate scatter distribution estimate of each of the previous and current beds and the raw data of the corresponding bed respectively, and update the previous and current initial activity distribution images in step 680. In other words, the processing apparatus 140 may perform image reconstruction on the raw data of each bed by using the candidate scatter distribution estimation of each of the previous bed and the current bed, respectively, and the obtained initial activity distribution image is an image after scatter correction. Further, the processing apparatus 140 may execute the process 600 to return to step 630 to splice the updated previous initial activity distribution image and the updated current initial activity distribution image to obtain an updated intermediate activity distribution image, and execute steps 640 to 660 until the iteration stop condition is satisfied.
Fig. 7 is an exemplary flow diagram of a scatter correction coefficient determination method, shown in accordance with some embodiments of the present description. In some embodiments, flow 700 may be performed by processing logic that may comprise hardware (e.g., circuitry, dedicated logic, programmable logic, microcode, etc.), software (instructions run on a processing device to perform hardware simulation), etc., or any combination thereof. In some embodiments, flow 700 may be implemented as a set of instructions (e.g., an application program) stored in storage device 150. Processing device 140 and/or the modules in fig. 2 may execute the set of instructions and, when executing the instructions, processing device 140 and/or the modules may be configured to perform flow 700. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, flow 700 may be accomplished with one or more additional operations not described and/or without one or more operations discussed herein. Additionally, the order of the operations in the process as shown in FIG. 7 and described below is not intended to be limiting.
In step 710, the processing apparatus 140 may obtain raw data for a first bed, a second bed, and a third bed that are consecutive beds. In some embodiments, step 710 may be performed by acquisition module 210 in system 200.
In step 720, the processing apparatus 140 may obtain a first initial activity distribution image of the first bed, a second initial activity distribution image of the second bed, and a third initial activity distribution image of the third bed from the previous iteration. In some embodiments, step 720 may be performed by the reconstruction module 220 in the system 200.
In step 730, the processing apparatus 140 may stitch the first initial activity distribution image, the second initial activity distribution image, and the third initial activity distribution image to obtain an intermediate activity distribution image. In some embodiments, step 730 may be performed by the reconstruction module 220 in the system 200.
In step 740, the processing device 140 may acquire at least one scattering point in the intermediate activity distribution image. In some embodiments, step 740 may be performed by acquisition module 210 in system 200.
In step 750, the processing device 140 may determine an intermediate scatter distribution estimate based on the intermediate activity distribution image and the at least one scattering point. In some embodiments, step 750 may be performed by the reconstruction module 220 in the system 200.
In step 760, the processing device 140 may split the intermediate scatter distribution estimate to obtain candidate scatter distribution estimates for each of the first, second, and third beds. In some embodiments, step 770 may be performed by reconstruction module 220 in system 200.
In step 770, the processing device 140 may determine whether an iteration stop condition is satisfied. In some embodiments, step 770 may be performed by reconstruction module 220 in system 200.
In some embodiments, the iteration stop condition may include whether the number of iterations reaches a preset number, whether a difference between images obtained in two adjacent iterations (e.g., each initial activity distribution image or intermediate activity distribution image) and/or a scatter distribution estimate (e.g., a candidate scatter distribution estimate for the second bed) is less than a predetermined threshold, and/or the like.
In some embodiments, in response to the iteration stop condition being satisfied and the third bed being not the last bed, the processing device 140 may specify a candidate scatter distribution estimate for the second bed as a scatter correction coefficient for the second bed in step 690. In some embodiments, in response to the iteration stop condition being satisfied and the third bed being the last bed, the processing device 140 may specify a candidate scatter distribution estimate for the second bed as the scatter correction coefficient for the second bed and a candidate scatter distribution estimate for the third bed as the scatter correction coefficient for the third bed in step 790. The processing device 140 may store the scatter correction coefficients for the second bed and/or the scatter correction coefficients for the third bed in the memory device 150 for recall. In this specification, for the second bed, since the influence of scattering of the previous bed (i.e., the first bed) and the next bed (i.e., the third bed) on the second bed is considered in the calculation at the same time, a more accurate scattering correction coefficient of the second bed can be obtained.
In response to the iteration stop condition not being satisfied, the processing apparatus 140 may perform image reconstruction based on the candidate scatter distribution estimate for each of the first, second and third beds and the raw data of the corresponding bed, respectively, and update the first, second and third initial activity distribution images in step 780. In other words, the processing device 140 may utilize the candidate scatter distribution estimation of each of the first bed, the second bed and the third bed to perform image reconstruction on the raw data of the corresponding bed respectively, and the obtained initial activity distribution image is an image after scatter correction. Further, the processing apparatus 140 may execute the process 700 to return to step 730 to splice the updated first initial activity distribution image, the updated second initial activity distribution image and the updated third initial activity distribution image to obtain an updated intermediate activity distribution image, and execute steps 740 to 760 until the iteration stop condition is satisfied.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) The specification provides a PET image reconstruction method, under a multi-bed scanning mode, a parallel multi-bed reconstruction time sequence is utilized, so that the influence of front and back bed scanning data on a current bed can be considered in an iterative reconstruction process, the noise of a reconstructed image splicing region is reduced, a PET image with a more uniform axial signal-to-noise ratio is obtained, the limitation on the length of the splicing region is further reduced, and the working efficiency of a system is improved; (2) By reconstructing the scanned raw data in pairs in real time, scanning and image reconstruction can be carried out simultaneously, and problems in the scanning process can be found conveniently and rapidly; (3) In the process of calculating the scattering correction, the influence of scanning data of front and back beds on the current bed position is considered, and the accuracy of the scattering correction is improved; (4) The PET reconstruction method provided by the specification does not need additional hardware upgrade and does not introduce new errors. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is only illustrative and not limiting of the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, though not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.

Claims (10)

1. A PET image reconstruction method, comprising:
acquiring raw data and correction coefficients of at least two beds; and
performing iterative reconstruction on the generated data of the at least two beds based on the correction coefficient to obtain a target image spliced by the at least two beds, wherein at least one iteration in the iterative reconstruction includes:
updating the initial image of each of the at least two beds,
splicing the updated images of each bed to obtain a spliced image,
processing the stitched image to obtain a processed image, an
And splitting the processed image to obtain an initial image of each bed in the at least two beds in the next iteration.
2. The method of claim 1, wherein acquiring the raw data for the at least two beds comprises:
acquiring a scanning range of a scanning object;
determining the number of bed bits and an overlapping area of the scanning bed according to the scanning range;
determining a position of the scanning bed at each of the at least two beds based on the number of beds and the overlap region;
acquiring the scanning time of each bed; and
based on the scanning time of each bed, controlling an imaging device to perform data acquisition on the scanning object when the scanning bed moves to the position of each bed so as to obtain the raw data of the at least two beds.
3. The method of claim 2, wherein obtaining the raw data for the at least two beds further comprises:
and preprocessing the raw data of the at least two beds to screen out positioning information and flight time information which accord with an event.
4. The method of claim 1, wherein obtaining the correction factor comprises:
and carrying out physical correction on the raw data of the at least two beds to obtain the correction coefficient.
5. The method of claim 1, wherein updating the initial image of each of the at least two beds comprises:
initializing and reconstructing the initial image of each of the at least two beds or acquiring the initial image of each of the at least two beds obtained from a previous iteration;
forward projecting the initial image of each of the at least two beds;
comparing the forward projection data of each of the at least two beds with the raw data of each of the at least two beds to obtain a deviation value; and
and carrying out back projection on the deviation value and accumulating the deviation value into the initial image of the corresponding bed so as to update the initial image of each bed.
6. The method of claim 1, wherein the at least two beds include N beds, where N is greater than or equal to 3 and is a positive integer, and iteratively reconstructing the raw data of the at least two beds to obtain the target image after splicing the at least two beds comprises:
performing iterative reconstruction on the raw data of two adjacent beds in the N beds to obtain N-1 spliced images; and
and sequentially splicing the N-1 spliced images to obtain the target image.
7. The method of claim 4, wherein the physical correction comprises a scatter correction, the correction coefficients comprise scatter correction coefficients, and performing the scatter correction on the raw data for the at least two beds comprises:
acquiring raw data of a previous bed and a current bed;
iteratively correcting the raw data of the previous bed and the current bed to obtain at least a scatter correction coefficient of the previous bed, wherein at least one iteration of the iterative correction comprises:
acquiring a previous initial activity distribution image of the previous bed and a current initial activity distribution image of the current bed, which are acquired in the previous iteration;
splicing the previous initial activity distribution image and the current initial activity distribution image to obtain a middle activity distribution image;
determining an intermediate scatter distribution estimate based at least on the intermediate activity distribution image;
splitting the intermediate scattering distribution estimate to obtain candidate scattering distribution estimates for each of the previous bed and the current bed; and
and respectively reconstructing images based on the candidate scattering distribution estimation of each bed in the previous bed and the current bed and the raw data of the corresponding bed, and updating the previous initial activity distribution image and the current initial activity distribution image.
8. The method of claim 4, wherein the physical correction comprises a scatter correction, the correction coefficients comprise scatter correction coefficients, and performing the scatter correction on the raw data for the at least two beds comprises:
acquiring raw data of a first bed, a second bed and a third bed which are continuous in bed;
iteratively correcting the raw data of the first, second and third beds to obtain at least a scatter correction coefficient of the second bed, wherein at least one iteration of the iterative correction comprises:
obtaining a first initial activity distribution image of the first bed, a second initial activity distribution image of the second bed and a third initial activity distribution image of the third bed from previous iteration;
splicing the first initial activity distribution image, the second initial activity distribution image and the third initial activity distribution image to obtain an intermediate activity distribution image;
determining an intermediate scatter distribution estimate based at least on the intermediate activity distribution image;
splitting the intermediate scattering distribution estimate to obtain candidate scattering distribution estimates for each of the first, second, and third beds; and
and respectively reconstructing images based on the candidate scattering distribution estimation of each of the first bed, the second bed and the third bed and the raw data of the corresponding bed, and updating the first initial activity distribution image, the second initial activity distribution image and the third initial activity distribution image.
9. A PET image reconstruction system, comprising:
the acquisition module is used for acquiring raw data and correction coefficients of at least two beds; and
a reconstruction module, configured to perform iterative reconstruction on the raw data of the at least two beds based on the correction coefficient to obtain a target image obtained by stitching the at least two beds, where at least one iteration in the iterative reconstruction includes:
updating the initial image of each of the at least two beds,
splicing the updated images of each bed to obtain a spliced image,
processing the stitched image to obtain a processed image, an
And splitting the processed image to obtain an initial image of each bed in the at least two beds in the next iteration.
10. A computer-readable storage medium, wherein the storage medium stores computer instructions, and when the computer instructions in the storage medium are read by a computer, the computer performs the method of any one of claims 1 to 8.
CN202210878458.1A 2022-07-25 2022-07-25 PET image reconstruction method, system and storage medium Pending CN115222599A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210878458.1A CN115222599A (en) 2022-07-25 2022-07-25 PET image reconstruction method, system and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210878458.1A CN115222599A (en) 2022-07-25 2022-07-25 PET image reconstruction method, system and storage medium

Publications (1)

Publication Number Publication Date
CN115222599A true CN115222599A (en) 2022-10-21

Family

ID=83614746

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210878458.1A Pending CN115222599A (en) 2022-07-25 2022-07-25 PET image reconstruction method, system and storage medium

Country Status (1)

Country Link
CN (1) CN115222599A (en)

Similar Documents

Publication Publication Date Title
CN110809782B (en) Attenuation correction system and method
CN107133996B (en) Method for generating an attenuation map for PET data reconstruction and PET/CT system
CN109409503B (en) Neural network training method, image conversion method, device, equipment and medium
EP3338636B1 (en) An apparatus and associated method for imaging
CN115605915A (en) Image reconstruction system and method
US11250543B2 (en) Medical imaging using neural networks
CN109215014B (en) Training method, device and equipment of CT image prediction model and storage medium
CN111462020A (en) Method, system, storage medium and device for correcting motion artifact of heart image
CN112767504A (en) System and method for image reconstruction
CN110415310A (en) Medical scanning imaging method, device, storage medium and computer equipment
CN110874855B (en) Collaborative imaging method and device, storage medium and collaborative imaging equipment
CN112529977B (en) PET image reconstruction method and system
CN114494479A (en) System and method for simultaneous attenuation correction, scatter correction, and denoising of low dose PET images using neural networks
CN112017258B (en) PET image reconstruction method, PET image reconstruction device, computer equipment and storage medium
CN112488949A (en) Low-dose PET image restoration method, system, equipment and medium
CN115222599A (en) PET image reconstruction method, system and storage medium
CN114359431A (en) Method and system for directly reconstructing parameter image
US11574184B2 (en) Multi-modal reconstruction network
JP7038752B2 (en) Imaging method and imaging equipment
Qi et al. Multi-task MR imaging with iterative teacher forcing and re-weighted deep learning
US11810228B2 (en) Network determination of limited-angle reconstruction
CN113393551A (en) Image system based on cloud server
Salomon et al. Attenuation corrected cardiac spect imaging using simultaneous reconstruction and a priori information
US11854126B2 (en) Methods and apparatus for deep learning based image attenuation correction
WO2022120588A1 (en) Low-dose pet image restoration method and system, device, and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination