CN113902823A - Projection method and system for PET image reconstruction - Google Patents

Projection method and system for PET image reconstruction Download PDF

Info

Publication number
CN113902823A
CN113902823A CN202111182705.6A CN202111182705A CN113902823A CN 113902823 A CN113902823 A CN 113902823A CN 202111182705 A CN202111182705 A CN 202111182705A CN 113902823 A CN113902823 A CN 113902823A
Authority
CN
China
Prior art keywords
response
lors
coincidence
projection data
lor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111182705.6A
Other languages
Chinese (zh)
Other versions
CN113902823B (en
Inventor
何鎏春
胡德斌
徐涵聪
吕杨
董筠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202111182705.6A priority Critical patent/CN113902823B/en
Publication of CN113902823A publication Critical patent/CN113902823A/en
Application granted granted Critical
Publication of CN113902823B publication Critical patent/CN113902823B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Nuclear Medicine (AREA)

Abstract

The embodiment of the specification discloses a projection method and a projection system for PET image reconstruction. The detector of the PET includes a plurality of detector rings arranged along an axial direction, and each detector ring includes a plurality of detection units arranged along a ring direction. The method comprises the following steps: acquiring actual projection data of an object, the actual projection data being acquired by detection units on a plurality of lines of coincidence response (LORs); acquiring an initial three-dimensional image of the object; acquiring a first coincidence response model and a second coincidence response model, wherein the first coincidence response model is annularly associated with the detector, and the second coincidence response model is axially associated with the detector; determining predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincident response models; and generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data.

Description

Projection method and system for PET image reconstruction
Technical Field
The present application relates to the field of medical imaging, and in particular, to a projection method and system for PET image reconstruction.
Background
PECT (Positron Emission Tomography), hereinafter referred to as PET, is a relatively advanced clinical examination imaging technique in the medical field. The PET reconstruction process requires projection (also called forward projection) and/or back projection using the detector system response matrix. The detector system response matrix is a matrix of detector coincidence-lines (line of response-LORs) that maps the radioactivity intensities at different coordinate bit values in three-dimensional space. Due to the physical nature of radiation detection by PET systems, LORs respond not only to pixels that they pass through in three-dimensional space, but also to pixels near the periphery of the LOR with different weights. Due to the complex physical characteristics, the response functions of each LOR to the peripheral pixels at different positions of each LOR are different, so that the calculated amount of a system response matrix is large, a large memory is occupied, the operation efficiency is low, and the image reconstruction speed is low; and if the system response matrix has errors with the real system response, the operation efficiency is reduced, and the image resolution is low. Therefore, how to increase the speed of PET image reconstruction and increase the image resolution is an urgent problem to be solved.
Disclosure of Invention
One of the embodiments of the present specification provides a projection method for PET image reconstruction. The detectors of the PET include a plurality of detector rings arranged along an axial direction, each detector ring including a plurality of detection units arranged along a ring direction, the method being performed by a computing device including a processor and a memory device. The method comprises the following steps: acquiring actual projection data of an object, the actual projection data being acquired by detection units on a plurality of lines of coincidence response (LORs); acquiring an initial three-dimensional image of the object; acquiring a first coincidence response model and a second coincidence response model, wherein the first coincidence response model is annularly associated with the detector, and the second coincidence response model is axially associated with the detector; determining predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincident response models; and generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data.
One of the embodiments of the present specification provides a projection system for PET image reconstruction. The detector of the PET includes a plurality of detector rings arranged along an axial direction, and each detector ring includes a plurality of detection units arranged along a ring direction. The system comprises: an actual projection data acquisition module for acquiring actual projection data of an object, the actual projection data being acquired by detection units on a plurality of coincidence response Lines (LORs); an initial three-dimensional image acquisition module for acquiring an initial three-dimensional image of the object; a coincidence response model obtaining module, configured to obtain a first coincidence response model and a second coincidence response model, where the first coincidence response model is circumferentially associated with the probe, and the second coincidence response model is axially associated with the probe; a predicted projection data determination module for determining predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincident response models; and the image updating module is used for generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data.
One of the embodiments of the present specification provides a projection apparatus for PET image reconstruction, including a processor and a storage device, the storage device being configured to store instructions, wherein when the processor executes the instructions, the projection method for PET image reconstruction as described in any one of the embodiments of the present specification is implemented.
One of the embodiments of the present specification provides a computer-readable storage medium, which stores instructions, wherein when a processor executes the instructions in the storage medium, the projection method for PET image reconstruction according to any one of the embodiments of the present specification is implemented.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of a PET scanner according to some embodiments herein;
FIG. 2 is a side view of a PET scanner according to some embodiments of the invention;
3A-3D are schematic structural diagrams of exemplary detector rings shown, according to some embodiments of the invention;
FIG. 4 is an exemplary block diagram of a projection system for PET image reconstruction in accordance with some embodiments of the present description;
FIG. 5 is an exemplary flow diagram of a projection method for PET image reconstruction in accordance with some embodiments described herein;
FIG. 6 is a schematic diagram illustrating identification of LORs in a three-dimensional coordinate system in accordance with some embodiments described herein;
FIG. 7 illustrates a graph comparing an axial coincidence response function obtained using a parametric fitting method with an axial coincidence function obtained by direct analytical calculation or measurement, according to some embodiments of the present application;
FIG. 8 is an exemplary flow diagram for determining predicted projection data, according to some embodiments herein;
FIG. 9 is a schematic diagram of a circular coincidence response function for different t locations on a LOR, according to some embodiments of the present description;
FIG. 10 is a schematic illustration of image rotation and image resampling, shown in accordance with some embodiments of the present description;
FIG. 11 illustrates an example distribution of the circumferential response functions of the four ABCd LORs in FIG. 10;
fig. 12 is a comparative example of a brain head portrait reconstructed by using the projection method provided in the embodiments of the present disclosure and a reference image.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification, the terms "a", "an" and/or "the" are not intended to be inclusive of the singular, but rather are intended to be inclusive of the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
FIG. 1 is a schematic diagram of a PET scanner according to some embodiments herein.
The PET scanner 100 may scan the object 110 and/or generate one or more data (e.g., scan data, image data, etc.) about the object 110. As shown in fig. 1, the scanner 100 may include a support assembly 102, a detector 104, and a scanning bed 106.
The support assembly 102 (which may alternatively be referred to as a gantry) may support other components in the scanner 100, such as the detector 104, a cooling assembly (not shown in fig. 1), and the like. For example, the support assembly 102 may support the detector 104 and/or drive the detector 104 to move, such as rotate, translate, rock, and the like. In some embodiments, support assembly 102 may include an aperture to form a detection zone (e.g., a detection zone in which subject 110 is located).
The detector 104 may detect radiation events (e.g., gamma photons) emitted from the object 110. In some embodiments, the detector 104 may receive radiation (e.g., gamma rays) and generate an electrical signal. The detector 104 may comprise one or more detection units. In some embodiments, the detection unit may include a crystal (e.g., a photo counter) and/or a photomultiplier (e.g., a silicon photomultiplier, a photomultiplier tube). The plurality of detection units may be packaged into one detector ring, i.e. one detector ring may comprise a plurality of detection units arranged in a ring direction (around the Z-axis). The detector rings may be mounted in a number to the inner wall of the support assembly 102. The plurality of detector rings may be arranged axially (i.e., in the Z-axis direction in fig. 1) along the detector 104. For more details of the detector ring, reference may be made to fig. 2, 3 and their associated description.
In some embodiments, the probe 104 may have an axial length (or referred to as the axial length of the probe). The axial length of the detector 104 may be defined as the length along the Z-axis direction, i.e., the distance from one end of the detector 104 to the opposite end of the detector 104. In some embodiments, the axial length of the probe 104 may range from 0.75 meters to 2 meters. In some embodiments, the axial length of the probe 104 may exceed 2 meters.
For the understanding of the embodiments in the present specification, the detection principle of PET will be described. In the detection process, tracer drugs (made of positron radioactive nuclide) with the radiation function are injected into a human body and carried to each tissue organ along with the flow of blood, and because different tissue organs absorb different drugs in different degrees, the concentration of the tracer drugs accumulated in each tissue organ is different, and the number intensity of emitted photons is different. Positron-emitting radionuclides decay in vivo, and during the process of converting from an unstable state to a ground state, the atomic nuclei emit positrons to the surroundings. The positron is combined with a surrounding negative electron, and annihilation occurs. During annihilation, the mass of positive and negative electrons is converted into a pair of oppositely directed gamma (γ) photons of 511keV energy. The flight (propagation) directions of the two photons are at an angle of 180 degrees to each other or approximately at an angle of 180 degrees to each other. That is, the two photons can be considered to fly in a straight line in opposite directions. When two detection units in a PET detector simultaneously receive a pair of gamma photons with an energy of 511keV, positrons corresponding to each detected pair of photons (a pair of photons is detected simultaneously, also called a coincidence event) are counted, and the line integral of the intensity (referred to as radioactive intensity for short) distribution diagram of the radioactive substance inside the living body (such as a human body) can be obtained by summarizing the counts of the positrons/coincidence events in various given directions. The position at which a gamma photon pair is generated and the alignment of the two detection units that detect the photon pair are called a line of response (LOR), also called a coincidence line. It will be appreciated that any two detection units are on the coincidence response line to which they correspond.
Figure 2 is a side view of a PET scanner according to some embodiments of the present invention.
As shown in fig. 2, a plurality of detector boxes (or blocks) 204 may be arranged in a ring-like configuration (also referred to as a detector ring) in a cross-section perpendicular to the Z-axis (i.e., a circumferential plane). The circumferential plane may be defined by an X-axis as well as a Y-axis. The detector box 204 may include one or more detection units 206, which may be distributed and packaged in a circumferential direction to form a detector box or detector block. The detector box 204 may be covered and protected by the housing 202. In some embodiments, the housing 202 may be a hollow cylinder. The area surrounded by the detector box 204 may be a detection area where the object 208 is located. The subject 208 may be supported by the scanning couch 106. In some embodiments, radiation emitted from the object 208 may be detected by the detector box 204 if the object 208 is within a lateral field of view. As described herein, the detection unit may be the smallest unit (e.g., may include a photoelectric counter) of the PET detector that is capable of photon detection and signal readout processing.
Fig. 3A-3D are schematic structural views of exemplary detector rings shown, in accordance with some embodiments of the present invention.
In some embodiments, the plurality of detector tiles (or detector units) may be distributed in a full ring array or a partial ring array. A detector ring may include one or more detector boxes. In some embodiments, a detector ring having a full ring configuration may be configured in a circular form (see fig. 3A), hexagonal form (see fig. 3D), elliptical form, or other polygonal form. In some embodiments, a detector ring having a partial ring configuration may be implemented based on two or more detector tiles. The detector blocks may be curved or flat. The detector ring of the partial ring configuration shown in fig. 3B has two curved detector blocks with an angle variation of 15 ° between the two curved detector blocks. The detector ring of a partial ring configuration shown in FIG. 3C has six evenly spaced curved detector bricks. In some embodiments, multiple detector rings may be arranged in axial succession to form a detector having a large axial length (e.g., 0.75 to 2 meters). In one detector, at least one detector ring may have a full ring configuration and/or at least one detector ring may have a partial ring configuration. Detectors with large axial lengths may have large axial fields of view (e.g., 0.75m to 2 m). In some embodiments, a whole-body scan may be achieved with a detector having a large axial length. The plurality of detector boxes or blocks may be symmetrically distributed along the circumferential plane, for example, in fig. 3D, 6 detector blocks may present a regular hexagonal distribution, and each detector block includes a plurality of detection units distributed on 6 sides of the regular hexagon.
FIG. 4 is an exemplary block diagram of a projection system for PET image reconstruction in accordance with some embodiments described herein.
As shown in fig. 4, the system 400 may include an actual projection data acquisition module 410, an initial three-dimensional image acquisition module 420, a coincidence response model acquisition module 430, a predicted projection data determination module 440, and an image update module 450.
The actual projection data acquisition module 410 may be used to acquire actual projection data of the object, which is acquired by the detection units on the plurality of LORs.
The initial three-dimensional image acquisition module 420 may be used to acquire an initial three-dimensional image of the object.
The conforming response model acquisition module 430 may be used to acquire a first conforming response model that is annularly associated with the sonde and a second conforming response model that is axially associated with the sonde.
The predicted projection data determination module 440 may be configured to determine predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second fitted response models.
The image update module 450 may be configured to generate a target three-dimensional image of the object based on the predicted projection data and the actual projection data.
For more details on the system 400 and its modules, reference may be made to fig. 5 and its associated description.
It should be understood that the system and its modules shown in FIG. 4 may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such code being provided, for example, on a carrier medium such as a diskette, CD-or DVD-ROM, a programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules in this specification may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system and its modules is for convenience only and should not limit the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. For example, in some embodiments, the actual projection data acquisition module 410 and the initial three-dimensional image acquisition module 420 may be two modules or may be combined into one module. Such variations are within the scope of the present disclosure.
Fig. 5 is an exemplary flow diagram of a projection method for PET image reconstruction in accordance with some embodiments presented herein. Flow 500 may be performed by a computing device, which may include a processor and a memory. As shown in fig. 5, the process 500 may include the following steps.
Actual projection data of the object is acquired, step 510. In some embodiments, step 510 may be performed by the actual projection data acquisition module 410.
The actual projection data is acquired by detection units on a plurality of LORs in a PET detector. In some embodiments, the plurality of LORs may be all LORs of the detector or may be partial LORs of the detector. The subject may refer to the patient being examined. Actual projection data of the object may be acquired based on a count of the plurality of LORs. With reference to the foregoing, during examination, the detectors (e.g., multiple detector rings) of the PET may surround the object. Positron tracers decay in the body to continuously release positrons, the positrons and negative electrons in the body generate annihilation reaction, two gamma photons with the energy of 511keV are released simultaneously, the two gamma photons in the two directions are 180 degrees to each other, the PET captures paired photons by using a detector, data is recorded by using a line of response (LOR) value, and the gamma photon corresponding to each coincident event is considered as an effective count to be recorded and stored so as to form actual projection data. The actual projection data may include a value corresponding to each LOR (the actual total number of photons detected by the detection unit on the bar of LORs), and may also be referred to as the actual response intensity or the actual projection value of the bar of LORs. In some embodiments, the actual projection data may be represented as a sinusoidal data map or matrix.
Step 520, an initial three-dimensional image of the object is acquired. In some embodiments, step 520 may be performed by initial three-dimensional image acquisition module 420.
The pixel value of each pixel point of the initial three-dimensional image may represent the radioactivity intensity or the concentration of radioactive elements at that pixel point. In some embodiments, the initial three-dimensional image may be stored in a three-dimensional matrix, where each element in the three-dimensional matrix corresponds to a pixel point, and a value of each matrix element is a pixel value of the pixel point corresponding to the matrix element. In some embodiments, the initial three-dimensional image may be determined by initializing pixel values of pixel points in the three-dimensional image. For example, the pixel values of the pixel points of the initial three-dimensional image may be specified as the same value or arbitrary values to acquire the initial three-dimensional image. In some embodiments, the initial three-dimensional image may be determined based on a detector system response model. For example, the actual projection data may be backprojected using a system response model to determine an initial three-dimensional image.
Step 530, obtain the first and second conforming response models. In some embodiments, step 530 may be performed by the conforming response model acquisition module 430.
The first coincidence response model is circumferentially associated with the probe and the second coincidence response model is axially associated with the probe.
To facilitate understanding of the role of the first and second coincidence response models, a concept of a system coincidence response model is introduced herein, which can be used to map or project the radioactivity intensity (i.e., pixel values) of different pixel locations (i.e., different pixel points) in three-dimensional space to each of a plurality of LORs to obtain the response intensity (i.e., projection value) of each LOR. For example, the system coincidence response model may include a coincidence response matrix or function (also referred to as a projection matrix or function) for each LOR, each element in the projection matrix for each LOR corresponding to a pixel in three-dimensional space representing the contribution of the pixel in three-dimensional space to the response intensity of the LOR or the probability (also referred to as a response weight) that a photon emitted at the pixel is detected by a detection unit on the LOR.
To better understand that the system conforms to the response modelType, referring to FIG. 6, a pixel can be represented by coordinates (X, Y, Z) in a three-dimensional coordinate system X-Y-Z. It is understood that the X, Y, and Z axes of the three-dimensional coordinate system are perpendicular to each other. In addition, the Z axis may be taken as a central axis of the detector, and accordingly, a direction of the Z axis (referred to as Z direction for short) is an axial direction of the detector; the X-Y plane is the annular plane of the detector, namely the plane of the detector ring. By means of a three-dimensional coordinate system X-Y-Z, it is also possible to use the coordinates
Figure BDA0003297910840000071
To uniquely identify the LOR, where s represents the (radial) distance of the LOR in the plane of the ring (i.e., the X-Y plane) projected onto the central axis (i.e., the Z axis) of the detector,
Figure BDA0003297910840000072
representing the angle of the projection of the LOR in the ring plane with the Y-axis (also referred to as the ring angle of the LOR), e.g. detection unit i in fig. 6aAnd a detection unit ibCircumferential angle of the located LOR
Figure BDA0003297910840000073
z represents the axial position of the intersection of the LOR with the plane of the ring (i.e., the z coordinate), and θ represents the angle of the LOR with its projection in the plane of the ring (also referred to as the axial angle of the LOR). Based on the above description, a matrix can be used
Figure BDA0003297910840000074
To indicate that the system conforms to the response model. System coincidence response matrix
Figure BDA0003297910840000075
Includes a coincidence response matrix corresponding to each of the plurality of LORs, and the coincidence response matrix corresponding to each of the LORs may be a portion of the system response matrix (e.g., a matrix)
Figure BDA0003297910840000076
One row or one column) of the three-dimensional image, the coincidence response matrix corresponding to each LOR can be used for carrying out weighted summation on the pixel values of all the pixel points of the three-dimensional image so as to obtain the response of the LOR to all the pixel pointsIntensity (i.e., the projected value of the LOR).
In some embodiments, the first conforming response model is a circumferential component of the system conforming response model and the second conforming response model is an axial component of the system conforming response model. The system conforming response model can be determined based on the first conforming response model and the second conforming response model. For example, the first fitting response model can be represented as a first matrix, the second fitting response model can be represented as a second matrix, and the system fitting response matrix (i.e., the system fitting response model) can be obtained by multiplying the first matrix and the second matrix.
The number of LORs and the number of pixels contained in the three-dimensional image both affect the storage space required by the storage system to conform to the response model, because each LOR corresponds to a conforming response model to represent the response weight of the pixel point in the three-dimensional image to each LOR. Based on the above discussion, the storage space required for a storage system to conform to a response model is often difficult to implement, and even if the corresponding storage costs are realized, is unacceptable. In view of this, the embodiments of the present disclosure provide to split the response of the three-dimensional space into the circumferential component and the axial component, that is, by decomposing the system coincidence response model into the first coincidence response model associated with the probe ring and the second coincidence response model associated with the probe shaft, and directly storing the first coincidence response model and the second coincidence response model, the storage space required for storing the coincidence response models can be saved, so that the projection computation amount can be reduced (i.e., the projection computation process is accelerated).
It should be understood that the use of the term "decomposing" is intended to describe the relationship between the system conforming response model and the first and second conforming response models, and that in practice the first and second conforming response models need not be obtained by decomposing the system conforming response model. In some embodiments, the first and/or second compliant response models may be obtained by one or more of the following methods: (1) analytic calculation based on the geometrical information of the detector; (2) based on analog calculation of Monte Care simulation of the detector structure; (3) direct measurements using specific experimental protocols were used. The calculated or measured first and/or second coincidence response models can be stored in a storage device, and when image reconstruction is required to be performed by using the first and/or second coincidence response models, the first and/or second coincidence response models can be directly called or acquired from the storage device.
Some complex physical characteristics result in each LOR responding differently to surrounding pixels at different locations of each LOR. These physical characteristics include, but are not limited to: positron annihilation free path, the non-collinearity of annihilation gamma rays, the attenuated scattering of gamma photons in a detector crystal, and the sensitivity variation of different detector crystals. It will be appreciated that different methods may be used to estimate the effect of different physical processes on the conforming response in obtaining the first and/or second conforming response models. For example, the analytical calculations can be used to describe the attenuated scatter of gamma photons in the detector crystal and experimentally measure the effect of other physical properties on the coincidence response.
In some embodiments, the first and/or second fitted response models stored in the memory may be any one or combination of forms of a response function (or matrix), a look-up table, a fitting function of a response function, fitting parameters of a fitting function, and the like.
In some embodiments, the first and/or second fitted response models may each include a fitted response matrix (e.g., a circumferential fitted response matrix or an axial fitted response matrix) corresponding to each LOR. In some embodiments, each of the conforming response matrices (e.g., the circumferential conforming response matrix or the axial conforming response matrix) includes a plurality of elements. Each element represents the response weight of a particular LOR to its surrounding pixels at a particular location or all of the pixels. For example, the first coincidence response model may be represented as a first matrix, the rows of the first matrix may correspond to different pixels, and the columns of the first matrix may correspond to different LORs; the second coincidence response model may be represented as a second matrix, the rows of which may correspond to different pixels, and the columns of which may correspond to different LORs.
In some embodiments, the first and/or second fitted response models may each include a fitted response function (e.g., a circumferential fitted response function or an axial fitted response function) corresponding to each LOR. In some embodiments, each coincident response function (e.g., a circumferential or axial coincident response function) can be used to represent the varying weight of the response of a particular LOR to its surrounding pixels or all pixels at a particular location as a function of the pixel's location relative to the particular LOR.
In some embodiments, the first and/or second conforming response models may be represented as one or more look-up tables, respectively. The lookup table may include response weights for pixels of a particular LOR at particular locations around its perimeter (e.g., response weights in a circular coincidence response matrix and response weights in an axial coincidence response matrix) or a coincidence response function representing the variation of response weights with pixel relative to the location of a particular LOR. In some embodiments, each LOR may correspond to a lookup table, and an element in the lookup table may represent the response weight of the LOR corresponding to the lookup table to a different pixel or the relative position in a certain direction from the response weight of a different location on the LOR to a different pixel.
In some embodiments, a simplified compressed look-up table may be built with the symmetry of the detector for a conforming response function or matrix obtained by various methods. As an example, the detector ring in the detector may comprise a regular polygon structure, and the plurality of detection units are arranged on respective sides of the regular polygon. The same number (denoted as N) of detection units may be disposed on each edge of the regular polygon, and the detection units on each edge may be arranged at equal intervals along the edge. Thus, the probe structure has M times of rotational symmetry in the circumferential direction, where M is the number of sides of the regular polygon. Accordingly, the memory may store only N circumferential angles (reference)
Figure BDA0003297910840000091
) Of group (a)And the first coincidence response model is determined based on the circumferential coincidence response functions corresponding to the LORs of the N circumferential angle groups. Wherein, the LOR of each circumferential angle in each circumferential angle group can correspond to a circumferential fit response function at different positions thereon. For example, the LOR of each circumferential angle in each circumferential angle group at different positions in the t direction (i.e. different t positions) may correspond to a respective circumferential fit response function along the s direction.
In some embodiments, the calculated function of the fitted response functions for the different LORs may be established by parametrically fitting existing fitted response functions (e.g., circumferential fitted response functions and/or axial fitted response functions) corresponding to the different LORs. In some embodiments, a lookup table of fitting parameters of a fitting function for different LOR coincidence response functions may be established, and the coincidence response functions corresponding to the different LORs may be determined by the fitting parameters corresponding to the different LORs and the calculation function. As another example, the first fitting function may represent a correspondence between a first fitting parameter corresponding to a different position on the LOR and the first fitting function. The second fitting function represents a correspondence between second fitting parameters corresponding to different positions on the LOR and the axial fitting response function. In some embodiments, the first fitting parameters and/or the second fitting parameters corresponding to different positions on each LOR may be stored in the memory in the form of a look-up table. Accordingly, the coincidence response model obtaining module 430 can obtain the first fitting parameter and the second fitting parameter corresponding to different positions on each LOR from the memory. Furthermore, the fit response model obtaining module 430 may determine the circumferential fit response function and the axial fit response function corresponding to different positions on each LOR by using the first fitting function based on the first fitting parameters, and determine the axial fit response function corresponding to different positions on each LOR by using the second fitting function based on the second fitting parameters. By storing the fitting parameters, the storage space required for storing the first and/or second conforming response models can be greatly saved.
In some embodiments, the first fitting function or the second fitting function may be obtained based on a combination of two half gaussian functions having different widths, and the first fitting parameter or the second fitting parameter may include a critical point of the two half gaussian functions and a width parameter of the two half gaussian functions.
Taking the axial coincidence response function as an example, the parametric fit can be performed according to the following equation (1):
Figure BDA0003297910840000101
wherein the response intensity I of the axial fit response function at different Δ z (axial distance of pixel relative to LOR) is fittedACAFDescribed as c as the critical point, left and right width (σ)LAnd σR) A combination of two different half gaussian functions. Fitting parameters c, σ of the fitting response function for axially different LORs at different t positionsLAnd σRA fit can be made that is related to t and related to the LOR hoop angle or the distance between two detection units on the LOR. After the fitting is completed, if the axial coincidence response function of a specific t position of a specific LOR is required to be solved, the fitting parameters c and sigma corresponding to the position can be obtained firstLAnd σRAnd obtaining the description formula.
FIG. 7 illustrates a comparison of an axial coincidence response function (i.e., the second fit function, corresponding to the curve in the figure) obtained using a parametric fitting method with an axial coincidence function (corresponding to the five-pointed star in the figure) obtained by direct analytical calculation or measurement, according to some embodiments of the present application. Wherein, LOR corresponding to each of the axial coincidence response functions a, b, c and d and t position thereof are shown as (e). As shown in the figure, the fitting response function obtained by the parameter fitting method is close to or substantially the same as the fitting function obtained by analytic calculation or measurement, but compared with the storage of the fitting response function, the storage of the fitting parameters of the fitting response function can save the storage space and improve the speed of obtaining the fitting function, thereby further accelerating the image reconstruction.
In some embodiments, sets of circumferential conforming response functions are established for different pixel sizes to maintain a higher accuracy of the first conforming response model. That is, the memory may store a plurality of sets of circular conforming response functions corresponding to a plurality of pixel sizes one to one, and the first conforming response model may be determined based on the plurality of sets of circular conforming response functions. Each set of circular conforming response functions may correspond to one LOR. The pixel size refers to the area of each pixel in the reconstructed three-dimensional image. For the same LOR, the corresponding annular coincidence response functions of different pixel sizes are different.
For more details on the first and/or second compliant response models, reference may be made to the related description below.
Step 540, determining the predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincidence response models. In some embodiments, step 540 may be performed by the predicted projection data determination module 440.
In some embodiments, the predicted projection data determination module 440 may divide the plurality of LORs into a plurality of sets of LORs, wherein the hoop angles (i.e., hoop angles) of the same set of LORs
Figure BDA0003297910840000111
) The same, and the projection of the LORs of the same group in the circumferential plane to the center of the circumferential plane is the same distance (refer to s). With reference to the foregoing, it is possible to combine s,
Figure BDA0003297910840000112
the same LOR is considered a group of LORs (which may also be referred to as a LOR group). It is understood that LORs within the same LOR group lie in the same plane.
After the LOR grouping, for each LOR (i.e., each LOR group) in the plurality of LOR groups, the predicted projection data determination module 440 may determine intermediate predicted projection data corresponding to the LOR group based on the first coincidence response model and the initial three-dimensional image, the intermediate predicted projection data corresponding to the LOR group including the response intensity of the plurality of annular planes at different axial positions (reference z) of the detector to the initial three-dimensional image. Further, the predicted projection data determination module 440 may determine the predicted projection data corresponding to each LOR in the set of LORs based on the second fitting response model and the intermediate predicted projection data corresponding to the set of LORs. For a detailed description of the determination of the predicted projection data, reference may be made to the detailed description in fig. 8.
Step 550, generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data. In some embodiments, step 550 may be performed by image update module 450.
It should be appreciated that the process 500 may be iteratively/repeatedly performed until a reconstructed image is obtained that meets the requirements. That is, the target three-dimensional image generated in any iteration can be used as the initial three-dimensional image in the next iteration. In each iteration, the initial three-dimensional image may be updated by comparing the error between the predicted projection data and the actual projection data. In some embodiments, the error between the predicted projection data and the actual projection data may be minimized over multiple iterations, and when the iterations terminate, the initial three-dimensional image in the last iteration or the updated initial three-dimensional image may be designated as the target three-dimensional image.
Fig. 8 is an exemplary flow diagram for determining predicted projection data according to some embodiments of the present description. Flow 800 may be performed by a computing device, which may include a processor and a memory. The 540 step in fig. 5 may be performed based on the process 800. As shown in fig. 8, the process 800 may include the following steps.
Step 810, dividing the plurality of LORs into a plurality of sets of LORs.
In some embodiments, the hoop angles of the same set of LORs (i.e., the hoop angles)
Figure BDA0003297910840000121
) The same, and the projection of the same set of LORs in the circumferential plane to the center of the circumferential plane (i.e., s) is the same. With reference to the foregoing, it is possible to combine s,
Figure BDA0003297910840000122
the same LOR is considered a group of LORs (which may also be referred to as a LOR group). It will be appreciated that the projection positions of LORs within the same LOR set within the same annular plane are the same or overlap.
And 820, determining intermediate prediction projection data corresponding to each group of LORs in the plurality of groups of LORs based on the first coincidence response model and the initial three-dimensional image.
In some embodiments, the intermediate predicted projection data corresponding to the LOR set may be represented as a two-dimensional data set. A first dimension of the two-dimensional data set represents axial positions (e.g., z coordinates) of the plurality of annular planes, and a second dimension of the two-dimensional data set represents different positions of the set of LORs in a second direction (denoted as the t direction) of the plurality of annular planes. Each element in the two-dimensional data set represents, at a position in the t direction in one of the ring planes, the sum of the response intensities of all pixels in the initial three-dimensional image or of the LOR passing through and surrounding pixels along a first direction (denoted as the s direction) in the ring plane of the LOR set. Wherein the t direction is perpendicular to the s direction, and the s direction and the t direction may be in accordance with the arrangement of the initial three-dimensional image in the ring plane. For example only, the elements in the two-dimensional data set may be calculated as follows:
Figure BDA0003297910840000123
wherein,
Figure BDA0003297910840000124
indicating an annular angle of
Figure BDA0003297910840000125
Of the LOR group of (A) is compared with the two-dimensional data groupElements of (i.e., intermediate predicted projection data);
Figure BDA0003297910840000126
represents the normalized response weight (interval of values of 0, 1)]) That is, the circumferential coincidence response matrix or function corresponding to the LOR can be extracted from the first coincidence response model; pix (t, s) represents pixel values of pixels (points) whose relative positions to the LOR in the t direction and the s direction are s, t, respectively, in the initial three-dimensional image.
In some embodiments, the first fitting response model may include a circumferential fitting response matrix corresponding to each LOR of the plurality of LORs.
In some embodiments, the first fitting response model may include a circumferential fitting response matrix corresponding to each LOR in the plurality of LORs at different positions (i.e., different t positions) in the t direction. Determining intermediate predicted projection data corresponding to the set of LORs based on the first coincident response model and the initial three-dimensional image may represent determining intermediate predicted projection data corresponding to the set of LORs based on a circumferential coincident response matrix corresponding to the set of LORs in the first coincident response model.
In some embodiments, the first fitting response model may be a two-dimensional function related to s, t, such as
Figure BDA0003297910840000127
Where s denotes the position of the pixel in the s direction and t denotes the position of the pixel in the t direction. When the calculation with lower precision is carried out, the annular coincidence response function can be simplified into a one-dimensional function related to s, namely, the annular coincidence response functions at different t positions of the same LOR are considered to be the same.
In some embodiments, for a short axis system, the circumferential coincidence response function corresponding to the same position in the t direction (i.e., the same t position) of each LOR in at least one of the plurality of sets of LORs may be the same.
In some embodiments, for a long axis system, the circumferential coincidence response function for each LOR in at least one of the plurality of LOR sets corresponding to the same position in the t direction (i.e., the same t position) may be different. In some embodiments, the differences in the circular coincidence response function of each LOR within a group of LORs at the same t position can be distinguished by further partitioning the subsets within the group of LORs. Specifically, for each LOR group of the plurality of LOR groups, the predicted projection data determination module 440 may divide the LOR group into a plurality of subsets, the LORs in the plurality of subsets corresponding to different axial angle (reference θ) intervals, the plurality of subsets corresponding to different circumferential coincidence response functions, respectively. The division of the axial angle interval in this specification is not particularly limited. For example only, the value interval [0,1] of tan θ may be divided into a plurality of sub-intervals with equal lengths, and accordingly, the subsets may be divided within the LOR group according to the θ interval corresponding to each sub-interval. For each of the plurality of subsets, the predicted projection data determination module 440 may determine intermediate predicted projection data corresponding to each subset by the circular fit response function corresponding to the subset, and then determine intermediate predicted projection data corresponding to the set of LORs based on the intermediate predicted projection data corresponding to each subset.
It is to be understood that a short axis system may refer to the axial length of the detector being less than a threshold value and a long axis system may refer to the axial length of the detector being greater than a threshold value. In some embodiments, the length threshold may be equal to 100 centimeters, 90 centimeters, 80 centimeters, 70 centimeters, or 60 centimeters, or 50 centimeters, or the like.
In some embodiments, the first fitting response model may include a plurality of circumferential response functions corresponding to the plurality of LORs, and the circumferential fitting response function is modified to obtain the first fitting response model. The modification may specifically include modifying the width of the preset partial waveform of the hoop coincidence response function in the s direction and the response strength (corresponding function value) of the hoop coincidence response function according to the difference between the pixel arrangement of the initial three-dimensional image and the hoop angle of the LOR. Wherein the initial three-dimensional image may be non-rotated or rotated by a preset angle along the circular direction of the probe
Figure BDA0003297910840000131
In (1). It will be appreciated that the modification of the width of the function and the strength of the response may be made without changeThe loop corresponds to the magnitude of the integral of the response function in the s-direction. Therefore, through the displacement of the annular coincidence response function along the direction s at different t positions, the deviation caused by the fact that the LOR and the pixel column are not parallel can be corrected.
Step 830, determining the predicted projection data corresponding to each LOR in the set of LORs based on the second fitting response model and the intermediate predicted projection data corresponding to the set of LORs.
In some embodiments, the predicted projection data determination module 440 may determine the response strength of each LOR in the set of LORs at each position in the t direction (i.e., each t position) based on the intermediate predicted projection data corresponding to the set of LORs through the second fitting response model. Furthermore, the predicted projection data determining module 440 may add the response intensity of each LOR at each t position to obtain the predicted projection data corresponding to the LOR.
In some embodiments, the axial conforming response function may be a two-dimensional function related to z, t. When the calculation with lower precision is carried out, the axial coincidence response function can be simplified into a one-dimensional function related to s, namely, the axial coincidence response functions at different t positions of the same LOR are considered to be the same.
In some embodiments, the second fitting response model may include a plurality of axial response functions respectively corresponding to each LOR of the plurality of LORs at different t positions. In some embodiments, the second fitting response model may include a plurality of axial response functions respectively corresponding to each LOR of the plurality of LORs at different t positions.
In some embodiments, the predicted projection data determination module 440 may rotate the initial three-dimensional image a predetermined angle (denoted as a predetermined angle) around the probe based on an image rotation model (e.g., an image rotation matrix)
Figure BDA0003297910840000141
). Furthermore, the predicted projection data determining module 440 may determine the response intensity of the LOR set to the rotated three-dimensional image at the plurality of annular planes at different axial positions (e.g., different z-positions) of the detector through the first coincidence response model to determine the middle corresponding to the LOR setProjection data is predicted. Rotated pixel arrangement (reference t direction) and LOR circumferential angle (reference
Figure BDA0003297910840000142
) The closer the response range (e.g., the interval with the response weight greater than 0 in the s direction) described by the first coincidence response model associated with the probe loop is, the narrower the response range (e.g., the fewer pixels involved in the calculation), the amount of projection calculation can be reduced accordingly, i.e., the projection calculation efficiency can be improved. Notably, in some embodiments, the image is rotated by an angle (e.g., an angle of rotation)
Figure BDA0003297910840000143
) Not necessarily at an angle to the LOR circumferential direction (e.g.
Figure BDA0003297910840000144
) Exactly equal, i.e. the pixel columns of the rotated three-dimensional image in the direction t need not be exactly parallel to the projection of the LOR onto the ring plane. It is understood that in some embodiments, each of the plurality of sets of LORs may correspond to a rotation angle
Figure BDA0003297910840000145
In a matrix
Figure BDA0003297910840000146
As an example, a matrix
Figure BDA0003297910840000147
Can be expressed as the following equation:
Figure BDA0003297910840000148
Figure BDA0003297910840000149
wherein,
Figure BDA00032979108400001410
is a circumferential imageA rotation matrix for rotating an initial three-dimensional image by a preset angle along the circumferential direction of the probe
Figure BDA00032979108400001411
Accordingly, the original coordinates (x, y, z) of the pixels (points) in the initial three-dimensional image may be transformed/mapped to (x, y, z)r,yr,z)。
Figure BDA00032979108400001412
Is a fitting response matrix (i.e. the first fitting response model) associated with the probe loop, hereinafter referred to as a loop fitting response matrix.
Figure BDA00032979108400001413
Is a coincidence response matrix (i.e. the second coincidence response model) associated with the detector axial direction (reference z-direction), hereinafter referred to as axial coincidence response matrix. The axial and circumferential coincidence response matrices T, a may each comprise a series of (circumferential/axial) coincidence response functions describing the response weights of different LORs to their surrounding pixels at a particular location (e.g., T location). For any LOR group, the rotated three-dimensional image (the coordinate of a pixel point is (x)) can be obtained by applying a circular coincidence response matrixr,yrZ)) to a particular two-dimensional data set (dimensions t, z). Each element in the two-dimensional data set represents the sum of the response intensities of the peripheral pixels in the rotated three-dimensional image along a first direction (denoted as the s direction) in one of the ring planes at a position in the t direction in the ring plane. For more details on the two-dimensional data set, reference may be made to the preceding embodiments. Further, by applying an axial coincidence response matrix, these two-dimensional data sets can be mapped or projected onto s respectively,
Figure BDA0003297910840000151
the same LOR, z, θ, are different to obtain the predicted response intensity (i.e., predicted projection data) for each LOR.
In some embodiments, image resampling may also be performed on the basis of image rotation. In particular, the predicted projection data determination module 440 may upsample the rotated three-dimensional image in the s-direction and/or downsample in the t-direction to obtain a resampled three-dimensional image. Further, the predicted projection data determination module 440 may determine the response intensity of the plurality of sets of annular plane LORs at different positions in the axial direction (reference z direction) of the detector to the resampled three-dimensional image through the first coincidence response model. On the one hand, upsampling in the s-direction can be used to reduce resolution degradation caused by image rotation and improve sampling accuracy of the conforming response model. On the other hand, considering that the variation of the coincidence response function along the t direction is relatively gentle when the image is rotated until the pixel arrangement is close to the circumferential angle of the LOR, for example, as shown in fig. 9, the circumferential coincidence response functions at different t positions on the same LOR, such as a, c, f, b, d, g or e, h, are approximately the same in distribution and shape, so that downsampling can be performed along the t direction to reduce the projection calculation amount. In some embodiments, the up-sampling process in the s-direction and the down-sampling process in the t-direction may be performed simultaneously in one rotation operation to reduce the amount of computation and corresponding memory access.
Of course, in some embodiments, the initial three-dimensional image may not be rotated. For example, the hoop angle of some LORs, for which the initial three-dimensional image may not be rotated, inherently coincides or nearly coincides with the pixel arrangement before rotation (reference t direction).
In some embodiments, the predicted projection data determination module 440 may also apply the second coincidence response model associated with the axial direction of the detector before applying the first coincidence response model associated with the circumferential direction of the detector to determine the predicted projection data corresponding to each LOR of the plurality of LORs.
FIG. 10 is a schematic illustration of image rotation and image resampling, shown in accordance with some embodiments of the present description. It should be understood that fig. 10 shows the pixels (dots) of the three-dimensional image, the detection units, and the LORs (e.g., four LORs labeled a, b, c, d) connecting the detection units in a view from the ring plane.
As shown in fig. 10, the circumferential coincidence response functions of the different LORs can act directly on the initial three-dimensional image shown in (1), i.e., without rotating the initial three-dimensional image, the two-dimensional data set required for subsequent projections is generated only by the circumferential coincidence response function. The initial three-dimensional image is rotated from (1) to (2), so that the angular deviation between the pixel arrangement needing to be considered by the annular coincidence response function and the coincidence response lines a, b, c and d is reduced, and the calculation efficiency is further facilitated. The image is up-sampled in the s direction (approximately in the vertical LOR direction) and down-sampled in the t direction (approximately in the LOR direction) to (3), so that the model accuracy can be improved, and the projection calculation speed can be improved. Fig. 11 shows an example of the distribution of the circumferential response functions of the abcd four LORs in fig. 10, and the darker the color represents the higher the response weight of the position. As shown, the two-dimensional circular coincidence response function accounts for angular deviations between the pixel arrangement and the LOR. The application of the axial coincidence response function is similar to that of the annular coincidence response function, a lookup table can be directly applied, a calculation function can also be applied, resampling is not needed to be carried out on (t, z) two-dimensional data, and coincidence response intensity of different LORs at different t positions can be directly calculated only through the axial coincidence response function.
Fig. 12 is a comparative example of a brain head portrait reconstructed by using the projection method provided in the embodiments of the present disclosure and a reference image. In fig. 12, a is a reference image, and the gray-white contrast is 4:1, which is closer to the actual gray-white contrast; column b is a noiseless data reconstruction result obtained by using the reconstruction method provided by the embodiment of the specification, and projection data required by the reconstruction of column b is obtained by performing forward projection on a column a of reference images; column c is a noisy data reconstruction result obtained by using the reconstruction method provided in the embodiments of the present specification, and projection data required for reconstruction of column c is obtained by performing forward projection on a column a of reference images. As shown in the figure, compared with the reference image, the image reconstructed by the image reconstruction method provided by the embodiments of the present specification is close to the reference image, and has good resolution and reduction degree.
It should be noted that the above description of the flow is for illustration and description only and does not limit the scope of the application of the present specification. Various modifications and alterations to the flow may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) and directly mapping the pixel values of the image to the LOR according to different weights by adopting the first coincidence response model which is annularly associated with the detector and the second coincidence response model which is axially associated with the detector, without resampling the pixels in an annular plane with the width which is parallel to the LOR and without miscut of the pixels distributed along the axial direction. Therefore, the precision error of the annular plane caused by resampling is reduced, and the error caused by pixel miscut is eliminated on the axial plane; (2) the system conforming response model is decomposed into a first conforming response model which is annularly associated with the detector and a second conforming response model which is axially associated with the detector, so that the storage space required by storing the response model is greatly saved, and the projection calculation time is shortened; (3) for a long axis system, subsets can be further divided in an LOR group, namely different annular coincidence response functions are adopted for different theta intervals so as to improve the model accuracy; (4) the image rotation can enable the pixel columns to be parallel or approximately parallel to the LOR, the number of pixels to be calculated in the s direction is reduced, and therefore the calculation process can be accelerated; (5) the deviation caused by non-parallelism of LORs and pixel columns can be corrected by the displacement of the annular coincidence response function along the s direction at different t positions, in addition, the LORs in the same angular data set (namely an LOR group) of the polygonal detector have the problem of non-strict parallelism, and the relative displacement in the coincidence response function can also correct the error introduced by the geometric structure of the part of the detector; (6) in the image rotation process, upsampling can be carried out on pixels of which the annular plane is approximately vertical to the LOR (along the s direction), so that the sampling precision conforming to a response model can be improved, and the condition that the introduced resampling error is overlarge due to overlarge pixels under the action of a rotation matrix is avoided when large pixel reconstruction is carried out is ensured; (7) downsampling may be performed for pixels that are approximately parallel to the LOR (along the t direction) to reduce the sampling rate and thus the amount of computation; (8) the up-sampling process and the down-sampling process can be completed simultaneously in one rotation operation, so as to reduce the calculated amount and the corresponding memory access; (9) the symmetry of the detector is utilized to establish a simplified and compressed lookup table, the existing coincidence response function can be fitted in a parameterization mode, a coincidence response function calculation formula with different LORs is established, in addition, the lookup table of parameters of the coincidence response function fitting function for different LORs can be established, and therefore the storage space required by the storage response model can be saved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the embodiments herein. Various modifications, improvements and adaptations to the embodiments described herein may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the embodiments of the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the embodiments of the present description may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereof. Accordingly, aspects of embodiments of the present description may be carried out entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.), or by a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the embodiments of the present specification may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, etc., or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for operation of various portions of the embodiments of the present description may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, VisualBasic, Fortran2003, Perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
In addition, unless explicitly stated in the claims, the order of processing elements and sequences, use of numbers and letters, or use of other names in the embodiments of the present specification are not intended to limit the order of the processes and methods in the embodiments of the present specification. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing processing device or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more embodiments of the invention. This method of disclosure, however, is not intended to imply that more features are required than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application is filed in a manner inconsistent or contrary to the present specification, and except where a claim is filed in a manner limited to the broadest scope of the application (whether present or later appended to the application). It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are possible within the scope of the embodiments of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (19)

1. A projection method for PET image reconstruction, the PET detectors comprising a plurality of detector rings arranged along an axial direction, each detector ring comprising a plurality of detection units arranged along a ring direction, the method being performed by a computing device comprising a processor and a storage device, the method comprising:
acquiring actual projection data of an object, the actual projection data being acquired by detection units on a plurality of lines of coincidence response (LORs);
acquiring an initial three-dimensional image of the object;
acquiring a first coincidence response model and a second coincidence response model, wherein the first coincidence response model is annularly associated with the detector, and the second coincidence response model is axially associated with the detector;
determining predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincident response models; and
and generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data.
2. The method of claim 1, wherein determining the predicted projection data corresponding to each of the plurality of LORs based on the first and second coincident response models comprises:
dividing the plurality of LORs into a plurality of sets of LORs, wherein the circumferential angles of the LORs in the same set and the distances from projections in the circumferential plane to the center of the circumferential plane are the same;
for each of the plurality of sets of LORs,
determining intermediate predicted projection data corresponding to the set of LORs based on the first coincidence response model and the initial three-dimensional image, the intermediate predicted projection data corresponding to the set of LORs including response intensities of the set of LORs to the initial three-dimensional image at a plurality of circumferential planes at different positions in the axial direction of the detector; and
based on the second coincidence response model and the intermediate predicted projection data corresponding to the set of LORs, the predicted projection data corresponding to each LOR in the set of LORs is determined.
3. The method of claim 2, wherein the set of LOR's corresponding intermediate predicted projection data comprises a two-dimensional data set; a first dimension of the two-dimensional data set represents axial positions of the plurality of circumferential planes, and a second dimension of the two-dimensional data set represents different positions of the set of LORs in a second direction within the plurality of circumferential planes; each element in the two-dimensional data set represents a sum of response intensities of the set of LORs for a peripheral pixel on the initial three-dimensional image along a first direction in one of the plurality of annular planes at a location in the second direction in the one of the plurality of annular planes; wherein the second direction is perpendicular to the first direction.
4. The method of claim 3, wherein determining the predicted projection data for each LOR in the set of LORs based on the second fitting response model and the intermediate predicted projection data for the set of LORs comprises:
determining, by the second coincidence response model, a response intensity of each LOR in the set of LORs at each position in the second direction based on the intermediate predicted projection data corresponding to the set of LORs; and
and adding the response intensity of each LOR at each position in the second direction to obtain the predicted projection data corresponding to the LOR.
5. The method of claim 3, wherein determining intermediate predicted projection data corresponding to the set of LORs based on the first fitted response model and the initial three-dimensional image comprises:
rotating the initial three-dimensional image along the annular direction of the detector by a preset angle based on an image rotation model; and
determining the response intensity of the set of LORs in the annular planes at different axial positions of the detector to the three-dimensional image after rotation through the first coincidence response model so as to determine the intermediate prediction projection data corresponding to the set of LORs.
6. The method of claim 5, wherein the set of intermediate predictive projection data corresponding to the LOR includes a two-dimensional data set; a first dimension of the two-dimensional data set represents axial positions of the plurality of circumferential planes, and a second dimension of the two-dimensional data set represents different positions of the set of LORs in a second direction within the plurality of circumferential planes; each element in the two-dimensional data set represents a sum of response intensities of the set of LORs to peripheral pixels on the rotated three-dimensional image along a first direction in one of the plurality of annular planes at a location in the second direction in the annular plane; wherein the second direction is perpendicular to the first direction, and the first direction and the second direction are consistent with the arrangement of the rotated three-dimensional image on the ring plane.
7. The method of claim 5, wherein said determining, from said first coincidence response model, the intensities of responses of said plurality of circumferential planes of said rotated three-dimensional image to said set of LORs at different positions in an axial direction of said probe comprises:
upsampling the rotated three-dimensional image in the first direction and/or downsampling the rotated three-dimensional image in the second direction to obtain a resampled three-dimensional image; and
determining the response intensity of the set of LORs of the plurality of annular planes at different axial positions of the detector to the resampled three-dimensional image through the first coincidence response model.
8. The method of claim 3, wherein said first fitting response model includes a plurality of circumferential fitting response functions corresponding to different locations in said second direction for each of said plurality of LORs, said second fitting response model includes a plurality of axial response functions corresponding to different locations in said second direction for each of said plurality of LORs, an axial length of said detector is less than a threshold, and fitting response functions corresponding to the same locations in said second direction for each of the LORs in at least one of said plurality of sets of LORs are identical.
9. The method of claim 3, wherein said first coincidence response model includes a plurality of circumferential coincidence response functions corresponding to different locations in said second direction for each of said plurality of LORs, said axial length of said probe being greater than a threshold, each LOR in at least one of said plurality of sets of LORs having a different circumferential coincidence response function corresponding to the same location in said second direction.
10. The method of claim 9, wherein determining intermediate predicted projection data corresponding to the set of LORs based on the first fitted response model and the initial three-dimensional image comprises:
dividing the group of LORs into a plurality of subsets, wherein the LORs in the plurality of subsets correspond to different axial angle intervals, and the plurality of subsets respectively correspond to different circumferential coincidence response functions;
for each subset of the plurality of subsets, determining intermediate predicted projection data corresponding to each subset by a circular fit response function corresponding to the subset; and
intermediate predicted projection data corresponding to the set of LORs is determined based on the intermediate predicted projection data corresponding to each subset.
11. The method of claim 3, wherein the first conforming response model includes a plurality of circumferential response functions corresponding to the plurality of LORs, and the circumferential conforming response function is modified to obtain the first conforming response model; the correction includes:
and according to the difference between the pixel arrangement of the initial three-dimensional image and the annular angle of the LOR, correcting the width of the preset part of the waveform of the annular coincidence response function along the first direction and the response intensity of the annular coincidence response function.
12. The method of claim 3, wherein the detector ring of the detector comprises a ring structure of a regular polygon, the plurality of detection units are arranged on each side of the regular polygon, and N detection units are arranged on each side of the regular polygon at equal intervals; the memory stores the hoop coincidence response functions corresponding to the LORs of the N hoop angle groups, the LORs of each hoop angle in each hoop angle group correspond to one hoop coincidence response function at different positions on the LORs, and the first coincidence response model is determined based on the hoop coincidence response functions corresponding to the LORs of the N hoop angle groups.
13. The method of claim 1, wherein the memory stores a plurality of sets of circular conforming response functions in one-to-one correspondence with a plurality of pixel sizes, the first conforming response model being determined based on the plurality of sets of circular conforming response functions.
14. The method of claim 1,
the first fitting function represents a corresponding relationship between first fitting parameters corresponding to different positions on the LOR and the annular fitting response function, and
the second fitting function represents a correspondence between second fitting parameters corresponding to different positions on the LOR and the axial fitting response function.
15. The method of claim 14, wherein the first fitting parameters and/or the second fitting parameters corresponding to different locations on each LOR are stored in the memory in the form of a look-up table, and obtaining the first and second fitted response models comprises:
acquiring a first fitting parameter and a second fitting parameter corresponding to different positions on each LOR from the memory;
based on the first fitting parameters, determining an annular coincidence response function and an axial coincidence response function corresponding to different positions on each LOR by using the first fitting function; and
based on the second fitting parameters, determining an axial coincidence response function corresponding to different locations on each LOR using the second fitting function.
16. The method of claim 14, wherein the first fitting function or the second fitting function is obtained based on a combination of two half-gaussian functions having different widths, and the first fitting parameter or the second fitting parameter comprises a critical point of the two half-gaussian functions and a width parameter of the two half-gaussian functions.
17. A projection system for PET image reconstruction, the PET detectors including a plurality of detector rings arranged in an axial direction, each detector ring including a plurality of detection units arranged in a ring direction, the system comprising:
an actual projection data acquisition module for acquiring actual projection data of an object, the actual projection data being acquired by detection units on a plurality of coincidence response Lines (LORs);
an initial three-dimensional image acquisition module for acquiring an initial three-dimensional image of the object;
a coincidence response model obtaining module, configured to obtain a first coincidence response model and a second coincidence response model, where the first coincidence response model is circumferentially associated with the probe, and the second coincidence response model is axially associated with the probe;
a predicted projection data determination module for determining predicted projection data corresponding to each LOR of the plurality of LORs based on the first and second coincident response models; and
and the image updating module is used for generating a target three-dimensional image of the object based on the predicted projection data and the actual projection data.
18. A projection apparatus for PET image reconstruction comprising a processor and a storage device for storing instructions, characterized in that the processor, when executing the instructions, implements the method of any one of claims 1-16.
19. A computer readable storage medium storing instructions, wherein when the instructions in the storage medium are executed by a processor, the method according to any one of claims 1 to 16 is implemented.
CN202111182705.6A 2021-10-11 2021-10-11 Projection method and system for PET image reconstruction Active CN113902823B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111182705.6A CN113902823B (en) 2021-10-11 2021-10-11 Projection method and system for PET image reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111182705.6A CN113902823B (en) 2021-10-11 2021-10-11 Projection method and system for PET image reconstruction

Publications (2)

Publication Number Publication Date
CN113902823A true CN113902823A (en) 2022-01-07
CN113902823B CN113902823B (en) 2024-07-09

Family

ID=79191441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111182705.6A Active CN113902823B (en) 2021-10-11 2021-10-11 Projection method and system for PET image reconstruction

Country Status (1)

Country Link
CN (1) CN113902823B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117237451A (en) * 2023-09-15 2023-12-15 南京航空航天大学 Industrial part 6D pose estimation method based on contour reconstruction and geometric guidance

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107978002A (en) * 2016-10-25 2018-05-01 上海东软医疗科技有限公司 A kind of PET image reconstruction method and device
US20190287275A1 (en) * 2016-08-03 2019-09-19 Koninklijke Philips N.V. Time-of-flight (tof) pet image reconstruction using locally modified tof kernels
CN110269638A (en) * 2019-06-25 2019-09-24 上海联影医疗科技有限公司 Image rebuilding method, system, readable storage medium storing program for executing and equipment
CN110866959A (en) * 2019-11-12 2020-03-06 上海联影医疗科技有限公司 Image reconstruction method, system, device and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190287275A1 (en) * 2016-08-03 2019-09-19 Koninklijke Philips N.V. Time-of-flight (tof) pet image reconstruction using locally modified tof kernels
CN107978002A (en) * 2016-10-25 2018-05-01 上海东软医疗科技有限公司 A kind of PET image reconstruction method and device
CN110269638A (en) * 2019-06-25 2019-09-24 上海联影医疗科技有限公司 Image rebuilding method, system, readable storage medium storing program for executing and equipment
CN110866959A (en) * 2019-11-12 2020-03-06 上海联影医疗科技有限公司 Image reconstruction method, system, device and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张斌;王李栓;赵书俊;: "PET断层重建中动态射线追踪算法的实现", 郑州大学学报(理学版), no. 03, 15 September 2012 (2012-09-15) *
郭金霞;曹孝卿;谢庆国;: "PET探测器几何结构与系统响应矩阵关系初探", 核电子学与探测技术, no. 01, 20 January 2011 (2011-01-20) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117237451A (en) * 2023-09-15 2023-12-15 南京航空航天大学 Industrial part 6D pose estimation method based on contour reconstruction and geometric guidance
CN117237451B (en) * 2023-09-15 2024-04-02 南京航空航天大学 Industrial part 6D pose estimation method based on contour reconstruction and geometric guidance

Also Published As

Publication number Publication date
CN113902823B (en) 2024-07-09

Similar Documents

Publication Publication Date Title
CN106466188B (en) System and method for emission tomography quantification
US8000513B2 (en) System and method for 3D time of flight PET forward projection based on an exact axial inverse rebinning relation in fourier space
US11704846B2 (en) System and method for image reconstruction
US9958559B1 (en) Method and apparatus for automatic detection and correction of patient bed shift using intrinsic scintillation crystal radiations
EP3399346B1 (en) Normalization crystal efficiencies estimation for continuous motion bed acquisition
US8885906B2 (en) Alignment of positron emission tomographs by virtual tomographs
US8509504B2 (en) Point spread function radial component implementation in Joseph's forward projector
Zeng et al. A GPU-accelerated fully 3D OSEM image reconstruction for a high-resolution small animal PET scanner using dual-ended readout detectors
US7769217B2 (en) Fast iterative 3D PET image reconstruction using a set of 2D linogram transformations
US8755586B2 (en) Image reconstruction including shift-variant blur compensation
CA2957469A1 (en) Detector assemblies and methods for helical ct scanning
CN113902823B (en) Projection method and system for PET image reconstruction
EP3503810A1 (en) Method and system for calibrating an imaging system
US11961164B2 (en) Continuous bed motion acquisition with axially short phantom for PET imaging system setup and quality control
Strzelecki Image reconstruction and simulation of strip positron emission tomography scanner using computational accelerators
JP2011002306A (en) Iterative image reconstruction method for pet system
US20240144553A1 (en) Systems and methods for image reconstruction
US20230206516A1 (en) Scatter estimation for pet from image-based convolutional neural network
US20240265593A1 (en) Method and system for calibrating an imaging system
US20230237638A1 (en) Apparatus and methods for unsupervised image denoising using double over-parameterization
Son et al. Feasibility study of a concurrent image reconstruction algorithm for proton therapy with in-beam TOF-PET
Boukhal et al. Positron-based attenuation correction for Positron Emission Tomography data using MCNP6 code

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant