CN112998735B - System and method for reconstructing image by scanning device - Google Patents
System and method for reconstructing image by scanning device Download PDFInfo
- Publication number
- CN112998735B CN112998735B CN202110216943.8A CN202110216943A CN112998735B CN 112998735 B CN112998735 B CN 112998735B CN 202110216943 A CN202110216943 A CN 202110216943A CN 112998735 B CN112998735 B CN 112998735B
- Authority
- CN
- China
- Prior art keywords
- detector
- time
- flight
- ray
- response line
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 43
- 230000004044 response Effects 0.000 claims abstract description 141
- 230000002123 temporal effect Effects 0.000 claims abstract description 57
- 238000012937 correction Methods 0.000 claims abstract description 45
- 238000004364 calculation method Methods 0.000 claims abstract description 16
- 238000005094 computer simulation Methods 0.000 claims abstract description 12
- 230000000007 visual effect Effects 0.000 claims abstract description 11
- 238000012545 processing Methods 0.000 claims description 21
- 238000009826 distribution Methods 0.000 claims description 16
- 230000002596 correlated effect Effects 0.000 claims description 4
- 230000002285 radioactive effect Effects 0.000 claims 4
- 238000005259 measurement Methods 0.000 description 15
- 230000005855 radiation Effects 0.000 description 12
- 238000001514 detection method Methods 0.000 description 11
- 230000000875 corresponding effect Effects 0.000 description 10
- 239000013078 crystal Substances 0.000 description 6
- 238000003384 imaging method Methods 0.000 description 6
- 239000000463 material Substances 0.000 description 6
- FVAUCKIRQBBSSJ-UHFFFAOYSA-M sodium iodide Chemical compound [Na+].[I-] FVAUCKIRQBBSSJ-UHFFFAOYSA-M 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 238000010586 diagram Methods 0.000 description 4
- 230000005251 gamma ray Effects 0.000 description 4
- 230000004323 axial length Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 238000005457 optimization Methods 0.000 description 3
- 239000000700 radioactive tracer Substances 0.000 description 3
- 239000000523 sample Substances 0.000 description 3
- 229910052765 Lutetium Inorganic materials 0.000 description 2
- BPQQTUXANYXVAA-UHFFFAOYSA-N Orthosilicate Chemical compound [O-][Si]([O-])([O-])[O-] BPQQTUXANYXVAA-UHFFFAOYSA-N 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 239000003814 drug Substances 0.000 description 2
- 229940079593 drug Drugs 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- OHSVLFRHMCKCQY-UHFFFAOYSA-N lutetium atom Chemical compound [Lu] OHSVLFRHMCKCQY-UHFFFAOYSA-N 0.000 description 2
- 238000009206 nuclear medicine Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 235000009518 sodium iodide Nutrition 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000009827 uniform distribution Methods 0.000 description 2
- 229910052684 Cerium Inorganic materials 0.000 description 1
- GYHNNYVSQQEPJS-UHFFFAOYSA-N Gallium Chemical compound [Ga] GYHNNYVSQQEPJS-UHFFFAOYSA-N 0.000 description 1
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 238000012879 PET imaging Methods 0.000 description 1
- 238000000429 assembly Methods 0.000 description 1
- 230000000712 assembly Effects 0.000 description 1
- 229910052797 bismuth Inorganic materials 0.000 description 1
- JCXGWMGPZLAOME-UHFFFAOYSA-N bismuth atom Chemical compound [Bi] JCXGWMGPZLAOME-UHFFFAOYSA-N 0.000 description 1
- GWXLDORMOJMVQZ-UHFFFAOYSA-N cerium Chemical compound [Ce] GWXLDORMOJMVQZ-UHFFFAOYSA-N 0.000 description 1
- 238000002591 computed tomography Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002059 diagnostic imaging Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 229910052733 gallium Inorganic materials 0.000 description 1
- 239000002223 garnet Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012636 positron electron tomography Methods 0.000 description 1
- 238000002600 positron emission tomography Methods 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 229910052727 yttrium Inorganic materials 0.000 description 1
- VWQVUPCCIRVNHF-UHFFFAOYSA-N yttrium atom Chemical compound [Y] VWQVUPCCIRVNHF-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/58—Testing, adjusting or calibrating thereof
- A61B6/582—Calibration
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/58—Testing, adjusting or calibrating thereof
- A61B6/582—Calibration
- A61B6/583—Calibration using calibration phantoms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/58—Testing, adjusting or calibrating thereof
- A61B6/582—Calibration
- A61B6/585—Calibration of detector units
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Pathology (AREA)
- Heart & Thoracic Surgery (AREA)
- High Energy & Nuclear Physics (AREA)
- Physics & Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Optics & Photonics (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Nuclear Medicine (AREA)
Abstract
The invention provides a system and a method for reconstructing an image by a scanning device, which comprises the following steps: placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field; defining a response line, and calculating the intersection length of the response line and the die body; collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line; correcting the mean value of the time difference of flight to obtain a mean value correction value of the time difference of flight; establishing a calculation model; using the computational model to obtain a temporal resolution of each detector; the temporal resolution of each detector is used to reconstruct the image. The invention provides a system and a method for reconstructing an image by a scanning device, which can improve the quality of the reconstructed image.
Description
Technical Field
The present invention relates to the field of medical imaging technologies, and in particular, to a system and a method for reconstructing an image of a scanning device.
Background
The time-of-flight positron emission tomography (TOF-PET) scanner is an advanced functional imaging tool in nuclear medicine imaging, and the application prospect of the TOF-PET scanner is highly valued by researchers and manufacturers of nuclear medicine imaging, and the imaging principle is as follows: by injecting a tracer containing a radionuclide into an organism before scanning the organism, the tracer generating beta in the organism + Decays and produces positiveElectrons, positive and negative electrons generated after decay generate annihilation reaction when meeting with electrons in organisms, so that a pair of gamma photons with opposite directions and same energy is generated, a detector surrounding the detected organisms detects the photons, information is stored in a form of coincidence events, and through a series of electronic responses, electronic response signals are input to a computer so as to generate an image capable of reflecting the distribution of the tracer in the organisms through a corresponding image reconstruction algorithm. TOF-PET has a time measurement function that can determine the position and intensity of the radionuclide distribution within a coincidence time window, and can improve the imaging quality of a PET scanner, reduce the dosage, and shorten the scanning time by using the time difference of arrival of two 51lkeV gamma photons generated by positron annihilation at a detector and locating the possible position of an annihilation event on a Line of Response (LOR) according to the speed of light.
In the conventional technology, a fixed system time resolution is used in an image reconstruction algorithm of TOF-PET, that is, all coincidence events use the same time resolution to estimate the position where annihilation may occur, and the influences of factors such as different scintillation crystal materials or sizes, circuit design differences, system installation differences or clock characteristics of elements in the TOF-PET system are ignored. In the conventional technology, if the time resolutions of all the detectors are assumed to be the same, the actual time difference of flight measurement accuracy of the coincidence event is overestimated or underestimated, and the image reconstruction quality is affected.
In the reconstruction algorithm of TOF-PET, the probability density of the possible annihilation positions is estimated according to the time difference of flight of a coincidence event, and is generally considered to satisfy a normal distribution, which is represented by obtaining a point on a response line corresponding to the coincidence event by the time difference of flight as the mean value of the normal distribution, and taking the time resolution of the response line (the time resolution of all the response lines is considered to be the same in the conventional technology) as the half-peak width of the normal distribution (or 2.355 times the standard deviation) as the half-peak width. Unlike conventional PET (TOF-free) reconstruction algorithms, where the locations at which annihilation is likely to occur are considered to be equal along the line of response, TOF-PET reconstruction can improve the imaging quality of the PET scanner, reduce drug usage, and shorten scan times. And the time resolution of obtaining accurate response lines is the basis for accurate reconstruction of the TOF-PET reconstruction algorithm.
Furthermore, the time resolution of the system is considered by the skilled artisan or researcher as an important indicator for evaluating the performance of a TOF-PET system, such as the NEMA (national Electrical Manufacturers Association) standard in the United states. The overall temporal resolution of the system, i.e. that of TOF-PET in conventional technology, is usually obtained by experimental means. This is limited, for example, by the inability to accurately evaluate a small number of detectors with significant time measurement errors. In addition, the time measurement precision of different detectors has certain difference, and the evaluation of the difference between different detectors is also an important embodiment for measuring the time measurement performance of the system.
Disclosure of Invention
In view of the above-mentioned drawbacks of the prior art, the present invention provides a system and a method for reconstructing an image by a scanning device, which can improve the quality of the reconstructed image by calculating the temporal resolution of the detector and then reconstructing the image according to the temporal resolution of the detector, thereby improving the influence of the temporal resolution of the detector on the reconstructed image.
To achieve the above and other objects, the present invention provides a method for reconstructing an image by a scanning device, comprising:
placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, the detector ring comprises a plurality of detectors with different positions, and each detector has the same or different time resolution;
defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the die body emits a first ray and a second ray which are opposite in direction and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
using said computational model to obtain a temporal resolution for each of said detectors;
obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image.
Further, the temporal resolution of any two of the detectors may be the same or different.
Further, the time resolution of the response line satisfies the following formula:
where σ represents the time standard deviation of the line of response, σ 1 And σ 2 Respectively representing the time standard deviation of the first detector and the time standard deviation of the second detector.
Further, the step of correcting the mean value of the time-of-flight differences according to the distance between the first detector and the phantom and the distance between the second detector and the phantom to obtain a mean value correction value of the time-of-flight differences includes:
correcting the time of the first ray detected by the first detector according to the distance between the first detector and the die body to obtain a time correction value of the first ray detected by the first detector;
correcting the time of the second ray detected by the second detector according to the distance between the second detector and the die body to obtain a time correction value of the second ray detected by the second detector;
calculating a corrected value for the time of flight difference for each of the coincident events;
and obtaining a corrected value of the mean value of the time-of-flight differences according to the corrected value of the time-of-flight differences and the number of coincidence events.
Further, the standard deviation of the time-of-flight differences positively correlates to the intersection length and the standard deviation of the time-of-flight differences positively correlates to the time resolution of the line of response, the mean of the time-of-flight differences tending towards the expectation of the time-of-flight differences.
Further, the time of the first ray detected by the first detector is corrected according to the distance from the first detector to the phantom body, so that a corrected time value of the first ray detected by the first detector is obtained, and the corrected time value is defined as T 1c ;
T 1c =T 1 -d 13 /c
Wherein, T 1c A correction value, T, representing the time at which the first ray was detected by the first detector 1 Representing the time at which the first ray was detected by the first detector, d 13 Representing the distance from the first detector to the phantom, c being the speed of light.
Further, the calculation model is
Wherein-representsSatisfying the normal distribution (expressed by Norm),mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ 1 And σ 2 Representing the standard deviation of time of the first detector and the standard deviation of time of the second detector, c representing the speed of light.
Further, using the computational model to obtain the temporal resolution of each of the detectors is formulated as
WhereinA mean correction value, Σ, representing said time-of-flight difference i Represents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents 1 And σ 2 Representing said standard deviation of said first detector and said standard deviation of said second detector, c representing the speed of light, and σ representing said standard deviation of each of said detectors;
Further, the step of obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image:
calculating the time resolution of the response line according to the time resolution of the first detector and the time resolution of the second detector, and expressing the time resolution of the response line by the following formula;
where σ represents the time standard deviation of the line of response, σ 1 And σ 2 Representing the time standard deviation of the first detector and the time standard deviation of the second detector, respectively;
the position of the occurrence of the annihilation event is estimated according to the time difference of flight T of each coincidence event and satisfies the following probability distribution
WhereinIs a point on the line of response that,indicating the probability density of their annihilation occurring,representing the first detector position as a function of the time-of-flight differenceThe second detector positionThe centers of all the possible annihilation positions obtained are calculated by the following formula
WhereinRepresenting a unit vector pointing from the first detector position to the second detector position, | · | representing a vector length, c representing a speed of light;
and obtaining the probability distribution of the position of the occurrence of the annihilation event according to the time-of-flight difference T of each coincidence event so as to reconstruct the image.
Further, the present invention also provides a system for reconstructing an image by a scanning device, comprising:
the die body position acquisition unit is used for acquiring the relative position relation of the die body relative to the center of a scanning view field, wherein the die body is positioned in a detector ring in the scanning view field, and the detector ring comprises a plurality of detectors with different positions;
the response line acquisition unit is used for acquiring a response line and calculating the intersection length of the response line and the phantom, wherein a connecting line between the first detector and the second detector is defined as the response line, and the response line penetrates through the phantom; a temporal resolution of the line of response is related to the temporal resolution of the first detector and the temporal resolution of the second detector;
the acquisition unit is used for acquiring coincidence events and counting the number of the coincidence events included in the response line so as to calculate the mean value of the time difference of flight of the response line, wherein when the die body emits a first ray and a second ray which are opposite in direction and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
the correction unit is used for correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
the model establishing unit is used for establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
a model processing unit for using the computational model to obtain a temporal resolution of each of the detectors;
a reconstruction unit for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image.
In summary, the present invention provides a system and a method for reconstructing an image by a scanning device, in which a phantom is first placed in a preset region of a scanning field of view, a relative position relationship between the phantom and the center of the scanning field of view is obtained, then a response line and a coincidence event are defined, and simultaneously, the intersection length of the response line and the phantom is calculated, and the number of coincidence events included in the response line is counted. According to the invention, the intersection point of the first ray and the first detector is defined as a first intersection point, and the intersection point of the first ray and the phantom is defined as a third intersection point, so that the time of detecting the first ray by the first detector can be corrected according to the distance between the first intersection point and the third intersection point, and the time of detecting the second ray by the second detector can also be corrected, so that the time difference of flight corresponding to an event can be corrected, and thus, the corrected value of the time difference of flight, namely, the mean value corrected value of the time difference of flight can be obtained. Then establishing a calculation model according to the resolution of the response line, the number of the coincidence events, the intersection length and the mean value correction value of the time difference of flight; and processing the computational model to obtain a temporal resolution of each detector, and then reconstructing an image from the temporal resolution of each detector using a reconstruction unit. According to the invention, by measuring the time resolution of each detector, when the time resolution of the first detector and/or the time resolution of the second detector are/is poor, the first detector and/or the second detector can be avoided, so that the quality of the reconstructed image can be improved, namely the accuracy of the reconstructed TOF-PET image can be improved.
Drawings
FIG. 1: figure of the structure of a TOF-PET apparatus according to the invention.
FIG. 2: a schematic cross-sectional view of a detector ring mounted on the gantry (gantry) of fig. 1 in the present invention.
FIG. 3: a schematic longitudinal cross-sectional view of the detector ring of fig. 2 in the present invention.
FIG. 4 is a schematic view of: the present invention is a diagram for explaining the principle of the TOF-PET reconstruction method used by the reconstruction unit of fig. 1.
FIG. 5: the invention discloses a method for reconstructing an image by a scanning device.
FIG. 6: the invention is a simplified schematic diagram of a phantom and detector ring.
FIG. 7: the present invention obtains a schematic of the response line.
FIG. 8: in the present invention, a schematic of a coincidence event is collected.
FIG. 9: the invention discloses a system diagram for reconstructing graphs by a scanning device.
FIG. 10: schematic representation of an electronic device of the present invention.
FIG. 11: schematic illustration of a computer storage medium in the present invention.
Detailed Description
The following embodiments of the present invention are provided by way of specific examples, and other advantages and effects of the present invention will be readily apparent to those skilled in the art from the disclosure herein. The invention is capable of other and different embodiments and of being practiced or of being carried out in various ways, and its several details are capable of modification in various respects, all without departing from the spirit and scope of the present invention.
It should be noted that the drawings provided in the present embodiment are only for illustrating the basic idea of the present invention, and the components related to the present invention are only shown in the drawings rather than drawn according to the number, shape and size of the components in actual implementation, and the type, quantity and proportion of the components in actual implementation may be changed freely, and the layout of the components may be more complicated.
As shown in fig. 1, the present embodiment proposes a Positron Emission computed Tomography (PET) device, which may be a TOF (Time Of Flight) -PET device. The TOF-PET apparatus 10 includes a control unit 11, a gantry 12, a signal processing unit 13, a coincidence counting unit 14, a storage unit 15, a reconstruction unit 16, a display unit 17, and an operation unit 18.
As shown in fig. 2-3, fig. 2 is a schematic cross-sectional view of a detector ring 100 mounted on gantry 12. Fig. 3 is a cross-sectional view taken along a line a-a of fig. 2. The gantry 12 has a plurality of detector rings 100 arranged along a circumferential central axis Z. The detector ring 100 has a plurality of detectors 110 arranged on a circumference around a central axis Z. An image Field Of View (FOV) is formed in the opening Of the probe ring 100. The bed plate 140 on which the phantom 200 is placed is inserted into the opening of the detector ring 100 so that the beam portion of the phantom 200 enters the FOV. The body mold 200 is placed on the bed 140 so that the body axis coincides with the central axis Z. A drug labeled with a radioisotope is injected into the phantom 200 for PET imaging. The detector 110 detects the pair of annihilation gamma-rays emitted from the inside of the phantom 200, and generates a pulse-like electric signal according to the amount of light of the pair of annihilation gamma-rays detected.
As shown in FIG. 2, the detector 110 has a plurality of scintillators 120 and a plurality of photomultiplier tubes 130. The scintillator 120 receives gamma rays resulting from pair-wise annihilation by the radioisotopes within the phantom 200, producing scintillation light. Each scintillator is configured such that a long axis direction of each scintillator substantially coincides with a radial direction of the detector ring. The photomultiplier tube 130 is provided at one end portion of the scintillator 120 with respect to a radial direction orthogonal to the central axis Z. A Light Guide (not shown) may or may not be provided between the scintillator 120 and the photomultiplier tube 130. The plurality of scintillators 120 and the plurality of photomultiplier tubes 130 included in the detector ring 100 are arranged in concentric circles (concentric cylinders). Scintillation light generated in the scintillator 120 propagates within the scintillator 120 and toward the photomultiplier tube 130. The photomultiplier tube 130 generates a pulse-like electric signal according to the amount of scintillation light. The generated electric signal is supplied to the signal processing section 13 as shown in fig. 1.
As shown in fig. 1, the signal processing unit 13 generates a single photon event from the electric signal from the photomultiplier tube 130. Specifically, the signal processing unit 13 performs detection time measurement processing, position calculation processing, and energy calculation processing. In the detection timing measurement process, the signal processing section 13 measures the detection timing of the gamma ray by the detector 110. Specifically, the signal processing unit 13 monitors the peak value of the electric signal from the photomultiplier tube 130. Then, the signal processing unit 13 measures, as the detection time, a time at which the peak value of the electric signal exceeds a preset threshold value. That is, the signal processing unit 13 electrically detects annihilation γ -rays by detecting that the intensity of the electric signal exceeds a threshold value. In the position calculation process, the signal processing unit 13 calculates the incident position of annihilation gamma-rays based on the electric signals from the photomultiplier tube 130. The incidence position of the annihilation gamma-ray corresponds to the position coordinates of the scintillator 120 on which the annihilation gamma-ray is incident. In the energy calculation process, the signal processing unit 13 calculates an energy value of annihilation gamma-rays incident on the scintillator 120 based on the electric signal from the photomultiplier tube 130. Detection time instant data, position coordinate data and energy value data relating to the single photon events are associated together. The combination of the energy value data, the position coordinate data, and the detection time instant data relating to the single photon event is referred to as single photon event data. The single photon event data is generated in turn each time an annihilation gamma-ray is detected. The generated single photon event data is supplied to the coincidence counting section 14, and the time data of the single photon event data is corrected by the time shift amount of the corresponding detector.
As shown in fig. 1, the coincidence counting unit 14 performs coincidence counting processing on single-photon event data relating to a plurality of single events. Specifically, the coincidence counting unit 14 repeatedly specifies event data concerning 2 single photon events contained in a predetermined time range from the repeatedly supplied single photon event data. The time range is set to, for example, about 1ns to 15 ns. The pair of single photon events are presumed to be due to a pair of annihilation gamma-rays generated from the same pair of annihilation sites. Paired single photon events are broadly referred to as coincident events. A Line connecting the pair Of detectors 110 (more specifically, the scintillator 120) that detect the pair Of annihilation gamma rays is called LOR (Line Of Response Line). In this way, the coincidence counting section 14 counts coincidence events for each LOR. Event data (hereinafter, referred to as coincidence event data) relating to a pair of events constituting the LOR is stored in the storage unit 15.
As shown in fig. 1, the reconstruction unit 16 reconstructs image data representing the spatial distribution of the concentration of the radioisotope in the subject from coincidence event data on a plurality of coincidence events. The reconstruction unit 16 executes a reconstruction method (hereinafter referred to as a TOF-PET reconstruction method) using a difference in detection times of a pair of annihilation gamma rays. In the TOF-PET reconstruction method, the probability of existence of paired annihilation points in each pixel on the LOR differs depending on the difference in detection time of a coincidence event.
Fig. 4 is a diagram for explaining the principle of the TOF-PET reconstruction method. As shown in FIG. 4, let a pair of annihilation gamma-rays be detected at time t 1 Is detected by the first detector 111 at a detection time t 2 When the second detector 112 detects the annihilation position, the distance difference between the annihilation position and the second detector 112 and the first detector 111 can be calculated by the following equation:
the reconstruction unit 16 calculates the position of the pair of annihilation points on the LOR for each coincidence event using equation (1). The position of the LOR is calculated by the reconstruction unit 16 from 2 detected positions of 2 events constituting the LOR. When the position of the pair annihilation point is calculated, the reconstruction unit 16 sets a weight corresponding to the existence probability of the pair annihilation point for each pixel on the LOR. The weight of the target pixel is set to become smaller as the distance from the pair of annihilation points becomes longer. The better the temporal resolution, the higher the accuracy of the calculated distribution of the pair annihilation points. Therefore, the better the temporal resolution, the higher the weight of the pixels of the calculated pair annihilation point relative to the other pixels. The reconstruction unit 16 reconstructs image data from the coincidence event data using the weights set in this manner. For example, the reconstruction unit 16 generates PET projection data representing the position and the count number of LORs from the coincidence event data. Then, the reconstruction unit 16 generates image data by TOF-PET reconstruction from the generated projection data. The reconstructed image data is supplied to the storage unit 15. In this way, TOF-PET reconstruction methods make use of the difference in detection times of coincidence events and can improve the signal-to-noise ratio compared to reconstruction methods that do not make use of the difference in detection times. That is, in the TOF-PET reconstruction method, the time resolution is an important parameter.
As shown in fig. 1, the display section 17 displays an image corresponding to the image data on the display device. As the display device, a CRT display, a liquid crystal display, an organic EL display, a plasma display, or the like can be suitably used. The operation unit 18 receives various commands and information inputs from an operator via an input device. As the input device, a keyboard, a mouse, various buttons, a touch panel, and the like can be suitably used.
As shown in fig. 3, gantry 12 is provided with a plurality of detector rings 100 arranged along the Z-axis. In fig. 3, 3 detector rings 100 are shown for illustration. Each detector ring 100 has a plurality of scintillator rings 121 arranged along the Z-axis. The scintillator ring 121 is constituted by a plurality of scintillators 120 arrayed in a substantially circumferential direction around the Z-axis. In fig. 3, 3 scintillator rings 121 are shown for each detector ring 100 for illustration. Hereinafter, the number of all scintillator rings 121 included in the plurality of detector rings arranged along the central axis Z is referred to as a column number. In fig. 3, the number of columns of the scintillator ring 121, that is, the number of columns of the scintillators 120 is 9. In addition, all of the scintillators 120 contained within the gantry 12 are referred to generally as a scintillator pack 360. The number of columns of the scintillator ring 121 (the number of columns of the scintillators 120), the number of detectors 110 in the detector ring 100, and the number of scintillators 120 in the detectors 110 are not limited to those shown in fig. 3.
In the detector 110, a plurality of scintillators 120 are arranged in, for example, a two-dimensional shape. The scintillator 120 according to the present embodiment may be formed of any type of scintillator material. For example, the scintillator 120 is made of NaI (sodium iodide)) or BGO (bismuth germanate), LSO (a certain amount of cerium can be added in lutetium silicate), LaBr 3: ce. LYSO (mixed crystal of LSO and yttrium silicate) and the like. As the material of the scintillator 120, lutetium crystal is often used. In addition to the above materials, the scintillator 120 may be formed of, for example, a gallium-based crystal or a garnet-based crystal.
Before describing particular embodiments, specific terms or concepts related to embodiments of the present invention are explained herein:
response line: the Line between two crystal stripes Of gamma photons detected by the detector is called a Line Of Response (LOR).
Event compliance: a pair of coincidence events is considered to have occurred when two 511keV gamma photons are detected within a predetermined temporal coincidence window (e.g., 1-15 ns left or right).
Time window of coincidence: is the time period set for the time difference of arrival of the two gamma photons at the detector.
The scattering coincided with: refers to two gamma photons produced by annihilation radiation that are scattered from tissue if one is scattered before arrival but detected within a coincidence time window, and is referred to as scatter coincidence.
Random agreement: is a false coincidence, a coincidence event in which two gamma photons have no temporal or spatial correlation, but are erroneously detected within a coincidence time window.
In some embodiments, it is desirable to use a line source phantom to estimate the temporal resolution of the system. For example, in the NEMA standard, the time resolution of the system is measured by counting the half-peak width of the time-of-flight difference between the coincidence times of all the nearby line source phantoms, i.e., the error between the theoretical values of the time-of-flight difference, as the time resolution of the system. In other words, the temporal resolution thus obtained has only one value, ignoring the variability of the temporal measurement performance of the different detectors. Further, the time-of-flight difference measurement accuracy of the coincidence time on one response line cannot be accurately described using such a time resolution, which in turn leads to a reduction in the quality of the reconstructed image.
The embodiment provides a system and a method for reconstructing an image of a scanning device, which are used for obtaining the time resolution of each detector in a TOF-PET device, so that the detector with the poorer time resolution can be screened out, and the accuracy of reconstructing the TOF-PET image is improved.
As shown in fig. 5, the present embodiment provides a method for reconstructing an image by a scanning device, including:
s1: placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, and the detector ring comprises a plurality of detectors with different positions;
s2: defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
s3: collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
s4: correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
s5: establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
s6: using said computational model to obtain a temporal resolution for each of said detectors;
and S7, acquiring the time resolution of the response line by using the time resolution of each detector, and reconstructing an image.
As shown in fig. 2 and 6, before using the scanning device 10, the scanning device 10 is calibrated in step S1, for example, the phantom 200 is first placed in the detector ring 100, and the phantom 200 may be any one of a linear radiation source, a uniform barrel radiation source, and a uniform hollow barrel radiation source. The phantom 200 is positioned within the scan field of view and parallel to the central axis of the cylindrical gantry 20. In this embodiment, the axial field of view length of the detector ring 100 is greater than the axial length of the phantom 200, so data can be acquired at axial locations in a continuous or discrete manner. Also for example, when the cross-section (or axial length) of the phantom 200 is significantly smaller than the cross-section (or axial length) of the detector ring 100, data may be collected at the cross-sectional location in a continuous or discrete manner. In the present embodiment, the detector rings 100 are used to detect radiation, so that each detector in each detector ring 100 can collect no less than 10 coincidence events, and when the cross-sectional size of the detector is enlarged, the number of coincidence events collected by the detector is increased. As can be seen in fig. 6, the detector ring 100 includes a plurality of detectors, which are positioned differently. In addition, the time-measured energies (in terms of the time resolution of the lines of response) of the plurality of detectors included in the detector ring 100 also vary.
As shown in fig. 6, in the present embodiment, the phantom 200 is located at a predetermined position in the detector ring 100, and the predetermined position may coincide with the center position of the detector ring 100, and of course, the predetermined position may have a certain distance from the center position of the detector ring 100. After the phantom 200 is placed in the detector ring 100, the relative position of the phantom 200 to the detector ring 100, including the tilt angle of the phantom 200, may also be determined by image reconstruction or external measurement. In this embodiment, the size of the phantom 200, and the position of the phantom 200 within the detector ring 100, may be used as parameters for post-modeling. It should be noted that, in the present embodiment, the phantom 200 is, for example, a barrel-shaped solid uniform radiation source, and the radiation dose is uniformly distributed.
As shown in fig. 7, in step S2, the present embodiment defines a connection line between the first detector 111 and the second detector 112 as a response line, for example, the first response line 101, the second response line 102, and the third response line 103 are shown in fig. 7. The first response line 101 does not intersect with the phantom 200, and the second response line 102 and the third response line 103 intersect with the phantom 200, that is, the second response line 102 and the third response line 103 can be obtained by a true coincidence event, and the first response line 101 can be obtained by a random coincidence event or a scattering coincidence event, so that the second response line 102 and the third response line 103 are taken as a research object, and the first response line 101 is not taken as a research object, so that the effectiveness of modeling can be improved. It should be noted that when a phantom having a certain volume, such as a bucket source, is used, this embodiment will also reject lines of response that are tangent to or intersect the phantom 200 only very short (e.g., less than 2cm in length). Since the relative positional relationship of the phantom 200 to the detector ring 100 has been obtained, the lengths of the intersections of the second 102 and third 103 lines of response with the phantom 200 can be calculated.
As shown in fig. 8, in step S3, the present embodiment takes the second response line 102 as an example for description, that is, the number of matching events included in the second response line 102 is obtained. When a positron is emitted from a first radiation point 201 in the phantom 200, the positron encounters an electron around the positron and annihilates with the electron, thereby generating a pair of oppositely directed gamma photons of the same energy that can be detected by the detector ring 100. In the present embodiment, the first radiation point 201 emits a pair of first and second rays L1 and L2 with opposite directions and same energy. Within a preset coincidence time window, the first ray L1 is detected by the first detector 111, the second ray L2 is detected by the second detector 112, for example, the first ray L1 and the second ray L2 are detected within 10ns, and it is considered that a pair of coincidence events occurs. Similarly, the second radiation point 202 also emits a pair of oppositely directed first and second rays L1 and L2 with the same energy, and the first ray L1 is detected by the first detector 111 and the second ray L2 is detected by the second detector 112 within the preset coincidence time window, so that a pair of coincidence events can be defined. Since a plurality of radiation points may be included between the line segments A3a4, the number of coincident events included for a line of response, i.e., the number of coincident events that the line of response may include, may be obtained and defined as the number of coincident events.
As shown in fig. 8, the present embodiment is described by taking a first ray L1 and a second ray L2 emitted from the first radiation point 201 as an example. When the first detector 111 detects the first ray L1 and the second detector 112 detects the second ray L2, the time when the first ray L1 reaches the first detector 111 and the time when the second ray L2 reaches the second detector 112 are obtained, and in this embodiment, the time when the first ray L1 reaches the first detector 111 is defined as a first time T 1 The time when the second ray L2 reaches the second detector 112 is defined as a second time T 2 Thereby the coincidence can be obtainedThe difference in time of flight of the member, which may correspond to an event, may be a first time T 1 And a second time T 2 The difference of (a). According to the above description, the time difference of flight of each coincidence event included in the response line can be obtained, and meanwhile, according to the number of coincidence events, the average value of the time difference of flight of the response line can be obtained. For example, assume the number of coincidence events is N and the time-of-flight difference for each coincidence event is T 11 ,T 21 ,T 31 ,...,T N Then the mean of the time-of-flight differences of the line of response is (T) 11 +T 21 +T 31 +...+T N ) and/N. In this embodiment, the standard deviation of the time-of-flight difference is positively correlated to the intersection length and the standard deviation of the time-of-flight difference is positively correlated to the time resolution of the line of response, the mean of the time-of-flight differences tending towards the desired time-of-flight difference.
As shown in FIG. 8, in step S4, the first ray L1 intersects the first detector 111 at a first intersection A1, and the first ray L1 intersects the border of the phantom 200 at a third intersection A3. The second ray L2 intersects the second detector 112 at a second intersection A2, and the second ray L2 intersects the boundary of the phantom 200 at a fourth intersection A4. The mean of the time-of-flight differences may thus be corrected based on the distance of the first detector 111 from the phantom 200 and the distance of the second detector 112 from the phantom 200, such that a mean corrected time-of-flight difference value may be obtained. The embodiment is used for the first time T according to the distance between the first detector 111 and the phantom 200 1 The modification is described as an example. Since the relative positional relationship of the phantom 200 and the detector ring 100 has been obtained, the distance between the line segments A1A3 can be obtained, and thus the first time T can be determined according to the distance of the line segments A1A3 1 And (6) correcting.
For the first time T in this embodiment 1 The correction formula of (2) is:
T 1c =T 1 -d 13 /c
wherein, T 1c A correction value (first time correction value), T, representing the time at which the first detector 111 detects the first ray L1 1 Representing the time, d, at which the first ray L1 was detected by the first detector 111 13 Representing a first detector111 (length of the line A1A 3) from the phantom 200, and c is the speed of light.
Similarly, a correction value T of the time when the second ray L1 is detected by the second detector 112 can be obtained 2c (second time correction) it is thus possible to obtain a correction of the time-of-flight difference of the coincidence event, that is to say equal to the difference between the first time correction and the second time correction, i.e.:
T c =T 1c -T 2c
wherein, T c A correction value for the time of flight difference for the coincidence event;
the present embodiment thus makes it possible to obtain a mean correction of the time-of-flight differences, i.e. of the time-of-flight differences of the coincident events, based on the number of coincident events and the correction of the time-of-flight differences of each coincident event, i.e. the mean correction of the time-of-flight differences of the coincident events, i.e. the correction of the time-of-flight differences of the coincident events
WhereinAnd adding the corrected values of the flight time differences of each coincidence event corresponding to the response line to the mean corrected value of the flight time differences of the coincidence events, wherein N is the number of coincidence events, and sigma represents the sum of the corrected values of the flight time differences of each coincidence event corresponding to the response line.
It should be noted that the corrected value of the time difference of flight is convolved with a normal distribution, and the result of a uniform distribution is distributed in the same way, and the standard deviation of the normal distribution is proportional to the time resolution of the response line; the length of the uniform distribution is proportional to the intersection length.
As shown in FIG. 8, in this embodiment, because the line of response passes through the phantom 200, i.e., the line of response intersects the phantom 200, for example, the line connecting the third intersection A3 and the fourth intersection A4 may be the length of the intersection of the line of response with the phantom 200. The intersection length may be used as a parameter for subsequent modeling. It should be noted that the intersection length is related to the position of the mold body 200, and the intersection length may also change when the mold body 200 is in the tilted state. In this embodiment, the line segment A2a4 may be the intersection length of the phantom 200 and the response line.
As shown in FIG. 6, after determining the response line in steps S5-S6, the present embodiment calculates the time standard deviation σ of the first detector 111 1 Time standard deviation σ of the second detector 112 2 Mean correction of time of flight differenceThe intersection length R is a modeling parameter, and a calculation model is established, namely:
wherein-representsSatisfying the normal distribution (expressed by Norm),mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ 1 And σ 2 Representing the time standard deviation of the first detector and the time standard deviation of the second detector, c representing the speed of light. It should be noted that, in the present embodiment, the time resolution of the response line is related to the time resolution of the first detector 111 and the time resolution of the second detector 112,
the temporal resolution of each of the detectors is then obtained from the computational model by solving an optimization problem
WhereinRepresenting said time of flightMean correction value of difference, Σ i Represents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents 1 And σ 2 Represents the standard deviation in time of said first detector and the standard deviation in time of said second detector, c represents the speed of light, and σ represents the standard deviation in time of each of said detectors. The solving process can be given by the following iterative process:
a) the standard deviation of time for initializing each detector isWhere k is 0 for the number of iterations,respectively the time standard deviation of each detector to be calculated, n is the number of detectors, sigma sys Indicating a time standard deviation of the system set in advance.
b) For iteration k +1, the update process can be performed in batches, one batch method is to group according to the PET system modules,
a. updating the time standard deviation of each detector in the module i, selecting a term about the time standard deviation of the detector in the optimization expression, and selecting the result of the (k + 1) th iteration of the module j smaller than the module i; selecting the result of the k iteration of the module j larger than i;
b. taking the optimum expression as a derivative with respect to the time standard deviation of each detector and finding the time standard deviation of that detector such that the derivative is 0 as a result of the (k + 1) th iteration, the following two points need to be noted:
i. the optimization expression is convex with respect to the time standard deviation of each detector if the following inequality is satisfied
In order to ensure that the time standard deviation of the optimized expression relative to each detector is convex, only the response line with the established inequality needs to be selected when the response line is screened;
two probes in each term in the optimized expression belong to different modules. In fact, in screening for coincidence events, a coincidence event can be deemed invalid if the probes on both ends of the coincidence event belong to the same module.
c.i=i+1
d. Repeating a. -c until the time standard deviations of the detectors on all modules are updated.
c)k=k+1
d) Repeating b) -c) until the amount of change of each detector is less than a threshold value, or a maximum number of iterations is reached.
After obtaining the time standard deviation σ of each of the detectors, the time resolution of each of the detectors can be further obtained as
Calculating the time resolution of the response line according to the time resolution of the first detector and the time resolution of the second detector, and expressing the time resolution of the response line by the following formula;
whereinThe time standard deviation, σ, representing the line of response 1 And σ 2 Respectively representing the time standard deviation of the first detector 111 and the time standard deviation of the second detector 112. It should be noted that the time standard deviation of the first detector 111Time standard deviation of the second detector 112
As shown in fig. 5 and fig. 6, in step S7, after obtaining each of the response line time standard deviations σ, the step of reconstructing the image may be as follows, where a Maximum Likelihood Expectation Maximization (MLEM) method is taken as an example, and may also be used in other TOF-PET reconstruction methods.
a) Initializing image f k 1 is a full 1 vector, where f denotes the image, k 0 denotes the number of iterations, a (·, σ) denotes the projection operation, a T (., σ) represents a projection operation. It should be noted that in the projection operation and the back-projection operation in the present embodiment, the time resolution of each line of response is different, which means that the standard deviation of the probability distribution (usually, normal distribution) for estimating the position where annihilation may occur on different lines of response is different and is related to the time resolution of the detectors at both ends of the line of response.
c)k=k+1
d) Repeating steps b) -c) until the amount of image change is less than a threshold value, or a maximum number of iterations is reached.
In addition, the time resolution of each detector can be used as an index for judging the time measurement accuracy of a TOF-PET system, which is specifically as follows:
b) To be provided withAverage value of (1) andas an index of evaluation, wherein the mean value is used as a time measurement for evaluating TOF-PET systemsThe average value is an index of the time measurement consistency, and both are as small as possible.
c) Statistics ofThe abnormal value in (2) is, for example, a range of the abnormal value with a standard deviation of plus or minus three times the average value. Anomalies may occur in the time measurements of the detectors corresponding to the outliers.
As shown in fig. 7 to 8, the present embodiment can obtain the time resolution of the response line by calculating the time resolution of the first detector 111 and the time resolution of the second detector 112, and then can reconstruct an image by the time resolution of the response line. In this embodiment, after calculating the time resolution of the first detector 111 and the time resolution of the second detector 112, different detectors may be selected, so as to improve the problem of poor reconstructed images due to poor time resolution, that is, improve the accuracy of the reconstructed images.
As shown in fig. 9, the present embodiment further provides a system 300 for reconstructing an image by a scanning device, where the system 300 for reconstructing an image by a scanning device includes a phantom position obtaining unit 301, a response line obtaining unit 302, an acquiring unit 303, a model establishing unit 304, a model processing unit 305, and a reconstructing unit 306. The temporal correction method of the system 300 for reconstructing an image by a scanning device can be seen from the above description.
As shown in fig. 8 to 9, in the present embodiment, the phantom position acquiring unit 301 is configured to acquire the position of the phantom 200 in the scanning device, specifically, the phantom position acquiring unit 301 is configured to acquire the position relationship of the phantom 200 in the scanning field, where the phantom 200 is located in the detector ring 100, for example. A response line acquiring unit 302, configured to acquire a response line, and calculate an intersection length of the response line and the phantom 200, where a connection line between the first detector 111 and the second detector 112 is defined as the response line, and the response line passes through the phantom 200; the temporal resolution of the lines of response is related to the temporal resolution of the first detector 111 and the temporal resolution of the second detector 112. The acquisition unit 303 is configured to acquire coincidence events, and count the number of coincidence events included in the response line to calculate a mean value of time differences of flight of the response line, where when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
as shown in fig. 7 to 9, a model establishing unit 304 for establishing a calculation model according to the mean correction value of the time-of-flight difference, the number of coincidence events, the intersection length, and the time resolution of the response line; a model processing unit 305 for using said calculation model to obtain the temporal resolution of each of said detectors. A reconstruction unit 306 for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image. The system for reconstructing the image by the scanning device provided by the embodiment can improve the accuracy of the reconstructed TOF-PET image.
As shown in fig. 10, the present embodiment further provides an electronic device, which includes a processor 50 and a memory 60, where the memory 60 stores program instructions, and the processor 50 executes the program instructions to implement the method for reconstructing an image by a scanning apparatus. The Processor 50 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Network Processor (NP), and the like; but also a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC) or other programmable logic device, discrete gate or transistor logic device, discrete hardware component; the Memory 60 may include a Random Access Memory (RAM), and may also include a Non-Volatile Memory (Non-Volatile Memory), such as at least one disk Memory. The Memory 60 may also be an internal Memory of Random Access Memory (RAM) type, and the processor 50 and the Memory 60 may be integrated into one or more independent circuits or hardware, such as: application Specific Integrated Circuit (ASIC). It should be noted that the computer program in the memory 60 can be implemented in the form of software functional units and stored in a computer readable storage medium when the computer program is sold or used as a stand-alone product. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, an electronic device, or a network device) to perform all or part of the steps of the method according to the embodiments of the present invention.
As shown in fig. 11, this embodiment also proposes a computer-readable storage medium 701, where the computer-readable storage medium 701 stores computer instructions 70, and the computer instructions 70 are used for causing the computer to execute the above-mentioned method for implementing the image reconstruction by the scanning device. The computer-readable storage medium 701 may be an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system or a propagation medium. The computer-readable storage medium 701 may also include a semiconductor or solid state memory, magnetic tape, a removable computer diskette, a Random Access Memory (RAM), a read-only memory (ROM), a rigid magnetic disk and an optical disk. Optical disks may include compact disk-read only memory (CD-ROM), compact disk-read/write (CD-RW), and DVD.
In summary, the present invention provides a system and a method for reconstructing an image by a scanning device, in which a phantom is first placed in a predetermined region of a scanning field of view, a relative position relationship between the phantom and a center of the scanning field of view is obtained, then a response line and a coincidence event are defined, and at the same time, an intersection length between the response line and the phantom is calculated, and the number of coincidence events included in the response line is counted. According to the method, the intersection point of the first ray and the first detector is defined as a first intersection point, the intersection point of the first ray and the phantom is defined as a third intersection point, so that the time of the first ray detected by the first detector can be corrected according to the distance between the first intersection point and the third intersection point, and the time of the second ray detected by the second detector can be corrected, so that the time difference of flight corresponding to an event can be corrected, and therefore the corrected value of the time difference of flight, namely the mean value corrected value of the time difference of flight can be obtained. Then establishing a calculation model according to the resolution of the response line, the number of the coincidence events, the intersection length and the mean value correction value of the time difference of flight; and processing the computational model to obtain a temporal resolution of each detector, and then reconstructing an image from the temporal resolution of each detector using a reconstruction unit. According to the invention, by measuring the time resolution of each detector, when the time resolution of the first detector and/or the time resolution of the second detector are/is poor, the first detector and/or the second detector can be avoided, so that the quality of the reconstructed image can be improved, namely the accuracy of the reconstructed TOF-PET image can be improved.
Reference throughout this specification to "one embodiment", "an embodiment", or "a specific embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment, and not necessarily all embodiments, of the present invention. Thus, respective appearances of the phrases "in one embodiment", "in an embodiment", or "in a specific embodiment" in various places throughout this specification are not necessarily referring to the same embodiment. Furthermore, the particular features, structures, or characteristics of any specific embodiment of the present invention may be combined in any suitable manner with one or more other embodiments. It is to be understood that other variations and modifications of the embodiments of the invention described and illustrated herein are possible in light of the teachings herein and are to be considered as part of the spirit and scope of the present invention.
It will also be appreciated that one or more of the elements shown in the figures can also be implemented in a more separated or integrated manner, or even removed for inoperability in some circumstances or provided for usefulness in accordance with a particular application.
Additionally, any reference arrows in the drawings/figures should be considered only as exemplary, and not limiting, unless otherwise expressly specified. Further, as used herein, the term "or" is generally intended to mean "and/or" unless otherwise indicated. Combinations of components or steps will also be considered as being noted where terminology is foreseen as rendering the ability to separate or combine is unclear.
As used in the description herein and throughout the claims that follow, "a", "an", and "the" include plural references unless otherwise indicated. Also, as used in the description herein and throughout the claims that follow, unless otherwise indicated, the meaning of "in … (in)" includes "in … (in)" and "on … (on)".
The above description of illustrated embodiments of the invention, including what is described in the abstract of the specification, is not intended to be exhaustive or to limit the invention to the precise forms disclosed herein. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes only, various equivalent modifications are possible within the spirit and scope of the present invention, as those skilled in the relevant art will recognize and appreciate. As indicated, these modifications may be made to the present invention in light of the foregoing description of illustrated embodiments of the present invention and are to be included within the spirit and scope of the present invention.
The systems and methods have been described herein in general terms as the details aid in understanding the invention. Furthermore, various specific details have been given to provide a general understanding of the embodiments of the invention. One skilled in the relevant art will recognize, however, that an embodiment of the invention can be practiced without one or more of the specific details, or with other apparatus, systems, assemblies, methods, components, materials, parts, and/or the like. In other instances, well-known structures, materials, and/or operations are not specifically shown or described in detail to avoid obscuring aspects of embodiments of the invention.
Claims (9)
1. A method for reconstructing an image for a scanning device, comprising:
placing a die body in a preset area of a scanning visual field, and obtaining a relative position relation between the die body and the center of the scanning visual field, wherein the die body is positioned in a detector ring in the scanning visual field, the detector ring comprises a plurality of detectors with different positions, and the die body is a linear radioactive source or a uniform barrel-shaped radioactive source;
defining a response line, and calculating the intersection length of the response line and the phantom, wherein a connecting line between a first detector and a second detector is defined as the response line, the response line passes through the phantom, and the time resolution of the response line is related to the time resolution of the first detector and the time resolution of the second detector;
collecting coincidence events, and counting the number of the coincidence events included in the response line to calculate the mean value of the time difference of flight of the response line, wherein when the phantom emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
using the computational model to obtain a temporal resolution of each of the detectors;
obtaining the temporal resolution of the lines of response using the temporal resolution of each of the detectors to reconstruct an image.
2. A method of reconstructing an image by a scanning device according to claim 1, wherein said temporal resolution of any two of said detectors is the same or different.
3. The method of reconstructing an image by a scanning device according to claim 1, wherein the temporal resolution of the response line satisfies the following formula:
4. The method of reconstructing an image using a scanning device according to claim 1, wherein the step of correcting the mean value of the time-of-flight differences according to the distance between the first detector and the phantom and the distance between the second detector and the phantom to obtain a mean value correction value of the time-of-flight differences comprises:
correcting the time of the first ray detected by the first detector according to the distance between the first detector and the die body to obtain a time correction value of the first ray detected by the first detector;
correcting the time of the second ray detected by the second detector according to the distance between the second detector and the die body to obtain a time correction value of the second ray detected by the second detector;
calculating a corrected value for the time of flight difference for each of the coincident events;
and obtaining a corrected value of the mean value of the time-of-flight differences according to the corrected value of the time-of-flight differences and the number of coincidence events.
5. The method of reconstructing an image by a scanning device according to claim 4, wherein said standard deviation of time-of-flight differences is positively correlated to said intersection length and said standard deviation of time-of-flight differences is positively correlated to said temporal resolution of said response line, the mean of said time-of-flight differences tending towards the expectation of said time-of-flight differences.
6. The method of reconstructing an image according to claim 4, wherein the time at which the first ray is detected by the first detector is corrected according to the distance from the first detector to the phantom to obtain a corrected time value at which the first ray is detected by the first detector, and the corrected time value is defined as T 1c ;
T 1c =T 1 -d 13 /c
Wherein, T 1c A correction value, T, representing the time at which the first ray was detected by the first detector 1 Representing the time at which the first ray was detected by the first detector, d 13 And c is the speed of light.
7. The method of claim 1, wherein the computational model is
Wherein-representsThe normal distribution Norm is satisfied,mean correction value representing the time-of-flight difference, N representing the number of coincidence events, R representing the length of intersection, σ 1 And σ 2 To representThe standard deviation of time of the first detector and the standard deviation of time of the second detector, c represents the speed of light.
8. The method of claim 7, wherein using the computational model to obtain the temporal resolution of each of the detectors comprises:
the time standard deviation of each of the detectors is obtained according to the following formula,
wherein,a mean correction value, Σ, representing said time-of-flight difference i Represents summing all the response lines, N represents the number of coincidence events, R represents the length of intersection, σ represents 1 And σ 2 Representing the time standard deviation of said first detector and the time standard deviation of said second detector, c representing the speed of light, and σ representing said time standard deviation of each of said detectors;
9. A system for reconstructing an image from a scanning device, comprising:
the die body position acquisition unit is used for acquiring the relative position relation of the die body relative to the center of a scanning view field, wherein the die body is positioned in a detector ring in the scanning view field, the detector ring comprises a plurality of detectors with different positions, and the die body is a linear radioactive source or a uniform barrel-shaped radioactive source;
the response line acquisition unit is used for acquiring a response line and calculating the intersection length of the response line and the phantom, wherein a connecting line between the first detector and the second detector is defined as the response line, and the response line penetrates through the phantom; a temporal resolution of the line of response is related to the temporal resolution of the first detector and the temporal resolution of the second detector;
the acquisition unit is used for acquiring coincidence events and counting the number of the coincidence events included by the response line so as to calculate the mean value of the flight time difference of the response line, wherein when the die body emits a first ray and a second ray in opposite directions, and the first ray and the second ray are detected in a preset time coincidence window, the coincidence events are defined;
the correction unit is used for correcting the mean value of the time difference of flight according to the distance between the first detector and the die body and the distance between the second detector and the die body to obtain a mean value correction value of the time difference of flight;
the model establishing unit is used for establishing a calculation model according to the mean value correction value of the flight time difference, the number of the coincidence events, the intersection length and the time resolution of the response line;
a model processing unit for using the computational model to obtain a temporal resolution of each of the detectors;
a reconstruction unit for obtaining the temporal resolution of the lines of response according to the temporal resolution of each of the detectors to reconstruct an image.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110216943.8A CN112998735B (en) | 2021-02-26 | 2021-02-26 | System and method for reconstructing image by scanning device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110216943.8A CN112998735B (en) | 2021-02-26 | 2021-02-26 | System and method for reconstructing image by scanning device |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112998735A CN112998735A (en) | 2021-06-22 |
CN112998735B true CN112998735B (en) | 2022-09-02 |
Family
ID=76386333
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110216943.8A Active CN112998735B (en) | 2021-02-26 | 2021-02-26 | System and method for reconstructing image by scanning device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112998735B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114594107B (en) * | 2022-05-09 | 2022-08-16 | 武汉精立电子技术有限公司 | Optimization method and application of scanning path and detection method of surface of semiconductor material |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006020678A (en) * | 2004-07-06 | 2006-01-26 | Hitachi Ltd | X-ray ct apparatus |
CN104183012A (en) * | 2013-10-31 | 2014-12-03 | 上海联影医疗科技有限公司 | PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device |
CN105496436A (en) * | 2015-11-28 | 2016-04-20 | 上海联影医疗科技有限公司 | Time correction method and device used for PET device |
CN106539591A (en) * | 2015-09-21 | 2017-03-29 | 上海联影医疗科技有限公司 | PET flight time state quality detection methods and PET scan device |
CN107137101A (en) * | 2017-04-24 | 2017-09-08 | 沈阳东软医疗系统有限公司 | A kind of time calibrating method and device |
-
2021
- 2021-02-26 CN CN202110216943.8A patent/CN112998735B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2006020678A (en) * | 2004-07-06 | 2006-01-26 | Hitachi Ltd | X-ray ct apparatus |
CN104183012A (en) * | 2013-10-31 | 2014-12-03 | 上海联影医疗科技有限公司 | PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device |
CN106539591A (en) * | 2015-09-21 | 2017-03-29 | 上海联影医疗科技有限公司 | PET flight time state quality detection methods and PET scan device |
CN105496436A (en) * | 2015-11-28 | 2016-04-20 | 上海联影医疗科技有限公司 | Time correction method and device used for PET device |
CN107137101A (en) * | 2017-04-24 | 2017-09-08 | 沈阳东软医疗系统有限公司 | A kind of time calibrating method and device |
Non-Patent Citations (1)
Title |
---|
A METHOD FOR TIME CALIBRATION OF PET SYSTEMS USING FIXED β+ RADIOACTIVE SOURCE;K.Dulski,et al;《Acta Physica Polonica B》;20200131;第51卷(第1期);第195-200页 * |
Also Published As
Publication number | Publication date |
---|---|
CN112998735A (en) | 2021-06-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106539591B (en) | PET flight time state quality detection method and PET scanning device | |
US10215864B2 (en) | System and method to improve image quality of emission tomography when using advanced radionuclides | |
US11243312B2 (en) | Imaging detector system for gamma radiation using unidirectional and bidirectional Compton scattering processes | |
US20110309252A1 (en) | Nuclear medicine imaging apparatus, and nuclear medicine imaging method | |
US11510636B2 (en) | System and method for positron emission tomography | |
JP5845487B2 (en) | Method for absolute measurement of radioactivity of positron decay nuclides that emit gamma rays, method for determining the detection efficiency of a radiation detector assembly, and method for calibrating a radiation measurement apparatus | |
JP2005315887A (en) | Method and system for normalizing positron emitting tomography system | |
CN101842806A (en) | Dirty isotope pet reconstruction | |
US10245002B2 (en) | Isotope specific calibration of a dose calibrator for quantitative functional imaging | |
JP6125309B2 (en) | Random coincidence counting estimation method and random coincidence counting estimation apparatus | |
US7129497B2 (en) | Method and system for normalization of a positron emission tomography system | |
JP2012233881A (en) | Nuclear medicine imaging method, nuclear medicine imaging device, and storage medium | |
CN112998735B (en) | System and method for reconstructing image by scanning device | |
US11231508B2 (en) | Gamma camera dead time determination in real time using long lived radioisotopes | |
Guérin et al. | Realistic PET Monte Carlo simulation with pixelated block detectors, light sharing, random coincidences and dead-time modeling | |
EP2902806B1 (en) | Nuclear medicine diagnostic device and medical data processing device | |
CN112998737B (en) | Time offset correction system and method for scanning device | |
Moore | ModPET: Novel applications of scintillation cameras to preclinical PET | |
Surti et al. | PET instrumentation | |
CN112998736B (en) | Time correction system and time correction method of scanning device | |
Kijewski | Positron emission tomography (PET) and single-photon emission computed tomography (SPECT) physics | |
JP7001176B2 (en) | Data processing methods, programs, data processing equipment and positron emission tomographic imaging equipment | |
Turkington | PET imaging basics | |
US20220343566A1 (en) | Methods and systems for reconstructing a positron emission tomography image | |
Kijewski | Positron emission tomography and single-photon emission computed tomography physics |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |