CN106859686B - Imaging method and imaging system - Google Patents

Imaging method and imaging system Download PDF

Info

Publication number
CN106859686B
CN106859686B CN201710039748.6A CN201710039748A CN106859686B CN 106859686 B CN106859686 B CN 106859686B CN 201710039748 A CN201710039748 A CN 201710039748A CN 106859686 B CN106859686 B CN 106859686B
Authority
CN
China
Prior art keywords
projection
code
projection position
encoding
coincidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710039748.6A
Other languages
Chinese (zh)
Other versions
CN106859686A (en
Inventor
韩冬
马锐兵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Medical Systems Co Ltd
Original Assignee
Neusoft Medical Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Medical Systems Co Ltd filed Critical Neusoft Medical Systems Co Ltd
Priority to CN201710039748.6A priority Critical patent/CN106859686B/en
Publication of CN106859686A publication Critical patent/CN106859686A/en
Application granted granted Critical
Publication of CN106859686B publication Critical patent/CN106859686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus for radiation diagnosis, e.g. combined with radiation therapy equipment
    • A61B6/02Devices for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computerised tomographs
    • A61B6/037Emission tomography

Abstract

The present application provides an imaging method. The imaging method includes: encoding a plurality of projection positions in a projection space formed by a detector device according to a spatial sequence to obtain projection position codes; detecting a coincidence event; determining a projection position code corresponding to the detected coincidence event, and accumulating a coincidence event count corresponding to the projection position code; sorting the projection position codes and the coincidence event counts corresponding to the detected coincidence events; and reconstructing an image using the ordered projection position codes and the corresponding coincidence event counts. The application also discloses an imaging system.

Description

Imaging method and imaging system
Technical Field
The present application relates to medical imaging methods and systems, and more particularly to methods and systems for positron emission tomography.
Background
Positron Emission Tomography (PET) is a technique that involves injecting a radionuclide-labeled tracer into a subject, where the tracer is collected by the circulatory system in some tissues of the subject where metabolism is vigorous. Meanwhile, nuclides in the tracer decay to release positrons, and the positrons collide with surrounding negative electrons to be annihilated, so that gamma photon pairs flying in opposite directions are released. A gamma photon pair is registered as a coincidence event when it is received by a pair of detector cells. When enough gamma photon pairs are received by the PET detector, the distribution condition of the tracer in the detected body can be calculated by utilizing a reconstruction algorithm, so that the metabolic distribution information of the detected body is obtained. When the PET detector receives the gamma photon pair, the information of the gamma photon pair, such as position, energy, time, etc., needs to be stored according to a certain data format. The existing data storage formats include two forms of chord graph (Sinogrm) data and list-mode (list-mode) data.
Disclosure of Invention
In view of this, one aspect of the present application provides an imaging method. The imaging method includes: encoding a plurality of projection positions in a projection space formed by a detector device according to a spatial sequence to obtain projection position codes; detecting a coincidence event; determining a projection position code corresponding to the detected coincidence event, and accumulating a coincidence event count corresponding to the projection position code; sorting the projection position codes and the coincidence event counts corresponding to the detected coincidence events; and reconstructing an image using the ordered projection position codes and the corresponding coincidence event counts.
Another aspect of the present application provides an imaging system. The imaging system includes: a detection means forming a projection space and for detecting gamma photons; the encoding unit is used for encoding a plurality of projection positions in the projection space according to a spatial sequence to obtain projection position codes; a coincidence processor for detecting a coincidence event; the projection data generating unit is used for determining the projection position code corresponding to the detected coincidence event and accumulating the coincidence event count corresponding to the projection position code; the sorting unit is used for sorting the projection position codes and the coincidence event counts corresponding to the detected coincidence events; and the image reconstruction unit is used for reconstructing an image by utilizing the sequenced projection position codes and the corresponding coincidence event counts.
Drawings
FIG. 1 is a schematic diagram of a positron annihilation event;
FIG. 2 is a schematic view of one embodiment of a detection device of the PET system;
FIG. 3 is a flow chart illustrating one embodiment of an imaging method of the present application;
FIG. 4 is a schematic diagram illustrating one embodiment of projection direction encoding and cross-sectional in-plane projection encoding;
FIG. 5 is a schematic diagram illustrating one embodiment of axial encoding;
FIG. 6 is a schematic diagram illustrating one embodiment of ring difference encoding;
FIG. 7 is a diagram illustrating one embodiment of time interval encoding;
FIG. 8 is a graph of reconstruction time versus coincidence event counts for a list-mode data reconstruction, a conventional chord graph data reconstruction, and the method of the present application;
FIG. 9 is a schematic view of an embodiment of an imaging system of the present application.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. Unless defined otherwise, technical or scientific terms used herein shall have the ordinary meaning as understood by one of ordinary skill in the art to which this invention belongs. The use of the terms "a" or "an" and the like in the description and in the claims of this application do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that the element or item listed as preceding "comprising" or "includes" covers the element or item listed as following "comprising" or "includes" and its equivalents, and does not exclude other elements or items. The terms "connected" or "coupled" and the like are not restricted to physical or mechanical connections, but may include electrical connections, whether direct or indirect. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
Figure 1 is a schematic diagram of a positron annihilation event. A deoxyglucose (FDG) marker containing a positive electron nuclide is used as a tracer and is injected into a subject, generally a human body. The tracer enters the subject and then diffuses into various tissues with the blood and participates in the metabolic activity of the subject. In this process, the positron nuclide in the tracer releases a positron e +, and the released positron e + moves a certain distance in the subject and annihilates with a negative electron e-in the surrounding environment to generate a pair of gamma photons with equal energy (511KeV) and opposite propagation directions (about 180 degrees). The pair of gamma photons can be detected by a detection device of the PET system to further analyze the presence of the positron and obtain the concentration distribution of the tracer in the subject.
FIG. 2 is a schematic diagram of one embodiment of a detection device 10 of a PET system. The probe apparatus 10 includes a plurality of probe rings 12 arranged along an axis (i.e., Z direction). Each detection ring 12 includes a plurality of detector modules 14 that are assembled together. The plurality of detection rings 12 form an interior space (which may be referred to as a "projection space") 16, and gamma photon pairs generated by positron annihilation events occurring within the interior space 16 are detected by a pair of detector modules 14 when the pairs are incident on the pair of detector modules 14 in opposite directions.
Each detector module 14 includes a number of detector cells (not shown) including a number of scintillation crystals and a photodetector device. The scintillation crystal can absorb gamma photons and produce a number of visible light photons depending on the energy of the gamma photons. The photodetector, typically a photomultiplier tube, converts the visible light signal generated by the scintillation crystal into an electrical signal for output. The electrical signal may be used to make a coincidence determination, e.g., to determine whether two gamma photons strike two detector cells within a preset coincidence time window. If within the preset coincidence time window, the event that the two gamma photons strike the two detector units is a coincidence event. The event that a gamma photon strikes a detector cell is a single event, and a pair of single events is a coincidence event.
FIG. 3 is a flow chart illustrating one embodiment of an imaging method 30. The imaging method 30 may be used with a PET system. Imaging method 30 includes steps 31-35. Wherein the content of the first and second substances,
in step 31, a number of projection positions in a projection space formed by the detector arrangement are encoded in a spatial order to obtain a projection position code.
The line between an opposing pair of detector units 140 of the PET system is the projection line 41 and the region between the opposing pair of detector units 140 is the projection region 42, as shown in fig. 4. Only one projection line 41 is schematically indicated in fig. 4, however several projection lines are present and in fig. 4 one projection area 42 is indicated by shading. In one embodiment, the projection locations represent projection lines, which are encoded. In another embodiment, the projection locations represent projection regions, which are encoded.
In one embodiment, the projection positions may be encoded in the projection direction, the in-plane projection arrangement direction, and the axial direction. Accordingly, the projection position coding includes projection direction coding, cross-sectional in-plane projection coding, and axial coding. The "projection direction" indicates a direction of a projection line or a projection area within the transverse plane, and may also be referred to as a "projection angle". The "cross-section" is the plane in which the annular region formed by the detector ring lies, i.e., the X-Y plane, and may also be referred to as the "fault plane".
The encoding of the projection direction is explained with reference to fig. 4. Fig. 4(a), (b) and (c) illustrate different projection directions within the transverse section, respectively. And taking any projection direction as a starting direction, and coding the starting direction to be 0, so that the projection direction codes of all projection data in the projection direction are all 0. The next projection direction rotating counterclockwise or clockwise is encoded as 1, and the projection direction codes of all projection data in the projection direction are all 1. And continuing rotating until the last projection direction, and coding the projection direction as j, wherein j is a positive integer, and the projection direction codes of all projection data in the projection direction are j. So that j +1 projection direction codes of 0 to j are obtained.
In the embodiment shown in fig. 4, the projection direction in fig. 4(a) is taken as the starting projection direction, and is encoded as 0. The projection direction in fig. 4(b) is the next projection direction of counterclockwise rotation of the projection direction in fig. 4(a), and is coded as 1. By analogy, the projection direction in fig. 4(c) is the last projection direction, and is coded as j. Fig. 4 shows only the projection direction encoding of one embodiment, but is not limited thereto. In other embodiments, the encoding may be started from any other projection direction, or the projection directions may be encoded in a clockwise rotation order. Fig. 4 is only schematic and not all projection patterns are shown.
In the embodiment of fig. 4, each projection direction is encoded separately. In another embodiment, the same code may be used for at least partially adjacent projection directions. That is, two adjacent projection directions are merged together to jointly encode. For example, two adjacent projection directions of fig. 4(a) and 4(b) use the same code, and are both coded as 0. Two adjacent projection directions subsequent to the projection direction in fig. 4(b) are jointly encoded, both being encoded as 1. And so on until the last two adjacent projection directions. Therefore, the down-sampling can be realized, the use of a storage space is reduced, and the calculation speed of a reconstruction algorithm is improved.
With continued reference to fig. 4, cross-sectional intra-projection encoding is illustrated. The transverse section under the same projection direction comprises a plurality of groups of projection lines or projection areas which are arranged in parallel or approximately in parallel. Each group of projection lines comprises a plurality of projection lines arranged along the Z direction, and each group of projection areas comprises a plurality of projection areas arranged along the Z direction. As shown in fig. 5, a set of projection areas 42 in a plane extending in the Z-direction (i.e., a plane perpendicular to the cross-sectional plane) is illustrated. Several groups of projection lines or projection areas are arranged in a direction perpendicular to or intersecting the projection direction. The arrangement direction of the sets of projection lines or projection areas within the transverse plane is referred to as "projection arrangement direction within the transverse plane". For example, the projection direction in fig. 4(a) is against the Y direction, and the in-plane projection arrangement direction is along the X direction or against the X direction.
And coding a plurality of groups of projection lines or projection areas in the transverse section to obtain the projection codes in the transverse section. The set of projection lines or the set of projection regions may be encoded from side to side in the direction of the projected arrangement within the transverse plane. For example, in fig. 4(a), a projection line group or a projection area group may be encoded from left to right, which are 0, 1, …, i, respectively, where i is a positive integer. Or may be encoded from right to left.
Similarly, a set of projection in the transverse plane (a set of projection lines or a set of projection regions) in the other projection direction is encoded. For example, as shown in FIGS. 4(b) and (c).
Referring to fig. 5, axial encoding is explained. The axial (Z-direction) position is encoded under projection in the same projection direction and in the same transverse plane. Encoding the projection lines or projection regions in sequence along or against the Z direction yields axial encodings of 0, 1, …, k, where k is a positive integer. Fig. 5 is only schematic and does not illustrate all axial positions.
Thus, the encoding is done for the projection line or the projection area, and the projection position encoding indicates the position of the projection line or the projection area in the projection space. Since a projection line or a projection area is determined by a pair of detector units 140, the projection position code is also a code of the pair of detector units.
The above is one embodiment of the projection position coding. In another embodiment, the ring difference is further encoded. Referring to fig. 6, the coding of the ring difference is explained. And encoding the ring difference under the projection in the same projection direction and the same transverse plane. If the ring difference code in fig. 6(a) is 0, the ring difference codes of all projection data in the ring difference are 0. Fig. 6(b) and (c) are a set of ring differences of ring difference 1, coded as 1 and 2, respectively. And so on until the maximum set of ring differences are coded as l-1 and l, respectively, as shown in fig. 6(d) and (e).
Fig. 6 shows only one embodiment of ring difference encoding. The ring differences may be encoded in any other order in other embodiments. The ring difference is encoded, for example, in the order from fig. 6(e) to fig. 6 (a). Or, for example, the ring differences in one direction are encoded sequentially and then the ring differences in the other direction are encoded sequentially. But are not limited to the above sequence.
Adjacent ring differences are encoded sequentially in the embodiment of fig. 6. In another embodiment, the same code is used for at least some of the neighboring ring differences, i.e. neighboring ring differences may be combined into one code. For example, the same code may be used for the set of ring differences in fig. 6(b) and (c), and the same code may be used for the set of ring differences in fig. 6(d) and (e). Compared with the embodiment that the adjacent ring differences are sequentially encoded, the image reconstruction based on the encoding in the embodiment that the same encoding is used for the adjacent ring differences has higher calculation speed, but the image quality is slightly reduced. In practical applications, the required image quality and reconstruction computation speed can be measured to select the way of encoding the ring difference. It should be noted that fig. 6 is only schematic, and not all the ring differences are illustrated.
In this embodiment, the axial position is encoded after the ring difference encoding is completed. Similar to the method for encoding the axial position in the embodiment of fig. 5, the axial position is encoded in the embodiment of fig. 6 under the same projection direction, the same cross-sectional projection and the same ring difference, thus obtaining the axial encoding.
In yet another embodiment, for embodiments using Time of Flight (TOF) techniques, the Time interval (Time-bin) is further encoded. In one embodiment, the projection spatial positions corresponding to the time-bins are encoded. The emission position of a gamma photon pair on the projection line can be determined according to the time difference of arrival of a pair of gamma photons at a pair of detector units, so that the time-bin can be corresponding to the position on the projection space. Referring to fig. 7, under the same projection direction, the same projection in the cross section, and the same ring difference, an axial region perpendicular to the cross section is divided into a plurality of projection space intervals corresponding to time-bins parallel to the axial direction, and encoding is performed. The encoding can be done from a position close to one detector cell to the opposite side, with the encoding being 0, 1, …, m, where m is a positive integer. Time-bins at different ring differences are thus encoded. In another embodiment, the time intervals are divided in time and encoded.
The division of the Time-bin may be done according to the actual application. If the number of time-bin partitions is large, i.e. partitions are divided according to a small time interval, the reconstructed image quality is high, however, more calculation time is required, and the calculation speed is reduced. If the number of time-bin partitions is small, i.e. partitioned in larger time intervals, the computation is accelerated, however the reconstructed image quality will be degraded. In practical applications, time-bins can be divided by balancing image quality and computation speed.
In some embodiments, the encoding may be performed in a different encoding order than the above-described embodiments. For example, the ring difference may be encoded after the axial encoding is completed; the time-bin may be encoded first, followed by the axial position. In some embodiments, the partial projection positions in the projection space are not encoded. For example, some detector unit pairs in the detector ring determine that a coincidence event is not received, and projection positions corresponding to the detector unit pairs may not be encoded; the coincident events received at the partial locations need not be considered for reconstruction, and the projection locations may not be encoded.
In step 32, a coincidence event is detected.
The single event is detected through the detection device, the single event is collected, and a pair of single events is judged to be coincidence events from the single events, namely the coincidence events are detected. In embodiments using TOF technology, the time difference of occurrence of a pair of single events of a coincidence event is further detected.
In step 33, the projection position code corresponding to the detected coincidence event is determined, and the coincidence event count corresponding to the projection position code is accumulated.
And determining the projection position code corresponding to the detected coincidence event according to the codes according to the positions of the pair of detector units which detect the coincidence event. Since a projection line or a projection area is defined by a pair of detector units, the pair of detector units corresponds to a projection position code. The positions of the pairs of detector units are thus obtained, i.e. the projection position code can be determined. In embodiments using TOF techniques, time-bin encoding can be determined based on the time difference of the detected occurrence of a pair of single events.
And when the projection position code corresponding to the coincidence event is obtained, adding 1 to the coincidence event count received by the projection position. The coincidence event count initial value corresponding to each projection position code may be set to 0, and the coincidence event counts are accumulated in units of 1. This is done until all coincident events have been detected. Projection data is thereby obtained, the projection data including a projection position code and a corresponding coincidence event count. The projection data may reflect the location at which a pair of gamma photons is generated and the number of gamma photon pairs generated at that location.
In step 34, the projection position codes and coincidence event counts corresponding to the detected coincidence events are sorted.
The projection data having a coincidence event count other than 0 is sorted. And sorting according to the size of the projection position codes, and sorting and storing the coincidence event counts corresponding to the projection position codes. The ordering may be in the spatial order in which the projection positions are encoded.
In one embodiment, the projection direction codes of the projection data are compared in size, and the projection data are arranged in the order of the projection method codes from small to large or from large to small. And comparing the sizes of the projection codes in the transverse section under the same projection direction code, and arranging the projection data according to the sequence of the projection codes in the transverse section from small to large or from large to small. And comparing the sizes of the axial codes under the same projection direction code and the same transverse plane projection code, and arranging the projection data according to the sequence of the axial codes from small to large or from large to small. The projection position codes and corresponding coincidence event counts are thus ordered.
In another embodiment, after the projection data are sorted according to the projection direction codes and the transverse plane projection codes according to the above embodiment, the sizes of the ring difference codes are compared under the same projection direction codes and the same transverse plane projection codes, and the projection data are arranged according to the order of the ring difference codes from small to large or from large to small. And comparing the sizes of the axial codes under the same projection direction code, the same transverse plane projection code and the same ring difference code, and arranging the projection data according to the sequence of the axial codes from small to large or from large to small.
In yet another embodiment, the size of the time-bin codes is compared and the projection data is arranged in order of time-bin codes from small to large or from large to small under the same projection direction code, the same cross-sectional in-plane projection code, the same ring difference code and the axial code.
The sorting step may be performed after all the projection data are acquired, that is, all the projection data are sorted after all the coincidence events are detected and all the projection data are obtained in steps 32 and 33. Alternatively, the step of ordering may be performed during the steps of detecting coincidence events 32 and determining projection position codes 33, i.e. during projection data acquisition, the obtained projection data being ordered each time a coincidence event is received. If the projection position code does not exist in the sequenced sequence, the projection position code and the corresponding coincidence event count are sequenced into the sequenced sequence; if the projection position code already exists in the sequence, the coincidence event count corresponding to the projection position code is accumulated, and 1 can be added on the basis of the original coincidence event count. Ordering the projection data facilitates speeding up the computation of subsequent image reconstruction.
In step 35, an image is reconstructed using the sorted projection position codes and the corresponding coincidence event counts.
And reconstructing the image by using the projection data through an iterative reconstruction algorithm. And performing iterative computation on coincidence event data with a coincidence event count of not 0, and not performing computation on the coincidence event data with a coincidence event count of 0, so that the sparse coding and storage of the chord graph data are realized, and the fast computation can be realized by combining an iterative reconstruction algorithm. During the reconstruction, iterative calculations are performed in the order of the arrangement of the projection data in step 34. Because coincidence event counts under the same projection position code are accumulated, the phenomenon that the coincidence event data at the same position are repeatedly calculated in the table data (list-mode data) reconstruction process does not occur. Thus, the reconstruction speed of the method 30 is faster than the reconstruction speed based on the list-mode data.
FIG. 8 is a graph of reconstruction time versus coincidence event counts for list-mode data reconstruction, conventional chord graph data reconstruction, and the method 30 of the present application. In the conventional chord chart data reconstruction, all projection data including data with coincidence event count of 0 need to be traversed according to a certain spatial sequence to complete iterative computation. As can be seen from FIG. 8, the reconstruction time of the present application is significantly less than that of the list-mode data reconstruction, the conventional chord graph data reconstruction.
The acts of the imaging method 30 are illustrated in block form, and the sequencing of blocks and the division of acts among blocks shown in FIG. 3 is not limited to the illustrated embodiment. For example, the modules may be performed in a different order; actions in one module may be combined with actions in another module or split into multiple modules. In some embodiments, there may be additional steps before, after, or intermediate the steps of the imaging method 30.
In correspondence with the foregoing embodiments of the imaging method 30, the present application also provides embodiments of an imaging system. FIG. 9 is a schematic block diagram illustrating an imaging system 90 of an embodiment. The imaging system 90 comprises a detection apparatus 10, an encoding unit 91, a coincidence processor 92, a projection data generation unit 93, a sorting unit 94 and an image reconstruction unit 95.
The detection means 10 form a projection space 16 and are used to detect gamma photons. The detection device 10 detects the gamma photons to generate optical signals, and converts the optical signals into electrical signals to output.
The encoding unit 91 is configured to encode a number of projection positions in the projection space 16 in a spatial order to obtain a projection position code. The encoding unit 91 may be used to perform step 31 of the imaging method 30.
In one embodiment, the encoding unit 91 is configured to encode the projection position in the projection direction, the transverse in-plane projection arrangement direction, and the axial direction to obtain a projection position code including a projection direction code, a transverse in-plane projection code, and an axial code.
In another embodiment, the encoding unit 91 is configured to encode the ring difference to obtain a projection position code comprising a ring difference code. The encoding unit 91 may encode the ring difference after encoding the projection position in the projection direction and the in-plane-transverse projection arrangement direction, and encode the ring difference in the axial direction, and the obtained projection position code includes a projection direction code, an in-plane-transverse projection code, a ring difference code, and an axial code. However, the encoding unit 91 may encode in other orders.
In a further embodiment, the encoding unit 91 is configured to encode the time interval to obtain a projection position code comprising the time interval code. The encoding unit 91 may further encode the projection position in the time interval on the basis of encoding the projection position in the projection direction, the transverse in-plane projection arrangement direction, and the axial direction, and the obtained projection position code includes a projection direction code, a transverse in-plane projection code, an axial code, and a time code. The encoding unit 91 may further encode the time interval on the basis of encoding the projection position in the projection direction, the transverse in-plane projection arrangement direction, the ring difference, and the axial direction, and the obtained projection position code includes a projection direction code, a transverse in-plane projection code, a ring difference code, an axial code, and a time code.
Coincidence processor 92 is used to detect coincidence events. Coincidence processor 92 receives the electrical signal output by detection device 10 and determines a coincidence event based on the electrical signal.
The projection data generating unit 93 is configured to determine a projection position code corresponding to the detected coincidence event, and accumulate a coincidence event count corresponding to the projection position code, so as to generate projection data. The projection data includes a projection position code and a corresponding coincidence event count.
The sorting unit 94 is used for sorting the projection position codes and coincidence event counts corresponding to the detected coincidence events. And sorting the projection data with coincidence event count not being 0 according to the projection position codes. In one embodiment, the sorting unit 94 sorts the projection data after the coincidence processor 92 detects all coincidence events and the projection data generating unit 93 generates the projection data of all the detected coincidence events. In another embodiment, the sorting unit 94 is used to sort in the process of the coincidence processor 92 detecting the coincidence event and the projection data generating unit 93 determining the projection position code. If the projection position code does not exist in the sorted sequence, the sorting unit 94 is configured to sort the projection position code and the corresponding coincidence event count into the sorted sequence; if the projection position code already exists in the sorted sequence, the sorting unit 94 is configured to accumulate the coincidence event count corresponding to the projection position code. The coincidence processor 92 detects a coincidence event, the projection data generating unit 93 determines projection data corresponding to the coincidence event, and the sorting unit 94 sorts the projection data or adds 1 to a coincidence event count corresponding to an existing projection position code. The coincidence processor 92, the projection data generating unit 93, and the sorting unit 94 operate in synchronization.
The image reconstruction unit 95 is used to reconstruct an image using the ordered projection position codes and the corresponding coincidence event counts. The sorted projection data are all projection data whose coincidence event count is not 0. The image reconstruction unit 95 reconstructs projection data whose coincidence event is not 0 in the order of arrangement.
The encoding unit 91, the coincidence processor 92, the projection data generation unit 93, the sorting unit 94 and/or the image reconstruction unit 95 of the imaging system 90 may be implemented by software, or by hardware, or by a combination of hardware and software. The encoding unit 91, the coincidence processor 92, the projection data generating unit 93, the sorting unit 94 and/or the image reconstruction unit 95 may be separate units, or may be integrated into one unit,
in some embodiments, the imaging system 90 may also include other devices not shown in FIG. 9. For example, but not limited to, a data acquisition unit for acquiring electrical signals of the detection device 10 to be provided to the coincidence processor 92; a memory operable to store the projection position code, coincidence event count, ordered projection data, intermediate processing data in an image reconstruction process, and/or a reconstructed image generated by the encoding unit 91; a display for displaying the reconstructed image and/or parameters, etc.; input devices, such as a keyboard, mouse, touch screen, etc., are used to input control commands, operating parameters, etc.
For the device embodiments, since they substantially correspond to the method embodiments, reference may be made to the partial description of the method embodiments for relevant points. The method embodiment and the device embodiment are complementary. The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the components can be selected according to actual needs to achieve the purpose of the scheme of the application. One of ordinary skill in the art can understand and implement it without inventive effort.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the scope of protection of the present application.

Claims (10)

1. An imaging method characterized by: it includes:
encoding a plurality of projection positions in a projection space formed by a detector device according to a spatial sequence to obtain projection position codes;
detecting a coincidence event;
determining a projection position code corresponding to the detected coincidence event, and accumulating a coincidence event count corresponding to the projection position code;
sorting the projection position codes and the coincidence event counts corresponding to the detected coincidence events; and
reconstructing an image using the ordered projection position codes and the corresponding coincidence event counts.
2. The imaging method of claim 1, wherein: the encoding step comprises encoding the projection position in the projection direction, the projection arrangement direction in the transverse plane and the axial direction to obtain a projection position code comprising a projection direction code, a projection code in the transverse plane and an axial code.
3. The imaging method of claim 1, wherein: the step of encoding comprises encoding the ring difference to obtain a projection position code comprising a ring difference code.
4. The imaging method of claim 1, wherein: the step of encoding comprises encoding the time interval to obtain a projected position code comprising the time interval code.
5. The imaging method of claim 1, wherein: the step of ordering comprises ordering during the steps of detecting coincidence events and determining projected position codes; if the projection position code does not exist in the sequenced sequence, the projection position code and the corresponding coincidence event count are sequenced into the sequenced sequence; if the projection position code already exists in the sorted sequence, the coincidence event counts corresponding to the projection position code are accumulated.
6. An imaging system, characterized by: it includes:
a detection means forming a projection space and for detecting gamma photons;
the encoding unit is used for encoding a plurality of projection positions in the projection space according to a spatial sequence to obtain projection position codes;
a coincidence processor for detecting a coincidence event;
the projection data generating unit is used for determining the projection position code corresponding to the detected coincidence event and accumulating the coincidence event count corresponding to the projection position code;
the sorting unit is used for sorting the projection position codes and the coincidence event counts corresponding to the detected coincidence events; and
an image reconstruction unit for reconstructing an image using the ordered projection position codes and the corresponding coincidence event counts.
7. The imaging system of claim 6, wherein: the encoding unit is used for encoding the projection position in the projection direction, the transverse in-plane projection arrangement direction and the axial direction to obtain a projection position code comprising the projection direction code, the transverse in-plane projection code and the axial code.
8. The imaging system of claim 6, wherein: the encoding unit is configured to encode the ring difference to obtain a projection position code comprising a ring difference code.
9. The imaging system of claim 6, wherein: the encoding unit is used for encoding the time interval to obtain a projection position code comprising the time interval code.
10. The imaging system of claim 6, wherein: the sorting unit is used for sorting in the process that the coincidence processor detects the coincidence event and the projection data generating unit determines the projection position code; if the projection position code does not exist in the sorted sequence, the sorting unit is used for sorting the projection position code and the corresponding coincidence event count into the sorted sequence; if the projection position code already exists in the sorted sequence, the sorting unit is used for accumulating the coincidence event count corresponding to the projection position code.
CN201710039748.6A 2017-01-19 2017-01-19 Imaging method and imaging system Active CN106859686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710039748.6A CN106859686B (en) 2017-01-19 2017-01-19 Imaging method and imaging system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710039748.6A CN106859686B (en) 2017-01-19 2017-01-19 Imaging method and imaging system

Publications (2)

Publication Number Publication Date
CN106859686A CN106859686A (en) 2017-06-20
CN106859686B true CN106859686B (en) 2020-01-03

Family

ID=59158261

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710039748.6A Active CN106859686B (en) 2017-01-19 2017-01-19 Imaging method and imaging system

Country Status (1)

Country Link
CN (1) CN106859686B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107799175B (en) * 2017-11-24 2020-11-03 中国科学院高能物理研究所 Virtual detector data organization method and device, storage medium and electronic equipment
CN108013895B (en) * 2017-11-28 2021-08-27 上海联影医疗科技股份有限公司 Coincidence judgment method, device, equipment and medium
CN110415311B (en) * 2019-07-29 2024-04-16 上海联影医疗科技股份有限公司 PET image reconstruction method, system, readable storage medium and apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1186246A (en) * 1997-11-22 1998-07-01 深圳奥沃国际科技发展有限公司 Photoelectric detector for tomography
CN1405081A (en) * 2002-11-01 2003-03-26 中国科学院上海微系统与信息技术研究所 Electric-static driven large-displacement micro structure
CN101084831A (en) * 2006-06-06 2007-12-12 通用电气公司 Methods and apparatus for PET time of flight
CN104408763A (en) * 2014-10-29 2015-03-11 沈阳东软医疗系统有限公司 Image reconstruction method and apparatus
CN104408756A (en) * 2014-10-30 2015-03-11 东软集团股份有限公司 PET image reconstruction method and apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4649348B2 (en) * 2006-02-28 2011-03-09 株式会社日立製作所 Nuclear medicine diagnostic equipment
US10107766B2 (en) * 2015-01-15 2018-10-23 Analogic Corporation Photon counting imaging modes

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1186246A (en) * 1997-11-22 1998-07-01 深圳奥沃国际科技发展有限公司 Photoelectric detector for tomography
CN1405081A (en) * 2002-11-01 2003-03-26 中国科学院上海微系统与信息技术研究所 Electric-static driven large-displacement micro structure
CN101084831A (en) * 2006-06-06 2007-12-12 通用电气公司 Methods and apparatus for PET time of flight
CN104408763A (en) * 2014-10-29 2015-03-11 沈阳东软医疗系统有限公司 Image reconstruction method and apparatus
CN104408756A (en) * 2014-10-30 2015-03-11 东软集团股份有限公司 PET image reconstruction method and apparatus

Also Published As

Publication number Publication date
CN106859686A (en) 2017-06-20

Similar Documents

Publication Publication Date Title
US11156732B2 (en) System and method for image reconstruction in positron emission tomography
US10360699B2 (en) Correcting count loss
US10754048B2 (en) Correcting count loss in pet system
CN107970037B (en) Imaging method and imaging system
JP2012118079A (en) Pet data processing method, pet data processing apparatus, computer readable recording medium and data processing method
CN106859686B (en) Imaging method and imaging system
JPH05504402A (en) High-resolution gamma-ray detectors for positron emission tomography (PET) and single photon emission computed tomography (SPECT)
CN101223553A (en) Three-dimensional time-of-flight PET with course angular and slice rebinning
CN104408756A (en) PET image reconstruction method and apparatus
US11143766B2 (en) PET system with a positron lifetime measurement function and positron lifetime measurement method in a PET system
Groiselle et al. 3D PET list-mode iterative reconstruction using time-of-flight information
JP5342228B2 (en) Image processing apparatus and three-dimensional PET apparatus
US20210157020A1 (en) Positron emission tomography (pet) timing calibration using coincidences involving high-energy cascade gamma from prompt-gamma positron emitters
JP2010204755A (en) Image processor, image reconstruction system, image processing method, and program
Sitek et al. Reconstruction of dual isotope PET using expectation maximization (EM) algorithm
US20230218243A1 (en) Medical image processing device, computer program, and nuclear medicine device
El Bitar et al. Acceleration of fully 3D Monte Carlo based system matrix computation for image reconstruction in small animal SPECT
KR102283454B1 (en) image reconstruction method to reconstruct the image by correcting the response depth information included in the observed data using the flight time information of the positron emission tomography
Alhassen et al. Ultrafast multipinhole single photon emission computed tomography iterative reconstruction using CUDA
Shopa et al. TOF MLEM Adaptation for the Total-Body J-PET with a Realistic Analytical System Response Matrix
US20050157923A1 (en) Image processing system and image processing method
Dimmock et al. An OpenCL implementation of pinhole image reconstruction
CN109567845B (en) Method for constructing random coincidence cases of PET system by case mixing method
CN110301927B (en) Method, apparatus, storage medium and medical device for determining inherent efficiency of crystal
JP2010203807A (en) Image processing apparatus, image processing system, image processing method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 110167 No. 177-1 Innovation Road, Hunnan District, Shenyang City, Liaoning Province

Applicant after: DongSoft Medical System Co., Ltd.

Address before: Hunnan New Century Road 110179 Shenyang city of Liaoning Province, No. 16

Applicant before: Dongruan Medical Systems Co., Ltd., Shenyang

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant