Summary of the invention
The technical matters that the present invention solves is to provide a kind of new PET three-dimensional image reconstruction method and apparatus, to improve the efficiency of three-dimensional reconstruction.
Embodiments of the invention provide a kind of PET 3-dimensional reconstruction method, and described method comprises:
1) initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
2) a plurality of line of response subsets that obtain from detector, select a line of response subset, described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
3) from selected line of response subset, select line of response group successively, and respectively each selected line of response group is carried out to predetermined registration operation, wherein said predetermined registration operation comprises:
3.1) according to the first coordinate system (X, Y, the pixel value f1(x of the pixel Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, Z) in the pixel value g1(s of each pixel of the first pixel groups, t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
3.2) according to the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
3.3) by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
3.4) according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
3.5) according to the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
4) according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset, upgrade described initial three-dimensional image;
5) meet while stopping rule, iteration termination, otherwise according to the initial three-dimensional image after upgrading from step 2) start to continue to carry out.
Alternatively, the axis direction that the direction of the z axle of the first coordinate system described in described step 1) (X, Y, Z) is described detector; Described step 3.2) the second coordinate system (S described in, T, Z) axis direction that z direction of principal axis is described detector, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, s direction of principal axis is the vertical direction of line of response direction corresponding to selected line of response group.
Alternatively, described method also comprises: before carrying out orthogonal projection calculating, and in s-t plane, the pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, in s-t plane, the pixel value g2(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
Alternatively, described method also comprises: before carrying out orthogonal projection calculating, and along the direction of s axle, the pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, along the direction of s axle, the pixel value g2(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
Alternatively, the orthogonal projection described step 3.2) is calculated and is comprised: in the first preset range, the pixel value of the pixel relevant to subpoint is weighted to summation; Described step 3.4) back projection in calculates and comprises: in the second preset range, the projection value of the subpoint relevant to pixel is weighted to summation.
Alternatively, described the first preset range is determined according to the position relationship between the end points of section described subpoint and described subpoint place and the gauging ring that the parallel plane described detector of t-z comprises; Described the second preset range is determined according to the position relationship between the end points of section described pixel and described pixel place and the gauging ring that the parallel plane described detector of t-z comprises.
Alternatively, the weight coefficient that each pixel relevant to described subpoint is corresponding is determined according to the distance of described pixel and subpoint; The weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
Alternatively, before carrying out orthogonal projection calculating, described method also comprises: by the pixel value merging of m pixel of the first pixel groups distributing along t direction of principal axis that is arranged in the second coordinate system corresponding to selected line of response group, wherein m is greater than 1 integer; Described step 3.3) in, the acquisition methods of measuring projection value also comprises: many bar responses line measurement projection value along the axial m of a t subpoint in the second coordinate system that selected line of response group is corresponding merges.
Alternatively, the rule of stopping described in is: described a plurality of line of response subsets were all selected.
Alternatively, the rule of stopping described in is: the initial three-dimensional image after described renewal reaches predetermined threshold value.
Alternatively, described step 3.1) in, according to the pixel value f1(x of the pixel in the first coordinate system, y, z), obtain the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) method comprise:
A1), for the pixel (s1, t1, z1) of the first pixel groups that is arranged in the second coordinate system corresponding to arbitrary selected line of response group, in the first coordinate system, find the coordinate (x1, y1, z1) of corresponding with it pixel;
A2) in the first coordinate system, by with described pixel (x1, y1, z1) the pixel value f1(x of adjacent pixel, y, z) obtain described pixel (x1, y1, z1) pixel value, the pixel (s1 of the first pixel groups that is arranged in the second coordinate system that selected line of response group is corresponding, t1, z1) pixel value g1(s1, t1, z1) with the described pixel (x1 that is arranged in the first coordinate system, y1, z1) pixel value identical.
Alternatively, described step a1) comprising: according to the angle of the first coordinate system rotation to the second coordinate system
the coordinate (x1, y1, z1) that calculates the corresponding pixel that is arranged in the first coordinate system of arbitrary pixel (s1, t1, z1) of first pixel groups that be arranged in second coordinate system corresponding with selected line of response group, computing method are:
Alternatively, described step a2) in, by with described pixel (x1, y1, z1) pixel value of adjacent pixel obtains described pixel (x1, y1, z1) the method for pixel value comprise: by with described pixel (x1, y1, z1) four adjacent pixels carry out bilinear interpolation, obtain the pixel value of described pixel (x1, y1, z1).
Alternatively, described step 3.5) in, according to the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z) method comprise:
B1), for the pixel (x2, y2, z2) of the second pixel groups that is arranged in the first coordinate system corresponding to arbitrary selected line of response group, in the second coordinate system, find the coordinate (s2, t2, z2) of corresponding with it pixel;
B2) in the second coordinate system, by with described pixel (s2, t2, z2) the pixel value g2(s of the pixel of the first adjacent pixel groups, t, z) obtain described pixel (s2, t2, z2) pixel value, the pixel (x2 of the second pixel groups that is arranged in the first coordinate system that selected line of response group is corresponding, y2, z2) pixel value f2(x2, y2, z2) with the described pixel (s2 that is arranged in the second coordinate system, t2, z2) pixel value identical.
Alternatively, described step b1) comprising: according to the angle of the second coordinate system rotation to the first coordinate system
the coordinate (s2, t2, z2) that calculates the corresponding pixel that is arranged in the second coordinate system of arbitrary pixel (x2, y2, z2) of second pixel groups that be arranged in first coordinate system corresponding with selected line of response group, computing method are:
Alternatively, described step b2) in, by with described pixel (s2, t2, z2) the pixel value g2(s of pixel of adjacent the first pixel groups, t, z) obtain described pixel (s2, t2, z2) the method for pixel value comprise: by with described pixel (s2, t2, z2) four adjacent pixels carry out bilinear interpolation, obtain the pixel value of described pixel (s2, t2, z2).
Embodiments of the invention also provide a kind of PET 3-dimensional reconstruction device, and described device comprises:
Initialization unit, for initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
Line of response subset selected cell, selects a line of response subset for a plurality of line of response subsets that obtain from detector, and described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
Line of response group processing unit, for selecting line of response group from selected line of response subset successively, and carries out predetermined registration operation to each selected line of response group respectively.
Described line of response group processing unit comprises:
The first coordinate system conversion subelement, be used for according to the pixel value f1(x of the pixel in the first coordinate system (X, Y, Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, the pixel value g1(s of each pixel of the first pixel groups Z), t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
Orthogonal projection subelement, for according to the pixel value g1(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
Correction coefficient is obtained subelement, for by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
Back projection's subelement, for according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
The second coordinate system conversion subelement, be used for according to the pixel value g2(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
Image update unit, for upgrading described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset;
Control module, for judging whether to meet the rule of stopping, when meeting while stopping rule, makes iteration termination, otherwise continues to carry out interative computation.
Compared with prior art, technique scheme has the following advantages: the scheme that is by converted coordinate, one group of pixel of rebuilding in image is alignd with the bar response line under a certain angle, can be two-dimentional ray-tracing algorithm by three-dimensional ray tracing simplify of arithmetic, thereby greatly reduce operand.
Embodiment
For above-mentioned purpose of the present invention, feature and advantage can be become apparent more, below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in detail.
Set forth in the following description a lot of details so that fully understand the present invention, but the present invention can also adopt other to be different from alternate manner described here, implement, so the present invention has not been subject to the restriction of following public specific embodiment.
Fig. 1 is the process flow diagram of PET three-dimensional image reconstruction method in the embodiment of the present invention.
Please refer to Fig. 1, described PET three-dimensional image reconstruction method comprises: step S100 to S150.
Step S100, initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at each pixel on three-dimensional matrice in the first coordinate system (X, Y, Z) is f1(x, y, z).
Fig. 2 is the schematic diagram of the first coordinate system in the embodiment of the present invention.Please refer to Fig. 2, the axial direction that the direction of the z axle of described the first coordinate system is detector (cross section that the circle that the O point of take in Fig. 2 is the center of circle is described detector, the axial direction of described detector is perpendicular to described cross section).
The x axle of described the first coordinate system plane definite with y axle is parallel with the cross section of described detector.Described detector is cylindrical structure, comprises a plurality of gauging rings, and the thickness of each gauging ring is H.
It should be noted that, in an embodiment of the present invention, the described three-dimensional matrice in the first coordinate system can be a voxel dot matrix that comprises N*N*N pixel, for the purpose of simplifying the description, has just shown a face in described three-dimensional matrice in Fig. 2.
Described detector can obtain a large amount of line of response (will obtain a large amount of line of response information in actual measurement process, in Fig. 3, only show that 5 bar response lines are as simply schematically illustrating), we are divided into different line of response subsets by described a large amount of line of response, wherein each line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other.
For example, wherein, a line of response subset comprises 12 line of response groups, and the line of response bearing of trend that described 12 line of response groups are corresponding and the angle of y axle are respectively 0 degree, 40 degree, 80 degree ... 320 degree and 360 degree.
Again for example, wherein, another line of response subset comprises 11 line of response groups, and the line of response bearing of trend that described 11 line of response groups are corresponding and the angle of y axle are respectively 10 degree, 50 degree, 90 degree ... 290 degree and 330 degree.
As previously mentioned, each line of response group comprises many line of response that are parallel to each other.For example, when line of response direction corresponding to line of response group and the angle of y axle are 20 degree, it is 20 degree and the line of response that is parallel to each other that described line of response group comprises all and y axle clamp angle.
For example, in Fig. 2, line of response L1, L2 and L3 are parallel to each other, and line of response L1, L2 and L3 belong to same line of response group so, and on the line of response L4 being parallel to each other and L5, belong to another line of response group.
Step S110, selects a line of response subset a plurality of line of response subsets that obtain from detector.
In an embodiment of the present invention, line of response subset of every selection is also carried out computing according to selected line of response subset, just completes iterative computation one time.
Step S120 selects line of response group successively from selected line of response subset, and respectively each selected line of response group is carried out to predetermined registration operation.
In an embodiment of the present invention, obtain the image amalgamation result of selected line of response subset, need to select successively a plurality of line of response groups in the line of response subset of described selection to carry out respectively projection correction, according to the data for projection after proofreading and correct, obtain the image information after correction corresponding to a plurality of line of response groups, and carry out image information merging according to the image information after correction corresponding to a plurality of line of response groups.
Fig. 4 is the process flow diagram of predetermined registration operation in the embodiment of the present invention, and described predetermined registration operation comprises step S111 to S115.
Below in conjunction with step S111, one in selected line of response subset predetermined registration operation process corresponding to selected line of response group is elaborated to step S115.
Step S111, is positioned at the pixel value f1(x of pixel in the first coordinate system (X, Y, Z) according to described initial three-dimensional image, y, z), obtain selected line of response group corresponding the second coordinate system (S, T, Z) the pixel value g1(s of each pixel of the first pixel groups in, t, z).
Below in conjunction with Fig. 5, to step, S111 is elaborated.
The left side of Fig. 5 is the schematic diagram of the pixel in the first coordinate system, and the right of Fig. 5 is the schematic diagram of the first pixel groups in the second coordinate system.
Pixel in described initial three-dimensional image is the dot in the rectangle of Fig. 5 left side, and the distance between each pixel is d, and the pixel value that each pixel is corresponding is f1(x, y, z).
After deliberation, inventor's discovery, in the schematic diagram on Fig. 5 left side, the pixel of initial three-dimensional image is not positioned in line of response corresponding to selected line of response group, if calculate the correction coefficient of every bar response line to each pixel by prior art, need to consume more time cost.
For this reason, thereby inventor proposes described the first coordinate system rotation to produce the second coordinate system, and according to the calculated for pixel values of the pixel of initial three-dimensional image, goes out the pixel value of the first pixel groups, if the angle of line of response L and y axle is
need so the angle of described the first coordinate system rotation to be
The first pixel groups is positioned on many bar responses line corresponding to selected line of response group.Therefore, by the scheme of rotating coordinate system, one group of pixel of rebuilding in image can be alignd with the bar response line under a certain angle, can be two-dimentional ray-tracing algorithm by three-dimensional ray tracing simplify of arithmetic, thereby greatly reduce operand.
As shown in Figure 5, the s direction of principal axis of described postrotational the second coordinate system is the vertical direction of line of response direction corresponding to selected line of response group, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, the axial direction that the direction of the z axle of described the second coordinate system is described detector.That is to say, the t direction of principal axis of described the second coordinate system and the bearing of trend of described line of response L are consistent.
The pixel of described the first pixel groups is distributed on many parallel line of response L corresponding with selected line of response group.The pixel value of the pixel in described the first pixel groups is g1(s, t, z), described pixel value g1(s, t, z) can be positioned at according to described initial three-dimensional image the pixel value f1(x of each pixel in the first coordinate system (X, Y, Z), y, z) obtain.
Particularly, in an embodiment of the present invention, the pixel that can first obtain described the first pixel groups corresponding coordinate with it in described the first coordinate system, then determines the pixel value of the pixel of described the first pixel groups according to the pixel value that is positioned at the pixel of obtained coordinate.
Pixel A in first pixel groups of take below in Fig. 5 describes as example.
Please refer to Fig. 5, in the second coordinate system, the coordinate in described pixel A is (s1, t1, z1), by the angle of the first coordinate system rotation to the second coordinate system
just can calculate the coordinate (x1, y1, z1) of described pixel A in the first coordinate system, computing method are:
Then according to described coordinate (x1, y1, z1), just can find out in the first coordinate system and coordinate (x1, y1, z1) adjacent pixel, then just can determine and be positioned at coordinate (x1, y1 according to the pixel value of described neighbor pixel, the pixel value of pixel z1), the pixel value g1(s of pixel A, t, z) and pixel (x1, y1, z1) pixel value equate.
Particularly, in an embodiment of the present invention, can carry out bilinear interpolation according to the pixel value of four pixels adjacent with coordinate (x1, y1, z1) in the first coordinate system and determine the pixel value g1(s of described pixel A in the second coordinate system, t, z).
By above-mentioned steps S111, obtained the pixel value g1(s of the pixel of the first pixel groups in the second coordinate system, t, z) after, can be according to described pixel value g1(s, t, z) carry out orthogonal projection calculating.
In an embodiment of the present invention, in order further to reduce the efficiency of calculated amount and raising three-dimensional image reconstruction, before described orthogonal projection is calculated, can also carry out pixel merging to the pixel in described the first pixel groups.
Particularly, described the first pixel groups pixel value along the axial m of a t pixel in the second coordinate system that selected line of response group is corresponding merges, that is to say, m pixel is combined into a pixel, the pixel value of new synthetic pixel is the pixel value sum of a described m pixel.Wherein, m=0.15 Δ t/d, Δ t is the time interval of record-setting flight time in flight time information handling system, d is the distance between pixel.
After pixel merges, image size is constant, and the distance of pixel on s direction of principal axis is constant, and the distance on t direction of principal axis becomes md.Obviously, after passing through above-mentioned union operation, the quantity of the pixel that need to calculate has reduced m doubly, will improve significantly image reconstruction efficiency.
In one embodiment of the invention, before described orthogonal projection is calculated, can also be in s-t plane, the pixel value g1(s to selected line of response group each pixel in corresponding the second coordinate system (S, T, Z), t, z) forward point of application spread function.
In another embodiment of the present invention, before described orthogonal projection is calculated, along the direction of s axle, the pixel value g1(s to selected line of response group each pixel in corresponding the second coordinate system (S, T, Z), t, z) forward point of application spread function.
Particularly, in an embodiment of the present invention, can generate point spread function by the GATE of nuclear medicine simulation software simulation.This software can Reality simulation environment under positron from burying in oblivion to photon, be detected the whole physical process that device absorbs, and record intermediate result.
Particularly, the position of burying in oblivion according to positron and the distance of line of response merge according to certain interval (as crystalline size) event that meets dropping in same interval, add up all event numbers that meet under event number and identical Δ t that meet in each interval, both are divided by and obtain normalized point spread function, are designated as
, angle wherein
s has determined the position of line of response with numbering, and code T has determined mistiming flight time, and numbering I has determined the distance of pixel perpendicular to line of response.
To the direction along s axle, to described the first pixel groups at corresponding the second coordinate system (S, T, Z) in the pixel value g1(s of each pixel, t, z) after forward point of application spread function, pixel value the g1 ' (s of described the first pixel groups each pixel in corresponding the second coordinate system (S, T, Z), t, z) computing method be:
It should be noted that, in embodiments of the invention, application two-dimensional points spread function is actually the reduced form of three-dimensional point spread function, with respect to three-dimensional point spread function, has reduced operand, has improved the efficiency of three-dimensional image reconstruction.
Step S112, second coordinate system (the S corresponding according to selected line of response group, T, Z) in the pixel value g1(s of each pixel of the first pixel groups, t, z) carry out orthogonal projection calculating, to obtain the projection value p1(s of the first pixel groups corresponding to described selected line of response group each subpoint in second coordinate system (S, T, Z) of correspondence, t, z).
Particularly, in an embodiment of the present invention, described orthogonal projection is calculated and is comprised: in the first preset range, the pixel value of the pixel relevant to subpoint is weighted to summation, described the first preset range is determined according to the position relationship between described subpoint and the end points of section described subpoint place and the parallel plane gauging ring of t-z.
By take the subpoint B in the second coordinate system described in Fig. 6, describe as example below.
Please refer to Fig. 6, described subpoint B is positioned at the profile P QRS described gauging ring being cut along C1C2 line, the length that the length of profile P QRS is C1C2, the height H that the width of profile P QRS is described gauging ring.The projection value of described subpoint B relevant to the pixel in the RBQ of PSBHe Delta Region, Delta Region (pixel in the RBQ of PSBHe Delta Region, Delta Region produces contribution function to the projection value of described subpoint B), the projection value of so described subpoint B is the value after the weighted sum of pixel value of the pixel in the PSBHe Delta Region RBQ of described Delta Region.
In an embodiment of the present invention, the weight coefficient that each pixel relevant to described subpoint is corresponding is definite according to the distance of described pixel and subpoint, and the weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
For example, the distance of some pixels and subpoint is L1, and the weight coefficient that this pixel is corresponding is w1(s, t, z), w1(s, t, z) computing method be:
Wherein, the halfwidth that FWHM is Gaussian function, its value is the temporal resolution of system and the product of the light velocity.
In the first preset range, the pixel value of the pixel relevant to subpoint is weighted so summation after value p1(s, t, z) computing method be:
p1(s,t,z)=∑g1(s,t,z)w1(s,t,z)。
Step S113, by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate.
Please refer to Fig. 7, in an embodiment of the present invention, the described data for projection to selected line of response group first pixel groups in the second coordinate system is proofreaied and correct, and to obtain correction coefficient, comprises step S113a and step S113b.
Wherein, step S113a, according to the line of response data corresponding to selected line of response group of described detector acquisition, obtains the measurement projection value p0(s of many bar responses line corresponding to selected line of response group each subpoint in described the second coordinate system, t, z).
Particularly, can first obtain line of response data (ia, ib, ra under list mode, rb, Δ t), wherein, (ia, ib) be a pair of crystal of the connection response line numbering on the hoop of gauging ring, (ra, rb) is this pair of crystal numbering in the axial direction, and Δ t is the flight time interval of recording in electronics.According to the temporal resolution t of system
reswith spatial resolution s
res, calculate and sort out maximum angle
be generally 1 degree~10 degree, incite somebody to action [0, π) press. simultaneously
be divided into N part, be designated as:
then travel through the event that meets under all list modes, according to the geometry of system, (ia, ib, ra, rb, Δ t) is converted to the coordinate of described the second coordinate system
wherein s is the distance that meets the corresponding crystal of event and be wired to central shaft,
for the angle value of this line projection on sagittal plane, z
mlpto meet event maximum probability position coordinate in the axial direction by determined this of flying time technology.Z
mlp,, t
bincomputing method be:
Wherein,
(x1, y1, z1) and (x2, y2, z2) is the volume coordinate that receives a pair of crystal of gamma-ray photon, and c is the light velocity.
The measurement projection value of each subpoint of line in the second coordinate system of can meeting with a response in above-mentioned steps is
wherein, the measurement projection value p0(s of many bar responses line each subpoint in described the second coordinate system that each the first pixel groups in the pixel subset of described selection is corresponding, t, z).
If before carrying out described orthogonal projection, described the first pixel groups pixel value along the axial m of a t pixel in the second coordinate system that selected line of response group is corresponding merges, and so correspondingly many bar responses line measurement projection value along the axial m of a t subpoint in corresponding the second coordinate system corresponding to selected line of response group can be merged.
Particularly, m subpoint merged into a subpoint, the projection value of the subpoint after merging is the summation of the projection value of a described m subpoint, and the m before the distance after merging between subpoint has increased doubly.
Step S113b, obtain the measurement projection value p0(s of many bar responses line corresponding to selected line of response group each subpoint in described the second coordinate system, t, z) and the projection value p1(s of the first pixel groups each subpoint in second coordinate system (S, T, Z) of correspondence corresponding to selected line of response group, t, z) ratio R between (s, t, z).
Particularly, the ratio between the projection value that the calculating measurement projection value that each coordinate is corresponding in described the second coordinate system and orthogonal projection are calculated.
After obtaining correction coefficient corresponding to selected line of response group, execution step S114.
Step S114, according to correction coefficient R(s corresponding to selected line of response group, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of the second pixel groups corresponding to selected line of response group each pixel in described corresponding the second coordinate system, t, z).
In an embodiment of the present invention, the pixel of described the second pixel groups is distributed in many parallel line of response corresponding to selected line of response group, the projection value of each subpoint of described the second pixel groups in the second coordinate system corresponding to selected line of response group is described correction coefficient value R (s, t, z).
It should be noted that, described back projection is calculated as the reverse process of described orthogonal projection, and described orthogonal projection is according to calculated for pixel values projection value, and described back projection is calculated as according to projection value calculating pixel value.
Particularly, in an embodiment of the present invention, described back projection calculates and can comprise: in the second preset range, the projection value of the subpoint relevant to pixel is weighted summation, and described the second preset range is determined according to the position relationship between described pixel and the end points of section described pixel place and the parallel plane gauging ring of t-z.
The weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
Particularly, determining with the computing method of weight coefficient and the computing method definite and weight coefficient that pixel is corresponding of above-mentioned the first preset range of described subpoint of described the second preset range is similar, please refer to the description of calculating about above-mentioned orthogonal projection.
For example, the distance of some subpoints and pixel is L2, and the weight coefficient that this subpoint is corresponding is w2(s, t, z), w2(s, t, z) computing method be:
Wherein, the halfwidth that FWHM is Gaussian function, its value is the temporal resolution of system and the product of the light velocity.
In the second preset range, the projection value of the subpoint relevant to pixel is weighted to the value g2(s after summation, t, z so) computing method be:
g2(s,t,z)=∑R(s,t,z)w2(s,t,z)。
In one embodiment of this invention, can also be after described back projection be calculated, in s-t plane, the pixel value f2(s to the second pixel groups each pixel in described corresponding the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
In another embodiment of the present invention, can also be after described back projection be calculated, along the direction of s axle, the pixel value g2(s to the second pixel groups of selected line of response group each pixel in described corresponding the second coordinate system, t, z) applied in reverse point spread function.
Direction along s axle, pixel value g2(s to the second pixel groups of selected line of response group each pixel in described corresponding the second coordinate system, t, z) after applied in reverse point spread function, pixel value the g2 ' (s of the second pixel groups each pixel in described corresponding the second coordinate system that described each first pixel groups is corresponding, t, z) computing method be:
It should be noted that equally, in embodiments of the invention, application two-dimensional points spread function is actually the reduced form of three-dimensional point spread function, with respect to three-dimensional point spread function, has reduced operand, has improved the efficiency of three-dimensional image reconstruction.
Step S115, according to the pixel value g2(s of the second pixel groups each pixel in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of the 3rd pixel groups each pixel in described the first coordinate system corresponding to described each second pixel groups, y, z).
In an embodiment of the present invention, the pixel of described the 3rd pixel groups is distributed on the described three-dimensional matrice in the first coordinate system.
It should be noted that, described step S115 is a reverse process of step S111,
In step S111, be according to the pixel value of the pixel in described the first coordinate system, obtain the pixel value of the pixel of the first pixel groups in the second coordinate system, described step S115 is according to the pixel value of the pixel of the second pixel groups in the second coordinate system, obtains the pixel value of the pixel of the 3rd pixel groups in the first coordinate system.
In an embodiment of the present invention, the coordinate of the pixel that can first obtain described the 3rd pixel groups in described the second coordinate system, the pixel value of the pixel of the second pixel groups that then coordinate in described the second coordinate system is adjacent according to the pixel with described the 3rd pixel groups is determined the pixel value of the pixel of described the 3rd pixel groups.
Pixel D in the 3rd pixel groups of take below in Fig. 8 describes as example.
Please refer to Fig. 8, in the first coordinate system, the coordinate in described pixel D is (x2, y2, z2), by the angle of the first coordinate system rotation to the second coordinate system
just can calculate the coordinate (s2, t2, z2) of described pixel A in the second coordinate system, computing method are:
Then just can be according to coordinate (s2, t2, z2), find out the pixel of the first pixel groups that coordinate (s2, t2, z2) in the second coordinate system is adjacent, then just can determine the pixel value f2(x of described pixel D in the second coordinate system, y, z according to the pixel value of described neighbor pixel).
Particularly, in an embodiment of the present invention, can carry out bilinear interpolation according to the pixel value of four adjacent pixels and determine the pixel value f2(x of described pixel D in the first coordinate system, y, z).
By step S111 to step S115 by by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) after all calculating, can be by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) merge, to obtain image information corresponding to selected line of response subset.
According to described step S111 to S115, each line of response group in selected line of response subset carried out projection correction and obtained the image information after corresponding correction, and by step S130, result of calculation corresponding to each line of response group being merged.
Step S130, upgrades described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset.
Particularly, in embodiments of the present invention, in step S130, by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) merging superposes.
Particularly, in embodiments of the present invention, in step S130, the pixel value that the result after can also merging according to described stack is upgraded described initial three-dimensional image comprises: the result that the result after described stack is merged is multiplied by described initial three-dimensional image pixel value divided by the value of normalized result is replaced the pixel value of described initial three-dimensional image.Wherein, described normalized result is that described correction coefficient R (s, t, z) corresponding to selected line of response group is set to and after 1, carries out the result that obtains after the calculating of described back projection and stack joint account.
Step S140, judges whether to meet the rule of stopping.
The rule of stopping in an embodiment of the present invention can all be selected for described a plurality of line of response subsets.
In other embodiments of the invention, the described rule of stopping also can reach predetermined threshold value for the initial three-dimensional image after described renewal, is specially the result that result after described stack merges is multiplied by described initial three-dimensional image pixel value divided by the value of described normalized result and reaches predetermined threshold value.
Step S150, when not meeting while stopping rule, the pixel value of the described initial three-dimensional image after output is upgraded.
When not meeting while stopping rule, return to step S110, select next line of response subset, continue to carry out.
Embodiments of the invention also provide a kind of PET 3-dimensional reconstruction device, please refer to Fig. 9, and described PET 3-dimensional reconstruction device 200 comprises:
Initialization unit 210, for initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
Line of response subset selected cell 220, selects a line of response subset for a plurality of line of response subsets that obtain from detector, and described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
Line of response group processing unit 230, for selecting line of response group from selected line of response subset successively, and carries out predetermined registration operation to each selected line of response group respectively.
Image update unit 240, for upgrading described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset;
Control module 250, for judging whether to meet the rule of stopping, when meeting while stopping rule, makes iteration termination, otherwise continues to carry out interative computation.
Particularly, please refer to Figure 10, in an embodiment of the present invention, described line of response group processing unit 230 can comprise:
The first coordinate system conversion subelement 231, be used for according to the pixel value f1(x of the pixel in the first coordinate system (X, Y, Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, the pixel value g1(s of each pixel of the first pixel groups Z), t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
Orthogonal projection subelement 232, for according to the pixel value g1(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
Correction coefficient is obtained subelement 233, for by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
Back projection's subelement 234, for according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
The second coordinate system conversion subelement 235, be used for according to the pixel value g2(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z).
In an embodiment of the present invention, the axis direction that the direction of the z axle of described the first coordinate system (X, Y, Z) is described detector; Described the second coordinate system (S, T, Z) axis direction that z direction of principal axis is described detector, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, s direction of principal axis is the vertical direction of line of response direction corresponding to selected line of response group.
One of ordinary skill in the art will appreciate that all or part of step in the whole bag of tricks of above-described embodiment is to come the hardware that instruction is relevant to complete by program, this program can be stored in a computer-readable recording medium, and storage medium can comprise: ROM, RAM, disk or CD etc.
Although the present invention discloses as above, the present invention is not defined in this.Any those skilled in the art, without departing from the spirit and scope of the present invention, all can make various changes or modifications, so protection scope of the present invention should be as the criterion with claim limited range.