CN104183012A - PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device - Google Patents

PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device Download PDF

Info

Publication number
CN104183012A
CN104183012A CN201310534444.9A CN201310534444A CN104183012A CN 104183012 A CN104183012 A CN 104183012A CN 201310534444 A CN201310534444 A CN 201310534444A CN 104183012 A CN104183012 A CN 104183012A
Authority
CN
China
Prior art keywords
pixel
coordinate system
line
response
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310534444.9A
Other languages
Chinese (zh)
Other versions
CN104183012B (en
Inventor
吕杨
丁喻
董筠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201310534444.9A priority Critical patent/CN104183012B/en
Publication of CN104183012A publication Critical patent/CN104183012A/en
Application granted granted Critical
Publication of CN104183012B publication Critical patent/CN104183012B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention provides a PET (Polyethylene terephthalate) three-dimensional image reconstruction method and device. The method comprises the following steps: initializing a three-dimensional image to cause the pixel value of a pixel, which is positioned on a three-dimensional matrix in a first coordinate system (X, Y, Z), of the initialized three-dimensional image to be f1(x,y,z); selecting one line-of-response subset from a plurality of line-of-response subsets obtained from a detector; successively selecting line-of-response groups from the selected line-of-response subset, and independently carrying out preset operation to each selected line-of-response group, wherein the preset operation comprises the step of obtaining the pixel value g1(s,t,z) of each pixel of the first pixel group which is positioned in a second coordinate system (S,T,Z) and corresponds to the selected according to the pixel value f1(x,y,z) of the pixel in the first coordinate system (X, Y, Z), and the pixels of the first pixel group are positioned on a plurality of lines of response corresponding to the selected line-of-response group. Three-dimensional image reconstruction efficiency can be improved.

Description

A kind of PET three-dimensional image reconstruction method and apparatus
Technical field
The present invention relates to medical image processing technology field, particularly relate to a kind of PET three-dimensional image reconstruction method and apparatus.
Background technology
Order subset maximum likelihood method (Ordered Subset Expectation Maximization, OSEM be current PET clinical in main three-dimensional image reconstruction iterative algorithm.This algorithm is divided into n subset by data for projection, only uses a sub-set pair data for projection to proofread and correct while rebuilding at every turn, rebuilds image update once, and all subsets are all proofreaied and correct once data for projection, are called iteration one time; Meet while stopping rule, iteration termination, enters next iteration otherwise using new reconstruction image as initial pictures.
It is large that yet the defect of OSEM algorithm is calculated amount, and computing velocity is slow.Particularly, when image is rebuild in each renewal, need to calculate the correction coefficient of every bar response line to each pixel.When line of response number is less in rebuilding the less and system of image (as 2-d reconstruction), all line of response can be precomputed the correction coefficient of all pixels, then by look-up table, when rebuilding, call; But when super-resolution reconstruction (as three-dimensional reconstruction), rebuild image and have magnanimity line of response very greatly and in system, existing data storage medium cannot meet the demands, can only when rebuilding, calculate in real time the correction coefficient of every bar response line to each pixel.
Therefore, a kind of new PET three-dimensional image reconstruction method and apparatus need to be proposed, to improve the efficiency of three-dimensional image reconstruction.
Summary of the invention
The technical matters that the present invention solves is to provide a kind of new PET three-dimensional image reconstruction method and apparatus, to improve the efficiency of three-dimensional reconstruction.
Embodiments of the invention provide a kind of PET 3-dimensional reconstruction method, and described method comprises:
1) initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
2) a plurality of line of response subsets that obtain from detector, select a line of response subset, described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
3) from selected line of response subset, select line of response group successively, and respectively each selected line of response group is carried out to predetermined registration operation, wherein said predetermined registration operation comprises:
3.1) according to the first coordinate system (X, Y, the pixel value f1(x of the pixel Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, Z) in the pixel value g1(s of each pixel of the first pixel groups, t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
3.2) according to the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
3.3) by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
3.4) according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
3.5) according to the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
4) according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset, upgrade described initial three-dimensional image;
5) meet while stopping rule, iteration termination, otherwise according to the initial three-dimensional image after upgrading from step 2) start to continue to carry out.
Alternatively, the axis direction that the direction of the z axle of the first coordinate system described in described step 1) (X, Y, Z) is described detector; Described step 3.2) the second coordinate system (S described in, T, Z) axis direction that z direction of principal axis is described detector, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, s direction of principal axis is the vertical direction of line of response direction corresponding to selected line of response group.
Alternatively, described method also comprises: before carrying out orthogonal projection calculating, and in s-t plane, the pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, in s-t plane, the pixel value g2(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
Alternatively, described method also comprises: before carrying out orthogonal projection calculating, and along the direction of s axle, the pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, along the direction of s axle, the pixel value g2(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
Alternatively, the orthogonal projection described step 3.2) is calculated and is comprised: in the first preset range, the pixel value of the pixel relevant to subpoint is weighted to summation; Described step 3.4) back projection in calculates and comprises: in the second preset range, the projection value of the subpoint relevant to pixel is weighted to summation.
Alternatively, described the first preset range is determined according to the position relationship between the end points of section described subpoint and described subpoint place and the gauging ring that the parallel plane described detector of t-z comprises; Described the second preset range is determined according to the position relationship between the end points of section described pixel and described pixel place and the gauging ring that the parallel plane described detector of t-z comprises.
Alternatively, the weight coefficient that each pixel relevant to described subpoint is corresponding is determined according to the distance of described pixel and subpoint; The weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
Alternatively, before carrying out orthogonal projection calculating, described method also comprises: by the pixel value merging of m pixel of the first pixel groups distributing along t direction of principal axis that is arranged in the second coordinate system corresponding to selected line of response group, wherein m is greater than 1 integer; Described step 3.3) in, the acquisition methods of measuring projection value also comprises: many bar responses line measurement projection value along the axial m of a t subpoint in the second coordinate system that selected line of response group is corresponding merges.
Alternatively, the rule of stopping described in is: described a plurality of line of response subsets were all selected.
Alternatively, the rule of stopping described in is: the initial three-dimensional image after described renewal reaches predetermined threshold value.
Alternatively, described step 3.1) in, according to the pixel value f1(x of the pixel in the first coordinate system, y, z), obtain the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) method comprise:
A1), for the pixel (s1, t1, z1) of the first pixel groups that is arranged in the second coordinate system corresponding to arbitrary selected line of response group, in the first coordinate system, find the coordinate (x1, y1, z1) of corresponding with it pixel;
A2) in the first coordinate system, by with described pixel (x1, y1, z1) the pixel value f1(x of adjacent pixel, y, z) obtain described pixel (x1, y1, z1) pixel value, the pixel (s1 of the first pixel groups that is arranged in the second coordinate system that selected line of response group is corresponding, t1, z1) pixel value g1(s1, t1, z1) with the described pixel (x1 that is arranged in the first coordinate system, y1, z1) pixel value identical.
Alternatively, described step a1) comprising: according to the angle of the first coordinate system rotation to the second coordinate system the coordinate (x1, y1, z1) that calculates the corresponding pixel that is arranged in the first coordinate system of arbitrary pixel (s1, t1, z1) of first pixel groups that be arranged in second coordinate system corresponding with selected line of response group, computing method are:
Alternatively, described step a2) in, by with described pixel (x1, y1, z1) pixel value of adjacent pixel obtains described pixel (x1, y1, z1) the method for pixel value comprise: by with described pixel (x1, y1, z1) four adjacent pixels carry out bilinear interpolation, obtain the pixel value of described pixel (x1, y1, z1).
Alternatively, described step 3.5) in, according to the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z) method comprise:
B1), for the pixel (x2, y2, z2) of the second pixel groups that is arranged in the first coordinate system corresponding to arbitrary selected line of response group, in the second coordinate system, find the coordinate (s2, t2, z2) of corresponding with it pixel;
B2) in the second coordinate system, by with described pixel (s2, t2, z2) the pixel value g2(s of the pixel of the first adjacent pixel groups, t, z) obtain described pixel (s2, t2, z2) pixel value, the pixel (x2 of the second pixel groups that is arranged in the first coordinate system that selected line of response group is corresponding, y2, z2) pixel value f2(x2, y2, z2) with the described pixel (s2 that is arranged in the second coordinate system, t2, z2) pixel value identical.
Alternatively, described step b1) comprising: according to the angle of the second coordinate system rotation to the first coordinate system the coordinate (s2, t2, z2) that calculates the corresponding pixel that is arranged in the second coordinate system of arbitrary pixel (x2, y2, z2) of second pixel groups that be arranged in first coordinate system corresponding with selected line of response group, computing method are:
Alternatively, described step b2) in, by with described pixel (s2, t2, z2) the pixel value g2(s of pixel of adjacent the first pixel groups, t, z) obtain described pixel (s2, t2, z2) the method for pixel value comprise: by with described pixel (s2, t2, z2) four adjacent pixels carry out bilinear interpolation, obtain the pixel value of described pixel (s2, t2, z2).
Embodiments of the invention also provide a kind of PET 3-dimensional reconstruction device, and described device comprises:
Initialization unit, for initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
Line of response subset selected cell, selects a line of response subset for a plurality of line of response subsets that obtain from detector, and described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
Line of response group processing unit, for selecting line of response group from selected line of response subset successively, and carries out predetermined registration operation to each selected line of response group respectively.
Described line of response group processing unit comprises:
The first coordinate system conversion subelement, be used for according to the pixel value f1(x of the pixel in the first coordinate system (X, Y, Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, the pixel value g1(s of each pixel of the first pixel groups Z), t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
Orthogonal projection subelement, for according to the pixel value g1(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
Correction coefficient is obtained subelement, for by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
Back projection's subelement, for according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
The second coordinate system conversion subelement, be used for according to the pixel value g2(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
Image update unit, for upgrading described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset;
Control module, for judging whether to meet the rule of stopping, when meeting while stopping rule, makes iteration termination, otherwise continues to carry out interative computation.
Compared with prior art, technique scheme has the following advantages: the scheme that is by converted coordinate, one group of pixel of rebuilding in image is alignd with the bar response line under a certain angle, can be two-dimentional ray-tracing algorithm by three-dimensional ray tracing simplify of arithmetic, thereby greatly reduce operand.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of PET three-dimensional image reconstruction method in the embodiment of the present invention;
Fig. 2 is the schematic diagram of the first coordinate system and initial three-dimensional image in the embodiment of the present invention;
Fig. 3 is the schematic diagram of line of response in the embodiment of the present invention;
Fig. 4 is the process flow diagram of predetermined registration operation in the embodiment of the present invention;
Fig. 5 is the contrast schematic diagram of the first coordinate system and the second coordinate system in the embodiment of the present invention;
Fig. 6 is the first preset range schematic diagram in the embodiment of the present invention;
Fig. 7 is the process flow diagram that obtains correction coefficient in the embodiment of the present invention;
Fig. 8 is the contrast schematic diagram of the 3rd pixel groups in the second pixel groups and the first coordinate system in the second coordinate system in the embodiment of the present invention;
Fig. 9 is the structural representation of PET three-dimensional image reconstruction device in one embodiment of the invention;
Figure 10 is the structural representation of line of response group processing unit in PET three-dimensional image reconstruction device in one embodiment of the invention.
Embodiment
For above-mentioned purpose of the present invention, feature and advantage can be become apparent more, below in conjunction with accompanying drawing, the specific embodiment of the present invention is described in detail.
Set forth in the following description a lot of details so that fully understand the present invention, but the present invention can also adopt other to be different from alternate manner described here, implement, so the present invention has not been subject to the restriction of following public specific embodiment.
Fig. 1 is the process flow diagram of PET three-dimensional image reconstruction method in the embodiment of the present invention.
Please refer to Fig. 1, described PET three-dimensional image reconstruction method comprises: step S100 to S150.
Step S100, initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at each pixel on three-dimensional matrice in the first coordinate system (X, Y, Z) is f1(x, y, z).
Fig. 2 is the schematic diagram of the first coordinate system in the embodiment of the present invention.Please refer to Fig. 2, the axial direction that the direction of the z axle of described the first coordinate system is detector (cross section that the circle that the O point of take in Fig. 2 is the center of circle is described detector, the axial direction of described detector is perpendicular to described cross section).
The x axle of described the first coordinate system plane definite with y axle is parallel with the cross section of described detector.Described detector is cylindrical structure, comprises a plurality of gauging rings, and the thickness of each gauging ring is H.
It should be noted that, in an embodiment of the present invention, the described three-dimensional matrice in the first coordinate system can be a voxel dot matrix that comprises N*N*N pixel, for the purpose of simplifying the description, has just shown a face in described three-dimensional matrice in Fig. 2.
Described detector can obtain a large amount of line of response (will obtain a large amount of line of response information in actual measurement process, in Fig. 3, only show that 5 bar response lines are as simply schematically illustrating), we are divided into different line of response subsets by described a large amount of line of response, wherein each line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other.
For example, wherein, a line of response subset comprises 12 line of response groups, and the line of response bearing of trend that described 12 line of response groups are corresponding and the angle of y axle are respectively 0 degree, 40 degree, 80 degree ... 320 degree and 360 degree.
Again for example, wherein, another line of response subset comprises 11 line of response groups, and the line of response bearing of trend that described 11 line of response groups are corresponding and the angle of y axle are respectively 10 degree, 50 degree, 90 degree ... 290 degree and 330 degree.
As previously mentioned, each line of response group comprises many line of response that are parallel to each other.For example, when line of response direction corresponding to line of response group and the angle of y axle are 20 degree, it is 20 degree and the line of response that is parallel to each other that described line of response group comprises all and y axle clamp angle.
For example, in Fig. 2, line of response L1, L2 and L3 are parallel to each other, and line of response L1, L2 and L3 belong to same line of response group so, and on the line of response L4 being parallel to each other and L5, belong to another line of response group.
Step S110, selects a line of response subset a plurality of line of response subsets that obtain from detector.
In an embodiment of the present invention, line of response subset of every selection is also carried out computing according to selected line of response subset, just completes iterative computation one time.
Step S120 selects line of response group successively from selected line of response subset, and respectively each selected line of response group is carried out to predetermined registration operation.
In an embodiment of the present invention, obtain the image amalgamation result of selected line of response subset, need to select successively a plurality of line of response groups in the line of response subset of described selection to carry out respectively projection correction, according to the data for projection after proofreading and correct, obtain the image information after correction corresponding to a plurality of line of response groups, and carry out image information merging according to the image information after correction corresponding to a plurality of line of response groups.
Fig. 4 is the process flow diagram of predetermined registration operation in the embodiment of the present invention, and described predetermined registration operation comprises step S111 to S115.
Below in conjunction with step S111, one in selected line of response subset predetermined registration operation process corresponding to selected line of response group is elaborated to step S115.
Step S111, is positioned at the pixel value f1(x of pixel in the first coordinate system (X, Y, Z) according to described initial three-dimensional image, y, z), obtain selected line of response group corresponding the second coordinate system (S, T, Z) the pixel value g1(s of each pixel of the first pixel groups in, t, z).
Below in conjunction with Fig. 5, to step, S111 is elaborated.
The left side of Fig. 5 is the schematic diagram of the pixel in the first coordinate system, and the right of Fig. 5 is the schematic diagram of the first pixel groups in the second coordinate system.
Pixel in described initial three-dimensional image is the dot in the rectangle of Fig. 5 left side, and the distance between each pixel is d, and the pixel value that each pixel is corresponding is f1(x, y, z).
After deliberation, inventor's discovery, in the schematic diagram on Fig. 5 left side, the pixel of initial three-dimensional image is not positioned in line of response corresponding to selected line of response group, if calculate the correction coefficient of every bar response line to each pixel by prior art, need to consume more time cost.
For this reason, thereby inventor proposes described the first coordinate system rotation to produce the second coordinate system, and according to the calculated for pixel values of the pixel of initial three-dimensional image, goes out the pixel value of the first pixel groups, if the angle of line of response L and y axle is need so the angle of described the first coordinate system rotation to be
The first pixel groups is positioned on many bar responses line corresponding to selected line of response group.Therefore, by the scheme of rotating coordinate system, one group of pixel of rebuilding in image can be alignd with the bar response line under a certain angle, can be two-dimentional ray-tracing algorithm by three-dimensional ray tracing simplify of arithmetic, thereby greatly reduce operand.
As shown in Figure 5, the s direction of principal axis of described postrotational the second coordinate system is the vertical direction of line of response direction corresponding to selected line of response group, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, the axial direction that the direction of the z axle of described the second coordinate system is described detector.That is to say, the t direction of principal axis of described the second coordinate system and the bearing of trend of described line of response L are consistent.
The pixel of described the first pixel groups is distributed on many parallel line of response L corresponding with selected line of response group.The pixel value of the pixel in described the first pixel groups is g1(s, t, z), described pixel value g1(s, t, z) can be positioned at according to described initial three-dimensional image the pixel value f1(x of each pixel in the first coordinate system (X, Y, Z), y, z) obtain.
Particularly, in an embodiment of the present invention, the pixel that can first obtain described the first pixel groups corresponding coordinate with it in described the first coordinate system, then determines the pixel value of the pixel of described the first pixel groups according to the pixel value that is positioned at the pixel of obtained coordinate.
Pixel A in first pixel groups of take below in Fig. 5 describes as example.
Please refer to Fig. 5, in the second coordinate system, the coordinate in described pixel A is (s1, t1, z1), by the angle of the first coordinate system rotation to the second coordinate system just can calculate the coordinate (x1, y1, z1) of described pixel A in the first coordinate system, computing method are:
Then according to described coordinate (x1, y1, z1), just can find out in the first coordinate system and coordinate (x1, y1, z1) adjacent pixel, then just can determine and be positioned at coordinate (x1, y1 according to the pixel value of described neighbor pixel, the pixel value of pixel z1), the pixel value g1(s of pixel A, t, z) and pixel (x1, y1, z1) pixel value equate.
Particularly, in an embodiment of the present invention, can carry out bilinear interpolation according to the pixel value of four pixels adjacent with coordinate (x1, y1, z1) in the first coordinate system and determine the pixel value g1(s of described pixel A in the second coordinate system, t, z).
By above-mentioned steps S111, obtained the pixel value g1(s of the pixel of the first pixel groups in the second coordinate system, t, z) after, can be according to described pixel value g1(s, t, z) carry out orthogonal projection calculating.
In an embodiment of the present invention, in order further to reduce the efficiency of calculated amount and raising three-dimensional image reconstruction, before described orthogonal projection is calculated, can also carry out pixel merging to the pixel in described the first pixel groups.
Particularly, described the first pixel groups pixel value along the axial m of a t pixel in the second coordinate system that selected line of response group is corresponding merges, that is to say, m pixel is combined into a pixel, the pixel value of new synthetic pixel is the pixel value sum of a described m pixel.Wherein, m=0.15 Δ t/d, Δ t is the time interval of record-setting flight time in flight time information handling system, d is the distance between pixel.
After pixel merges, image size is constant, and the distance of pixel on s direction of principal axis is constant, and the distance on t direction of principal axis becomes md.Obviously, after passing through above-mentioned union operation, the quantity of the pixel that need to calculate has reduced m doubly, will improve significantly image reconstruction efficiency.
In one embodiment of the invention, before described orthogonal projection is calculated, can also be in s-t plane, the pixel value g1(s to selected line of response group each pixel in corresponding the second coordinate system (S, T, Z), t, z) forward point of application spread function.
In another embodiment of the present invention, before described orthogonal projection is calculated, along the direction of s axle, the pixel value g1(s to selected line of response group each pixel in corresponding the second coordinate system (S, T, Z), t, z) forward point of application spread function.
Particularly, in an embodiment of the present invention, can generate point spread function by the GATE of nuclear medicine simulation software simulation.This software can Reality simulation environment under positron from burying in oblivion to photon, be detected the whole physical process that device absorbs, and record intermediate result.
Particularly, the position of burying in oblivion according to positron and the distance of line of response merge according to certain interval (as crystalline size) event that meets dropping in same interval, add up all event numbers that meet under event number and identical Δ t that meet in each interval, both are divided by and obtain normalized point spread function, are designated as , angle wherein s has determined the position of line of response with numbering, and code T has determined mistiming flight time, and numbering I has determined the distance of pixel perpendicular to line of response.
To the direction along s axle, to described the first pixel groups at corresponding the second coordinate system (S, T, Z) in the pixel value g1(s of each pixel, t, z) after forward point of application spread function, pixel value the g1 ' (s of described the first pixel groups each pixel in corresponding the second coordinate system (S, T, Z), t, z) computing method be:
It should be noted that, in embodiments of the invention, application two-dimensional points spread function is actually the reduced form of three-dimensional point spread function, with respect to three-dimensional point spread function, has reduced operand, has improved the efficiency of three-dimensional image reconstruction.
Step S112, second coordinate system (the S corresponding according to selected line of response group, T, Z) in the pixel value g1(s of each pixel of the first pixel groups, t, z) carry out orthogonal projection calculating, to obtain the projection value p1(s of the first pixel groups corresponding to described selected line of response group each subpoint in second coordinate system (S, T, Z) of correspondence, t, z).
Particularly, in an embodiment of the present invention, described orthogonal projection is calculated and is comprised: in the first preset range, the pixel value of the pixel relevant to subpoint is weighted to summation, described the first preset range is determined according to the position relationship between described subpoint and the end points of section described subpoint place and the parallel plane gauging ring of t-z.
By take the subpoint B in the second coordinate system described in Fig. 6, describe as example below.
Please refer to Fig. 6, described subpoint B is positioned at the profile P QRS described gauging ring being cut along C1C2 line, the length that the length of profile P QRS is C1C2, the height H that the width of profile P QRS is described gauging ring.The projection value of described subpoint B relevant to the pixel in the RBQ of PSBHe Delta Region, Delta Region (pixel in the RBQ of PSBHe Delta Region, Delta Region produces contribution function to the projection value of described subpoint B), the projection value of so described subpoint B is the value after the weighted sum of pixel value of the pixel in the PSBHe Delta Region RBQ of described Delta Region.
In an embodiment of the present invention, the weight coefficient that each pixel relevant to described subpoint is corresponding is definite according to the distance of described pixel and subpoint, and the weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
For example, the distance of some pixels and subpoint is L1, and the weight coefficient that this pixel is corresponding is w1(s, t, z), w1(s, t, z) computing method be:
w 1 ( s , t , z ) = 1 c 2 π exp ( - L 1 2 2 c 2 ) , c = FWHM 2 2 ln 2 .
Wherein, the halfwidth that FWHM is Gaussian function, its value is the temporal resolution of system and the product of the light velocity.
In the first preset range, the pixel value of the pixel relevant to subpoint is weighted so summation after value p1(s, t, z) computing method be:
p1(s,t,z)=∑g1(s,t,z)w1(s,t,z)。
Step S113, by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate.
Please refer to Fig. 7, in an embodiment of the present invention, the described data for projection to selected line of response group first pixel groups in the second coordinate system is proofreaied and correct, and to obtain correction coefficient, comprises step S113a and step S113b.
Wherein, step S113a, according to the line of response data corresponding to selected line of response group of described detector acquisition, obtains the measurement projection value p0(s of many bar responses line corresponding to selected line of response group each subpoint in described the second coordinate system, t, z).
Particularly, can first obtain line of response data (ia, ib, ra under list mode, rb, Δ t), wherein, (ia, ib) be a pair of crystal of the connection response line numbering on the hoop of gauging ring, (ra, rb) is this pair of crystal numbering in the axial direction, and Δ t is the flight time interval of recording in electronics.According to the temporal resolution t of system reswith spatial resolution s res, calculate and sort out maximum angle be generally 1 degree~10 degree, incite somebody to action [0, π) press. simultaneously be divided into N part, be designated as: then travel through the event that meets under all list modes, according to the geometry of system, (ia, ib, ra, rb, Δ t) is converted to the coordinate of described the second coordinate system wherein s is the distance that meets the corresponding crystal of event and be wired to central shaft, for the angle value of this line projection on sagittal plane, z mlpto meet event maximum probability position coordinate in the axial direction by determined this of flying time technology.Z mlp,, t bincomputing method be:
z mlp = z 1 + z 2 2 - c Δt 2 ( z 2 - z 1 ) d lor ;
x mlp = x 1 + x 2 2 - c Δt 2 ( x 2 - x 1 ) d lor ;
y mlp = y 1 + y 2 2 - c Δt 2 ( y 2 - y 1 ) d lor ,
Wherein, d lor = ( x 2 - x 1 ) 2 + ( y 2 - y 1 ) 2 + ( z 2 - z 1 ) 2 , (x1, y1, z1) and (x2, y2, z2) is the volume coordinate that receives a pair of crystal of gamma-ray photon, and c is the light velocity.
The measurement projection value of each subpoint of line in the second coordinate system of can meeting with a response in above-mentioned steps is wherein, the measurement projection value p0(s of many bar responses line each subpoint in described the second coordinate system that each the first pixel groups in the pixel subset of described selection is corresponding, t, z).
If before carrying out described orthogonal projection, described the first pixel groups pixel value along the axial m of a t pixel in the second coordinate system that selected line of response group is corresponding merges, and so correspondingly many bar responses line measurement projection value along the axial m of a t subpoint in corresponding the second coordinate system corresponding to selected line of response group can be merged.
Particularly, m subpoint merged into a subpoint, the projection value of the subpoint after merging is the summation of the projection value of a described m subpoint, and the m before the distance after merging between subpoint has increased doubly.
Step S113b, obtain the measurement projection value p0(s of many bar responses line corresponding to selected line of response group each subpoint in described the second coordinate system, t, z) and the projection value p1(s of the first pixel groups each subpoint in second coordinate system (S, T, Z) of correspondence corresponding to selected line of response group, t, z) ratio R between (s, t, z).
Particularly, the ratio between the projection value that the calculating measurement projection value that each coordinate is corresponding in described the second coordinate system and orthogonal projection are calculated.
After obtaining correction coefficient corresponding to selected line of response group, execution step S114.
Step S114, according to correction coefficient R(s corresponding to selected line of response group, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of the second pixel groups corresponding to selected line of response group each pixel in described corresponding the second coordinate system, t, z).
In an embodiment of the present invention, the pixel of described the second pixel groups is distributed in many parallel line of response corresponding to selected line of response group, the projection value of each subpoint of described the second pixel groups in the second coordinate system corresponding to selected line of response group is described correction coefficient value R (s, t, z).
It should be noted that, described back projection is calculated as the reverse process of described orthogonal projection, and described orthogonal projection is according to calculated for pixel values projection value, and described back projection is calculated as according to projection value calculating pixel value.
Particularly, in an embodiment of the present invention, described back projection calculates and can comprise: in the second preset range, the projection value of the subpoint relevant to pixel is weighted summation, and described the second preset range is determined according to the position relationship between described pixel and the end points of section described pixel place and the parallel plane gauging ring of t-z.
The weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
Particularly, determining with the computing method of weight coefficient and the computing method definite and weight coefficient that pixel is corresponding of above-mentioned the first preset range of described subpoint of described the second preset range is similar, please refer to the description of calculating about above-mentioned orthogonal projection.
For example, the distance of some subpoints and pixel is L2, and the weight coefficient that this subpoint is corresponding is w2(s, t, z), w2(s, t, z) computing method be:
( s , t , z ) = 1 c 2 π ( - L 2 2 2 c 2 ) , c = FWHM 2 2 ln 2
Wherein, the halfwidth that FWHM is Gaussian function, its value is the temporal resolution of system and the product of the light velocity.
In the second preset range, the projection value of the subpoint relevant to pixel is weighted to the value g2(s after summation, t, z so) computing method be:
g2(s,t,z)=∑R(s,t,z)w2(s,t,z)。
In one embodiment of this invention, can also be after described back projection be calculated, in s-t plane, the pixel value f2(s to the second pixel groups each pixel in described corresponding the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
In another embodiment of the present invention, can also be after described back projection be calculated, along the direction of s axle, the pixel value g2(s to the second pixel groups of selected line of response group each pixel in described corresponding the second coordinate system, t, z) applied in reverse point spread function.
Direction along s axle, pixel value g2(s to the second pixel groups of selected line of response group each pixel in described corresponding the second coordinate system, t, z) after applied in reverse point spread function, pixel value the g2 ' (s of the second pixel groups each pixel in described corresponding the second coordinate system that described each first pixel groups is corresponding, t, z) computing method be:
It should be noted that equally, in embodiments of the invention, application two-dimensional points spread function is actually the reduced form of three-dimensional point spread function, with respect to three-dimensional point spread function, has reduced operand, has improved the efficiency of three-dimensional image reconstruction.
Step S115, according to the pixel value g2(s of the second pixel groups each pixel in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of the 3rd pixel groups each pixel in described the first coordinate system corresponding to described each second pixel groups, y, z).
In an embodiment of the present invention, the pixel of described the 3rd pixel groups is distributed on the described three-dimensional matrice in the first coordinate system.
It should be noted that, described step S115 is a reverse process of step S111,
In step S111, be according to the pixel value of the pixel in described the first coordinate system, obtain the pixel value of the pixel of the first pixel groups in the second coordinate system, described step S115 is according to the pixel value of the pixel of the second pixel groups in the second coordinate system, obtains the pixel value of the pixel of the 3rd pixel groups in the first coordinate system.
In an embodiment of the present invention, the coordinate of the pixel that can first obtain described the 3rd pixel groups in described the second coordinate system, the pixel value of the pixel of the second pixel groups that then coordinate in described the second coordinate system is adjacent according to the pixel with described the 3rd pixel groups is determined the pixel value of the pixel of described the 3rd pixel groups.
Pixel D in the 3rd pixel groups of take below in Fig. 8 describes as example.
Please refer to Fig. 8, in the first coordinate system, the coordinate in described pixel D is (x2, y2, z2), by the angle of the first coordinate system rotation to the second coordinate system just can calculate the coordinate (s2, t2, z2) of described pixel A in the second coordinate system, computing method are:
Then just can be according to coordinate (s2, t2, z2), find out the pixel of the first pixel groups that coordinate (s2, t2, z2) in the second coordinate system is adjacent, then just can determine the pixel value f2(x of described pixel D in the second coordinate system, y, z according to the pixel value of described neighbor pixel).
Particularly, in an embodiment of the present invention, can carry out bilinear interpolation according to the pixel value of four adjacent pixels and determine the pixel value f2(x of described pixel D in the first coordinate system, y, z).
By step S111 to step S115 by by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) after all calculating, can be by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) merge, to obtain image information corresponding to selected line of response subset.
According to described step S111 to S115, each line of response group in selected line of response subset carried out projection correction and obtained the image information after corresponding correction, and by step S130, result of calculation corresponding to each line of response group being merged.
Step S130, upgrades described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset.
Particularly, in embodiments of the present invention, in step S130, by the pixel value f2(x of the 3rd pixel groups each pixel in described corresponding the second coordinate system corresponding to a plurality of line of response groups of selected line of response subset, y, z) merging superposes.
Particularly, in embodiments of the present invention, in step S130, the pixel value that the result after can also merging according to described stack is upgraded described initial three-dimensional image comprises: the result that the result after described stack is merged is multiplied by described initial three-dimensional image pixel value divided by the value of normalized result is replaced the pixel value of described initial three-dimensional image.Wherein, described normalized result is that described correction coefficient R (s, t, z) corresponding to selected line of response group is set to and after 1, carries out the result that obtains after the calculating of described back projection and stack joint account.
Step S140, judges whether to meet the rule of stopping.
The rule of stopping in an embodiment of the present invention can all be selected for described a plurality of line of response subsets.
In other embodiments of the invention, the described rule of stopping also can reach predetermined threshold value for the initial three-dimensional image after described renewal, is specially the result that result after described stack merges is multiplied by described initial three-dimensional image pixel value divided by the value of described normalized result and reaches predetermined threshold value.
Step S150, when not meeting while stopping rule, the pixel value of the described initial three-dimensional image after output is upgraded.
When not meeting while stopping rule, return to step S110, select next line of response subset, continue to carry out.
Embodiments of the invention also provide a kind of PET 3-dimensional reconstruction device, please refer to Fig. 9, and described PET 3-dimensional reconstruction device 200 comprises:
Initialization unit 210, for initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
Line of response subset selected cell 220, selects a line of response subset for a plurality of line of response subsets that obtain from detector, and described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
Line of response group processing unit 230, for selecting line of response group from selected line of response subset successively, and carries out predetermined registration operation to each selected line of response group respectively.
Image update unit 240, for upgrading described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset;
Control module 250, for judging whether to meet the rule of stopping, when meeting while stopping rule, makes iteration termination, otherwise continues to carry out interative computation.
Particularly, please refer to Figure 10, in an embodiment of the present invention, described line of response group processing unit 230 can comprise:
The first coordinate system conversion subelement 231, be used for according to the pixel value f1(x of the pixel in the first coordinate system (X, Y, Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, the pixel value g1(s of each pixel of the first pixel groups Z), t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
Orthogonal projection subelement 232, for according to the pixel value g1(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
Correction coefficient is obtained subelement 233, for by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
Back projection's subelement 234, for according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
The second coordinate system conversion subelement 235, be used for according to the pixel value g2(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the second pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z).
In an embodiment of the present invention, the axis direction that the direction of the z axle of described the first coordinate system (X, Y, Z) is described detector; Described the second coordinate system (S, T, Z) axis direction that z direction of principal axis is described detector, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, s direction of principal axis is the vertical direction of line of response direction corresponding to selected line of response group.
One of ordinary skill in the art will appreciate that all or part of step in the whole bag of tricks of above-described embodiment is to come the hardware that instruction is relevant to complete by program, this program can be stored in a computer-readable recording medium, and storage medium can comprise: ROM, RAM, disk or CD etc.
Although the present invention discloses as above, the present invention is not defined in this.Any those skilled in the art, without departing from the spirit and scope of the present invention, all can make various changes or modifications, so protection scope of the present invention should be as the criterion with claim limited range.

Claims (17)

1. a PET 3-dimensional reconstruction method, is characterized in that, comprising:
1) initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
2) a plurality of line of response subsets that obtain from detector, select a line of response subset, described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
3) from selected line of response subset, select line of response group successively, and respectively each selected line of response group is carried out to predetermined registration operation, wherein said predetermined registration operation comprises:
3.1) according to the first coordinate system (X, Y, the pixel value f1(x of the pixel Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, Z) in the pixel value g1(s of each pixel of the first pixel groups, t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
3.2) according to the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
3.3) by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
3.4) according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
3.5) according to the pixel value g2(s of each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the 3rd pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
4) according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset, upgrade described initial three-dimensional image;
5) meet while stopping rule, iteration termination, otherwise according to the initial three-dimensional image after upgrading from step 2) start to continue to carry out.
2. the method for claim 1, is characterized in that, the axis direction that the direction of the z axle of the first coordinate system described in described step 1) (X, Y, Z) is described detector; Described step 3.2) the second coordinate system (S described in, T, Z) axis direction that z direction of principal axis is described detector, the t direction of principal axis of described the second coordinate system is line of response direction corresponding to selected line of response group, s direction of principal axis is the vertical direction of line of response direction corresponding to selected line of response group.
3. the method for claim 1, is characterized in that, also comprises: before carrying out orthogonal projection calculating, in s-t plane, pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, in s-t plane, the pixel value g2(s to each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
4. the method for claim 1, is characterized in that, also comprises: before carrying out orthogonal projection calculating, direction along s axle, pixel value g1(s to each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) forward point of application spread function; After carrying out back projection's calculating, along the direction of s axle, the pixel value g2(s to each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) applied in reverse point spread function.
5. the method for claim 1, is characterized in that, described step 3.2) in orthogonal projection calculate and to comprise: in the first preset range, the pixel value of the pixel relevant to subpoint is weighted to summation; Described step 3.4) back projection in calculates and comprises: in the second preset range, the projection value of the subpoint relevant to pixel is weighted to summation.
6. method as claimed in claim 5, is characterized in that, described the first preset range is determined according to the position relationship between the end points of section described subpoint and described subpoint place and the gauging ring that the parallel plane described detector of t-z comprises; Described the second preset range is determined according to the position relationship between the end points of section described pixel and described pixel place and the gauging ring that the parallel plane described detector of t-z comprises.
7. method as claimed in claim 5, is characterized in that, the weight coefficient that each pixel relevant to described subpoint is corresponding is determined according to the distance of described pixel and subpoint; The weight coefficient that each subpoint relevant to described pixel is corresponding is determined according to the distance of described subpoint and pixel.
8. the method for claim 1, it is characterized in that, before carrying out orthogonal projection calculating, also comprise: by the pixel value merging of m pixel of the first pixel groups distributing along t direction of principal axis that is arranged in the second coordinate system corresponding to selected line of response group, wherein m is greater than 1 integer; Described step 3.3) in, the acquisition methods of measuring projection value also comprises: many bar responses line measurement projection value along the axial m of a t subpoint in the second coordinate system that selected line of response group is corresponding merges.
9. the method for claim 1, is characterized in that, described in the rule of stopping be: described a plurality of line of response subsets were all selected.
10. the method for claim 1, is characterized in that, described in the rule of stopping be: the initial three-dimensional image after described renewal reaches predetermined threshold value.
11. the method for claim 1, it is characterized in that, described step 3.1) in, according to the pixel value f1(x of the pixel in the first coordinate system, y, z), obtain the pixel value g1(s of each pixel of the first pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z) method comprise: a1) for the pixel (s of the first pixel groups that is arranged in the second coordinate system corresponding to arbitrary selected line of response group 1, t 1, z 1), in the first coordinate system, find the coordinate (x of the pixel of correspondence with it 1, y 1, z 1);
A2) in the first coordinate system, by with described pixel (x 1, y 1, z 1) the pixel value f1(x of adjacent pixel, y, z) obtain described pixel (x 1, y 1, z 1) pixel value, the pixel (s of the first pixel groups that is arranged in the second coordinate system that selected line of response group is corresponding 1, t 1, z 1) pixel value g1(s 1, t 1, z 1) be arranged in the described pixel (x of the first coordinate system 1, y 1, z 1) pixel value identical.
12. methods as claimed in claim 11, is characterized in that, described step a1) comprising: according to the angle of the first coordinate system rotation to the second coordinate system calculate arbitrary pixel (s of first pixel groups that be arranged in second coordinate system corresponding with selected line of response group 1, t 1, z 1) coordinate (x of the corresponding pixel that is arranged in the first coordinate system 1, y 1, z 1), computing method are:
13. methods as claimed in claim 11, is characterized in that, described step a2) in, by with described pixel (x 1, y 1, z 1) pixel value of adjacent pixel obtains described pixel (x 1, y 1, z 1) the method for pixel value comprise: by with described pixel (x 1, y 1, z 1) four adjacent pixels carry out bilinear interpolation, obtain described pixel (x 1, y 1, z 1) pixel value.
14. the method for claim 1, it is characterized in that, described step 3.5) in, according to the pixel value g2(s of each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the 3rd pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z) method comprise:
B1) for the pixel (x of the 3rd pixel groups that is arranged in the first coordinate system corresponding to arbitrary selected line of response group 2, y 2, z 2), in the second coordinate system, find the coordinate (s of the pixel of correspondence with it 2, t 2, z 2);
B2) in the second coordinate system, by with described pixel (s 2, t 2, z 2) the pixel value g2(s of pixel of adjacent the second pixel groups, t, z) obtain described pixel (s 2, t 2, z 2) pixel value, the pixel (x of the 3rd pixel groups that is arranged in the first coordinate system that selected line of response group is corresponding 2, y2, z 2) pixel value f2(x 2, y2, z 2) be arranged in the described pixel (s of the second coordinate system 2, t 2, z 2) pixel value identical.
15. methods as claimed in claim 14, is characterized in that, described step b1) comprising: according to the angle of the second coordinate system rotation to the first coordinate system calculate arbitrary pixel (x of three pixel groups that be arranged in first coordinate system corresponding with selected line of response group 2, y 2, z 2) coordinate (s of the corresponding pixel that is arranged in the second coordinate system 2, t 2, z 2), computing method are:
16. methods as claimed in claim 14, is characterized in that, described step b2) in, by with described pixel (s 2, t 2, z 2) the pixel value g2(s of pixel of adjacent the second pixel groups, t, z) obtain described pixel (s 2, t 2, z 2) the method for pixel value comprise: by with described pixel (s 2, t 2, z 2) four adjacent pixels carry out bilinear interpolation, obtain described pixel (s 2, t 2, z 2) pixel value.
17. 1 kinds of PET 3-dimensional reconstruction devices, is characterized in that, comprising:
Initialization unit, for initialization 3-D view, the pixel value that makes initial three-dimensional image be positioned at pixel on the middle three-dimensional matrice of the first coordinate system (X, Y, Z) is f1(x, y, z);
Line of response subset selected cell, selects a line of response subset for a plurality of line of response subsets that obtain from detector, and described in each, line of response subset comprises a plurality of line of response groups, and each line of response group comprises many line of response that are parallel to each other;
Line of response group processing unit, for selecting line of response group from selected line of response subset successively, and carries out predetermined registration operation to each selected line of response group respectively, and described line of response group processing unit comprises:
The first coordinate system conversion subelement, be used for according to the pixel value f1(x of the pixel in the first coordinate system (X, Y, Z), y, z), obtain selected line of response group corresponding be positioned at the second coordinate system (S, T, the pixel value g1(s of each pixel of the first pixel groups Z), t, z), the pixel of described the first pixel groups is positioned on many bar responses line corresponding to selected line of response group;
Orthogonal projection subelement, for according to the pixel value g1(s of each pixel of the first pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z) carry out orthogonal projection calculating, to obtain the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z);
Correction coefficient is obtained subelement, for by the orthogonal projection value p1(s of each subpoint corresponding to selected line of response group, t, z) with measure projection value p0(s, t, z) contrast, to obtain correction coefficient R(s, t, z), described measurement projection value p0(s, t, z) according to the line of response data of described detector acquisition, calculate;
Back projection's subelement, for according to described correction coefficient R(s, t, z) carry out back projection's calculating, to obtain the pixel value g2(s of each pixel of the second pixel groups that is arranged in the second coordinate system corresponding to selected line of response group, t, z);
The second coordinate system conversion subelement, be used for according to the pixel value g2(s of each pixel of the second pixel groups that is positioned at the second coordinate system corresponding to selected line of response group, t, z), obtain the pixel value f2(x of each pixel of the 3rd pixel groups that is arranged in the first coordinate system corresponding to selected line of response group, y, z);
Image update unit, for upgrading described initial three-dimensional image according to the result of described predetermined registration operation corresponding to a plurality of line of response groups of selected line of response subset;
Control module, for judging whether to meet the rule of stopping, when meeting while stopping rule, makes iteration termination, otherwise continues to carry out interative computation.
CN201310534444.9A 2013-10-31 2013-10-31 A kind of PET three-dimensional image reconstruction method and device Active CN104183012B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310534444.9A CN104183012B (en) 2013-10-31 2013-10-31 A kind of PET three-dimensional image reconstruction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310534444.9A CN104183012B (en) 2013-10-31 2013-10-31 A kind of PET three-dimensional image reconstruction method and device

Publications (2)

Publication Number Publication Date
CN104183012A true CN104183012A (en) 2014-12-03
CN104183012B CN104183012B (en) 2015-12-09

Family

ID=51964026

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310534444.9A Active CN104183012B (en) 2013-10-31 2013-10-31 A kind of PET three-dimensional image reconstruction method and device

Country Status (1)

Country Link
CN (1) CN104183012B (en)

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105913380A (en) * 2016-03-31 2016-08-31 上海联影医疗科技有限公司 Medical image conversion method and equipment
CN106175807A (en) * 2016-06-30 2016-12-07 上海联影医疗科技有限公司 Crystal positions bearing calibration for imaging system
CN106251313A (en) * 2016-08-15 2016-12-21 上海联影医疗科技有限公司 Medical imaging procedure and system
CN106251380A (en) * 2016-07-29 2016-12-21 上海联影医疗科技有限公司 Image rebuilding method
WO2017181357A1 (en) * 2016-04-20 2017-10-26 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
CN107392876A (en) * 2017-08-02 2017-11-24 上海联影医疗科技有限公司 Obtain the method, apparatus and medical imaging procedure of the normalization factor
CN108664723A (en) * 2018-05-08 2018-10-16 湖南大学 Complex model is digitized implementation method, device and the electronic equipment of modeling
CN108717716A (en) * 2018-04-03 2018-10-30 江苏赛诺格兰医疗科技有限公司 Data correcting method, device and computer storage media
CN109712213A (en) * 2018-12-24 2019-05-03 上海联影医疗科技有限公司 PET image reconstruction method, system, readable storage medium storing program for executing and equipment
US10535166B2 (en) 2016-02-29 2020-01-14 Shanghai United Imaging Healthcare Co., Ltd. System and method for reconstructing ECT image
CN111080737A (en) * 2019-12-20 2020-04-28 东软医疗系统股份有限公司 Image reconstruction method and device and PET scanning system
CN111325686A (en) * 2020-02-11 2020-06-23 之江实验室 Low-dose PET three-dimensional reconstruction method based on deep learning
CN111445552A (en) * 2020-03-27 2020-07-24 上海联影医疗科技有限公司 Subset division method of PET system, image reconstruction method and device
WO2021102614A1 (en) * 2019-11-25 2021-06-03 中国科学院深圳先进技术研究院 Method and terminal for processing positron emission tomography (pet) data
CN112998735A (en) * 2021-02-26 2021-06-22 中派科技(深圳)有限责任公司 System and method for reconstructing image by scanning device
CN113112558A (en) * 2021-03-26 2021-07-13 徐州医科大学 High-definition PET image reconstruction method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101449983A (en) * 2007-11-29 2009-06-10 深圳迈瑞生物医疗电子股份有限公司 Scanning changing interpolation method and device of ultrasonic linear-array deflexion imaging
WO2010125917A1 (en) * 2009-04-27 2010-11-04 株式会社日立製作所 Image reconstruction method, x-ray ct device, and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101449983A (en) * 2007-11-29 2009-06-10 深圳迈瑞生物医疗电子股份有限公司 Scanning changing interpolation method and device of ultrasonic linear-array deflexion imaging
WO2010125917A1 (en) * 2009-04-27 2010-11-04 株式会社日立製作所 Image reconstruction method, x-ray ct device, and program

Cited By (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11557067B2 (en) 2016-02-29 2023-01-17 Shanghai United Imaging Healthcare Co., Ltd. System and method for reconstructing ECT image
US10535166B2 (en) 2016-02-29 2020-01-14 Shanghai United Imaging Healthcare Co., Ltd. System and method for reconstructing ECT image
CN105913380B (en) * 2016-03-31 2019-10-08 上海联影医疗科技有限公司 The method and apparatus of medical image conversion
CN105913380A (en) * 2016-03-31 2016-08-31 上海联影医疗科技有限公司 Medical image conversion method and equipment
GB2577662B (en) * 2016-04-20 2020-09-23 Shanghai United Imaging Healthcare Co Ltd System and method for image reconstruction
WO2017181357A1 (en) * 2016-04-20 2017-10-26 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
US11049292B2 (en) 2016-04-20 2021-06-29 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
GB2577662A (en) * 2016-04-20 2020-04-08 Shanghai United Imaging Healthcare Co Ltd System and method for image reconstruction
US11704846B2 (en) 2016-04-20 2023-07-18 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
US10176604B2 (en) 2016-04-20 2019-01-08 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
CN106175807A (en) * 2016-06-30 2016-12-07 上海联影医疗科技有限公司 Crystal positions bearing calibration for imaging system
CN106175807B (en) * 2016-06-30 2019-04-23 上海联影医疗科技有限公司 Crystal positions bearing calibration for imaging system
CN106251380B (en) * 2016-07-29 2022-07-15 上海联影医疗科技股份有限公司 Image reconstruction method
CN106251380A (en) * 2016-07-29 2016-12-21 上海联影医疗科技有限公司 Image rebuilding method
US10429525B2 (en) 2016-08-15 2019-10-01 Shenzhen United Imaging Healthcare Co., Ltd. System and method for PET data correction
US11874412B2 (en) 2016-08-15 2024-01-16 Shanghai United Imaging Healthcare Co., Ltd. System and method for PET data correction
CN106251313B (en) * 2016-08-15 2020-06-26 上海联影医疗科技有限公司 Medical imaging method and system
US11105939B2 (en) 2016-08-15 2021-08-31 Shanghai United Imaging Healthcare. Co., Ltd. System and method for PET data correction
CN106251313A (en) * 2016-08-15 2016-12-21 上海联影医疗科技有限公司 Medical imaging procedure and system
CN107392876A (en) * 2017-08-02 2017-11-24 上海联影医疗科技有限公司 Obtain the method, apparatus and medical imaging procedure of the normalization factor
CN107392876B (en) * 2017-08-02 2020-10-02 上海联影医疗科技有限公司 Method and device for acquiring normalized correction factor and medical imaging method
CN108717716B (en) * 2018-04-03 2022-05-17 江苏赛诺格兰医疗科技有限公司 Data correction method, device and computer storage medium
CN108717716A (en) * 2018-04-03 2018-10-30 江苏赛诺格兰医疗科技有限公司 Data correcting method, device and computer storage media
CN108664723A (en) * 2018-05-08 2018-10-16 湖南大学 Complex model is digitized implementation method, device and the electronic equipment of modeling
CN109712213A (en) * 2018-12-24 2019-05-03 上海联影医疗科技有限公司 PET image reconstruction method, system, readable storage medium storing program for executing and equipment
CN109712213B (en) * 2018-12-24 2023-10-20 上海联影医疗科技股份有限公司 PET image reconstruction method, system, readable storage medium and apparatus
WO2021102614A1 (en) * 2019-11-25 2021-06-03 中国科学院深圳先进技术研究院 Method and terminal for processing positron emission tomography (pet) data
CN111080737B (en) * 2019-12-20 2023-05-26 沈阳智核医疗科技有限公司 Image reconstruction method, device and PET scanning system
CN111080737A (en) * 2019-12-20 2020-04-28 东软医疗系统股份有限公司 Image reconstruction method and device and PET scanning system
CN111325686A (en) * 2020-02-11 2020-06-23 之江实验室 Low-dose PET three-dimensional reconstruction method based on deep learning
CN111325686B (en) * 2020-02-11 2021-03-30 之江实验室 Low-dose PET three-dimensional reconstruction method based on deep learning
CN111445552A (en) * 2020-03-27 2020-07-24 上海联影医疗科技有限公司 Subset division method of PET system, image reconstruction method and device
CN111445552B (en) * 2020-03-27 2023-11-21 上海联影医疗科技股份有限公司 Subset partitioning method, image reconstruction method and device of PET (positron emission tomography) system
CN112998735B (en) * 2021-02-26 2022-09-02 中派科技(深圳)有限责任公司 System and method for reconstructing image by scanning device
CN112998735A (en) * 2021-02-26 2021-06-22 中派科技(深圳)有限责任公司 System and method for reconstructing image by scanning device
CN113112558A (en) * 2021-03-26 2021-07-13 徐州医科大学 High-definition PET image reconstruction method
CN113112558B (en) * 2021-03-26 2024-03-15 江苏医药职业学院 High-definition PET image reconstruction method

Also Published As

Publication number Publication date
CN104183012B (en) 2015-12-09

Similar Documents

Publication Publication Date Title
CN104183012B (en) A kind of PET three-dimensional image reconstruction method and device
CN104751499A (en) Method and device for reconstructing PET two-dimensional image
CN101454801B (en) Local motion compensation based on list mode data
CN102842141B (en) Rotary X-ray contrastographic picture iteration reconstruction method
CN101278317B (en) Distributed iterative image reconstruction
US20100074500A1 (en) System and method for 3d time of flight pet forward projection based on an exact axial inverse rebinning relation in fourier space
CN101496061B (en) For calculating method, device and the computer-readable medium of mutual yardstick with the visual attribute of amendment image data set
CN103153192A (en) X-ray CT apparatus and image reconstruction method
CN105682559A (en) Methods and systems for estimating scatter
TWI517093B (en) Computer tomography reconstruction method
CN101490712A (en) Image reconstruction using data ordering
CN103996213A (en) PET image rebuilding method and system
CN104732541A (en) Method and device for acquiring PET crystal position lookup table
CN112233246B (en) Satellite image dense matching method and system based on SRTM constraint
US7769217B2 (en) Fast iterative 3D PET image reconstruction using a set of 2D linogram transformations
US7616798B2 (en) Method for faster iterative reconstruction for converging collimation spect with depth dependent collimator response modeling
CN106056646B (en) CT image rebuilding methods and system
Popescu et al. Ray tracing through a grid of blobs
CN117745537A (en) Tunnel equipment temperature detection method, device, computer equipment and storage medium
JP6014738B2 (en) 3D image projection method
Kim et al. Ultra-fast hybrid CPU–GPU multiple scatter simulation for 3-D PET
CN105832358B (en) A kind of imaging method of the rotation double flat plate PET system based on system calibration
US20140056499A1 (en) Apparatus and method for generating image using correction model
Mesquita et al. Choosing the ART relaxation parameter for Clear-PEM 2D image reconstruction
Sampson et al. Investigating multi-threaded SIMD for helical CT reconstruction on a CPU

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 201815 Shanghai City, Jiading District Jiading Industrial Zone No. 1180 Building 8 Hing Road

Patentee after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201815 Shanghai City, Jiading District Jiading Industrial Zone No. 1180 Building 8 Hing Road

Patentee before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: 201807 2258 Chengbei Road, Jiading District, Shanghai

Patentee after: Shanghai Lianying Medical Technology Co.,Ltd.

Address before: 201815 No. 1180 Xingxian Road, Jiading Industrial Zone, Jiading District, Shanghai

Patentee before: Shanghai Lianying Medical Technology Co.,Ltd.