WO1997005574A1 - Raw data segmentation and analysis in image tomography - Google Patents

Raw data segmentation and analysis in image tomography Download PDF

Info

Publication number
WO1997005574A1
WO1997005574A1 PCT/GB1996/001814 GB9601814W WO9705574A1 WO 1997005574 A1 WO1997005574 A1 WO 1997005574A1 GB 9601814 W GB9601814 W GB 9601814W WO 9705574 A1 WO9705574 A1 WO 9705574A1
Authority
WO
WIPO (PCT)
Prior art keywords
data set
data
sinogram
selected object
theta
Prior art date
Application number
PCT/GB1996/001814
Other languages
French (fr)
Inventor
Vaseem Unnabi Chengazi
Keith Eric Britton
Cyril Carson Nimmon
Original Assignee
Imperial Cancer Research Technology Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GBGB9515458.9A external-priority patent/GB9515458D0/en
Priority claimed from GBGB9517044.5A external-priority patent/GB9517044D0/en
Application filed by Imperial Cancer Research Technology Limited filed Critical Imperial Cancer Research Technology Limited
Priority to AU66226/96A priority Critical patent/AU6622696A/en
Priority to EP96925859A priority patent/EP0843868A1/en
Publication of WO1997005574A1 publication Critical patent/WO1997005574A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating

Definitions

  • the present invention relates to the field of image tomography, and in particular to methods of identifying, locating or analysing objects of interest in single-photon emission tomography (SPET) or PET studies.
  • SPET single-photon emission tomography
  • the technique is used to construct a three-dimensional map of radioactivity sources within an entity or target body, providing relative levels of radioactivity emanating from each of a plurality of volume elements making up the entity.
  • the radioactivity sources are gamma radiation emitters which have been injected into the body of a patient, and these are detected by a gamma camera which images a plurality of views of the body as illustrated in figures 1 and 2.
  • three dimensional mapping of other entities is also achieved with this technique.
  • the imaging system typically comprises a detector 10 and a coliimator 15 adapted to image emissions parallel to the axis of the coliimator, ie. as depicted in figure la, in the x-direction.
  • images are taken every 2 — 6° .
  • the resulting set of images each provide total radiation counts for, or projections of, a plurality of parallel columns 22 passing through the body.
  • back-projection techniques are well-known in the art, but all of them make a substantial number of approximations and assumptions about the data which result in an effective filtering of the data when reconstructing the voxel data.
  • back-projection techniques necessarily use an averaging process to determine the values of individual voxels, and thus introduce a smoothing effect, ie. a high-frequency filtration of the data.
  • the voxel map produced in the reconstructed image cannot be considered as "raw data” and further processing (ie. filtering for image enhancement or quantitative analysis) of this data can cause problems with unwanted filtering artefacts such as aliasing and the like.
  • a digital data processing system must, in creating a reconstructed image, back-project to a predetermined back- projection matrix 35 or grid, such as that shown in figure Id. Because of this, a substantial amount of inte ⁇ olation of the data is required when a projection such as that shown in figure lb is not aligned with the back- projection matrix (figure Id). In other words, the quantized x' axis ofthe camera must be mapped to the quantized x-y axes of the matrix 35.
  • Various schemes exist for such inte ⁇ olation varying from "nearest-pixel" mapping to linear and even more complex interpolation methods, all of which introduce a further element of data filtration. Filtering is required, subsequent to this inte ⁇ olation, in particular to remove the "star” artefact, which also introduces interdependence of voxel values that prevents derivation of quantitative parameters.
  • a further example is attenuation and scatter correction.
  • Each projection image provides a total radiation count for a given column 22.
  • a simple, but unsophisticated technique for back-projection is to assume, in the first instance, that the radiation count derives from sources distributed evenly throughout the length of the column, before weighting the data with counts from other images transverse thereto in the reconstruction process.
  • this simplistic approach ignores known attenuation and scatter factors for radiation passing through the body 20, for which approximate corrections can be made during the back-projection process.
  • such corrections also introduce filtering artefacts which can cause problems with later data processing.
  • the ability to provide quantitative measurement of radiation sourced from a region of interest within a body is a highly desirable goal in a number of fields.
  • there are many clinical benefits such as enabling a clinician to more accurately locate and analyse a disease site in a scanned patient.
  • the ability to use time as an additional factor in deriving quantitative data further enhances the ability to make dosimetry measurements for radionuciide therapy of cancer and other diseases.
  • such techniques have a far wider applicability beyond aiding diagnosis and therapy of the living body.
  • the present invention provides a method of deriving quantitative data from a raw projection data set of an entity in a tomographic image by the steps of:
  • the present invention provides a method of enhancing portions of an entity in a tomographic image by elimination of contributions thereto from a selected object within the entity by the steps of: (a) segmenting out the selected object in the projected images; (b) modifying the raw data set to form a second data set in which contributions from the selected object have been eliminated; and
  • the present invention provides a method of segmenting out at least one selected object from a raw projection data set of an entity in a tomographic image and modifying the raw projection data set to form a second data set comprising data relating only to the at least one selected object, comprising the steps of: forming a sinogram for a given transverse plane; identifying the edges of a selected object for a selected theta value; tracking the edges of the object through its sinusoidal path on the sinogram for all theta to determine an object extent; modifying the raw data set to form the second data set by using only data corresponding to the object extent.
  • the methods are useful in imaging the body of a patient.
  • a still further aspect of the invention provides a method of locating a disease site in a patient, the method comprising the method of the first or second aspects of the invention wherein the entity is the body of the patient and the at least one selected object is either: (a) the disease site, or (b) a site which partially obscures the disease site.
  • disease site we include any site in the body which exhibits abnormal characteristics.
  • a disease site includes a tumour which may be a primary tumour or may be a secondary tumour or metastasis.
  • the disease site may be a prostate tumour and a typical site which obscures this is the bladder of the patient.
  • Figures 1(a) to 1(d) show schematic diagrams in the x-y plane useful in the explanation of back-projection tomography techniques
  • Figure 2 shows a perspective schematic view of a plurality of images or projections formed from a single transaxial plane or "slice" through a body
  • Figure 3 shows a schematic view of a sinogram presentation of the images in figure 2;
  • Figures 4(a) and 4(b) show a further schematic view of a sinogram presentation of a more complex set of images together with a corresponding plot of total counts per angle of the sinogram for an object therein;
  • Figure 5 shows a further schematic view of the sinogram presentation of figure 4(a);
  • FIG. 6 shows a flow diagram of the principal steps of the present invention
  • Figure 7 shows a flowchart for the initialization of a forward projection image matrix
  • Figures 8(a) to 8(f) show a flowchart indicating the steps taken during automated segmentation of data in each sinogram;
  • Figure 9 shows a flowchart indicating the steps taken during manual segmentation of data in each sinogram;
  • Figures 10 and 11 show the steps taken during the forward projection of the segmented data to estimate counts contributed from segmented objects
  • Figure 12 to 17 show exemplary images illustrating use of the techniques of the present invention, in which:
  • Figure 12 is a conventionally back-projected reconstruction of a transverse section at human kidney level
  • Figure 13 is a hybrid image with both kidneys and spleen in an estimated body outline
  • Figure 14 shows the result of the first iteration of an algorithm according to the present invention to remove the contribution of the right kidney to the raw data
  • Figure 15 shows the result of the final iteration of the algorithm according to the present invention having removed the right kidney without affecting the left kidney or spleen;
  • Figure 16 shows a transverse section of a conventionally reconstructed image at bladder level
  • Figure 17 shows a transverse section corresponding to figure 16, but with pre-processing of the raw data set according to the present invention prior to reconstruction;
  • Figure 18 shows a diagram illustrating a sinogram row presentation of raw data according to one aspect of the present invention
  • Figures 19(a), 19(b) and 19(c) show actual image raw data in the sinogram row presentation of figure 18;
  • Figure 20 shows a diagrammatic three dimensional view of the sinogram row presentation of data of figure 18;
  • Figures 21(a), 21(b) and 21(c) are diagrams useful in explaining data processing techniques according to one aspect of the present invention.
  • Figure 22 shows a diagram of a panspectral version ofthe sinogram row presentation of figures 18 to 20.
  • a body 20 located at the centre of rotation of a gamma camera is imaged in a transaxial or transverse (x-y) plane to produce a plurality of projections 30 0 to 30 n , each at an angle theta to the back-projection matrix 35 in the x-y plane.
  • the projections 30 are shown as having finite dimension in the z-direction.
  • a gamma camera will comprise a two- dimensional array coliimator thereby providing simultaneous data collection for several transaxial planes at once.
  • An object of interest 40 located within the body 20 at position i,j,k is imaged on the projections 30 at varying positions as indicated, representing an increased or decreased number of counts usually displayed as intensity of the image.
  • FIG. 3 there is shown a sinogram presentation 50 of the data collected in figure 2.
  • the sinogram trace 45 can be predicted in both phase and amplitude (which correspond to the location of the object in the x-y plane) and in variation of the number of counts (or intensity) along the trace (which corresponds to the effects of physical factors such as attenuation and scatter).
  • the change in counts between the first few projections 30 0 , 30 t etc and the last few projections 30 n . ⁇ , 30 n represent any changes in activity within the period of acquisition.
  • the shape of the object of interest will manifest itself as a variation in the thickness a,b of the trace 45 with varying theta, the edges of the object of interest typically being discernable by changes in intensity.
  • a sinogram 60 is shown for two objects spatially separated within the body being scanned.
  • Trace 62 represents an object for which quantitative data is required.
  • Trace 64 represents an interfering object in which we have no interest. For a large part of the sinogram, the traces do not meet, eg.
  • the selected object for segmentation may take one of two forms. In the first scenario, it is only the selected object for which the user requires detailed or quantitative count information, or a detailed three dimensional image. In this case, all other data in a sinogram is discarded, leaving only data corresponding to the selected object as an object of interest.
  • the selected object may be an obscuring object, the data from which is masking or interfering with the data required during the reconstruction process. For example, there may be an accumulation of radioactive material in a particular organ in a patient's body which is not of interest, but the high counts generally obscure other regions in which smaller, but nevertheless crucial quantities of radioactive material are accumulated. In this case, data corresponding to the selected object are discarded, leaving all other raw data intact and more clearly showing any objects of interest.
  • an image may be formed from the remaining data, and quantitative measurements made.
  • the raw image data from projections 30 j to 30 n are loaded into the computer system (step 101).
  • a sinogram representation 60 of the data is formed (step 102) and, based on operator analysis of the sinogram, the sinogram trace 62 relating to a selected object is identified (steps 103, 104).
  • the edges of the selected object are defined (step 105), and the count data from the object isolated from the raw data (step 106) to form a second data set (step 111), either by segmenting out the selected object to leave the selected object data only (steps 107, 108), or by segmenting out the selected object to leave all other data (steps 109, 110) more clearly showing any objects of interest within the body.
  • This modified data set may then be used to form an image (step 1 12), or to provide quantitative data such as volume of the object, activity within the object and activity variation with time (steps 1 14-1 16), providing output either on a suitable display device or print out (step 1 17).
  • step Al parameters for the forward projection of matrix 35 are specified for each transverse (x-y) image plane, including matrix size (resolution) and number of projections 30 comprising the raw data set.
  • Other display parameters may be set, and the method of inte ⁇ olation to be used for back-projection of the projections 30 onto the non-aligned matrix 35 is specified.
  • step A2 a matrix is constructed which gives the mapping of a point (i,j) in the x-y plane of the transverse image onto the sinogram at values (x',k).
  • step Bl the raw data from gamma camera 10 is loaded into the computer system.
  • the data is re-formed into a set of sinograms as illustrated in figures 3 and 4, each corresponding to a transaxial slice.
  • the user For the pu ⁇ oses of segmenting out the data, the user first selects the sinogram to display (step B2). The selection depends upon the location of the object relative to the z-axis, and the relative positions of other traces. For example, the projection may be clearer in some sinograms than others.
  • the user then opts (step B3) for either automatic object definition (step B5) or manual object definition (step B4).
  • Manual object definition is necessarily substantially more labour intensive, but may offer advantages in respect of the skill of the user in assessing the shape and extent of objects and will be discussed in greater detail later with reference to figure 9.
  • step B6 the user first selects a designation number to define a selected object (Object Number), which object will comprise a number of segments, ie. a number of trace portions Sl, S2 and S3 relating to the same selected object within the sinogram.
  • the user displays the chosen sinogram which will best show the object (step B7), and identifies, on the sinogram, a selected trace portion or segment Sl of the object, by marking the upper and lower limits of the segment (ie. at maximum and minimum theta values thereof) and a seed position P within the selected segment Sl of trace 64 (step B9A). Other segments S2 and S3 of the trace 64 are similarly marked (step B9B).
  • the system then allows the user to select (step BIO) one of a number of predetermined global (e.g. Sobel) or direction biased (e.g. Edge Compass) edge-detection algorithms well known in the art to locate the edges 80,81 of the object segment in the positive and negative x' directions from point P and over the selected range of theta values for the segment on the sinogram.
  • predetermined global e.g. Sobel
  • direction biased e.g. Edge Compass
  • steps B12 to B22 the system uses the edge-finding algorithms to determine an outline shape or two-dimensional "edge map" 85,86,87 (on the sinogram, ie. in x'-0 space) of the various segments Sl, S2 and S3 identified with the object trace 64. From the sinogram view, the system generates either a first or second derivative image (steps B 12-B15). This image may be displayed to the user (step B16) and saved for further use during edge tracing. In steps B19 to B22, the edge tracing procedure is carried out from the derivative edge map. Starting from the seed position P identified by the user, the algorithm hunts for the nearest edge moving across the spatial derivative at that particular angle 0.
  • the system Upon finding the nearest edge, the system then attempts to locate a new, and corresponding edge point for the adjacent ⁇ value above or below in the sinogram. This procedure is carried out subject to the constraints that the new edge point should be within a user specified target range of the previous edge point detected and that the changes in the counts at the edge detected should exceed a user specified "edge sha ⁇ ness" expressed as a percentage of the maximum counts in the object at that angle. The process is terminated if the constraints are not met and results in a series of discrete points mapping the edge (eg. 85, 86 or 87, figure 5) in the defined segment. The procedure is repeated for both left and right edges of the sinogram trace, and for all segments.
  • the user may view these points either on the original sinogram image, or on the edge image (step B21) and can toggle the display between the two (step B22).
  • a curve fitting algorithm is carried out to map the closest possible pair of sinusoidal waveforms respectively to the left-hand edges and the right-hand edges of individual segments S1-S3 which form the object trace 64. These waveforms are then drawn onto the displayed sinogram for confirmation by the user (steps B23 to B28), which finalises the object definition.
  • the separation of the sinusoidal traces for each theta value are used to calculate the object width and shape, and to calculate the total number of pixel counts attributable to the object (step B30). This is preferably carried out by plotting the number of counts along the sinusoidal trace, against angle 0 (see figure 4(b)).
  • This plot yields a trace having a relatively slow variation in count as a function of theta, representing variations arising from differing levels of attenuation through the body (eg. at segments Sl , S2 and S3), upon which is superimposed relatively sharp deviations from the slowly varying trace where the counts of interfering traces combine with those of the selected object (eg. at areas 70,71).
  • Known inte ⁇ olation techniques are used to eliminate these sha ⁇ deviations, resulting in a slowly varying trace which corresponds to the counts deriving from the selected object only.
  • the system may also use the trace 64 width to assume an ellipsoidal shape of the object and represent it as such, or to represent the object as irregular in shape (step B31), as appropriate.
  • edges of an object have been delineated in the sinogram, it is possible to represent them in the transverse plane via a reverse Radon transform. Since the detected edge 85,86,87 has a binary form, it does not require any filtering for moving between the planar (x-y space) and sinogram (x'-0 space) representations of the raw data set. This means that there is now an enclosed area that represents the location of the object whose edges have been traced. In order to estimate the activity within this region, the forward projection procedure as shown in figure 10 is used.
  • the body 20 outline is defined in the transverse section and the object of interest 40 outlined within it (step Dl). It will be understood that additional objects of interest may also be added as ellipsoidal or irregular shapes (step D2).
  • the total number of counts attributable to the object of interest is distributed into the transverse plane as a number of counts per voxel within the area of the object of interest (step D2). This is based upon the fact that radiation emission is isotropic. As an example, this means that if there are 6400 gamma rays emitted from the object of interest 40 (derived from the count vs. 0 plot), we are likely to get on average 100 events detected in the object's projection in each of the 64 views acquired under ideal conditions.
  • an array of expected attenuation factors for the body 20 outline are generated and stored, or retrieved from memory (steps D3 to D8). These attenuation factors provide an estimate of the expected attenuation occurring for each angle 0 for the segmented object of interest given its position within the body outline.
  • step D9 account is taken of the observed phenomenon of line spread by setting a suitable line spread function. This compensates for a number of effects such as scattering, equipment and acquisition parameters which diffuse the image from, for example, a point source into an approximate Gaussian distribution.
  • step D10 account is taken of known background noise effects by defining a suitable noise function, eg Poisson distribution.
  • the counts within the object are forward projected (steps Dl 1 , D12) and this distribution of counts obtained in each projection are compared with the counts that were actually obtained for the selected object in the raw data set (step B35).
  • the simulation parameters may then be adjusted to minimise the Chi-squared statistic (step B36) in order to get a forward projection count distribution result as close as possible to the actual counts obtained from the selected object 40 in each of the projections.
  • the cycle of steps B34 - B37 are repeated until a convergence criterion is satisfied (step B37), although in practice it is found that very few iterations, if any, are required.
  • the convergence criterion may be chosen according to a clinical situation governing the desired accuracy.
  • the operation steps B2 to B39 must be repeated for each transverse slice. Since adjacent slices will be substantially similar to one another, with slowly varying values of the sinusoidal traces for successive slices, it is possible to carry out a repetitive process on the succeeding slice substantially automatically. Providing that the axial distance between transverse slices is not large compared with the dimensions of the object of interest, the seed position P will still be contained within the object of interest, and the upper and lower limits for theta will vary only slightly and the object edges will have varied only slightly. Thus edge finding algorithm at steps BIO to B21 can operate automatically on the new slice, with confirmation from the user if necessary.
  • step B39 we have, by addition, both the location of, and the counts within, the selected object for all the transverse planes that the object occupies by using only the raw data without any reconstruction.
  • step Cl the left or right hand side of the " sinogram trace is first selected (step Cl) and a sinusoidal waveform is displayed superimposed thereon. Phase and amplitude coefficients are adjusted by the user until a close match with the edge of the object trace edge is obtained (steps C2, C3). The exercise is repeated for the opposite trace edge (step C4).
  • a raw data set contains all the information necessary to characterise the distribution of radioactivity in three dimensions, and that, for a given data set, it is possible to describe the relationships between the entire set of projections as a set of mathematical functions. Once this description is made, it is possible to manipulate the data set to predict clinically advantageous "what if" scenarios that maintain the relationships and provide quantitative parameters. The steps have been described in connection with Figure 6.
  • the algorithm starts off with user identification of the object that is to be segmented and quantitated in the raw projection data, which is usually best done in the projection with the maximum counts from the object itself, and with minimal interference from any over- or underlying structures.
  • a user defined seed pixel within this object starts off a three dimensional edge detector that produces a series of discrete points defining the boundaries that satisfy a preset target range and edge sha ⁇ ness, and terminates when all such points have been identified.
  • a least squares fit to this set of edge pixels defines the boundary of the object according to an assumed ellipsoid or irregular shape selected by the user.
  • the algorithm then forms an estimate of the outline of the patient's body according to a preset threshold from the limits as seen in all the projections, and also the mean background counts free from all other major objects.
  • a copy of the delineated object as well as the estimated body outline is produced in a new data set to form the basis of the forward projection simulation module.
  • An attenuation map is generated by associating the path lengths stored in a lookup table for each of the pixels located within the object of interest to the edge of the body outline with the attenuation coefficient.
  • the pixels within the body outline are given an initial count value based on the estimate of the mean background and the pixels within the object of interest are given an arbitrary initial count value by the user.
  • a Monte Carlo subroutine that isotropically distributes these initial estimates of counts per voxel for each projection angle.
  • This subroutine takes into consideration the aforementioned attenuation maps (and any additional attenuation corrections if required), noise, Modulation Transfer Function and time variance of activity within the segmented organ due to pharmacokinetic redistribution or radionuclidic decay.
  • a Chi-squared statistic is calculated to compare the simulated data with the actual data based on the projections with the majority of the counts arising from the object of interest, and used to revise the initial estimates iteratively. This procedure converges to a point when the simulation mirrors the original data closely for only the delineated object independent of all others.
  • the algorithm can branch one of two ways by either deleting the segmented object from the raw data set, or keeping the object but deleting out everything else, ie. image surgery.
  • This decision is made by the user based on the clinical situation for which the study was performed.
  • the quantitative data about the object namely, volume, activity and time variance during the period of acquisition are inferred from the values of these parameters used during the simulation to get the minimum Chisquared statistic. All the above steps and their resultant output can be overridden or modified by the user should the need be felt.
  • the entire sequence is repeated several times till all objects of interest have been segmented and quantitated independent of each other using the raw data set only, and a new data set is generated that includes the appropriate objects of interest only, in any combination dictated by the clinical situation.
  • the algorithm may then terminate at that point without attempting to form images.
  • the new data set which contains the quantitated object can be reconstructed using back projection with no prefiltering and a simple ramp filter to obtain images for comparison with conventionally filtered and
  • Table 1 gives quantitative data for a phantom with six spheres, ranging in volume from 0.6 to 24 cL and with activities from 7.8 to 312
  • MBq put in water, r values for both volume and activity are >0.99 showing good concordance between actual volumes and activities and those measured from regression lines fitted to the program output.
  • Figure 12 shows a conventionally backprojected reconstruction of a transverse section at the level of the kidneys of an Indium-Ill Octreotide study using no prefiltering and a simple ramp filter.
  • Figure 13 shows a hybrid image with both the kidneys and spleen from the original data placed in the estimated body outline, while Figure 14 shows the first iteration of the algorithm to remove the contribution of the right kidney
  • FIG. 15 shows the final iteration showing same section with the right kidney completely removed from the raw data set without affecting the left kidney or the spleen. It can be seen that the artefactual cold area in the area between the kidneys is reduced.
  • Figure 16 shows a transverse section of a conventional postreconstruction image at the level of the bladder in a Tc-99m labelled CYT-351 study of prostate cancer at 24 hours with Weiner prefiltering and attenuation correction
  • Figure 17 shows the result of processing of the raw data set to reduce selectively the counts originating from the bladder prior to similar reconstruction.
  • the conventional image shows the effects of a wide range of contrast values, accumulation of activity in the bladder over the hour long acquisition, and poor count statistics.
  • the two external iliac vessels at approximately 10 and 2 o'clock position (with respect to the bladder), the two internal iliac vessels at 5 and 7 o'clock and the extraprostatic extension of the carcinoma at 6 o'clock are visualised.
  • Two wedge shaped cold artefacts are present on either side of the bladder.
  • the iliac vessels are seen in essentially the same places as before, but the extension due to the prostate cancer is now separated from the bladder, and without the artefactual addition of counts originating from the bladder.
  • the wedge artefacts have disappeared and the contrast range is now balanced over the entire image.
  • the processing illustrated above is possible only after accurately reproducible derivation of both the size and activity within the object by the algorithm.
  • the program returns a series of numbers in terms of counts, pixels and percent kinetic variation, which can be calibrated easily to yield MBq.cm "3 .
  • Any method that aims to provide clinically useful quantitative parameters must satisfactorily take into account all the variability in the entire expanse of factors affecting the acquisition of data in the routine environment of a Nuclear Medicine department if it is to achieve widespread acceptance. It is preferable if the inco ⁇ oration of the corrections involves as little processing and user interaction as is technically possible so that the accuracy, reproducibility and confidence in the quantitative parameters is increased.
  • the method described here begins the processing, using the raw projection data only, before any artefacts are introduced by any reconstruction process. It then segments organs of interest and provides a simulated data set for each, independent of all others, that is capable of taking into account all the major factors affecting acquisition eg. the Modulation Transfer Function, noise, attenuation, large contrast values and time variance of the activity distribution due to pharmacokinetics.
  • the simulated data sets are forward projected to assess the accuracy ofthe simulation and the parameters used for the simulation are used to manipulate the original data set to compensate for acquisition limitations and quantify volume and activity. The compensation and quantitation are done before the reconstruction introduces interdependence of voxel values.
  • the processing can be tailored to the pharmacokinetics and biodistribution of any particular agent. This is important in cases of those radiopharmaceuticals which offer low normal: abnormal ratios, have rapid redistribution kinetics or with wide variations in uptake in adjacent organs. It may be possible to have less stringent goals for new radiopharmaceuticals in terms of pharmacokinetics and biodistribution.
  • the ability to quantify objects independent of any process of image formation means that more rigorous comparisons can be made between studies carried out at different times, centres or protocols. Such an exacting basis is a prerequisite for the development of standardised databases of images and protocols, and in the assessment of equipment and software performance.
  • the mathematically defined relationships used by the algorithm may be able to form the basis of optimised acquisition protocols for new types of studies, for example, a reduced number of views but with a longer time per image over an arc or set of arcs that offer the best visualisation of a particular organ, or studies using rapidly distributing tracers that offer a combination of functional information and three dimensional visualisation.
  • the technique previously described herein uses an implementation which segments out a part of the raw projection data and then generates a simulated data set that closely resembles the contribution from the object of interest within this part of the data. Once this condition is met, it is inferred that the parameters used during the simulation are directly comparable to the object of interest. This provides the clinically important estimates of volume, activity and time variance.
  • Avoidance of this simulation process can be useful as simulation processes can be quite elaborate and complex in order to achieve optimal results. This requires considerable computing power to implement and can be time consuming to program, debug, optimise and check for accuracy under varying conditions.
  • the simplification of the edge detection / object delineation process can be desirable as although the techniques described above can utilize a number of well developed techniques, no single method can be applied to the wide range of tomographic data which can be found in nuclear medicine. This means that a preferred implementation will include several different edge detectors. It may therefore be necessary to select the type of detector used and operating parameters thereof dependent upon a particular clinical study. Whilst there is no problem with this, there are circumstances in which it is also desirable to offer a standardized analysis technique.
  • the alternative embodiment now to be described makes further use of a number of periodicities found in the sinogram representations of the raw data.
  • the alternative embodiment has particular applicability for camera systems in which, for each value of 0, information is collected simultaneously for all z. That is to say, the camera includes a detector 10 and coliimator 15 which extend in two dimensions for simultaneous collection of x' data in all transaxial planes (z) at once, as previously discussed.
  • the data acquired and represented in the sinograms of figures 3, 4 and 5 essentially relate to the following dimensions: a) the three physical dimensions identified previously in figure 1 as x, y and z; b) the time during acquisition and therefore rotation of the camera head, ie. 0; c) the energy of the counts arising from the volume imaged, hereinafter referred to as E; d) the change in counts within objects during a period of acquisition; and e) the time between sets of acquisitions.
  • the sinograms 50 of a raw data set as depicted in figure 3 are each rotated by 90° anticlockwise, and laid end to end in order of the successive transaxial slices, ie. for increasing, or decreasing values of z.
  • a sinogram row 200 is thereby formed comprising successive sinograms 50 ⁇ to 50 n representing successive z values.
  • the vertical axis represents x'
  • the horizontal axis represents, within each sinogram, 0 from 0 to 360, and from sinogram to sinogram, z from z l to z ⁇ .
  • the horizontal axis also represents time t in a saw tooth function 210 depicted above the sinogram row 200, from time (beginning of acquisition period) to t e (end of acquisition period).
  • the number of counts, or intensity I is represented by a height of trace perpendicular to the diagram of figure 18.
  • an object of interest will be represented on the sinogram row 200 as a sinusoidal trace of varying thickness according to the x-y dimensions of the object, of frequency corresponding to the one sinogram period. Variations in the dimensions of the object over the z-direction will manifest themselves as small discontinuities 202 in width and/or position of the trace at each sinogram boundary. The extent of the object in the z-direction will be manifested as the appearance of the object through a number of adjacent sinograms. Figure 18 shows this in part as a diminishing size of trace over sinogram views 50 t to 50 n , and is more clearly visible in figures 19(a), (b) and (c).
  • Figures 19(a), (b) and (c) show actual data taken from human patients showing the portion of the body encompassing the liver, prostate and parathyroid respectively. In each figure, the entire length of the sinogram row 200 comprises all seven rows 200 ] to 200 7 in a continuous chain.
  • the figures 19(a), (b) and (c) clearly show the plurality of interfering traces which are normally found in a sinogram view. The figures also clearly show the extent of the objects in the z direction.
  • the three physical dimensions of the object of interest and its location within the volume being imaged are displayed in the phase, amplitude and length of the sinogram trace, while the counts arising from it are displayed in the height of the trace.
  • the time period of the acquisition is the saw tooth function from sinogram view 50 j to 50 n in figure 18.
  • a step change in the height (I) at the beginning and end of each sinogram view and from one sinogram view to the next relates to the time variance of activity over the data acquisition period t e - t b .
  • a drop between the end of one view to the next one means an accumulation of activity and a rise means a decrease in activity over the period of acquisition or, comparing the beginning and end of the same view, this comparison means the reverse, ie. a drop is a decrease.
  • Step changes in the phase and amplitude of the sinogram trace relate to the shape and orientation of the object.
  • the characteristics of the sinogram trace relate directly to the properties of the object producing it.
  • An essential aspect of this is the periodicity of these characteristics which can be exploited to segment out an object of interest.
  • Figure 20 shows a schematic three dimensional visualization of the intensity data represented by just one object trace 210 within a sinogram row, with intensity I (number of counts) represented by the orthogonal axis.
  • the trace 210 has finite width in x' (x-y) space as it snakes along the sinogram row, although this width is not shown in the diagram.
  • the trace 210 will, of course, be typically interfered with by other traces 212 shown in dotted outline interlacing with trace 210 as previously described with reference to figure 4.
  • the trace height (figure 4b and figure 20, lower inset showing I vs.
  • the I vs. 0 (and also I vs. 0,z) plot for a selected object will vary sinusoidally over a period of one sinogram view.
  • trace height variations from interference from interlacing traces will have a periodicity of twice that of the selected object, and these variations can be identified and eliminated accordingly.
  • Local attenuation by objects eg. bone structures
  • will cause dips in the intensity vs. z trace which may also be detected since they force a departure from the ideal sinusoidal trace. Thus these local dips may also be eliminated by interpolation.
  • the function of intensity I versus x' (shown in figure 21(b)) is used to determine the number and location of peaks 230, 232 in the un-normalized raw data set.
  • a preprogrammed suitable threshold value may be used to determine which peaks to examine on a first pass. Alternatively, this may be determined by reference to a clinical database which enables control ofthe system by defining approximate locations of major peaks for a particular clinical study.
  • Each maximum point x' for any given value of 0 is related to the nearest maximum point x' for the next value of 0 forming a succession of points (hereinafter a "set") along the trace 210 in figure 21a, and similarly along the trace 212.
  • the two sets of points corresponding to traces 210 and 212 will intersect twice (220, 221) per sinogram view such that, for example, only one I vs. x' peak is located at 0 B . These intersections are discounted for the time being leaving blocks of points between the intersections.
  • the extent of the object is now determined within these blocks by determining an object width for each 0, ie. determining a value on each side of the peak which delineates the object.
  • a pair of minimum points x' ⁇ , x' 2 are derived by computing the magnitude of the first derivative, dl/dx' (figure 21(c)). These points determine the edges of the traces 210, 212 at that value of 0.
  • a least squares curve-fitting algorithm is then used to define the edges 215, 216 of each trace 210, 212. This may be smoothed using the first harmonic of the Fourier expansion of the curve.
  • Various refinements may be used to optimize the position of the curves defined by the x' , and x' 2 points, for example taking into account factors such as the attenuation coefficient Z, the thickness of attenuating material and energy E of the ⁇ radiation. These factors may be determined empirically or by reference to a priori knowledge of the organs being imaged.
  • the number of unique sets of maximum points derived is the number of objects identified, and it is now possible to relate the phase and amplitude of each set of maximum points 230 and the length of the entire trace 210, 212 into the x, y and z dimensions of each object.
  • the next task is to determine the number of counts originating from each object identified thus far.
  • the count profile I vs. 0,z is plotted along the sinogram trace (figure 20, lower inset). This is smoothed to eliminate contributions from interlaced traces and to compensate for local attenuation as previously described.
  • the first harmonic of the Fourier expansion of this count profile may be taken to eliminate noise from the curve fitting.
  • the volume integral of this count profile relates to the number of counts originating from the object (including the background contribution at this point in the analysis). This realization makes the elimination of the Monte Carlo simulation possible, as discussed earlier. It is now possible to convert the parameters obtained above into true estimates of the physical dimensions, radioactivity and time variance for each object, providing that information relating to the characteristics of the gamma camera, the body outline and the background contribution are provided.
  • the gamma camera characteristics are obtained to a required degree of thoroughness from a series of experiments calibrating the technique for a particular gamma camera set up.
  • the body outline and background contributions can be assessed and calibrated using techniques described hereinbefore.
  • the second sinogram data set may be back-projected for study or normalized to enable identification of a new set of peaks 230, 232 relating to objects of lower activity. A further pass can then be made to eliminate contributions from these second order objects, or to examine them in detail.
  • panspectral gamma cameras may be used to acquire and assign radioactivity counts over a range of photon energies into separate energy bins which can further refine and improve the raw data segmentation process described above.
  • a number of sinogram row data sets 200 A ...200 X are laid out to form a page 250 as shown in figure 22.
  • Each page 250 relates to a data set acquired at a time t, over an acquisition period t+ ⁇ t, ie. from t ⁇ , to t e as shown in figure 18.
  • Each page 250 comprises a number of sinogram rows in which each row corresponds to bin counts for a given photon energy (E) range. The highest energy bin corresponds to the top row (row 1) and the lowest energy bin corresponds to the bottom row (row n).
  • Each row therefore comprises data from one part of the energy spectrum as indicated on the right hand side of the figure. Organising the data set in this manner enables refinements to correction methods for attenuation and scatter by exploiting information contained in the sinogram rows 2 to n.
  • the probability that it will give rise to a certain pattern of counts in all of the other rows can be estimated as a probability map.
  • the spread of this location further down in the energy spectrum may be estimated by a series of expanding windows 261 , 262 ... 265 (shown greatly exaggerated in the diagram) of, eg. 3 x3 data elements for rows 200 B to 200 D , 4 x4 data elements for the next row 200 E , 5 x5 data elements for the next row and 9 x9 for the remaining rows.
  • the number of counts within each window are, respectively, A-f 2 , A-f 3 , ... A-f n where f n represents a parameter which is dependent upon the energy of the photon and the attenuation factor (Z value) of the attenuating medium.
  • Sl counts are the result of interactions of photons arising from object 1
  • S2 counts are the result of interactions of photons arising from object 2
  • the probability maps also have periodic properties. For example, the spread of counts arising from a non-central object varies periodically, narrowing for 0 where the object is close to the camera and widening for 0 where the object is furthest from the camera. This means that if the counts originating from a particular point lower down in the energy range are considered, then the contributions Sl and S2 of each object can be refined still further according to the location of the object within the three ⁇ dimensional volume being imaged. This is because there is a higher probability that an object further away from the camera will give rise to an interaction because of the depth of the intervening medium than for an object closer to the camera, in proportion to the activity inside each object.
  • the realisation that the variation of probabilities has periodic characteristics simplifies the implementation of the algorithm on a computer.
  • the present invention therefore organizes raw acquisition data into sinogram-based data structures and exploits various periodicities in the data structure, which periodicities facilitate the segmentation from, and quantitative analysis of, object data from the raw data set without the need for data reconstruction using back projection techniques.
  • the data segmented out can be the object of interest or can be an object which is obscuring remaining data in the raw data set, ie. the remaining data set essentially comprises one or more objects of interest for which quantitative data is sought without the effects of the initially segmented out object. Successive objects may be segmented to gradually leave a data set which includes the object(s) of interest.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

In order to avoid the problems inherent in data reconstruction and back-projection in the field of image tomography, a sinogram based technique is used to segment out selected portions of the raw projection data set prior to data reconstructions. The technique enables provision of quantitative parameters in respect of designated objects of interest within a scan field. The technique uses aspects of geometrical predictability in the sinogram presentation to automatically segment out selected data from the raw data set.

Description

RAW DATA SEGMENTATION AND ANALYSIS IN IMAGE TOMOGRAPHY
The present invention relates to the field of image tomography, and in particular to methods of identifying, locating or analysing objects of interest in single-photon emission tomography (SPET) or PET studies.
The methodology of SPET is described in detail in several standard textbooks on Nuclear Medicine. Broadly speaking, the technique is used to construct a three-dimensional map of radioactivity sources within an entity or target body, providing relative levels of radioactivity emanating from each of a plurality of volume elements making up the entity. Typically, the radioactivity sources are gamma radiation emitters which have been injected into the body of a patient, and these are detected by a gamma camera which images a plurality of views of the body as illustrated in figures 1 and 2. However, three dimensional mapping of other entities is also achieved with this technique.
The imaging system typically comprises a detector 10 and a coliimator 15 adapted to image emissions parallel to the axis of the coliimator, ie. as depicted in figure la, in the x-direction. The camera is adapted to rotate about an axis of rotation, R, which is preferably coincident with the centre of the target body 20, and successive images are produced at varying angles of theta degrees, for example, as shown at figure lb (0 = 45°) and figure lc (θ = 90°). Typically, images are taken every 2 — 6° . The resulting set of images each provide total radiation counts for, or projections of, a plurality of parallel columns 22 passing through the body. These images are then back-projected using known techniques to compute the relative number of counts sourced from each pixel ij (figure Id) of a matrix 35 representing a transaxial or transverse slice through the body 20. Each pixel will have finite thickness in the z- direction which is a function of the equipment and acquisition parameters being used, and thus corresponds to a volume element or "voxel" in the transaxial or transverse (x-y) plane. Further transaxial slices are imaged at various positions along the z-axis, and from this further information, relative numbers of radiation counts from all voxels in the body are calculated to reconstruct a three dimensional "image" of radioactivity within the target body.
The back-projection techniques used are well-known in the art, but all of them make a substantial number of approximations and assumptions about the data which result in an effective filtering of the data when reconstructing the voxel data. In particular, back-projection techniques necessarily use an averaging process to determine the values of individual voxels, and thus introduce a smoothing effect, ie. a high-frequency filtration of the data. Thus, the voxel map produced in the reconstructed image cannot be considered as "raw data" and further processing (ie. filtering for image enhancement or quantitative analysis) of this data can cause problems with unwanted filtering artefacts such as aliasing and the like.
As a further example, a digital data processing system must, in creating a reconstructed image, back-project to a predetermined back- projection matrix 35 or grid, such as that shown in figure Id. Because of this, a substantial amount of inteφolation of the data is required when a projection such as that shown in figure lb is not aligned with the back- projection matrix (figure Id). In other words, the quantized x' axis ofthe camera must be mapped to the quantized x-y axes of the matrix 35. Various schemes exist for such inteφolation, varying from "nearest-pixel" mapping to linear and even more complex interpolation methods, all of which introduce a further element of data filtration. Filtering is required, subsequent to this inteφolation, in particular to remove the "star" artefact, which also introduces interdependence of voxel values that prevents derivation of quantitative parameters.
A further example is attenuation and scatter correction. Each projection image provides a total radiation count for a given column 22. A simple, but unsophisticated technique for back-projection is to assume, in the first instance, that the radiation count derives from sources distributed evenly throughout the length of the column, before weighting the data with counts from other images transverse thereto in the reconstruction process. However, this simplistic approach ignores known attenuation and scatter factors for radiation passing through the body 20, for which approximate corrections can be made during the back-projection process. Although resulting in a more accurate final image, such corrections also introduce filtering artefacts which can cause problems with later data processing.
All of the above factors severely limit the scope for quantitation of the data derived from SPET studies. In summary, the accuracy of quantitative data is affected by factors such as noise, scatter, attenuation, sampling errors, detector characteristics, time variance of the radioactivity distribution during the period of acquisition and the type of reconstruction algorithm and filtering used. Much work has been done towards estimating the effects of each of these factors and towards providing methods attempting to deal with the inaccuracies caused, such as those described in European Journal of Nuclear Medicine 19(1), 1992, pp. 47- 61; K A Blokland et al: "Quantitative analysis in single photon emission tomography" . However, none of the approaches are in widespread use, and they fail to tackle a fundamental problem, namely that of maintaining statistical independence of the sets of pixels or voxels of the reconstructed images.
Because of these filtering effects and the resulting interdependence of voxel values, statistically valid quantitative parameters cannot be derived from the reconstructed data set (voxel map). Thus, it has not previously been possible to provide an accurate absolute measure of the radiation emitted from a given volume or region of interest within a body 20. This aspect of the reconstruction techniques further prevents time dependent studies of radioactivity changes between different scans, since there is no accurate absolute measure of radioactivity in one scan to compare with later scans.
The ability to provide quantitative measurement of radiation sourced from a region of interest within a body is a highly desirable goal in a number of fields. In particular, there are many clinical benefits such as enabling a clinician to more accurately locate and analyse a disease site in a scanned patient. The ability to use time as an additional factor in deriving quantitative data further enhances the ability to make dosimetry measurements for radionuciide therapy of cancer and other diseases. However, it will be understood that such techniques have a far wider applicability beyond aiding diagnosis and therapy of the living body.
In the present invention, it has been recognized that statistical independence of volumes of regions of interest within a tomographic image can be maintained by removing, or segmenting out selected parts of the data prior to carrying out any data reconstruction, such as back projection methods. It is an object of the present invention to provide a method of data reconstruction which enables the provision of quantitative data relating to specific regions of interest within a body scanned using single photon emission tomographic techniques.
It is a further object of the invention to enable the determination of volume, activity and chronological changes in photon emission tomography data prior to, or without any, data reconstruction.
It is a further object of the present invention to provide a method of producing tomographic images in which contributions from predetermined regions of the imaged body are removed prior to processing the data by back-projection or other methods.
According to a first aspect, the present invention provides a method of deriving quantitative data from a raw projection data set of an entity in a tomographic image by the steps of:
(a) segmenting out at least one selected object within the entity in the projected images; (b) modifying the raw projection data set to form a second data set comprising data relating only to the at least one selected object; and (c) using the second data set to determine quantitative information on the volume defined by the selected object by comparison with the raw data set.
According to a second aspect, the present invention provides a method of enhancing portions of an entity in a tomographic image by elimination of contributions thereto from a selected object within the entity by the steps of: (a) segmenting out the selected object in the projected images; (b) modifying the raw data set to form a second data set in which contributions from the selected object have been eliminated; and
(c) using the second data set to create a back-projected three¬ dimensional image of the entity.
According to a further aspect, the present invention provides a method of segmenting out at least one selected object from a raw projection data set of an entity in a tomographic image and modifying the raw projection data set to form a second data set comprising data relating only to the at least one selected object, comprising the steps of: forming a sinogram for a given transverse plane; identifying the edges of a selected object for a selected theta value; tracking the edges of the object through its sinusoidal path on the sinogram for all theta to determine an object extent; modifying the raw data set to form the second data set by using only data corresponding to the object extent.
The methods are useful in imaging the body of a patient.
A still further aspect of the invention provides a method of locating a disease site in a patient, the method comprising the method of the first or second aspects of the invention wherein the entity is the body of the patient and the at least one selected object is either: (a) the disease site, or (b) a site which partially obscures the disease site.
By disease site we include any site in the body which exhibits abnormal characteristics. In particular, a disease site includes a tumour which may be a primary tumour or may be a secondary tumour or metastasis. In particular, the disease site may be a prostate tumour and a typical site which obscures this is the bladder of the patient. The present invention will now be described in detail by way of example, and with reference to the accompanying drawings in which:
Figures 1(a) to 1(d) show schematic diagrams in the x-y plane useful in the explanation of back-projection tomography techniques;
Figure 2 shows a perspective schematic view of a plurality of images or projections formed from a single transaxial plane or "slice" through a body;
Figure 3 shows a schematic view of a sinogram presentation of the images in figure 2;
Figures 4(a) and 4(b) show a further schematic view of a sinogram presentation of a more complex set of images together with a corresponding plot of total counts per angle of the sinogram for an object therein;
Figure 5 shows a further schematic view of the sinogram presentation of figure 4(a);
Figure 6 shows a flow diagram of the principal steps of the present invention;
Figure 7 shows a flowchart for the initialization of a forward projection image matrix;
Figures 8(a) to 8(f) show a flowchart indicating the steps taken during automated segmentation of data in each sinogram; Figure 9 shows a flowchart indicating the steps taken during manual segmentation of data in each sinogram;
Figures 10 and 11 show the steps taken during the forward projection of the segmented data to estimate counts contributed from segmented objects;
Figure 12 to 17 show exemplary images illustrating use of the techniques of the present invention, in which:
Figure 12 is a conventionally back-projected reconstruction of a transverse section at human kidney level;
Figure 13 is a hybrid image with both kidneys and spleen in an estimated body outline;
Figure 14 shows the result of the first iteration of an algorithm according to the present invention to remove the contribution of the right kidney to the raw data;
Figure 15 shows the result of the final iteration of the algorithm according to the present invention having removed the right kidney without affecting the left kidney or spleen;
Figure 16 shows a transverse section of a conventionally reconstructed image at bladder level;
Figure 17 shows a transverse section corresponding to figure 16, but with pre-processing of the raw data set according to the present invention prior to reconstruction;
8 Figure 18 shows a diagram illustrating a sinogram row presentation of raw data according to one aspect of the present invention;
Figures 19(a), 19(b) and 19(c) show actual image raw data in the sinogram row presentation of figure 18;
Figure 20 shows a diagrammatic three dimensional view of the sinogram row presentation of data of figure 18;
Figures 21(a), 21(b) and 21(c) are diagrams useful in explaining data processing techniques according to one aspect of the present invention; and
Figure 22 shows a diagram of a panspectral version ofthe sinogram row presentation of figures 18 to 20.
RECTIFIED SHEET (RULE 91) With reference to figure 2, a body 20 located at the centre of rotation of a gamma camera is imaged in a transaxial or transverse (x-y) plane to produce a plurality of projections 300 to 30n, each at an angle theta to the back-projection matrix 35 in the x-y plane. For the puφoses of illustration, the projections 30 are shown as having finite dimension in the z-direction. Typically, a gamma camera will comprise a two- dimensional array coliimator thereby providing simultaneous data collection for several transaxial planes at once. However, for the puφoses of the present invention, whether data is collected by a two- or one-dimensional array is not relevant; the user can be presented with a set of one-dimensional projections in x'-z space for each angle θ. For simplicity of illustration, projections 30 are shown only for theta multiples of 15°, although more typically there would be up to 128 images per 360° rotation.
An object of interest 40 located within the body 20 at position i,j,k is imaged on the projections 30 at varying positions as indicated, representing an increased or decreased number of counts usually displayed as intensity of the image.
With reference to figure 3, there is shown a sinogram presentation 50 of the data collected in figure 2. In the sinogram presentation, for a single transverse plane, the x' data from projection 300 (0=0) are displayed along the horizontal axis, and projections 30l5 302...30n corresponding to successively increasing values of θ are located thereover at increasing or decreasing ordinates.
If we now consider an object at the centre of rotation, R, of the gamma camera, this will form a trace on the sinogram 50 which is a vertical straight line because of its central location. Assuming uniform attenuation throughout a body with cylindrical geometry being imaged, the intensity (ie. number of counts) will remain constant along the line. The signal to noise ratio for each projection will depend upon the time per projection. This feature of the sinogram presentation is already in use in the process of using a point source positioned slightly off-centre for centre-of- rotation corrections.
As shown in figure 3, for any object located anywhere within the field of view, the sinogram trace 45 can be predicted in both phase and amplitude (which correspond to the location of the object in the x-y plane) and in variation of the number of counts (or intensity) along the trace (which corresponds to the effects of physical factors such as attenuation and scatter). The change in counts between the first few projections 300, 30t etc and the last few projections 30n.ι , 30n represent any changes in activity within the period of acquisition.
It is also noted that the shape of the object of interest will manifest itself as a variation in the thickness a,b of the trace 45 with varying theta, the edges of the object of interest typically being discernable by changes in intensity.
It has been recognised that aspects of the predictability of the sinogram trace 45 can be used as a valuable and powerful tool in segmenting out data sets of interest from the raw data of the projections, prior to carrying out image reconstruction using back-projection techniques and the like. It is therefore possible to isolate data from regions of interest before their voxel values become contaminated with dependencies from other regions of the image arising during data reconstruction. It is therefore possible to isolate the effects of other parts of the image before reconstruction. In figure 4(a), a sinogram 60 is shown for two objects spatially separated within the body being scanned. Trace 62 represents an object for which quantitative data is required. Trace 64 represents an interfering object in which we have no interest. For a large part of the sinogram, the traces do not meet, eg. trace portions Sl , S2, and S3. However, there are two areas 70,71 where they intersect, ie. for those values of theta, the objects lie one in front of the other. Because of the predictable nature of the sinogram, it is possible to inteφolate, for each trace, the expected contribution to the counts within the two intersections, as clearly shown in figure 4(b). Figure 4(b) represents the number of counts associated with the trace 62 for each angle θ.
It is possible to use the data at positions 73,74 to deduce the contribution of trace 64 to intersection 70, and to use the data at 75,76 to deduce the contribution of trace 64 to intersection 71. Thus, the overall contribution to the sinogram from trace 64 may be readily excised from the entire sinogram, leaving the trace 62 (and any other traces not shown) unaffected by the object creating trace 64. The exercise is repeated for sinograms of each transverse slice. Alternatively, it is possible to extract from the sinogram data, only the counts relating to trace 62, discarding all other data.
Because segmentation takes place before reconstruction, statistical independence from any subsequent manipulation or reconstruction of the data is assured, and calculation of quantitative parameters prior to the introduction of reconstruction compromises and artefacts becomes possible.
The selected object for segmentation may take one of two forms. In the first scenario, it is only the selected object for which the user requires detailed or quantitative count information, or a detailed three dimensional image. In this case, all other data in a sinogram is discarded, leaving only data corresponding to the selected object as an object of interest. In a second scenario, the selected object may be an obscuring object, the data from which is masking or interfering with the data required during the reconstruction process. For example, there may be an accumulation of radioactive material in a particular organ in a patient's body which is not of interest, but the high counts generally obscure other regions in which smaller, but nevertheless crucial quantities of radioactive material are accumulated. In this case, data corresponding to the selected object are discarded, leaving all other raw data intact and more clearly showing any objects of interest.
Once data from the selected object have been removed or extracted, an image may be formed from the remaining data, and quantitative measurements made.
The general procedural steps of a method according to the present invention are shown in figure 6, and will be described later in more detail with reference to figures 7 to 10.
With reference to figure 6, the raw image data from projections 30j to 30n are loaded into the computer system (step 101). A sinogram representation 60 of the data is formed (step 102) and, based on operator analysis of the sinogram, the sinogram trace 62 relating to a selected object is identified (steps 103, 104). Using edge detection algorithms, in combination with certain predictable aspects of the sinogram presentation, the edges of the selected object are defined (step 105), and the count data from the object isolated from the raw data (step 106) to form a second data set (step 111), either by segmenting out the selected object to leave the selected object data only (steps 107, 108), or by segmenting out the selected object to leave all other data (steps 109, 110) more clearly showing any objects of interest within the body. This modified data set may then be used to form an image (step 1 12), or to provide quantitative data such as volume of the object, activity within the object and activity variation with time (steps 1 14-1 16), providing output either on a suitable display device or print out (step 1 17).
With reference to figure 7, overall system parameters are first set up. In step Al , parameters for the forward projection of matrix 35 are specified for each transverse (x-y) image plane, including matrix size (resolution) and number of projections 30 comprising the raw data set. Other display parameters may be set, and the method of inteφolation to be used for back-projection of the projections 30 onto the non-aligned matrix 35 is specified.
In step A2, a matrix is constructed which gives the mapping of a point (i,j) in the x-y plane of the transverse image onto the sinogram at values (x',k).
With reference to figure 8, the data segmentation is then carried out. In step Bl , the raw data from gamma camera 10 is loaded into the computer system. The data is re-formed into a set of sinograms as illustrated in figures 3 and 4, each corresponding to a transaxial slice.
For the puφoses of segmenting out the data, the user first selects the sinogram to display (step B2). The selection depends upon the location of the object relative to the z-axis, and the relative positions of other traces. For example, the projection may be clearer in some sinograms than others. The user then opts (step B3) for either automatic object definition (step B5) or manual object definition (step B4). Manual object definition is necessarily substantially more labour intensive, but may offer advantages in respect of the skill of the user in assessing the shape and extent of objects and will be discussed in greater detail later with reference to figure 9.
In step B6, the user first selects a designation number to define a selected object (Object Number), which object will comprise a number of segments, ie. a number of trace portions Sl, S2 and S3 relating to the same selected object within the sinogram. The user displays the chosen sinogram which will best show the object (step B7), and identifies, on the sinogram, a selected trace portion or segment Sl of the object, by marking the upper and lower limits of the segment (ie. at maximum and minimum theta values thereof) and a seed position P within the selected segment Sl of trace 64 (step B9A). Other segments S2 and S3 of the trace 64 are similarly marked (step B9B).
The system then allows the user to select (step BIO) one of a number of predetermined global (e.g. Sobel) or direction biased (e.g. Edge Compass) edge-detection algorithms well known in the art to locate the edges 80,81 of the object segment in the positive and negative x' directions from point P and over the selected range of theta values for the segment on the sinogram. Typical edge finding algorithms are described in "Digital Image Processing", W K Pratt, J Wiley & Sons.
In steps B12 to B22 the system uses the edge-finding algorithms to determine an outline shape or two-dimensional "edge map" 85,86,87 (on the sinogram, ie. in x'-0 space) of the various segments Sl, S2 and S3 identified with the object trace 64. From the sinogram view, the system generates either a first or second derivative image (steps B 12-B15). This image may be displayed to the user (step B16) and saved for further use during edge tracing. In steps B19 to B22, the edge tracing procedure is carried out from the derivative edge map. Starting from the seed position P identified by the user, the algorithm hunts for the nearest edge moving across the spatial derivative at that particular angle 0. Upon finding the nearest edge, the system then attempts to locate a new, and corresponding edge point for the adjacent θ value above or below in the sinogram. This procedure is carried out subject to the constraints that the new edge point should be within a user specified target range of the previous edge point detected and that the changes in the counts at the edge detected should exceed a user specified "edge shaφness" expressed as a percentage of the maximum counts in the object at that angle. The process is terminated if the constraints are not met and results in a series of discrete points mapping the edge (eg. 85, 86 or 87, figure 5) in the defined segment. The procedure is repeated for both left and right edges of the sinogram trace, and for all segments.
The user may view these points either on the original sinogram image, or on the edge image (step B21) and can toggle the display between the two (step B22).
Once the segments Sl , S2 and S3 have been defined by the system on the sinogram, a curve fitting algorithm is carried out to map the closest possible pair of sinusoidal waveforms respectively to the left-hand edges and the right-hand edges of individual segments S1-S3 which form the object trace 64. These waveforms are then drawn onto the displayed sinogram for confirmation by the user (steps B23 to B28), which finalises the object definition. Upon satisfactory definition of the object, the separation of the sinusoidal traces for each theta value are used to calculate the object width and shape, and to calculate the total number of pixel counts attributable to the object (step B30). This is preferably carried out by plotting the number of counts along the sinusoidal trace, against angle 0 (see figure 4(b)). This plot yields a trace having a relatively slow variation in count as a function of theta, representing variations arising from differing levels of attenuation through the body (eg. at segments Sl , S2 and S3), upon which is superimposed relatively sharp deviations from the slowly varying trace where the counts of interfering traces combine with those of the selected object (eg. at areas 70,71). Known inteφolation techniques are used to eliminate these shaφ deviations, resulting in a slowly varying trace which corresponds to the counts deriving from the selected object only.
The system may also use the trace 64 width to assume an ellipsoidal shape of the object and represent it as such, or to represent the object as irregular in shape (step B31), as appropriate.
Once the edges of an object have been delineated in the sinogram, it is possible to represent them in the transverse plane via a reverse Radon transform. Since the detected edge 85,86,87 has a binary form, it does not require any filtering for moving between the planar (x-y space) and sinogram (x'-0 space) representations of the raw data set. This means that there is now an enclosed area that represents the location of the object whose edges have been traced. In order to estimate the activity within this region, the forward projection procedure as shown in figure 10 is used.
The body 20 outline is defined in the transverse section and the object of interest 40 outlined within it (step Dl). It will be understood that additional objects of interest may also be added as ellipsoidal or irregular shapes (step D2).
From the sinogram, or rather the smoothed count versus 0 plot of figure 4(b), the total number of counts attributable to the object of interest is distributed into the transverse plane as a number of counts per voxel within the area of the object of interest (step D2). This is based upon the fact that radiation emission is isotropic. As an example, this means that if there are 6400 gamma rays emitted from the object of interest 40 (derived from the count vs. 0 plot), we are likely to get on average 100 events detected in the object's projection in each of the 64 views acquired under ideal conditions.
However, the conditions for acquiring actual image data are far from ideal, and this simulation can be made more accurate by putting in other factors that prevail in practice: eg. attenuation, pharmacokinetic redistribution, the modulation transfer function of the equipment being used, radionuclidic decay, statistical noise, etc.
Therefore, an array of expected attenuation factors for the body 20 outline are generated and stored, or retrieved from memory (steps D3 to D8). These attenuation factors provide an estimate of the expected attenuation occurring for each angle 0 for the segmented object of interest given its position within the body outline.
In step D9, account is taken of the observed phenomenon of line spread by setting a suitable line spread function. This compensates for a number of effects such as scattering, equipment and acquisition parameters which diffuse the image from, for example, a point source into an approximate Gaussian distribution. In step D10, account is taken of known background noise effects by defining a suitable noise function, eg Poisson distribution.
Now, using these realistic simulation functions, the counts within the object are forward projected (steps Dl 1 , D12) and this distribution of counts obtained in each projection are compared with the counts that were actually obtained for the selected object in the raw data set (step B35).
The simulation parameters may then be adjusted to minimise the Chi-squared statistic (step B36) in order to get a forward projection count distribution result as close as possible to the actual counts obtained from the selected object 40 in each of the projections. The cycle of steps B34 - B37 are repeated until a convergence criterion is satisfied (step B37), although in practice it is found that very few iterations, if any, are required. The convergence criterion may be chosen according to a clinical situation governing the desired accuracy.
Once the segmentation and simulation processes have been completed for the current sinogram, ie. for the current transverse slice, the operation steps B2 to B39 must be repeated for each transverse slice. Since adjacent slices will be substantially similar to one another, with slowly varying values of the sinusoidal traces for successive slices, it is possible to carry out a repetitive process on the succeeding slice substantially automatically. Providing that the axial distance between transverse slices is not large compared with the dimensions of the object of interest, the seed position P will still be contained within the object of interest, and the upper and lower limits for theta will vary only slightly and the object edges will have varied only slightly. Thus edge finding algorithm at steps BIO to B21 can operate automatically on the new slice, with confirmation from the user if necessary. It will also be understood that, due to geometrical considerations, in a succession of transaxial slices the variations in seed position P and trace edges 85,86,87 will slowly vary in a somewhat predictable manner with inteφolation or extrapolation being possible over a number of successive sinograms. Thus, where geometrical assumptions of ellipsoidal geometry of the object have been exploited in x-y space (step B31), similar predictions and exploitation in x'-z space are possible.
Thus, at the end of this process step B39, we have, by addition, both the location of, and the counts within, the selected object for all the transverse planes that the object occupies by using only the raw data without any reconstruction.
It is thus possible to modify the raw data set to derive quantitative data for the segmented out object, or of the data set without the segmented out object. This operation can be carried out because both the raw and simulated data sets are only arrays of numbers (each number associated with a voxel location (i,j,k)). The values may be represented by a colour scale in an image for inteφretation by the user. Once the convergence criteria have been met (in step B39), a subtraction of the simulated from the raw data set removes the segmented object and gives the desired data set. Alternatively, subtraction of the inverse of the simulated data set from the raw data set removes everything except the segmented object.
These procedures can be used in various combinations, if there is more than one segmented object of interest 40, in such a way as to maximise clinical advantage, for example. Since the objects have been segmented out using the raw data set prior to reconstruction, any values and ratios thereof are statistically valid because their interdependence due to filtering has been totally by-passed. In addition, since the simulation takes into account all factors affecting the acquisition of the raw data set, the parameters can be used to estimate the form the data would have taken for clinically advantageous scenarios.
With reference now to figure 9, the alternative to automatic object definition, as selected at steps B3 and B4 is to carry out a manual procedure. In this procedure, the left or right hand side of the "sinogram trace is first selected (step Cl) and a sinusoidal waveform is displayed superimposed thereon. Phase and amplitude coefficients are adjusted by the user until a close match with the edge of the object trace edge is obtained (steps C2, C3). The exercise is repeated for the opposite trace edge (step C4).
It will be understood that while the invention has been described, with reference to figures 4 and 5, in terms of only two interfering traces, in practice a larger number of traces will be observed, and thus the number of portions S will increase, each one being of shorter length. However, the curve fitting algorithms are readily able to accommodate curve matching to a larger number of shorter curve portions.
It will also be understood that while the invention has been described in relation to single photon emission tomography, the technique is readily applicable to tomographic imaging of any collimated emission of radiation from or through an appropriate entity.
Examples
Using a standard Jaszczak SPET, phantom and patient data were acquired according to standardised protocols using a Siemens Orbiter gamma camera and transferred to Nuclear Diagnostics workstations for analysis. The technique described here has been implemented using the X Window System (™ Massachusetts Institute of Technology) running on a Sun Workstation (Sun Microsystem, Inc.). It makes use of software library routines "NUCLIB" supplied by Nuclear Diagnostics Ltd. These library routines provide structures to facilitate the input/output, memory storage, and display of nuclear medicine image data.
The basic premises of this method are that a raw data set contains all the information necessary to characterise the distribution of radioactivity in three dimensions, and that, for a given data set, it is possible to describe the relationships between the entire set of projections as a set of mathematical functions. Once this description is made, it is possible to manipulate the data set to predict clinically advantageous "what if" scenarios that maintain the relationships and provide quantitative parameters. The steps have been described in connection with Figure 6.
In summary, the algorithm starts off with user identification of the object that is to be segmented and quantitated in the raw projection data, which is usually best done in the projection with the maximum counts from the object itself, and with minimal interference from any over- or underlying structures. A user defined seed pixel within this object starts off a three dimensional edge detector that produces a series of discrete points defining the boundaries that satisfy a preset target range and edge shaφness, and terminates when all such points have been identified. A least squares fit to this set of edge pixels defines the boundary of the object according to an assumed ellipsoid or irregular shape selected by the user. The algorithm then forms an estimate of the outline of the patient's body according to a preset threshold from the limits as seen in all the projections, and also the mean background counts free from all other major objects. Next, a copy of the delineated object as well as the estimated body outline is produced in a new data set to form the basis of the forward projection simulation module. An attenuation map is generated by associating the path lengths stored in a lookup table for each of the pixels located within the object of interest to the edge of the body outline with the attenuation coefficient. The pixels within the body outline are given an initial count value based on the estimate of the mean background and the pixels within the object of interest are given an arbitrary initial count value by the user. These counts are then forward projected by a Monte Carlo subroutine that isotropically distributes these initial estimates of counts per voxel for each projection angle. This subroutine takes into consideration the aforementioned attenuation maps (and any additional attenuation corrections if required), noise, Modulation Transfer Function and time variance of activity within the segmented organ due to pharmacokinetic redistribution or radionuclidic decay. A Chi-squared statistic is calculated to compare the simulated data with the actual data based on the projections with the majority of the counts arising from the object of interest, and used to revise the initial estimates iteratively. This procedure converges to a point when the simulation mirrors the original data closely for only the delineated object independent of all others.
At this point, the algorithm can branch one of two ways by either deleting the segmented object from the raw data set, or keeping the object but deleting out everything else, ie. image surgery. This decision is made by the user based on the clinical situation for which the study was performed. The quantitative data about the object, namely, volume, activity and time variance during the period of acquisition are inferred from the values of these parameters used during the simulation to get the minimum Chisquared statistic. All the above steps and their resultant output can be overridden or modified by the user should the need be felt. The entire sequence is repeated several times till all objects of interest have been segmented and quantitated independent of each other using the raw data set only, and a new data set is generated that includes the appropriate objects of interest only, in any combination dictated by the clinical situation. The algorithm may then terminate at that point without attempting to form images. The new data set which contains the quantitated object can be reconstructed using back projection with no prefiltering and a simple ramp filter to obtain images for comparison with conventionally filtered and reconstructed images.
A small selection of results and images can be presented here.
Please note that the colour table shown in cyclic due to conversion from Sun workstation format colour images to PC format black and white images. In all images, the patient's anterior is at the top, and patient's right is on the left of the image.
Table 1 gives quantitative data for a phantom with six spheres, ranging in volume from 0.6 to 24 cL and with activities from 7.8 to 312
MBq, put in water, r values for both volume and activity are >0.99 showing good concordance between actual volumes and activities and those measured from regression lines fitted to the program output.
Figure 12 shows a conventionally backprojected reconstruction of a transverse section at the level of the kidneys of an Indium-Ill Octreotide study using no prefiltering and a simple ramp filter. Figure 13 shows a hybrid image with both the kidneys and spleen from the original data placed in the estimated body outline, while Figure 14 shows the first iteration of the algorithm to remove the contribution of the right kidney
24
RECTIFIED SHEET to the raw data. Figure 15 shows the final iteration showing same section with the right kidney completely removed from the raw data set without affecting the left kidney or the spleen. It can be seen that the artefactual cold area in the area between the kidneys is reduced.
Figure 16 shows a transverse section of a conventional postreconstruction image at the level of the bladder in a Tc-99m labelled CYT-351 study of prostate cancer at 24 hours with Weiner prefiltering and attenuation correction, while Figure 17 shows the result of processing of the raw data set to reduce selectively the counts originating from the bladder prior to similar reconstruction.
The conventional image shows the effects of a wide range of contrast values, accumulation of activity in the bladder over the hour long acquisition, and poor count statistics. The two external iliac vessels at approximately 10 and 2 o'clock position (with respect to the bladder), the two internal iliac vessels at 5 and 7 o'clock and the extraprostatic extension of the carcinoma at 6 o'clock are visualised. Two wedge shaped cold artefacts are present on either side of the bladder. In the processed image, the iliac vessels are seen in essentially the same places as before, but the extension due to the prostate cancer is now separated from the bladder, and without the artefactual addition of counts originating from the bladder. The wedge artefacts have disappeared and the contrast range is now balanced over the entire image.
The processing illustrated above is possible only after accurately reproducible derivation of both the size and activity within the object by the algorithm. The program returns a series of numbers in terms of counts, pixels and percent kinetic variation, which can be calibrated easily to yield MBq.cm"3. Any method that aims to provide clinically useful quantitative parameters must satisfactorily take into account all the variability in the entire expanse of factors affecting the acquisition of data in the routine environment of a Nuclear Medicine department if it is to achieve widespread acceptance. It is preferable if the incoφoration of the corrections involves as little processing and user interaction as is technically possible so that the accuracy, reproducibility and confidence in the quantitative parameters is increased.
The method described here begins the processing, using the raw projection data only, before any artefacts are introduced by any reconstruction process. It then segments organs of interest and provides a simulated data set for each, independent of all others, that is capable of taking into account all the major factors affecting acquisition eg. the Modulation Transfer Function, noise, attenuation, large contrast values and time variance of the activity distribution due to pharmacokinetics. The simulated data sets are forward projected to assess the accuracy ofthe simulation and the parameters used for the simulation are used to manipulate the original data set to compensate for acquisition limitations and quantify volume and activity. The compensation and quantitation are done before the reconstruction introduces interdependence of voxel values.
At present the simulation process has been simplified in order to prove the feasibility of the proposed method to cater for routine clinical data, and identification of specific areas has been concentrated upon where refinements will improve further the accuracy and applicability of the method. For example, uniformly distributed attenuation maps have been used. It will be understood that a next step would be to conform these maps to provide nonuniformities based on anatomical knowledge. The approach described here is distinct from all other reconstruction strategies in the sense that it is based on the raw projection data with incoφoration of both knowledge and requirements of the user, to form a processed data set that is selectively enhanceable while providing object images and quantitation independently. These facilities represent a powerful set of tools with great potential that can be realised. Firstly, the processing can be tailored to the pharmacokinetics and biodistribution of any particular agent. This is important in cases of those radiopharmaceuticals which offer low normal: abnormal ratios, have rapid redistribution kinetics or with wide variations in uptake in adjacent organs. It may be possible to have less stringent goals for new radiopharmaceuticals in terms of pharmacokinetics and biodistribution. Secondly, the ability to quantify objects independent of any process of image formation means that more rigorous comparisons can be made between studies carried out at different times, centres or protocols. Such an exacting basis is a prerequisite for the development of standardised databases of images and protocols, and in the assessment of equipment and software performance. Thirdly, the mathematically defined relationships used by the algorithm may be able to form the basis of optimised acquisition protocols for new types of studies, for example, a reduced number of views but with a longer time per image over an arc or set of arcs that offer the best visualisation of a particular organ, or studies using rapidly distributing tracers that offer a combination of functional information and three dimensional visualisation.
Object segmentation
An alternative technique of data segmentation of at least one selected object within the entity in the projected images will now be described. The alternative technique offers some advantages over the previously described techniques, enabling a reduction in the operator input into the edge detection process (previously described with reference to figure 8) and eliminating the need for a Monte Carlo simulation process (previously described with reference to figure 10).
In summary, the technique previously described herein uses an implementation which segments out a part of the raw projection data and then generates a simulated data set that closely resembles the contribution from the object of interest within this part of the data. Once this condition is met, it is inferred that the parameters used during the simulation are directly comparable to the object of interest. This provides the clinically important estimates of volume, activity and time variance.
Avoidance of this simulation process can be useful as simulation processes can be quite elaborate and complex in order to achieve optimal results. This requires considerable computing power to implement and can be time consuming to program, debug, optimise and check for accuracy under varying conditions.
The simplification of the edge detection / object delineation process can be desirable as although the techniques described above can utilize a number of well developed techniques, no single method can be applied to the wide range of tomographic data which can be found in nuclear medicine. This means that a preferred implementation will include several different edge detectors. It may therefore be necessary to select the type of detector used and operating parameters thereof dependent upon a particular clinical study. Whilst there is no problem with this, there are circumstances in which it is also desirable to offer a standardized analysis technique. The alternative embodiment now to be described makes further use of a number of periodicities found in the sinogram representations of the raw data. The alternative embodiment has particular applicability for camera systems in which, for each value of 0, information is collected simultaneously for all z. That is to say, the camera includes a detector 10 and coliimator 15 which extend in two dimensions for simultaneous collection of x' data in all transaxial planes (z) at once, as previously discussed.
The data acquired and represented in the sinograms of figures 3, 4 and 5 essentially relate to the following dimensions: a) the three physical dimensions identified previously in figure 1 as x, y and z; b) the time during acquisition and therefore rotation of the camera head, ie. 0; c) the energy of the counts arising from the volume imaged, hereinafter referred to as E; d) the change in counts within objects during a period of acquisition; and e) the time between sets of acquisitions.
With reference to figure 18, the sinograms 50 of a raw data set as depicted in figure 3 are each rotated by 90° anticlockwise, and laid end to end in order of the successive transaxial slices, ie. for increasing, or decreasing values of z. A sinogram row 200 is thereby formed comprising successive sinograms 50< to 50n representing successive z values. Thus, as shown, the vertical axis represents x', the horizontal axis represents, within each sinogram, 0 from 0 to 360, and from sinogram to sinogram, z from zl to zπ. Because all z axis data for each 0 are imaged simultaneously, the horizontal axis also represents time t in a saw tooth function 210 depicted above the sinogram row 200, from time (beginning of acquisition period) to te (end of acquisition period). As discussed in relation to figure 3, the number of counts, or intensity I, is represented by a height of trace perpendicular to the diagram of figure 18.
It will be understood that an object of interest will be represented on the sinogram row 200 as a sinusoidal trace of varying thickness according to the x-y dimensions of the object, of frequency corresponding to the one sinogram period. Variations in the dimensions of the object over the z-direction will manifest themselves as small discontinuities 202 in width and/or position of the trace at each sinogram boundary. The extent of the object in the z-direction will be manifested as the appearance of the object through a number of adjacent sinograms. Figure 18 shows this in part as a diminishing size of trace over sinogram views 50t to 50n, and is more clearly visible in figures 19(a), (b) and (c).
Figures 19(a), (b) and (c) show actual data taken from human patients showing the portion of the body encompassing the liver, prostate and parathyroid respectively. In each figure, the entire length of the sinogram row 200 comprises all seven rows 200] to 2007 in a continuous chain. The figures 19(a), (b) and (c) clearly show the plurality of interfering traces which are normally found in a sinogram view. The figures also clearly show the extent of the objects in the z direction.
It can be seen that the three physical dimensions of the object of interest and its location within the volume being imaged are displayed in the phase, amplitude and length of the sinogram trace, while the counts arising from it are displayed in the height of the trace. The time period of the acquisition is the saw tooth function from sinogram view 50j to 50n in figure 18. For each object's trace discernible in the sinogram, a step change in the height (I) at the beginning and end of each sinogram view and from one sinogram view to the next relates to the time variance of activity over the data acquisition period te - tb. A drop between the end of one view to the next one means an accumulation of activity and a rise means a decrease in activity over the period of acquisition or, comparing the beginning and end of the same view, this comparison means the reverse, ie. a drop is a decrease. Step changes in the phase and amplitude of the sinogram trace relate to the shape and orientation of the object.
In other words, the characteristics of the sinogram trace relate directly to the properties of the object producing it. An essential aspect of this is the periodicity of these characteristics which can be exploited to segment out an object of interest.
Figure 20 shows a schematic three dimensional visualization of the intensity data represented by just one object trace 210 within a sinogram row, with intensity I (number of counts) represented by the orthogonal axis. It will be understood that the trace 210 has finite width in x' (x-y) space as it snakes along the sinogram row, although this width is not shown in the diagram. As best shown in the x'-0 "plan view" upper inset of the figure, the trace 210 will, of course, be typically interfered with by other traces 212 shown in dotted outline interlacing with trace 210 as previously described with reference to figure 4. It will be understood that the trace height (figure 4b and figure 20, lower inset showing I vs. z,0) will also vary periodically according to: a) interference from interlacing traces of other radioactive objects (which may locally increase the trace height) and b) attenuation of the signal from shielding objects in front of the object of interest (which will locally decrease the trace height). As has been previously described with reference to figure 4(b), the I vs. 0 (and also I vs. 0,z) plot for a selected object will vary sinusoidally over a period of one sinogram view. In the I vs. z plot, trace height variations from interference from interlacing traces will have a periodicity of twice that of the selected object, and these variations can be identified and eliminated accordingly. Local attenuation by objects (eg. bone structures) will cause dips in the intensity vs. z trace which may also be detected since they force a departure from the ideal sinusoidal trace. Thus these local dips may also be eliminated by interpolation.
With reference to figure 21 , to determine the number and location of objects readily visible, the following procedure is carried out. For each value of 0 in the sinogram row (eg. line 0A on figure 21(a)), the function of intensity I versus x' (shown in figure 21(b)) is used to determine the number and location of peaks 230, 232 in the un-normalized raw data set. A preprogrammed suitable threshold value may be used to determine which peaks to examine on a first pass. Alternatively, this may be determined by reference to a clinical database which enables control ofthe system by defining approximate locations of major peaks for a particular clinical study.
This intensity peak location is repeated for each value of 0. Each maximum point x' for any given value of 0 is related to the nearest maximum point x' for the next value of 0 forming a succession of points (hereinafter a "set") along the trace 210 in figure 21a, and similarly along the trace 212. It will be understood that the two sets of points corresponding to traces 210 and 212 will intersect twice (220, 221) per sinogram view such that, for example, only one I vs. x' peak is located at 0B. These intersections are discounted for the time being leaving blocks of points between the intersections. The extent of the object is now determined within these blocks by determining an object width for each 0, ie. determining a value on each side of the peak which delineates the object. For each maximum point 230, 232, a pair of minimum points x' ι , x'2 are derived by computing the magnitude of the first derivative, dl/dx' (figure 21(c)). These points determine the edges of the traces 210, 212 at that value of 0. A least squares curve-fitting algorithm is then used to define the edges 215, 216 of each trace 210, 212. This may be smoothed using the first harmonic of the Fourier expansion of the curve.
Various refinements may be used to optimize the position of the curves defined by the x' , and x'2 points, for example taking into account factors such as the attenuation coefficient Z, the thickness of attenuating material and energy E of the γ radiation. These factors may be determined empirically or by reference to a priori knowledge of the organs being imaged.
The number of unique sets of maximum points derived is the number of objects identified, and it is now possible to relate the phase and amplitude of each set of maximum points 230 and the length of the entire trace 210, 212 into the x, y and z dimensions of each object.
The next task is to determine the number of counts originating from each object identified thus far. For each object (trace), the count profile I vs. 0,z is plotted along the sinogram trace (figure 20, lower inset). This is smoothed to eliminate contributions from interlaced traces and to compensate for local attenuation as previously described. The first harmonic of the Fourier expansion of this count profile may be taken to eliminate noise from the curve fitting. The volume integral of this count profile relates to the number of counts originating from the object (including the background contribution at this point in the analysis). This realization makes the elimination of the Monte Carlo simulation possible, as discussed earlier. It is now possible to convert the parameters obtained above into true estimates of the physical dimensions, radioactivity and time variance for each object, providing that information relating to the characteristics of the gamma camera, the body outline and the background contribution are provided.
The gamma camera characteristics are obtained to a required degree of thoroughness from a series of experiments calibrating the technique for a particular gamma camera set up. The body outline and background contributions can be assessed and calibrated using techniques described hereinbefore.
Once the contribution of the objects defined by traces 210, 212 have been established, it is then possible to excise this data from the sinogram raw data set, leaving a second data set in sinogram form which may be used in similar manner to examine further features of the body under study. Typically, upon removal of data relating to large, highly active organs such as the kidneys or liver, the second sinogram data set may be back-projected for study or normalized to enable identification of a new set of peaks 230, 232 relating to objects of lower activity. A further pass can then be made to eliminate contributions from these second order objects, or to examine them in detail.
With reference to the figures 19(a), 19(b) and 19(c), it can be readily observed that traces relating to selected objects only give rise to counts over a part of the range of 0 values because of substantial or complete attenuation of the radiation by objects in front of the object of interest. It will be understood that where this is so, data in these portions ofthe sinogram are effectively not used nor required in segmenting out the object because the curve fitting algorithms can still adequately operate on the available segments, ie. using limited ranges of 0, because of the highly predictable periodicity of data.
This being so, in order to segment out certain objects of interest, it becomes completely unnecessary to actually gather data relating to the excluded (non-relevant) ranges of 0. Thus, valuable scanning time can be saved by conducting scans only over limited ranges of 0 known to be of use in the data segmentation operation. For example, where substantial attenuation occurs on one side of a patient, a 180° scan on the opposite side may provide optimum data. The ranges of 0 of use to the analysis can be determined on a clinical basis. For example, a clinical database may provided to determine the scan conditions required for a particular clinical study. The particular type of clinical study being performed would be entered into the gamma camera control system which would then automatically determine the range or range of 0 over which the camera should scan to obtain useful data. Although the absence of data for certain ranges of 0 will prevent full 360° tomographic imaging, it will be understood that using the techniques of the present invention, this is no longer necessary to define the three dimensional boundaries of an object of interest in order to obtain quantitative data from that object.
Such a limited scan technique offers substantial increases in throughput of patients on a typical tomographic gamma imaging system thereby offering substantial cost savings. Furthermore, where substantial attenuation gives rise to inherent inaccuracy or lower quality of the data, the use of only optimum data from limited ranges of 0 may optimize the final accuracy. In a further embodiment, recently developed panspectral gamma cameras may be used to acquire and assign radioactivity counts over a range of photon energies into separate energy bins which can further refine and improve the raw data segmentation process described above.
A number of sinogram row data sets 200A...200X are laid out to form a page 250 as shown in figure 22. Each page 250 relates to a data set acquired at a time t, over an acquisition period t+Δt, ie. from t^, to te as shown in figure 18. Each page 250 comprises a number of sinogram rows in which each row corresponds to bin counts for a given photon energy (E) range. The highest energy bin corresponds to the top row (row 1) and the lowest energy bin corresponds to the bottom row (row n). Each row therefore comprises data from one part of the energy spectrum as indicated on the right hand side of the figure. Organising the data set in this manner enables refinements to correction methods for attenuation and scatter by exploiting information contained in the sinogram rows 2 to n.
Since the radiation interaction gives rise to the Compton edge and continuum, and the type and amount of such interactions (ie. both attenuation and scatter) will determine the shape of the energy spectrum, it is possible to decompose the information content for each row according to probability maps.
If the activity at a certain location or data element 260 is determined to be A counts in row 1 (200 A), then the probability that it will give rise to a certain pattern of counts in all of the other rows can be estimated as a probability map. For example, for a given type of radio tracer in a patient, the spread of this location further down in the energy spectrum may be estimated by a series of expanding windows 261 , 262 ... 265 (shown greatly exaggerated in the diagram) of, eg. 3 x3 data elements for rows 200B to 200D, 4 x4 data elements for the next row 200E, 5 x5 data elements for the next row and 9 x9 for the remaining rows. The number of counts within each window are, respectively, A-f2, A-f3, ... A-fn where fn represents a parameter which is dependent upon the energy of the photon and the attenuation factor (Z value) of the attenuating medium.
This means that, for two points Pl and P2 in the first row 200A located respectively within the traces (eg. 210, 212 in figure 21(a)) of two different objects, the probability map can be used to calculate that, for a given point P3 in row n with counts S falling within the overlapping WxW windows of both the points in a certain row, Sl counts are the result of interactions of photons arising from object 1 and S2 counts are the result of interactions of photons arising from object 2, such that Sl + S2 = S, taking into account the relative amounts of radioactivity in each object. This can be extended in a similar fashion for all other points and sinogram traces.
The probability maps also have periodic properties. For example, the spread of counts arising from a non-central object varies periodically, narrowing for 0 where the object is close to the camera and widening for 0 where the object is furthest from the camera. This means that if the counts originating from a particular point lower down in the energy range are considered, then the contributions Sl and S2 of each object can be refined still further according to the location of the object within the three¬ dimensional volume being imaged. This is because there is a higher probability that an object further away from the camera will give rise to an interaction because of the depth of the intervening medium than for an object closer to the camera, in proportion to the activity inside each object. The realisation that the variation of probabilities has periodic characteristics simplifies the implementation of the algorithm on a computer.
It will be understood that this concept is extendible to other modalities as well, where the energy bands can be considered analogous to some other property relevant to the modality under consideration.
The present invention therefore organizes raw acquisition data into sinogram-based data structures and exploits various periodicities in the data structure, which periodicities facilitate the segmentation from, and quantitative analysis of, object data from the raw data set without the need for data reconstruction using back projection techniques. As has been explained herein, the data segmented out can be the object of interest or can be an object which is obscuring remaining data in the raw data set, ie. the remaining data set essentially comprises one or more objects of interest for which quantitative data is sought without the effects of the initially segmented out object. Successive objects may be segmented to gradually leave a data set which includes the object(s) of interest.
Table 1 : Phantom data.
Sr. Actual Volume Estimated Volume Actual Activity Estimated Activity
No. (in cL) (in cL)(i) (in MBq) (in MBq)(ii)
1. 24 23.9 312 309.8
2. 12 1 1.24 156 147
3. 6 6.97 78 98.8
4. 2.4 2.86 31.2 31.3
5. 1.2 1.98 15.6 13.5
6. 0.6 -0.77 7.8 0.36
(i): Regression line is [-2.128+0.644A)+(0.382B)] where A and B are parameters used by the edge detector. r=0.995.
(ii): Regression line is [-3.96+((4.56e-3)x(C))] where C is the counts estimated by forward projection Monte
Carlo module. r=0.999.
38a
RECTIFIED SHEET (RULE 91) ISA/EP

Claims

1. A method of deriving quantitative data from a raw projection data set of an entity in a tomographic image by the steps of: (a) segmenting out at least one selected object within the entity in the projected images;
(b) modifying the raw projection data set to form a second data set comprising data relating only to the at least one selected object; and
(c) using the second data set to determine quantitative information on the volume defined by the selected object by comparison with the raw data set.
2. A method according to claim 1 wherein the step of segmenting out a selected object comprises the steps of: (i) forming a sinogram for a given transverse plane;
(ii) identifying the edges of an object of interest for a selected theta value;
(iii) tracking the edges of the object through its sinusoidal path on the sinogram for all theta to determine an object extent; (iv) modifying the raw data set to form the second data set by using only data corresponding to the object extent.
3. A method according to claim 1 wherein the step of segmenting out a selected object comprises the steps of: (i) forming a sinogram for a given transverse plane;
(ii) identifying the edges of a selected object, which is not of interest, for a selected theta value;
(iii) tracking the edges of the object through its sinusoidal path on the sinogram for all theta to determine an object extent;
39 (iv) modifying the raw data set to form the second data set by removing all data corresponding to the object extent.
4. A method according to claim 2 or claim 3 further including the steps of: determining to which transverse planes the object extends; and repeating the steps (i) to (iv) for all successive transverse planes in which the object appears.
5. A method according to claim 2 or claim 3 wherein: step (iii) includes carrying out an inverse Radon transform to delineate the object extent in the x-y space of the transverse plane; and step (iv) includes the steps of: deriving, from the sinogram, a total count for all theta for the selected object; based on the total count, iteratively simulating a forward projection of the delineated object in the transverse plane to predict counts for each 0 projection until the simulated forward projection counts match the raw data projection counts for the selected object, to a desired degree of accuracy.
6. A method according to claim 5 further including the step of applying at least one correction factor to the simulated forward projection to compensate for predictable variations.
7. A method according to claim 6 wherein the correction factors applied are chosen to compensate for any or all of the known phenomena of attenuation, scatter, pharmacokinetic redistribution, the modulation transfer function of the equipment being used, radionuclidic decay or statistical noise.
40
8. A method according to claim 2 or claim 3 wherein step (ii) further includes the step of choosing the selected value of theta by determining for which theta the selected object is optimally distinguished from all other objects within the entity.
9. A method according to claim 2 or claim 3 wherein the selected object forms a trace in the sinogram which is comprised of a plurality of segments extending between intersections with interfering traces from other objects, and wherein step (iii) further includes tracking the edges of the selected object only in said plurality of segments, and inteφolating said edges across the intersections to determine said object extent.
10. A method according to claim 9 wherein step (iv) further includes the step of generating a plot of count versus theta for all theta included within said plurality of segments, and interpolating said counts for all theta between said plurality of segments.
11. A method of enhancing portions of an entity in a tomographic image by elimination of contributions thereto from a selected object within the entity by the steps of:
(a) segmenting out the selected object in the projected images;
(b) modifying the raw data set to form a second data set in which contributions from the selected object have been eliminated; and
(c) using the second data set to create a back-projected three- dimensional image of the entity.
12. Apparatus for determining quantitative data relating to a part of a tomographic image, from a raw projection data set of an entity in the tomographic image, including:
41 (a) means for segmenting out at least one selected object within the entity in each of the projected images;
(b) means for modifying the raw data set to form a second data set comprising data relating only to the at least one selected object; and (c) means for using the second data set to determine quantitative information on the volume defined by the selected object by comparison with the raw data set.
13. A method of locating a disease site in a patient, the method comprising the method of claim 1 wherein the entity is the body of the patient and the at least one selected object is either: (a) the disease site, or (b) a site which partially obscures the disease site.
14. A method according to claim 13 wherein the site which obscures the disease site is the bladder.
15. A method of segmenting out at least one selected object from a raw projection data set of an entity in a tomographic image and modifying the raw projection data set to form a second data set comprising data relating only to the at least one selected object, comprising the steps of: forming a sinogram for a given transverse plane; identifying the edges of a selected object for a selected theta value; tracking the edges of the object through its sinusoidal path on the sinogram for all theta to determine an object extent; modifying the raw data set to form the second data set by using only data corresponding to the object extent.
16. A method according to claim 15 wherein the tracking step further includes the step of tracking the object through a plurality of transverse planes.
42
17. A method according to claim 16, in which said data set represents a plurality of values of a variable, I, as a function of position, x',0,z in said tomographic image, the method further comprising the steps of:
(a) forming a contiguous row of sinograms for each of a succession of transverse planes (z) through said entity;
(b) identifying positions of said selected object in a plurality of said sinograms for each of a plurality of selected 0 values;
(c) inteφolating between said positions to track the object along its path in the sinogram row for all theta, and determining the object extent for each θ and z;
(d) defining a set of points in 1-0 space which lie within the object extent and interpolating between said set of points to track the contibutions of the selected object to the raw data set to thereby form a second data set relating only to the selected object.
18. A method according to claim 17 wherein step (b) includes the step of selecting theta values for which data relating to said selected object is not substantially interfered with by other objects and wherein said inteφolation steps are made only on the basis of data for those selected theta values.
19. A method according to claim 18 wherein step (c) includes the steps of:
(i) for each said selected value of theta, identifying at least one maximum point in I-x' space;
(ii) for each maximum point, identifying the object extent on either side thereof using an edge detection algorithm; and
(iii) using said object extents to form said inteφolated set of points in x-y space.
43
20. A method according to claim 18 wherein said edge detection algorithm comprises locating the nearest maximum value of the first derivative dl/dx'.
21. A method according to claim 17 wherein step (d) includes the step of determining a function having a periodicity equal to the sinogram periodicity and which provides an optimal match to the 1-0 data for selected 0.
22. A method according to claim 21 wherein said selected 0 comprise portions of the sinograms for which data relating to said selected object is not substantially interfered with by other objects.
23. A method according to claim 17 wherein step (d) includes the step of performing a volume integral over the data in said second data set to compute the contribution of the selected object to the raw data set.
24. A method according to claim 17 further including dividing said raw projection data set into rows of sinograms each corresponding to a different range of a modality related to the data and exploiting a predetermined relationship between said different ranges of the modality to refine the accuracy of said second data set.
25. A method according to claim 24 in which the projection data set quantities are photon counts and the modality is photon energy.
26. A method according to claim 25 in which the predetermined relationship is the shape of the expected energy spectrum as a result of radiation interactions with the entity under study.
44
PCT/GB1996/001814 1995-07-27 1996-07-29 Raw data segmentation and analysis in image tomography WO1997005574A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
AU66226/96A AU6622696A (en) 1995-07-27 1996-07-29 Raw data segmentation and analysis in image tomography
EP96925859A EP0843868A1 (en) 1995-07-27 1996-07-29 Raw data segmentation and analysis in image tomography

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
GBGB9515458.9A GB9515458D0 (en) 1995-07-27 1995-07-27 Raw data segmentation and analysis in image tomography
GB9515458.9 1995-07-27
GBGB9517044.5A GB9517044D0 (en) 1995-08-19 1995-08-19 Raw data segmentation and analysis in image tomography
GB9517044.5 1995-08-19

Publications (1)

Publication Number Publication Date
WO1997005574A1 true WO1997005574A1 (en) 1997-02-13

Family

ID=26307473

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB1996/001814 WO1997005574A1 (en) 1995-07-27 1996-07-29 Raw data segmentation and analysis in image tomography

Country Status (3)

Country Link
EP (1) EP0843868A1 (en)
AU (1) AU6622696A (en)
WO (1) WO1997005574A1 (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1204373A1 (en) * 1999-06-23 2002-05-15 The Board Of Trustees Of The University Of Illinois Fast hierarchical backprojection method for imaging
WO2002075662A2 (en) * 2001-03-15 2002-09-26 Koninklijke Philips Electronics Nv Fast transform for reconstruction of rotating-slat data
DE10307331A1 (en) * 2003-02-17 2004-09-02 BAM Bundesanstalt für Materialforschung und -prüfung Imaging method and device for computer-aided evaluation of computed tomographic measurements by direct iterative reconstruction
WO2005020154A1 (en) * 2003-08-25 2005-03-03 Ulla Ruotsalainen Method, system and software product for motion correction of tomographic images
WO2005109346A1 (en) * 2004-05-06 2005-11-17 UNIVERSITé LAVAL 3d localization of objects from tomography data
EP1612734A2 (en) * 2004-06-30 2006-01-04 General Electric Company System and method for boundary estimation using CT metrology
EP1665125A2 (en) * 2003-09-02 2006-06-07 Ludwig Institute For Cancer Research Data driven motion correction for nuclear imaging
US7176916B2 (en) 2005-04-15 2007-02-13 T.I.E.S., Inc. Object identifying system for segmenting unreconstructed data in image tomography
WO2007054843A1 (en) 2005-11-10 2007-05-18 Koninklijke Philips Electronics, N.V. Pet imaging using anatomic list mode mask
DE102008047840A1 (en) 2008-09-18 2010-04-01 Siemens Aktiengesellschaft Method for the at least partial determination and / or adaptation of an attenuation map used for the attenuation correction of positron emission tomography image data sets in a combined magnetic resonance positron emission tomography device
JP2010528312A (en) * 2007-05-30 2010-08-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ PET local tomography
US8217937B2 (en) * 2007-03-28 2012-07-10 The Aerospace Corporation Isosurfacial three-dimensional imaging system and method
CN101454801B (en) * 2006-02-28 2012-12-05 皇家飞利浦电子股份有限公司 Local motion compensation based on list mode data
CN107945203A (en) * 2017-11-24 2018-04-20 中国科学院高能物理研究所 PET image processing method and processing device, electronic equipment, storage medium
CN110007068A (en) * 2019-03-25 2019-07-12 桂林优利特医疗电子有限公司 A kind of urine drip detection method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11367227B2 (en) 2020-07-28 2022-06-21 Canon Medical Systems Corporation Method and apparatus for computer vision based attenuation map generation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0354763A2 (en) * 1988-08-09 1990-02-14 General Electric Company Method and apparatus for estimating body contours in fan beam computed tomographic systems
WO1992020032A1 (en) * 1991-04-25 1992-11-12 Inria Institut National De Recherche En Informatique Et En Automatique Method and device for examining a body, particularly for tomography

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0354763A2 (en) * 1988-08-09 1990-02-14 General Electric Company Method and apparatus for estimating body contours in fan beam computed tomographic systems
WO1992020032A1 (en) * 1991-04-25 1992-11-12 Inria Institut National De Recherche En Informatique Et En Automatique Method and device for examining a body, particularly for tomography

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
THIRION J -P: "Segmentation of tomographic data without image reconstruction", IEEE TRANSACTIONS ON MEDICAL IMAGING, MARCH 1992, USA, vol. 11, no. 1, ISSN 0278-0062, pages 102 - 110, XP000281930 *

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003502093A (en) * 1999-06-23 2003-01-21 ザ、ボード、オブ、トラスティーズ、オブ、ザ、ユニバシティー、オブ、イリノイ A method for fast hierarchical backprojection of images
EP1204373A1 (en) * 1999-06-23 2002-05-15 The Board Of Trustees Of The University Of Illinois Fast hierarchical backprojection method for imaging
JP4958354B2 (en) * 1999-06-23 2012-06-20 ザ、ボード、オブ、トラスティーズ、オブ、ザ、ユニバシティー、オブ、イリノイ A method for fast hierarchical backprojection of images.
EP1204373A4 (en) * 1999-06-23 2009-05-27 Univ Illinois Fast hierarchical backprojection method for imaging
WO2002075662A2 (en) * 2001-03-15 2002-09-26 Koninklijke Philips Electronics Nv Fast transform for reconstruction of rotating-slat data
WO2002075662A3 (en) * 2001-03-15 2003-09-12 Koninkl Philips Electronics Nv Fast transform for reconstruction of rotating-slat data
DE10307331B4 (en) * 2003-02-17 2009-03-05 BAM Bundesanstalt für Materialforschung und -prüfung Imaging method for the computer aided evaluation of computer-tomographic measurements by direct iterative reconstruction
DE10307331A1 (en) * 2003-02-17 2004-09-02 BAM Bundesanstalt für Materialforschung und -prüfung Imaging method and device for computer-aided evaluation of computed tomographic measurements by direct iterative reconstruction
US7702180B2 (en) 2003-02-17 2010-04-20 Bam Bundesanstalt Fur Materialforschung Und -Prufung Imaging method and device for the computer-assisted evaluation of computer-tomographic measurements by means of direct iterative reconstruction
WO2005020154A1 (en) * 2003-08-25 2005-03-03 Ulla Ruotsalainen Method, system and software product for motion correction of tomographic images
EP1665125A2 (en) * 2003-09-02 2006-06-07 Ludwig Institute For Cancer Research Data driven motion correction for nuclear imaging
EP1665125A4 (en) * 2003-09-02 2007-10-03 Ludwig Inst Cancer Res Data driven motion correction for nuclear imaging
WO2005109346A1 (en) * 2004-05-06 2005-11-17 UNIVERSITé LAVAL 3d localization of objects from tomography data
EP1612734A2 (en) * 2004-06-30 2006-01-04 General Electric Company System and method for boundary estimation using CT metrology
EP1612734A3 (en) * 2004-06-30 2013-01-23 General Electric Company System and method for boundary estimation using CT metrology
US7176916B2 (en) 2005-04-15 2007-02-13 T.I.E.S., Inc. Object identifying system for segmenting unreconstructed data in image tomography
WO2007054843A1 (en) 2005-11-10 2007-05-18 Koninklijke Philips Electronics, N.V. Pet imaging using anatomic list mode mask
CN101454801B (en) * 2006-02-28 2012-12-05 皇家飞利浦电子股份有限公司 Local motion compensation based on list mode data
US8217937B2 (en) * 2007-03-28 2012-07-10 The Aerospace Corporation Isosurfacial three-dimensional imaging system and method
JP2010528312A (en) * 2007-05-30 2010-08-19 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ PET local tomography
DE102008047840A1 (en) 2008-09-18 2010-04-01 Siemens Aktiengesellschaft Method for the at least partial determination and / or adaptation of an attenuation map used for the attenuation correction of positron emission tomography image data sets in a combined magnetic resonance positron emission tomography device
US8314617B2 (en) 2008-09-18 2012-11-20 Siemens Aktiengesellschaft Method for at least partly determining and/or adapting an attenuation map used for correcting attenuation of positron emission tomography image data sets in a combined magnetic resonance-positron emission tomography device
DE102008047840B4 (en) 2008-09-18 2018-08-30 Siemens Healthcare Gmbh Method for the at least partial determination and / or adaptation of an attenuation map used for the attenuation correction of positron emission tomography image data sets in a combined magnetic resonance positron emission tomography device
CN107945203A (en) * 2017-11-24 2018-04-20 中国科学院高能物理研究所 PET image processing method and processing device, electronic equipment, storage medium
CN110007068A (en) * 2019-03-25 2019-07-12 桂林优利特医疗电子有限公司 A kind of urine drip detection method

Also Published As

Publication number Publication date
EP0843868A1 (en) 1998-05-27
AU6622696A (en) 1997-02-26

Similar Documents

Publication Publication Date Title
Wang et al. Iterative X-ray cone-beam tomography for metal artifact reduction and local region reconstruction
US8175115B2 (en) Method and system for iterative reconstruction
US8150112B2 (en) Regional reconstruction of spatially distributed functions
WO1997005574A1 (en) Raw data segmentation and analysis in image tomography
US7176916B2 (en) Object identifying system for segmenting unreconstructed data in image tomography
Xu et al. An algorithm for efficient metal artifact reductions in permanent seed implants
Gifford et al. LROC analysis of detector-response compensation in SPECT
CN101111758A (en) Apparatus and method for correction or extension of x-ray projections
JPH11514446A (en) Multi-slice restricted projection PET
Berthon et al. PETSTEP: generation of synthetic PET lesions for fast evaluation of segmentation methods
CN107635469A (en) The estimation of the decay pattern met based on the scattering in PET system
Matej et al. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
McKee et al. A deconvolution scatter correction for a 3-D PET system
US10102650B2 (en) Model-based scatter correction for non-parallel-hole collimators
Stute et al. Realistic and efficient modeling of radiotracer heterogeneity in Monte Carlo simulations of PET images with tumors
Hebert et al. The GEM MAP algorithm with 3-D SPECT system response
Szlávecz et al. GPU-based acceleration of the MLEM algorithm for SPECT parallel imaging with attenuation correction and compensation for detector response
Khalil Emission tomography and image reconstruction
Zeng et al. Iterative reconstruction with attenuation compensation from cone-beam projections acquired via nonplanar orbits
KR101493683B1 (en) Super-resolution Apparatus and Method using LOR reconstruction based cone-beam in PET image
Hsieh et al. Projection space image reconstruction using strip functions to calculate pixels more" natural" for modeling the geometric response of the SPECT collimator
Reader et al. Attenuation and scatter correction of list-mode data driven iterative and analytic image reconstruction algorithms for rotating 3D PET systems
Bai et al. Postinjection single photon transmission tomography with ordered-subset algorithms for whole-body PET imaging
Hebert et al. A fully automated optimization algorithm for determining the 3-D patient contour from photo-peak projection data in SPECT
Kudrolli et al. SS3D-Fast fully 3D PET iterative reconstruction using stochastic sampling

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1996925859

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1996925859

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1996925859

Country of ref document: EP