US20110038452A1 - Image domain based noise reduction for low dose computed tomography fluoroscopy - Google Patents
Image domain based noise reduction for low dose computed tomography fluoroscopy Download PDFInfo
- Publication number
- US20110038452A1 US20110038452A1 US12/539,674 US53967409A US2011038452A1 US 20110038452 A1 US20110038452 A1 US 20110038452A1 US 53967409 A US53967409 A US 53967409A US 2011038452 A1 US2011038452 A1 US 2011038452A1
- Authority
- US
- United States
- Prior art keywords
- images
- nta
- spr
- given
- recited
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000002591 computed tomography Methods 0.000 title claims abstract description 16
- 238000002594 fluoroscopy Methods 0.000 title description 6
- 230000009467 reduction Effects 0.000 title description 2
- 238000000034 method Methods 0.000 claims abstract description 15
- 238000012545 processing Methods 0.000 claims description 17
- 238000002156 mixing Methods 0.000 claims description 15
- 101100355949 Caenorhabditis elegans spr-1 gene Proteins 0.000 claims description 9
- 238000013480 data collection Methods 0.000 claims description 8
- 238000009499 grossing Methods 0.000 claims 2
- 238000002546 full scan Methods 0.000 abstract description 4
- 238000013459 approach Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 7
- 238000003384 imaging method Methods 0.000 description 3
- 239000011159 matrix material Substances 0.000 description 3
- 210000004872 soft tissue Anatomy 0.000 description 3
- 238000012935 Averaging Methods 0.000 description 2
- 238000007792 addition Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000002747 voluntary effect Effects 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000004907 flux Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/006—Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10—TECHNICAL SUBJECTS COVERED BY FORMER USPC
- Y10S—TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y10S378/00—X-ray or gamma ray systems or devices
- Y10S378/901—Computer tomography program or processor
Definitions
- the present invention relates to reducing noise in computed tomography (CT) images during CT-fluoroscopy.
- CT-fluoroscopy involves continuous scanning of a slice or volume of a subject for monitoring in real time, such as monitoring interventions. If a regular dose of x-rays is used, the subject will be exposed to a large x-ray dose. If a lower dose is used, then image noise is increased. In CT, image noise is inversely proportional to the square root of the x-ray tube current. As the tube current is decreased to reduce dose, the image noise increases, resulting in poor image quality.
- One method used to reduce image noise is to average the image slices at the same location, but this produces blurring of the edges since there is bound to be movement of the subject, voluntary or involuntary, during the scan. For example, involuntary motion can be due to breathing or beating of the heart.
- One aspect of the present invention is a computed-tomography method including exposing an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, defining a group of views, where each scan includes a first number of the groups, generating first images respectively using projection data from each group of views, generating second images from plural ones of the first images, generating third images by averaging respective pluralities of the second images, generating a gradient image using at least one of the second and third images, and generating a display image by weighting one the of second images and one of the third images using the gradient image.
- a computed-tomography apparatus in another aspect of the invention, includes an x-ray source to expose an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, an x-ray detector, a data collection unit, a data processing unit connected to the data collection unit, and a display.
- the data processing unit includes a memory storing x-ray projection data for a plurality of scans at a position of an object to obtain projection data at a plurality of views, and the data processing unit generates first images respectively using projection data from each group of views, generates second images from plural ones of the first images, generates third images by averaging respective pluralities of the second images, generates a gradient image using at least one of the second and third images, and generates a display image on the display by weighting one the of second images and one of the third images using the gradient image.
- FIG. 1 is a diagram of a system according to the invention
- FIG. 1A is a diagram of the processing unit of FIG. 1 ;
- FIG. 2 is a matrix of views collected over one rotation of the x-ray source
- FIG. 3 is a diagram of view blocks and image reconstruction over the view blocks
- FIG. 4 is a diagram illustrating partial images
- FIG. 5 is a diagram illustrating full-scan images
- FIG. 6 is a diagram illustrating non-overlapping time images
- FIG. 7 is a diagram illustrating combining images
- FIG. 8 is a graph illustrating a blending curve
- FIG. 9 is a graph illustrating gradient values in an image
- FIG. 10 is a graph of blending factor as a function of gradient value.
- FIG. 11 shows full-scan, non-overlapping and blended images.
- FIG. 1 shows an x-ray computed tomographic imaging device according to the present invention.
- the device may be operated as different x-ray doses to carry out different types of scanning, such as CT fluoroscopy.
- the projection data measurement system constituted by gantry 1 accommodates an x-ray source 3 that generates a cone-beam of x-ray flux approximately cone-shaped, and a two-dimensional array type x-ray detector 5 consisting of a plurality of detector elements arranged in two-dimensional fashion, i.e., a plurality of elements arranged in one dimension stacked in a plurality of rows.
- X-ray source 3 and two-dimensional array type x-ray detector 5 are installed on a rotating ring 2 in facing opposite sides of a subject, who is laid on a sliding sheet of a bed 6 .
- Two-dimensional array type x-ray detector 5 is mounted on rotating ring 2 . Each detector element will correspond with one channel.
- X-rays from x-ray source 3 are directed on to subject through an x-ray filter 4 .
- X-rays that have passed through the subject are detected as an electrical signal by two-dimensional array type x-ray detector 5 .
- X-ray controller 8 supplies a trigger signal to high voltage generator 7 .
- High voltage generator 7 applies high voltage to x-ray source 3 with the timing with which the trigger signal is received. This causes x-rays to be emitted from x-ray source 3 .
- Gantry/bed controller 9 synchronously controls the revolution of rotating ring 2 of gantry 1 and the sliding of the sliding sheet of bed 6 .
- System controller 10 constitutes the control center of the entire system and controls x-ray controller 8 and gantry/bed controller 9 such that, as seen from the subject, x-ray source 3 executes so-called helical scanning, in which it moves along a helical path.
- rotating ring 2 is continuously rotated with fixed angular speed while the sliding plate is displaced with fixed speed, and x-rays are emitted continuously or intermittently at fixed angular intervals from x-ray source 3 .
- the source may also be scanned circularly.
- the output signal of two-dimensional array type x-ray detector 5 is amplified by a data collection unit 1 I for each channel and converted to a digital signal, to produce projection data.
- the projection data output from data collection unit 11 is fed to processing unit 12 .
- Processing unit 12 performs various processing using the projection data.
- Unit 12 performs interpolation, backprojection and reconstruction.
- Unit 12 determines backprojection data reflecting the x-ray absorption in each voxel.
- the imaging region is of cylindrical shape with radius o) (is there a word missing here ?) centered on the axis of evolution.
- Unit 12 defines a plurality of voxels (three-dimensional pixels) in this imaging region, and finds the backprojection data for each voxel.
- the three-dimensional image data or tomographic image data compiled by using this backprojection data is sent to display device 14 , where it is displayed visually as a three-dimensional image or tomographic image.
- projection data is collected over one rotation of the x-ray source (full scan).
- the number of views collected per rotation in time (T Rot ) is N VPR , and during each view, data is collected from a set of detectors N d .
- the views collected over one rotation can be represented as a matrix shown in FIG. 2 . Each cell in the matrix represents a sample of the data collected at any given view (y-axis) and any given channel (x-axis).
- FIG. IA A more detailed view of collection unit 11 and processing unit 12 is shown in FIG. IA.
- the projection data is collected and the data for each of a desired number of views is stored in a register or portion of memory unit 11 - 1 to 11 - n .
- FIG. 1A will be described in more detail below.
- N Rot For CT fluoroscopy, the same slice position is scanned repeatedly for more than one rotation (N Rot ). The total number of views collected is given by N Rot
- a real time image may be reconstructed using data views equal to N VPR at any given time (the views are counted backwards from any point in time).
- real-time images are reconstructed at a desired fraction of the rotation, such as every 1 ⁇ 4 or 1 ⁇ 6 rotation.
- An image may be reconstructed every T Rot /N SPR .
- T Rot 1 sec
- an image may be produced every 0.25 sec. This provides an effect similar to real-time image production or CT fluoroscopy.
- the upper limit is determined by hardware speed and memory needed to reconstruct images. For example, having four partial images per second implies four displayed images per second. In an extreme limit, in a mathematical sense, a partial image after every view may be created, that is 900 partial images per second or 900 displayed images per second (in this example). However, for the human eye, anything beyond 25-30 images per second is not significant. Hence, in practice no more than about 20 or 25 partial images per second (900 views) may be computed to provide good quality partial images. Note that for example purposes, 900 views per second are used but this number can take on other values as needed.
- a partial image can be computed from as small as one view.
- 900 consecutive (in time) partial images may be added added to give one full scan image.
- Computationally, using larger number of views (such as 225 in the example) to create partial images is more practical.
- partial images may be computed using a partial scan, such as a half-scan image.
- the images may be averaged before being displayed. This is illustrated in FIG. 5 .
- OTA denotes Overlapping Time Average.
- the displayed images OTA are computed in unit 12 by:
- Second Display Image OTA(6) average ( FS (4)+ FS (5)+ FS (6))
- NTA non-overlapping time images
- N NTA is defined as the number of non-overlapping time average images. For example,
- NTA (11) FS (3)+ FS (7)+ FS (11).
- FIG. 7 illustrates a further approach to producing an improved imaged.
- the NTA image smooth image
- FS image smooth image
- smoothed image( 11 ) FS( 11 )++NTA( 11 ).
- the symbol “++” is used to denote a blend of the images, and not an addition of corresponding voxels in the 2 images.
- the FS or NTA image may be defined by the newest collected view block which is 11 in the schematic of FIG. 6 . Since this is ‘real-time’ the views in view block 12 are not being used for computation as yet, although they might be getting collected as the hardware computes FS( 11 ), NTA( 11 ) and FS( 11 )++NTA( 11 ).
- a gradient image is used to determine the contribution to each pixel in the display image from the NTA image and from the FS image. For pixels in the gradient image that have a high value (indicating an edge), the pixels in the display image will have a significantly larger contribution from the FS image (sharp image) and pixels in the display image that have a low value (indicating smooth regions) will have a larger contribution from the NTA image (smooth image).
- the gradient image may be obtained as follows:
- scheme 1 is a better approach than scheme 2 .
- FIG. 8 represents one blending curve, which is represented by the following equation:
- x 0 and w are parameters, where x 0 represents the “center” of the curve and w controls the “width” of the curve.
- the parameters may be chosen by an operator or can be set automatically depending on the scan conditions and the slice position in the object being imaged.
- x 0 and w were selected and plugged into the above equation to obtain the curve. These values are just an example.
- x 0 may be automatically selected by computing the average value of voxels in the gradient image, and w is set based on image quality. As shown in FIGS. 8 and 10 , w can take on a range of values, such as between 15 and 30.
- the gradient curve remains fixed for each pixel.
- the ‘shape’ of the curve does NOT depend on ‘x’ value, which would the gradient value at any voxel. Therefore, going from one voxel to another is tantamount to moving along the x-axis which would in turn yield a corresponding value (alpha) on the y-axis.
- the value of a for each pixel is different and this value is determined by the value of the gradient at that pixel, and is given by:
- a new value of ⁇ is determined based on the gradient value.
- the gradient value is high, a higher value of ⁇ is used such that a higher contribution to the displayed image comes from the FS (sharp) image and, on the other hand, if the gradient value is low, this means that the pixel belongs to a low frequency region and a higher contribution to the displayed image comes from the NTA (smooth) image.
- the following equation describes the blending to obtain the blended image BI.
- the gradient curve may be automatically selected.
- the statistics mean, median and standard deviation
- the soft-tissue region and the high gradient regions will be segregated as shown in FIG. 9 .
- the blending curve can be automatically chosen.
- FIG. 11 An example of an image obtained according to the invention is illustrated in FIG. 11 .
- the top image is the FS image
- the middle image is the NTA image
- the bottom image is the blended image obtained from the above equation.
- Three regions are indicated in the image.
- Region 91 shows the sharp tip of a needle.
- the same needle in the NTA image is blurred.
- Region 92 shows that the edges in the blended image are much sharper than the NTA image.
- Region 93 shows how noise is reduced in the blended image compared to the FS image.
- noise can be reduced while maintaining sharpness.
- processing unit 12 A more detailed view of processing unit 12 is shown in FIG. 1A .
- the projection data is collected and the data for each of desired number of views is processed in processing unit 12 by processor 16 to create the partial images PI(n) and stored them in registers or memory portions 15 - 1 to 15 - n of memory 15 .
- Processing unit 12 also generates the FS(n) images, OTA images, NTA image, and gradient images and stores them in other registers or portions 15 - o , 15 - p , . . . of memory 15 .
- Processing unit performs the blending using blending curves stored in register or memory portion 15 - m , and selects the blending curve, as described above, to create the weighted images.
- the weighted images are also stored in 15 - 0 , 15 - p , . . . as needed.
- the images generated in processing unit 12 are sent to display 14 for display to the user.
- the images created and stored may also be transferred to other users or systems using a network such as a LAN, wireless LAN or the internet connected to the CT apparatus.
- the invention may also be embodied in the form a computer-readable medium containing a stored program to cause a computer to carry out the various operations and functions described above.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Medical Informatics (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Pure & Applied Mathematics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Heart & Thoracic Surgery (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biomedical Technology (AREA)
- High Energy & Nuclear Physics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Image Processing (AREA)
Abstract
A method of computed-tomography and a computed-tomography apparatus in which x-ray projection data is acquired at a number of views for a scan of an object. Partial images are created from data for a desired number of said views. Full scan images are created from plural ones of the partial images. Non-overlapping time images are created from the full-scan images. Gradient images are also created. An improved image is created by weighting respective ones of the full scan and non-overlapping time images using the gradient image. The improved image has increased sharpness with reduced noise.
Description
- 1. Field of the Invention
- The present invention relates to reducing noise in computed tomography (CT) images during CT-fluoroscopy.
- 2. Discussion of the Background
- CT-fluoroscopy involves continuous scanning of a slice or volume of a subject for monitoring in real time, such as monitoring interventions. If a regular dose of x-rays is used, the subject will be exposed to a large x-ray dose. If a lower dose is used, then image noise is increased. In CT, image noise is inversely proportional to the square root of the x-ray tube current. As the tube current is decreased to reduce dose, the image noise increases, resulting in poor image quality. One method used to reduce image noise is to average the image slices at the same location, but this produces blurring of the edges since there is bound to be movement of the subject, voluntary or involuntary, during the scan. For example, involuntary motion can be due to breathing or beating of the heart.
- One aspect of the present invention is a computed-tomography method including exposing an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, defining a group of views, where each scan includes a first number of the groups, generating first images respectively using projection data from each group of views, generating second images from plural ones of the first images, generating third images by averaging respective pluralities of the second images, generating a gradient image using at least one of the second and third images, and generating a display image by weighting one the of second images and one of the third images using the gradient image.
- In another aspect of the invention, a computed-tomography apparatus includes an x-ray source to expose an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, an x-ray detector, a data collection unit, a data processing unit connected to the data collection unit, and a display. The data processing unit includes a memory storing x-ray projection data for a plurality of scans at a position of an object to obtain projection data at a plurality of views, and the data processing unit generates first images respectively using projection data from each group of views, generates second images from plural ones of the first images, generates third images by averaging respective pluralities of the second images, generates a gradient image using at least one of the second and third images, and generates a display image on the display by weighting one the of second images and one of the third images using the gradient image.
- A more complete appreciation of the invention and many of the attendant advantages thereof will be readily obtained as the same becomes better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
-
FIG. 1 is a diagram of a system according to the invention; -
FIG. 1A is a diagram of the processing unit ofFIG. 1 ; -
FIG. 2 is a matrix of views collected over one rotation of the x-ray source; -
FIG. 3 is a diagram of view blocks and image reconstruction over the view blocks; -
FIG. 4 is a diagram illustrating partial images; -
FIG. 5 is a diagram illustrating full-scan images; -
FIG. 6 is a diagram illustrating non-overlapping time images; -
FIG. 7 is a diagram illustrating combining images; -
FIG. 8 is a graph illustrating a blending curve; -
FIG. 9 is a graph illustrating gradient values in an image; -
FIG. 10 is a graph of blending factor as a function of gradient value; and -
FIG. 11 shows full-scan, non-overlapping and blended images. -
FIG. 1 shows an x-ray computed tomographic imaging device according to the present invention. The device may be operated as different x-ray doses to carry out different types of scanning, such as CT fluoroscopy. The projection data measurement system constituted bygantry 1 accommodates anx-ray source 3 that generates a cone-beam of x-ray flux approximately cone-shaped, and a two-dimensional arraytype x-ray detector 5 consisting of a plurality of detector elements arranged in two-dimensional fashion, i.e., a plurality of elements arranged in one dimension stacked in a plurality of rows.X-ray source 3 and two-dimensional arraytype x-ray detector 5 are installed on a rotatingring 2 in facing opposite sides of a subject, who is laid on a sliding sheet of abed 6. Two-dimensional arraytype x-ray detector 5 is mounted on rotatingring 2. Each detector element will correspond with one channel. X-rays fromx-ray source 3 are directed on to subject through anx-ray filter 4. X-rays that have passed through the subject are detected as an electrical signal by two-dimensional arraytype x-ray detector 5. -
X-ray controller 8 supplies a trigger signal tohigh voltage generator 7.High voltage generator 7 applies high voltage tox-ray source 3 with the timing with which the trigger signal is received. This causes x-rays to be emitted fromx-ray source 3. Gantry/bed controller 9 synchronously controls the revolution of rotatingring 2 ofgantry 1 and the sliding of the sliding sheet ofbed 6.System controller 10 constitutes the control center of the entire system andcontrols x-ray controller 8 and gantry/bed controller 9 such that, as seen from the subject,x-ray source 3 executes so-called helical scanning, in which it moves along a helical path. Specifically, rotatingring 2 is continuously rotated with fixed angular speed while the sliding plate is displaced with fixed speed, and x-rays are emitted continuously or intermittently at fixed angular intervals fromx-ray source 3. The source may also be scanned circularly. - The output signal of two-dimensional array
type x-ray detector 5 is amplified by a data collection unit 1 I for each channel and converted to a digital signal, to produce projection data. The projection data output fromdata collection unit 11 is fed to processingunit 12.Processing unit 12 performs various processing using the projection data.Unit 12 performs interpolation, backprojection and reconstruction.Unit 12 determines backprojection data reflecting the x-ray absorption in each voxel. In the helical scanning system using a cone-beam of x-rays, the imaging region (effective field of view) is of cylindrical shape with radius o) (is there a word missing here ?) centered on the axis of evolution.Unit 12 defines a plurality of voxels (three-dimensional pixels) in this imaging region, and finds the backprojection data for each voxel. The three-dimensional image data or tomographic image data compiled by using this backprojection data is sent to displaydevice 14, where it is displayed visually as a three-dimensional image or tomographic image. - In typical CT operation, projection data is collected over one rotation of the x-ray source (full scan). The number of views collected per rotation in time (TRot) is NVPR, and during each view, data is collected from a set of detectors Nd. There may be one or more rows of detectors. For ease of explanation, a detector with one row is considered. The views collected over one rotation can be represented as a matrix shown in
FIG. 2 . Each cell in the matrix represents a sample of the data collected at any given view (y-axis) and any given channel (x-axis). - A more detailed view of
collection unit 11 andprocessing unit 12 is shown in FIG. IA. The projection data is collected and the data for each of a desired number of views is stored in a register or portion of memory unit 11-1 to 11-n.FIG. 1A will be described in more detail below. - For CT fluoroscopy, the same slice position is scanned repeatedly for more than one rotation (NRot). The total number of views collected is given by NRot|NVPR compared with just NVPR in the case of typical CT operation. Since there is a continuous feed of the views, it is not necessary to wait until the end of an integral number of TRot to reconstruct an image. A real time image may be reconstructed using data views equal to NVPR at any given time (the views are counted backwards from any point in time). Preferably, real-time images are reconstructed at a desired fraction of the rotation, such as every ¼ or ⅙ rotation.
-
FIG. 3 illustrates an example with the number of sections per rotation NSPR=4. An image may be reconstructed every TRot/NSPR. As an example, for a rotation time TRot=1 sec, an image may be produced every 0.25 sec. This provides an effect similar to real-time image production or CT fluoroscopy. InFIG. 3 , while NSPR=4, it can take on other values such as 6 or 8. The higher the number the more the image appears to be real-time. - The upper limit is determined by hardware speed and memory needed to reconstruct images. For example, having four partial images per second implies four displayed images per second. In an extreme limit, in a mathematical sense, a partial image after every view may be created, that is 900 partial images per second or 900 displayed images per second (in this example). However, for the human eye, anything beyond 25-30 images per second is not significant. Hence, in practice no more than about 20 or 25 partial images per second (900 views) may be computed to provide good quality partial images. Note that for example purposes, 900 views per second are used but this number can take on other values as needed.
- As an example, assume that a total of 1800 views are collected and 900 views are required to reconstruct 1 image (Full-Scan). Then, in theory an image can be reconstructed using the view ranges (1 . . . 900), (2 . . . 901), (3 . . . 902) & so on. However, in practice, the ability of the hardware to keep up with the pace of reconstruction may be limited.
- In another example, if NVPR=900, each view block contains 225 (900/4) views. There will be a significant overlap in terms of views when reconstructing consecutive images. It is therefore not necessary to backproject NVPR views to reconstruction every single image. Partial images, shown in
FIG. 4 , may be used. Each partial image PI is formed by backprojecting only those views within the block. For example, PI(0) is a partial image formed from view block n=0, etc. Full scan images (FS) are formed by: -
- In the example of
FIG. 4 , -
First image=PI(0)+PI(1)+PI(2)+PI(3)=FS(3) -
Second image=first image−PI(0)+PI(4)=FS(4) -
Third image=second image−PI(1)+PI(5)=FS(5) - Using one adding and one subtracting operation to create the images reduces the number of operations as opposed to three additions. Here, a partial image (PI) can be computed from as small as one view. In the example, 900 consecutive (in time) partial images may be added added to give one full scan image. Computationally, using larger number of views (such as 225 in the example) to create partial images is more practical. Further, partial images may be computed using a partial scan, such as a half-scan image.
- According to the invention, the images may be averaged before being displayed. This is illustrated in
FIG. 5 . InFIG. 5 , OTA denotes Overlapping Time Average. The displayed images OTA are computed inunit 12 by: -
- In this example:
-
First Display Image OTA(5)=average (FS(3)+FS(4)+FS(5)) -
Second Display Image OTA(6)=average (FS(4)+FS(5)+FS(6)) - The above OTA approach works ideally when the object being scanned is stationary. However, when there is voluntary or involuntary motion, edges in the displayed image may be blurred. In a second approach to noise reduction, non-overlapping time images (NTA) are averaged. These images are smooth (less noise). This is illustrated in
FIG. 6 . The NTA images are computed by: -
- NNTA is defined as the number of non-overlapping time average images. For example,
-
NTA(11)=FS(3)+FS(7)+FS(11). -
FIG. 7 illustrates a further approach to producing an improved imaged. At the end of any view block, there are two different images that may be displayed, the NTA and FS images. The NTA image (smooth image) is combined with the FS image (sharp image) to produce an image with sharp edges without degrading the image smoothness. Here, smoothed image(11)=FS(11)++NTA(11). The symbol “++” is used to denote a blend of the images, and not an addition of corresponding voxels in the 2 images. The FS or NTA image may be defined by the newest collected view block which is 11 in the schematic ofFIG. 6 . Since this is ‘real-time’ the views inview block 12 are not being used for computation as yet, although they might be getting collected as the hardware computes FS(11), NTA(11) and FS(11)++NTA(11). - A gradient image, described in more detail below, is used to determine the contribution to each pixel in the display image from the NTA image and from the FS image. For pixels in the gradient image that have a high value (indicating an edge), the pixels in the display image will have a significantly larger contribution from the FS image (sharp image) and pixels in the display image that have a low value (indicating smooth regions) will have a larger contribution from the NTA image (smooth image).
- The gradient image may be obtained as follows:
- In a first approach, a difference of consecutive FS images is found, and there is (NSPR−1)/NSPR rotation overlap.
-
Grad1k=abs(FS(k)−FS(k−1)), where k≧N SPR. - In a second approach, a difference of FS images is found, with no overlap between the images.
-
Grad2m=abs(FS(m)−FS(m−N SPR)), where m≧2·N SPR−1 - In a third approach, a difference between FS and NTA images is found
-
Grad3p=abs(FS−NTA p), where p≧N NTA ·N SPR−1 - If there is object motion (as is usually the case),
scheme 1 is a better approach thanscheme 2. - Once the gradient image is obtained, the gradient, FS and NTA images are blended.
FIG. 8 represents one blending curve, which is represented by the following equation: -
- Here, x0 and w are parameters, where x0 represents the “center” of the curve and w controls the “width” of the curve.
- The parameters may be chosen by an operator or can be set automatically depending on the scan conditions and the slice position in the object being imaged.
FIG. 8 shows typical values of x0=40 and w=15. InFIG. 8 , x0 and w were selected and plugged into the above equation to obtain the curve. These values are just an example. In general, x0 may be automatically selected by computing the average value of voxels in the gradient image, and w is set based on image quality. As shown inFIGS. 8 and 10 , w can take on a range of values, such as between 15 and 30. - The gradient curve remains fixed for each pixel. In other words, the ‘shape’ of the curve does NOT depend on ‘x’ value, which would the gradient value at any voxel. Therefore, going from one voxel to another is tantamount to moving along the x-axis which would in turn yield a corresponding value (alpha) on the y-axis. However, the value of a for each pixel is different and this value is determined by the value of the gradient at that pixel, and is given by:
-
- For each pixel in the gradient image, a new value of α is determined based on the gradient value.
- At any given pixel, if the gradient value is high, a higher value of α is used such that a higher contribution to the displayed image comes from the FS (sharp) image and, on the other hand, if the gradient value is low, this means that the pixel belongs to a low frequency region and a higher contribution to the displayed image comes from the NTA (smooth) image. The following equation describes the blending to obtain the blended image BI.
-
Bl p(n)=(1−α)NTA p(n)+α·FS p(n), - where 0<p<Number of pixels.
- The gradient curve may be automatically selected. When the gradient image is computed, the statistics (mean, median and standard deviation) of the noise values in a soft-tissue region may be computed. On the x-axis, which represents the gradient value, the soft-tissue region and the high gradient regions will be segregated as shown in
FIG. 9 . A point on the gradient axis (x-axis) which results in weight=0.75 is termed as the pivot point. This pivot point is the value of ‘w’ (gradient value) that gives a fixed value of blending weight=0.75. Thus, using the location of pivot point with respect to the soft tissue region statistics, the blending curve can be automatically chosen.FIG. 10 illustrates different curves, for w=1, 5, 10 and 20. Curves for other values of w may be generated and used, as needed. - An example of an image obtained according to the invention is illustrated in
FIG. 11 . The top image is the FS image, the middle image is the NTA image and the bottom image is the blended image obtained from the above equation. Three regions are indicated in the image.Region 91 shows the sharp tip of a needle. The same needle in the NTA image is blurred.Region 92 shows that the edges in the blended image are much sharper than the NTA image.Region 93 shows how noise is reduced in the blended image compared to the FS image. Thus, according to the invention, noise can be reduced while maintaining sharpness. - A more detailed view of
processing unit 12 is shown inFIG. 1A . The projection data is collected and the data for each of desired number of views is processed inprocessing unit 12 byprocessor 16 to create the partial images PI(n) and stored them in registers or memory portions 15-1 to 15-n ofmemory 15. Processingunit 12 also generates the FS(n) images, OTA images, NTA image, and gradient images and stores them in other registers or portions 15-o, 15-p, . . . ofmemory 15. Processing unit performs the blending using blending curves stored in register or memory portion 15-m, and selects the blending curve, as described above, to create the weighted images. The weighted images are also stored in 15-0, 15-p, . . . as needed. The images generated inprocessing unit 12 are sent to display 14 for display to the user. The images created and stored may also be transferred to other users or systems using a network such as a LAN, wireless LAN or the internet connected to the CT apparatus. - The invention may also be embodied in the form a computer-readable medium containing a stored program to cause a computer to carry out the various operations and functions described above.
- Numerous other modifications and variations of the present invention are possible in light of the above teachings. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein.
Claims (21)
1. A method for generating computed-tomography image, comprising:
exposing an object with x-rays at a plurality of scans at a position of said object to obtain projection data at a plurality of views;
generating first images serially in time using the projection data;
generating second images by smoothing respective pluralities of said first images;
generating a gradient image based on at least one of said first images; and
generating a display image by weight blending one said of first images and one of said second images using said gradient image.
2. A method as recited in claim 1 , comprising:
defining a plurality of said views as a view block;
generating consecutive third images from respective consecutive view blocks; and
generating each of said first images from a plurality of consecutive third images.
3. A method as recited in claim 2 , comprising:
using projection data from said view blocks to produce a respective plurality of said third images.
4. A method as recited in claim 2 , comprising:
generating a first one of said first images using a first plurality of said third images; and
generating a second one of said first images by subtracting a first one of said third images from said first one of said first images and adding a next third image subsequent to said plurality of third images to said first one of said first images.
5. A method as recited in claim 1 , wherein said second images comprise non-overlapping time images (NTA(k)), said first images are given by FS(k), and generating said second images comprises:
where: NNTA is a number of NTA images, and
NSPR is a number of sections per rotation of said x-ray source.
6. A method as recited in claim 1 , wherein first images are given by FS(k) and said gradient images are given by:
Gradk=abs(FS(k)−FS(k−1)), where k≧N SPR.
Gradk=abs(FS(k)−FS(k−1)), where k≧N SPR.
where NSPR is a number of sections per rotation of said x-ray source.
7. A method as recited in claim 1 , wherein said first images are given by FS(m) and said gradient images are given by:
Gradm=abs(FS(m)−FS(m−N SPR)), where m≧2·N SPR−1
Gradm=abs(FS(m)−FS(m−N SPR)), where m≧2·N SPR−1
where NSPR is a number of sections per rotation of said x-ray source.
8. A method as recited in claim 1 , wherein said first images are given by FS(k), said third images are given by NTAp and said gradient images are given by:
Gradp=abs(FS−NTA p), where p≧N NTA·N SPR−1
where: NNTA is a number of said second images
NSPR is a number of sections per rotation of said x-ray source.
9. A method as recited in claim 1 , wherein said first images are FSp(n), said second images are NTAp(n) and said display images are given as:
Bl p(n)=(1−α)NTA p(n)+α·FS p(n)
Bl p(n)=(1−α)NTA p(n)+α·FS p(n)
where α is a weighting factor.
10. A method as recited in claim 9 , wherein a is given as:
11. A method as recited in claim 9 , comprising:
using a blending curve to weight said first and second images.
12. A method as recited in claim 11 , comprising:
automatically selecting said blending curve.
13. A computed-tomography apparatus, comprising:
an x-ray source to expose an object with x-rays at a plurality of scans at a position of said object to obtain projection data at a plurality of views;
an x-ray detector;
a data collection unit;
a data processing unit connected to said data collection unit; and
a display,
wherein:
said data processing unit includes a memory storing x-ray projection data for a plurality of scans at a position of the object to obtain projection data at a plurality of views; and
said data processing unit generates first images serially in time using the projection data, generates second images by smoothing respective pluralities of said first images, generates a gradient image based on at least one of said first images, and generates a display image by weight blending one said of first images and one of said second images using said gradient image.
14. An apparatus as recited in claim 13 , wherein said second images comprise non-overlapping time images (NTA(k)), said first images are given by FS(k), and said second images are generated by said data processing unit as:
where: NNTA is a number of NTA images, and
NSPR is a number of sections per rotation of said x-ray source.
15. An apparatus as recited in claim 13 , wherein first images are given by FS(k) and said gradient images are given by:
Gradk=abs(FS(k)−FS(k−1)), where k≧N SPR.
Gradk=abs(FS(k)−FS(k−1)), where k≧N SPR.
where NSPR is a number of sections per rotation of said x-ray source.
16. An apparatus as recited in claim 13 , wherein said first images are given by FS(m) and said gradient images are given by:
Gradp=abs(FS(m)−FS(m−N SPR)), where m≧2·N SPR−1
Gradp=abs(FS(m)−FS(m−N SPR)), where m≧2·N SPR−1
where NSPR is a number of sections per rotation of said x-ray source.
17. An apparatus as recited in claim 13 , wherein said first images are given by FS(k), said second images are given by NTAp and said gradient images are given by:
Gradp=abs(FS−NTA p), where p≧N NTA ·N SPR−1
Gradp=abs(FS−NTA p), where p≧N NTA ·N SPR−1
where: NNTA is a number of said second images
NSPR is a number of sections per rotation of said x-ray source.
18. An apparatus as recited in claim 13 , wherein said first images are FSp(n), said second images are NTAp(n) and said display images are given as:
Bl p(n)=(1−α)NTA p(n)+α·FS p(n)
Bl p(n)=(1−α)NTA p(n)+α·FS p(n)
where α is a weighting factor.
19. An apparatus as recited in claim 18 , wherein α is given as:
20. An apparatus as recited in claim 18 , comprising:
said data collection unit using a blending curve to weight said first and second images.
21. An apparatus as recited in claim 20 , comprising said data collection unit automatically selecting said blending curve.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/539,674 US20110038452A1 (en) | 2009-08-12 | 2009-08-12 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
JP2010181051A JP5637768B2 (en) | 2009-08-12 | 2010-08-12 | Method for generating computer tomography image and computer tomography apparatus |
US13/557,467 US8687871B2 (en) | 2009-08-12 | 2012-07-25 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/539,674 US20110038452A1 (en) | 2009-08-12 | 2009-08-12 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/557,467 Continuation US8687871B2 (en) | 2009-08-12 | 2012-07-25 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
Publications (1)
Publication Number | Publication Date |
---|---|
US20110038452A1 true US20110038452A1 (en) | 2011-02-17 |
Family
ID=43588590
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/539,674 Abandoned US20110038452A1 (en) | 2009-08-12 | 2009-08-12 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
US13/557,467 Active US8687871B2 (en) | 2009-08-12 | 2012-07-25 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/557,467 Active US8687871B2 (en) | 2009-08-12 | 2012-07-25 | Image domain based noise reduction for low dose computed tomography fluoroscopy |
Country Status (2)
Country | Link |
---|---|
US (2) | US20110038452A1 (en) |
JP (1) | JP5637768B2 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110135184A1 (en) * | 2009-12-03 | 2011-06-09 | Canon Kabushiki Kaisha | X-ray image combining apparatus and x-ray image combining method |
US20130303898A1 (en) * | 2012-05-09 | 2013-11-14 | Paul E. Kinahan | Respiratory motion correction with internal-external motion correlation, and associated systems and methods |
US20170103551A1 (en) * | 2015-10-13 | 2017-04-13 | Shenyang Neusoft Medical Systems Co., Ltd. | Reconstruction and combination of pet multi-bed image |
US10839573B2 (en) * | 2016-03-22 | 2020-11-17 | Adobe Inc. | Apparatus, systems, and methods for integrating digital media content into other digital media content |
CN117243627A (en) * | 2023-11-16 | 2023-12-19 | 有方(合肥)医疗科技有限公司 | CBCT image processing method and device |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6743396B2 (en) * | 2016-01-25 | 2020-08-19 | Tdk株式会社 | Bandpass filters and duplexers |
US20170215818A1 (en) * | 2016-02-03 | 2017-08-03 | General Electric Company | High-resolution computed tomography or c-arm imaging |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5594767A (en) * | 1995-11-02 | 1997-01-14 | General Electric Company | Methods and apparatus for enhancing image sharpness |
US6400789B1 (en) * | 1997-02-20 | 2002-06-04 | Philips Medical Systems Technologies Ltd. | On-line image reconstruction in helical CT scanners |
US20090052727A1 (en) * | 2007-08-24 | 2009-02-26 | Christian Eusemann | Methods for non-linear image blending, adjustment and display |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS61125332A (en) * | 1984-11-20 | 1986-06-13 | コニカ株式会社 | X-ray image treatment method |
JPH01147677A (en) * | 1987-12-02 | 1989-06-09 | Toshiba Corp | Image processor |
JP3730282B2 (en) * | 1995-05-02 | 2005-12-21 | 株式会社東芝 | Computed tomography equipment |
JPH0969157A (en) * | 1995-09-01 | 1997-03-11 | Konica Corp | Method for processing radiation image |
JPH119589A (en) * | 1997-04-30 | 1999-01-19 | Hitachi Medical Corp | X-ray ct tomograph and image recomposing method |
US5907593A (en) | 1997-11-26 | 1999-05-25 | General Electric Company | Image reconstruction in a CT fluoroscopy system |
US6801594B1 (en) | 1997-11-26 | 2004-10-05 | General Electric Company | Computed tomography fluoroscopy system |
JP2000271113A (en) * | 1999-01-20 | 2000-10-03 | Toshiba Corp | Computerized tomograph |
JP4233079B2 (en) * | 2001-10-26 | 2009-03-04 | 株式会社東芝 | CT equipment |
CN1589741B (en) | 2003-08-25 | 2010-04-21 | 株式会社东芝 | X-ray CT apparatus |
JP5260036B2 (en) * | 2007-12-17 | 2013-08-14 | ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー | X-ray CT system |
-
2009
- 2009-08-12 US US12/539,674 patent/US20110038452A1/en not_active Abandoned
-
2010
- 2010-08-12 JP JP2010181051A patent/JP5637768B2/en active Active
-
2012
- 2012-07-25 US US13/557,467 patent/US8687871B2/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5594767A (en) * | 1995-11-02 | 1997-01-14 | General Electric Company | Methods and apparatus for enhancing image sharpness |
US6400789B1 (en) * | 1997-02-20 | 2002-06-04 | Philips Medical Systems Technologies Ltd. | On-line image reconstruction in helical CT scanners |
US6970585B1 (en) * | 1997-02-20 | 2005-11-29 | Koninklijke Philips Electronics N.V. | Real-time dynamic image reconstruction |
US20090052727A1 (en) * | 2007-08-24 | 2009-02-26 | Christian Eusemann | Methods for non-linear image blending, adjustment and display |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110135184A1 (en) * | 2009-12-03 | 2011-06-09 | Canon Kabushiki Kaisha | X-ray image combining apparatus and x-ray image combining method |
US20130303898A1 (en) * | 2012-05-09 | 2013-11-14 | Paul E. Kinahan | Respiratory motion correction with internal-external motion correlation, and associated systems and methods |
US9451926B2 (en) * | 2012-05-09 | 2016-09-27 | University Of Washington Through Its Center For Commercialization | Respiratory motion correction with internal-external motion correlation, and associated systems and methods |
US20170103551A1 (en) * | 2015-10-13 | 2017-04-13 | Shenyang Neusoft Medical Systems Co., Ltd. | Reconstruction and combination of pet multi-bed image |
US10043295B2 (en) * | 2015-10-13 | 2018-08-07 | Shenyang Neusoft Medical Systems Co., Ltd. | Reconstruction and combination of pet multi-bed image |
US10839573B2 (en) * | 2016-03-22 | 2020-11-17 | Adobe Inc. | Apparatus, systems, and methods for integrating digital media content into other digital media content |
CN117243627A (en) * | 2023-11-16 | 2023-12-19 | 有方(合肥)医疗科技有限公司 | CBCT image processing method and device |
Also Published As
Publication number | Publication date |
---|---|
JP5637768B2 (en) | 2014-12-10 |
US20120294417A1 (en) | 2012-11-22 |
US8687871B2 (en) | 2014-04-01 |
JP2011036671A (en) | 2011-02-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8687871B2 (en) | Image domain based noise reduction for low dose computed tomography fluoroscopy | |
US9662084B2 (en) | Method and apparatus for iteratively reconstructing tomographic images from electrocardiographic-gated projection data | |
US6665370B2 (en) | Computed tomography method and apparatus for acquiring images dependent on a time curve of a periodic motion of the subject | |
JP5248648B2 (en) | Computer tomography system and method | |
JP5180181B2 (en) | Computer tomography data collection apparatus and method | |
EP1716809B1 (en) | Tomogram reconstruction method and tomograph | |
Abella et al. | Software architecture for multi-bed FDK-based reconstruction in X-ray CT scanners | |
JP5199081B2 (en) | Suppression of band artifact in cardiac CT scan | |
EP1800264B1 (en) | Image reconstruction with voxel dependent interpolation | |
Kachelrieß et al. | Extended parallel backprojection for standard three‐dimensional and phase‐correlated four‐dimensional axial and spiral cone‐beam CT with arbitrary pitch, arbitrary cone‐angle, and 100% dose usage | |
US20070195925A1 (en) | Streak artifact reduction in cardiac cone beam ct reconstruction | |
CN103608839A (en) | Contrast-dependent resolution image | |
CN102947861A (en) | Method and system for noise reduction in low dose computed tomography | |
JP2013085960A (en) | Image reconstitution method and image reconstitution system | |
JP2010158512A (en) | X-ray computerized tomographic apparatus, medical image processor, and medical image processing program | |
JP2008519636A (en) | Computed tomography method for inspection of periodic moving objects | |
EP0969414A2 (en) | Computerized tomographic multi-frame image reconstruction method and apparatus for helical scanning | |
US8494111B2 (en) | System and method for image reconstruction for helical cone beam computed tomography with factorized redundancy weighting | |
JP2002034970A (en) | Method and device for spiral reconstitution in multi-slice ct scan | |
WO2007004196A2 (en) | Exact fbp type algorithm for arbitrary trajectories | |
Shechter et al. | The frequency split method for helical cone‐beam reconstruction | |
US8315351B2 (en) | System and method for tomographic reconstruction utilizing circular trajectory and scanogram to reduce artifacts | |
US6999550B2 (en) | Method and apparatus for obtaining data for reconstructing images of an object | |
CN102656609A (en) | Motion compensation with tissue density retention | |
Slagowski et al. | Feasibility of CT-based 3D anatomic mapping with a scanning-beam digital x-ray (SBDX) system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |