US20110038452A1 - Image domain based noise reduction for low dose computed tomography fluoroscopy - Google Patents

Image domain based noise reduction for low dose computed tomography fluoroscopy Download PDF

Info

Publication number
US20110038452A1
US20110038452A1 US12/539,674 US53967409A US2011038452A1 US 20110038452 A1 US20110038452 A1 US 20110038452A1 US 53967409 A US53967409 A US 53967409A US 2011038452 A1 US2011038452 A1 US 2011038452A1
Authority
US
United States
Prior art keywords
images
nta
spr
given
recited
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US12/539,674
Other languages
English (en)
Inventor
Sachin Moghe
IImar Hein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Canon Medical Systems Corp
Original Assignee
Toshiba Corp
Toshiba Medical Systems Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp, Toshiba Medical Systems Corp filed Critical Toshiba Corp
Priority to US12/539,674 priority Critical patent/US20110038452A1/en
Priority to JP2010181051A priority patent/JP5637768B2/ja
Publication of US20110038452A1 publication Critical patent/US20110038452A1/en
Priority to US13/557,467 priority patent/US8687871B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10TECHNICAL SUBJECTS COVERED BY FORMER USPC
    • Y10STECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y10S378/00X-ray or gamma ray systems or devices
    • Y10S378/901Computer tomography program or processor

Definitions

  • the present invention relates to reducing noise in computed tomography (CT) images during CT-fluoroscopy.
  • CT-fluoroscopy involves continuous scanning of a slice or volume of a subject for monitoring in real time, such as monitoring interventions. If a regular dose of x-rays is used, the subject will be exposed to a large x-ray dose. If a lower dose is used, then image noise is increased. In CT, image noise is inversely proportional to the square root of the x-ray tube current. As the tube current is decreased to reduce dose, the image noise increases, resulting in poor image quality.
  • One method used to reduce image noise is to average the image slices at the same location, but this produces blurring of the edges since there is bound to be movement of the subject, voluntary or involuntary, during the scan. For example, involuntary motion can be due to breathing or beating of the heart.
  • One aspect of the present invention is a computed-tomography method including exposing an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, defining a group of views, where each scan includes a first number of the groups, generating first images respectively using projection data from each group of views, generating second images from plural ones of the first images, generating third images by averaging respective pluralities of the second images, generating a gradient image using at least one of the second and third images, and generating a display image by weighting one the of second images and one of the third images using the gradient image.
  • a computed-tomography apparatus in another aspect of the invention, includes an x-ray source to expose an object with x-rays at a plurality of scans at a position of the object to obtain projection data at a plurality of views, an x-ray detector, a data collection unit, a data processing unit connected to the data collection unit, and a display.
  • the data processing unit includes a memory storing x-ray projection data for a plurality of scans at a position of an object to obtain projection data at a plurality of views, and the data processing unit generates first images respectively using projection data from each group of views, generates second images from plural ones of the first images, generates third images by averaging respective pluralities of the second images, generates a gradient image using at least one of the second and third images, and generates a display image on the display by weighting one the of second images and one of the third images using the gradient image.
  • FIG. 1 is a diagram of a system according to the invention
  • FIG. 1A is a diagram of the processing unit of FIG. 1 ;
  • FIG. 2 is a matrix of views collected over one rotation of the x-ray source
  • FIG. 3 is a diagram of view blocks and image reconstruction over the view blocks
  • FIG. 4 is a diagram illustrating partial images
  • FIG. 5 is a diagram illustrating full-scan images
  • FIG. 6 is a diagram illustrating non-overlapping time images
  • FIG. 7 is a diagram illustrating combining images
  • FIG. 8 is a graph illustrating a blending curve
  • FIG. 9 is a graph illustrating gradient values in an image
  • FIG. 10 is a graph of blending factor as a function of gradient value.
  • FIG. 11 shows full-scan, non-overlapping and blended images.
  • FIG. 1 shows an x-ray computed tomographic imaging device according to the present invention.
  • the device may be operated as different x-ray doses to carry out different types of scanning, such as CT fluoroscopy.
  • the projection data measurement system constituted by gantry 1 accommodates an x-ray source 3 that generates a cone-beam of x-ray flux approximately cone-shaped, and a two-dimensional array type x-ray detector 5 consisting of a plurality of detector elements arranged in two-dimensional fashion, i.e., a plurality of elements arranged in one dimension stacked in a plurality of rows.
  • X-ray source 3 and two-dimensional array type x-ray detector 5 are installed on a rotating ring 2 in facing opposite sides of a subject, who is laid on a sliding sheet of a bed 6 .
  • Two-dimensional array type x-ray detector 5 is mounted on rotating ring 2 . Each detector element will correspond with one channel.
  • X-rays from x-ray source 3 are directed on to subject through an x-ray filter 4 .
  • X-rays that have passed through the subject are detected as an electrical signal by two-dimensional array type x-ray detector 5 .
  • X-ray controller 8 supplies a trigger signal to high voltage generator 7 .
  • High voltage generator 7 applies high voltage to x-ray source 3 with the timing with which the trigger signal is received. This causes x-rays to be emitted from x-ray source 3 .
  • Gantry/bed controller 9 synchronously controls the revolution of rotating ring 2 of gantry 1 and the sliding of the sliding sheet of bed 6 .
  • System controller 10 constitutes the control center of the entire system and controls x-ray controller 8 and gantry/bed controller 9 such that, as seen from the subject, x-ray source 3 executes so-called helical scanning, in which it moves along a helical path.
  • rotating ring 2 is continuously rotated with fixed angular speed while the sliding plate is displaced with fixed speed, and x-rays are emitted continuously or intermittently at fixed angular intervals from x-ray source 3 .
  • the source may also be scanned circularly.
  • the output signal of two-dimensional array type x-ray detector 5 is amplified by a data collection unit 1 I for each channel and converted to a digital signal, to produce projection data.
  • the projection data output from data collection unit 11 is fed to processing unit 12 .
  • Processing unit 12 performs various processing using the projection data.
  • Unit 12 performs interpolation, backprojection and reconstruction.
  • Unit 12 determines backprojection data reflecting the x-ray absorption in each voxel.
  • the imaging region is of cylindrical shape with radius o) (is there a word missing here ?) centered on the axis of evolution.
  • Unit 12 defines a plurality of voxels (three-dimensional pixels) in this imaging region, and finds the backprojection data for each voxel.
  • the three-dimensional image data or tomographic image data compiled by using this backprojection data is sent to display device 14 , where it is displayed visually as a three-dimensional image or tomographic image.
  • projection data is collected over one rotation of the x-ray source (full scan).
  • the number of views collected per rotation in time (T Rot ) is N VPR , and during each view, data is collected from a set of detectors N d .
  • the views collected over one rotation can be represented as a matrix shown in FIG. 2 . Each cell in the matrix represents a sample of the data collected at any given view (y-axis) and any given channel (x-axis).
  • FIG. IA A more detailed view of collection unit 11 and processing unit 12 is shown in FIG. IA.
  • the projection data is collected and the data for each of a desired number of views is stored in a register or portion of memory unit 11 - 1 to 11 - n .
  • FIG. 1A will be described in more detail below.
  • N Rot For CT fluoroscopy, the same slice position is scanned repeatedly for more than one rotation (N Rot ). The total number of views collected is given by N Rot
  • a real time image may be reconstructed using data views equal to N VPR at any given time (the views are counted backwards from any point in time).
  • real-time images are reconstructed at a desired fraction of the rotation, such as every 1 ⁇ 4 or 1 ⁇ 6 rotation.
  • An image may be reconstructed every T Rot /N SPR .
  • T Rot 1 sec
  • an image may be produced every 0.25 sec. This provides an effect similar to real-time image production or CT fluoroscopy.
  • the upper limit is determined by hardware speed and memory needed to reconstruct images. For example, having four partial images per second implies four displayed images per second. In an extreme limit, in a mathematical sense, a partial image after every view may be created, that is 900 partial images per second or 900 displayed images per second (in this example). However, for the human eye, anything beyond 25-30 images per second is not significant. Hence, in practice no more than about 20 or 25 partial images per second (900 views) may be computed to provide good quality partial images. Note that for example purposes, 900 views per second are used but this number can take on other values as needed.
  • a partial image can be computed from as small as one view.
  • 900 consecutive (in time) partial images may be added added to give one full scan image.
  • Computationally, using larger number of views (such as 225 in the example) to create partial images is more practical.
  • partial images may be computed using a partial scan, such as a half-scan image.
  • the images may be averaged before being displayed. This is illustrated in FIG. 5 .
  • OTA denotes Overlapping Time Average.
  • the displayed images OTA are computed in unit 12 by:
  • Second Display Image OTA(6) average ( FS (4)+ FS (5)+ FS (6))
  • NTA non-overlapping time images
  • N NTA is defined as the number of non-overlapping time average images. For example,
  • NTA (11) FS (3)+ FS (7)+ FS (11).
  • FIG. 7 illustrates a further approach to producing an improved imaged.
  • the NTA image smooth image
  • FS image smooth image
  • smoothed image( 11 ) FS( 11 )++NTA( 11 ).
  • the symbol “++” is used to denote a blend of the images, and not an addition of corresponding voxels in the 2 images.
  • the FS or NTA image may be defined by the newest collected view block which is 11 in the schematic of FIG. 6 . Since this is ‘real-time’ the views in view block 12 are not being used for computation as yet, although they might be getting collected as the hardware computes FS( 11 ), NTA( 11 ) and FS( 11 )++NTA( 11 ).
  • a gradient image is used to determine the contribution to each pixel in the display image from the NTA image and from the FS image. For pixels in the gradient image that have a high value (indicating an edge), the pixels in the display image will have a significantly larger contribution from the FS image (sharp image) and pixels in the display image that have a low value (indicating smooth regions) will have a larger contribution from the NTA image (smooth image).
  • the gradient image may be obtained as follows:
  • scheme 1 is a better approach than scheme 2 .
  • FIG. 8 represents one blending curve, which is represented by the following equation:
  • x 0 and w are parameters, where x 0 represents the “center” of the curve and w controls the “width” of the curve.
  • the parameters may be chosen by an operator or can be set automatically depending on the scan conditions and the slice position in the object being imaged.
  • x 0 and w were selected and plugged into the above equation to obtain the curve. These values are just an example.
  • x 0 may be automatically selected by computing the average value of voxels in the gradient image, and w is set based on image quality. As shown in FIGS. 8 and 10 , w can take on a range of values, such as between 15 and 30.
  • the gradient curve remains fixed for each pixel.
  • the ‘shape’ of the curve does NOT depend on ‘x’ value, which would the gradient value at any voxel. Therefore, going from one voxel to another is tantamount to moving along the x-axis which would in turn yield a corresponding value (alpha) on the y-axis.
  • the value of a for each pixel is different and this value is determined by the value of the gradient at that pixel, and is given by:
  • a new value of ⁇ is determined based on the gradient value.
  • the gradient value is high, a higher value of ⁇ is used such that a higher contribution to the displayed image comes from the FS (sharp) image and, on the other hand, if the gradient value is low, this means that the pixel belongs to a low frequency region and a higher contribution to the displayed image comes from the NTA (smooth) image.
  • the following equation describes the blending to obtain the blended image BI.
  • the gradient curve may be automatically selected.
  • the statistics mean, median and standard deviation
  • the soft-tissue region and the high gradient regions will be segregated as shown in FIG. 9 .
  • the blending curve can be automatically chosen.
  • FIG. 11 An example of an image obtained according to the invention is illustrated in FIG. 11 .
  • the top image is the FS image
  • the middle image is the NTA image
  • the bottom image is the blended image obtained from the above equation.
  • Three regions are indicated in the image.
  • Region 91 shows the sharp tip of a needle.
  • the same needle in the NTA image is blurred.
  • Region 92 shows that the edges in the blended image are much sharper than the NTA image.
  • Region 93 shows how noise is reduced in the blended image compared to the FS image.
  • noise can be reduced while maintaining sharpness.
  • processing unit 12 A more detailed view of processing unit 12 is shown in FIG. 1A .
  • the projection data is collected and the data for each of desired number of views is processed in processing unit 12 by processor 16 to create the partial images PI(n) and stored them in registers or memory portions 15 - 1 to 15 - n of memory 15 .
  • Processing unit 12 also generates the FS(n) images, OTA images, NTA image, and gradient images and stores them in other registers or portions 15 - o , 15 - p , . . . of memory 15 .
  • Processing unit performs the blending using blending curves stored in register or memory portion 15 - m , and selects the blending curve, as described above, to create the weighted images.
  • the weighted images are also stored in 15 - 0 , 15 - p , . . . as needed.
  • the images generated in processing unit 12 are sent to display 14 for display to the user.
  • the images created and stored may also be transferred to other users or systems using a network such as a LAN, wireless LAN or the internet connected to the CT apparatus.
  • the invention may also be embodied in the form a computer-readable medium containing a stored program to cause a computer to carry out the various operations and functions described above.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Algebra (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Biophysics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)
US12/539,674 2009-08-12 2009-08-12 Image domain based noise reduction for low dose computed tomography fluoroscopy Abandoned US20110038452A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US12/539,674 US20110038452A1 (en) 2009-08-12 2009-08-12 Image domain based noise reduction for low dose computed tomography fluoroscopy
JP2010181051A JP5637768B2 (ja) 2009-08-12 2010-08-12 コンピュータ断層撮影画像の生成方法およびコンピュータ断層撮影装置
US13/557,467 US8687871B2 (en) 2009-08-12 2012-07-25 Image domain based noise reduction for low dose computed tomography fluoroscopy

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US12/539,674 US20110038452A1 (en) 2009-08-12 2009-08-12 Image domain based noise reduction for low dose computed tomography fluoroscopy

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US13/557,467 Continuation US8687871B2 (en) 2009-08-12 2012-07-25 Image domain based noise reduction for low dose computed tomography fluoroscopy

Publications (1)

Publication Number Publication Date
US20110038452A1 true US20110038452A1 (en) 2011-02-17

Family

ID=43588590

Family Applications (2)

Application Number Title Priority Date Filing Date
US12/539,674 Abandoned US20110038452A1 (en) 2009-08-12 2009-08-12 Image domain based noise reduction for low dose computed tomography fluoroscopy
US13/557,467 Active US8687871B2 (en) 2009-08-12 2012-07-25 Image domain based noise reduction for low dose computed tomography fluoroscopy

Family Applications After (1)

Application Number Title Priority Date Filing Date
US13/557,467 Active US8687871B2 (en) 2009-08-12 2012-07-25 Image domain based noise reduction for low dose computed tomography fluoroscopy

Country Status (2)

Country Link
US (2) US20110038452A1 (ja)
JP (1) JP5637768B2 (ja)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110135184A1 (en) * 2009-12-03 2011-06-09 Canon Kabushiki Kaisha X-ray image combining apparatus and x-ray image combining method
US20130303898A1 (en) * 2012-05-09 2013-11-14 Paul E. Kinahan Respiratory motion correction with internal-external motion correlation, and associated systems and methods
US20170103551A1 (en) * 2015-10-13 2017-04-13 Shenyang Neusoft Medical Systems Co., Ltd. Reconstruction and combination of pet multi-bed image
US10839573B2 (en) * 2016-03-22 2020-11-17 Adobe Inc. Apparatus, systems, and methods for integrating digital media content into other digital media content
CN117243627A (zh) * 2023-11-16 2023-12-19 有方(合肥)医疗科技有限公司 Cbct图像的处理方法及装置

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6743396B2 (ja) * 2016-01-25 2020-08-19 Tdk株式会社 バンドパスフィルタおよび分波器
US20170215818A1 (en) * 2016-02-03 2017-08-03 General Electric Company High-resolution computed tomography or c-arm imaging

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594767A (en) * 1995-11-02 1997-01-14 General Electric Company Methods and apparatus for enhancing image sharpness
US6400789B1 (en) * 1997-02-20 2002-06-04 Philips Medical Systems Technologies Ltd. On-line image reconstruction in helical CT scanners
US20090052727A1 (en) * 2007-08-24 2009-02-26 Christian Eusemann Methods for non-linear image blending, adjustment and display

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61125332A (ja) * 1984-11-20 1986-06-13 コニカ株式会社 X線画像処理方法
JPH01147677A (ja) * 1987-12-02 1989-06-09 Toshiba Corp 画像処理装置
JP3730282B2 (ja) * 1995-05-02 2005-12-21 株式会社東芝 コンピュータ断層撮影装置
JPH0969157A (ja) * 1995-09-01 1997-03-11 Konica Corp 放射線画像処理方法
JPH119589A (ja) * 1997-04-30 1999-01-19 Hitachi Medical Corp X線ct装置及び画像再構成方法
US5907593A (en) 1997-11-26 1999-05-25 General Electric Company Image reconstruction in a CT fluoroscopy system
US6801594B1 (en) 1997-11-26 2004-10-05 General Electric Company Computed tomography fluoroscopy system
JP2000271113A (ja) * 1999-01-20 2000-10-03 Toshiba Corp コンピュータ断層像撮影装置
JP4233079B2 (ja) * 2001-10-26 2009-03-04 株式会社東芝 Ct装置
CN1589741B (zh) 2003-08-25 2010-04-21 株式会社东芝 X-射线ct装置
JP5260036B2 (ja) * 2007-12-17 2013-08-14 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー X線ct装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5594767A (en) * 1995-11-02 1997-01-14 General Electric Company Methods and apparatus for enhancing image sharpness
US6400789B1 (en) * 1997-02-20 2002-06-04 Philips Medical Systems Technologies Ltd. On-line image reconstruction in helical CT scanners
US6970585B1 (en) * 1997-02-20 2005-11-29 Koninklijke Philips Electronics N.V. Real-time dynamic image reconstruction
US20090052727A1 (en) * 2007-08-24 2009-02-26 Christian Eusemann Methods for non-linear image blending, adjustment and display

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110135184A1 (en) * 2009-12-03 2011-06-09 Canon Kabushiki Kaisha X-ray image combining apparatus and x-ray image combining method
US20130303898A1 (en) * 2012-05-09 2013-11-14 Paul E. Kinahan Respiratory motion correction with internal-external motion correlation, and associated systems and methods
US9451926B2 (en) * 2012-05-09 2016-09-27 University Of Washington Through Its Center For Commercialization Respiratory motion correction with internal-external motion correlation, and associated systems and methods
US20170103551A1 (en) * 2015-10-13 2017-04-13 Shenyang Neusoft Medical Systems Co., Ltd. Reconstruction and combination of pet multi-bed image
US10043295B2 (en) * 2015-10-13 2018-08-07 Shenyang Neusoft Medical Systems Co., Ltd. Reconstruction and combination of pet multi-bed image
US10839573B2 (en) * 2016-03-22 2020-11-17 Adobe Inc. Apparatus, systems, and methods for integrating digital media content into other digital media content
CN117243627A (zh) * 2023-11-16 2023-12-19 有方(合肥)医疗科技有限公司 Cbct图像的处理方法及装置

Also Published As

Publication number Publication date
JP5637768B2 (ja) 2014-12-10
US8687871B2 (en) 2014-04-01
US20120294417A1 (en) 2012-11-22
JP2011036671A (ja) 2011-02-24

Similar Documents

Publication Publication Date Title
US8687871B2 (en) Image domain based noise reduction for low dose computed tomography fluoroscopy
US9662084B2 (en) Method and apparatus for iteratively reconstructing tomographic images from electrocardiographic-gated projection data
US6665370B2 (en) Computed tomography method and apparatus for acquiring images dependent on a time curve of a periodic motion of the subject
JP5248648B2 (ja) コンピュータ断層撮影システムおよび方法
JP5180181B2 (ja) コンピュータ断層撮影データ収集装置及び方法
EP1716809B1 (en) Tomogram reconstruction method and tomograph
Abella et al. Software architecture for multi-bed FDK-based reconstruction in X-ray CT scanners
JP5199081B2 (ja) 心臓ct撮影のバンドアーチファクトの抑制
EP1800264B1 (en) Image reconstruction with voxel dependent interpolation
Kachelrieß et al. Extended parallel backprojection for standard three‐dimensional and phase‐correlated four‐dimensional axial and spiral cone‐beam CT with arbitrary pitch, arbitrary cone‐angle, and 100% dose usage
US20070195925A1 (en) Streak artifact reduction in cardiac cone beam ct reconstruction
CN102947861A (zh) 用于在低剂量计算机断层摄影中降低噪声的方法和系统
CN103608839A (zh) 对比度相关分辨率图像
CN102270350A (zh) 结合四维噪声滤波器的迭代ct图像重建
JP2013085960A (ja) 画像再構成方法及び画像再構成システム
JP2010158512A (ja) X線コンピュータ断層撮影装置、医用画像処理装置、及び医用画像処理プログラム
JP2008519636A (ja) 周期運動物体の検査のためのコンピュータ断層撮影方法
US8494111B2 (en) System and method for image reconstruction for helical cone beam computed tomography with factorized redundancy weighting
EP0969414A2 (en) Computerized tomographic multi-frame image reconstruction method and apparatus for helical scanning
JP2002034970A (ja) マルチ・スライスct走査の螺旋再構成の方法及び装置
WO2007004196A2 (en) Exact fbp type algorithm for arbitrary trajectories
Shechter et al. The frequency split method for helical cone‐beam reconstruction
US8315351B2 (en) System and method for tomographic reconstruction utilizing circular trajectory and scanogram to reduce artifacts
US6999550B2 (en) Method and apparatus for obtaining data for reconstructing images of an object
CN102656609A (zh) 组织密度保留的运动补偿

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION