WO2005107597A1 - Radiation imaging device for correcting body movement, image processing method, and computer program - Google Patents

Radiation imaging device for correcting body movement, image processing method, and computer program Download PDF

Info

Publication number
WO2005107597A1
WO2005107597A1 PCT/JP2005/008817 JP2005008817W WO2005107597A1 WO 2005107597 A1 WO2005107597 A1 WO 2005107597A1 JP 2005008817 W JP2005008817 W JP 2005008817W WO 2005107597 A1 WO2005107597 A1 WO 2005107597A1
Authority
WO
WIPO (PCT)
Prior art keywords
geometric transformation
projected
body movement
radiation
correction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2005/008817
Other languages
English (en)
French (fr)
Inventor
Hiroyuki Urushiya
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to US10/599,028 priority Critical patent/US7787670B2/en
Publication of WO2005107597A1 publication Critical patent/WO2005107597A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5258Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise
    • A61B6/5264Devices using data or image processing specially adapted for radiation diagnosis involving detection or reduction of artifacts or noise due to motion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20021Dividing image into blocks, subimages or windows
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30061Lung
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/412Dynamic

Definitions

  • the present invention relates to a radiation imaging device, an image processing method for the radiation imaging device, and a computer program for causing the radiation imaging device to execute the image processing method.
  • the radiation imaging device and the image processing method according to the present invention can be suitably used to reduce in radiography an artifact which occurs in a tomographic image due to movement of a body to be tested (this movement is called body movement hereinafter) .
  • FIG. 9 is a diagram showing a concept of a cone beam X-ray CT (computed tomography) device in which a plane sensor is used.
  • an X-ray which is radiated from an X-ray source 901 is transmitted through a body 903 to be tested, and the transmitted X-ray is detected by a plane sensor 902 provided at the position opposite to the X-ray source 901.
  • the body 903 to be tested is also called a subject 903 hereinafter.
  • the cone beam X-ray CT device executes control so that the X—ray source 901 and the plane sensor 902 together rotate around the subject 903.
  • each of the X—ray source 901 and the plane sensor 902 stands again its initial position.
  • the subject 903 rotates from 0° to 360°, the subject 903 faces the initial direction at the initial position.
  • the projected image initially acquired by the plane sensor 902 at the angle 0° has to be the same as the projected image acquired after the rotation of 360°.
  • the projected image initially acquired by the plane sensor 902 at the angle 0° is resultingly different from the projected image acquired after the rotation of 360°. Therefore, if the tomographic image of the subject 903 is created by using such different projected images, a streak artifact appears on the created tomographic image.
  • a method of correcting deviation or misregistration of the subject occurring due to its body movement by executing interpolation within a range of certain angles is known (for example, see Japanese Patent Application Laid-Open No. H06-114052) . Subsequently, this method will be explained with reference to Figs. 10A, 10B and IOC.
  • Figs. 10A to 10C show the method that the body movement is corrected by using a sinogram in which the channels of the detector are plotted on the axis of abscissa and rotation angles are plotted on the axis of ordinate.
  • each of curved lines 1001 to 1003 on this sinogram indicates the trajectory of a certain point within the subject 903. If the subject 903 does not move during the radiography (that is, there is no body movement of the subject 903), as shown in Fig. 10A, a detector channel position A of the point at the time of a start of scan (0°) conforms to a detector channel position B on the point at the time of an end of the scan (360°). On the other hand, if the body movement of the subject 903 occurs, as shown in Fig. 10B,.the detector channel position of the point at the time of the end of the scan (360°) which should be essentially the position B shifts to a position B' .
  • the detector channel position A of the point at the time of the start of the scan (0°) is shifted to a position A' and the detector channel position B ' on the point at the time of the end of the scan (360°) is likewise shifted to a position B" both by an amount half as much as a deviation amount between the detector channel position A of the point at the time of the start of scan (0°) and the detector channel position B' on the point at the time of the end of the scan (360°), so as to conform the position A' and the position B" to each other.
  • a radiation imaging device is characterized by comprising: a geometric transformation parameter solving unit adapted to acquire, from among plural projected images of which projected angles of a radiation are different from others, geometric transformation parameters between at least the two projected images of which the projected angles of the radiation overlap each other; a changing unit adapted to gradually change the geometric transformation parameters acquired by the geometric transformation parameter solving unit, within a predetermined range of the projected angles of the radiation; and a body movement correction unit adapted to execute a correction of a body movement by executing geometric transformation to the plural projected images of which the projected angles of the radiation are different, by using the respective changed geometric transformation parameters.
  • an image processing method is characterized by comprising: a geometric transformation parameter solving step of acquiring, from among plural projected images of which projected angles of a radiation are different from others, geometric transformation parameters between at least the two projected images of which the projected angles of the radiation overlap each other; a changing step of gradually changing the geometric transformation parameters acquired in the geometric transformation parameter solving step, within a predetermined range of the projected angles of the radiation; and a body movement correction step of executing a correction of a body movement by executing geometric transformation to the plural projected images of which the projected angles of the ⁇ radiation are different, by using the respective changed geometric transformation parameters.
  • FIG. 1 is a block diagram showing one example of the whole constitution of an X-ray CT system according to the embodiment of the present invention
  • Fig. 2 is a flow chart for explaining one example of the operation to be executed by the X- ray CT system in case of radiographing a subject and thus acquiring a tomographic image, according to the embodiment of the present invention
  • Fig. 3 is a flow chart for explaining one example of the operation to be executed in a pre- process and a reconstruction process of the X-ray CT system, according to the embodiment of the present invention
  • Fig. 4 is a conceptual diagram for explaining the corresponding points between two projected images of which the projected angles overlap each other, according to the embodiment of the present invention
  • Fig. 1 is a block diagram showing one example of the whole constitution of an X-ray CT system according to the embodiment of the present invention
  • Fig. 2 is a flow chart for explaining one example of the operation to be executed by the X- ray CT system in case of radiographing a subject and thus acquiring
  • FIG. 5 is a conceptual diagram for explaining the fixed points on one of the two projected images of which the projected angles overlap each other, according to the embodiment of the present invention
  • Figs. 6A and 6B are conceptual diagrams for explaining how to acquire the corresponding points by means of matching, according to the embodiment of the present invention
  • Fig. 7 is a conceptual diagram for explaining geometric transformation parameters for correcting body movement, according to the embodiment of the present invention
  • Fig. 8 is a conceptual diagram for explaining the coordinates which show the status before the geometric transformation is executed and the coordinates which show the status after the geometric transformation was executed
  • Fig. 9 is a conceptual diagram for explaining a conventional cone beam X-ray CT device
  • Figs. 10A, 10B and IOC are conceptual diagrams for explaining conventional geometric transformation parameters for correcting body movement .
  • Fig. 1 is a block diagram showing one example of the whole constitution of an X-ray CT system according to the embodiment of the present invention.
  • an X-ray is generated from an X-ray source (or X-ray generator) 103 which is controlled by an X-ray generator control unit 104, the generated X—ray is transmitted through a patient 102 which is treated as a subject, and the. transmitted X-ray is then detected by an X—ray detector 101.
  • the case where the X-ray is used will be explained by way of example.
  • the radiation to be processed in the present embodiment is not limited to the X- ray, that is, electromagnetic waves such as an ⁇ - ray, a ⁇ -ray, a ⁇ -ray and the like may be used.
  • the X-ray detected by the X-ray detector 101 is then input as a projected image to an image input unit 105.
  • the X-ray source 103 and the X-ray detector 101 such as a plane sensor or the like collect the projected image with respect to each of predetermined rotation angles as together rotating around the patient 102.
  • an image processing unit 107 corrects the X-ray detector 101, executes a pre-process including logarithmic transformation to the input projected image of each of the rotation angles, andfurther executes image processes such as a reconstruction process and the like to the input projected image.
  • a group of tomographic images (or called a tomographic image group) is created through the above processes. After then, the created tomographic image group can be displayed on a diagnostic monitor 109, and also stored in an image storage unit 108.
  • a tester or a diagnostician handles an operation unit 110 to execute various operations such as a display window operation on the diagnostic monitor 109 or the like, a switching display operation of the tomographic image in the body axis of the patient 102, a cross section transformation operation of the tomographic image, a three-dimensional surface display operation, and the like.
  • the above operations are totally managed by an X-ray radiography system control unit 106 which consists of a microcomputer and the like.
  • radiography conditions such as an X-ray tube voltage, an X-ray tube current, an exposure time and the like, patient information such as a patient's name, a patient's age, a patient's gender and the like, and test information such as a test ID (identification) and the like are set through the operation by the tester on the operation unit 110.
  • the X-ray generator control unit 104 controls the X-ray source 103 in accordance with the radiography conditions set in the step S201.
  • the X-ray is generated from the X-ray source 103 to radiograph the patient 102, and the projected image acquired by the radiography is input to the image input unit 105.
  • the image processing unit 107 executes the pre-process to the projected image acquired in the radiography.
  • the image processing unit 107 further executes the reconstruction process to the projected image subjected to the pre-process in the step S203.
  • the tomographic image of the patient 102 is created.
  • the tomographic image created in the step S204 is displayed on the diagnostic monitor 109.
  • the tester confirms the tomographic image displayed on the diagnostic monitor 109, and handles the operation unit 110 in accordance with the confirmed result.
  • the X-ray radiography system control unit 106 judges whether or not it is indicated that the X-ray radiography succeeds, based on the content of the handling by the tester. Then, when the X-ray radiography system control unit 106 judges that it is indicated that the X-ray radiography does not succeed, the flow returns to the step S202 to again radiograph the patient 102. That is, the processes in the steps S202 to S206 are repeated until the X-ray radiography system control unit 106 judges that it is indicated that the X-ray radiography succeeds.
  • step S207 the X-ray radiography system control unit 106 transfers the tomographic image created in the step S205 to the image storage unit 108, the image database 114 and the like.
  • step S208 the X-ray radiography system control unit 106 judges whether or not next radiography is indicated, on the basis of the handling by the tester on the operation unit 110. Then, when the X-ray radiography system control unit 106 judges that the next radiography is indicated, the flow returns to the step S201 to execute the next radiography.
  • Fig. 3 is a flow chart for explaining one example of the operation to be executed in the pre- process and the reconstruction process of the X-ray CT system, according to the present embodiment.
  • the coordinates of the corresponding points between the two projected images of which the projected angles overlap each other are acquired (step S301) .
  • step S301 the coordinates of the corresponding points between the two projected images of which the projected angles overlap each other.
  • Fig. 4 is a conceptual diagram for explaining the corresponding points between the two projected images of which the projected angles overlap each other. That is, as shown in Fig.
  • the coordinates of the corresponding points (the small black square points shown in Fig. 4) between a projected image 401 (for example, an image at scan angle 0°) and a projected image 402 (for example, an image at scan angle 360°) of which the respective projected angles overlap each other are acquired.
  • the number of corresponding points has to be equal to or larger than the number of geometric transformation parameters, and accuracy of the geometric transformation parameters improve in proportion to increase in the number of corresponding points.
  • a necessary calculation time becomes- long at the same instant. ⁇ Therefore, in consideration of such a fact, it has only to determine the number of corresponding points substantially in accordance with the system itself.
  • the sets of the coordinates of the respective corresponding points of the projected images 401 and 402 are acquired as much as the number of corresponding points.
  • plural fixed points are set on one (e.g., projected image 401) of the two projected images.
  • the most characteristic points are set as the fixed points.
  • the end points of a rib, the branch points of lung blood vessels, and the like are suitable for the most characteristic points. Therefore, the end points of the rib, the branch points of the lung blood vessels or the like are set as the fixed points.
  • the characteristic points are not used as the fixed points, for example, as shown in Fig.
  • a template image 603 of a predetermined size is cut out on the basis of the fixed point on the projected image 401.
  • a search area 604 of a predetermined size larger than the size of the template image 603 is set on the projected image 402 of which the projected angle overlaps the projected ' angle of the projected image 401.
  • the coordinates of the central point of the search area 604 are the same as the coordinates of the fixed point on the projected image 401.
  • the position of the central point is shifted sequentially from the upper left to the lower right within the search area 604.
  • a reference image of the size same as that of the template image 603 is cut out from the projected image 402.
  • an estimated value est concerning the matching between the cut- out reference image and the template image 603 is acquired.
  • the estimated value est of the matching can be acquired by the following equation (1) .
  • the estimated value est of the matching may be acquired simply by the following equation (2) .
  • the position of the reference image at which the estimated value est of the matching becomes minimum is equivalent to the coordinates of the corresponding points.
  • the geometric transformation parameter is acquired from the set of the coordinates of the corresponding points (step S302) .
  • affine transformation may be used in such geometric transformation. It should be noted that the affine transformation can be represented by the following equation (3) .
  • the set of the corresponding points is defined as the set of the point of the coordinates (x n , y n ) and the point of the coordinates (X n , Y n ) (n is 1, ... , N) .
  • geometric transformation parameters a 0 ⁇ , a 02 , an, a 12 r a 2i , a 22 for the affine transformation by which a least squares error err between the coordinates (x n f , y n ' ) acquired by transforming the coordinates (x n , y n ) with use of the above equation (3) and the coordinates (X n , Y n ) is minimized are acquired.
  • the least squares error err is represented by the following equation (4).
  • the coordinates x n ' and y n ' are the functions which use the geometric transformation parameters a 0 ⁇ , a 0 2, n, a ⁇ 2 , a 2 ⁇ , a 22 for the affine transformation as variables. Therefore, also the least squares error err by the equation (4) is the function which uses the geometric transformation parameters a 0 ⁇ , a 02 , an, a i2 , a 2 ⁇ , a 22 for the affine transformation as variables.
  • the least squares error err is subjected to partial differentiation based on each of the variables of the geometric transformation parameters a 0 ⁇ , ao 2 , an, a i2 , a 2 ⁇ , a 22 for the affine transformation to set "0" and then the simultaneous equations are solved, it is possible to acquire the geometric transformation parameters a 0 ⁇ , a 0 2r an, a ⁇ 2 , a 2 ⁇ , a 22 for the affine transformation by which the least squares error err becomes minimum.
  • the geometric transformation parameters aoir a 02 , an, a ⁇ 2 , a 2 ⁇ , a 2 for the affine transformation are acquired.
  • the geometric transformation in which also enlargement of the image and shearing of the image are admitted. Consequently, in case of executing the geometric transformation only for rotation of the image and movement of the image without permitting the enlargement of the image and the shearing of the image, Helmert transformation is used.
  • Helmert transformation can be represented by the following equation (5).
  • the geometric transformation parameters in case of the Helmert transformation are three, i.e., ⁇ , a 0 ⁇ , a 02 . Therefore, these parameters can be acquired in the same manner as above .
  • the affine transformation and the Helmert transformation even if secondary projective transformation, high-order polynomial transformation or the like is used, it is possible to acquire the geometric transformation parameters as well as the case where the affine transformation, the Helmert transformation or the like is used.
  • the geometric transformation parameter is acquired in the step S302, it . is then judged whether or not an estimated amount of the acquired geometric transformation parameter is larger than a predetermined amount (step S303) .
  • the estimated amount Of the geometric transformation parameter is the amount for estimating the magnitude of the body movement while the scan is being executed. More specifically, the estimated amount of the geometric transformation parameter is used to estimate how far from the geometric transformation parameter in case of no body movement.
  • the geometric transformation parameters are set as ⁇ i, ⁇ 2 , ..., ⁇ n , ..., ⁇ N
  • the geometric transformation parameters in case of no body movement are set as ⁇ i' , ⁇ 2 ' , ..., ⁇ n ' , ... , ⁇ N ' .
  • an estimated amount est2 of the geometric transformation parameter can be acquired by the following equation (6) .
  • the right side of the above equation (6) may not be necessarily the square root. That is, instead of the above equation (6), it is possible to more simply acquire the estimated amount est2 of the geometric transformation parameter by the following equation
  • step S303 when it is judged in the step S303 that the estimated amount of the geometric transformation parameter acquired as above is equal to or smaller than a predetermined amount, it is possible to judge that there is no body movement of the level for which the correction is necessary. Therefore, in this case, the flow advances to a step S306 without executing the correction of the body movement so as to reconstruct the projected image and thus create the tomographic image. On the other hand, when it is judged in the step S303 that the estimated amount of the geometric transformation parameter acquired as above is larger than the predetermined amount, the correction of the body movement is executed.
  • the geometric transformation parameter for geometric correction is first determined so that the projected images of which the projected angles overlap each other conform to each other and gradually change within a predetermined angle (step S304) .
  • the correction of the body movement only half of the correction amount is gradually subjected to the correction according to the angles within the range, e.g., from 0° to 180°.
  • the correction is gradually executed according to the angles in the direction opposite to the direction of the correction within the rage from 0° to 180°, whereby it is possible to achieve the smooth correction of the body movement.
  • the geometric transformation parameters for executing the above correction of the body movement may be determined so as to gradually change within the range, e.g., from 0° to 180°.
  • only half of the minus geometric transformation parameters acquired in the step S302 may be determined so as to gradually change within the range, e.g., from 180° to 360°.
  • the body movement is gradually corrected according to the angles within the range from 0° to 90°, by half of the correction amount of the body movement. Within the range from 90° to 270°, the body movement is not corrected.
  • the body movement is gradually corrected according to the angles, by half of the correction amount of the body movement, in the direction opposite to the direction of the correction within the rage from 0° to 90°.
  • the meaning that the projected image gradually changes within the predetermined angle and the body movement is gradually corrected according to the angles is as follows. That is, the geometric transformation parameter is divided by the predetermined angle to acquire a change amount of the geometric transformation parameter with respect to each unit angle, and the change amounts of the geometric transformation parameters of the respective unit angles are gradually added to others.
  • Fig. 7 one example of the method of determining the geometric transformation parameters for executing the correction of the body movement (geometric correction) will be explained with reference to Fig. 7. That is, Fig.
  • FIG. 7 shows the projected angles in the direction along the axis of abscissas and the geometric transformation parameters in the direction along the axis of ordinate, with respect to each of the respective geometric transformation parameters.
  • the angles from 0° to 360° are given as the projected angles.
  • the projected angle may be set from ⁇ ° ( ⁇ a) to (360 + ⁇ ) °.
  • n-th parameter it takes the case of n-th parameter as an example.
  • the geometric transformation is executed by half of the body movement with respect to the projected image of 0°, whereby its parameter ⁇ n (0) is given by the following equation (10).
  • ⁇ n (0) ⁇ n ' + ( ⁇ n - ⁇ n ')/2 ... (10)
  • the above predetermined angle is ⁇ ° .
  • the parameter of the projected angle within this range is assumed as ⁇ n ' for the sake of convenience.
  • it is controlled to gradually change the parameter from ⁇ n (0) to ⁇ n ' . That is, the parameter ⁇ n ( ⁇ ) in case of ⁇ ° ( ⁇ ⁇ ) is given by the following equation (11) .
  • ⁇ n ( ⁇ ) ⁇ n ' + ⁇ ( ⁇ n - ⁇ n ')/2 ⁇ x ⁇ / ⁇ ... (11)
  • ⁇ n (360) ⁇ coordinate - ( ⁇ coordinate - ⁇ n ')/2 ... (12) Then, within the range from (360 - ⁇ ) ° to 360°, it is controlled to gradually change the parameter from ⁇ n ' to ⁇ n (360).
  • the parameter ⁇ n ( ⁇ ) of the range from (360 - ⁇ ) ° to 360° is given by the following equation (13) .
  • ⁇ n( ⁇ ) ⁇ n ' + . ⁇ ( ⁇ n - ⁇ n ')/2 ⁇ x ⁇ - (360 - ⁇ ) ⁇ / ⁇ ... (13)
  • the parameter may be determined so that the whole transformation is executed within the predetermined angle with respect to either the side of 0° or the side of 360°.
  • the geometric transformation is executed to the projected image of each of the projected angles by using the acquired geometric transformation parameter (step S305) .
  • the geometric transformation is not executed within the range from ⁇ ° to (360 - ⁇ ) °.
  • the point (X' , y' ) indicates the coordinates of the integer value corresponding to each pixel on the af er-geometric transformation image 802. Then, by using the inverse transformation of the geometric transformation with respect to the coordinates, the point (X, y) corresponding to the point (X' , y' ) can be acquired. However, since the point (X, y) does not have the coordinates of an integer value, interpolation is executed by using the pixel values of the four points closest to the point (X, y) , and the interpolated value is set as the pixel value of the point (X' , y' ) .
  • This process is executed to all the pixels, whereby the image can be created through the geometric transformation.
  • the geometric transformation parameter may be set in advance as the parameter for the inverse transformation of the geometric transformation in the step S304.
  • the reconstruction is executed by using the corrected projected images, and the tomographic image is thus created (step S306) .
  • the coordinates of the respective corresponding points between the projected images 401 and 402 of which the respective projected angles (for example, 0° and 360°) overlap each other are first acquired.
  • the geometric transformation parameters ⁇ n (0) and ⁇ n (360) are acquired through the affine transformation or the like by using the sets of the coordinates of the corresponding points. Subsequently, when the estimated amount of the geometric transformation parameter is larger than the predetermined amount, the. correction of the body movement is executed, and the tomographic image is then created by using the projected image 802 which has been subjected to the correction of the body movement . More specifically, when the correction of the body movement is executed, as described above, the geometric transformation parameter for executing the geometric correction with respect to the projected angles from ⁇ ° to (360 - ⁇ ) ° is assumed as ⁇ n ' for the sake of convenience.
  • the geometric transformation parameter ⁇ n ( ⁇ ) for executing the geometric correction is determined so that the parameter gradually changes from ⁇ n ( 0 ) to ⁇ n ' (as indicated by the equation (11)) within the range of the projected angles from 0° to ⁇ °.
  • the geometric transformation parameter ⁇ n ( ⁇ ) for executing the geometric correction is determined so that the parameter gradually changes from ⁇ n ' to ⁇ n (360) (as indicated by the equation (13) ) within the range of the projected angles from (360 - ⁇ ). ° to 360°.
  • the geometric transformation i.e., body movement correction
  • the body movement is not grasped with respect to each horizontal (landscape) line, but the body movement is grasped as the image.
  • the present invention includes in its category a case where the program codes of software for realizing the functions of the above embodiment is supplied to a computer provided in an apparatus or a system connected to various devices so as to actually operate these devices for realizing the functions of the above embodiment, and thus the computer (CPU or MPU) in the system or the apparatus reads and executes the supplied and stored program codes and operates these devices.
  • the program codes themselves of the software realize the functions of the above embodiment. Therefore, the program codes themselves and a means, such as a recording medium storing these program codes, for supplying the program codes to the computer constitute the present invention.
  • a recording medium for storing these program codes for example, a flexible disk, a hard disk, an optical disk, a magnetooptical disk, a. CD-ROM, a magnetic tape, a nonvolatile memory card, a ROM or the like can be used.
  • the present invention includes not only a case where the functions of the above embodiment are realized by executing the supplied program codes with the computer, but also a case where the program codes cooperate with an OS (operating system) running on the computer or another application software to realize the functions of the above embodiment. Furthermore, it is needless to say that the present invention also includes a case where, after the supplied program codes are written into a function expansion board inserted in the computer or a memory of a function expansion unit connected to the computer, the CPU or the like provided in the function expansion board or the function expansion unit executes a part or all of the actual processes on the basis of the instructions of the program codes, and thus the functions of the above embodiment are realized by such the processes.
  • the body movement can be grasped as the image.
  • the body movement can be accurately grasped and seized.
  • the geometric transformation parameters are smoothly changed between the projected angles, whereby it is possible to achieve high-precision correction of the body movement as much as possible. In consequence, it is possible to create the tomographic image on which an artifact has been reduced as much as possible.
  • the magnitude of the body movement is estimated by using the acquired geometric transformation parameters, and the correction of the body movement is executed only in the case where the magnitude of the body movement is equal to or larger than the predetermined magnitude. Thus, useless correction of the body movement can be avoided, whereby it is possible to shorten the time necessary for executing the correction of the body movement .

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • General Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Pulmonology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
PCT/JP2005/008817 2004-05-11 2005-05-09 Radiation imaging device for correcting body movement, image processing method, and computer program Ceased WO2005107597A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US10/599,028 US7787670B2 (en) 2004-05-11 2005-05-09 Radiation imaging device for correcting body movement, image processing method, and computer program

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2004-141490 2004-05-11
JP2004141490A JP4438053B2 (ja) 2004-05-11 2004-05-11 放射線撮像装置、画像処理方法及びコンピュータプログラム

Publications (1)

Publication Number Publication Date
WO2005107597A1 true WO2005107597A1 (en) 2005-11-17

Family

ID=35320002

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2005/008817 Ceased WO2005107597A1 (en) 2004-05-11 2005-05-09 Radiation imaging device for correcting body movement, image processing method, and computer program

Country Status (3)

Country Link
US (1) US7787670B2 (enExample)
JP (1) JP4438053B2 (enExample)
WO (1) WO2005107597A1 (enExample)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008145161A1 (en) * 2007-05-31 2008-12-04 Elekta Ab (Publ) Motion artefact reduction in ct scanning
US8391580B2 (en) 2007-08-01 2013-03-05 Depuy Orthopaedie Gmbh Image processing

Families Citing this family (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006031580A1 (de) 2006-07-03 2008-01-17 Faro Technologies, Inc., Lake Mary Verfahren und Vorrichtung zum dreidimensionalen Erfassen eines Raumbereichs
WO2008141293A2 (en) * 2007-05-11 2008-11-20 The Board Of Regents Of The University Of Oklahoma One Partner's Place Image segmentation system and method
JP5387018B2 (ja) * 2008-06-02 2014-01-15 株式会社島津製作所 X線断層像撮影装置
US9551575B2 (en) 2009-03-25 2017-01-24 Faro Technologies, Inc. Laser scanner having a multi-color light source and real-time color receiver
DE102009015920B4 (de) 2009-03-25 2014-11-20 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102009038964A1 (de) * 2009-08-20 2011-02-24 Faro Technologies, Inc., Lake Mary Verfahren zum optischen Abtasten und Vermessen einer Umgebung
US9210288B2 (en) 2009-11-20 2015-12-08 Faro Technologies, Inc. Three-dimensional scanner with dichroic beam splitters to capture a variety of signals
US9113023B2 (en) 2009-11-20 2015-08-18 Faro Technologies, Inc. Three-dimensional scanner with spectroscopic energy detector
US9529083B2 (en) 2009-11-20 2016-12-27 Faro Technologies, Inc. Three-dimensional scanner with enhanced spectroscopic energy detector
DE102009057101A1 (de) 2009-11-20 2011-05-26 Faro Technologies, Inc., Lake Mary Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
DE102009055989B4 (de) 2009-11-20 2017-02-16 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US9826942B2 (en) 2009-11-25 2017-11-28 Dental Imaging Technologies Corporation Correcting and reconstructing x-ray images using patient motion vectors extracted from marker positions in x-ray images
US9082036B2 (en) 2009-11-25 2015-07-14 Dental Imaging Technologies Corporation Method for accurate sub-pixel localization of markers on X-ray images
US9082177B2 (en) 2009-11-25 2015-07-14 Dental Imaging Technologies Corporation Method for tracking X-ray markers in serial CT projection images
US9082182B2 (en) 2009-11-25 2015-07-14 Dental Imaging Technologies Corporation Extracting patient motion vectors from marker positions in x-ray images
US9628775B2 (en) 2010-01-20 2017-04-18 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
US9879976B2 (en) 2010-01-20 2018-01-30 Faro Technologies, Inc. Articulated arm coordinate measurement machine that uses a 2D camera to determine 3D coordinates of smoothly continuous edge features
US9163922B2 (en) 2010-01-20 2015-10-20 Faro Technologies, Inc. Coordinate measurement machine with distance meter and camera to determine dimensions within camera images
WO2011090891A1 (en) 2010-01-20 2011-07-28 Faro Technologies, Inc. Display for coordinate measuring machine
US9607239B2 (en) 2010-01-20 2017-03-28 Faro Technologies, Inc. Articulated arm coordinate measurement machine having a 2D camera and method of obtaining 3D representations
DE102010020925B4 (de) 2010-05-10 2014-02-27 Faro Technologies, Inc. Verfahren zum optischen Abtasten und Vermessen einer Umgebung
US9168654B2 (en) 2010-11-16 2015-10-27 Faro Technologies, Inc. Coordinate measuring machines with dual layer arm
DE102012100609A1 (de) 2012-01-25 2013-07-25 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US8997362B2 (en) 2012-07-17 2015-04-07 Faro Technologies, Inc. Portable articulated arm coordinate measuring machine with optical communications bus
DE102012107544B3 (de) 2012-08-17 2013-05-23 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
US10067231B2 (en) 2012-10-05 2018-09-04 Faro Technologies, Inc. Registration calculation of three-dimensional scanner data performed between scans based on measurements by two-dimensional scanner
US9513107B2 (en) 2012-10-05 2016-12-06 Faro Technologies, Inc. Registration calculation between three-dimensional (3D) scans based on two-dimensional (2D) scan data from a 3D scanner
DE102012109481A1 (de) 2012-10-05 2014-04-10 Faro Technologies, Inc. Vorrichtung zum optischen Abtasten und Vermessen einer Umgebung
JP6292826B2 (ja) * 2013-11-06 2018-03-14 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
US9526468B2 (en) * 2014-09-09 2016-12-27 General Electric Company Multiple frame acquisition for exposure control in X-ray medical imagers
EP3331449B1 (en) * 2015-10-28 2018-12-19 Koninklijke Philips N.V. Computed tomography image generation apparatus
DE102015122844A1 (de) 2015-12-27 2017-06-29 Faro Technologies, Inc. 3D-Messvorrichtung mit Batteriepack
JP6682373B2 (ja) * 2016-06-14 2020-04-15 キヤノンメディカルシステムズ株式会社 Spect装置
FI20175244A7 (fi) * 2017-03-17 2018-09-18 Planmeca Oy Itsekalibroiva lääketieteellinen kuvannuslaite
JP7167564B2 (ja) * 2018-09-05 2022-11-09 株式会社島津製作所 X線撮影装置およびx線撮影装置の作動方法
CN113892960B (zh) * 2021-10-09 2024-05-28 清华大学 X射线自成像几何标定方法及装置
JP2024005570A (ja) * 2022-06-30 2024-01-17 株式会社リガク 補正装置、システム、方法およびプログラム
JP2024013652A (ja) * 2022-07-20 2024-02-01 キヤノン株式会社 画像処理方法、画像処理装置、プログラム

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005137472A (ja) * 2003-11-05 2005-06-02 Canon Inc 放射線画像処理装置、放射線画像処理方法、プログラム及びコンピュータ可読媒体

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4858128A (en) * 1986-08-11 1989-08-15 General Electric Company View-to-view image correction for object motion
JPH06114052A (ja) 1992-10-07 1994-04-26 Hitachi Medical Corp X線ct装置
JP2006000225A (ja) 2004-06-15 2006-01-05 Canon Inc X線ct装置
JP4498023B2 (ja) 2004-06-15 2010-07-07 キヤノン株式会社 X線ct装置
JP4585815B2 (ja) 2004-09-03 2010-11-24 キヤノン株式会社 情報処理装置、撮影システム、吸収係数補正方法、及びコンピュータプログラム
US7782998B2 (en) * 2004-12-21 2010-08-24 General Electric Company Method and apparatus for correcting motion in image reconstruction

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005137472A (ja) * 2003-11-05 2005-06-02 Canon Inc 放射線画像処理装置、放射線画像処理方法、プログラム及びコンピュータ可読媒体

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008145161A1 (en) * 2007-05-31 2008-12-04 Elekta Ab (Publ) Motion artefact reduction in ct scanning
US8442294B2 (en) 2007-05-31 2013-05-14 Elekta Ab (Publ) Motion artefact reduction in CT scanning
US8391580B2 (en) 2007-08-01 2013-03-05 Depuy Orthopaedie Gmbh Image processing

Also Published As

Publication number Publication date
JP4438053B2 (ja) 2010-03-24
JP2005319207A (ja) 2005-11-17
US20070195091A1 (en) 2007-08-23
US7787670B2 (en) 2010-08-31

Similar Documents

Publication Publication Date Title
US7787670B2 (en) Radiation imaging device for correcting body movement, image processing method, and computer program
US6445761B1 (en) X-ray computerized tomograph including collimator that restricts irradiation range of X-ray fan beam
US6795522B2 (en) Backprojection method and X-ray CT apparatus
EP1530162B1 (en) Radiation image processing apparatus, radiation image processing method, program, and computer-readable medium
JP2006025868A (ja) 画像処理装置及び画像処理方法並びにx線ctシステム
CN102667856B (zh) 成像数据的多部分对准
JP2008307184A (ja) 画像処理装置および画像処理プログラム
JP6074450B2 (ja) Ctシステム
JP5576631B2 (ja) 放射線撮影装置、放射線撮影方法、及びプログラム
KR20060061249A (ko) 선량 평가 방법 및 x선 ct 장치
CN100490747C (zh) 三维图像处理装置
US6418183B1 (en) Methods and apparatus for two-pass CT imaging
KR100923094B1 (ko) 트렁케이션 아티팩트를 보정하는 방법
JP6167841B2 (ja) 医用画像処理装置及びプログラム
JP4752468B2 (ja) 断面像再構成装置およびそれを用いたx線撮影装置
EP3272288B1 (en) Apparatus and method for ct data reconstruction based on motion compensation
JP2004065706A (ja) 投影データ補正方法、画像生成方法およびx線ct装置
CN118154767A (zh) 一种锥束ct的图像重建方法、系统、装置和介质
JP5574002B2 (ja) X線検査装置
JP2008012027A (ja) 断層撮影装置および断層撮影方法
JP2009050413A (ja) 記憶媒体及びx線ctシステム
JP2006187511A (ja) X線撮像装置
JP2003102718A (ja) X線ctシステム、操作コンソール、及びその制御方法、ファントム

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 10599028

Country of ref document: US

Ref document number: 2007195091

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Ref document number: DE

122 Ep: pct application non-entry in european phase
WWP Wipo information: published in national office

Ref document number: 10599028

Country of ref document: US