US20160125627A1 - Image processing apparatus, image processing method, and image processing system - Google Patents
Image processing apparatus, image processing method, and image processing system Download PDFInfo
- Publication number
- US20160125627A1 US20160125627A1 US14/995,133 US201614995133A US2016125627A1 US 20160125627 A1 US20160125627 A1 US 20160125627A1 US 201614995133 A US201614995133 A US 201614995133A US 2016125627 A1 US2016125627 A1 US 2016125627A1
- Authority
- US
- United States
- Prior art keywords
- ray
- image processing
- present
- image
- processing apparatus
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000012545 processing Methods 0.000 title claims abstract description 415
- 238000003672 processing method Methods 0.000 title claims description 55
- 238000001514 detection method Methods 0.000 claims abstract description 213
- 238000000034 method Methods 0.000 claims abstract description 44
- 230000010365 information processing Effects 0.000 claims 8
- 238000004364 calculation method Methods 0.000 description 49
- 238000004891 communication Methods 0.000 description 40
- 238000002591 computed tomography Methods 0.000 description 31
- 238000010586 diagram Methods 0.000 description 22
- 230000000694 effects Effects 0.000 description 15
- 230000006870 function Effects 0.000 description 15
- 238000012937 correction Methods 0.000 description 11
- 230000006866 deterioration Effects 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000015572 biosynthetic process Effects 0.000 description 2
- 239000000470 constituent Substances 0.000 description 2
- 238000007476 Maximum Likelihood Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 229910052704 radon Inorganic materials 0.000 description 1
- SYUHGPGVQRZVTB-UHFFFAOYSA-N radon atom Chemical compound [Rn] SYUHGPGVQRZVTB-UHFFFAOYSA-N 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/006—Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/02—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
- G01N23/04—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/02—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
- G01N23/04—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
- G01N23/044—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using laminography or tomosynthesis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N23/00—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00
- G01N23/02—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material
- G01N23/04—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material
- G01N23/046—Investigating or analysing materials by the use of wave or particle radiation, e.g. X-rays or neutrons, not covered by groups G01N3/00 – G01N17/00, G01N21/00 or G01N22/00 by transmitting the radiation through the material and forming images of the material using tomography, e.g. computed tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10116—X-ray image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/424—Iterative
Definitions
- the present disclosure relates to an image processing apparatus, an image processing method, and an image processing system.
- a CT (computed tomography) apparatus (or a CT system; hereinafter, the same) that utilizes X-rays output from an X-ray source or an apparatus (or a system; hereinafter, the same) having a tomosynthesis function that utilizes X-rays is widely used in the medical field, for example.
- JP-A-2004-329784 describes a technology relating to a CT apparatus that utilizes X-rays.
- a CT apparatus and the like that utilizes X-rays forms an X-ray image by processing X-ray detection data representing a detection result of X-rays. More specifically, in a CT apparatus and the like that utilizes X-rays, an X-ray image is formed based on X-ray detection data being converted into projection data, and three-dimensional data being reconstituted from the projection data, for example.
- an X-ray source that outputs cone beam X-rays is used, or an X-ray source that outputs fan beam X-rays like the technology described in JP-A-2004-329784 is used, for example.
- Examples of methods for performing the calculations for forming an X-ray image more rapidly include ignoring the effects caused by a cone beam or a fan beam, or converting a cone beam or a fan beam into a parallel beam in the projection data and dividing the processing for forming the X-ray image.
- a novel and improved image processing apparatus, image processing method, and image processing system that can achieve a higher quality X-ray image while reducing the calculation costs for reconstituting an X-ray image even further.
- an image processing apparatus including a processing unit configured to processes projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and form an X-ray image based on the X-ray detection data.
- an image processing method including processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and forming an X-ray image based on the X-ray detection data.
- an image processing system including an X-ray output apparatus that includes an X-ray source for outputting parallel beam X-rays, a detection apparatus configured to detect the parallel beam X-rays, generate X-ray detection data representing a detection result of the parallel beam X-rays, and convert the generated X-ray detection data into projection data by projection, and an image processing apparatus that includes a processing unit configured to process projection data in which the X-ray detection data has been converted, and form an X-ray image based on the X-ray detection data.
- a higher quality X-ray image can be achieved while reducing the calculation costs for reconstituting an X-ray image even further.
- FIG. 1 is a flow diagram illustrating a first example of the processing performed in an image processing method according to an embodiment of the present disclosure performed by an image processing apparatus according to an embodiment of the present disclosure;
- FIG. 2 is a flow diagram illustrating an example of reconstitution processing performed by the image processing apparatus according to an embodiment of the present disclosure
- FIG. 3 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure
- FIG. 4 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure
- FIG. 5 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure
- FIG. 6 is a flow diagram illustrating a second example of the processing performed in the image processing method according to an embodiment of the present disclosure performed by the image processing apparatus according to an embodiment of the present disclosure;
- FIG. 7 is an explanatory diagram illustrating processing performed in the image processing method according to an embodiment of the present disclosure.
- FIG. 8 is a flow diagram illustrating a third example of the processing performed in the image processing method according to an embodiment of the present disclosure performed by the image processing apparatus according to an embodiment of the present disclosure;
- FIG. 9 is an explanatory diagram illustrating an example of an image processing system according to an embodiment of the present disclosure.
- FIG. 10 is a block diagram illustrating an example of a configuration of the image processing apparatus according to an embodiment of the present disclosure.
- FIG. 11 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to an embodiment of the present disclosure.
- the image processing method according to the present embodiment will be described.
- the image processing method according to the present embodiment will be described based on an example in which the image processing apparatus according to the present embodiment performs the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus forms an X-ray image based on X-ray detection data by processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted.
- the X-ray detection data according to the present embodiment is, for example, data representing a detection intensity of parallel beam X-rays that have passed through a target and have been detected by a detector, such as the detector that is included in the below-described detection apparatus according to the present embodiment.
- a detector such as the detector that is included in the below-described detection apparatus according to the present embodiment.
- the X-ray detection data according to the present embodiment is sometimes referred to as “parallel X-ray detection data”.
- the parallel X-ray detection data representing the detection intensity of the parallel beam X-rays detected by the above-described detector is converted into projection data (two-dimensional projection data) by being projected in two dimensions as an X-ray projection image. More specifically, the parallel X-ray detection data is converted into projection data by radon conversion, for example.
- the conversion processing for converting the parallel X-ray detection data into projection data is performed by an external device, such as the detection apparatus (described below) according to the present embodiment, that generates the X-ray detection data according to the present embodiment, for example, the conversion processing according to the present embodiment is not limited to being performed by an external device.
- the image processing apparatus according to the present embodiment may perform the conversion processing according to the present embodiment.
- the processing performed in the image processing method according to the present embodiment will be described mainly based on an example in which the conversion processing according to the present embodiment is performed by an external device, such as the detection apparatus according to the present embodiment, that generates parallel X-ray detection data, namely, a case in which the image processing apparatus according to the present embodiment processes projection data in which the parallel X-ray detection data has already been converted by the external device.
- an external device such as the detection apparatus according to the present embodiment
- the image processing apparatus processes projection data in which parallel X-ray detection data has been converted, or parallel X-ray detection data, acquired from an external device such as the below-described detection apparatus according to the present embodiment
- the projection data in which parallel X-ray detection data has been converted or the parallel X-ray detection data processed by the image processing apparatus according to the present embodiment is not limited to that described above.
- the image processing apparatus according to the present embodiment can also perform the processing by reading from a storage unit (described below) projection data in which parallel X-ray detection data has been converted, or parallel X-ray detection data, that is stored in a storage unit (described below) included in the apparatus (the image processing apparatus according to the present embodiment) or stored on an external storage medium, for example.
- the image processing apparatus forms an X-ray image by reconstituting three-dimensional data from the projection data.
- examples of the processing used by the image processing apparatus according to the present embodiment to reconstitute three-dimensional data from the projection data include successive approximation methods, such as ML-EM (maximum likelihood-expectation maximization), OS-EM (ordered subsets-expectation maximization), and MAP-EM (maximum a posteriori-expectation maximization). It is noted that the processing performed to reconstitute three-dimensional data from the projection data according to the present embodiment is obviously not limited to processing that uses the successive approximation methods mentioned above.
- the image processing apparatus can form a highly accurate X-ray image by performing processing that uses a successive approximation method like those mentioned above as the processing to reconstitute three-dimensional data from projection data.
- the image processing apparatus can achieve a higher quality X-ray image.
- the image processing apparatus forms an X-ray image based on parallel X-ray detection data by processing projection data in which parallel X-ray detection data has been converted.
- the projection data in which parallel X-ray detection data has been converted there is no mixing of the data of the plurality of layers of the target, as is the case in the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. This is because projection data in which parallel X-ray detection data has been converted is not affected by the above-described widening and unevenness of the detection intensity resulting from a cone beam or a fan beam.
- the image processing apparatus does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like.
- the image processing apparatus can reduce the calculation costs for forming an X-ray image more than when processing the above-described X-ray detection data that represents a detection result of cone beam or fan beam X-rays.
- the image processing apparatus does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like, approximation or image deterioration resulting from processing to reduce the effects of a cone beam or a fan beam is prevented. Therefore, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image.
- the image processing apparatus is capable of performing processing (e.g., processing for dividing the projection data and forming an X-ray image for each piece of divided projection data (described below)) in parallel. Further, since the image processing apparatus according to the present embodiment can form successive X-ray images each time projection data is acquired (described below), for example, the amount of memory that is used in one calculation is substantially reduced.
- the processing performed in the image processing method according to the present embodiment will be described based on an example in which the image processing apparatus according to the present embodiment processes projection data in which parallel X-ray detection data has been converted by an external device. It is noted that when the image processing apparatus according to the present embodiment processes parallel X-ray detection data, the image processing apparatus according to the present embodiment converts the parallel X-ray detection data into projection data, for example, and processes the converted projection data.
- FIG. 1 is a flow diagram illustrating a first example of the processing performed in an image processing method according to the present embodiment performed by an image processing apparatus according to the present embodiment.
- the image processing apparatus determines whether projection data has been acquired (S 100 ).
- the image processing apparatus determines that projection data has been acquired if all of the projection data corresponding to the target has been read into a RAM (random-access memory), for example.
- step S 100 If it is determined in step S 100 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired.
- step S 100 if it is determined in step S 100 that projection data has been acquired, the image processing apparatus according to the present embodiment sets the image represented by the projection data as projection image P, and sets an initial reconstituted image I 0 (S 102 ).
- the image processing apparatus sets the initial reconstituted image I 0 by generating as the initial reconstituted image I 0 an image in which the pixel values of all of the pixels are positive, such as an image in which the pixel values of all of the pixels are indicated as “1”, for example, the processing for setting the initial reconstituted image I 0 that is performed in the present embodiment is not limited to this.
- the image processing apparatus may set an image reconstituted by a FBP (filtered back-projection) method as the initial reconstituted image I 0 .
- the image processing apparatus according to the present embodiment can also, for example, set an arbitrary image in which the pixel values are positive as the initial reconstituted image I 0 .
- step S 104 the image processing apparatus performs reconstitution processing for forming an X-ray image (S 104 ).
- FIG. 2 is a flow diagram illustrating an example of reconstitution processing performed by the image processing apparatus according to the present embodiment.
- the image processing apparatus projects a reconstituted image I n (wherein n denotes an integer of 0 or more), and generates a reprojection image P′ (S 200 ).
- the reconstituted image I n that is initially projected by the image processing apparatus according to the present embodiment is the initial reconstituted image I 0 set in step S 102 of FIG. 1 .
- the image processing apparatus compares the reprojection image P′ and the projection image P, for example, and calculates a ratio between the reprojection image P′ and the projection image P (S 202 ).
- the image processing apparatus determines whether to finish the reconstitution processing (S 204 ).
- the image processing apparatus according to the present embodiment determines that the reconstitution processing is to be finished when, for example, the ratio between the reprojection image P′ and the projection image P calculated in step S 202 is equal to or less than a predetermined set value (or less than a predetermined value).
- step S 204 if it is determined in step S 204 to finish the reconstitution processing, the image processing apparatus according to the present embodiment sets the reconstituted image I n as the X-ray image (a so-called reconstituted layer image) (S 208 ). Then, the image processing apparatus according to the present embodiment finishes the reconstitution processing.
- the image processing apparatus performs the processing illustrated in FIG. 2 , for example, as the reconstitution processing according to the present embodiment.
- the reconstitution processing according to the present embodiment is represented by the following formula 1, for example.
- the “i” in formula 1 represents the coordinates of the reconstituted image I n (the coordinates corresponding to the position of the target), the “j” in formula 1 represents the coordinates of the projection image P, and the “k” in formula 1 represents the number of repetitions. Further, the “C ij ” in formula 1 represents the “detection rate”, which is the probability of the voxel of coordinate i in the target being detected by a detector corresponding to coordinate j of the projection image P.
- one calculation cycle can be carried out by plugging all of the voxels of that layer and the values of the detection positions corresponding to those voxels. Accordingly, when reconstituting an X-ray image corresponding to a three-dimensional object (target), the reconstituted image I n and the projection image P based on the projection data are three-dimensional, so that the calculation represented in formula 1 can be carried out on all of the voxels in three dimensions.
- FIG. 3 is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment.
- FIG. 3 illustrates an example in a typical CT of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector.
- Symbol A in FIG. 3 illustrates an example in a single-slice type CT apparatus of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector.
- symbol B in FIG. 3 illustrates an example in a multi-slice type CT apparatus of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector.
- the calculation costs to form the X-ray image are smaller than the calculation costs to form the X-ray image when reconstituting an X-ray image using the multi-slice type CT apparatus employing an X-ray source that outputs cone beam X-rays like that illustrated by B in FIG. 3 .
- the single-slice type CT apparatus illustrated by A in FIG. 3 the larger the detection area of the target, the longer it takes to generate the X-ray image.
- a multi-slice type like that illustrated by B in FIG. 3 is mainstream.
- an X-ray source that outputs cone beam X-rays is used, as illustrated by B of FIG. 3 .
- FIG. 4 which is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment, illustrates the outline of the processing that is performed when reconstituting an X-ray image using a cone beam X-ray source.
- FIG. 5 which is an explanatory diagram illustrating processing performed in the image processing method according to the present embodiment, illustrates an example of a multi-slice type CT apparatus in which an X-ray source that outputs parallel beam X-rays is used.
- the image processing apparatus forms an X-ray image based on X-ray detection data by processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted.
- the image processing apparatus can perform the successive calculations for reconstitution on a per layer basis in a closed state. Further, since the reconstitution calculations are also two-dimensional ⁇ two-dimensional, the calculation amount that is performed in one go can be substantially reduced. In addition, the calculation amount of the detection probability C ij in formula 1 can be reduced by a lot more than when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B of FIG. 3 , for example.
- the image processing apparatus performs the processing illustrated in FIG. 1 , for example, as the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus in the projection data in which parallel X-ray detection data has been converted that is processed by the image processing apparatus according to the present embodiment in the processing illustrated in FIG. 1 , there is no mixing of data of the plurality of layers of the target, as is the case with the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted.
- the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like.
- the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. Therefore, when performing the processing illustrated in FIG. 1 , for example, deterioration resulting from the processing to reduce the effects of a cone beam or a fan beam is prevented. Further, the calculation costs for forming an X-ray image can be reduced more than when processing the above-described X-ray detection data that represents a detection result of cone beam or fan beam X-rays. Therefore, by performing the processing illustrated in FIG. 1 , for example, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further.
- the image processing apparatus performs processing that uses a successive approximation method, such as processing using the ML-EM method represented in formula 1, for example, as the reconstitution processing for forming an X-ray image. Therefore, since a highly accurate X-ray image can be formed by performing the processing illustrated in FIG. 1 , for example, a higher quality X-ray image can be achieved.
- a successive approximation method such as processing using the ML-EM method represented in formula 1, for example, as the reconstitution processing for forming an X-ray image. Therefore, since a highly accurate X-ray image can be formed by performing the processing illustrated in FIG. 1 , for example, a higher quality X-ray image can be achieved.
- processing performed in the image processing method according to the present embodiment that is performed by the image processing apparatus according to the present embodiment is not limited to the processing according to the first example illustrated in FIG. 1 .
- the image processing apparatus can, for example, divide the projection data and form an X-ray image for each piece of divided projection data (the parallel processing according to the present embodiment). Further, the image processing apparatus according to the present embodiment can also form successive X-ray images each time projection data is acquired (the successive processing according to the present embodiment).
- FIG. 6 is a flow diagram illustrating a second example of the processing performed in the image processing method according to the present embodiment by the image processing apparatus according to the present embodiment.
- FIG. 6 illustrates an example of the parallel processing according to the present embodiment.
- the image processing apparatus determines whether projection data has been acquired (S 300 ). If it is determined in step S 300 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired.
- the image processing apparatus divides the projection data (S 302 ).
- the image processing apparatus divides the projection data on a per layer basis
- the units that the image processing apparatus according to the present embodiment divides the projection data into are not especially limited.
- the units into which the image processing apparatus according to the present embodiment divides the projection data may be, for example, a single unit, or a mixture of a plurality of units.
- the image processing apparatus sets an image represented by each piece of divided projection data as the projection image P.
- the image processing apparatus sets an initial reconstituted image I 0 corresponding to each piece of divided projection data (S 304 ).
- the image processing apparatus according to the present embodiment sets the initial reconstituted image I 0 in the same manner as in step S 102 of FIG. 1 .
- step S 304 can be carried out after the processing step S 302 , the processing of step S 302 and the processing step S 304 can be performed independently. Therefore, the image processing apparatus according to the present embodiment can perform the processing of step S 302 and the processing step S 304 in synchronization, for example.
- the image processing apparatus When the processing steps S 302 and S 304 has been performed, the image processing apparatus according to the present embodiment performs in parallel the reconstitution processing for forming an X-ray image for each piece of divided projection data (S 306 ).
- the image processing apparatus according to the present embodiment performs the reconstitution processing for forming X-ray images in the same manner as in step S 104 of FIG. 1 , for example.
- the image processing apparatus performs the processing illustrated in FIG. 6 , for example, as the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like. Further, similar to the processing according to the first example illustrated in FIG.
- the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted.
- the image processing apparatus can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further.
- the dividing of the projection data into respective layers and the reconstitution processing corresponding to each piece of projection data are performed in parallel.
- the image processing apparatus can shorten the processing time (calculation time) taken for the reconstitution processing more than when an X-ray image is reconstituted using an X-ray source that outputs cone beam or fan beam X-rays, for example.
- the processing time calculation time
- the image processing apparatus since there is no mixing of the data of the plurality of layers of the target in the projection data processed by the image processing apparatus according to the present embodiment, deterioration in the accuracy of the X-ray image is prevented even if the image processing apparatus according to the present embodiment processes only the parallel X-ray detection data corresponding to a specific layer of the target.
- FIG. 7 is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment.
- FIG. 7 illustrates an example in a helical scanning type CT apparatus, or in a non-helical scanning type CT apparatus, for example, of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector. Further, a person is shown in FIG. 7 as the target.
- FIG. 8 is a flow diagram illustrating a third example of the processing performed in the image processing method according to the present embodiment performed by the image processing apparatus according to present embodiment.
- FIG. 8 illustrates an example of the successive processing according to the present embodiment.
- the image processing apparatus determines whether projection data has been acquired (S 400 ).
- the image processing apparatus determines that projection data has been acquired if projection data transmitted from an external device has been received, and the received projection data read into the RAM or the like, or if projection data stored in a storage unit (described below) has been read from the storage unit (described below), and the read projection data read into the RAM or the like, for example.
- step S 400 If it is determined in step S 400 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired.
- step S 400 determines that projection data has been acquired.
- the image processing apparatus sets an initial reconstituted image I 0 in the same manner as in step S 102 of FIG. 1 (S 402 ).
- the image processing apparatus When the processing step S 402 has been performed, the image processing apparatus according to the present embodiment performs the reconstitution processing for forming an X-ray image in the same manner as in step S 104 of FIG. 1 (S 404 ).
- the image processing apparatus determines whether to finish the processing performed in the image processing method according to the present embodiment (S 406 ).
- the image processing apparatus according to the present embodiment determines, for example, to finish the processing performed in the image processing method according to the present embodiment when a signal indicating that scanning has finished transmitted from an external device, such as the CT apparatus, is received.
- the image processing apparatus determines to finish the processing performed in the image processing method according to the present embodiment when, for example, all of projection data (e.g., projection data formed into groups based on metadata and the like) corresponding to a given target has been read from the storage unit (described below).
- all of projection data e.g., projection data formed into groups based on metadata and the like
- step S 406 the image processing apparatus according to the present embodiment repeats the processing from step S 400 . Further, if it is determined in step S 406 to finish the processing performed in the image processing method according to the present embodiment, the image processing apparatus according to the present embodiment finishes the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus performs the processing illustrated in FIG. 8 , for example, as the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like. Further, similar to the processing according to the first example illustrated in FIG.
- the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted.
- the image processing apparatus can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further.
- the image processing apparatus according to the present embodiment since there is no mixing of the data of the plurality of layers of the target in the projection data processed by the image processing apparatus according to the present embodiment, deterioration in the accuracy of the X-ray image is prevented even if only the parallel X-ray detection data corresponding to a specific layer of the target is processed. Namely, since the image processing apparatus according to the present embodiment can independently perform processing on each layer surface, in the processing illustrated in FIG. 8 , the image processing apparatus according to the present embodiment forms successive X-ray images each time projection data is acquired.
- the image processing apparatus perform the calculations relating to the formation of the X-ray image, such as a reconstitution calculation, in order from the portions for which scanning has finished.
- the calculations relating to the formation of the X-ray image can be completed along with as the finishing of the CT scanning.
- the image processing apparatus forms successive X-ray images each time projection data is acquired, the amount of memory that is used in one calculation is substantially reduced.
- FIG. 9 is an explanatory diagram illustrating an example of an image processing system 1000 according to the present embodiment.
- the image processing system 1000 has, for example, an image processing apparatus 100 , an X-ray output apparatus 200 , and a detection apparatus 300 .
- the X-ray output apparatus 200 includes, for example, an X-ray source (not illustrated), for outputting parallel beam X-rays.
- examples of the X-ray source included in the X-ray output apparatus 200 include an X-ray tube, which is an electron tube for generating X-rays, a colimeter that forms parallel beam X-rays from X-rays generated by an X-ray tube, and a planar source in which a plurality of X-ray tubes are arranged on a flat face.
- the configuration of the X-ray output apparatus 200 is not limited to that described above.
- the X-ray output apparatus 200 is configured from a MPU (micro-processing unit), various processing circuits and the like.
- the X-ray output apparatus 200 may also include a control unit (not illustrated) for controlling the generation of X-rays by the X-ray source, a ROM (read-only memory, not illustrated), a RAM (not illustrated) and the like.
- the ROM (not illustrated) included in the X-ray output apparatus 200 stores control data, such as programs and calculation parameters used by the control unit (not illustrated) included in the detection apparatus 300 .
- the RAM included in the X-ray output apparatus 200 temporarily stores programs, for example, that are executed by the control unit (not illustrated) included in the X-ray output apparatus 200 .
- the detection apparatus 300 which includes a detection unit (not illustrated) that has a detector for detecting X-rays, for example, detects parallel beam X-rays and generates parallel X-ray detection data.
- the configuration of the detection apparatus 300 is not limited to that described above.
- the detection apparatus 300 is configured from a MPU, various processing circuits and the like.
- the detection apparatus 300 may also include a processing unit (not illustrated) for converting parallel X-ray detection data into projection data, a ROM (read-only memory, not illustrated), a RAM (not illustrated), a communication unit and the like.
- the ROM (not illustrated) included in the detection apparatus 300 stores control data, such as programs and calculation parameters used by the control unit (not illustrated) included in the detection apparatus 300 .
- the RAM included in the detection apparatus 300 temporarily stores programs, for example, that are executed by the control unit (not illustrated) included in the detection apparatus 300 .
- the communication unit (not illustrated) included in the detection apparatus 300 is a communication device included in the detection apparatus 300 , which has the role of performing wireless/wired communication with an external device, such as the image processing apparatus 100 , via a network (or directly).
- examples of the communication unit (not illustrated) included in the detection apparatus 300 include a communication antenna and an RF (radio frequency) circuit (wireless communication), an IEEE 802.15.1 port and a transmitting/receiving circuit (wireless communication), an IEEE 802.11b port and a transmitting/receiving circuit (wireless communication), or a LAN (local area network) terminal and a transmitting/receiving circuit (wired communication) and the like.
- the communication unit (not illustrated) included in the detection apparatus 300 include a configuration that supports an arbitrary standard capable of performing communication, such as a USB (universal serial bus) terminal and a transmitting/receiving circuit, and an arbitrary configuration capable of communicating with an external device via a network.
- Examples of the network according to the embodiment of the present disclosure include a wired network such as a LAN or a WAN (wide area network), a wireless network such as a wireless LAN (wireless local area network), and wireless WAN (wireless wide area network) via a base station, or the Internet using a communication protocol such as TCP/IP (transmission control protocol/internet protocol) and the like.
- the detection apparatus 300 transmits to the image processing apparatus 100 , for example, the generated parallel X-ray detection data or projection data in which parallel X-ray detection data has been converted.
- the image processing apparatus 100 forms an X-ray image based on parallel X-ray detection data by performing the above-described processing performed in the image processing method according to the present embodiment, and processing parallel X-ray detection data or projection data in which parallel X-ray detection data has been converted.
- the image processing apparatus 100 processes, for example, parallel X-ray detection data transmitted from the detection apparatus 300 , or projection data transmitted from the detection apparatus 300 in which parallel X-ray detection data has been converted. It is noted that the image processing apparatus 100 can process, for example, parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted. Examples of parallel X-ray detection data stored in the storage unit (described below) or the like include parallel X-ray detection data generated by the detection apparatus 300 and parallel X-ray detection data generated by an external device other than the detection apparatus 300 . Further, examples of projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted include projection data converted by the detection apparatus 300 and projection data generated by an external device other than the detection apparatus 300 .
- the image processing system 1000 has, for example, the configuration illustrated in FIG. 9 .
- the image processing apparatus 100 forms an X-ray image based on parallel X-ray detection data by performing the above-described processing performed in the image processing method according to the present embodiment. Therefore, based on the configuration illustrated in FIG. 9 , for example, an image processing system is realized that can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further.
- the image processing system according to the present embodiment is not limited to the configuration illustrated in FIG. 9 .
- the X-ray output apparatus 200 and the detection apparatus 300 may be an integrated apparatus, like a CT apparatus that utilizes X-rays or an apparatus having a tomosynthesis function in which X-rays are utilized.
- the X-ray output apparatus 200 and the detection apparatus 300 are an integrated apparatus, such an apparatus may include a gantry that has a rotary motor, for example.
- FIG. 10 is a block diagram illustrating an example of a configuration of an image processing apparatus 100 according to an embodiment of the present disclosure.
- the image processing apparatus 100 includes, for example, a communication unit 102 and a control unit 104 .
- the image processing apparatus 100 may also include, for example, a ROM (not illustrated), a RAM (not illustrated), a storage unit (not illustrated), a user-operable operation unit (not illustrated), a display unit (not illustrated) that displays various screens on a display screen and the like.
- the image processing apparatus 100 connects these constituent elements to each other with a bus that serves as a data transmission path.
- the ROM (not illustrated) stores control data, such as programs and calculation parameters used by the control unit 104 .
- the RAM (not illustrated) temporarily stores programs and the like that are executed by the control unit 104 .
- the storage unit (not illustrated) is a storage device included in the image processing apparatus 100 , which stores, for example, various data such as X-ray detection data, projection data in which X-ray detection data has been converted, and applications.
- examples of the storage unit (not illustrated) include magnetic recording media such as a hard disk, non-volatile memory such as flash memory and the like. Further, the storage unit (not illustrated) may be detachable from the image processing apparatus 100 .
- examples of the operation unit include the below-described operation input device.
- examples of the display unit may include the below-described display device.
- FIG. 11 is an explanatory diagram illustrating an example of a hardware configuration of the image processing apparatus 100 according to an embodiment of the present disclosure.
- the image processing apparatus 100 includes, for example, a MPU 150 , a ROM 152 , a RAM 154 , a recording medium 156 , an input/output interface 158 , an operation input device 160 , a display device 162 , and a communication interface 164 . Further, the image processing apparatus 100 connects these constituent elements to each other with a bus 166 that serves as a data transmission path.
- the MPU 150 is configured from, for example, a MPU, various processing circuits and the like.
- the MPU 150 functions as the control unit 104 for controlling the whole image processing apparatus 100 . Further, in the image processing apparatus 100 , the MPU 150 plays the role of, for example, the below-described processing unit 110 .
- the ROM 152 stores control data, such as programs and calculation parameters used by the MPU 150 .
- the RAM 154 temporarily stores programs and the like, for example, that are executed by the MPU 150 .
- the recording medium 156 functions as a storage unit, which stores, for example, various data such as X-ray detection data, projection data in which X-ray detection data has been converted, and applications.
- examples of the recording medium 156 include magnetic recording media such as a hard disk, non-volatile memory such as flash memory and the like. Further, the recording medium 156 may be detachable from the image processing apparatus 100 .
- the input/output interface 158 connects the operation input device 160 and the display device 162 .
- the operation input device 160 functions as an operation unit (not illustrated), and the display device 162 functions as a display unit (not illustrated).
- examples of the input/output interface 158 includes a USB terminal, a DVI (digital visual interface) terminal, a HDMI (high-definition multimedia interface) terminal, various processing circuits and the like.
- the operation input device 160 is, for example, included on the image processing apparatus 100 , and is connected with the input/output interface 158 in the image processing apparatus 100 .
- Examples of the operation input device 160 include a button, a direction key, a rotating-type selector such as a jog dial, or a combination of these.
- the display device 162 is, for example, included on the image processing apparatus 100 , and is connected with the input/output interface 158 in the image processing apparatus 100 .
- the input/output interface 158 include a liquid crystal display (LCD), an organic EL display (organic electroluminescence display, also called an OLED (organic light emitting diode display)) and the like.
- the input/output interface 158 is obviously also connected to an external device, such as an operation input device (e.g., a keyboard, a mouse etc.) or a display device, as an external device of the image processing apparatus 100 .
- an operation input device e.g., a keyboard, a mouse etc.
- a display device as an external device of the image processing apparatus 100 .
- the display device 162 may also be a device that can perform a display and user operations.
- the communication interface 164 is a communication unit included in the image processing apparatus 100 , which functions as the communication unit 102 for performing wireless/wired communication with the detection apparatus 300 or an external device, such as a server, via a network (or directly).
- examples of the communication interface 164 include a communication antenna and an RF circuit (wireless communication), an IEEE 802.15.1 port and a transmitting/receiving circuit (wireless communication), an IEEE 802.11b port and a transmitting/receiving circuit (wireless communication), or a LAN (local area network) terminal and a transmitting/receiving circuit (wired communication) and the like.
- the image processing apparatus 100 performs the processing performed in the image processing method according to the present embodiment based on the configuration illustrated in FIG. 11 , for example.
- the hardware configuration of the image processing apparatus 100 according to the present embodiment is not limited to the configuration illustrated in FIG. 11 .
- the image processing apparatus 100 may be configured without the communication device 164 .
- the image processing apparatus 100 may also be configured without the operation input device 160 or the display device 162 .
- the communication unit 102 is a communication unit included in the image processing apparatus 100 , which performs wireless/wired communication with the detection apparatus 300 or an external device, such as a server, via a network (or directly). Further, communication by the communication unit 102 is controlled by the control unit 104 , for example.
- examples of the communication unit 102 include a communication antenna and an RF (radio frequency) circuit, a LAN terminal, a transmitting/receiving circuit and the like.
- the configuration of the communication unit 102 is not limited to these examples.
- the communication unit 102 may have a configuration that supports an arbitrary standard that is capable of performing communication, such as a USB terminal and a transmitting/receiving circuit, or an arbitrary configuration that is capable of communicating with an external device via a network.
- the control unit 104 is configured from a MPU, for example, which plays the role of controlling the whole image processing apparatus 100 . Further, the control unit 104 which includes, for example, the processing unit 110 , plays the lead role in the processing performed in the image processing method according to the present embodiment.
- the processing unit 110 which plays the lead role in the processing performed in the image processing method according to the present embodiment, forms an X-ray image based on X-ray detection data by processing projection data in which parallel X-ray detection data (X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source) has been converted by projection. More specifically, the processing unit 110 forms an X-ray image based on X-ray detection data by performing the processing according to the above-described first example, the processing according to the above-described second example, or the processing according to the above-described third example.
- the processing unit 110 process that projection data. Further, for example, in the case of processing the parallel X-ray detection data, such as when the communication unit 102 has received parallel X-ray detection data, the processing unit 110 converts the parallel X-ray detection data and processes the converted projection data. It is noted that the processing unit 110 can also process parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted.
- the control unit 104 plays the lead role in the processing performed in the image processing method according to the present embodiment due to its inclusion of the processing unit 110 , for example.
- the image processing apparatus 100 performs the processing performed in the image processing method according to the present embodiment based on the configuration illustrated in FIG. 10 , for example.
- the image processing apparatus 100 can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. Further, the image processing apparatus 100 achieves the effects gained from performing the processing according to the above-described first example when performing the processing according to the above-described first example, achieves the effects gained from performing the processing according to the above-described second example when performing the processing according to the above-described second example, and achieves the effects gained from performing the processing according to the above-described third example when performing the processing according to the above-described third example.
- the configuration of the image processing apparatus according to the present embodiment is not limited to the configuration illustrated in FIG. 10 .
- the image processing apparatus may further include a detection unit (not illustrated) having a similar function and configuration to the detection apparatus 300 illustrated in FIG. 9 .
- the detection unit (not illustrated) detects parallel beam X-rays and generates parallel X-ray detection data, for example.
- the detection unit (not illustrated) may also have a function for, for example, detecting parallel beam X-rays, generating parallel X-ray detection data, and converting the generated parallel X-ray detection data into projection data by projection.
- the processing unit 110 converts the X-ray detection data generated by the detection unit (not illustrated) into projection data by projection, and processes the converted projection data. Further, in the case of the detection unit (not illustrated) detecting parallel beam X-rays, generating parallel X-ray detection data, and converting the generated parallel X-ray detection data into projection data by projection, the processing unit 110 processes the projection data converted by the detection unit (not illustrated).
- processing unit 110 may also process parallel X-ray detection data stored in a storage unit (described below), or projection data in which parallel X-ray detection data stored in a storage unit (described below) has been converted.
- the image processing apparatus according to the first modified example of the present embodiment can perform the processing performed in the image processing method according to the present embodiment. Therefore, the image processing apparatus according to the first modified example of the present embodiment can obtain the same effects as the image processing apparatus 100 illustrated in FIG. 10 .
- the image processing apparatus may further include, in addition to the configuration of the image processing apparatus according to the first modified example of the present embodiment, an X-ray output unit (not illustrated) having a similar function and configuration to the X-ray output apparatus 200 illustrated in FIG. 9 .
- the X-ray output unit (not illustrated) has an X-ray source that outputs parallel beam X-rays, for example. Further, the generation of the X-rays in the X-ray output unit (not illustrated) is controlled by the control unit 104 , for example.
- the image processing apparatus which in addition to the configuration illustrated in FIG. 10 , further includes an X-ray output unit (not illustrated) and a detection unit (not illustrated), the detection unit (not illustrated) can detect parallel beam X-rays output from the X-ray output unit (not illustrated), for example, and the processing unit 110 can process projection data in which parallel X-ray detection data representing a detection result has been converted by the detection unit (not illustrated). It is noted that the processing unit 110 according to the second modified example of the present embodiment can also process parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted.
- the image processing apparatus can perform the processing performed in the image processing method according to the present embodiment.
- the image processing apparatus can obtain the same effects as the image processing apparatus 100 illustrated in FIG. 10 .
- the image processing apparatus according to the present embodiment may be configured without the communication unit 102 .
- the present embodiment is not limited to this example.
- the present embodiment can also be used in various devices that are capable of processing an image, such as a computer like a PC (personal computer) or a server, a CT apparatus (an apparatus that uses 360° direction projection data), an apparatus having a tomosynthesis function (an apparatus that uses projection data of a controlled angle direction, such as 180° direction projection data), a communications device such as a smartphone and the like.
- a processing IC integrated circuit
- a program that makes a computer function as the image processing apparatus according to the present embodiment e.g., a program capable of executing the processing performed in the image processing method according to the present embodiment, such as a program that makes a computer function as the processing unit 110 illustrated in FIG. 10
- a higher quality X-ray image can be achieved while reducing the calculation costs for forming an X-ray image even further.
- the present embodiment can further provide a recording medium in which this program is stored.
- present technology may also be configured as below.
- An image processing apparatus including:
- a processing unit configured to processes projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and form an X-ray image based on the X-ray detection data.
- a detection unit configured to detect the parallel beam X-rays, generate the X-ray detection data, and convert the generated X-ray detection data into the projection data
- processing unit is configured to process the projection data converted by the detection unit.
- a detection unit configured to detect the parallel beam X-rays and generate the X-ray detection data
- processing unit is configured to convert the X-ray detection data generated by the detection unit into the projection data, and process the converted projection data.
- an X-ray output unit that includes the X-ray source for outputting the parallel beam X-rays.
- An image processing method including:
- an X-ray output apparatus that includes an X-ray source for outputting parallel beam X-rays
- a detection apparatus configured to detect the parallel beam X-rays, generate X-ray detection data representing a detection result of the parallel beam X-rays, and convert the generated X-ray detection data into projection data by projection;
- an image processing apparatus that includes a processing unit configured to process projection data in which the X-ray detection data has been converted, and form an X-ray image based on the X-ray detection data.
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Immunology (AREA)
- Medical Informatics (AREA)
- Pulmonology (AREA)
- Pure & Applied Mathematics (AREA)
- Algebra (AREA)
- Mathematical Analysis (AREA)
- Mathematical Optimization (AREA)
- Mathematical Physics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- Heart & Thoracic Surgery (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Biophysics (AREA)
- Quality & Reliability (AREA)
- Image Processing (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
There is provided an image processing apparatus including a processing unit configured to processes projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and form an X-ray image based on the X-ray detection data.
Description
- The present application is a Continuation application of U.S. patent application Ser. No. 13/968,469 filed on Aug. 16, 2013 which claims priority to Japanese Patent Application No. 2012-185292 filed on Aug. 24, 2012, the disclosures of which are incorporated herein by reference.
- The present disclosure relates to an image processing apparatus, an image processing method, and an image processing system.
- For example, a CT (computed tomography) apparatus (or a CT system; hereinafter, the same) that utilizes X-rays output from an X-ray source or an apparatus (or a system; hereinafter, the same) having a tomosynthesis function that utilizes X-rays is widely used in the medical field, for example. JP-A-2004-329784 describes a technology relating to a CT apparatus that utilizes X-rays.
- A CT apparatus and the like that utilizes X-rays forms an X-ray image by processing X-ray detection data representing a detection result of X-rays. More specifically, in a CT apparatus and the like that utilizes X-rays, an X-ray image is formed based on X-ray detection data being converted into projection data, and three-dimensional data being reconstituted from the projection data, for example.
- Here, in a CT apparatus and the like that utilizes X-rays, an X-ray source that outputs cone beam X-rays is used, or an X-ray source that outputs fan beam X-rays like the technology described in JP-A-2004-329784 is used, for example.
- However, when forming an X-ray image by processing projection data in which X-ray detection data representing a result that X-rays output from an X-ray source that outputs cone beam X-rays or an X-ray source that outputs fan beam X-rays have been detected has been converted, mixing of data of a plurality of layers of a target that is hit by the X-rays occurs in the projection data due to widening and unevenness of the detection intensity resulting from the cone beam or the fan beam, for example. Further, in order to strictly carry out the reconstitution of three-dimensional data from projection data in which data from various layers is mixed like this, calculations are performed that repeatedly use all of the projection data and all of the reconstitution data.
- Therefore, to form an X-ray image having greater accuracy by processing projection data in which X-ray detection data representing a result that X-rays output from an X-ray source that outputs cone beam X-rays or an X-ray source that outputs fan beam X-rays have been detected has been converted, the calculation costs for forming the X-ray image become very large.
- Examples of methods for performing the calculations for forming an X-ray image more rapidly include ignoring the effects caused by a cone beam or a fan beam, or converting a cone beam or a fan beam into a parallel beam in the projection data and dividing the processing for forming the X-ray image.
- However, when using such a method for performing the calculations for forming an X-ray image more rapidly, since approximate processing is included when forming the X-ray image, the calculation accuracy deteriorates, so that the accuracy of the obtained X-ray image deteriorates.
- According to an embodiment of the present disclosure, there are provided a novel and improved image processing apparatus, image processing method, and image processing system that can achieve a higher quality X-ray image while reducing the calculation costs for reconstituting an X-ray image even further.
- According to an embodiment of the present disclosure, there is provided an image processing apparatus including a processing unit configured to processes projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and form an X-ray image based on the X-ray detection data.
- Further, according to an embodiment of the present disclosure, there is provided an image processing method including processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and forming an X-ray image based on the X-ray detection data.
- Further, according to an embodiment of the present disclosure, there is provided an image processing system including an X-ray output apparatus that includes an X-ray source for outputting parallel beam X-rays, a detection apparatus configured to detect the parallel beam X-rays, generate X-ray detection data representing a detection result of the parallel beam X-rays, and convert the generated X-ray detection data into projection data by projection, and an image processing apparatus that includes a processing unit configured to process projection data in which the X-ray detection data has been converted, and form an X-ray image based on the X-ray detection data.
- According to the embodiments of the present disclosure described above, a higher quality X-ray image can be achieved while reducing the calculation costs for reconstituting an X-ray image even further.
-
FIG. 1 is a flow diagram illustrating a first example of the processing performed in an image processing method according to an embodiment of the present disclosure performed by an image processing apparatus according to an embodiment of the present disclosure; -
FIG. 2 is a flow diagram illustrating an example of reconstitution processing performed by the image processing apparatus according to an embodiment of the present disclosure; -
FIG. 3 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure; -
FIG. 4 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure; -
FIG. 5 is an explanatory diagram illustrating the processing performed in the image processing method according to an embodiment of the present disclosure; -
FIG. 6 is a flow diagram illustrating a second example of the processing performed in the image processing method according to an embodiment of the present disclosure performed by the image processing apparatus according to an embodiment of the present disclosure; -
FIG. 7 is an explanatory diagram illustrating processing performed in the image processing method according to an embodiment of the present disclosure; -
FIG. 8 is a flow diagram illustrating a third example of the processing performed in the image processing method according to an embodiment of the present disclosure performed by the image processing apparatus according to an embodiment of the present disclosure; -
FIG. 9 is an explanatory diagram illustrating an example of an image processing system according to an embodiment of the present disclosure; -
FIG. 10 is a block diagram illustrating an example of a configuration of the image processing apparatus according to an embodiment of the present disclosure; and -
FIG. 11 is a diagram illustrating an example of a hardware configuration of the image processing apparatus according to an embodiment of the present disclosure. - Hereinafter, preferred embodiments of the present disclosure will be described in detail with reference to the appended drawings. Note that, in this specification and the appended drawings, structural elements that have substantially the same function and structure are denoted with the same reference numerals, and repeated explanation of these structural elements is omitted.
- Further, the description will be made in the following order.
- 1. Image processing method according to the present embodiment
2. Image processing apparatus according to the present embodiment
3. Program according to the present embodiment - Before describing the configuration of the image processing apparatus according to the present embodiment, first, the image processing method according to the present embodiment will be described. In the following, the image processing method according to the present embodiment will be described based on an example in which the image processing apparatus according to the present embodiment performs the processing performed in the image processing method according to the present embodiment.
- As described above, when forming an X-ray image having greater accuracy by processing projection data in which X-ray detection data representing a result that X-rays output from an X-ray source that outputs cone beam X-rays or an X-ray source that outputs fan beam X-rays have been detected has been converted, the calculation costs for forming the X-ray image become very large. Further, when using the above-described method for performing the calculations for forming an X-ray image more rapidly, since approximate processing is included when forming the X-ray image, the calculation accuracy deteriorates, so that the accuracy of the obtained X-ray image deteriorates.
- Accordingly, the image processing apparatus according to the present embodiment forms an X-ray image based on X-ray detection data by processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted.
- Here, the X-ray detection data according to the present embodiment is, for example, data representing a detection intensity of parallel beam X-rays that have passed through a target and have been detected by a detector, such as the detector that is included in the below-described detection apparatus according to the present embodiment. In the following, to differentiate between the X-ray detection data according to the present embodiment and the X-ray detection data representing a detection result of cone beam X-rays or fan beam X-rays, the X-ray detection data according to the present embodiment is sometimes referred to as “parallel X-ray detection data”.
- The parallel X-ray detection data representing the detection intensity of the parallel beam X-rays detected by the above-described detector is converted into projection data (two-dimensional projection data) by being projected in two dimensions as an X-ray projection image. More specifically, the parallel X-ray detection data is converted into projection data by radon conversion, for example.
- Here, although the conversion processing for converting the parallel X-ray detection data into projection data is performed by an external device, such as the detection apparatus (described below) according to the present embodiment, that generates the X-ray detection data according to the present embodiment, for example, the conversion processing according to the present embodiment is not limited to being performed by an external device. For example, the image processing apparatus according to the present embodiment may perform the conversion processing according to the present embodiment. In the following, the processing performed in the image processing method according to the present embodiment will be described mainly based on an example in which the conversion processing according to the present embodiment is performed by an external device, such as the detection apparatus according to the present embodiment, that generates parallel X-ray detection data, namely, a case in which the image processing apparatus according to the present embodiment processes projection data in which the parallel X-ray detection data has already been converted by the external device.
- Further, although the image processing apparatus according to the present embodiment processes projection data in which parallel X-ray detection data has been converted, or parallel X-ray detection data, acquired from an external device such as the below-described detection apparatus according to the present embodiment, the projection data in which parallel X-ray detection data has been converted or the parallel X-ray detection data processed by the image processing apparatus according to the present embodiment is not limited to that described above. For example, the image processing apparatus according to the present embodiment can also perform the processing by reading from a storage unit (described below) projection data in which parallel X-ray detection data has been converted, or parallel X-ray detection data, that is stored in a storage unit (described below) included in the apparatus (the image processing apparatus according to the present embodiment) or stored on an external storage medium, for example.
- More specifically, the image processing apparatus according to the present embodiment forms an X-ray image by reconstituting three-dimensional data from the projection data.
- Here, examples of the processing used by the image processing apparatus according to the present embodiment to reconstitute three-dimensional data from the projection data include successive approximation methods, such as ML-EM (maximum likelihood-expectation maximization), OS-EM (ordered subsets-expectation maximization), and MAP-EM (maximum a posteriori-expectation maximization). It is noted that the processing performed to reconstitute three-dimensional data from the projection data according to the present embodiment is obviously not limited to processing that uses the successive approximation methods mentioned above.
- In processing using a successive approximation method like those mentioned above, a value close to the correct value is obtained by repeated projection of a reconstituted image and reverse-projection of an image (corrected image) in which the reconstituted image has been corrected. Therefore, the image processing apparatus according to the present embodiment can form a highly accurate X-ray image by performing processing that uses a successive approximation method like those mentioned above as the processing to reconstitute three-dimensional data from projection data.
- Therefore, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image.
- Further, as described above, the image processing apparatus according to the present embodiment forms an X-ray image based on parallel X-ray detection data by processing projection data in which parallel X-ray detection data has been converted. Here, in the projection data in which parallel X-ray detection data has been converted, there is no mixing of the data of the plurality of layers of the target, as is the case in the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. This is because projection data in which parallel X-ray detection data has been converted is not affected by the above-described widening and unevenness of the detection intensity resulting from a cone beam or a fan beam.
- Therefore, by processing projection data in which parallel X-ray detection data has been converted, the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like.
- Accordingly, the image processing apparatus according to the present embodiment can reduce the calculation costs for forming an X-ray image more than when processing the above-described X-ray detection data that represents a detection result of cone beam or fan beam X-rays.
- Further, since the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like, approximation or image deterioration resulting from processing to reduce the effects of a cone beam or a fan beam is prevented. Therefore, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image.
- In addition, by processing projection data in which parallel X-ray detection data has been converted, since there is no mixing of the data of the respective layers of the target in the projection data, deterioration in the accuracy of the X-ray image is prevented even if the image processing apparatus according to the present embodiment processes only the parallel X-ray detection data corresponding to a specific layer of the target. Namely, by processing projection data in which parallel X-ray detection data has been converted, with the image processing apparatus according to the present embodiment it is possible to process only the parallel X-ray detection data corresponding to a specific layer of the target.
- Therefore, the image processing apparatus according to the present embodiment is capable of performing processing (e.g., processing for dividing the projection data and forming an X-ray image for each piece of divided projection data (described below)) in parallel. Further, since the image processing apparatus according to the present embodiment can form successive X-ray images each time projection data is acquired (described below), for example, the amount of memory that is used in one calculation is substantially reduced.
- Next, the processing performed in the image processing method according to the present embodiment will be described in more detail.
- In the following, the processing performed in the image processing method according to the present embodiment will be described based on an example in which the image processing apparatus according to the present embodiment processes projection data in which parallel X-ray detection data has been converted by an external device. It is noted that when the image processing apparatus according to the present embodiment processes parallel X-ray detection data, the image processing apparatus according to the present embodiment converts the parallel X-ray detection data into projection data, for example, and processes the converted projection data.
-
FIG. 1 is a flow diagram illustrating a first example of the processing performed in an image processing method according to the present embodiment performed by an image processing apparatus according to the present embodiment. - The image processing apparatus according to the present embodiment determines whether projection data has been acquired (S100). The image processing apparatus according to the present embodiment determines that projection data has been acquired if all of the projection data corresponding to the target has been read into a RAM (random-access memory), for example.
- If it is determined in step S100 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired.
- Further, if it is determined in step S100 that projection data has been acquired, the image processing apparatus according to the present embodiment sets the image represented by the projection data as projection image P, and sets an initial reconstituted image I0 (S102).
- Here, although the image processing apparatus according to the present embodiment sets the initial reconstituted image I0 by generating as the initial reconstituted image I0 an image in which the pixel values of all of the pixels are positive, such as an image in which the pixel values of all of the pixels are indicated as “1”, for example, the processing for setting the initial reconstituted image I0 that is performed in the present embodiment is not limited to this. For example, in order to complete the below-described reconstitution processing more rapidly, the image processing apparatus according to the present embodiment may set an image reconstituted by a FBP (filtered back-projection) method as the initial reconstituted image I0. Further, the image processing apparatus according to the present embodiment can also, for example, set an arbitrary image in which the pixel values are positive as the initial reconstituted image I0.
- When the processing of step S102 has been performed, the image processing apparatus according to the present embodiment performs reconstitution processing for forming an X-ray image (S104).
-
FIG. 2 is a flow diagram illustrating an example of reconstitution processing performed by the image processing apparatus according to the present embodiment. - The image processing apparatus according to the present embodiment projects a reconstituted image In (wherein n denotes an integer of 0 or more), and generates a reprojection image P′ (S200). Here, the reconstituted image In that is initially projected by the image processing apparatus according to the present embodiment is the initial reconstituted image I0 set in step S102 of
FIG. 1 . - When the processing of step S200 has been performed, the image processing apparatus according to the present embodiment compares the reprojection image P′ and the projection image P, for example, and calculates a ratio between the reprojection image P′ and the projection image P (S202).
- When the ratio between the reprojection image P′ and the projection image P has been calculated in step S202, the image processing apparatus according to the present embodiment determines whether to finish the reconstitution processing (S204). The image processing apparatus according to the present embodiment determines that the reconstitution processing is to be finished when, for example, the ratio between the reprojection image P′ and the projection image P calculated in step S202 is equal to or less than a predetermined set value (or less than a predetermined value).
- If it is not determined in step S204 to finish the reconstitution processing, the image processing apparatus according to the present embodiment reverse-projects the reprojection image P′ (correction value) using the ratio between the reprojection image P′ and the projection image P to generate a new reconstituted image In (n=n+1) (S206). Then, the image processing apparatus according to the present embodiment repeats the processing from step S200.
- Further, if it is determined in step S204 to finish the reconstitution processing, the image processing apparatus according to the present embodiment sets the reconstituted image In as the X-ray image (a so-called reconstituted layer image) (S208). Then, the image processing apparatus according to the present embodiment finishes the reconstitution processing.
- The image processing apparatus according to the present embodiment performs the processing illustrated in
FIG. 2 , for example, as the reconstitution processing according to the present embodiment. - Here, when the image processing apparatus according to the present embodiment uses ML-EM, which is a basic successive approximation method that uses maximum likelihood estimation, the reconstitution processing according to the present embodiment is represented by the following
formula 1, for example. -
- The “i” in
formula 1 represents the coordinates of the reconstituted image In (the coordinates corresponding to the position of the target), the “j” informula 1 represents the coordinates of the projection image P, and the “k” informula 1 represents the number of repetitions. Further, the “Cij” informula 1 represents the “detection rate”, which is the probability of the voxel of coordinate i in the target being detected by a detector corresponding to coordinate j of the projection image P. - Therefore, for example, to reconstitute a given layer by a successive approximation method, one calculation cycle can be carried out by plugging all of the voxels of that layer and the values of the detection positions corresponding to those voxels. Accordingly, when reconstituting an X-ray image corresponding to a three-dimensional object (target), the reconstituted image In and the projection image P based on the projection data are three-dimensional, so that the calculation represented in
formula 1 can be carried out on all of the voxels in three dimensions. -
FIG. 3 is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment.FIG. 3 illustrates an example in a typical CT of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector. Symbol A inFIG. 3 illustrates an example in a single-slice type CT apparatus of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector. Further, symbol B inFIG. 3 illustrates an example in a multi-slice type CT apparatus of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector. - For the single-slice type CT apparatus illustrated by A in
FIG. 3 , since one layer and one array of detectors correspond to each other on a one-to-one basis, the successive calculations performed in the above reconstitution processing can be performed layer by layer. Namely, in the case of the single-slice type CT apparatus illustrated by A inFIG. 3 , in formula 1 a two-dimensional×two-dimensional calculation can be performed. Therefore, when reconstituting the X-ray image using the single-slice type CT apparatus illustrated by A inFIG. 3 , the calculation costs to form the X-ray image are smaller than the calculation costs to form the X-ray image when reconstituting an X-ray image using the multi-slice type CT apparatus employing an X-ray source that outputs cone beam X-rays like that illustrated by B inFIG. 3 . However, with the single-slice type CT apparatus illustrated by A inFIG. 3 , the larger the detection area of the target, the longer it takes to generate the X-ray image. - Further, for recent CT apparatuses, in order to increase the detection area and shorten the time, a multi-slice type like that illustrated by B in
FIG. 3 is mainstream. Here, for a multi-slice type CT apparatus, an X-ray source that outputs cone beam X-rays is used, as illustrated by B ofFIG. 3 . - When an X-ray source that outputs cone beam X-rays is used as illustrated by B of
FIG. 3 , when an attempt is made to reconstitute the X-ray image corresponding to a given layer, the width of the detector corresponding to the X-rays that pass through that layer surface increases. In addition, when the X-ray source and the detector are rotated by a gantry and the like configuring the CT apparatus, the position of the detector that passes through a given voxel of the target changes depending on the angle of rotation. - Therefore, when an X-ray source that outputs cone beam X-rays is used as illustrated by B of
FIG. 3 , when reconstituting the X-ray image corresponding to the layer surface, all of the relevant voxels and the detectors are used in the calculation. Namely, when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , all of the voxels and the detectors that have an effect on each other are used in all the calculations even when trying to reconstitute the X-ray image corresponding to a given specific layer surface. - Therefore, when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B of
FIG. 3 , the value of all the voxels in the three-dimensional information about the target and the value of all the detectors are used in a given single calculation. Namely, when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , the calculation amount of the detection probability Cij informula 1 increases. -
FIG. 4 , which is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment, illustrates the outline of the processing that is performed when reconstituting an X-ray image using a cone beam X-ray source. - As illustrated in
FIG. 4 , when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , correspondence between three-dimensional data×three-dimensional data is repeatedly calculated based onformula 1. Therefore, when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , the calculation amount is very large. Further, even when reconstituting an X-ray image using a fan beam X-ray source, similar to when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , the calculation amount relating to the reconstitution of the X-ray image is very large. - Therefore, as described above, when forming an X-ray image having greater accuracy by processing projection data in which X-ray detection data representing a result that X-rays output from an X-ray source that outputs cone beam X-rays or an X-ray source that outputs fan beam X-rays have been detected has been converted, the calculation costs for forming the X-ray image become very large.
-
FIG. 5 , which is an explanatory diagram illustrating processing performed in the image processing method according to the present embodiment, illustrates an example of a multi-slice type CT apparatus in which an X-ray source that outputs parallel beam X-rays is used. - As illustrated in
FIG. 5 , when an X-ray source that outputs parallel beam X-rays is used, since X-rays are irradiated from an X-ray source parallel to the array of detectors, mixing of the layer data among the layers is eliminated, so that each cross-section and each detector array are independent, and the correspondence between the cross-sections and the detectors is on a one-to-one basis. Further, as described above, the image processing apparatus according to the present embodiment forms an X-ray image based on X-ray detection data by processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted. - Therefore, in the reconstitution processing according to the present embodiment illustrated in
FIG. 2 , the image processing apparatus according to the present embodiment can perform the successive calculations for reconstitution on a per layer basis in a closed state. Further, since the reconstitution calculations are also two-dimensional×two-dimensional, the calculation amount that is performed in one go can be substantially reduced. In addition, the calculation amount of the detection probability Cij informula 1 can be reduced by a lot more than when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays as illustrated by B ofFIG. 3 , for example. - The image processing apparatus according to the present embodiment performs the processing illustrated in
FIG. 1 , for example, as the processing performed in the image processing method according to the present embodiment. - Here, in the projection data in which parallel X-ray detection data has been converted that is processed by the image processing apparatus according to the present embodiment in the processing illustrated in
FIG. 1 , there is no mixing of data of the plurality of layers of the target, as is the case with the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. Namely, when performing the processing illustrated inFIG. 1 , for example, the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like. Further, the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. Therefore, when performing the processing illustrated inFIG. 1 , for example, deterioration resulting from the processing to reduce the effects of a cone beam or a fan beam is prevented. Further, the calculation costs for forming an X-ray image can be reduced more than when processing the above-described X-ray detection data that represents a detection result of cone beam or fan beam X-rays. Therefore, by performing the processing illustrated inFIG. 1 , for example, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. - Further, for example, in the processing illustrated in
FIG. 1 , the image processing apparatus according to the present embodiment performs processing that uses a successive approximation method, such as processing using the ML-EM method represented informula 1, for example, as the reconstitution processing for forming an X-ray image. Therefore, since a highly accurate X-ray image can be formed by performing the processing illustrated inFIG. 1 , for example, a higher quality X-ray image can be achieved. - It is noted that the processing performed in the image processing method according to the present embodiment that is performed by the image processing apparatus according to the present embodiment is not limited to the processing according to the first example illustrated in
FIG. 1 . - As described above, by processing projection data in which parallel X-ray detection data has been converted, there is no mixing in the projection data of the data from respective layers in the target. Therefore, a deterioration in the accuracy of the X-ray image is prevented even if the image processing apparatus according to the present embodiment processes only the parallel X-ray detection data corresponding to a specific layer of the target. Namely, by processing projection data in which parallel X-ray detection data has been converted, with the image processing apparatus according to the present embodiment it is possible to process only the parallel X-ray detection data corresponding to a specific layer of the target.
- Therefore, the image processing apparatus according to the present embodiment can, for example, divide the projection data and form an X-ray image for each piece of divided projection data (the parallel processing according to the present embodiment). Further, the image processing apparatus according to the present embodiment can also form successive X-ray images each time projection data is acquired (the successive processing according to the present embodiment).
-
FIG. 6 is a flow diagram illustrating a second example of the processing performed in the image processing method according to the present embodiment by the image processing apparatus according to the present embodiment. Here,FIG. 6 illustrates an example of the parallel processing according to the present embodiment. - Similar to step S100 of
FIG. 1 , the image processing apparatus according to the present embodiment determines whether projection data has been acquired (S300). If it is determined in step S300 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired. - Further, if it is determined in step S300 that projection data has been acquired, the image processing apparatus according to the present embodiment divides the projection data (S302). Here although the image processing apparatus according to the present embodiment divides the projection data on a per layer basis, for example, the units that the image processing apparatus according to the present embodiment divides the projection data into are not especially limited. Further, the units into which the image processing apparatus according to the present embodiment divides the projection data may be, for example, a single unit, or a mixture of a plurality of units. The image processing apparatus according to the present embodiment sets an image represented by each piece of divided projection data as the projection image P.
- Further, if it is determined in step S300 that projection data has been acquired, the image processing apparatus according to the present embodiment sets an initial reconstituted image I0 corresponding to each piece of divided projection data (S304). Here, the image processing apparatus according to the present embodiment sets the initial reconstituted image I0 in the same manner as in step S102 of
FIG. 1 . - It is noted that although in
FIG. 6 an example is illustrated in which the processing of step S304 is carried out after the processing step S302, the processing of step S302 and the processing step S304 can be performed independently. Therefore, the image processing apparatus according to the present embodiment can perform the processing of step S302 and the processing step S304 in synchronization, for example. - When the processing steps S302 and S304 has been performed, the image processing apparatus according to the present embodiment performs in parallel the reconstitution processing for forming an X-ray image for each piece of divided projection data (S306). Here, the image processing apparatus according to the present embodiment performs the reconstitution processing for forming X-ray images in the same manner as in step S104 of
FIG. 1 , for example. - The image processing apparatus according to the present embodiment performs the processing illustrated in
FIG. 6 , for example, as the processing performed in the image processing method according to the present embodiment. - Here, in the projection data that is processed by the image processing apparatus according to the present embodiment in the processing illustrated in
FIG. 6 , similar to the processing according to the first example illustrated inFIG. 1 , there is no mixing of data of the plurality of layers of the target. Namely, similar to the processing according to the first example illustrated inFIG. 1 , when performing the processing illustrated inFIG. 6 , for example, the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like. Further, similar to the processing according to the first example illustrated inFIG. 1 , the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. - Therefore, similar to the processing according to the first example illustrated in
FIG. 1 , by performing the processing illustrated inFIG. 6 , for example, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. - Further, in the processing illustrated in
FIG. 6 , the dividing of the projection data into respective layers and the reconstitution processing corresponding to each piece of projection data are performed in parallel. - Therefore, the image processing apparatus according to the present embodiment can shorten the processing time (calculation time) taken for the reconstitution processing more than when an X-ray image is reconstituted using an X-ray source that outputs cone beam or fan beam X-rays, for example. In addition, similar to the processing according to the first example illustrated in
FIG. 1 , since there is no mixing of the data of the plurality of layers of the target in the projection data processed by the image processing apparatus according to the present embodiment, deterioration in the accuracy of the X-ray image is prevented even if the image processing apparatus according to the present embodiment processes only the parallel X-ray detection data corresponding to a specific layer of the target. -
FIG. 7 is an explanatory diagram illustrating the processing performed in the image processing method according to the present embodiment.FIG. 7 illustrates an example in a helical scanning type CT apparatus, or in a non-helical scanning type CT apparatus, for example, of the correspondence among the position of the X-ray source, the position of the target through which the X-rays pass, and the position of the detector. Further, a person is shown inFIG. 7 as the target. - In a helical scanning type CT apparatus or a non-helical scanning type CT apparatus like that illustrated in
FIG. 7 , when an X-ray source that outputs cone beam X-rays is used as illustrated by B ofFIG. 3 , reconstitution processing like that illustrated inFIG. 2 is not performed until all of the scans of the target are finished. This is because, as described above, when reconstituting an X-ray image using an X-ray source that outputs cone beam X-rays, all of the voxels and the detectors that have an effect on each other are used in all the calculations even when trying to reconstitute the X-ray image corresponding to a given specific layer surface. - In contrast, in the projection data in which parallel X-ray detection data has been converted that is processed by the image processing apparatus according to the present embodiment, there is no mixing of data of the plurality of layers of the target, as is the case with the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. Therefore, as described above, a deterioration in the accuracy of the X-ray image is prevented even if the image processing apparatus according to the present embodiment processes only the parallel X-ray detection data corresponding to a specific layer of the target. Namely, a deterioration in the accuracy of the X-ray image to be formed does not occur even if successive X-ray images are formed each time projection data is acquired, for example.
- Accordingly, as the processing according to the third example of the image processing method according to the present embodiment, an example will be described of processing that can realize the successive processing according to the present embodiment.
-
FIG. 8 is a flow diagram illustrating a third example of the processing performed in the image processing method according to the present embodiment performed by the image processing apparatus according to present embodiment. Here,FIG. 8 illustrates an example of the successive processing according to the present embodiment. - The image processing apparatus according to the present embodiment determines whether projection data has been acquired (S400). The image processing apparatus according to the present embodiment determines that projection data has been acquired if projection data transmitted from an external device has been received, and the received projection data read into the RAM or the like, or if projection data stored in a storage unit (described below) has been read from the storage unit (described below), and the read projection data read into the RAM or the like, for example.
- If it is determined in step S400 that projection data has not been acquired, the image processing apparatus according to the present embodiment does not proceed to the next processing step until projection data is acquired.
- Further, if it is determined in step S400 that projection data has been acquired, the image processing apparatus according to the present embodiment sets an initial reconstituted image I0 in the same manner as in step S102 of
FIG. 1 (S402). - When the processing step S402 has been performed, the image processing apparatus according to the present embodiment performs the reconstitution processing for forming an X-ray image in the same manner as in step S104 of
FIG. 1 (S404). - When the processing step S404 has been performed, the image processing apparatus according to the present embodiment determines whether to finish the processing performed in the image processing method according to the present embodiment (S406). Here, when processing projection data based on parallel X-ray detection data representing a detection result in a helical scanning type CT apparatus or a non-helical scanning type CT apparatus like that illustrated in
FIG. 7 , the image processing apparatus according to the present embodiment determines, for example, to finish the processing performed in the image processing method according to the present embodiment when a signal indicating that scanning has finished transmitted from an external device, such as the CT apparatus, is received. Further, if projection data stored in a storage unit (described below) or the like is processed, the image processing apparatus according to the present embodiment determines to finish the processing performed in the image processing method according to the present embodiment when, for example, all of projection data (e.g., projection data formed into groups based on metadata and the like) corresponding to a given target has been read from the storage unit (described below). - If it is not determined in step S406 to finish the processing performed in the image processing method according to the present embodiment, the image processing apparatus according to the present embodiment repeats the processing from step S400. Further, if it is determined in step S406 to finish the processing performed in the image processing method according to the present embodiment, the image processing apparatus according to the present embodiment finishes the processing performed in the image processing method according to the present embodiment.
- The image processing apparatus according to the present embodiment performs the processing illustrated in
FIG. 8 , for example, as the processing performed in the image processing method according to the present embodiment. - Here, in the projection data that is processed by the image processing apparatus according to the present embodiment in the processing illustrated in
FIG. 8 , similar to the processing according to the first example illustrated inFIG. 1 , there is no mixing of data of the plurality of layers of the target. Namely, similar to the processing according to the first example illustrated inFIG. 1 , when performing the processing illustrated inFIG. 8 , for example, the image processing apparatus according to the present embodiment does not have to perform processing to reduce the effects of a cone beam or a fan beam, such as geometric correction, distortion correction, noise removal and the like. Further, similar to the processing according to the first example illustrated inFIG. 1 , the calculation amount relating to the reconstitution processing performed by the image processing apparatus according to the present embodiment is reduced by a lot more than when processing the above-described projection data in which X-ray detection data representing a detection result of cone beam or fan beam X-rays has been converted. - Therefore, similar to the processing according to the first example illustrated in
FIG. 1 , by performing the processing illustrated inFIG. 8 , for example, the image processing apparatus according to the present embodiment can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. - In addition, similar to the processing according to the first example illustrated in
FIG. 1 , since there is no mixing of the data of the plurality of layers of the target in the projection data processed by the image processing apparatus according to the present embodiment, deterioration in the accuracy of the X-ray image is prevented even if only the parallel X-ray detection data corresponding to a specific layer of the target is processed. Namely, since the image processing apparatus according to the present embodiment can independently perform processing on each layer surface, in the processing illustrated inFIG. 8 , the image processing apparatus according to the present embodiment forms successive X-ray images each time projection data is acquired. - Therefore, in a helical scanning type CT apparatus or a non-helical scanning type CT apparatus like that illustrated in
FIG. 7 , the image processing apparatus according to the present embodiment perform the calculations relating to the formation of the X-ray image, such as a reconstitution calculation, in order from the portions for which scanning has finished. For example, the calculations relating to the formation of the X-ray image can be completed along with as the finishing of the CT scanning. - Further, since the image processing apparatus according to the present embodiment forms successive X-ray images each time projection data is acquired, the amount of memory that is used in one calculation is substantially reduced.
- Next, an example of the configuration of an image processing apparatus according to the present embodiment that is capable of performing the processing performed in the above-described image processing method according to the present embodiment will be described.
- Before describing an example of the configuration of the image processing apparatus according to the present embodiment, an example of the image processing system according to the present embodiment that has the image processing apparatus according to the present embodiment will be described.
FIG. 9 is an explanatory diagram illustrating an example of animage processing system 1000 according to the present embodiment. Theimage processing system 1000 has, for example, animage processing apparatus 100, anX-ray output apparatus 200, and adetection apparatus 300. - The
X-ray output apparatus 200 includes, for example, an X-ray source (not illustrated), for outputting parallel beam X-rays. Here, examples of the X-ray source included in theX-ray output apparatus 200 include an X-ray tube, which is an electron tube for generating X-rays, a colimeter that forms parallel beam X-rays from X-rays generated by an X-ray tube, and a planar source in which a plurality of X-ray tubes are arranged on a flat face. - It is noted that the configuration of the
X-ray output apparatus 200 is not limited to that described above. For example, theX-ray output apparatus 200 is configured from a MPU (micro-processing unit), various processing circuits and the like. Further, theX-ray output apparatus 200 may also include a control unit (not illustrated) for controlling the generation of X-rays by the X-ray source, a ROM (read-only memory, not illustrated), a RAM (not illustrated) and the like. - Here, the ROM (not illustrated) included in the
X-ray output apparatus 200 stores control data, such as programs and calculation parameters used by the control unit (not illustrated) included in thedetection apparatus 300. The RAM included in theX-ray output apparatus 200 temporarily stores programs, for example, that are executed by the control unit (not illustrated) included in theX-ray output apparatus 200. - The
detection apparatus 300, which includes a detection unit (not illustrated) that has a detector for detecting X-rays, for example, detects parallel beam X-rays and generates parallel X-ray detection data. - It is noted that the configuration of the
detection apparatus 300 is not limited to that described above. For example, thedetection apparatus 300 is configured from a MPU, various processing circuits and the like. Further, thedetection apparatus 300 may also include a processing unit (not illustrated) for converting parallel X-ray detection data into projection data, a ROM (read-only memory, not illustrated), a RAM (not illustrated), a communication unit and the like. - Here, the ROM (not illustrated) included in the
detection apparatus 300 stores control data, such as programs and calculation parameters used by the control unit (not illustrated) included in thedetection apparatus 300. The RAM included in thedetection apparatus 300 temporarily stores programs, for example, that are executed by the control unit (not illustrated) included in thedetection apparatus 300. - The communication unit (not illustrated) included in the
detection apparatus 300 is a communication device included in thedetection apparatus 300, which has the role of performing wireless/wired communication with an external device, such as theimage processing apparatus 100, via a network (or directly). Here, examples of the communication unit (not illustrated) included in thedetection apparatus 300 include a communication antenna and an RF (radio frequency) circuit (wireless communication), an IEEE 802.15.1 port and a transmitting/receiving circuit (wireless communication), an IEEE 802.11b port and a transmitting/receiving circuit (wireless communication), or a LAN (local area network) terminal and a transmitting/receiving circuit (wired communication) and the like. Further examples of the communication unit (not illustrated) included in thedetection apparatus 300 include a configuration that supports an arbitrary standard capable of performing communication, such as a USB (universal serial bus) terminal and a transmitting/receiving circuit, and an arbitrary configuration capable of communicating with an external device via a network. Examples of the network according to the embodiment of the present disclosure include a wired network such as a LAN or a WAN (wide area network), a wireless network such as a wireless LAN (wireless local area network), and wireless WAN (wireless wide area network) via a base station, or the Internet using a communication protocol such as TCP/IP (transmission control protocol/internet protocol) and the like. - The
detection apparatus 300 transmits to theimage processing apparatus 100, for example, the generated parallel X-ray detection data or projection data in which parallel X-ray detection data has been converted. - The
image processing apparatus 100 forms an X-ray image based on parallel X-ray detection data by performing the above-described processing performed in the image processing method according to the present embodiment, and processing parallel X-ray detection data or projection data in which parallel X-ray detection data has been converted. - Here, the
image processing apparatus 100 processes, for example, parallel X-ray detection data transmitted from thedetection apparatus 300, or projection data transmitted from thedetection apparatus 300 in which parallel X-ray detection data has been converted. It is noted that theimage processing apparatus 100 can process, for example, parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted. Examples of parallel X-ray detection data stored in the storage unit (described below) or the like include parallel X-ray detection data generated by thedetection apparatus 300 and parallel X-ray detection data generated by an external device other than thedetection apparatus 300. Further, examples of projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted include projection data converted by thedetection apparatus 300 and projection data generated by an external device other than thedetection apparatus 300. - The
image processing system 1000 has, for example, the configuration illustrated inFIG. 9 . In theimage processing system 1000 illustrated inFIG. 9 , theimage processing apparatus 100 forms an X-ray image based on parallel X-ray detection data by performing the above-described processing performed in the image processing method according to the present embodiment. Therefore, based on the configuration illustrated inFIG. 9 , for example, an image processing system is realized that can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. - It is noted that the image processing system according to the present embodiment is not limited to the configuration illustrated in
FIG. 9 . For example, in the image processing system according to the present embodiment, theX-ray output apparatus 200 and thedetection apparatus 300 may be an integrated apparatus, like a CT apparatus that utilizes X-rays or an apparatus having a tomosynthesis function in which X-rays are utilized. Further, if theX-ray output apparatus 200 and thedetection apparatus 300 are an integrated apparatus, such an apparatus may include a gantry that has a rotary motor, for example. - Next, an example of the configuration of the image processing apparatus according to the present embodiment will be described using the
image processing apparatus 100 configuring theimage processing system 1000 illustrated inFIG. 9 as an example. -
FIG. 10 is a block diagram illustrating an example of a configuration of animage processing apparatus 100 according to an embodiment of the present disclosure. Theimage processing apparatus 100 includes, for example, acommunication unit 102 and acontrol unit 104. - Further, the
image processing apparatus 100 may also include, for example, a ROM (not illustrated), a RAM (not illustrated), a storage unit (not illustrated), a user-operable operation unit (not illustrated), a display unit (not illustrated) that displays various screens on a display screen and the like. Theimage processing apparatus 100 connects these constituent elements to each other with a bus that serves as a data transmission path. - Here, the ROM (not illustrated) stores control data, such as programs and calculation parameters used by the
control unit 104. The RAM (not illustrated) temporarily stores programs and the like that are executed by thecontrol unit 104. - The storage unit (not illustrated) is a storage device included in the
image processing apparatus 100, which stores, for example, various data such as X-ray detection data, projection data in which X-ray detection data has been converted, and applications. Here, examples of the storage unit (not illustrated) include magnetic recording media such as a hard disk, non-volatile memory such as flash memory and the like. Further, the storage unit (not illustrated) may be detachable from theimage processing apparatus 100. - In addition, examples of the operation unit (not illustrated) include the below-described operation input device. Examples of the display unit (not illustrated) may include the below-described display device.
-
FIG. 11 is an explanatory diagram illustrating an example of a hardware configuration of theimage processing apparatus 100 according to an embodiment of the present disclosure. Theimage processing apparatus 100 includes, for example, aMPU 150, aROM 152, aRAM 154, arecording medium 156, an input/output interface 158, anoperation input device 160, adisplay device 162, and acommunication interface 164. Further, theimage processing apparatus 100 connects these constituent elements to each other with abus 166 that serves as a data transmission path. - The
MPU 150 is configured from, for example, a MPU, various processing circuits and the like. TheMPU 150 functions as thecontrol unit 104 for controlling the wholeimage processing apparatus 100. Further, in theimage processing apparatus 100, theMPU 150 plays the role of, for example, the below-describedprocessing unit 110. - The
ROM 152 stores control data, such as programs and calculation parameters used by theMPU 150. TheRAM 154 temporarily stores programs and the like, for example, that are executed by theMPU 150. - The
recording medium 156 functions as a storage unit, which stores, for example, various data such as X-ray detection data, projection data in which X-ray detection data has been converted, and applications. Here, examples of therecording medium 156 include magnetic recording media such as a hard disk, non-volatile memory such as flash memory and the like. Further, therecording medium 156 may be detachable from theimage processing apparatus 100. - The input/
output interface 158, for example, connects theoperation input device 160 and thedisplay device 162. Theoperation input device 160 functions as an operation unit (not illustrated), and thedisplay device 162 functions as a display unit (not illustrated). Here, examples of the input/output interface 158 includes a USB terminal, a DVI (digital visual interface) terminal, a HDMI (high-definition multimedia interface) terminal, various processing circuits and the like. Further, theoperation input device 160 is, for example, included on theimage processing apparatus 100, and is connected with the input/output interface 158 in theimage processing apparatus 100. Examples of theoperation input device 160 include a button, a direction key, a rotating-type selector such as a jog dial, or a combination of these. Further, thedisplay device 162 is, for example, included on theimage processing apparatus 100, and is connected with the input/output interface 158 in theimage processing apparatus 100. Examples of the input/output interface 158 include a liquid crystal display (LCD), an organic EL display (organic electroluminescence display, also called an OLED (organic light emitting diode display)) and the like. - It is noted that the input/
output interface 158 is obviously also connected to an external device, such as an operation input device (e.g., a keyboard, a mouse etc.) or a display device, as an external device of theimage processing apparatus 100. Further, thedisplay device 162 may also be a device that can perform a display and user operations. - The
communication interface 164 is a communication unit included in theimage processing apparatus 100, which functions as thecommunication unit 102 for performing wireless/wired communication with thedetection apparatus 300 or an external device, such as a server, via a network (or directly). Here, examples of thecommunication interface 164 include a communication antenna and an RF circuit (wireless communication), an IEEE 802.15.1 port and a transmitting/receiving circuit (wireless communication), an IEEE 802.11b port and a transmitting/receiving circuit (wireless communication), or a LAN (local area network) terminal and a transmitting/receiving circuit (wired communication) and the like. - The
image processing apparatus 100 performs the processing performed in the image processing method according to the present embodiment based on the configuration illustrated inFIG. 11 , for example. However, the hardware configuration of theimage processing apparatus 100 according to the present embodiment is not limited to the configuration illustrated inFIG. 11 . For example, if theimage processing apparatus 100 performs the processing as a standalone configuration, theimage processing apparatus 100 may be configured without thecommunication device 164. In addition, theimage processing apparatus 100 may also be configured without theoperation input device 160 or thedisplay device 162. - An example of the configuration of the
image processing apparatus 100 will be described again with reference toFIG. 10 . Thecommunication unit 102 is a communication unit included in theimage processing apparatus 100, which performs wireless/wired communication with thedetection apparatus 300 or an external device, such as a server, via a network (or directly). Further, communication by thecommunication unit 102 is controlled by thecontrol unit 104, for example. Here, examples of thecommunication unit 102 include a communication antenna and an RF (radio frequency) circuit, a LAN terminal, a transmitting/receiving circuit and the like. However, the configuration of thecommunication unit 102 is not limited to these examples. For example, thecommunication unit 102 may have a configuration that supports an arbitrary standard that is capable of performing communication, such as a USB terminal and a transmitting/receiving circuit, or an arbitrary configuration that is capable of communicating with an external device via a network. - The
control unit 104 is configured from a MPU, for example, which plays the role of controlling the wholeimage processing apparatus 100. Further, thecontrol unit 104 which includes, for example, theprocessing unit 110, plays the lead role in the processing performed in the image processing method according to the present embodiment. - The
processing unit 110, which plays the lead role in the processing performed in the image processing method according to the present embodiment, forms an X-ray image based on X-ray detection data by processing projection data in which parallel X-ray detection data (X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source) has been converted by projection. More specifically, theprocessing unit 110 forms an X-ray image based on X-ray detection data by performing the processing according to the above-described first example, the processing according to the above-described second example, or the processing according to the above-described third example. - Here, for example, if it is possible to directly process the projection data in which parallel X-ray detection data has been converted, such as when the
communication unit 102 has received projection data in which parallel X-ray detection data has been converted, theprocessing unit 110 process that projection data. Further, for example, in the case of processing the parallel X-ray detection data, such as when thecommunication unit 102 has received parallel X-ray detection data, theprocessing unit 110 converts the parallel X-ray detection data and processes the converted projection data. It is noted that theprocessing unit 110 can also process parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted. - The
control unit 104 plays the lead role in the processing performed in the image processing method according to the present embodiment due to its inclusion of theprocessing unit 110, for example. - The
image processing apparatus 100 performs the processing performed in the image processing method according to the present embodiment based on the configuration illustrated inFIG. 10 , for example. - Therefore, the
image processing apparatus 100 can achieve a higher quality X-ray image while reducing the calculation costs for forming an X-ray image even further. Further, theimage processing apparatus 100 achieves the effects gained from performing the processing according to the above-described first example when performing the processing according to the above-described first example, achieves the effects gained from performing the processing according to the above-described second example when performing the processing according to the above-described second example, and achieves the effects gained from performing the processing according to the above-described third example when performing the processing according to the above-described third example. - It is noted that the configuration of the image processing apparatus according to the present embodiment is not limited to the configuration illustrated in
FIG. 10 . - For example, the image processing apparatus according to the present embodiment may further include a detection unit (not illustrated) having a similar function and configuration to the
detection apparatus 300 illustrated inFIG. 9 . The detection unit (not illustrated) detects parallel beam X-rays and generates parallel X-ray detection data, for example. Further, the detection unit (not illustrated) may also have a function for, for example, detecting parallel beam X-rays, generating parallel X-ray detection data, and converting the generated parallel X-ray detection data into projection data by projection. - When the detection unit (not illustrated) detects parallel beam X-rays and generates parallel X-ray detection data, for example, the
processing unit 110 converts the X-ray detection data generated by the detection unit (not illustrated) into projection data by projection, and processes the converted projection data. Further, in the case of the detection unit (not illustrated) detecting parallel beam X-rays, generating parallel X-ray detection data, and converting the generated parallel X-ray detection data into projection data by projection, theprocessing unit 110 processes the projection data converted by the detection unit (not illustrated). It is noted that theprocessing unit 110 according to the first modified example of the present embodiment may also process parallel X-ray detection data stored in a storage unit (described below), or projection data in which parallel X-ray detection data stored in a storage unit (described below) has been converted. - Similar to the
image processing apparatus 100 illustrated inFIG. 10 , even if it further includes a detection unit (not illustrated), the image processing apparatus according to the first modified example of the present embodiment can perform the processing performed in the image processing method according to the present embodiment. Therefore, the image processing apparatus according to the first modified example of the present embodiment can obtain the same effects as theimage processing apparatus 100 illustrated inFIG. 10 . - The image processing apparatus according to the present embodiment may further include, in addition to the configuration of the image processing apparatus according to the first modified example of the present embodiment, an X-ray output unit (not illustrated) having a similar function and configuration to the
X-ray output apparatus 200 illustrated inFIG. 9 . The X-ray output unit (not illustrated) has an X-ray source that outputs parallel beam X-rays, for example. Further, the generation of the X-rays in the X-ray output unit (not illustrated) is controlled by thecontrol unit 104, for example. - With the image processing apparatus according to the second modified example of the present embodiment, which in addition to the configuration illustrated in
FIG. 10 , further includes an X-ray output unit (not illustrated) and a detection unit (not illustrated), the detection unit (not illustrated) can detect parallel beam X-rays output from the X-ray output unit (not illustrated), for example, and theprocessing unit 110 can process projection data in which parallel X-ray detection data representing a detection result has been converted by the detection unit (not illustrated). It is noted that theprocessing unit 110 according to the second modified example of the present embodiment can also process parallel X-ray detection data stored in a storage unit (described below) or the like, or projection data stored in the storage unit (described below) or the like in which parallel X-ray detection data has been converted. - Therefore, even with a configuration that additionally includes an X-ray output unit (not illustrated) and a detection unit (not illustrated), similar to the
image processing apparatus 100 illustrated inFIG. 10 , the image processing apparatus according to the second modified example of the present embodiment can perform the processing performed in the image processing method according to the present embodiment. - Therefore, the image processing apparatus according to the first modified example of the present embodiment can obtain the same effects as the
image processing apparatus 100 illustrated inFIG. 10 . - When the image processing apparatus according to the present embodiment performs processing as a standalone configuration, for example, the image processing apparatus according to the present embodiment may be configured without the
communication unit 102. - Although an image processing apparatus was described above as an embodiment of the present disclosure, the present embodiment is not limited to this example. The present embodiment can also be used in various devices that are capable of processing an image, such as a computer like a PC (personal computer) or a server, a CT apparatus (an apparatus that uses 360° direction projection data), an apparatus having a tomosynthesis function (an apparatus that uses projection data of a controlled angle direction, such as 180° direction projection data), a communications device such as a smartphone and the like. Further, the present embodiment can also be applied in a processing IC (integrated circuit) that can be incorporated in such devices.
- By executing on a computer a program that makes a computer function as the image processing apparatus according to the present embodiment (e.g., a program capable of executing the processing performed in the image processing method according to the present embodiment, such as a program that makes a computer function as the
processing unit 110 illustrated inFIG. 10 ), a higher quality X-ray image can be achieved while reducing the calculation costs for forming an X-ray image even further. - It should be understood by those skilled in the art that various modifications, combinations, sub-combinations and alterations may occur depending on design requirements and other factors insofar as they are within the scope of the appended claims or the equivalents thereof.
- For example, although a program (computer program) that makes a computer function as the image processing apparatus according to the present embodiment was described above, the present embodiment can further provide a recording medium in which this program is stored.
- The above-described configuration illustrates one example of the present embodiment, and naturally comes under the technical scope of an embodiment according to the present disclosure.
- Additionally, the present technology may also be configured as below.
- (1) An image processing apparatus including:
- a processing unit configured to processes projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and form an X-ray image based on the X-ray detection data.
- (2) The image processing apparatus according to (1), wherein the processing unit is configured to divide the projection data and form the X-ray image per piece of divided projection data.
(3) The image processing apparatus according to (1), wherein the processing unit is configured to form the X-ray image each time the projection data is acquired.
(4) The image processing apparatus according to any one of (1) to (3), further including: - a detection unit configured to detect the parallel beam X-rays, generate the X-ray detection data, and convert the generated X-ray detection data into the projection data,
- wherein the processing unit is configured to process the projection data converted by the detection unit.
- (5) The image processing apparatus according to any one of (1) to (3), further including:
- a detection unit configured to detect the parallel beam X-rays and generate the X-ray detection data,
- wherein the processing unit is configured to convert the X-ray detection data generated by the detection unit into the projection data, and process the converted projection data.
- (6) The image processing apparatus according to (4) or (5), further including:
- an X-ray output unit that includes the X-ray source for outputting the parallel beam X-rays.
- (7) An image processing method including:
- processing projection data in which X-ray detection data representing a detection result of parallel beam X-rays output from an X-ray source has been converted by projection, and forming an X-ray image based on the X-ray detection data.
- an X-ray output apparatus that includes an X-ray source for outputting parallel beam X-rays;
- a detection apparatus configured to detect the parallel beam X-rays, generate X-ray detection data representing a detection result of the parallel beam X-rays, and convert the generated X-ray detection data into projection data by projection; and
- an image processing apparatus that includes a processing unit configured to process projection data in which the X-ray detection data has been converted, and form an X-ray image based on the X-ray detection data.
Claims (8)
1. An information processing apparatus comprising:
circuitry configured to:
process projection data in which X-ray detection data representing a detection result of X-ray beam output from an X-ray source through a layer of a target has been converted by projection; and
form reconstruction data from the processed projection data corresponding to the layer.
2. The information processing apparatus according to claim 1 , wherein the circuitry is configured to divide the projection data and form the reconstruction data for each piece of the divided projection data.
3. The information processing apparatus according to claim 1 , wherein the circuitry is configured to form an X-ray image based on the reconstruction data each time the projection data is acquired.
4. The information processing apparatus according to claim 1 , wherein the circuitry is configured to:
detect the X-ray beam, generate the X-ray detection data, and convert the generated X-ray detection data into the projection data by projection, and
process the converted projection data.
5. The information processing apparatus according to claim 4 , wherein the circuitry is configured to:
detect parallel X-ray beam and generate the X-ray detection data based on the detected parallel X-ray beam.
6. The information processing apparatus according to claim 4 , further comprising:
an X-ray output unit that includes the X-ray source for outputting the X-ray beam.
7. An information processing method comprising:
processing projection data in which X-ray detection data representing a detection result of X-ray beam output from an X-ray source through a layer of a target has been converted by projection and forming reconstruction data from the projection data corresponding to the layer.
8. An information processing system comprising:
an X-ray output apparatus that includes an X-ray source for outputting X-ray beam; and
circuitry configured to:
detect X-ray beam, generate X-ray detection data representing a detection result of the X-ray beam through a layer of a target, and convert the generated X-ray detection data corresponding to the layer into projection data by projection; and
process the projection data and form reconstruction data from the projection data corresponding to the layer.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US14/995,133 US20160125627A1 (en) | 2012-08-24 | 2016-01-13 | Image processing apparatus, image processing method, and image processing system |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2012-185292 | 2012-08-24 | ||
JP2012185292A JP2014042564A (en) | 2012-08-24 | 2012-08-24 | Image processing apparatus, image processing method, and image processing system |
US13/968,469 US9239301B2 (en) | 2012-08-24 | 2013-08-16 | Image processing apparatus, image processing method, and image processing system |
US14/995,133 US20160125627A1 (en) | 2012-08-24 | 2016-01-13 | Image processing apparatus, image processing method, and image processing system |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,469 Continuation US9239301B2 (en) | 2012-08-24 | 2013-08-16 | Image processing apparatus, image processing method, and image processing system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20160125627A1 true US20160125627A1 (en) | 2016-05-05 |
Family
ID=50147991
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,469 Expired - Fee Related US9239301B2 (en) | 2012-08-24 | 2013-08-16 | Image processing apparatus, image processing method, and image processing system |
US14/995,133 Abandoned US20160125627A1 (en) | 2012-08-24 | 2016-01-13 | Image processing apparatus, image processing method, and image processing system |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,469 Expired - Fee Related US9239301B2 (en) | 2012-08-24 | 2013-08-16 | Image processing apparatus, image processing method, and image processing system |
Country Status (3)
Country | Link |
---|---|
US (2) | US9239301B2 (en) |
JP (1) | JP2014042564A (en) |
CN (1) | CN103622716A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109523517A (en) * | 2018-10-23 | 2019-03-26 | 北京深睿博联科技有限责任公司 | Heterogeneous characteristic processing method and device for neurological disease image |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014042564A (en) * | 2012-08-24 | 2014-03-13 | Sony Corp | Image processing apparatus, image processing method, and image processing system |
WO2014192831A1 (en) * | 2013-05-28 | 2014-12-04 | 株式会社東芝 | Medical diagnostic imaging equipment and control method |
CN204392310U (en) * | 2015-02-13 | 2015-06-10 | 武汉数字派特科技有限公司 | A kind of PET communication system based on all-IP |
US10628736B2 (en) * | 2015-09-24 | 2020-04-21 | Huron Technologies International Inc. | Systems and methods for barcode annotations for digital images |
CA3118014A1 (en) | 2018-11-05 | 2020-05-14 | Hamid Reza Tizhoosh | Systems and methods of managing medical images |
US11610395B2 (en) | 2020-11-24 | 2023-03-21 | Huron Technologies International Inc. | Systems and methods for generating encoded representations for multiple magnifications of image data |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3971948A (en) * | 1973-08-06 | 1976-07-27 | Siemens Aktiengesellschaft | X-ray diagnostic apparatus for producing a transverse layer image |
US4282438A (en) * | 1977-02-14 | 1981-08-04 | Tokyo Shibaura Electric Co., Ltd. | Computed tomography apparatus and method using penetrating radiation |
US4481650A (en) * | 1981-04-24 | 1984-11-06 | Instrumentarlum Oy | Installation for producing radiographic layer images |
US4891829A (en) * | 1986-11-19 | 1990-01-02 | Exxon Research And Engineering Company | Method and apparatus for utilizing an electro-optic detector in a microtomography system |
US5109276A (en) * | 1988-05-27 | 1992-04-28 | The University Of Connecticut | Multi-dimensional multi-spectral imaging system |
USRE34160E (en) * | 1977-06-29 | 1993-01-12 | Emi Limited | Medical radiographic apparatus |
US20040264634A1 (en) * | 2003-06-25 | 2004-12-30 | General Electric Company | Fourier based method, apparatus, and medium for optimal reconstruction in digital tomosynthesis |
US20090022264A1 (en) * | 2007-07-19 | 2009-01-22 | Zhou Otto Z | Stationary x-ray digital breast tomosynthesis systems and related methods |
US20100255213A1 (en) * | 2009-04-02 | 2010-10-07 | Fei Company | Method for Forming Microscopic 3D Structures |
US20110019799A1 (en) * | 2009-07-24 | 2011-01-27 | Nucsafe, Inc. | Spatial sequenced backscatter portal |
US20140056402A1 (en) * | 2012-08-24 | 2014-02-27 | Sony Corporation | Image processing apparatus, image processing method, and image processing system |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002125966A (en) * | 2000-10-25 | 2002-05-08 | Hitachi Medical Corp | X-ray ct apparatus |
DE10337935A1 (en) * | 2003-08-18 | 2005-03-17 | Siemens Ag | Device for recording structural data of an object |
JP4434698B2 (en) * | 2003-11-13 | 2010-03-17 | 株式会社東芝 | X-ray CT system |
JP4840876B2 (en) * | 2005-02-28 | 2011-12-21 | 大学共同利用機関法人 高エネルギー加速器研究機構 | Three-dimensional image synthesis method and apparatus |
JP2006255089A (en) * | 2005-03-16 | 2006-09-28 | Toshiba Corp | X-ray computer tomography apparatus |
KR101110712B1 (en) * | 2006-11-09 | 2012-02-24 | 캐논 가부시끼가이샤 | Radiographic imaging control apparatus using multi radiation generating apparatus |
JP5461803B2 (en) * | 2008-08-22 | 2014-04-02 | 株式会社東芝 | X-ray CT system |
JP5661483B2 (en) * | 2011-01-17 | 2015-01-28 | 株式会社東芝 | Medical diagnostic imaging equipment |
-
2012
- 2012-08-24 JP JP2012185292A patent/JP2014042564A/en active Pending
-
2013
- 2013-08-16 CN CN201310359747.1A patent/CN103622716A/en active Pending
- 2013-08-16 US US13/968,469 patent/US9239301B2/en not_active Expired - Fee Related
-
2016
- 2016-01-13 US US14/995,133 patent/US20160125627A1/en not_active Abandoned
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3971948A (en) * | 1973-08-06 | 1976-07-27 | Siemens Aktiengesellschaft | X-ray diagnostic apparatus for producing a transverse layer image |
US4282438A (en) * | 1977-02-14 | 1981-08-04 | Tokyo Shibaura Electric Co., Ltd. | Computed tomography apparatus and method using penetrating radiation |
USRE34160E (en) * | 1977-06-29 | 1993-01-12 | Emi Limited | Medical radiographic apparatus |
US4481650A (en) * | 1981-04-24 | 1984-11-06 | Instrumentarlum Oy | Installation for producing radiographic layer images |
US4891829A (en) * | 1986-11-19 | 1990-01-02 | Exxon Research And Engineering Company | Method and apparatus for utilizing an electro-optic detector in a microtomography system |
US5109276A (en) * | 1988-05-27 | 1992-04-28 | The University Of Connecticut | Multi-dimensional multi-spectral imaging system |
US20040264634A1 (en) * | 2003-06-25 | 2004-12-30 | General Electric Company | Fourier based method, apparatus, and medium for optimal reconstruction in digital tomosynthesis |
US20090022264A1 (en) * | 2007-07-19 | 2009-01-22 | Zhou Otto Z | Stationary x-ray digital breast tomosynthesis systems and related methods |
US20100255213A1 (en) * | 2009-04-02 | 2010-10-07 | Fei Company | Method for Forming Microscopic 3D Structures |
US20110019799A1 (en) * | 2009-07-24 | 2011-01-27 | Nucsafe, Inc. | Spatial sequenced backscatter portal |
US20140056402A1 (en) * | 2012-08-24 | 2014-02-27 | Sony Corporation | Image processing apparatus, image processing method, and image processing system |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109523517A (en) * | 2018-10-23 | 2019-03-26 | 北京深睿博联科技有限责任公司 | Heterogeneous characteristic processing method and device for neurological disease image |
Also Published As
Publication number | Publication date |
---|---|
US9239301B2 (en) | 2016-01-19 |
JP2014042564A (en) | 2014-03-13 |
US20140056402A1 (en) | 2014-02-27 |
CN103622716A (en) | 2014-03-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20160125627A1 (en) | Image processing apparatus, image processing method, and image processing system | |
US9536336B2 (en) | Image processing apparatus, image processing method, and program | |
US10839483B2 (en) | Method of converting low-resolution image to high-resolution image and image conversion device performing method | |
US10152951B2 (en) | Method and system for interactive control of window/level parameters of multi-image displays | |
US20180061097A1 (en) | Image generation apparatus, image generation method, and x-ray ct apparatus | |
US11615270B2 (en) | Medical image processing apparatus, learning method, X-ray diagnostic apparatus, and medical image processing method | |
US11557071B2 (en) | Systems and methods for determining at least one artifact calibration coefficient | |
US9208540B2 (en) | Image producing method, image producing apparatus and radiation tomographic imaging apparatus, and program | |
US11875434B2 (en) | Systems and methods for correcting projection images in computed tomography image reconstruction | |
US11361480B2 (en) | System and method for 3D image reconstruction from axial step-and-shoot CT | |
US9870638B2 (en) | Appearance transfer techniques | |
CN103310471B (en) | CT video generation device and method, CT image generation system | |
Xie et al. | High through-plane resolution CT imaging with self-supervised deep learning | |
CN104182932A (en) | CT (Computed Tomography) device, CT image system and CT image generation method | |
US20150335306A1 (en) | System and method for ultra-high resolution tomographic imaging | |
US20150154757A1 (en) | Image processor, treatment system, and image processing method | |
US10242440B2 (en) | Doseless emission tomography attenuation correction | |
US11521336B2 (en) | Systems and methods for correcting projection images in computed tomography image reconstruction | |
CN105809723B (en) | CBCT method for reconstructing and system | |
CN111080523A (en) | Infrared panoramic search system and infrared panoramic image splicing method based on angle information | |
Wei et al. | A neighborhood standard deviation based algorithm for generating PET crystal position maps | |
CN111080734B (en) | Method and terminal for processing Positron Emission Tomography (PET) data | |
US9472001B2 (en) | Image processor, image reconstruction method, and radiation imaging apparatus | |
CN110084866B (en) | Computed tomography method and device | |
US20190307411A1 (en) | Radiographic image processing apparatus, radiographic image processing method, and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE |