JP5563018B2 - Radiography equipment - Google Patents

Radiography equipment Download PDF

Info

Publication number
JP5563018B2
JP5563018B2 JP2012129947A JP2012129947A JP5563018B2 JP 5563018 B2 JP5563018 B2 JP 5563018B2 JP 2012129947 A JP2012129947 A JP 2012129947A JP 2012129947 A JP2012129947 A JP 2012129947A JP 5563018 B2 JP5563018 B2 JP 5563018B2
Authority
JP
Japan
Prior art keywords
image
image signal
projection
converted
processing unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2012129947A
Other languages
Japanese (ja)
Other versions
JP2013031641A (en
Inventor
貞登 赤堀
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2011149411 priority Critical
Priority to JP2011149411 priority
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Priority to JP2012129947A priority patent/JP5563018B2/en
Priority claimed from EP12172019A external-priority patent/EP2535872A1/en
Publication of JP2013031641A publication Critical patent/JP2013031641A/en
Publication of JP5563018B2 publication Critical patent/JP5563018B2/en
Application granted granted Critical
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to a radiation imaging apparatus that captures a plurality of projection images (radiation images) by tomosynthesis imaging, and reconstructs a tomographic image at a predetermined cross section of a subject from the plurality of projection images thus captured.

  In tomosynthesis imaging, for example, a patient is irradiated with radiation at different angles while moving the radiation source in one direction, and the radiation that has passed through the subject is detected by a radiation detector. The projected images are taken continuously. Then, the tomographic images in the predetermined cross section of the subject are reconstructed by shifting the plurality of photographed projection images so that the positions of the structures of interest match and superimposing corresponding pixels.

  In tomosynthesis imaging, by utilizing the fact that structures overlap in the projection image depending on the imaging angle, as described above, by adding the projection image while appropriately shifting the target structure on the desired tomographic plane, An enhanced image (tomographic image) is obtained.

  However, in tomosynthesis imaging, the range of imaging angles is limited, so that the separation capability in the depth direction is limited, and artifacts due to the influence of structures other than the target structure may occur in the reconstructed tomographic image. There is a problem.

  Here, as a prior art document considered to be relevant to the present invention, Patent Document 1 discloses that one image is 4 × 6 pixels (in order to measure the amount of movement from two subject images). It is described that the image is divided into small regions for each pixel) and the similarity between two images is calculated for each small region. Patent Document 2 describes that a sub-region in a search region having high similarity to image data in a template is determined using a normalized cross-correlation method.

  In addition, in order to improve diagnostic performance for the tomographic image after reconstruction, frequency enhancement processing for enhancing high-frequency components, frequency suppression processing for suppressing low-frequency components such as flow images included in the tomographic image, etc. It has been proposed to perform image processing (see Patent Document 3).

  Here, the frequency enhancement process subtracts the non-sharp mask image signal of the non-sharp mask image (average image of the tomographic image) from the image signal of the tomographic image to create an image signal of the frequency image including a high-frequency component, The frequency image signal multiplied by the enhancement coefficient and the high frequency component enhanced is added to the image signal of the tomographic image (see Patent Documents 4 and 5). Thereby, the high frequency component in the tomographic image is emphasized.

  In addition, the flow image is an obstacle shadow that occurs in the tomographic image along the moving direction of the radiation source in an image of a portion where the radiation transmission dose greatly changes other than the focal section to be imaged (see Patent Document 6). ). In the frequency suppression process, for example, the low-frequency component corresponding to the flow image or the like is removed from the reconstructed tomographic image to generate an image from which the flow image or the like is removed.

  In order to perform image processing such as the above-described frequency enhancement processing and frequency suppression processing, a filtered back-projection method (Filtered Back-Projection method) that performs filter correction processing at the time of reconstruction of a tomographic image is known. In this filter correction process, a ramp filter having a filter characteristic for performing linear transformation as shown in FIG. 17A, or a product obtained by multiplying this ramp filter by a window function such as a Hanning window shown in FIG. Etc. are commonly used.

  The filter correction processing is usually performed with a linear filter having high-frequency emphasis characteristics, but due to over-emphasis of high-contrast signals, overshoot and undershoot occur around metal or the like, noise components are emphasized, There is a problem that artifacts occur in tomographic images. Further, as another method of reconstruction of tomosynthesis, a successive approximation method such as an algebraic reconstruction method is known. However, this method has a problem that an operation time increases.

Japanese Patent Laid-Open No. 5-49631 JP-A-7-37074 Japanese Patent No. 3816151 JP 55-163472 A JP 55-87953 A JP-A-3-276265

  A first object of the present invention is to provide a radiation imaging apparatus capable of performing filter correction processing that does not cause artifacts even in a portion where the contrast of a tomographic image is large without increasing the calculation time.

  In addition to the first object described above, the second object of the present invention is radiography that can prevent the occurrence of artifacts due to the influence of structures other than the target structure in the tomographic image after reconstruction. To provide an apparatus.

In order to achieve the above object, the present invention is a radiographic apparatus that reconstructs a tomographic image in a predetermined cross section of a subject from a plurality of projection images of the subject taken by tomosynthesis imaging,
A frequency filter processing unit that creates a plurality of band-limited image signals each having different frequency response characteristics from a projection image signal corresponding to the projection image;
A non-linear transformation processing unit for non-linear transformation of the band-limited image signal;
An integration processing unit that integrates a plurality of band limited image signals nonlinearly converted by the nonlinear conversion processing unit to create a converted image signal;
A back projection processing unit for reconstructing a tomographic image in a predetermined cross section of the subject from a plurality of converted image signals corresponding to a plurality of projection images;
The nonlinear conversion processing unit nonlinearly converts the band limited image signal so that a component exceeding the first predetermined value of the band limited image signal included in the converted image signal is small, and further, a component on the high frequency side of the projection image signal more become components of the low frequency side than is included in the converted image signal, to provide a radiation imaging apparatus, wherein the components of the band-limited image signal is to a non-linear transformation to be smaller.

Further, the nonlinear conversion processing unit further converts the converted image with respect to pixels whose projection image signal is smaller than the third predetermined value as the projection image is captured on the end side of the projection image captured from the front of the subject. contained in the signal, it is preferred components of the band-limited image signal is to a non-linear transformation to be smaller.

Furthermore, non-linear transformation processing part is further included in the converted image signal, that the projected image signal is what components of the band-limited image signal of a smaller pixel than the second predetermined value is non-linear transformation to be smaller Is preferred.

  Moreover, it is preferable that a frequency filter process part produces a several band limited image signal by carrying out the band limitation of the projection image signal one dimension along the moving direction of a radiation source.

Further, a single conversion image corresponding to one conversion image signal is used as a reference conversion image, and is cumulatively added to the same position on the tomographic image between pixels on the reference conversion image and pixels on each conversion image. A similarity calculator for calculating the similarity of
For each pixel of the plurality of converted images, a weighting coefficient calculating unit that calculates a weighting coefficient so as to increase as the degree of similarity increases,
The back projection processing unit re-adds the tomographic image by accumulating and adding the multiplication values of the pixel values of the plurality of converted images and the corresponding weighting factors that are cumulatively added to the same position on the tomographic image. It is preferable to constitute.

  Here, the similarity calculation unit calculates a similarity between the first region on the reference conversion image and the second region on each conversion image, which is cumulatively added to the same position on the tomographic image. Preferably there is.

  Moreover, it is preferable that a similarity calculation part makes a reference | standard conversion image the conversion image image | photographed from the front of the subject among several conversion images.

The back projection processing unit reconstructs a tomographic image from a plurality of converted images,
Furthermore, a similarity calculation unit that calculates the similarity between pixels on a plurality of converted images accumulated and added at the same position on the tomographic image;
For each pixel of the tomographic image, a weighting coefficient calculation unit that calculates a weighting coefficient so as to increase as the degree of similarity increases,
It is preferable that the image processing apparatus includes a multiplication processing unit that generates a multiplication processing image by multiplying a pixel value of each pixel of the tomographic image by a weighting coefficient corresponding thereto.

  Moreover, it is preferable that the similarity calculation unit calculates a similarity between predetermined regions on a plurality of converted images that are cumulatively added to the same position on the tomographic image.

  Moreover, it is preferable that a similarity calculation part is what calculates a similarity degree by normalization cross-correlation.

  Moreover, it is preferable that a weighting coefficient calculation part uses a similarity as a weighting coefficient.

  According to the present invention, it is possible to reduce the occurrence of artifacts and noise components from being overemphasized in a tomographic image, such as metal, where the contrast is originally high, without increasing the computation time.

  Further, according to the present invention, since the pixel values are weighted according to the similarity between the projected images, the target structure is emphasized in the reconstructed tomographic image, and the structures other than the target structure are displayed. Generation of artifacts due to influence can be prevented.

It is a block diagram of one embodiment showing composition of a radiography apparatus of the present invention. (A) And (B) is a conceptual diagram of an example showing the mode at the time of reconstruction of a tomographic image by tomosynthesis imaging. It is a block diagram of one embodiment showing composition of an image processing device of a radiographic device of the 1st mode. It is a conceptual diagram of an example showing the positional relationship between the structure imaged by tomosynthesis imaging and the structure in the projection image. It is a conceptual diagram of an example showing the mode at the time of reconstruction of a tomographic image. It is a conceptual diagram of an example showing the relationship between the position of a structure, the position of a radiation source, and a projection position on a projection image. It is a block diagram of one embodiment showing composition of an image processing device of a radiographic device of the 2nd mode. It is a block diagram of 1st Embodiment showing the structure of the image processing apparatus of the radiography apparatus of a 3rd aspect. It is a graph of an example showing the filter characteristic of a nonlinear transformation process. It is a block diagram of 2nd Embodiment showing the structure of the image processing apparatus of the radiography apparatus of a 3rd aspect. It is a graph of an example showing the filter characteristic of a nonlinear transformation process. It is a block diagram of 3rd Embodiment showing the structure of the image processing apparatus of the radiography apparatus of a 3rd aspect. It is a graph of an example showing the filter characteristic of a nonlinear transformation process. It is a graph of an example showing the filter characteristic of a nonlinear transformation process. It is a graph of an example showing the filter characteristic of a nonlinear transformation process. It is an example block diagram showing the structure of a frequency filter process part. (A) And (B) is a graph of an example showing the filter characteristic of a linear transformation process.

  Hereinafter, a radiation imaging apparatus of the present invention will be described in detail based on preferred embodiments shown in the accompanying drawings.

  FIG. 1 is a block diagram of an embodiment showing a configuration of a radiation imaging apparatus of the present invention. The radiography apparatus 10 shown in FIG. 1 performs tomosynthesis imaging of the subject 34 to capture a plurality of projection images (radiation images) with different imaging angles, and the subject 34 is obtained from the plurality of projection images thus captured. A tomographic image in a cross section having a predetermined height is reconstructed. The radiation imaging apparatus 10 includes an imaging apparatus 12 and a console 14.

  The imaging device 12 performs tomosynthesis imaging of the subject 34 and captures a plurality of projection images having different imaging angles, and includes a radiation source 16, a radiation control device 18, and an imaging table 20. .

  The radiation source 16 irradiates a predetermined intensity of radiation for a predetermined period of time when a projection image of the subject 34 is captured under the control of the radiation control device 18. That is, a predetermined amount of radiation (dose) is irradiated.

  The radiation control device 18 controls the operation of the radiation source 16 (irradiation, irradiation position, irradiation angle, etc.) according to the imaging conditions by control of the control device 26 of the console 14 described later.

  The imaging table 20 in the illustrated example is a supine position imaging table, and is a table for positioning the subject 34 when a projection image is captured. It should be noted that a standing position imaging stand can be used as the imaging stand 20. A radiation detector 22 is disposed below the imaging table 20.

  The radiation detector 22 is, for example, a flat panel type (FPD: flat panel detector), and detects the radiation irradiated from the radiation source 16 and transmitted through the subject 34. An image signal (image data) of the captured projection image is output.

  Subsequently, the console 14 controls the overall operation of the radiation imaging apparatus 10, and includes an input device 24, a control device 26, an image processing device 28, a recording device 30, and a display device 32. Yes.

  The input device 24 is for inputting various instructions such as shooting instructions and various information such as shooting conditions, and can be exemplified by a keyboard and a mouse.

  The control device 26 controls the operations of the radiation control device 18, the image processing device 28, the recording device 30, and the display device 32 based on various instructions and various information input via the input device 24.

  The image processing device 28 performs various types of image processing including image synthesis processing and filter correction processing on the plurality of projection images (image signals thereof) input from the radiation detector 22 under the control of the control device 26. A tomographic image (image signal) in a cross section having a predetermined height of the subject 34 is reconstructed and output.

  The recording device 30 records various types of information including tomographic images (image signals) output from the image processing device 28 under the control of the control device 26, and includes a hard disk, a CD-R, and a DVD-R. A printer or the like can be exemplified.

  The display device 32 displays various types of information including a tomographic image output from the image processing device 28 under the control of the control device 26, and may be a liquid crystal display or the like.

  Next, the operation of the radiation imaging apparatus 10 during tomosynthesis imaging will be described.

  In the radiography apparatus 10, when performing tomosynthesis imaging, the subject 34 is positioned on the imaging surface of the imaging table 20. Thereafter, when an instruction to start photographing is given from the input device 24, tomosynthesis photographing is started under the control of the control device 26.

  When imaging is started, the imaging apparatus 12 controls the radiation control apparatus 18 to change the irradiation angle of the radiation source 16 in the direction of the subject 34 while moving the radiation source 16 in one direction to perform different irradiation. Radiation is irradiated to the subject 34 at an angle, and a plurality of projection images with different shooting angles are sequentially shot by one shooting operation. Each time a projection image of the subject 34 is captured, an image signal of the captured projection image is output from the radiation detector 22.

  At this time, in the console 14, image signals of a plurality of projection images input from the imaging device 12 are sequentially stored in the recording device 30 under the control of the control device 26.

  When the photographing is finished, the control device 26 controls the image composition processing or filter for aligning and superimposing the plurality of projection images from the image signals of the plurality of projection images having different photographing angles stored in the recording device 30. Various kinds of image processing including correction processing and the like are performed, and a tomographic image at a predetermined cross section of the subject 34 is reconstructed. The reconstructed tomographic image is displayed on the display device 32, and the image signal is recorded in the recording device 30 as necessary.

  Hereinafter, the operation at the time of tomographic image reconstruction will be described.

  A shift addition method is typically used as a tomographic image reconstruction method. In the shift addition method, based on the position of the radiation source 16 at the time of capturing each projected image, the positions of the captured images are sequentially shifted so that the corresponding structures are aligned so that the corresponding pixels are obtained. It is to add.

  FIGS. 2A and 2B are conceptual diagrams illustrating an example of a state when a tomographic image is reconstructed by tomosynthesis imaging. As shown in FIG. 6A, at the time of tomosynthesis imaging, the radiation source 16 starts from the position S1 and moves to S3, and the subject 34 is irradiated with radiation at each of the radiation source positions S1, S2, and S3. Assume that radiation images P1, P2, and P3 of the subject 34 are obtained.

  Here, it is assumed that the structures A and B exist at two positions where the height of the subject 34 is different as shown in FIG. At each imaging position (position of the radiation source 16 at the time of imaging) S1, S2, and S3, the radiation irradiated from the radiation source 16 passes through the subject 34 and enters the radiation detector 22. As a result, in the projection images P1, P2, and P3 corresponding to the photographing positions S1, S2, and S3, the two structures A and B are projected with different positional relationships.

  For example, in the case of the projection image P1, since the position S1 of the radiation source 16 is located on the left side of the structures A and B with respect to the moving direction of the radiation source 16, the structures A and B are respectively structures. Projected to the positions of P1A and P1B shifted to the right side of A and B. Similarly, in the case of the projection image P2, it is projected at the positions of P2A and P2B almost immediately below, and in the case of the projection image P3, it is projected at the positions of P3A and P3B shifted to the left side.

  When reconstructing a tomographic image in a cross section at a height where the structure A exists, the projection positions P1A, P2A, and P3A of the target structure A are matched based on the position of the radiation source 16, for example, FIG. As shown in (B), the projection image P1 is shifted to the left and the projection image P3 is shifted to the right to synthesize corresponding pixels. Thereby, a tomographic image having a height at which the target structure A exists is reconstructed. Similarly, a tomographic image in a cross section having an arbitrary height can be reconstructed.

  Next, image processing during tomographic image reconstruction in the image processing device 28 will be described.

  FIG. 3 is a block diagram of an embodiment showing the configuration of the image processing apparatus of the radiation imaging apparatus according to the first aspect. The image processing apparatus 28 </ b> A shown in the figure includes a similarity calculation unit 136, a weighting coefficient calculation unit 138, and a back projection processing unit 140.

  The similarity calculation unit 136 uses one projection image of a plurality of projection images captured by tomosynthesis imaging, for example, a projection image captured directly in front of the subject as a reference projection image, on the tomographic image. A predetermined area in the vicinity of the pixel on the reference projection image (an M × N pixel area in the vicinity including the target pixel) and a predetermined area in the vicinity of the corresponding pixel on each projection image, which are cumulatively added to the same position The degree of similarity (or degree of dispersion) is calculated by a normalized mutual function.

  The weighting coefficient calculation unit 138 calculates the weighting coefficient so that each pixel of the plurality of projection images becomes larger as the degree of similarity increases. Note that the weighting coefficient calculation unit 138 may use the similarity itself as a weighting coefficient.

  Then, the backprojection processing unit 140 cumulatively adds the multiplication values of the pixel values of the pixels on the plurality of projection images and the corresponding weighting coefficients, which are cumulatively added at the same position on the tomographic image, Reconstruct a tomographic image.

  The image processing device 28A accumulates and adds to the same position on the tomographic image at the time of reconstruction of the tomographic image, and a predetermined region near the pixel on the reference projection image and a predetermined pixel near the corresponding pixel on each projection image. Depending on the degree of similarity (correlation) between the regions, the pixel values of the corresponding pixels in each projection image are weighted and added to selectively emphasize the target structure in the focal plane (other than the focal plane) To suppress the structure).

  Here, as shown in FIG. 4, by tomosynthesis imaging, a plurality of projection images (three in the example of FIG. 4) of the subject including a star structure and a circular structure as target structures are obtained. Suppose it was filmed.

  As shown in FIG. 6, the coordinates of the structure of interest in a predetermined cross section of the subject 34 are (x, y, z), the positions of the radiation sources at the time of capturing the respective projection images (sxi, syi, szi), The projection position of the structure of interest on the i-th projection image Pi (i is an integer from −I to I, and the projection image taken from the front of the subject 34 is P0) (ti, si, 0) ), Assuming that the pixel value of each pixel of the projection image Pi is Pi (ti, si), the pixel value Tz (x, y) of each pixel of the tomographic image Tz after reconstruction by the conventional method is expressed by the following equation. The

  As a result of the above calculation, pixel information on each projection image Pi that has passed through a point (x, y, z) in space is cumulatively added to the pixel value Tz (x, y) of the corresponding pixel on the tomographic image Tz. Therefore, a tomographic image Tz in which the structure of the point (x, y, z) is emphasized can be obtained. For example, if the point (x, y, z) is one point on the star-shaped structure shown in FIG. 5, the projection information of the corresponding pixels of each projection image Pi acts so as to be superimposed. As a result, a tomographic image Tz in which the star-shaped structure is emphasized can be obtained.

  On the other hand, in the image processing device 28A, if the weighting coefficient for each pixel (ti, si) of each projection image Pi is wi (ti, si), the tomographic image Tz reconstructed by the backprojection processing unit 140 is displayed. The pixel value Tz (x, y) of each pixel is expressed by the following equation.

  Here, in the image processing apparatus 28A, the weighting coefficient wi (ti, si) is calculated as follows.

  First, the similarity calculation unit 136 cumulatively adds the projected image P0 photographed from the front of the subject 34 among the plurality of projected images Pi to the same position on the tomographic image Tz as the reference projected image P0. , Between the neighboring rectangular area of the projection position P0 (t0, s0) of the pixel on the reference projection image P0 and the neighboring rectangular area of the projection position Pi (ti, si) of the corresponding pixel on each projection image Pi. Similarity is calculated by normalized cross-correlation.

  Then, the weighting coefficient calculation unit 138 calculates the weighting coefficient wi (ti, si) so as to increase as the similarity increases for each pixel of the plurality of projection images. Note that the weighting coefficient w0 (t0, s0) of the reference projection image P0 is “1” because of the normalized autocorrelation.

  The weighting by the weighting coefficient wi is performed based on the pixel value Tz (x, x, y) according to the similarity between the corresponding rectangular areas of the projection image Pi that are cumulatively added to the pixel value Tz (x, y) of each pixel of the tomographic image Tz. It acts to vary the contribution to y). Therefore, in the tomographic image Tz after reconstruction, it is possible to emphasize the target structure and prevent the occurrence of artifacts due to the influence of the structure other than the target structure.

  Note that the similarity calculation unit 136 accumulates and adds the pixels at the same position on the tomographic image Tz, not the predetermined area, and the pixel at the projection position P0 (t0, s0) on the reference projection image P0, and each projection image Pi. You may make it calculate the similarity with the pixel of the upper projection position Pi (ti, si). In addition, it is not essential to use a projection image taken directly in front of the subject as a reference projection image, and one projection image out of a plurality of projection images Pi may be used as a reference projection image.

  Further, it is not essential for the similarity calculation unit 136 to obtain the similarity between predetermined regions or pixels of two projected images by normalized cross-correlation, and the similarity between the two is obtained by various template matching (pattern matching). Can be calculated.

  The image processing device 28A reconstructs the tomographic image after weighting each pixel of each projection image, but is not limited to this, and weights each pixel of the tomographic image after reconstruction. The same effect can be obtained. Hereinafter, this case will be described.

  FIG. 7 is a block diagram of an embodiment showing the configuration of the image processing apparatus of the radiation imaging apparatus according to the second aspect. The image processing apparatus 28B shown in the figure includes a back projection processing unit 142, a similarity calculation unit 144, a weighting coefficient calculation unit 146, and a multiplication processing unit 148.

  The back projection processing unit 142 reconstructs a tomographic image from a plurality of projection images.

  The similarity calculation unit 144 accumulates and adds to the same position on the tomographic image, and similarities between predetermined regions in the vicinity of the pixels on the plurality of projection images (an M × N pixel region in the vicinity including the target pixel). The degree is calculated by normalized cross correlation.

  The weighting coefficient calculation unit 146 calculates the weighting coefficient so that each pixel of the tomographic image increases as the degree of similarity increases.

  Then, the multiplication processing unit 148 multiplies the pixel value of each pixel of the tomographic image by the weighting coefficient corresponding thereto, thereby creating a multiplication processed image after the multiplication processing. This multiplication processing image is displayed on the display device 32.

  The image processing device 28B responds to the tomographic image according to the similarity between predetermined regions in the vicinity of the pixels on the plurality of projection images, which are cumulatively added to the same position on the tomographic image after the reconstruction of the tomographic image. By weighting the pixel value of the target pixel, the target structure in the focal section is selectively emphasized (a structure other than the focal section is suppressed).

  In the image processing device 28B, assuming that the weighting coefficient for each pixel Tz (x, y) of the tomographic image Tz after reconstruction is wi (x, y), the pixel value Tz () of each pixel of the tomographic image Tz after reconstruction. x, y) is represented by the following formula.

  In the image processing device 28B, the weighting coefficient wi (x, y) is calculated as follows.

  First, the similarity between adjacent rectangular areas of the projection positions Pi (ti, si) of the pixels on the plurality of projection images Pi, which is cumulatively added to the same position on the tomographic image Tz by the similarity calculation unit 144, It is calculated by the average value of normalized cross-correlation between two images of a plurality of images.

  Then, the weighting coefficient calculation unit 146 calculates the weighting coefficient wi (x, y) for each pixel of the tomographic image Tz so as to increase as the degree of similarity increases.

  The weighting by the weighting coefficient wi is performed based on the pixel value Tz (x, It acts to vary the value of y). Therefore, it is possible to emphasize the structure of interest in the reconstructed tomographic image Tz with a smaller amount of calculation, and to prevent the occurrence of artifacts due to the influence of structures other than the structure of interest.

  Similarly, the similarity calculation unit 144 calculates the similarity between the pixels at the projection positions Pi (ti, si) on the plurality of projection images Pi that are cumulatively added to the same position on the tomographic image Tz. May be. Further, the similarity calculation unit 144 can calculate the similarity between predetermined regions or pixels of two projection images not only by normalized cross-correlation but also by various template matching (pattern matching).

  Next, a radiation imaging apparatus according to the third aspect of the present invention will be described.

  The configuration of the radiation imaging apparatus according to the third aspect of the present invention is the same as that of the radiation imaging apparatus according to the first and second aspects of the present invention. That is, the radiation imaging apparatus 10 according to the third aspect of the present invention includes the imaging apparatus 12 and the console 14. The operation of the radiation imaging apparatus 10 according to the third aspect at the time of tomosynthesis imaging is the same as that of the radiation imaging apparatus 10 according to the first and second aspects.

  Next, image processing during tomographic image reconstruction in the image processing device 28 will be described.

  FIG. 8 is a block diagram of the first embodiment showing the configuration of the image processing apparatus of the radiation imaging apparatus according to the third aspect. The image processing apparatus 28 </ b> C illustrated in FIG. 18 includes a filter processing unit 236, a nonlinear transformation processing unit 238, and a back projection processing unit 240.

  The filter processing unit 236 performs image processing such as frequency emphasis processing and frequency suppression processing on the projection image signal Sorg corresponding to the projection image using a filter, and a filter corresponding to the filter processing image after the filter processing A processed image signal g (Sorg) is created.

  The non-linear conversion processing unit 238 performs non-linear conversion depending on contrast and transmission dose on the filtered image signal g (Sorg). From the non-linear conversion processing unit 238, a converted image signal Sproc corresponding to the converted image after the non-linear conversion processing of the filtered image signal g (Sorg) is output.

  As already described, an image signal corresponding to a portion with originally high contrast, such as metal, may be overemphasized by the above image processing. In this case, overshoot or undershoot occurs in the filtered image signal g (Sorg) between the pixels of the filtered image having a high contrast corresponding to metal or the like and the peripheral portion thereof. Artifacts may occur in the corresponding part of the tomographic image.

  Contrast-dependent nonlinear conversion processing is performed when the filtered image signal g (Sorg) exceeds a predetermined value by the above-described image processing, that is, a filter between pixels of a filtered image having a high contrast and its peripheral portion. When overshoot or undershoot occurs in the processed image signal g (Sorg), the component exceeding the predetermined value of the filtered image signal g (Sorg) included in the converted image signal Sproc is reduced, that is, overshoot. In other words, the filtered image signal g (Sorg) is nonlinearly converted so that no undershoot occurs.

  In radiography, for example, if the subject 34 is thick, the radiation transmission dose decreases, the projection image signal Sorg decreases, and the noise component included in the projection image increases. Conversely, if the subject 34 is thin, the transmitted dose increases, the projection image signal Sorg increases, and the noise component included in the projection image decreases. In other words, when the thickness of the subject 34 is the same, if the radiation dose is small, the transmitted dose decreases and the noise component increases, and if the radiation dose is large, the transmitted dose increases and the noise component decreases.

  The transmitted dose-dependent nonlinear conversion processing is performed when the transmitted dose corresponding to each pixel of the projection image is smaller than a predetermined value, that is, when the projection image signal Sorg is small and the noise component included in each pixel of the projection image is large. The filtered image signal g (Sorg) is reduced so that the component of the filtered image signal g (Sorg) included in the converted image signal Sproc is reduced, that is, the noise component included in each pixel of the converted image is reduced. Is subjected to nonlinear transformation.

As described above, the nonlinear conversion processing unit 238 of this embodiment performs contrast-dependent and transmitted dose-dependent nonlinear conversion. The processing performed by the nonlinear conversion processing unit 238 is shown in the following equation.
Sproc = β (Sorg) · f (g (Sorg))
Here, g is a function that performs image processing such as frequency enhancement processing and frequency suppression processing, f is a function that performs contrast-dependent nonlinear transformation processing, and β is a function that performs nonlinear transformation processing that depends on transmitted dose. is there.

  Note that it is not essential for the nonlinear conversion processing unit 238 to perform transmission dose-dependent nonlinear conversion, and only contrast-dependent nonlinear conversion may be performed.

  The filter processing unit 236 and the non-linear transformation processing unit 238 constitute a filter correction processing unit that performs filter correction processing. That is, the filter correction processing unit performs image processing on the projection image signal Sorg by the filter processing unit 236 to create a filter processed image signal g (Sorg), and the filter processing image signal g by the nonlinear conversion processing unit 238. A filter correction process is performed in which nonlinear conversion is performed on (Sorg) to generate a converted image signal Sproc.

  Finally, the backprojection processing unit 240 uses the converted image signals Sproc of a plurality of converted images corresponding to the plurality of projection images subjected to the filter correction process, tomographic images of a cross section at a predetermined height of the subject 34. Reconfigure.

  Hereinafter, the operation of the image processing device 28C will be described.

  In the image processing apparatus 28C, for example, first, the projection image signal Sorg is Fourier-transformed by the filter processing unit 236 to be converted into a frequency image signal composed of a plurality of frequency components. Subsequently, for example, the frequency image signal is subjected to image processing using a filter that emphasizes high-frequency components as shown in the graph of FIG. 17A, and then subjected to inverse Fourier transform to perform filter processing after filter processing. A filtered image signal g (Sorg) corresponding to the image is created.

  Note that the filter for enhancing the high-frequency component of the projection image is not limited to that shown in FIG. 17A, and various filters that realize the same function can be used. Further, the filter processing unit 236 may be a unit that suppresses a low-frequency component of a projection image.

  Subsequently, the nonlinear conversion processing unit 238 performs contrast-dependent and transmission dose-dependent nonlinear conversion on the filtered image signal g (Sorg).

For example, as shown in the graph of FIG. 9, when the absolute value of the filtered image signal g (Sorg) exceeds a predetermined value Th1, the filtered image signal g (Sorg) included in the converted image signal Sproc by the function f. Ingredients exceeding a predetermined value Th1 of, according to a non-linear conversion characteristic becomes smaller remote I part fraction does not exceed a predetermined value Th1, the non-linear conversion of the contrast dependence is performed on the filtered image signals g (Sorg) . FIG. 9 shows an example in which the filtered image signal g (Sorg) has a ± value, and it is not essential that the filtered image signal g (Sorg) has a ± value.

  Further, when the transmitted dose corresponding to each pixel of the projection image is smaller than a predetermined value, nonlinear conversion is performed so that the component of the filtered image signal g (Sorg) included in the converted image signal Sproc is small.

  As described above, the projection image signal Sorg is subjected to filter correction processing that combines image processing by the filter processing unit 236 and non-linear conversion by the non-linear conversion processing unit 238. As a result, a converted image signal Sproc is created. The

  This reduces the occurrence of artifacts on the reconstructed tomographic image due to overemphasis of originally high contrast parts such as metal or noise components without increasing the computation time. be able to.

  The filter correction process is sequentially performed on the projection image signals Sorg corresponding to all the projection images, and is recorded in the recording device 30. Then, the filter correction process is performed on the projection image signals Sorg corresponding to all the projection images, and when the converted image signal Sproc is generated, the back projection processing unit 240 performs a plurality of the filter correction processes. From the converted image signal Sproc of the converted image, a tomographic image of a cross section having a predetermined height of the subject 34 is reconstructed and displayed on the display device 32.

  Subsequently, FIG. 10 is a block diagram of the second embodiment showing the configuration of the image processing apparatus of the radiation imaging apparatus of the third aspect. The image processing device 28D shown in the figure includes a frequency filter processing unit 242, a non-linear transformation processing unit 244B, an integration processing unit 246B, and a back projection processing unit 248.

  The frequency filter processing unit 242 uses a plurality of band limited image signals Sbi (i = 1 to k, k) corresponding to a plurality of band limited images having different frequency response characteristics from the projection image signal Sorg corresponding to the projected image. Frequency filter processing for creating an integer of 2 or more. The frequency filter processing unit 242 includes a non-sharp mask image signal creation unit 250 and a band limited image signal creation unit 252 as shown in FIG.

  The non-sharp mask image signal generating unit 250 is a plurality of non-sharp mask image signals Susi (corresponding to a plurality of non-sharp mask images, each of which is an average image of the projection image and has different frequency response characteristics from the projection image signal Sorg. i = 1 to k, k is an integer of 2 or more). In the case of this embodiment, the unsharp mask image signal Sus1 is a high-frequency side non-sharp mask image signal Susk is a low-frequency side unsharp mask image.

  Further, the band limited image signal creation unit 252 generates a plurality of band limited image signals Sbi corresponding to a plurality of band limited images having different frequency response characteristics from the projection image signal Sorg and the unsharp mask image signal Susi. create. The band limited image is a difference image between two images having adjacent frequency components (frequency bands) among the projected image and the plurality of non-sharp mask images.

  The frequency filter processing unit 242 generates a plurality of band limited image signals Sbi by band limiting the projection image signal Sorg one-dimensionally along the moving direction of the radiation source. Although it is not necessary to band-limit the direction orthogonal to the moving direction of the radiation source, the band may be band-limited.

  Subsequently, the non-linear conversion processing unit 244B performs contrast-dependent non-linear conversion processing (fi (i = 1 to k, k is an integer of 2 or more)) on the band limited image signal Sbi.

  As shown in FIG. 11, the contrast-dependent nonlinear conversion process performs conversion for each pixel to suppress contrast when the band limited image signal Sbi exceeds a predetermined value. This prevents the band-limited image signal Sbi from having an excessive contrast and suppresses the occurrence of overshoot and undershoot.

  Furthermore, in addition to the contrast-dependent nonlinear conversion, the nonlinear conversion processing unit 244B performs a transmission dose-dependent nonlinear conversion process in which a gain gi (Sorg) dependent on the projection image signal Sorg is multiplied for each band-limited image signal Sbi (gi). (I = 1 to k, k is an integer of 2 or more)). gi has a function shape such that a small gain value is applied when the projection image signal Sorg is smaller than a predetermined value. Thereby, it can be made to act on the direction which suppresses the contrast of a noise component with respect to the pixel with little transmitted light amount and many noise components, and it can prevent that noise deteriorates. Moreover, it can be made to act in the direction which suppresses the contrast by high absorbers, such as a metal.

Subsequently, the integration processing unit 246B integrates the plurality of band limited image signals Sbi output from the nonlinear conversion processing unit 244B to create a converted image signal Sproc. The converted image signal Sproc can be expressed by the following equation.
Sproc = Σgi (Sorg) · fi (Sbi)
Here, gi is a nonlinear function that defines a gain depending on the projection image signal Sorg, and fi is a function that defines a contrast-dependent nonlinear conversion process.

  The frequency filter processing unit 242, the non-linear conversion processing unit 244B, and the integration processing unit 246B constitute a filter correction processing unit that performs filter correction processing. That is, the filter correction processing unit creates the band limited image signal Sbi from the projection image signal Sorg by the frequency filter processing unit 242, and performs nonlinear conversion processing on the band limited image signal Sbi by the nonlinear conversion processing unit 244B. The integration processing unit 246B performs filter correction processing that integrates the non-linearly converted band-limited image signal Sbi to create a converted image signal Sproc.

  The back projection processing unit 248 reconstructs a tomographic image at a predetermined height of the subject 34 from the converted image signals Sproc of the plurality of converted images corresponding to the plurality of projection images subjected to the filter correction processing. To do.

  Hereinafter, the operation of the image processing device 28D will be described.

  In the image processing device 28D, the frequency filter processing unit 242 performs the following frequency filter processing on the projection image signal Sorg corresponding to each projection image.

  First, the non-sharp mask image signal generating unit 250 generates a plurality of non-sharp mask image signals Susi corresponding to a plurality of non-sharp mask images having different frequency response characteristics from the projection image signal Sorg.

  The unsharp mask image signal creation unit 250 creates the unsharp mask image signal Sus1, for example, by filtering the projection image signal Sorg of all the pixels constituting the projection image with the unsharp mask. Then, by repeatedly performing this filtering process on the unsharp mask image signal Susi after the filter process, a plurality of unsharp mask image signals Sus1, Sus2, Sus3,..., Susk having different frequency response characteristics are created. .

  Subsequently, the band limited image signal generation unit 252 generates a plurality of band limited image signals Sbi corresponding to band limited images of a plurality of frequency components from the projection image signal Sorg and the unsharp mask image signal Susi.

  The band limited image signal creation unit 252 creates the band limited image signal Sb1 by subtracting the non-sharp mask image signal Sus1 from the projection image signal Sorg with a subtracter, for example. In the same manner, a plurality of frequency components are obtained by performing subtraction Sus1-Sus2, Sus2-Sus3, Sus3-Sus4,..., Sus (k-1) -Susk of image signals between non-sharp mask images of adjacent frequency components. A plurality of band limited image signals Sb1, Sb2, Sb3,..., Sbk corresponding to the band limited images are generated.

  That is, the band limited image signal Sbk is a signal including only the frequency components of the band limited image limited to a predetermined frequency band.

  Subsequently, the nonlinear conversion processing unit 244B performs contrast-dependent nonlinear conversion processing on the band limited image signal Sbi.

  As shown in the graph of FIG. 11, in the nonlinear conversion processing unit 244B, when the absolute value of the band limited image signal Sbi exceeds a predetermined value, the function fi causes the band limited image signal Sbi to be included in the converted image signal Sproc. Conversion is performed on each pixel so as to suppress the contrast with respect to the band limited image signal Sbi so that the component exceeding the predetermined value becomes small. FIG. 11 shows an example in which the band limited image signal Sbi has a value of ±.

  Further, in the nonlinear transformation processing unit 244B, in addition to the contrast-dependent nonlinear transformation, the band-limited image signal of the pixel whose projection image signal is smaller than a predetermined value included in the transformed image signal Sproc is reduced by the function gi. A gain gi (Sorg) depending on the projection image signal Sorg is multiplied for each band limited image signal Sbi. The non-linear conversion processing unit 244B outputs a non-linearly converted band limited image signal Sbi.

  Subsequently, the integration processing unit 246B integrates the plurality of band limited image signals Sbi output from the nonlinear conversion processing unit 244B, and creates a converted image signal Sproc corresponding to the converted image.

  As described above, the band-limited image signal Sbi created by the frequency filter processing unit 242 is subjected to the filter correction processing that combines the nonlinear conversion by the nonlinear conversion processing unit 244B and the integration processing by the integration processing unit 246B. As a result, a converted image signal Sproc is created.

  As a result, in a tomographic image, it is possible to reduce over-emphasis on an originally high-contrast portion such as metal to cause artifacts or enhance noise components.

  The filter correction process is sequentially performed on the projection image signals Sorg corresponding to all the projection images, and is recorded in the recording device 30. Then, when the above-described filter correction processing is performed on the projection image signal Sorg corresponding to all the projection images and the converted image signal Sproc is generated, the back projection processing unit 248 performs a plurality of filter correction processing. A tomographic image at a cross section of a predetermined height of the subject is reconstructed from the converted image signal Sproc of the converted images of the sheet, and is displayed on the display device 32.

  Next, FIG. 12 is a block diagram of the third embodiment showing the configuration of the image processing apparatus of the radiation imaging apparatus of the third aspect. Similar to the image processing device 28D shown in FIG. 10, the image processing device 28E shown in FIG. 10 includes a frequency filter processing unit 242, a non-linear transformation processing unit 244C, an integration processing unit 246C, and a back projection processing unit 248. Has been. Since the difference between the image processing device 28E and the image processing device D is the non-linear conversion processing unit 244C and the integration processing unit 246C, these will be mainly described below.

Similar to the nonlinear conversion processing unit 244B, the nonlinear conversion processing unit 244C performs contrast-dependent nonlinear conversion processing on the band limited image signal Sbi. Furthermore, non-linear transformation processing part 244 C, in addition to the non-linear transformation of the contrast-dependent, performing nonlinear conversion processing of the transit dose dependent multiplying a gain gi (Sorg) that depends on the projected image signal Sorg for each band-limited image signal Sbi.

  As shown in FIG. 12, the nonlinear conversion processing unit 244C uses a function fus and a function gus to output a non-sharp mask image signal (unsharp mask on the lowest frequency side) output from the final stage of the non-sharp mask image signal creation unit 250. The image signal) Susk is also subjected to filter correction processing, but this is not essential.

Subsequently, the integration processing unit 246C integrates the plurality of band limited image signals Sbi output from the nonlinear conversion processing unit 244C, and creates a converted image signal Sproc corresponding to the converted image after the integration processing. The converted image signal Sproc can be expressed by the following equation.
Sproc = Σgi (Sorg) · fi (Sbi)
Here, gi is a nonlinear function that defines a gain depending on the projection image signal Sorg, and fi is a function that defines a contrast-dependent nonlinear conversion process.

  Similarly, the frequency filter processing unit 242, the nonlinear conversion processing unit 244C, and the integration processing unit 246C constitute a filter correction processing unit that performs filter correction processing. That is, the filter correction processing unit creates the band limited image signal Sbi from the projection image signal Sorg by the frequency filter processing unit 242 and performs nonlinear conversion processing on the band limited image signal Sbi by the nonlinear conversion processing unit 244C. The integration processing unit 246C performs filter correction processing for integrating the non-linearly converted band limited image signal Sbi to create a converted image signal Sproc.

  Hereinafter, the operation of the image processing device 28E will be described.

  The operations until the plurality of band limited image signals Sb1, Sb2, Sb3,..., Sbk corresponding to the band limited images of a plurality of frequency components are generated by the frequency filter processing unit 242 are the same as in the case of the image processing device 28D. is there.

  Subsequently, the nonlinear conversion processing unit 244C performs contrast-dependent nonlinear conversion processing and transmitted dose-dependent nonlinear conversion processing on the band limited image signal Sbi.

  In the non-linear conversion processing unit 244C, as shown in the graph of FIG. 13, by the function gi, a pixel whose projection image signal Sorg is smaller than a predetermined value, that is, a pixel having high radiation absorption and high brightness (low density). Non-linear conversion is performed by multiplying the band limited image signal Sbi by a gain gi (Sorg) depending on the projection image signal Sorg so that the component of the band limited image signal Sbi included in the converted image signal Sproc becomes smaller. In particular, the component of the band limited image signal Sbi that is included in the converted image signal Sproc for pixels whose projection image signal Sorg is smaller than a predetermined value as the component on the lower frequency side is higher than the component on the higher frequency side of the projection image signal Sorg. Non-linear conversion processing is performed so as to be smaller.

  In this way, by giving the pixel whose projection image signal Sorg is smaller than a predetermined value, in particular, the characteristic of suppressing the low frequency component, it acts in the direction of suppressing the contrast caused by the metal having a large absorption, Artifacts generated in the reconstructed image can be reduced.

  Further, as shown in the graph of FIG. 14, the projection image (projection with a larger photographing angle) photographed on the end side than the projected image (photographing angle = 0 degree) photographed from the front of the subject 34 by the function gi. The gain gi () depends on the projection image signal Sorg so that the component of the band limited image signal Sbi included in the converted image signal Sproc becomes smaller for pixels whose projection image signal Sorg is smaller than a predetermined value as the image becomes (image). Sorg) may be subjected to non-linear transformation processing for multiplying each band limited image signal Sbi.

  Further, in the non-linear conversion processing unit 244C, as shown in the graph of FIG. 15, when the absolute value of the band limited image signal Sbi exceeds the predetermined value Th1, the band limited image signal included in the converted image signal Sproc by the function fi. Conversion is performed on each pixel so as to suppress the contrast of the band limited image signal Sbi so that the component exceeding the predetermined value Th1 of Sbi becomes small. FIG. 15 shows an example where the band limited image signal Sbi has a value of ±. A non-linearly converted band limited image signal Sbi is output from the non-linear conversion processing unit 244C.

  In addition, the gradient of the conversion curve of the function fi that nonlinearly transforms the band limited image signal Sbi becomes gentler as the projected image is captured on the end side than the projected image captured from the front of the subject 34 by the function fi. In other words, the nonlinear conversion process may be performed so that the band limited image signal Sbi becomes small.

  Subsequently, the integration processing unit 246C integrates the plurality of band-limited image signals Sbi output from the nonlinear conversion processing unit 244B, and creates a converted image signal Sproc corresponding to the converted image. The subsequent operation is the same as that of the image processing device 28D.

  As described above, the band-limited image signal Sbi created by the frequency filter processing unit 242 is subjected to the filter correction processing that combines the nonlinear conversion by the nonlinear conversion processing unit 244C and the integration processing by the integration processing unit 246C. As a result, a converted image signal Sproc is created.

  As a result, in the tomographic image after reconstruction, it is possible to suppress an originally high-contrast portion such as a metal having a large absorption, and therefore it is possible to reduce the occurrence of artifacts and enhancement of noise components.

  In addition, as described in Patent Document 3, a plurality of unsharp masks having different frequency response characteristics by sequentially thinning out pixels of a projected image and creating a plurality of unsharp mask images having different resolutions, respectively. An image signal Susi can also be created.

  In this method, as shown in FIG. 16, the non-sharp mask image signal generation unit 250B performs non-sharp mask image processing on the projection image signal Sorg, and the vertical direction in the drawing of the projection image (the moving direction of the radiation source). ) Pixels are sequentially thinned out by half to create a plurality of non-sharp mask images (Gaussian pyramids) having different sizes (resolutions). As a result, the vertical size is 1/2, 1/4, 1/8,... Of the projected image, and the horizontal size is the same as the projected image. Mask image signals Sus1, Sus2, Sus3,..., Susk are generated.

  As shown in the figure, the plurality of non-sharp mask images have different vertical sizes. For this reason, the image signal Sorg of the projection image, the image signal Sus1 of the non-sharp mask image adjacent thereto, and the image of the non-sharp mask images of the adjacent frequency components are displayed by the subsequent band limited image signal creation unit 252B. Signal subtraction Sorg-Sus1, Sus1-Sus2, Sus2-Sus3, Sus3-Sus4, ..., Sus (k-1) -Susk cannot be performed.

  Therefore, the band-limited image signal creation unit 252B upsamples the vertical size of each non-sharp mask image by a factor of 2 and interpolates, and then outputs the image signal Sorg of the projection image and the non-sharp mask image adjacent thereto. Subtraction Sorg-Sus1, Sus1-Sus2, Sus2-Sus3, Sus3-Sus4, ..., Sus (k-1) -Susk of the image signal Sus1 and the non-sharp mask images of adjacent frequency components A plurality of band limited image signals Sb1, Sb2, Sb3,..., Sbk corresponding to the band limited images of a plurality of frequency components are generated.

  Further, as shown in the figure, the band-limited images after the subtraction process also have different sizes in the vertical direction. Therefore, although not shown in FIG. 16, as in the case of the band-limited image signal creation unit 252B, the integration processing unit 246B performs upsampling of the size of the band-limited image in the vertical direction twice and interpolates. Thereafter, a plurality of converted image signals are integrated to create a converted image signal Sproc.

  Note that not only the vertical direction of the image but also the pixels in the horizontal direction may be thinned out simultaneously to create a plurality of unsharp mask images having different resolutions.

  The radiographic apparatus of the above embodiment is an apparatus that captures a projection image of a subject by moving a radiation source along a linear trajectory by tomosynthesis imaging. However, the present invention is not limited to this, and for example, radiation For radiation imaging devices that perform tomosynthesis imaging by moving the radiation source in a trajectory other than a linear trajectory, such as moving the source in an arc-shaped trajectory centered on the subject and capturing a projected image of the subject Is also applicable. In this case, the present invention can be applied if the calculation formula for obtaining the corresponding pixel is changed according to the geometrical system of imaging.

  In addition, the radiographic apparatus of the above embodiment reconstructs a tomographic image at a predetermined cross section of the subject by a shift addition method. However, the present invention is not limited to this, for example, a filter back projection method. The present invention can also be applied to a radiation imaging apparatus that reconstructs a tomographic image by a reconstruction method other than the shift addition method, such as a method of accumulating the corresponding pixels to reconstruct a tomographic image. For example, in the case of a radiation imaging apparatus that reconstructs a tomographic image by a filter back projection method, the present invention can be applied using a projection image after filter processing.

  It is also possible to use the radiation imaging apparatus according to the first or second aspect in combination with the radiation imaging apparatus according to any one of the first to third embodiments according to the third aspect. That is, a plurality of converted image signals Sproc corresponding to each of a plurality of converted images created by the radiation imaging apparatus of the third aspect are converted into a plurality of projection image signals corresponding to a plurality of projected images captured by tomosynthesis imaging. Instead, the tomographic image can be reconstructed from a plurality of converted images by supplying the radiation imaging apparatus of the first or second aspect.

  In this case, the converted image signal Sproc corresponding to the converted image after the non-linear conversion process created by the non-linear conversion processing unit 238 of the radiation imaging apparatus according to the first embodiment of the third aspect, or the integration process of the second embodiment. The converted image signal Sproc corresponding to the converted image after the integration processing created by the unit 246B or the integration processing unit 246C of the third embodiment is used as the similarity calculation unit 136 of the radiation imaging apparatus of the first aspect, or the second To the back projection processing unit 142 of the radiation imaging apparatus of the aspect.

  Thereby, both the effects of the radiation imaging apparatus of the third aspect and the effects of the radiation imaging apparatuses of the first and second aspects can be obtained. That is, in the reconstructed tomographic image, without increasing the calculation time, it is possible to reduce the occurrence of artifacts or noise components being emphasized due to over-emphasis on the originally high-contrast portion such as metal. it can. Furthermore, it is possible to emphasize the target structure and prevent the occurrence of artifacts due to the influence of the structure other than the target structure.

The present invention is basically as described above.
Although the present invention has been described in detail above, the present invention is not limited to the above-described embodiment, and it is needless to say that various improvements and modifications may be made without departing from the gist of the present invention.

DESCRIPTION OF SYMBOLS 10 Radiography apparatus 12 Imaging apparatus 14 Console 16 Radiation source 18 Radiation control apparatus 20 Imaging stand 24 Input device 26 Control apparatus 28, 28A, 28B, 28C, 28D, 28E Image processing apparatus 30 Recording apparatus 32 Display apparatus 34 Subject 136 144, similarity calculation unit 138, 146 weighting coefficient calculation unit 140, 142 back projection processing unit 148 multiplication processing unit 236 filter processing unit 238, 244B, 244C nonlinear transformation processing unit 240, 248 back projection processing unit 242 frequency filter processing unit 246B 246C Integration processing unit 250 Unsharp mask image signal creation unit 252 Band limited image signal creation unit

Claims (11)

  1. A radiography apparatus for reconstructing a tomographic image of a predetermined cross section of a subject from a plurality of projection images of the subject taken by tomosynthesis imaging,
    A frequency filter processing unit that creates a plurality of band limited image signals having different frequency response characteristics from the projection image signal corresponding to the projection image;
    A non-linear transformation processing part for nonlinear converting the band-limited image signals,
    An integration processing unit that integrates a plurality of band limited image signals nonlinearly converted by the nonlinear conversion processing unit to create a converted image signal;
    A back projection processing unit for reconstructing a tomographic image in a predetermined cross section of the subject from the plurality of converted image signals corresponding to the plurality of projection images,
    The nonlinear transformation processing unit nonlinearly transforms the band limited image signal so that a component exceeding the first predetermined value of the band limited image signal included in the converted image signal is reduced, and further, the projection image signal enough than components of the high-frequency side becomes a component of the low frequency side, the conversion is included in the image signal, the radiographic apparatus, wherein the components of the band-limited image signal is to a non-linear transformation to be smaller .
  2. The non-linear conversion processing unit further, for the pixels whose projection image signal is smaller than a third predetermined value , as the projection image is captured on the end side than the projection image captured from the front of the subject , included in the converted image signals, the radiation imaging apparatus as claimed in claim 1, wherein component of said band-limited image signal is to a non-linear transformation to be smaller.
  3. The nonlinear conversion processing unit, further wherein included in the converted image signal, the projection image signal in which components of the band-limited image signal of a smaller pixel than the second predetermined value is non-linear transformation to be smaller The radiation imaging apparatus according to claim 1 or 2.
  4. Wherein the frequency filtering process unit, the projection image signal, by one-dimensionally to band limitation in the movement direction of the radiation source, according to claim 1 to 3 is intended to create a plurality of said band-limited image signal The radiation imaging apparatus according to any one of the above.
  5. Further, a single converted image corresponding to one converted image signal is used as a reference converted image, and the pixels on the reference converted image and the pixels on each converted image are cumulatively added to the same position on the tomographic image. A similarity calculation unit for calculating the similarity between
    For each pixel of the plurality of converted images, a weighting coefficient calculating unit that calculates a weighting coefficient so as to increase as the similarity increases, and
    The back projection processing unit cumulatively adds the multiplication values of the pixel values of the plurality of converted image pixels and the corresponding weighting coefficients, which are cumulatively added to the same position on the tomographic image, thereby the radiation imaging apparatus according to any one of claims 1-4 is intended to reconstruct a tomographic image.
  6. The similarity calculation unit calculates a similarity between the first area on the reference converted image and the second area on each converted image, which is cumulatively added to the same position on the tomographic image. The radiation imaging apparatus according to claim 5 .
  7. The radiation imaging apparatus according to claim 5, wherein the similarity calculation unit uses a converted image captured from directly in front of the subject among the plurality of converted images as the reference converted image.
  8. The back projection processing unit reconstructs the tomographic image from a plurality of the converted images,
    Furthermore, a similarity calculation unit that calculates the similarity between the pixels on the plurality of converted images cumulatively added to the same position on the tomographic image;
    For each pixel of the tomographic image, a weighting coefficient calculation unit that calculates a weighting coefficient so as to increase as the similarity increases,
    The radiation imaging apparatus according to any one of claims 1-4 and a multiplication unit for creating multiplication processing image by multiplying the weighting factor corresponding thereto and the pixel value of each pixel of the tomographic image.
  9. The radiographic apparatus according to claim 8 , wherein the similarity calculation unit calculates a similarity between predetermined regions on the plurality of converted images that are cumulatively added to the same position on the tomographic image.
  10. The similarity calculation unit, the radiation imaging apparatus according to any one of claims 5-9 by normalized cross-correlation and calculates the degree of similarity.
  11. The radiation imaging apparatus according to any one of claims 5 to 10 , wherein the weighting coefficient calculating unit uses the similarity as the weighting coefficient.
JP2012129947A 2011-07-05 2012-06-07 Radiography equipment Active JP5563018B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
JP2011149411 2011-07-05
JP2011149411 2011-07-05
JP2012129947A JP5563018B2 (en) 2011-07-05 2012-06-07 Radiography equipment

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
JP2012129947A JP5563018B2 (en) 2011-07-05 2012-06-07 Radiography equipment
EP12172019A EP2535872A1 (en) 2011-06-15 2012-06-14 Radiographic imaging system
US13/523,543 US9286702B2 (en) 2011-06-15 2012-06-14 Radiographic imaging system

Publications (2)

Publication Number Publication Date
JP2013031641A JP2013031641A (en) 2013-02-14
JP5563018B2 true JP5563018B2 (en) 2014-07-30

Family

ID=47788045

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2012129947A Active JP5563018B2 (en) 2011-07-05 2012-06-07 Radiography equipment

Country Status (1)

Country Link
JP (1) JP5563018B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5952251B2 (en) * 2012-12-14 2016-07-13 富士フイルム株式会社 Image processing apparatus, radiographic imaging system, image processing program, and image processing method
WO2015045165A1 (en) * 2013-09-30 2015-04-02 株式会社島津製作所 Radiation tomographic image processing method and radiation tomographic imaging apparatus
US10335107B2 (en) 2014-09-19 2019-07-02 Fujifilm Corporation Tomographic image generation device and method, and recording medium
JP6185023B2 (en) * 2014-09-19 2017-08-23 富士フイルム株式会社 Tomographic image generating apparatus, method and program
JP6370280B2 (en) * 2015-09-16 2018-08-08 富士フイルム株式会社 Tomographic image generating apparatus, method and program

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4903204A (en) * 1987-12-01 1990-02-20 Duke University Matrix inversion tomosynthesis improvements in longitudinal X-ray slice imaging
JP2689176B2 (en) * 1990-03-26 1997-12-10 富士写真フイルム株式会社 Tomography image processor
JPH0549631A (en) * 1991-08-28 1993-03-02 Shimadzu Corp X-ray image photographing device
JP3816151B2 (en) * 1995-09-29 2006-08-30 富士写真フイルム株式会社 Image processing method and apparatus
JP3783116B2 (en) * 1996-09-25 2006-06-07 富士写真フイルム株式会社 Radiation image enhancement processing method and apparatus
JP4054402B2 (en) * 1997-04-25 2008-02-27 株式会社東芝 X-ray tomography equipment
JP2004073449A (en) * 2002-08-16 2004-03-11 Canon Inc Radiograph
US6751284B1 (en) * 2002-12-03 2004-06-15 General Electric Company Method and system for tomosynthesis image enhancement using transverse filtering
JP2005052295A (en) * 2003-08-01 2005-03-03 Fuji Photo Film Co Ltd Image processing apparatus and program
US7522755B2 (en) * 2005-03-01 2009-04-21 General Electric Company Systems, methods and apparatus for filtered back-projection reconstruction in digital tomosynthesis
JP2008245999A (en) * 2007-03-30 2008-10-16 Fujifilm Corp Radiographic equipment

Also Published As

Publication number Publication date
JP2013031641A (en) 2013-02-14

Similar Documents

Publication Publication Date Title
Hirsch et al. Efficient filter flow for space-variant multiframe blind deconvolution
JP3754933B2 (en) Image processing apparatus, image processing system, image processing method, program, and storage medium
KR101689867B1 (en) Method for processing image, image processing apparatus and medical image system for performing the same
US5708693A (en) Image processing for noise reduction
US10043294B2 (en) Image processing device, radiographic imaging system, recording medium storing image processing program, and image processing method
US9324153B2 (en) Depth measurement apparatus, image pickup apparatus, depth measurement method, and depth measurement program
JP5559375B2 (en) X-ray computed tomography apparatus, reconstruction processing apparatus, and image processing apparatus
US7689055B2 (en) Method and apparatus for enhancing image acquired by radiographic system
US7623691B2 (en) Method for helical windmill artifact reduction with noise restoration for helical multislice CT
JP6006193B2 (en) Radiation image processing apparatus and method, and program
US7889904B2 (en) Image processing device, image processing method, program, storage medium and image processing system
DE102009039987A1 (en) Iterative CT image filter for noise reduction
US8965078B2 (en) Projection-space denoising with bilateral filtering in computed tomography
US20080205785A1 (en) Method and system for enhancing digital images
JP6169626B2 (en) Radiation image processing apparatus, method and program
US7359542B2 (en) Method and apparatus for detecting anomalous shadows
Suzuki et al. Efficient approximation of neural filters for removing quantum noise from images
US5878108A (en) Method for generating X-ray image and apparatus therefor
JP2005052553A (en) Radiographic image processing method and device, grid selecting method and device
JP3987024B2 (en) Method and system for enhancing tomosynthesis images using lateral filtering
JP4598507B2 (en) System and method for image noise reduction using minimum error space-time recursive filter
US8873825B2 (en) Method of noise reduction in digital X-rayograms
KR102003042B1 (en) Medical image processing apparatus and method of synthesizing and displaying of medical image generated by multi-energy x-ray
JPH0696200A (en) Method and device for decreasing noise
JP4679710B2 (en) Noise suppression processing apparatus and recording medium

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130118

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20130425

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20130430

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130528

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140212

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140410

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140603

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140611

R150 Certificate of patent or registration of utility model

Ref document number: 5563018

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250

R250 Receipt of annual fees

Free format text: JAPANESE INTERMEDIATE CODE: R250