JP2005058760A - Image data processing apparatus and image data processing method - Google Patents

Image data processing apparatus and image data processing method Download PDF

Info

Publication number
JP2005058760A
JP2005058760A JP2004226115A JP2004226115A JP2005058760A JP 2005058760 A JP2005058760 A JP 2005058760A JP 2004226115 A JP2004226115 A JP 2004226115A JP 2004226115 A JP2004226115 A JP 2004226115A JP 2005058760 A JP2005058760 A JP 2005058760A
Authority
JP
Japan
Prior art keywords
image data
parameter
blur
reconstruction
data processing
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2004226115A
Other languages
Japanese (ja)
Other versions
JP4686147B2 (en
Inventor
Toshihiro Rifu
俊裕 利府
Original Assignee
Toshiba Corp
株式会社東芝
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2003205025 priority Critical
Application filed by Toshiba Corp, 株式会社東芝 filed Critical Toshiba Corp
Priority to JP2004226115A priority patent/JP4686147B2/en
Publication of JP2005058760A publication Critical patent/JP2005058760A/en
Application granted granted Critical
Publication of JP4686147B2 publication Critical patent/JP4686147B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An object of the present invention is to reliably reduce blur components due to basic performance of an image capturing apparatus and image capturing conditions for each pixel from captured 3D image data, and to improve the accuracy and performance of image processing using the 3D image data. Improve quality.
A blur function for reducing blur for each pixel of three-dimensional image data in real space obtained by photographing the internal structure of an object under given photographing parameters represents the basic performance of the photographing apparatus. It is set according to at least one of a parameter and a shooting parameter, and the image data is subjected to correction processing for blur reduction using the set blur function. For example, the blur function is PSF (point spread function), and the correction process is deconvolution. This correction process can be executed in the X, Y, and Z axis directions.
[Selection] Figure 4

Description

  The present invention relates to an image data processing apparatus and an image data processing method, and more particularly to post-processing image blur reduction for 3D image data collected by a modality such as an X-ray CT (Computer Tomography) scanner. The present invention relates to an image data processing apparatus and an image data processing method that perform correction.

  In recent years, X-ray CT apparatuses have become widespread as computed tomography (CT) apparatuses, and their uses are expanded not only for medical purposes but also for industrial purposes.

  In recent years, in response to the strong demand from the medical field to capture images with a higher definition (high resolution) and a wider range, multi-slice X-ray CT apparatuses have been developed and have become quite popular. Yes.

  This multi-slice X-ray CT apparatus includes an X-ray source that radiates a fan beam X-ray having a spread width in the slice direction (longitudinal direction of the bed), and a plurality of detection element arrays (for example, 4 columns, 8 columns, etc.) Are two-dimensional detectors arranged in the slice direction, and operate with multi-scan or helical scan. Thereby, compared with a single slice X-ray CT apparatus, three-dimensional image data over a wide range of a subject can be obtained with high accuracy and in a short time.

  The three-dimensional image data obtained in this way is not only displayed and observed, but has various uses in recent years.

  For example, in the case of medical use, there is a measurement of a stenosis rate of a blood vessel and a wrinkle. Specifically, three-dimensional image data in which the distribution state of the contrast agent flowing through the blood vessel is imaged can be obtained by administering the X-ray contrast agent to the subject and performing imaging with the X-ray CT apparatus. For this reason, the stenosis rate of the blood vessel and the size of the wrinkle are measured from the distribution of the CT value of the contrast agent reflected in the three-dimensional image data. For example, in the case of measuring the stenosis rate, the thickness of the inner wall of the blood vessel (the range occupied by the contrast agent) is measured from three-dimensional image data (also called volume data), and the thickness of the blood vessel in the portion considered normal is reduced. It is obtained by comparing with the thickness of the blood vessel of the part. In the measurement of the thickness of the blood vessel, a threshold value for the CT value is usually set.

On the other hand, Patent Documents 1 and 2 show other processing examples of three-dimensional image data obtained from various medical apparatuses including not only an X-ray CT apparatus but also an ultrasonic diagnostic apparatus and a magnetic resonance imaging apparatus. . An object of Patent Document 1 is to perform reliable blood vessel measurement based on a display image, and sets a region of interest that crosses a blood vessel wall vertically on a tomographic image of a blood vessel, and a profile of pixel values in this region The dimension value related to the blood vessel is measured based on the above. On the other hand, the object described in Patent Document 2 is intended to accurately measure the length of an object of interest (blood vessel, intestine, etc.) having a curvature in a direction not parallel to the projection plane using an MIP image.
JP-A-11-342132 JP 2000-350726 A

  However, when measuring the stenosis rate or wrinkle of the blood vessels described above, the three-dimensional image data itself depends on the basic performance of the scanner itself and the imaging (scanning) conditions when imaging (scanning) with the X-ray scanner. There is a problem of reduction in spatial resolution (specifically, spatial resolution due to the “blurred” component of the pixel value).

  For example, the resolution of the subject in the body axis direction (usually the longitudinal direction of the bed: the Z-axis direction) varies depending on the slice thickness. When the slice thickness is increased, “blur” occurs in the body axis direction, the resolution in the body axis direction decreases, and the image quality deteriorates. “Bokeh” is obtained by scanning small particles having a high X-ray absorption rate and examining the amplitude of the spatial frequency component by a PSF (point spread function). In the measurement of the stenosis rate, if there is a “blurred” component in the image data, the blood vessel is measured to be thicker than the actual thickness, so the error in the measurement value of the stenosis rate increases and the reliability of the measurement Had fallen.

  In addition, there is a problem that the spatial resolution due to the above-described blur component is further different between the Z-axis direction described above and the X-axis direction and the Y-axis direction orthogonal thereto. That is, since the spatial resolution directivity exists in the three-dimensional image data obtained by the X-ray CT apparatus, the error component varies depending on the direction, and the image quality becomes unstable.

  In Patent Documents 1 and 2 described above, no consideration is given to the above-mentioned “blur” component, even though three-dimensional image data obtained by a medical apparatus including an X-ray CT apparatus is used.

  The present invention has been made in view of the above-described circumstances. Even for three-dimensional image data photographed under different photographing apparatuses and various photographing conditions, the blur component of each pixel can be reliably detected from the three-dimensional image data. The purpose is to improve the accuracy and quality of the results of image processing performed on the three-dimensional image data.

  In order to achieve the above object, an image processing apparatus is provided as one aspect of the present invention. This image processing device is a device for processing real space three-dimensional image data obtained by photographing the internal structure of an object under given photographing parameters, and reduces blur for each pixel of the image data. A blur function for setting a blur function according to at least one of the parameter representing the basic performance of the photographing apparatus and the photographing parameter, and the image data using the blur function set by the blur function setting means. Correction means for performing correction processing for blur reduction.

  For example, the 3D image data is 3D image data obtained by reconstructing data collected by scanning a subject as the object using a radiation CT apparatus, Includes parameters relating to the basic performance of the radiation CT apparatus and parameters relating to imaging arbitrarily set at the time of data collection.

  The imaging parameters preferably include at least the slice thickness during scanning and reconstruction, and the type of reconstruction algorithm.

  The imaging parameters include, for example, a slice thickness at the time of scanning and reconstruction, a type of reconstruction algorithm, a reconstruction condition, a reconstruction function, a pixel size, and a helical pitch at the time of helical scanning.

  For example, the blur function setting unit is configured to set the blur function at least in the body axis direction of the subject in the three-dimensional image data.

  An apparatus according to another aspect of the present invention is an apparatus that substantially minimizes blurring of image data obtained by scanning. The apparatus scans a known object to generate first image data and scans an object of interest to generate second image data, and is connected to the scan unit, and the known unit A PSF (point spread function) is determined based on the first image data from the object, and the PSF is corrected according to a combination of parameters to generate an improved PSF. A data processing unit that processes the second image data by deconvolution with PSF to substantially minimize the blur of the second image data.

  Furthermore, an apparatus according to another aspect of the present invention is an apparatus that substantially minimizes blurring of image data, scans a known object to generate first three-dimensional image data, and scans an object of interest. A second unit for generating second three-dimensional image data; and a PSF (point spread function: point image) connected to the scan unit and based on the first three-dimensional image data from the known object Distribution function) and correcting the PSF according to the combination of parameters to generate an improved PSF, and subjecting the second three-dimensional image data to a deconvolution process with the improved PSF. And a data processing unit that substantially minimizes the blur of the second three-dimensional image data.

  On the other hand, the image data processing method according to the present invention is a blur function for reducing blur for each pixel of real space three-dimensional image data obtained by photographing the internal structure of an object under given photographing parameters. Is set in accordance with at least one of a parameter representing the basic performance of the photographing apparatus and the photographing parameter, and the image data is subjected to correction processing for blur reduction using the set blur function. .

  The present invention has a typical configuration and basic functions and effects as described above, but other functions and effects will become apparent through the description of the accompanying drawings and the embodiments of the invention described below.

  According to the present invention, the blur component due to the basic performance of the photographing apparatus and photographing conditions is reliably reduced for each pixel from the photographed three-dimensional image data, and the result of image processing using the three-dimensional image data is highly accurate. An image data processing apparatus and an image data processing method that can be improved in quality and quality can be provided.

  Hereinafter, one embodiment of the present invention will be described with reference to FIGS.

  FIG. 1 shows the configuration of a multi-slice CT apparatus as an embodiment of an X-ray CT apparatus as a radiation CT apparatus according to this embodiment. This multi-slice CT apparatus can perform not only a multi-slice helical scan but also a conventional scan (single-slice scan and multi-slice scan).

  As shown in FIG. 1, the multi-slice CT apparatus 10 has a bed (not shown) on which a subject (for example, a patient) P is placed and a diagnostic opening OP for inserting the subject P to make a diagnosis. A gantry G that collects projection data of the subject P, and a data processing unit U that controls the overall operation of the gantry G and collects projection data and performs image reconstruction processing, image display, and the like. ing.

  The bed has a top plate that is slidable in the longitudinal direction by driving a bed driving unit (not shown). Usually, the subject P is placed so that the body axis direction coincides with the longitudinal direction.

  The gantry G includes an X-ray tube 11 as a radiation source disposed opposite to the subject P inserted into the diagnostic opening OP, a two-dimensional X-ray detector 12 as a radiation detector, and a data acquisition device. In addition to an X-ray detector system 14 including (DAS 13), a non-contact data transmission device 15, a gantry driving unit 16, and a slip ring 17 are provided.

  The X-ray tube 11 and the X-ray detector system 14 (including the X-ray detector 12 and the DAS 13) are provided on a rotating ring 21 that can rotate within the gantry G, and are rotated by drive control from the gantry driving unit 16. The ring 21 rotates. As a result, the X-ray tube 11 and the X-ray detector system 14 can rotate together around a rotation center axis parallel to the body axis direction of the subject P inserted into the diagnostic opening OP of the gantry G. It has become. The rotating ring 21 is driven to rotate at a high speed of 1 second or less per rotation.

  The X-ray tube 11 generates a cone beam (quadrangular pyramid) shape or a fan beam shape X-ray with respect to the subject P placed in the effective visual field region FOV. The X-ray tube 11 is supplied with electric power (tube voltage, tube current) necessary for X-ray exposure from the high voltage generator 18 through the slip ring 17. As a result, the X-ray tube 11 can generate so-called cone beam X-rays or fan beam X-rays that spread in two directions: a slice direction parallel to the rotation center axis and a channel direction orthogonal to the slice direction. In normal diagnosis, the subject P is placed on the top plate along the longitudinal direction of the bed, so that the slice direction coincides with the body axis direction of the subject P.

  In the gantry G, between the X-ray tube 101 and the subject P, a cone-shaped or fan-shaped X-ray beam exposed from the X-ray focal point of the X-ray tube 101 is shaped to have a required size. A collimator 19 for forming the X-ray beam is provided.

  The data processing unit U is a preprocessing device 21 that performs preprocessing such as data correction with the host controller 20 as the center, a storage device 22, an auxiliary storage device 23, a data processing device 24, a reconstruction device 25, and an input device 26. , And a display device 27 are connected to each other via a data / control bus 28.

  Further, the bus 28 is connected to an external image processing device 30. The image processing device 30 includes an auxiliary storage device 31, a data processing device 32, a reconstruction device 33, an input device 34, and a display device 35.

  The detection operation by the X-ray detector 12 is repeated, for example, about 1000 times during one rotation (about 1 second). As a result, enormous 2D projection data for M × N channels is generated, for example, 1000 times per second (one rotation). Therefore, the operations of the DAS 13 and the non-contact data transmission device 15 are executed at an extremely high speed in order to transmit the enormous and high-speed generated two-dimensional projection data with almost no time delay.

  The digital amount of projection data sent to the data processing device U is sent to the preprocessing device 21. The preprocessing device 21 performs sensitivity correction, X-ray intensity correction, and the like on the projection data. For example, 1000 sets (1000 views) of two-dimensional projection data for 360 ° subjected to sensitivity correction, X-ray intensity correction, and the like in the preprocessing device 21 are temporarily stored in the auxiliary storage device 23. The reconstruction device 25 generates (reconstructs) tomographic image data of each slice by performing reconstruction processing of the fan beam reconstruction method or cone beam reconstruction method on the projection data stored in the auxiliary storage device 23. Thereby, three-dimensional image data in real space is obtained, and this three-dimensional image data is stored in the storage device 22.

  In the present embodiment, the data processing device 24 performs image data processing for blur reduction according to the present invention as post-processing on the three-dimensional image data reconstructed as described above. ing.

  The contents of this image data processing will be described below. In the present description, such image data processing is executed by the data processing device 24 integrated with the multi-slice CT apparatus 10, but this image data processing is executed by the reconstruction device 25 integrated with the multi-slice CT apparatus 10. Alternatively, it may be executed by the data processing device 32 or the reconstruction device 33 in the external image processing device 30. Further, such image data processing may be executed by a computer device separate from the multi-slice CT apparatus 10. Therefore, the data processing devices 24 and 32, the reconstruction devices 25 and 33, and the separate computer device can functionally realize the image data processing device according to the present invention.

  As shown in FIG. 2, the data processing device 24 as an image data processing device first stores the three-dimensional image data collected and stored as described above, for example, taken under the administration of contrast medium, as a storage device 22. (Step S1).

  Next, the data processing device 24 has basic performance information (for example, X-ray focus size, detector aperture width, focus-center distance, etc.) indicating the basic performance of the multi-slice CT device 10 as a medical device, and imaging planned at the time of imaging. Conditions (for example, slice thickness at the time of scanning and reconstruction, reconstruction algorithm type, reconstruction condition, reconstruction function, pixel size, helical pitch at the time of helical scanning, blood vessel angle when the imaging target is a blood vessel, etc. (Including parameters) is read from the storage device 22 (step S2). At this time, it is preferable that the imaging conditions include at least information indicating the slice thickness at the time of canning and reconstruction. Furthermore, it is preferable that the imaging conditions include at least information indicating the slice thickness at the time of scanning and reconstruction, and the type of reconstruction algorithm.

  When this preparation is completed, the data processing device 24 sets a blur correction function for each pixel reflecting the basic performance information and the photographing conditions read as described above.

  Here, as described above, “blur” represents a state of spatial resolution in which the degree can be known by a PSF (point spread function).

  Now, assuming true three-dimensional image data D1 with no blur (see FIG. 3 (a)), a blur component is superimposed (convolution) on the collected data according to the basic performance of the apparatus and the state of the imaging condition. It is considered that three-dimensional image data D2 including a blur component is obtained (see FIG. 3C: see arrows A1 and A2 in FIG. 3). Therefore, if any blur component removal correction process can be applied to the three-dimensional image data D2 including the blur component, three-dimensional image data D1 ′ having no blur (or few blur components) can be obtained. I can do it. However, how to perform such correction processing is a problem.

  The present inventor has found that the spatial resolution of the three-dimensional data collected by the X-ray CT apparatus differs in the body axis direction (Z-axis direction) and the axial direction (X and Y-axis directions). Focusing on the fact that it varies depending on the basic performance of the apparatus and the shooting conditions, the blur correction function is set for each pixel reflecting the basic performance information of the apparatus and the shooting conditions.

  Even if this blur correction function is a kind of filtering function, by correcting (specifically, deconvolution) the captured three-dimensional image data with this blur correction function, the basic performance of the apparatus and the shooting conditions can be reduced. It is set to obtain an image in which the blur and / or smear caused is reduced or greatly eliminated.

  This blur correction function includes at least the Z-axis direction that matches the body axis direction of the subject, and is preferably set in each of the three-dimensional X, Y, and Z-axis directions. If necessary, it may be only in the Z-axis direction.

  This blur correction function conceptually has a high weight applied to each pixel, for example, as shown in FIG.

  Therefore, returning to FIG. 2 and continuing the description, the data processing device 24 performs deconvolution processing on the reconstructed three-dimensional image data using the blur function defined as described above (step S4). As a result, the three-dimensional image data D1 ′ from which the blur component is removed or greatly reduced is obtained from the three-dimensional data D2 including the blur component caused by the basic performance of the apparatus and the photographing conditions (arrow B1 in FIG. 3). , B2 flow). The blurred three-dimensional image data D1 ′ is stored in the storage device 22, for example.

  Further, the data processing device 24 reads, for example, the three-dimensional data D1 ′ stored in the storage device 22 in response to a command from the user, and from this D1 ′, measures the stenosis rate of the blood vessel, the wrinkles, and the bone (the ossicle). ) And the like are executed (step S5).

  Therefore, according to the present embodiment, when measuring the stenosis rate or wrinkles of blood vessels, at least the blur component due to the basic performance of the apparatus and the imaging conditions is greatly reduced from the image data to be measured. Therefore, the accuracy of the measurement is remarkably improved and the reliability can be increased.

  For example, even when the thicknesses of slices constituting the three-dimensional image data are different in the Z-axis direction (the body axis direction of the subject), if a blur correction function considering each pixel position in the Z-axis direction is set, the blur is set. By the correction, the blur component in the Z-axis direction due to at least the influence of the slice thickness is greatly reduced and made uniform. That is, variations in error factors in the Z-axis direction are eliminated together with removal or reduction of the blur component. For this reason, even when traveling at a three-dimensionally complex angle, such as a blood vessel of the heart, the cross-sectional image obtained by performing Curved MPR (cross-sectional transformation) from the three-dimensional image data includes at least the blood vessel in the Z-axis direction Is drawn with high precision. Naturally, by performing such blur correction on the X-axis and the Y-axis (that is, the XY cross section), the blood vessels that travel in a complicated manner are rendered with high accuracy without depending on the three-dimensional directions.

  As a result, even when various measurements are performed by post-processing from image data captured by a medical device or the like, the measurement accuracy is greatly improved, contributing to more accurate and quick diagnosis, and reliability for such measurement. Can be improved. Further, by performing such correction processing, it is possible to reduce the burden of the patient because image data collection is not repeated again.

(Another embodiment)
With reference to FIGS. 2 and 4, another embodiment of the present invention will be described. This embodiment will describe the above-described embodiment in more detail.

  Here, although partially overlapping with the above description, terms used in the following description are organized. First, the “blur minimization function” is used to effectively determine the blur, smear, and fuzziness of the 3D image obtained by reconstructing the acquired CT image data. Defined as a “correction function” (ie, “improved PSF”). Further, “blur minimization process” (that is, “de-convolution process”) is defined as a process for applying a blur minimization function. Furthermore, “blur function” and “PSF” are used as synonyms for the same conventional method, and “corrective function”, “blur correction function”, “blur minimum” In the present invention, “blur minimization function” and “improved PSF” are substantially interchangeable terms indicating the same technique.

  In step S3, the PSF is improved based on the following set of parameters to determine the correction function. This set of parameters is generally divided into two groups, including basic information and scanning conditions, and associated with a particular 3D reconstructed image. These parameters are usually independent of each other, but each parameter does not necessarily have to be independent.

  The basic information described above is device-dependent information that depends on a device that collects image data, that is, information that is essentially related to a specific CT device. As an example, this basic information includes X-ray focus size, detector width, and focus center processing. The imaging condition is a condition for collecting a specific set of three-dimensional reconstructed image data. As an example, the imaging conditions include the slice thickness at the time of scanning and reconstruction, the type of reconstruction algorithm to be used, the reconstruction condition, the reconstruction function, the pixel size, the helical pitch of the helical scan, and the object is a tubular body The angle when is is included. A preferable aspect is that the imaging condition includes information on slice thickness at the time of reconstruction and scanning at a minimum. Another aspect is that the imaging condition includes, at a minimum, information on the slice thickness at the time of reconstruction and scanning, in addition to the type of algorithm for reconstruction. These parameters are used to improve the known PSF. This known PSF is information that is preset during the training process of collecting known subjects.

Examples of the device dependent parameters described above are summarized in Table 1. This table shows detailed information about each parameter. Each parameter is described as a parameter name in Table 1. This parameter name functions as a variable. Each parameter has a range of values set for a typical CT scanner. This certain range is merely an example. The individual weight index in Table 1 takes an integer value, which indicates an individual sub-range having a corresponding range and associated weight. Among these, the known PSF is weighted using the related weight, and the blur, stain, or ambiguity of the three-dimensional image data reconstructed from the captured CT image data using the weighted known PSF. Is substantially minimized. This numerical range value depends on what the scanner device is, but in one preferred embodiment, the number of sub-ranges, i.e. the number of individual weight indicators, can be set unchanged.

Examples of the above-described photographing condition parameters are summarized in Table 2. This table shows detailed information about each parameter. Each parameter is described as a parameter name in Table 2. This parameter name functions as a variable. Each parameter has a range of values set for a typical CT scanner or processing environment. The individual weight index takes an integer representing an individual subrange of the corresponding range, a specific algorithm, or a specific reconstruction mode. Each individual weight index is also associated with a specific weight value. This associated weight value is used to weight the known PSF. A weighted known PSF is used to substantially minimize blurring, smearing, or ambiguity in the 3D image data reconstructed from the captured CT image data. This numerical range value depends on what the scanner device is, but in one preferred embodiment, the number of sub-ranges, i.e. the number of individual weight indicators, can be set unchanged.

The type of convolution filter is represented by the parameter “type conv.filter ”. This parameter has about 40 known algorithms for an algorithm that is a method for reconstructing a 3D image from scanned CT image data. These algorithms function as filters that adjust the image quality when performing three-dimensional reconstruction. For example, another filter can be used for a specific part of the scanned body image data.

The type of reconstruction is represented by the parameter “t reconstruction ”. This parameter has four conditions under which predetermined three-dimensional reconstruction is executed. As an example, there are four modes: a half reconstruction mode, a full reconstruction mode, a fan beam reconstruction mode, and a cone beam reconstruction mode.

The parameter representing the helical pitch is “P helical ”. This parameter is set on the assumption that the helical pitch is equal to the movement distance of the bed per rotation. In the case where four slices are simultaneously helically scanned, for example, it is assumed that the X-ray beam is 4 mm, the beam pitch is in the range of 0.5 mm to 2 mm, and the helical pitch is in the range of 2 mm to 8 mm.

For this reason, the improved PSF is determined by combining the above-described parameters. That is, when the improved PSF is expressed as PSF improved , the PSF improved is approximately defined by the following equation (1).

[Equation 1]
PSF improved
= PSF (d focus-x , d focus-z , a dct-x , d focus to focus , t detector collimation , t reconstruction , type conv.filter , type reconstruction , size pixel , P helical ) …… (1)

In obtaining this improved PSF, the combination of parameters for weighting the PSF is arbitrary, but it is desirable that the imaging condition includes at least a parameter “t reconstruction ” representing “ reconstruction type”. Another suitable example is that the imaging condition includes at least a parameter “type conv.filterindicating “convolution filter type” and a parameter “t reconstruction ” indicating “ reconstruction type”. It is.

  The actual weighting process is performed in various ways. As an example, it is preferable to introduce each of the above-described parameters using separate tables describing individual weight indices and corresponding weight values used to improve the effect of a known PSF. The total number of combinations of these weight values is equal to the product of the number of individual weight indexes corresponding to all the above-described parameters.

  By repeatedly executing this method, the target image O that minimizes the error E is obtained. Based on the slope of the error E, the correction vector dE / dO is determined. By using the correction vector dE / dO, the target image is obtained as the iteration is performed as represented by the following equation (5). O can be converged.

[Equation 5]
O N + 1 = (O N−a ) dE / dO (5)

  In the formula (5), “a” is a constant. In order to increase the convergence speed to the maximum value, it is preferable to select the constant a and the correction vector as optimum values using other methods such as “maximum diving method” and “conjugate gradient method”. On the other hand, as the convergence speed is increased, there is a negative aspect that convergence, that is, deconvolution, is more susceptible to image noise, so it is necessary to consider the balance with the convergence speed of image quality. .

As shown in the preferred example of FIG. 2, in the above process that substantially removes the blur, smear, and obscuration of the 3D reconstructed image data according to the present invention, an improved PSF, i.e., a blur minimization function, is obtained. Used in place of “PSF” in equations (3) and (4). By using this PSF improved , that is, the blur minimization function, unnecessary blur on the convolved image O ′, which can be said to be a true image O, can be minimized.

In the above description, PSF improved that can be applied uniformly in three directions of captured image data has been described. However, this PSF improved may be set independently for each of the X, Y, and Z axis directions. This is because the amount of blur is usually direction dependent with respect to scanning. Therefore, another parameter indicating the direction is added to PSF improved , and PSF improved (x 1 , y 1 , z 1 ) having directionality is set as shown in the following equation (6). Also good.

[Equation 6]
PSF improved (x i , y i , z i )
= PSF (d focus-x , d focus-z , a dct-x , d focus to focus , t detector collimation , t reconstruction , type conv.filter , type reconstruction , size pixel , P helical ) …… (6)

In this equation (6), x i , y i , and z i correspond to the respective coordinate positions, that is, in other words, the scan direction. Furthermore, the above-described improvement method for known PSFs can also be implemented taking into account any direction, such as a predetermined diagonal direction. That is, it has been found that a true image of the object of interest cannot be ideally collected and is due to various inaccuracies including mechanical and optical elements. This is expressed by the following equation (7) when PSF improved is used.

[Equation 7]
V (x, y, z) = PSF improved (x i , y i , z i ) | v (x, y, z) (7)

  In this equation (7), v (x, y, z) is three-dimensional ideal, that is, true image data, and V (x, y, z) is collected three-dimensional image data. For this reason, the three-dimensional image data v ′ (x, y, z) corrected almost to the ideal can be determined from the acquired CT image data V (x, y, z). v ′ (x, y, z) is expressed using a deconvolution process as shown in the following equation (8).

[Equation 8]
v ′ (x, y, z) = PSF improved −1 (x i , y i , z i ) | v (x, y, z) (8)

  This equation (8) is susceptible to noise as described above, but by using an appropriate iterative method, the influence of noise can be substantially ignored and the accuracy can be improved. As described above, as an example, it is preferable to execute the processing based on the equation (8) for each pixel.

  With reference to FIG. 4, the preferred process including the basic effect of minimizing the blur of the collected CT image data is visually summarized.

  FIG. 4A shows a sub-process for determining a certain PSF based on a known object such as a wire model. For example, the length of the two wires is smaller than the spatial resolution of the detector mounted on the specific CT scanner used. Therefore, as shown in FIG. 4B, a set of PSFs including three PSFs in the X, Y, and Z directions is obtained. Each of the three PSFs draws a bell-shaped curve having a fairly wide point distribution range.

  Next, an object such as a blood vessel is scanned by the same CT scanner, and a three-dimensional image as shown in FIG. 4E is reconstructed. The three-dimensional reconstructed image of the blood vessel includes a blurred portion or a stain portion that tends to induce inaccuracy when measuring the physical characteristics of the blood vessel. The image including the blur and the stain is considered to be a result of convolution of the blur and the stain with the target original or ideal image.

A set of information is collected and stored for use in the specific image data that has been imaged as described above. This information includes information related to device characteristics of the CT scanner to be used and information related to scanning conditions (imaging conditions) executed to collect specific image data. As shown in FIG. 4C, both the apparatus information and the imaging condition information are used as parameters, and the PSF for substantially eliminating the blur and stain on the CT image of the blood vessel is improved. The parameters include d focus-x , d focus-z , a dct-x , d focus to focus , t detector collimation , t reconstruction , type conv. A combination consisting of filter , type reconstruction , size pixel , and P helical is included.

  As a result of this improvement processing, as schematically shown in FIG. 4D, three PSFs in a narrower range than the PSF shown in FIG. 4B can be obtained. The three PDFs show a narrower (limited) point distribution, further improving the spatial resolution. The reconstructed three-dimensional image shown in FIG. 4E using this improved PDF is subjected to deconvolution processing, and as a result, blur and stain are substantially eliminated.

  Due to the substantial erasure (minimization) with respect to the blur and the stain, the three-dimensional reconstructed image of the blood vessel is clearly visualized with a contour line as shown in FIG. As a result, when a physical feature is measured from the blood vessel image, the measurement accuracy is greatly improved. That is, much better measurement accuracy can be obtained by using the clearly improved blood vessel image shown in FIG. 4F than using the blurred or stained blood vessel image shown in FIG.

  The present invention is not limited to the above-described embodiments, and various modifications can be made without departing from the scope of the invention when it is practiced. In the above-described embodiment, the case where the image data processing method according to the present invention is applied to a medical device has been described. However, the embodiment can also be applied to industrial fields such as reverse design. Note that this medical apparatus is not limited to the X-ray CT scanner, and includes a magnetic resonance imaging apparatus and an ultrasonic diagnostic apparatus. Image data collected from such a medical modality is used for the processing of the present invention. Can be targeted. Further, the improved PSF described above can be applied not only to 3D image data and 3D reconstructed image data, but also to 2D image data, and also to an image data processing method combining 3D and 2D. .

1 is a block diagram showing a schematic configuration of an X-ray CT apparatus according to one embodiment of the present invention. 3 is a schematic flowchart showing a flow of blur component correction processing of three-dimensional image data executed in the X-ray CT apparatus of the embodiment. The figure explaining the principle of the correction | amendment process of the blur component of the three-dimensional image data implemented by this invention in embodiment. The figure explaining the principle of the correction | amendment process of the blur component of the three-dimensional image data implemented by this invention in another embodiment.

Explanation of symbols

11 X-ray tube 12 X-ray detector 13 DAS (data collection device)
14 X-ray detector system 22 Storage device 24, 32 Data processing device 25, 33 Reconfiguration device

Claims (31)

  1. In an image data processing apparatus for processing 3D image data of real space obtained by photographing the internal structure of an object under given photographing parameters,
    A blur function setting means for setting a blur function for reducing blur for each pixel of the image data according to at least one of a parameter representing basic performance of the photographing apparatus and the photographing parameter;
    An image data processing apparatus comprising: correction means for performing correction processing for blur reduction on the image data using the blur function set by the blur function setting means.
  2. The three-dimensional image data is three-dimensional image data obtained by reconstructing data collected by scanning a subject as the object using a radiation CT apparatus,
    The image data processing apparatus according to claim 1, wherein the imaging parameters include a parameter relating to basic performance of the radiation CT apparatus and a parameter relating to imaging arbitrarily set at the time of data collection.
  3. The image data processing apparatus according to claim 2, wherein the blur function setting unit is configured to set the blur function in at least a body axis direction of the subject in the three-dimensional image data. .
  4. The blur function setting means separately sets the blur function in each of a total of three directions including a body axis direction of the subject in the three-dimensional image data and two orthogonal directions along a plane orthogonal to the body axis direction. The image data processing apparatus according to claim 3, wherein the image data processing apparatus is configured to be set.
  5. The image data processing apparatus according to any one of claims 2 to 4, wherein the imaging parameter includes at least a slice thickness during scanning and reconstruction.
  6. The image data processing apparatus according to any one of claims 2 to 4, wherein the parameters relating to imaging include at least a slice thickness at the time of scanning and reconstruction, and a type of reconstruction algorithm.
  7. The imaging parameters include a slice thickness at the time of scanning and reconstruction, a type of reconstruction algorithm, a reconstruction condition, a reconstruction function, a pixel size, and a helical pitch at the time of helical scanning. 5. The image data processing device according to any one of 4.
  8. The image processing apparatus according to claim 1, wherein the correction unit is a unit that performs a deconvolution process with a characteristic corresponding to the blur function on the image data.
  9. A parameter indicating the basic performance of the imaging apparatus and a blur function for reducing blur for each pixel of the three-dimensional image data in real space obtained by imaging the internal structure of the object under given imaging parameters and the imaging Set according to at least one of the parameters,
    An image data processing method, wherein correction processing for blur reduction is performed on the image data using the set blur function.
  10. In an apparatus that substantially minimizes blurring of image data obtained by scanning,
    A scan unit that scans a known object to generate first image data and scans an object of interest to generate second image data;
    A PSF (point spread function) is determined based on the first image data from the known object connected to the scan unit, and the PSF is corrected according to a combination of parameters. A data processing unit for generating an improved PSF and subjecting the second image data to a deconvolution process with the improved PSF to substantially minimize blur of the second image data; A device comprising:
  11. The memory unit for storing the parameter, the first image data, and the second image data, the memory unit being connected to the scan unit and the data processing unit. apparatus.
  12. 11. The data processing unit according to claim 10, wherein the data processing unit is configured to substantially minimize blur in the X-axis direction during the scanning of the second image data based on the improved PSF. The device described.
  13. 11. The data processing unit according to claim 10, wherein the data processing unit is configured to substantially minimize blur in the Y-axis direction during the scanning of the second image data based on the improved PSF. The device described.
  14. The said data processing unit is comprised so that the blur of the Z-axis direction at the time of the said scan of the said 2nd image data might be substantially minimized based on the said improved PSF. The device described.
  15. The apparatus of claim 10, wherein the data processing unit is configured to apply the improved PSF to each pixel of the second image data.
  16. 12. The apparatus according to claim 11, wherein the memory unit stores, as the parameter, a parameter including a combination of apparatus-dependent parameters depending on an apparatus that performs the scan and an imaging condition parameter.
  17. The apparatus of claim 16, wherein the memory unit stores a predetermined set of weight values for weighting each of the parameters.
  18. The device-dependent parameters include a parameter d focus-x indicating the focus size in the X-axis direction, a parameter d focus-z indicating the focus size in the Z-axis direction, and a parameter a dct- indicating the detector aperture in the X-axis direction. 17. The apparatus according to claim 16, comprising x and a parameter d focus to focus indicating a distance between the focus and the center of rotation.
  19. The imaging condition parameter includes a parameter t detector collimation indicating a slice thickness by a collimator of the detector , a parameter t reconstruction indicating a slice thickness at the time of reconstruction , and a parameter type conv.filter indicating a type of the convolution filter for the reconstruction . The apparatus according to claim 16, further comprising: a parameter type reconstruction indicating a type of reconstruction mode; a parameter size pixel indicating a pixel size; and a parameter P helical indicating a helical pitch.
  20. The apparatus according to claim 10, wherein the data processing unit is a unit that reconstructs the first image data and the second image data in two dimensions.
  21. The apparatus according to claim 10, wherein the data processing unit is a unit that reconstructs the first image data and the second image data in two dimensions.
  22. The apparatus according to claim 10, wherein the data processing unit is configured to apply a deconvolution process to each pixel of the second image data.
  23. The apparatus according to claim 10, wherein the scanning unit is an X-ray CT scanner.
  24. The object of interest is a human body, the X-ray CT scanner scans the human body in a body axis direction along the X-axis direction, and the data processing unit defocuses the blur in the Z-axis direction based on the improved PSF. 24. The apparatus of claim 23, wherein the apparatus is configured to be substantially minimized.
  25. 25. The apparatus of claim 24, wherein the data processing unit is a unit that substantially minimizes blur in each of the X-axis and Y-axis directions orthogonal to the Z-axis direction based on the improved PSF.
  26. The apparatus according to claim 25, wherein the parameter includes a parameter t reconstruction indicating a slice thickness at the time of the reconstruction .
  27. 27. The apparatus according to claim 26, wherein the parameter includes a parameter type reconstruction indicating a type of the reconstruction mode.
  28. 28. The apparatus according to claim 27, wherein the data processing unit is a unit that performs a deconvolution process on each pixel of the second image data.
  29. In an apparatus that substantially minimizes blurring of image data,
    A scan unit that scans a known object to generate first 3D image data and scans an object of interest to generate second 3D image data;
    A PSF (point spread function) is determined based on the first three-dimensional image data connected to the scan unit and from the known object, and the PSF is corrected according to a combination of parameters. And generating an improved PSF, and subjecting the second 3D image data to deconvolution processing with the improved PSF to substantially minimize the blur of the second 3D image data. And a data processing unit.
  30. 30. The apparatus according to claim 29, wherein the parameter includes a parameter t reconstruction indicating a slice thickness at the time of the reconstruction .
  31. The apparatus according to claim 29, wherein the parameter includes a parameter type reconstruction indicating a type of the reconstruction mode.
JP2004226115A 2003-07-31 2004-08-02 Image data processing device Expired - Fee Related JP4686147B2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2003205025 2003-07-31
JP2004226115A JP4686147B2 (en) 2003-07-31 2004-08-02 Image data processing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004226115A JP4686147B2 (en) 2003-07-31 2004-08-02 Image data processing device

Publications (2)

Publication Number Publication Date
JP2005058760A true JP2005058760A (en) 2005-03-10
JP4686147B2 JP4686147B2 (en) 2011-05-18

Family

ID=34379955

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004226115A Expired - Fee Related JP4686147B2 (en) 2003-07-31 2004-08-02 Image data processing device

Country Status (1)

Country Link
JP (1) JP4686147B2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008530517A (en) * 2004-11-17 2008-08-07 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Reconstruction of 2D planar images of nuclear medicine by iterative constraint deconvolution
EP2015560A2 (en) 2007-07-13 2009-01-14 Morpho Inc. Image data processing method and imaging apparatus
CN101336828B (en) * 2007-07-06 2010-10-13 Ge医疗系统环球技术有限公司 Acquisition method and device of CT value correction paper
CN101576505B (en) * 2008-04-21 2011-11-16 株式会社林创研 Three-dimensional image obtaining device and processing apparatus using the same
WO2013089155A1 (en) * 2011-12-12 2013-06-20 株式会社 日立メディコ X-ray ct device and method for correcting scattered x-rays
JP2013166033A (en) * 2006-02-27 2013-08-29 Toshiba Corp Image display apparatus and x-ray ct apparatus
JP2016517789A (en) * 2013-05-14 2016-06-20 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Artifact reduction in X-ray image reconstruction using coordinate grid matched to geometry

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102759538B (en) * 2012-08-02 2014-04-16 西北工业大学 Method for measuring and modeling point spread function of cone beam CT system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61290573A (en) * 1985-06-19 1986-12-20 Hitachi Medical Corp X-ray ct device
JPH10314152A (en) * 1997-05-19 1998-12-02 Hitachi Medical Corp X-ray photographing device
JP2000157529A (en) * 1991-03-15 2000-06-13 Hitachi Medical Corp X-ray ct device
JP2001149364A (en) * 1999-11-30 2001-06-05 Shimadzu Corp Cone beam x-ray ct system
JP2003190100A (en) * 2001-12-27 2003-07-08 Konica Corp Medical image processor, medical image processing method, program and recording medium having program recorded thereon

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS61290573A (en) * 1985-06-19 1986-12-20 Hitachi Medical Corp X-ray ct device
JP2000157529A (en) * 1991-03-15 2000-06-13 Hitachi Medical Corp X-ray ct device
JPH10314152A (en) * 1997-05-19 1998-12-02 Hitachi Medical Corp X-ray photographing device
JP2001149364A (en) * 1999-11-30 2001-06-05 Shimadzu Corp Cone beam x-ray ct system
JP2003190100A (en) * 2001-12-27 2003-07-08 Konica Corp Medical image processor, medical image processing method, program and recording medium having program recorded thereon

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008530517A (en) * 2004-11-17 2008-08-07 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Reconstruction of 2D planar images of nuclear medicine by iterative constraint deconvolution
JP2013166033A (en) * 2006-02-27 2013-08-29 Toshiba Corp Image display apparatus and x-ray ct apparatus
CN101336828B (en) * 2007-07-06 2010-10-13 Ge医疗系统环球技术有限公司 Acquisition method and device of CT value correction paper
EP2015560A2 (en) 2007-07-13 2009-01-14 Morpho Inc. Image data processing method and imaging apparatus
US8155467B2 (en) 2007-07-13 2012-04-10 Morpho, Inc. Image data processing method and imaging apparatus
CN101576505B (en) * 2008-04-21 2011-11-16 株式会社林创研 Three-dimensional image obtaining device and processing apparatus using the same
WO2013089155A1 (en) * 2011-12-12 2013-06-20 株式会社 日立メディコ X-ray ct device and method for correcting scattered x-rays
US9307949B2 (en) 2011-12-12 2016-04-12 Hitachi Medical Corporation X-ray CT device and method for correcting scattered X-rays
JP2016517789A (en) * 2013-05-14 2016-06-20 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Artifact reduction in X-ray image reconstruction using coordinate grid matched to geometry

Also Published As

Publication number Publication date
JP4686147B2 (en) 2011-05-18

Similar Documents

Publication Publication Date Title
US9629590B2 (en) Radiation imaging apparatus and imaging method using radiation
US6765983B2 (en) Method and apparatus for imaging a region of dynamic tissue
US6674834B1 (en) Phantom and method for evaluating calcium scoring
JP5523686B2 (en) Method for reducing motion artifacts in highly limited medical images
US7260252B2 (en) X-ray computed tomographic apparatus, image processing apparatus, and image processing method
US7869560B2 (en) X-ray CT apparatus and image processing apparatus
US7558439B2 (en) Motion artifact correction of tomographical images
JP4079632B2 (en) Method and apparatus for selecting and displaying medical image data
US7623691B2 (en) Method for helical windmill artifact reduction with noise restoration for helical multislice CT
US7269246B2 (en) X-ray angiography apparatus
US6421411B1 (en) Methods and apparatus for helical image artifact reduction
US7142633B2 (en) Enhanced X-ray imaging system and method
DE60224770T2 (en) Method and apparatus for noise reduction in computer tomographs
JP5011482B2 (en) X-ray CT system
US5598453A (en) Method for X-ray fluoroscopy or radiography, and X-ray apparatus
JP4424709B2 (en) Method and imaging system for forming an image of an object
US8705822B2 (en) Method for creating images indicating material decomposition in dual energy, dual source helical computed tomography
US8396184B2 (en) X-ray CT system and control method for same
JP5248648B2 (en) Computer tomography system and method
US6904117B2 (en) Tilted gantry helical cone-beam Feldkamp reconstruction for multislice CT
EP2326250B1 (en) Calibration method for ring artifact correction in non-ideal isocentric 3d rotational x-ray scanner systems using a calibration phantom based rotation center finding algorithm
US7440535B2 (en) Cone beam CT apparatus using truncated projections and a previously acquired 3D CT image
EP0932363B1 (en) Tomosynthesis system for breast imaging
US6873677B2 (en) Method and device for improving time resolution of an imaging device
US8280135B2 (en) System and method for highly attenuating material artifact reduction in x-ray computed tomography

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070801

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100810

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101008

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20101109

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20101215

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110118

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110214

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140218

Year of fee payment: 3

R151 Written notification of patent or utility model registration

Ref document number: 4686147

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140218

Year of fee payment: 3

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

Free format text: JAPANESE INTERMEDIATE CODE: R313114

Free format text: JAPANESE INTERMEDIATE CODE: R313117

R371 Transfer withdrawn

Free format text: JAPANESE INTERMEDIATE CODE: R371

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313117

Free format text: JAPANESE INTERMEDIATE CODE: R313113

Free format text: JAPANESE INTERMEDIATE CODE: R313114

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees