CN106651809B - Method and device for removing artifacts in image - Google Patents

Method and device for removing artifacts in image Download PDF

Info

Publication number
CN106651809B
CN106651809B CN201611256592.9A CN201611256592A CN106651809B CN 106651809 B CN106651809 B CN 106651809B CN 201611256592 A CN201611256592 A CN 201611256592A CN 106651809 B CN106651809 B CN 106651809B
Authority
CN
China
Prior art keywords
projection
function
filament
point
diffusion function
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201611256592.9A
Other languages
Chinese (zh)
Other versions
CN106651809A (en
Inventor
陈鸣之
姬长胜
杨宏成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201611256592.9A priority Critical patent/CN106651809B/en
Publication of CN106651809A publication Critical patent/CN106651809A/en
Application granted granted Critical
Publication of CN106651809B publication Critical patent/CN106651809B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20201Motion blur correction

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

A method, a device and a medical imaging system for removing artifacts in images are provided, wherein the method for removing the artifacts in the images comprises the following steps: acquiring a system matrix of an imaging system, wherein the system matrix gives out a fuzzy factor corresponding to an ith pixel point imaged in an imaging unit after an ith ray passes through j individual pixels and a pixel point of a preset neighborhood taking the ith pixel point as a center; and the system matrix is used for the iterative reconstruction method to carry out reconstruction so as to remove the artifact in the image. The quality of the image reconstructed by applying the technical scheme of the invention is better, the image meets the actual clinical requirement, and the occurrence of missed diagnosis or misdiagnosis is avoided.

Description

Method and device for removing artifacts in image
Technical Field
The present invention relates to the field of image processing, and in particular, to a method and an apparatus for removing artifacts in an image.
Background
Motion blur is the effect of a movement of a scene in an image. This phenomenon is more pronounced in situations where long exposures are available or where objects within the scene are moving rapidly. Cone Beam Computed Tomography (CBCT) imaging systems, such as C-arm imaging systems, have a movement of the light source relative to the object being imaged, i.e. motion artifacts, due to the rotation of the gantry during the imaging process.
At present, motion blur can be removed by a deconvolution method, and classified into a blind deconvolution method and a non-blind deconvolution method according to the known or non-known blur kernel. Both blind and non-blind deconvolution methods need to be implemented by accurately estimating the blur kernel. The blur kernel characterizes the degree of blurring of an image, and can be divided into two types, namely space shift invariant (the blur kernel is independent of the pixel coordinate position) and space shift variant (the blur kernel is a function of the pixel coordinate position). For medical imaging systems with high gantry rotation speeds, such as CBCT imaging systems, the blur kernel is not only spatially shifted, but is also uncertain even at a certain pixel coordinate position, with the degree of blur depending on the position of the imaging subject from the source and the flat panel detector.
In the prior art, motion blur in a medical imaging system with a high-speed rotating frame is removed based on a two-dimensional projection image, the estimation of a blur kernel is based on spatial shift and is unchanged, an object is assumed to be a point object arranged at an isocenter, and because a simplified model is not consistent with an actual situation, the removal effect of motion artifacts in the projection image is poor, and further the projection image is reconstructed, the quality of the reconstructed image is poor, the actual clinical requirement is not met, and missed diagnosis or misdiagnosis is easily caused.
Therefore, how to remove the motion artifact caused by the gantry rotation is one of the problems to be solved.
Disclosure of Invention
The invention provides a method for removing artifacts in an image, which comprises the following steps:
acquiring a system matrix of an imaging system, wherein the system matrix gives out a fuzzy factor corresponding to an ith pixel point imaged in an imaging unit after an ith ray passes through j individual pixels and a pixel point of a preset neighborhood taking the ith pixel point as a center;
and the system matrix is used for the iterative reconstruction method to carry out reconstruction so as to remove the artifact in the image.
Optionally, the acquiring a system matrix of the imaging system includes: and acquiring system matrixes of the imaging system at different projection angles, and superposing the system matrixes at different projection angles to acquire the system matrix of the imaging system.
Optionally, the obtaining of the system matrix under the projection angle includes:
dividing the rotated angle in the exposure integration time of the imaging system by a preset angle, wherein the preset angle is smaller than the projection angle, and the projection angle is half of the rotated angle in the exposure integration time of the imaging system;
and superposing the divided system matrix corresponding to each angle to obtain the system matrix of the imaging system under the projection angle.
Optionally, the acquiring a system matrix of the imaging system includes:
acquiring a projection image of a phantom, and acquiring a diffusion function of a voxel in the phantom based on the projection image, wherein a marker is distributed in the phantom, and the size of the projection image of the marker in at least one dimension is smaller than the size of a detector pixel;
acquiring a system matrix of an imaging system based on an initial system matrix of the imaging system and the diffusion function of the voxel, wherein the initial system matrix of the imaging system is obtained based on a point model, a line model or an area model.
Optionally, the marker is a small sphere, the obtaining a projection image of the phantom, and the obtaining a diffusion function of a voxel in the phantom based on the projection image includes:
acquiring a projection central point of the small ball in the projection image;
selecting a preset neighborhood by taking the projection central point of the small ball in the projection image as a center, and normalizing the gray value of the pixel point of the neighborhood by taking the gray value of the projection central point of the small ball as a reference;
generating a diffusion function of the small ball based on the distance from the pixel point in the preset neighborhood to the projection center point of the small ball and the normalized gray value of the pixel point; and
and generating the diffusion function of the voxel in the model body based on the diffusion function and the interpolation function of the small ball.
Optionally, the generating a diffusion function of a voxel in the phantom based on the diffusion function and the interpolation function of the small sphere includes:
establishing a basis function corresponding to the diffusion function of the small ball;
interpolating the base function characteristic representation corresponding to the diffusion function of the small ball to obtain the base function characteristic representation of the voxel in the model body;
and converting the base function characteristic representation of the voxel in the phantom into a diffusion function corresponding to the voxel.
Optionally, the marker is a filament, the obtaining a projection image of the phantom, and the obtaining a diffusion function of a voxel in the phantom based on the projection image includes:
acquiring a first diffusion function of each point on the filament in the projection image of the first angle in the direction perpendicular to the direction of the filament;
acquiring a second diffusion function of each point on the filament in the projection image of the second projection angle in the direction perpendicular to the direction of the filament; the first projection angle is perpendicular to the second projection angle;
multiplying the first diffusion function and the second diffusion function corresponding to each point on the filament to obtain a diffusion function of the filament;
generating a diffusion function for voxels in the phantom based on the diffusion function and the interpolation function of the filament.
Optionally, acquiring a spread function of a point on the filament in the projection image in a direction perpendicular to the direction of the filament includes:
acquiring a projected central point of a point on the filament in the projected image;
selecting a preset neighborhood by taking the projection central point of the point on the filament in the projection image as a center, and normalizing the gray value of the pixel point of the neighborhood by taking the gray value of the projection central point of the point on the filament as a reference;
and generating a diffusion function of the point on the filament in the direction vertical to the direction of the filament based on the distance from the pixel point in the preset neighborhood to the projection center point of the point on the filament and the normalized gray value of the pixel point.
Optionally, generating the diffusion function of the voxel in the phantom based on the diffusion function and the interpolation function of the filament includes:
establishing a base function characteristic representation corresponding to a diffusion function of the filament;
interpolating the basis function characteristic representation corresponding to the diffusion function of the filament to obtain the basis function characteristic representation of the voxel in the phantom;
and converting the base function characteristic representation of the voxel in the phantom into a diffusion function corresponding to the voxel. Optionally, the interpolation function is: one of a nearest neighbor interpolation function, a trilinear interpolation function, and a B-spline interpolation function.
Optionally, the applying the system matrix to an iterative reconstruction method for reconstruction includes: and reconstructing the obtained voxels by using a filtered back projection method as initial values of iterative reconstruction.
In order to solve the above problem, the present invention further provides a device for removing artifacts in an image, including:
the system matrix obtaining unit is used for obtaining a system matrix of the imaging system, and the system matrix gives out fuzzy factors corresponding to the ith pixel point imaged in the imaging unit and the pixel point of a preset neighborhood taking the ith pixel point as the center after the ith ray passes through the j individual pixels;
and the reconstruction unit is used for reconstructing the system matrix by using an iterative reconstruction method so as to remove the artifacts in the image.
In order to solve the above problem, the present invention further provides a medical imaging system, which includes the above apparatus for removing artifacts in images.
Optionally, the medical imaging system is one of a CBCT system, a CT system or a PET system.
Compared with the prior art, the technical scheme of the invention has the following advantages:
when a system matrix of an imaging system is obtained, the influence of motion blur is added into the system matrix, so that the obtained system matrix gives out a blurring factor corresponding to an ith pixel point imaged in an imaging unit and a pixel point of a preset neighborhood with the ith pixel point as a center after an ith ray passes through j individual pixels. The system matrix of the imaging system is obtained based on the actual condition of the imaging system, rather than the simplified model of the imaging system, so that the finally obtained system matrix reflects the characteristics of the system more truly, or obtains a more accurate system matrix, and further, when iterative reconstruction is carried out based on the system matrix, the system matrix reflecting the actual characteristics of the imaging system, namely the system matrix including motion blur, is adopted in the process of iterative orthographic projection and back projection, so that the influence of the motion blur on the reconstructed image is eliminated in the process of iterative reconstruction, the quality of the reconstructed image is better, the actual clinical requirement is met, and the phenomena of missed diagnosis or misdiagnosis are avoided.
Further, when the system matrix under the projection angle is obtained, the rotated angle in the exposure integration time of the imaging system is divided by a preset angle, the divided system matrix corresponding to each angle is overlapped to obtain the system matrix of the imaging system under the projection angle, the system matrices under other projection angles are obtained in the same way, and the system matrices under different projection angles are overlapped to obtain the system matrix of the imaging system. This method is simple and can obtain a system matrix with high precision. Iterative reconstruction is carried out by using the system matrix, artifacts in the reconstructed image can be well removed, and the image which meets the actual clinical requirement is obtained.
Furthermore, a diffusion function of a voxel in the phantom is acquired through actually acquiring a projection image of the phantom and based on the projection image of the phantom, and a system matrix is further acquired through the diffusion function of the voxel in the phantom and an initial system matrix of the imaging system. The diffusion function of the voxel in the phantom is acquired through the actually acquired projection image of the phantom, and then the system matrix is acquired, and finally the system matrix with higher precision can be acquired.
Drawings
Fig. 1 is a schematic flowchart of a method for removing artifacts in an image according to an embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below. In the following description, specific details are set forth in order to provide a thorough understanding of the present invention. The invention can be implemented in a number of ways different from those described herein and similar generalizations can be made by those skilled in the art without departing from the spirit of the invention. Therefore, the present invention is not limited to the specific embodiments disclosed below.
As described in the prior art, medical imaging systems that eliminate high gantry rotation speeds, such as: when motion blur of CT, CBCT, PET systems and the like occurs, an imaging system is simplified in the process of removing motion artifacts based on two-dimensional projection images, and the artifact removal effect is poor. For medical imaging systems with high gantry rotation speeds, the blur kernel is spatially shifted and depends on the pixel coordinate position, with the degree of blur depending on the position of the imaging subject from the light source and detector. The inventor considers that the motion blur generated in the high-speed rotation process of the gantry is introduced into a system matrix, namely the system matrix capable of representing the motion blur is obtained, and then the motion blur or the motion artifact is removed in the iterative reconstruction process.
Referring to fig. 1, fig. 1 is a schematic flowchart of a method for removing an artifact in an image according to an embodiment of the present invention, where as shown in fig. 1, the method for removing an artifact in an image includes:
s101: acquiring a system matrix of an imaging system, wherein the system matrix gives out a fuzzy factor corresponding to an ith pixel point imaged in an imaging unit after an ith ray passes through j individual pixels and a pixel point of a preset neighborhood taking the ith pixel point as a center;
s102: and the system matrix is used for the iterative reconstruction method to carry out reconstruction so as to remove the artifact in the image.
The technical solution of the present invention is described in detail below with reference to specific embodiments, and in the embodiments, a CBCT system is taken as an example for illustration, but the technical solution of the present invention is not limited thereto, and all motion blur caused by high-speed motion of a gantry can be removed by the method provided in the embodiments.
Example one
Step S101 is executed to obtain a system matrix of the CBCT system, and in this embodiment, the diffusion function of the voxels in the phantom is obtained by performing an actual test on the phantom. And then acquiring a system matrix based on the diffusion function of the voxels in the phantom. The die body is provided with a plurality of markers, the plurality of markers can be arranged at different positions in a matrix made of uniform materials, and the absorption coefficient of the markers is different from that of the matrix, so that the markers and the matrix can be easily distinguished in the acquired projection image. The die body can be a cylinder, a hexahedron and the like. The matrix in the mold body can be made of low atomic number materials such as PMMA, and the markers can be made of high atomic number materials such as tungsten, aluminum, iron and the like.
The size of the projection image of the marker in at least one dimension is smaller than the detector pixel size, or the size of the marker is sufficiently small in at least one dimension, such as: if the label is a pellet, its diameter should be sufficiently small. The marker in this embodiment may be a bead, filament, sliver, etc. If the marker is a sphere, the diameter of the sphere needs to be less than a fraction of a millimeter in order for the size of the projected image of the marker in one dimension to be less than the detector pixel size. When the markers are filaments or strips, the markers may be arranged in the matrix in layers, the arrangement of the filaments or strips in the matrix may be perpendicular to a scanning plane (the scanning plane refers to a plane formed during rotation of the gantry), the size of the filaments may be less than one tenth of a millimeter, in addition, in this embodiment, the arrangement of the multiple markers should make the positions of the markers under the same projection angle not coincide, and the arrangement of the multiple markers in the phantom may be in a spiral shape, a linear shape, a curved shape, a broken line shape, and the like. In addition, the distribution density of the markers in the phantom, or the spacing between the markers in each dimension, may be determined according to actual requirements. The higher the distribution density of the markers in the phantom, the higher the accuracy of the obtained system matrix, but the amount of calculation and memory will increase.
In this embodiment, the system matrix of the CBCT system is obtained by first acquiring a projection image of the phantom through the CBCT system, then obtaining a diffusion function of a voxel in the phantom based on the acquired projection image, then obtaining an initial system matrix of the CBCT system by using an analytic method based on a point model, a line model, or an area model, and then multiplying the obtained initial system matrix and the obtained diffusion function of the voxel in the phantom to obtain the system matrix of the CBCT system.
For the analytical method, considering the focal point and the detector pixel as ideal points, simulation calculation is performed by using a line integral model based on the intensity of the ray formed by the ray beam passing through the voxel. The method for obtaining the initial system matrix through the point model, the line model and the area model in the embodiment is as follows:
point model, regarding the ray as a thick line with width 0 and interval τ, aijIndicating whether the ith ray intersects with the jth pixel
Figure BDA0001198944170000091
Line model, regarding the ray as a thick line with a width of 0 and a spacing of τ, aijThe length of the intersection line of the ith ray and the jth pixel is shown.
Area model, regarding the ray as a thick line with width of tau and interval of 0, the area delta of the i-th ray intersected with the j-th pixel and the pixel area delta2The ratio of (A) to (B) is as follows: a isij=Δ/δ2
In this embodiment, the mode of obtaining the diffusion function of the voxel in the phantom is different depending on the markers arranged in the phantom, and the following description is made of the mode of obtaining the voxel in the phantom when the markers are small spheres or filaments, respectively:
when the marker in the phantom is a small sphere, obtaining the diffusion function of the voxel in the phantom by the following method:
firstly: and acquiring a projection central point of the small ball in the projection image. In this embodiment, the three-dimensional coordinates of each small sphere in space can be obtained by an industrial CT method, and then the projection center point of the small sphere in the projection image can be calculated according to the three-dimensional coordinates of the small sphere in space, the position of the radiation source, and the position of the detector.
And then selecting a preset neighborhood by taking the projection central point of the small ball in the projection image as a center, and normalizing the gray value of the pixel point of the neighborhood by taking the gray value of the projection central point of the small ball as a reference. The size of the predetermined neighborhood region in this embodiment may be determined according to actual calculation accuracy requirements and storage space, and the predetermined neighborhood region may be a square, for example, the predetermined neighborhood region may be a 5 × 5 square region, that is, a side length is 5 pixel units. And dividing the gray value of each pixel point of the preset neighborhood by the gray value of the projection center point of the small ball in the projection image so as to normalize each pixel point of the preset neighborhood.
And then generating a diffusion function of the small ball according to the distance from each pixel point in the preset neighborhood to the projection center point of the small ball and the normalized gray value of the pixel point, wherein the diffusion function of the small ball is expressed in a matrix form, the size of the matrix is the same as that of the preset neighborhood, and the position relationship between each pixel point in the matrix and the projection center point of the small ball reflects the distance between the projection center point of the small ball and the pixel point in the preset neighborhood. For example: the projection center point is a pixel point in the 2 nd row and the 2 nd column in the matrix, and the distance between the pixel point in the 2 nd row and the 1 st column and the projection center point is a pixel unit.
After the diffusion function of each small sphere in the phantom is obtained in the above manner, the diffusion function of the voxel in the phantom is generated according to the diffusion function and the interpolation function of the small sphere. In this embodiment, in order to reduce the data amount when generating the diffusion function of the voxel in the phantom, increase the speed of generating the system matrix, and further increase the speed of removing the artifact, the diffusion function of the small sphere is not directly interpolated, but a basis function corresponding to the diffusion function of the small sphere is first established, and the diffusion function of the known small sphere is represented in a matrix form. Taking the fitting by the gaussian function as an example, fitting the data points corresponding to the gray values of the pixel points in the preset neighborhood with the abscissa as the distance from the projection center point to the pixel points in the preset neighborhood and the ordinate as the normalized gray values of the pixel points in the preset neighborhood by the gaussian function to obtain the characteristic representation of the gaussian function, and representing the diffusion function of the small ball by the characteristic representation of the gaussian function. Different spheres correspond to different diffusion functions and therefore also to different gaussian function characterizations. Then, the basis function feature representation corresponding to the diffusion function of the small sphere is interpolated to obtain the basis function feature representation of the voxels in the phantom, and in this embodiment, the nearest neighbor interpolation function, the trilinear interpolation function, the B-spline interpolation function, and the like may be specifically used to perform interpolation to obtain the basis function feature representation corresponding to each voxel. Finally, the base function feature representation corresponding to each voxel is converted into a diffusion function corresponding to the voxel, taking a gaussian function as an example, the gaussian function feature representation corresponding to each voxel is converted into a corresponding gaussian function, specifically, a longitudinal coordinate corresponding to a horizontal coordinate of the gaussian function is found according to a distance between a pixel point of the predetermined neighborhood of the voxel corresponding to the horizontal coordinate and a projection center point of the voxel, that is, a gray value of a pixel point in the predetermined neighborhood of the voxel is found, that is, the diffusion function of the voxel is obtained.
The diffusion function of the voxels in the phantom is obtained through the steps, that is, the gray values of the ith pixel point imaged in the imaging unit after the ith ray passes through the jth voxel and the pixel points of the predetermined neighborhood with the ith pixel point as the center are obtained, and then the gray values are multiplied by the gray value of the ith pixel point imaged in the imaging unit after the ith ray passes through the jth voxel in the initial system matrix, so that the system matrix of the CBCT can be obtained.
When the marker in the phantom is a filament, the diffusion function of the voxels in the phantom is obtained by:
firstly, acquiring a first diffusion function of each point on the filament in a projection image at a first angle in the direction perpendicular to the direction of the filament, and acquiring a second diffusion function of each point on the filament in a projection image at a second projection angle in the direction perpendicular to the direction of the filament; the first projection angle is perpendicular to the second projection angle. Specifically, the phantom with the filament disposed thereon is collected at different projection angles, and for a projection image at a certain projection angle, obtaining a diffusion function of each point on the filament in a direction perpendicular to the direction of the filament is actually similar to a process of obtaining a diffusion function of the bead, and specifically obtained in this embodiment as follows:
firstly, acquiring a projection central point of a point on a filament in the projection image, for example, acquiring a three-dimensional coordinate of the point on each filament in space by an industrial CT mode, and then calculating the projection central point of each point on the filament in the projection image according to the three-dimensional coordinate of the point on the filament in space, the position of a ray source and the position of a detector. Then, a preset neighborhood is selected by taking the projection center point of the point on the filament in the projection image as a center, the gray value of the pixel point of the neighborhood is normalized by taking the gray value of the projection center point of the point on the filament as a reference, and the size of the preset neighborhood is determined according to the actual calculation precision requirement and the storage space. And finally, generating a diffusion function of the point on the filament in the direction vertical to the direction of the filament based on the distance from the pixel point in the preset neighborhood to the projection center point of the point on the filament and the normalized gray value of the pixel point. Similarly, the diffusion function of a certain point on the filament is also expressed in the form of a matrix, the size of the matrix is the same as that of the predetermined neighborhood, and the position relationship between each pixel point in the matrix and the projection center point of the certain point on the filament reflects the distance between the projection center point of the certain point and the pixel point in the predetermined neighborhood. In this way, a spread function of each point on the filament in a direction perpendicular to the direction of the filament is obtained, for a first projection angle a first spread function of each point on the filament in a direction perpendicular to the direction of the filament is obtained, and similarly for a second projection angle a second spread function of each point on the filament in a direction perpendicular to the direction of the filament is obtained.
The diffusion function of the filament may then be obtained by multiplying the first and second diffusion functions corresponding to each point on the filament. Similarly, in the present embodiment, in order to reduce the data amount when generating the diffusion function of the voxel in the phantom, increase the speed of generating the system matrix, and further increase the speed of removing the artifact, the diffusion function of the voxel in the phantom is not generated directly based on the diffusion function and the interpolation function of the filament, and similarly to when the marker in the phantom is a small sphere, the feature representation of the basis function corresponding to the diffusion function of the filament is established first, and the basis function may be a gaussian function, a cubic spline function, a B spline function, or the like. And then interpolating the base function characteristic representation corresponding to the diffusion function of the filament to obtain the base function characteristic representation of the voxel in the phantom, wherein the interpolation function can be a nearest neighbor interpolation function, a trilinear interpolation function, a B-spline interpolation function and the like. After the base function characteristic representation of all voxels in the phantom is obtained by means of interpolation of the interpolation function, the base function characteristic representation of the voxels is correspondingly converted to obtain the diffusion function corresponding to the voxels. How to establish the base function characteristic representation corresponding to the diffusion function of the filament is similar to establishing the base function characteristic representation corresponding to the diffusion function of the bead, and the process of converting the base function characteristic representations corresponding to all the voxels into the diffusion function can refer to the process of converting the base function characteristic representations corresponding to all the voxels into the diffusion function when the marker is the bead, and details are not repeated here.
So far, the diffusion functions of all voxels in the phantom are obtained through the above process, and are multiplied by the initial system matrix, so that the system matrix of the CBCT can be obtained.
The above process gives a process how to obtain the system matrix of the system when different markers are distributed in the die bodies, and in practical application, the system matrix of the system can be obtained by adopting a method corresponding to the different die bodies according to the different die bodies.
Next, artifacts in the reconstructed image are removed based on the system matrix. The system matrix in this embodiment includes a blurring factor that reflects motion blur, and therefore, artifacts in the reconstructed image can be removed by an iterative reconstruction method. For the iterative reconstruction method, the following steps are included: an Algebraic iterative reconstruction (ART) algorithm, a Maximum-Likelihood expectation maximization (ML-EM) algorithm, and the like. In this embodiment, the ART algorithm is taken as an example to describe a process of removing artifacts in iterative reconstruction. The formula for the ART algorithm is as follows:
Figure BDA0001198944170000131
wherein x isk+1For updated voxel values, xkIs the current voxel value, A is the system matrix, AixkAs a projection value along the ith ray, biFor projection data detected on the ith detection pixel, Aixk-biThe projection residual error is that the projection residual error c is back-projected along the ith ray, | Ai2Is a normalization factor.
This exampleIn (1), initial voxel x in iterative reconstruction process0Can be obtained by a filtering back projection method, and the initial voxel x is obtained by the filtering back projection method0The convergence rate of the iterative reconstruction algorithm can be increased.
It can be seen from the above formula that the system matrix with motion blur is brought into the above formula to obtain the difference between the predicted value of the projected image obtained by calculation and the true value of the projected image obtained by measurement, the projection residual is back-projected, the difference is made between the current voxel value and the current projection residual to update the current voxel value, and as the iteration number increases, when x is the number of iterationsk+1And xkThe difference converges on x*And reconstructing the closest image to the real image. By introducing a system matrix with fuzzy factors and utilizing the system matrix to carry out iterative reconstruction, the motion artifact of the reconstructed image is finally removed in the reconstruction process. Because the motion artifact is removed in the reconstruction process, the method is simple compared with a mode of removing the motion artifact based on the projection image and then reconstructing. In addition, the system matrix reflects the characteristics of the system more truly, so that the image after iterative reconstruction has good quality and meets the actual clinical requirement.
The present embodiment further provides an apparatus for removing artifacts in an image, including:
the system matrix obtaining unit is used for obtaining a system matrix of the imaging system, and the system matrix gives out fuzzy factors corresponding to the ith pixel point imaged in the imaging unit and the pixel point of a preset neighborhood taking the ith pixel point as the center after the ith ray passes through the j individual pixels;
and the reconstruction unit is used for reconstructing the system matrix by using an iterative reconstruction method so as to remove the artifacts in the image.
The present embodiments also provide a medical imaging system, comprising: the device for removing the artifacts in the image. The medical imaging system may be a CBCT system or a CT system or a PET system.
Example two
The difference between the present embodiment and the first embodiment is that the method for obtaining the system matrix in the present embodiment is different from the first embodiment. In the first embodiment, the system matrix of the system is obtained by actually measuring the phantom, but in the present embodiment, the system matrix of the system is obtained by analyzing. In the present embodiment, the CBCT system is still used as an example for illustration. Generally, in the CBCT system, half of an angle rotated within an exposure integration time of the imaging system is defined as a projection angle, in order to obtain a system matrix capable of including motion blur in this embodiment, the angle rotated within the exposure integration time of the imaging system is subdivided, that is, the angle is divided by a preset angle, where the preset angle is smaller than the projection angle, after the angle rotated within the exposure integration time of the CBCT imaging system is subdivided by the preset angle, a system matrix corresponding to each subdivided angle is calculated, and then the system matrices corresponding to each angle are superimposed to obtain a system matrix of the imaging system at the projection angle. For example, if the rotation angle of the CBCT imaging system is 2 degrees within the exposure integration time, the projection angle is 1 degree, and if the predetermined angle is 0.2 degree, the system matrix corresponding to the angles of 0 to 0.2 degree, 0.2 to 0.4 degree, 0.4 to 0.6 degree, 0.6 to 0.8 degree, 0.8 to 1 degree, 1 to 1.2 degrees, 1.2 to 1.4 degrees, 1.4 to 1.6 degrees, 1.6 to 1.8 degrees, and 1.8 to 2 degrees is calculated. The system matrix at each angle can be obtained by adopting an analytic method, and specifically, the system matrix can be obtained based on a point model, a line model or an area model.
After the system matrixes of the imaging system under different projection angles are obtained in the above mode, the system matrixes under different projection angles are superposed to obtain the system matrix of the CBCT imaging system.
After the system matrix of the CBCT system is obtained, the system matrix is used for reconstruction, so that the artifact of the reconstructed image can be eliminated, and how to eliminate the artifact in the reconstruction process can be referred to in embodiment one, which is not described herein again.
In summary, the method and apparatus for removing artifacts in an image according to the embodiments of the present invention at least have the following advantages:
when a system matrix of an imaging system is obtained, the influence of motion blur is added into the system matrix, so that the obtained system matrix gives out a blurring factor corresponding to an ith pixel point imaged in an imaging unit and a pixel point of a preset neighborhood with the ith pixel point as a center after an ith ray passes through j individual pixels. The system matrix of the imaging system is obtained based on the actual condition of the imaging system, rather than the simplified model of the imaging system, so that the finally obtained system matrix reflects the characteristics of the system more truly, or obtains a more accurate system matrix, and further, when iterative reconstruction is carried out based on the system matrix, the system matrix reflecting the actual characteristics of the imaging system, namely the system matrix including motion blur, is adopted in the process of iterative orthographic projection and back projection, so that the influence of the motion blur on the reconstructed image is eliminated in the process of iterative reconstruction, the quality of the reconstructed image is better, the actual clinical requirement is met, and the phenomena of missed diagnosis or misdiagnosis are avoided.
Further, when the system matrix under the projection angle is obtained, the rotated angle in the exposure integration time of the imaging system is divided by a preset angle, the divided system matrix corresponding to each angle is overlapped to obtain the system matrix of the imaging system under the projection angle, the system matrices under other projection angles are obtained in the same way, and the system matrices under different projection angles are overlapped to obtain the system matrix of the imaging system. This method is simple and can obtain a system matrix with high precision. Iterative reconstruction is carried out by using the system matrix, artifacts in the reconstructed image can be well removed, and the image which meets the actual clinical requirement is obtained.
Furthermore, a diffusion function of a voxel in the phantom is acquired through actually acquiring a projection image of the phantom and based on the projection image of the phantom, and a system matrix is further acquired through the diffusion function of the voxel in the phantom and an initial system matrix of the imaging system. The diffusion function of the voxel in the phantom is acquired through the actually acquired projection image of the phantom, and then the system matrix is acquired, and finally the system matrix with higher precision can be acquired.
Although the present invention has been described with reference to the preferred embodiments, it is not intended to limit the present invention, and those skilled in the art can make variations and modifications of the present invention without departing from the spirit and scope of the present invention by using the methods and technical contents disclosed above, and therefore, any simple modification, equivalent change and modification made to the above embodiments according to the technical essence of the present invention shall fall within the protection scope of the present invention.

Claims (13)

1. A method for removing artifacts in an image, said artifacts being motion artifacts, comprising:
acquiring a system matrix of an imaging system, wherein the system matrix can represent motion blur, and the system matrix gives out a blurring factor corresponding to an ith pixel point imaged in an imaging unit and a pixel point of a preset neighborhood taking the ith pixel point as a center after an ith ray passes through j individual pixels;
using the system matrix for an iterative reconstruction method to reconstruct so as to remove artifacts in the image;
the method for reconstructing the system matrix by using an iterative reconstruction method comprises the following steps: and reconstructing the obtained voxels by using a filtered back projection method as initial values of iterative reconstruction.
2. The method of removing artifacts in images of claim 1, wherein said acquiring a system matrix of an imaging system comprises: and acquiring system matrixes of the imaging system at different projection angles, and superposing the system matrixes at different projection angles to acquire the system matrix of the imaging system.
3. The method of claim 2, wherein obtaining a system matrix at a projection angle comprises:
dividing the rotated angle in the exposure integration time of the imaging system by a preset angle, wherein the preset angle is smaller than the projection angle, and the projection angle is half of the rotated angle in the exposure integration time of the imaging system;
and superposing the divided system matrix corresponding to each angle to obtain the system matrix of the imaging system under the projection angle.
4. The method of removing artifacts in images of claim 1, wherein said acquiring a system matrix of an imaging system comprises:
acquiring a projection image of a phantom, and acquiring a diffusion function of a voxel in the phantom based on the projection image, wherein a marker is distributed in the phantom, and the size of the projection image of the marker in at least one dimension is smaller than the size of a detector pixel;
acquiring a system matrix of an imaging system based on an initial system matrix of the imaging system and the diffusion function of the voxel, wherein the initial system matrix of the imaging system is obtained based on a point model, a line model or an area model.
5. The method of claim 4, wherein the marker is a sphere, the obtaining a projection image of a phantom, and the obtaining a diffusion function of voxels in the phantom based on the projection image comprises:
acquiring a projection central point of the small ball in the projection image;
selecting a preset neighborhood by taking the projection central point of the small ball in the projection image as a center, and normalizing the gray value of the pixel point of the neighborhood by taking the gray value of the projection central point of the small ball as a reference;
generating a diffusion function of the small ball based on the distance from the pixel point in the preset neighborhood to the projection center point of the small ball and the normalized gray value of the pixel point; and
and generating the diffusion function of the voxel in the model body based on the diffusion function and the interpolation function of the small ball.
6. The method of removing artifacts from images of claim 5, wherein generating a diffusion function of voxels in the phantom based on the diffusion function and interpolation function of the globule comprises:
establishing a basis function corresponding to the diffusion function of the small ball;
interpolating the base function characteristic representation corresponding to the diffusion function of the small ball to obtain the base function characteristic representation of the voxel in the model body;
and converting the base function characteristic representation of the voxel in the phantom into a diffusion function corresponding to the voxel.
7. The method of claim 4, wherein the marker is a filament, the obtaining a projection image of a phantom, and the obtaining a diffusion function of voxels in the phantom based on the projection image comprises:
acquiring a first diffusion function of each point on the filament in the projection image of the first angle in the direction perpendicular to the direction of the filament;
acquiring a second diffusion function of each point on the filament in the projection image of the second projection angle in the direction perpendicular to the direction of the filament; the first projection angle is perpendicular to the second projection angle;
multiplying the first diffusion function and the second diffusion function corresponding to each point on the filament to obtain a diffusion function of the filament;
generating a diffusion function for voxels in the phantom based on the diffusion function and the interpolation function of the filament.
8. The method of claim 7, wherein obtaining a spread function of points on the filament in the projection image in a direction perpendicular to the direction of the filament comprises:
acquiring a projected central point of a point on the filament in the projected image;
selecting a preset neighborhood by taking the projection central point of the point on the filament in the projection image as a center, and normalizing the gray value of the pixel point of the neighborhood by taking the gray value of the projection central point of the point on the filament as a reference;
and generating a diffusion function of the point on the filament in the direction vertical to the direction of the filament based on the distance from the pixel point in the preset neighborhood to the projection center point of the point on the filament and the normalized gray value of the pixel point.
9. The method of removing artifacts from images of claim 7, wherein generating a diffusion function of voxels in the phantom based on a diffusion function and an interpolation function of the filament comprises:
establishing a base function characteristic representation corresponding to a diffusion function of the filament;
interpolating the basis function characteristic representation corresponding to the diffusion function of the filament to obtain the basis function characteristic representation of the voxel in the phantom;
and converting the base function characteristic representation of the voxel in the phantom into a diffusion function corresponding to the voxel.
10. The method for removing artifacts in images according to claim 5 or 7, wherein the interpolation function is: one of a nearest neighbor interpolation function, a trilinear interpolation function, and a B-spline interpolation function.
11. An apparatus for removing artifacts in an image, the artifacts being motion artifacts, comprising:
the system matrix obtaining unit is used for obtaining a system matrix of the imaging system, the system matrix can represent motion blur, and the system matrix gives out a blurring factor corresponding to an ith pixel point imaged by the imaging unit and a pixel point of a preset neighborhood with the ith pixel point as a center after an ith ray passes through j individual pixels;
a reconstruction unit, configured to use the system matrix in an iterative reconstruction method for reconstruction to remove artifacts in an image, where using the system matrix in the iterative reconstruction method for reconstruction includes: and reconstructing the obtained voxels by using a filtered back projection method as initial values of iterative reconstruction.
12. A medical imaging system comprising the apparatus for removing artifacts in images of claim 11.
13. The medical imaging system of claim 12, wherein the medical imaging system is one of a CBCT system, a CT system, or a PET system.
CN201611256592.9A 2016-12-30 2016-12-30 Method and device for removing artifacts in image Active CN106651809B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611256592.9A CN106651809B (en) 2016-12-30 2016-12-30 Method and device for removing artifacts in image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611256592.9A CN106651809B (en) 2016-12-30 2016-12-30 Method and device for removing artifacts in image

Publications (2)

Publication Number Publication Date
CN106651809A CN106651809A (en) 2017-05-10
CN106651809B true CN106651809B (en) 2020-03-31

Family

ID=58837285

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611256592.9A Active CN106651809B (en) 2016-12-30 2016-12-30 Method and device for removing artifacts in image

Country Status (1)

Country Link
CN (1) CN106651809B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113610716B (en) * 2021-07-02 2023-10-03 中铁二十局集团有限公司 Image artifact eliminating method, device and equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101520499A (en) * 2008-02-29 2009-09-02 西门子(中国)有限公司 Method and device for removing artifacts in magnetic resonance imaging
CN103315739A (en) * 2013-05-22 2013-09-25 华东师范大学 Magnetic resonance imaging method and system for eliminating motion artifact based on dynamic tracking technology

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7991240B2 (en) * 2007-09-17 2011-08-02 Aptina Imaging Corporation Methods, systems and apparatuses for modeling optical images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101520499A (en) * 2008-02-29 2009-09-02 西门子(中国)有限公司 Method and device for removing artifacts in magnetic resonance imaging
CN103315739A (en) * 2013-05-22 2013-09-25 华东师范大学 Magnetic resonance imaging method and system for eliminating motion artifact based on dynamic tracking technology

Also Published As

Publication number Publication date
CN106651809A (en) 2017-05-10

Similar Documents

Publication Publication Date Title
CN108122203B (en) Geometric parameter correction method, device, equipment and system
US9349198B2 (en) Robust artifact reduction in image reconstruction
US7251306B2 (en) Methods, apparatus, and software to facilitate iterative reconstruction of images
CN109643458B (en) System and method for automated sinogram completion, combining, and completion by combining
US20110268334A1 (en) Apparatus for Improving Image Resolution and Apparatus for Super-Resolution Photography Using Wobble Motion and Point Spread Function (PSF), in Positron Emission Tomography
US9002090B2 (en) Forward projection apparatus
JP6026214B2 (en) X-ray computed tomography apparatus (X-ray CT apparatus), medical image processing apparatus, and medical image processing method for supplementing detailed images in continuous multiscale reconstruction
JP6351986B2 (en) System optics in at least one of backprojection and forward projection for model-based iterative reconstruction
CN110057847B (en) TR (transmitter-receiver) tomography projection rearrangement method and device
US9196063B2 (en) Super-resolution apparatus and method
US7916828B1 (en) Method for image construction
Matej et al. Analytic TOF PET reconstruction algorithm within DIRECT data partitioning framework
US9965875B2 (en) Virtual projection image method
US20160078606A1 (en) Super-Resolution Apparatus and Method
CN106651809B (en) Method and device for removing artifacts in image
EP3331449B1 (en) Computed tomography image generation apparatus
US8379948B2 (en) Methods and systems for fast iterative reconstruction using separable system models
CN106650700B (en) Die body, method and device for measuring system matrix
JP4222930B2 (en) Three-dimensional backprojection method and apparatus and X-ray CT apparatus
KR20140130786A (en) Super-resolution Apparatus and Method using LOR reconstruction based cone-beam in PET image
Jeong et al. Sinogram super-resolution using a space-variant blur matrix in pet
Oh et al. Compressed-sensing (CS)-based micro-DTS reconstruction for applications of fast, low-dose x-ray imaging
Nguyen Geometric calibration for offset flat-panel CBCT systems using projection matrix
Chityala Region of interest cone beam computed tomography (ROI-CBCT)
Jeon et al. Algebraic correction for metal artifact reduction in computed tomography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder
CP01 Change in the name or title of a patent holder

Address after: 201807 No. 2258 Chengbei Road, Jiading Industrial Zone, Jiading District, Shanghai.

Patentee after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201807 No. 2258 Chengbei Road, Jiading Industrial Zone, Jiading District, Shanghai.

Patentee before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.