CN108876730B - Method, device and equipment for correcting motion artifact and storage medium - Google Patents

Method, device and equipment for correcting motion artifact and storage medium Download PDF

Info

Publication number
CN108876730B
CN108876730B CN201810509389.0A CN201810509389A CN108876730B CN 108876730 B CN108876730 B CN 108876730B CN 201810509389 A CN201810509389 A CN 201810509389A CN 108876730 B CN108876730 B CN 108876730B
Authority
CN
China
Prior art keywords
centroid
projection data
detected object
scanning
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810509389.0A
Other languages
Chinese (zh)
Other versions
CN108876730A (en
Inventor
楼珊珊
佟丽霞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Neusoft Medical Systems Co Ltd
Original Assignee
Neusoft Medical Systems Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Neusoft Medical Systems Co Ltd filed Critical Neusoft Medical Systems Co Ltd
Priority to CN201810509389.0A priority Critical patent/CN108876730B/en
Publication of CN108876730A publication Critical patent/CN108876730A/en
Application granted granted Critical
Publication of CN108876730B publication Critical patent/CN108876730B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Abstract

The application discloses a method, a device and equipment for correcting motion artifacts and a storage medium, wherein the method comprises the following steps: acquiring projection data of a detected object; mapping the motion characteristics of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data; correcting projection data with a motion feature in the acquired projection data based on the motion feature, wherein the projection data with the motion feature comprises: the data obtained by scanning the detected object when the center of mass of the detected object moves; and carrying out image reconstruction according to the corrected projection data to obtain a reconstructed image of the detected object. By implementing the method and the device, the influence of the scanning data obtained when the detected object moves on the reconstructed image can be effectively reduced by correcting the projection data with the motion characteristics, so that the motion artifact in the reconstructed image is improved.

Description

Method, device and equipment for correcting motion artifact and storage medium
Technical Field
The present application relates to the field of image processing technologies, and in particular, to a method, an apparatus, and a device for correcting a motion artifact, and a storage medium.
Background
CT is a technique of scanning a certain portion of a subject with an X-ray beam and obtaining a CT image composed of pixels by a series of processing of the scanning result. Among them, the CT image is a reconstructed image, which is also called CT reconstructed image. When CT scanning is performed on a certain part of a human body, motion artifacts (structural blurring or ghosting of the scanned part) may occur in a reconstructed image due to autonomous motion or physiological motion of the human body, taking the subject as an example.
The occurrence of motion artifacts can reduce the sharpness of the CT reconstructed image, thereby affecting the accuracy of the diagnostic result obtained from the CT reconstructed image.
Disclosure of Invention
The application provides a method, a device and equipment for correcting motion artifacts, which aim to solve the problem of poor definition of the existing reconstructed image.
According to a first aspect of embodiments herein, there is provided a method of correcting motion artifacts, the method comprising:
acquiring projection data of a detected object;
mapping the motion characteristics of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data;
correcting projection data with a motion feature in the acquired projection data based on the motion feature, wherein the projection data with the motion feature comprises: the data obtained by scanning the detected object when the center of mass of the detected object moves;
And carrying out image reconstruction according to the corrected projection data to obtain a reconstructed image of the detected object.
In one embodiment, the acquired projection data includes a plurality of sets of projection data, one set of projection data is data obtained by scanning the object once, and the motion characteristics of the centroid of the object varied with scanning time are mapped by tracking the centroid of the acquired projection data, including:
determining position parameters and scanning time corresponding to each group of projection data;
converting each set of projection data and the corresponding position parameters into the centroid of the set of projection data at the corresponding scanning time;
mapping the scanning time corresponding to each set of projection data and the centroid thereof at the scanning time to the centroid of the detected object at the corresponding scanning time;
and performing data fitting on the centroid of the detected object at each scanning time to obtain the variation relation of the centroid of the detected object relative to the scanning time.
In one embodiment, the position parameter is a position parameter of a channel in which projection data is acquired.
In one embodiment, the mapping out the motion characteristics of the centroid of the detected object with the change of the scanning time by tracking the centroid of the acquired projection data further comprises:
In one embodiment, the mapping out the motion characteristics of the centroid of the detected object with the change of the scanning time by tracking the centroid of the acquired projection data further comprises:
calculating a time difference between the determined maximum scan time and the determined minimum scan time if a device carrying the object is in motion during the object being scanned;
acquiring an accumulation result of the centroid of the detected object at each scanning time;
calculating the ratio of the accumulated result to the time difference as an equipment error;
and eliminating the equipment error from the change relation to obtain a corrected change relation.
In one embodiment, the mapping out the motion characteristics of the centroid of the detected object with the change of the scanning time by tracking the centroid of the acquired projection data further comprises:
based on the corrected change relationship, determining the main motion axis of the centroid of the detected object and the included angle value of the abscissa axis of the current coordinate system;
and carrying out coordinate rotation on the corrected change relationship according to the included angle value to obtain the rotated change relationship.
In one embodiment, the correcting the projection data having the motion feature in the acquired projection data based on the motion feature includes:
Determining the movement intensity of the centroid of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the movement characteristics;
determining reconstruction weights of projection data corresponding to the detected object at each scanning position according to the movement intensity of the centroid of the detected object at each scanning position, wherein the reconstruction weights of the corresponding projection data are smaller for the scanning position with the more intense centroid movement;
and acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain corrected projection data.
In one embodiment, the determining the intensity of the motion of the centroid of the detected object at each scanning position based on the correspondence between the scanning time and the scanning position and the motion feature includes:
converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position;
and calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
In one embodiment, the determining the intensity of the motion of the centroid of the detected object at each scanning position based on the correspondence between the scanning time and the scanning position and the motion feature includes:
Calculating the intensity of the motion of the center of mass of the detected object at each scanning time based on the motion characteristics;
and converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
According to a second aspect of embodiments of the present application, there is provided an apparatus for correcting motion artifacts, comprising:
the data acquisition module is used for acquiring projection data of the detected object;
the characteristic extraction module is used for mapping the motion characteristic of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data;
a data correction module, configured to correct projection data with a motion feature in the acquired projection data based on the motion feature, where the projection data with the motion feature includes: the data obtained by scanning the detected object when the center of mass of the detected object moves;
and the image reconstruction module is used for reconstructing an image according to the corrected projection data to obtain a reconstructed image of the detected object.
In one embodiment, the acquired projection data includes a plurality of sets of projection data, and one set of projection data is data obtained by scanning the object once, and the feature extraction module includes:
The parameter determining module is used for determining position parameters and scanning time corresponding to each group of projection data;
the centroid calculation module is used for converting each group of projection data and the corresponding position parameters into the centroid of the group of projection data at the corresponding scanning time;
the centroid mapping module is used for mapping the scanning time corresponding to each group of projection data and the centroid of the projection data at the scanning time to the centroid of the detected object at the corresponding scanning time;
and the first fitting module is used for performing data fitting on the centroid of the detected object at each scanning time to obtain the change relation of the centroid of the detected object relative to the scanning time.
In one embodiment, the position parameter is a position parameter of a channel in which projection data is acquired.
In one embodiment, the feature extraction module further comprises:
the characteristic determining module is used for representing the motion characteristic of the center of mass of the detected object at each scanning time according to the variation relation of the center of mass of the detected object relative to the scanning time when the device bearing the detected object is in a static state during the period that the detected object is scanned;
the time difference determining module is used for calculating the time difference between the determined maximum scanning time and the determined minimum scanning time when the equipment bearing the detected object is in a motion state during the period that the detected object is scanned;
The centroid accumulation module is used for acquiring the accumulation result of the centroid of the detected object at each scanning time;
the error calculation module is used for calculating the ratio of the accumulated result to the time difference as an equipment error;
and the error correction module is used for eliminating the equipment error from the change relation to obtain a corrected change relation.
In one embodiment, the feature extraction module further comprises:
the included angle determining module is used for determining the main motion axis of the centroid of the detected object and the included angle value of the abscissa axis of the current coordinate system based on the corrected change relationship;
and the coordinate rotation module is used for rotating the coordinates of the corrected change relationship according to the included angle value to obtain the rotated change relationship.
In one embodiment, the data correction module comprises:
the motion degree determining module is used for determining the motion intensity degree of the centroid of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the motion characteristics;
the reconstruction weight determining module is used for determining the reconstruction weight of the projection data corresponding to the detected object at each scanning position according to the movement intensity of the centroid of the detected object at each scanning position, wherein the reconstruction weight of the corresponding projection data is smaller for the scanning position with more intense centroid movement;
And the data correction submodule is used for acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain the corrected projection data.
In one embodiment, the degree of motion determination module is configured to:
converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position;
and calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
In one embodiment, the degree of motion determination module is configured to:
calculating the intensity of the motion of the center of mass of the detected object at each scanning time based on the motion characteristics;
and converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
According to a third aspect of embodiments of the present application, there is provided a computer apparatus comprising:
a processor;
a memory storing processor-executable instructions;
wherein the processor is coupled to the memory for reading program instructions stored by the memory and, in response, performing operations in the method as described above.
According to a fourth aspect of embodiments herein, there is provided one or more machine-readable storage media having instructions stored thereon, which when executed by one or more processors, cause a computer device to perform operations in a method as described above.
By applying the embodiment of the application, the movement characteristics of the centroid of the detected object changing along with the scanning time can be mapped by tracking the centroid of the projection data of the detected object, then the data with the movement characteristics in the acquired projection data is corrected based on the movement characteristics, and then the image reconstruction is performed based on the corrected projection data. Because the projection data obtained when the centroid moves is corrected, the influence of the projection data obtained when the detected object moves on the reconstructed image can be effectively reduced, the image is reconstructed based on the corrected projection data, the motion artifact in the reconstructed image can be improved, the quality of the reconstructed image is improved, and an accurate basis is provided for subsequent diagnosis based on the reconstructed image.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
FIG. 1A is a reconstructed image with motion artifacts shown in an exemplary embodiment of the present application;
FIG. 1B is a flow chart illustrating a method of correcting motion artifact according to an exemplary embodiment of the present application;
FIG. 1C is an architectural diagram of a scanning device shown in an exemplary embodiment of the present application;
FIG. 1D is a schematic diagram illustrating a process of two adjacent scans according to an exemplary embodiment of the present application;
FIG. 1E is a schematic view of a parallel beam of radiation shown in an exemplary embodiment of the present application;
FIG. 2 is a flow chart illustrating a method of correcting motion artifact according to another exemplary embodiment of the present application;
FIG. 3 is a flow chart illustrating a method of correcting motion artifact according to another exemplary embodiment of the present application;
fig. 4A is a schematic diagram illustrating a motion characteristic of a change in a centroid of an inspected object with respect to a scanning time according to an exemplary embodiment of the present application;
fig. 4B is a schematic diagram illustrating a motion relationship of a centroid of a detected object with respect to a scanning position according to an exemplary embodiment of the present application;
FIG. 4C is a schematic diagram illustrating the determination of reconstruction weights in accordance with an exemplary embodiment of the present application;
FIG. 5 is a diagram illustrating a hardware configuration of a computer device according to an exemplary embodiment of the present application;
fig. 6 is a block diagram illustrating an apparatus for correcting motion artifact according to an exemplary embodiment of the present application.
Detailed Description
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in this application and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
The motion artifact related to the embodiment of the present application refers to a motion artifact caused by a macroscopic motion of a detected object in a process of scanning the detected object by an imaging system, and can be divided into an autonomous motion artifact and a physiological motion artifact, where the autonomous motion artifact refers to an artifact caused by a motion that does not have periodicity and can be autonomously controlled by the detected object, such as: artifacts caused by swallowing, eyeball rotation, limb movement, and the like; physiological motion artifacts refer to artifacts caused by involuntary motion that is not controlled by the subject, such as: heart beats, periodic motion such as blood vessel pulsation, or non-periodic motion such as gastrointestinal peristalsis.
In practical applications, when an object is scanned by an imaging system such as CT (Computed Tomography), PET (Positron Emission Tomography), and PET-CT, the above motion artifacts may exist in a reconstructed image formed by the imaging system due to macroscopic motion of the object. For example: fig. 1A shows a CT reconstructed image of abdomen, which has a blurred structure and poor definition, and may affect the accuracy of the diagnosis result.
In the process of reconstructing the image, the imaging system carries out image reconstruction based on the combination result of the plurality of groups of projection data. Therefore, the more projection data having a motion characteristic in projection data used for reconstructing an image, the higher the possibility of motion artifacts occurring in the reconstructed image and the higher the severity of the motion artifacts. In view of this, the present application provides a method for correcting motion artifact, which refers to data obtained by scanning an object to be detected by an imaging system when a centroid of the object is moving (the object is in motion when the centroid is moving), which is referred to as projection data with motion characteristics.
In practical application, the motion characteristic of the center of mass of the detected object changing along with the scanning time can be mapped by tracking the center of mass of the projection data of the detected object, then the data with the motion characteristic in the acquired projection data is corrected based on the motion characteristic, and then the image reconstruction is performed based on the corrected projection data. Because the projection data obtained when the centroid moves is corrected, the influence of the projection data obtained when the detected object moves on the reconstructed image can be effectively reduced, the image is reconstructed based on the corrected projection data, the motion artifact in the reconstructed image can be improved, the quality of the reconstructed image is improved, and an accurate basis is provided for subsequent diagnosis based on the reconstructed image. The method for correcting motion artifacts provided by the present application is described in detail below with reference to the accompanying drawings.
Referring to fig. 1B, fig. 1B is a flowchart illustrating a method for correcting motion artifacts according to an exemplary embodiment of the present application, which may include the following steps S101-S104:
step S101, projection data of the object to be inspected is acquired.
And step S102, mapping the motion characteristics of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data.
Step S103, correcting projection data with motion features in the acquired projection data based on the motion features, wherein the projection data with motion features includes: the detected object is scanned when the center of mass of the detected object moves.
And S104, carrying out image reconstruction according to the corrected projection data to obtain a reconstructed image of the detected object.
The projection data according to the embodiment of the present invention may also be referred to as raw data, and after a signal source of an imaging system transmits a corresponding signal to a subject, a detector of the imaging system receives a signal released by the subject and correspondingly converts the received signal to generate the projection data. The imaging system referred to herein may be a CT imaging system, a PET-CT imaging system, or other imaging system that reconstructs images based on the combined results of multiple sets of projection data. Taking the CT system as an example, the generation process of the lower projection data is specifically described below with reference to fig. 1C:
the CT imaging system may include a scanning device as shown in fig. 1C, which may include a gantry 11, an X-ray emitter and X-ray detector 16 mounted on the gantry 11, and a device 15 carrying an object to be examined. Wherein the X-ray emitter may comprise an X-ray source 12 and a collimator device 13, the device 15 may be a scanning bed, and the X-ray detector 16 may comprise a plurality of channels for detecting X-rays.
In practical applications, the apparatus 15 may travel along an axis R parallel to the z direction to bring an object to be inspected (not shown in fig. 1C) to a corresponding inspection area, and the gantry 11 may rotate around the axis R to drive the X-ray emitter to a plurality of positions to scan the object to be inspected at the inspection area for a plurality of times.
Each time of scanning, the X-ray emitter emits an X-ray beam 14 toward the object to be inspected at the detection area, and the collimator device 13 adjusts the X-ray generated by the X-ray source 12 to a parallel beam, a fan beam, or a cone beam and then emits the X-ray beam 14 toward the object to be inspected.
The emitted X-ray beam 14 can transmit through the object to be inspected, transmit to the X-ray detector 16, receive the X-ray by the X-ray detector 16, convert the X-ray into an electrical signal through photoelectric conversion, and convert the electrical signal into a Digital signal through an ADC (Analog/Digital Converter). These digital signals may be referred to as raw data, or may also be referred to as projection data.
After the X-ray detector 16 outputs the projection data, in order to avoid confusion of the projection data obtained by each scanning, the embodiments of the present application may group the projection data, where a group of projection data is data obtained by scanning the object once, and corresponds to at least one of the scanning time, the scanning position, and the scanning angle of the scanning. Wherein, the scanning angle can be determined by the current scanning times and the rotating angle of the two adjacent scanning times. Fig. 1D shows two adjacent scans at respective positions W1 and W2 and a rotation angle θ, each scan may pass through the X-ray emitter 21, project a parallel X-ray beam toward the object to be inspected, and receive the X-ray beam by the X-ray detector 22 and generate projection data.
In addition, different channels in the X-ray detector may be used to distinguish the same set of projection data, and each piece of projection data in the same set may be stored corresponding to each channel, for example, the projection data output by each channel may be stored corresponding to the position parameter of each channel.
When the detected object is actually scanned, the scanning frame can drive the X-ray emitter and the X-ray detector to rotate and advance along the R-axis direction shown in fig. 1C, so that the position parameters of each channel can change along with the change of the scanning time and the scanning times. At each scanning, three-dimensional coordinates may be configured for each channel as a position parameter of each channel with reference to the three-dimensional coordinate system shown in fig. 1C.
In another example, when considering that the position of each channel of the X-ray detector changes when the X-ray detector rotationally travels along the R-axis direction shown in fig. 1C, the change can be decomposed into linear travel along the z-direction and angular rotation within the xy-plane. Therefore, each time scanning is carried out, the x-y coordinate system can be referred to, and the x-coordinate and the y-coordinate are configured as the position parameters of the central channel, and the position parameters of other channels are represented by: the relative position of the channel and the central channel and the position parameter of the central channel are determined. For example, when the X-ray detector has 700 channels, the position parameter of the central channel is defined as 349.5, and the position parameters of the other channels are defined as the channel index minus the position parameter of the central channel.
As shown in fig. 1E, when the X-ray emitter projects a parallel ray beam to the object to be inspected, the central channel may refer to: the channel in the X-ray detector that receives the rays 31 in the middle of the beam.
When correcting the motion artifact, the embodiment of the present application may receive projection data output by a detector of the imaging system, may also retrieve the projection data from a storage area of the imaging system, and may also retrieve the projection data in other manners, which is not limited in this application. Which projection data of the object to be examined are specifically acquired can be determined according to at least one of the required correction accuracy, the reconstruction position of the reconstructed image, and the motion cycle of the object to be examined. For example: if the object to be examined includes at least a part of the heart, projection data in time periods corresponding to N cardiac motion cycles may be acquired, and further, if the requirement for correction accuracy is high, N may be configured to be a large value, such as an integer of 10 or more, and if the requirement for correction accuracy is low, N may be configured to be a small value, such as a positive integer of less than 10.
Since the acquired projection data has motion characteristic data, which easily causes motion artifacts in the reconstructed image, in the embodiment of the present application, before reconstructing an image based on the acquired projection data, in order to correct the motion artifacts, the projection data obtained when the detected object moves needs to be corrected, so as to reduce the influence of the data on the reconstructed image.
Further, in consideration of the fact that in an actual imaging scene, it is difficult to obtain the movement characteristics of the subject by directly monitoring the change in position of the subject, regardless of whether the subject is subjected to voluntary movements such as swallowing, eye movements, and limb movements, or physiological movements such as heart beats, blood vessel beats, and gastrointestinal peristalsis. Therefore, according to the embodiment of the application, after the projection data of the detected object is acquired, the motion characteristics of the centroid of the detected object changing along with the scanning time are mapped by tracking the centroid of the acquired projection data.
In some examples, after the object to be detected is scanned by the imaging system each time, if the projection data of the object to be detected is acquired, the centroid of the projection data may be tracked first, and then the tracked centroid is mapped according to the mapping relationship between the centroid of the object to be detected and the centroid of the projection data, so as to obtain the motion characteristic of the centroid of the object to be detected changing along with the scanning time. According to the embodiment of the application, the center of mass of the detected object at each scanning time can reflect the change relation of the center of mass of the detected object relative to the scanning time, and therefore, the motion characteristics can be represented by the center of mass of the detected object at each scanning time. Wherein the centroid can be identified by its location parameters, such as centroid coordinates.
In other embodiments, data fitting may be performed on a plurality of scanning times and the centroid of the detected object at each of the plurality of scanning times to obtain a variation relationship between the centroid of the detected object and the scanning time, so as to represent the motion characteristic of the centroid of the detected object varying with time. When performing data fitting, the number of scan times to be fitted may be determined based on at least one of the required correction accuracy, the range of scan positions corresponding to the reconstruction position of the reconstructed image, and the motion cycle of the object to be examined. For example: the higher the accuracy requirement, the greater the number of scan times to fit.
In this example, the acquired data may include multiple sets of projection data of the object to be detected, where one set of projection data is data obtained by scanning the object to be detected once, and when tracking the centroid of the acquired projection data, the centroids of the sets of projection data may be tracked, so as to obtain the centroid of each set of projection data at the corresponding scanning time. When mapping, the centroids of the sets of projection data at the corresponding scanning times may be mapped to the centroids of the detected object at the corresponding scanning times, respectively. Specific implementation can be seen in fig. 2, and the method shown in fig. 2 may include the following steps S201 to S204:
Step S201, determining the position parameters and the scanning time corresponding to each set of projection data.
Step S202, each set of projection data and the corresponding position parameter are converted into the centroid of the set of projection data in the corresponding scanning time.
Step S203, mapping the scanning time corresponding to each set of projection data and the centroid thereof at the scanning time to the centroid of the detected object at each scanning time.
And S204, performing data fitting on the centroid of the detected object at each scanning time to obtain the variation relation of the centroid of the detected object relative to the scanning time.
In step S201, the position parameter and the scanning time may be determined simultaneously or separately. The position parameters and the scanning time can be specifically referred to the relevant contents in the embodiments related to fig. 1B to 1E.
In addition, the position parameter may also be other parameters that can represent a relative position relationship between projection data, which is not limited in this embodiment of the present application.
In step S202, each set of projection data and its corresponding position parameter may be converted into a centroid of the set of projection data at its corresponding scan time according to a predetermined correspondence between the centroid and the projection data and its corresponding position parameter.
In other examples, each set of projection data and the corresponding position parameter may be substituted into a predetermined centroid calculation formula to obtain a centroid of the set of projection data at the corresponding scan time.
When the imaging system is a CT imaging system and the emitted X-ray beam is a parallel beam, the centroid calculation formula may be as in formula (1):
Figure BDA0001671783450000121
wherein nChannelNum represents the number of channels contained in the detector, p (theta)i,tj) Representing the angle theta with the scaniAnd a position parameter tjCorresponding projection data, pcomi) Indicating the corresponding scan angleIs thetaiI belongs to [0, nhalf view perrot), j belongs to [0, nChannnelNum), and nhalf view perrot represents the number of times of scanning of the object to be detected by the scanning equipment within a half-turn of the gantry rotation.
In step S203, according to the mapping relationship between the centroid of the detected object and the projection centroid thereof, the scanning time corresponding to each set of projection data and the centroid thereof at the scanning time are mapped as the centroid of the detected object at the corresponding scanning time.
In other examples, each set of projection data and the corresponding position parameter may be substituted into a predetermined mapping formula to obtain a centroid of the set of projection data at the corresponding scan time.
When the imaging system is a CT imaging system and the emitted X-ray beam is a parallel beam, the mapping formula may be as in formula (2):
pcomi)=xcom(t)*cosθi+ycom(t)*sinθi (2);
wherein (x)com(t),ycom(t)) represents the centroid coordinates of the inspected object at time t.
Substituting each group of projection data and the corresponding position parameters into the mapping formula to obtain a result shown in formula (3):
Figure BDA0001671783450000131
wherein t- Δ t < ti<t+Δt,i∈[0,n]The Δ t is greater than 0 and the specific value can be set by the designer according to the actual scene condition, if the imaging system is a CT system and the object to be examined includes the heart, the averaging of the heart motion rules needs to be avoided as much as possible when setting the Δ t, and the value is set to be between 7s and 10 s.
In the embodiment of the present application, formula (3) can also reflect the variation relationship of the centroid of the detected object relative to the scanning time, and therefore, the motion characteristic of the centroid of the detected object relative to the scanning time variation can be represented by formula (3). The data fitting can also be performed on the above formula (3), so as to obtain the variation relation of the centroid of the detected object relative to the scanning time, and represent the motion characteristic of the centroid of the detected object relative to the scanning time variation. The data fitting method can be various, and the application embodiment does not limit the data fitting method.
In some examples, the variation relationship shown in the following formula can be obtained by the least squares method:
X(t)=(A(t)TA(t))-1A(t)TY(t) (4);
wherein, A (t), Y (t), X (t) are respectively shown as the following formulas:
Figure BDA0001671783450000132
Figure BDA0001671783450000141
Figure BDA0001671783450000142
in other embodiments, when the motion characteristic of the centroid of the detected object changing with the scanning time is mapped by tracking the centroid of the acquired projection data, the acquired projection data may be directly substituted into the above formula (4) after the detected object is scanned each time, so as to obtain the change relationship of the centroid of the detected object with respect to the scanning time.
With regard to the variation relationship obtained in any of the above embodiments, if the device carrying the detected object is in a stationary state during the period when the detected object is scanned, the variation relationship of the centroid of the detected object with respect to the scanning time may be used to represent the motion characteristics of the centroid of the detected object at each scanning time. The apparatus for carrying the object to be examined, which is mentioned here, may vary from imaging system to imaging system, e.g. a scanning bed if the imaging system is a CT system.
If the equipment carrying the detected object is in a motion state during the period that the detected object is scanned, the deviation of the motion of the equipment to the result of the centroid calculation needs to be corrected. In one example, the center of mass of the detected object at each scanning time may be subtracted by the average value of the center of mass in a certain time range around the scanning time, and the error may be corrected by:
And determining the accumulated value of the center of mass of the detected object in a preset time range before and after each scanning time.
And calculating half of the ratio of the centroid accumulated value to the preset time range as the equipment error.
And obtaining the corrected centroid of the detected object at the scanning time according to the difference value between the centroid of the detected object at the scanning time and the equipment error. The predetermined time range may be Δ t, and the corrected centroid is the centroid after the device error is eliminated.
In this example, the motion characteristics of the centroid of the object to be detected at each scanning time may be represented by the corrected centroid of the object to be detected at each scanning time. The modified centroid of the detected object at each scanning time can also be subjected to data fitting to obtain the variation relation of the modified centroid of the detected object relative to the scanning time, which represents the motion characteristics of the centroid of the detected object at each scanning time.
In another example, for the variation relationship of the centroid of the detected object relative to the scanning time obtained by data fitting, as shown in formula (4), the deviation of the motion of the apparatus to the centroid calculation result can be corrected by:
the time difference between the determined maximum scan time and the determined minimum scan time is calculated, and the maximum scan time and the minimum scan time may refer to the maximum value and the minimum value of the scan time corresponding to each projection data.
And acquiring an accumulation result of the centroid of the detected object at each scanning time.
And calculating the ratio of the accumulation result to the time difference as the equipment error.
And eliminating the equipment error from the change relation to obtain a corrected change relation.
In other examples, when the deviation of the motion of the apparatus to the centroid calculation result is removed, the centroid of the detected object at each scanning time may be directly substituted into the following formula to obtain the modified variation relationship:
Figure BDA0001671783450000151
wherein, X (t)i) Is detected at tiThe centroid of the moment.
In practical application, considering that the main motion axis is taken as a coordinate axis to facilitate analysis of the motion characteristics of the centroid, in the embodiment of the present application, for the change relationship obtained in any of the above embodiments, the main motion axis of the detected object may be taken as any coordinate axis of a current coordinate system, and the change relationship of the corrected centroid of the detected object with respect to the scanning time may be used to represent the motion characteristics of the centroid of the detected object at each scanning time, where the current coordinate system mentioned herein may be set by a designer of the present solution according to a practical application scenario, such as an X-Y coordinate system in a three-dimensional coordinate system shown in fig. 1C.
When the main motion axis of the detected object is not any coordinate axis of the current coordinate system, in an example, the main motion axis of the centroid of the detected object and the included angle value of the abscissa axis of the current coordinate system may be determined based on the modified variation relationship; and carrying out coordinate rotation on the corrected change relationship according to the included angle value to obtain the rotated change relationship.
In another example, the modified variation relationship may be substituted into the following equation to obtain an inertia matrix:
Figure BDA0001671783450000161
wherein, Ixx、Ixy、IyyRespectively as follows:
Figure BDA0001671783450000162
Figure BDA0001671783450000163
Figure BDA0001671783450000164
wherein x isucom(ti) Transverse coordinate, y, representing the centroid of the inspected object in equation (8)ucom(ti) The longitudinal coordinate of the centroid of the inspected object in equation (8) is expressed.
Performing an orthogonal transformation on equation (9) can obtain a diagonal matrix as shown in the following equation:
Figure BDA0001671783450000165
wherein, I'xx≥I′xy
The rotation angle corresponding to the above orthogonal transformation is
Figure BDA0001671783450000167
Rotating the transverse coordinate and the longitudinal coordinate of the centroid of the detected object in the formula (8) according to the rotation angle can obtain the variation relation of the centroid of the detected object with respect to the scanning time in the coordinate system of the main motion axis, as shown in the following formula:
Figure BDA0001671783450000166
as can be seen from the foregoing embodiments, through the operations of any of the above embodiments, it is possible to obtain the centroid of the detected object at each scanning time, the corrected centroid of the detected object at each scanning time, the change relationship of the centroid of the detected object with respect to the scanning time, the change relationship after correction, or the change relationship after rotation, and a motion feature of any one of them, which represents the centroid of the detected object with respect to the scanning time, may be determined according to an actual application scenario.
After the motion characteristic of the centroid of the detected object in relative scanning is determined, in an example, considering that the motion intensity of the detected object is higher and the motion artifact in the reconstructed image is more serious when the detected object is scanned, the method can determine the motion intensity of the centroid of the detected object in each scanning time based on the motion characteristic, and eliminate projection data obtained in the time when the motion intensity is higher than a preset value from the obtained projection data, wherein the residual projection data are corrected projection data. Wherein the intensity of the movement can be represented by the speed. This predetermined value may be set according to the actual correction requirement, the smaller the predetermined value, the higher the correction accuracy.
When the movement intensity of the center of mass of the detected object at each scanning time is determined, if the movement feature is represented by the center of mass or the corrected center of mass of the detected object at each scanning time, the movement intensity can be determined by the following operations:
the difference between adjacent scan times is calculated.
And calculating the difference value of the centroids or the corrected centroids of the two adjacent scanning times.
And then calculating the ratio of the two differences to obtain the average speed of the centroid or the corrected centroid in the adjacent scanning time, and indicating the movement intensity of the centroid of the detected object between the adjacent scanning times.
If the motion characteristics are represented by the change relationship obtained by fitting, the change relationship after correction or the change relationship after rotation, the derivative of the change relationship can be directly calculated to obtain the change rule of the centroid speed of the detected object relative to the scanning time so as to represent the motion intensity of the centroid of the detected object at each scanning time.
In other embodiments, the projection data with a moving characteristic may also be corrected by configuring a reconstruction weight for each acquired projection data, and specifically, referring to fig. 3, the method shown in fig. 3 may include the following steps S301 to S303:
step S301, determining the movement intensity of the centroid of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the movement characteristics. The scan position referred to herein may vary with the predetermined coordinate system, and if the imaging system is a CT system, the scan position may be represented by the Z coordinate shown in FIG. 1C.
Step S302, according to the intensity of movement of the centroid of the detected object at each scanning position, determining the reconstruction weight of the projection data corresponding to the detected object at each scanning position, wherein the reconstruction weight of the corresponding projection data is smaller for the scanning position with the more intense centroid movement. The specific correspondence between the two can be set according to the actual correction requirement, such as linear relationship, Cos weight, trapezoidal relationship, etc.
Step S303, acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain the corrected projection data.
In one example, determining the intensity of the motion of the centroid of the detected object at each scanning position based on the correspondence between the scanning time and the scanning position and the motion characteristics may include:
and converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position.
And calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
In this example, if the motion characteristic is represented by a variation relationship obtained by fitting, a modified variation relationship, or a variation relationship after rotation, the variation relationship may be converted into a motion relationship of the centroid of the detected object with respect to each scanning position, and then the motion relationship is derived to obtain a motion speed of the centroid of the detected object at each scanning position, which represents a degree of intensity of motion of the centroid at each scanning position.
If the motion characteristics are represented by the centroid or the corrected centroid of the detected object at each scanning time, the centroid or the corrected centroid of the detected object at each scanning time can be converted into the centroid or the corrected centroid of the detected object at each scanning position based on the corresponding relationship between the scanning time and the scanning position, and then the difference value of the adjacent scanning positions is calculated; calculating the difference of the centroids or the corrected centroids of the adjacent scanning positions; and then calculating the ratio of the two differences to obtain the average speed of the centroid or the corrected centroid at the adjacent scanning positions, and representing the movement intensity of the centroid at each scanning position.
In another example, the intensity of the motion of the centroid of the detected object at each scanning position can be determined based on the correspondence between the scanning time and the scanning position and the motion characteristics by the following operations:
and calculating the movement intensity of the center of mass of the detected object at each scanning time based on the movement characteristics.
And converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
In this example, if the motion characteristic is represented by a variation relationship obtained by fitting, a modified variation relationship, or a variation relationship after rotation, the variation relationship may be derived to obtain a motion speed of the detected object at each scanning time; and converting the movement speed of the detected object at each scanning time into the movement speed of the centroid of the detected object at each scanning position according to the corresponding relation between the scanning time and the scanning position, and indicating the movement intensity of the centroid of the detected object at each scanning position.
If the motion characteristics are represented by the center of mass or the corrected center of mass of the detected object at each scanning time, the difference value of the adjacent scanning time can be calculated; calculating the difference value of the centroids or the corrected centroids of two adjacent scanning times; and then calculating the ratio of the two differences to obtain the average speed of the centroid or the corrected centroid in the adjacent scanning time, converting the average speed of the centroid or the corrected centroid in the adjacent scanning time into the average speed of the centroid or the corrected centroid in the adjacent scanning position of the detected object based on the corresponding relation between the scanning time and the scanning position, and representing the motion intensity degree of the centroid of the detected object in each scanning position.
As can be seen from the foregoing embodiments, based on the principle of reducing the influence of projection data with motion characteristics on a reconstructed image, after the projection data with motion characteristics is corrected, an image of a detected object may be reconstructed based on the corrected projection data, and a reconstruction process may be different according to different imaging systems and reconstruction methods, specifically, refer to an image reconstruction technique of a corresponding imaging system, and the following image reconstruction process is briefly described below by taking a CT system as an example only:
an X-ray attenuation coefficient or absorption coefficient for each voxel is obtained based on the corrected projection data. The X-ray attenuation coefficients are arranged in a digital matrix (digital matrix), in which each number can be represented as an image value, e.g. a grey scale, of each pixel (pixel). All pixels generated from the digital matrix constitute a CT image.
The following description, with reference to fig. 4A to 4C and specific application scenarios, schematically describes the process of correcting motion artifacts according to the present application:
the imaging system adopted in the application scenario is a CT system, the lung of the human body is scanned, and the detected object may include the lung, the heart and other human tissues because the lung is close to the heart. The object to be examined is scanned by the scanning device shown in fig. 1C.
When the gantry 11 rotates around an axis R parallel to the z direction, the X-ray emitter projects parallel ray beams to the object to be detected multiple times, and the X-ray detector 16 receives one group of X-rays each time, converts the group of X-rays into an electrical signal through photoelectric conversion, converts the electrical signal into a group of projection data through an ADC (Analog/Digital Converter), and inputs the projection data into an image reconstruction device of the CT system.
If the image reconstruction device performs image reconstruction directly on the basis of sets of projection data, it is possible to obtain a reconstructed image containing motion artifacts as shown in fig. 1A due to the beating heart.
In order to correct the motion artifact, the image reconstruction device may map a motion characteristic of the centroid of the detected object changing with the scanning time by tracking the centroid of each set of projection numbers, and may specifically obtain the motion characteristic through formulas (1) to (14).
In this scenario, if the human body does not move during scanning, the obtained motion characteristics can reflect the motion characteristics of the heart, as shown in fig. 4A, the abscissa is the scanning time, the ordinate is the coordinate of the centroid on the main motion axis, the curve represents the motion characteristics of the centroid of the detected object relative to the scanning time, the time at which the derivative in the ascending direction is 0 in the graph is marked as Rn through the detection of the R peak of the ECG signal, and this process is equivalent to the R peak detection process of the ECG signal.
In order to reduce the motion artifact of the reconstructed image, the projection data with the motion feature in the acquired projection data needs to be corrected based on the motion feature; and then, image reconstruction is carried out according to the corrected projection data to obtain a reconstructed image of the detected object.
When the projection data is corrected, the motion relationship of the centroid of the detected object relative to the scanning position can be obtained according to the motion characteristics and the corresponding relationship between the scanning time and the scanning position, as shown in fig. 4B, the abscissa is the scanning position, the ordinate is the coordinate of the centroid on the main motion axis, and the curve represents the motion characteristics of the centroid of the detected object relative to the scanning position. Where z0-z through z0+ z represent the range of scan positions corresponding to the projection data required to reconstruct the image at z 0.
Based on the motion relationship of the centroid of the detected object relative to the scanning position, the reconstruction weight may be configured for the projection data corresponding to each scanning position in the range from z0-z to z0+ z, and during configuration, the reconstruction weight may be configured according to the corresponding relationship shown in fig. 4C, where in fig. 4C, phaseDThe relative phase position of the R peak is shown, the abscissa is the scanning position, the curve 1 is the motion relation of the centroid of the detected object relative to the scanning position, and the curve 2 is the corresponding relation of the reconstruction weight of the projection data and the scanning position.
The projection data corresponding to each scan position in the range from z0-z to z0+ z is configured according to the relationship shown in fig. 4C, the reconstruction weight is configured, the product of the projection data and the reconstruction weight is used as the corrected projection data, and after the image reconstruction is performed, the motion artifact in the reconstructed image can be improved well.
Corresponding to examples of the method of correcting motion artifacts of the present application, examples of an apparatus for correcting motion artifacts are also provided. The apparatus for correcting motion artifact can be applied to various computer devices, such as an image reconstruction device for performing image processing and/or image reconstruction in an imaging system such as CT, PET-CT, or other devices with data processing capability outside the imaging system. As shown in fig. 5, a hardware structure of a computer device for applying the apparatus for correcting motion artifact according to the present application is schematically illustrated, and the computer device may include a processor 510, a memory 520, and a non-volatile storage 530. The memory 520 and the nonvolatile memory 530 are machine-readable storage media, and the processor 510 and the machine- readable storage media 520 and 530 may be connected to each other via an internal bus 540. In other possible implementations, the computer device may also include a network interface 550 to enable communication with other devices or components. In addition to the processor 510, the memory 520, the network interface 550, and the non-volatile storage 530 shown in fig. 5, the apparatus may also include other hardware according to actual functional requirements, which is not shown in fig. 5.
In different examples, the machine- readable storage medium 520, 530 may be a ROM (Read-Only Memory), a volatile Memory, a non-volatile Memory, a flash Memory, a storage drive (e.g., a hard drive), a solid state drive, any type of storage disk (e.g., a compact disk, a DVD, etc.), or the like, or a combination thereof.
Further, a machine-readable storage medium, which may be embodied as memory 520, has stored thereon machine-executable instructions corresponding to the apparatus 600 for correcting motion artifact. Functionally partitioned, as shown in fig. 6, the apparatus 600 for correcting motion artifact may include a data acquisition module 610, a feature extraction module 620, a data correction module 630, and an image reconstruction module 640.
The data acquiring module 610 is configured to acquire projection data of the object to be detected.
And the feature extraction module 620 is configured to map a motion feature of the centroid of the detected object changing with the scanning time by tracking the centroid of the acquired projection data.
A data correcting module 630, configured to correct projection data with a motion feature in the acquired projection data based on the motion feature, where the projection data with the motion feature includes: the detected object is scanned when the center of mass of the detected object moves.
And an image reconstruction module 640, configured to perform image reconstruction according to the corrected projection data to obtain a reconstructed image of the detected object.
In some examples, the acquired projection data includes a plurality of sets of projection data, and one set of projection data is data obtained by scanning the object once, and the feature extraction module 620 may include:
and the parameter determining module is used for determining the position parameters and the scanning time corresponding to each group of projection data.
And the centroid calculation module is used for converting each group of projection data and the corresponding position parameter into a centroid of the group of projection data at the corresponding scanning time.
And the centroid mapping module is used for mapping the scanning time corresponding to each group of projection data and the centroid of the projection data at the scanning time to the centroid of the detected object at the corresponding scanning time.
And the first fitting module is used for performing data fitting on the centroid of the detected object at each scanning time to obtain the change relation of the centroid of the detected object relative to the scanning time.
As an example, the position parameter is a position parameter of a channel in which projection data is acquired.
As an example, the feature extraction module 620 may further include:
and the characteristic determining module is used for representing the motion characteristic of the center of mass of the detected object at each scanning time according to the variation relation of the center of mass of the detected object relative to the scanning time when the device carrying the detected object is in a static state during the period that the detected object is scanned.
And the time difference determining module is used for calculating the time difference between the determined maximum scanning time and the determined minimum scanning time when the equipment carrying the detected object is in a motion state during the period that the detected object is scanned.
And the centroid accumulation module is used for acquiring the accumulation result of the centroid of the detected object at each scanning time.
And the error calculation module is used for calculating the ratio of the accumulation result to the time difference as the equipment error.
And the error correction module is used for eliminating the equipment error from the change relation to obtain a corrected change relation.
As an example, the feature extraction module 620 may further include:
and the included angle determining module is used for determining the included angle value between the main motion axis of the centroid of the detected object and the abscissa axis of the current coordinate system based on the corrected change relationship.
And the coordinate rotation module is used for rotating the coordinates of the corrected change relationship according to the included angle value to obtain the rotated change relationship.
In other examples, the data correction module 630 may include:
and the motion degree determining module is used for determining the motion intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the motion characteristics.
And the reconstruction weight determining module is used for determining the reconstruction weight of the projection data corresponding to the detected object at each scanning position according to the movement intensity of the centroid of the detected object at each scanning position, wherein the reconstruction weight of the corresponding projection data is smaller for the scanning position with more intense centroid movement.
And the data correction submodule is used for acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain the corrected projection data.
As an example, the motion level determination module is configured to:
and converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position.
And calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
As an example, the motion level determination module is configured to:
and calculating the movement intensity of the center of mass of the detected object at each scanning time based on the movement characteristics.
And converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
The above device embodiments correspond to the method embodiments and are not described herein again.
Other embodiments of the present application will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the application and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It will be understood that the present application is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (16)

1. A method of correcting motion artifacts, the method comprising:
acquiring projection data of a detected object;
mapping the motion characteristics of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data;
correcting projection data with a motion feature in the acquired projection data based on the motion feature, wherein the projection data with the motion feature comprises: the data obtained by scanning the detected object when the center of mass of the detected object moves;
Carrying out image reconstruction according to the corrected projection data to obtain a reconstructed image of the detected object;
wherein the content of the first and second substances,
the correcting the projection data with the motion feature in the acquired projection data based on the motion feature comprises:
determining the movement intensity of the centroid of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the movement characteristics;
determining reconstruction weights of projection data corresponding to the detected object at each scanning position according to the movement intensity of the centroid of the detected object at each scanning position, wherein the reconstruction weights of the corresponding projection data are smaller for the scanning position with the more intense centroid movement;
and acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain corrected projection data.
2. The method of claim 1, wherein the acquired projection data comprises a plurality of sets of projection data, one set of projection data is data obtained by scanning the object once, and the mapping the motion characteristics of the centroid of the object to be detected, which changes with scanning time, by tracking the centroid of the acquired projection data comprises:
Determining position parameters and scanning time corresponding to each group of projection data;
converting each set of projection data and the corresponding position parameters into the centroid of the set of projection data at the corresponding scanning time;
mapping the scanning time corresponding to each set of projection data and the centroid thereof at the scanning time to the centroid of the detected object at the corresponding scanning time;
and performing data fitting on the centroid of the detected object at each scanning time to obtain the variation relation of the centroid of the detected object relative to the scanning time.
3. The method of claim 2, wherein the location parameter is a location parameter of a channel acquiring projection data.
4. The method of claim 2, wherein mapping motion characteristics of the centroid of the inspected object as a function of scan time by tracking the centroid of the acquired projection data further comprises:
calculating a time difference between the determined maximum scan time and the determined minimum scan time if a device carrying the object is in motion during the object being scanned;
acquiring an accumulation result of the centroid of the detected object at each scanning time;
Calculating the ratio of the accumulated result to the time difference as an equipment error;
and eliminating the equipment error from the change relation to obtain a corrected change relation.
5. The method of claim 4, wherein mapping motion characteristics of the centroid of the inspected object as a function of scan time by tracking the centroid of the acquired projection data, further comprises:
based on the corrected change relationship, determining the main motion axis of the centroid of the detected object and the included angle value of the abscissa axis of the current coordinate system;
and carrying out coordinate rotation on the corrected change relationship according to the included angle value to obtain the rotated change relationship.
6. The method according to claim 1, wherein the determining the intensity of the motion of the centroid of the detected object at each scanning position based on the correspondence between the scanning time and the scanning position and the motion feature comprises:
converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position;
and calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
7. The method according to claim 1, wherein the determining the intensity of the motion of the centroid of the detected object at each scanning position based on the correspondence between the scanning time and the scanning position and the motion feature comprises:
calculating the intensity of the motion of the center of mass of the detected object at each scanning time based on the motion characteristics;
and converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
8. An apparatus for correcting motion artifacts, comprising:
the data acquisition module is used for acquiring projection data of the detected object;
the characteristic extraction module is used for mapping the motion characteristic of the centroid of the detected object changing along with the scanning time by tracking the centroid of the acquired projection data;
a data correction module, configured to correct projection data with a motion feature in the acquired projection data based on the motion feature, where the projection data with the motion feature includes: the data obtained by scanning the detected object when the center of mass of the detected object moves;
The image reconstruction module is used for reconstructing an image according to the corrected projection data to obtain a reconstructed image of the detected object;
the data correction module includes:
the motion degree determining module is used for determining the motion intensity degree of the centroid of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position and the motion characteristics;
the reconstruction weight determining module is used for determining the reconstruction weight of the projection data corresponding to the detected object at each scanning position according to the movement intensity of the centroid of the detected object at each scanning position, wherein the reconstruction weight of the corresponding projection data is smaller for the scanning position with more intense centroid movement;
and the data correction submodule is used for acquiring the product of the projection data of each scanning position and the corresponding reconstruction weight to obtain the corrected projection data.
9. The apparatus of claim 8, wherein the acquired projection data comprises a plurality of sets of projection data, and one set of projection data is data obtained by scanning the object once, and the feature extraction module comprises:
the parameter determining module is used for determining position parameters and scanning time corresponding to each group of projection data;
The centroid calculation module is used for converting each group of projection data and the corresponding position parameters into the centroid of the group of projection data at the corresponding scanning time;
the centroid mapping module is used for mapping the scanning time corresponding to each group of projection data and the centroid of the projection data at the scanning time to the centroid of the detected object at the corresponding scanning time;
and the first fitting module is used for performing data fitting on the centroid of the detected object at each scanning time to obtain the change relation of the centroid of the detected object relative to the scanning time.
10. The apparatus of claim 9, wherein the position parameter is a position parameter of a channel in which projection data is acquired.
11. The apparatus of claim 9, wherein the feature extraction module further comprises:
the time difference determining module is used for calculating the time difference between the determined maximum scanning time and the determined minimum scanning time when the equipment bearing the detected object is in a motion state during the period that the detected object is scanned;
the centroid accumulation module is used for acquiring the accumulation result of the centroid of the detected object at each scanning time;
the error calculation module is used for calculating the ratio of the accumulated result to the time difference as an equipment error;
And the error correction module is used for eliminating the equipment error from the change relation to obtain a corrected change relation.
12. The apparatus of claim 11, wherein the feature extraction module further comprises:
the included angle determining module is used for determining the main motion axis of the centroid of the detected object and the included angle value of the abscissa axis of the current coordinate system based on the corrected change relationship;
and the coordinate rotation module is used for rotating the coordinates of the corrected change relationship according to the included angle value to obtain the rotated change relationship.
13. The apparatus of claim 8, wherein the degree of motion determination module is configured to:
converting the motion characteristics into the motion relation of the centroid of the detected object relative to each scanning position according to the corresponding relation between the scanning time and the scanning position;
and calculating the movement intensity of the center of mass of the detected object at each scanning position based on the movement relation.
14. The apparatus of claim 8, wherein the degree of motion determination module is configured to:
calculating the intensity of the motion of the center of mass of the detected object at each scanning time based on the motion characteristics;
And converting the movement intensity of the center of mass of the detected object at each scanning time into the movement intensity of the center of mass of the detected object at each scanning position based on the corresponding relation between the scanning time and the scanning position.
15. A computer device, comprising:
a processor;
a memory storing processor-executable instructions;
wherein the processor is coupled to the memory for reading program instructions stored by the memory and, in response, performing operations in the method of any of claims 1-7.
16. One or more machine-readable storage media having instructions stored thereon, which when executed by one or more processors, cause a computer device to perform operations in a method as recited in any of claims 1-7.
CN201810509389.0A 2018-05-24 2018-05-24 Method, device and equipment for correcting motion artifact and storage medium Active CN108876730B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810509389.0A CN108876730B (en) 2018-05-24 2018-05-24 Method, device and equipment for correcting motion artifact and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810509389.0A CN108876730B (en) 2018-05-24 2018-05-24 Method, device and equipment for correcting motion artifact and storage medium

Publications (2)

Publication Number Publication Date
CN108876730A CN108876730A (en) 2018-11-23
CN108876730B true CN108876730B (en) 2022-03-04

Family

ID=64333818

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810509389.0A Active CN108876730B (en) 2018-05-24 2018-05-24 Method, device and equipment for correcting motion artifact and storage medium

Country Status (1)

Country Link
CN (1) CN108876730B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110443866B (en) * 2019-07-31 2023-05-30 沈阳智核医疗科技有限公司 Image reconstruction method, device, terminal equipment and PET system
CN110473269B (en) * 2019-08-08 2023-05-26 上海联影医疗科技股份有限公司 Image reconstruction method, system, equipment and storage medium
EP4111418A4 (en) * 2020-02-28 2023-09-13 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for correcting motion artifacts in images
CN111544020B (en) * 2020-04-17 2023-08-01 北京东软医疗设备有限公司 Geometric correction method and device for X-ray imaging equipment
CN111462168B (en) * 2020-04-22 2023-09-19 上海联影医疗科技股份有限公司 Motion parameter estimation method and motion artifact correction method
CN111528825A (en) * 2020-05-14 2020-08-14 浙江大学 Photoelectric volume pulse wave signal optimization method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101454801A (en) * 2006-02-28 2009-06-10 皇家飞利浦电子股份有限公司 Local motion compensation based on list mode data
CN101936720A (en) * 2010-07-30 2011-01-05 北京航空航天大学 Method for calibrating detector torsion angle applied to cone-beam XCT system
CN202049120U (en) * 2011-03-04 2011-11-23 首都师范大学 System for eliminating geometric artifacts in CT (computed tomography) image
CN102317975A (en) * 2009-02-17 2012-01-11 皇家飞利浦电子股份有限公司 Functional imaging
CN102426696A (en) * 2011-10-24 2012-04-25 西安电子科技大学 Optical projection tomography motion artifact correction method
CN102652674A (en) * 2011-03-04 2012-09-05 首都师范大学 Method and system for eliminating geometrical artifacts in CT (Computerized Tomography) image
CN203776924U (en) * 2014-03-06 2014-08-20 北京锐视康科技发展有限公司 Calibration device for geometric position of cone-beam CT (computed tomography) system
CN106251380A (en) * 2016-07-29 2016-12-21 上海联影医疗科技有限公司 Image rebuilding method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9482732B2 (en) * 2012-11-08 2016-11-01 Nicolas Chesneau MRI reconstruction with motion-dependent regularization
JP6273241B2 (en) * 2015-09-24 2018-01-31 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Radiation tomography method, apparatus and program
CN107303184B (en) * 2016-04-22 2020-09-15 上海联影医疗科技有限公司 CT scanning X-ray source tube current modulation method and computed tomography device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101454801A (en) * 2006-02-28 2009-06-10 皇家飞利浦电子股份有限公司 Local motion compensation based on list mode data
CN102317975A (en) * 2009-02-17 2012-01-11 皇家飞利浦电子股份有限公司 Functional imaging
CN101936720A (en) * 2010-07-30 2011-01-05 北京航空航天大学 Method for calibrating detector torsion angle applied to cone-beam XCT system
CN202049120U (en) * 2011-03-04 2011-11-23 首都师范大学 System for eliminating geometric artifacts in CT (computed tomography) image
CN102652674A (en) * 2011-03-04 2012-09-05 首都师范大学 Method and system for eliminating geometrical artifacts in CT (Computerized Tomography) image
CN102426696A (en) * 2011-10-24 2012-04-25 西安电子科技大学 Optical projection tomography motion artifact correction method
CN203776924U (en) * 2014-03-06 2014-08-20 北京锐视康科技发展有限公司 Calibration device for geometric position of cone-beam CT (computed tomography) system
CN106251380A (en) * 2016-07-29 2016-12-21 上海联影医疗科技有限公司 Image rebuilding method

Also Published As

Publication number Publication date
CN108876730A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN108876730B (en) Method, device and equipment for correcting motion artifact and storage medium
US7391844B2 (en) Method and apparatus for correcting for beam hardening in CT images
US7747057B2 (en) Methods and apparatus for BIS correction
US7751524B2 (en) X-ray computed tomography apparatus
US8666137B2 (en) Apparatus and method for processing projection data
CN110533597B (en) Artifact processing method, artifact processing device, rotation center determining device, artifact processing equipment and storage medium
US9420986B2 (en) X-ray CT apparatus and X-ray CT image processing method
JP2010527741A (en) Method and system for facilitating correction of gain variation in image reconstruction
IL158197A (en) Methods and apparatus for truncation compensation
US9592021B2 (en) X-ray CT device, and method
CN107945850B (en) Method and device for processing medical images
CN110751702A (en) Image reconstruction method, system, device and storage medium
KR20170088681A (en) Tomography apparatus and method for reconstructing a tomography image thereof
AU2019271915A1 (en) Method and system for motion correction in CT imaging
US11341638B2 (en) Medical image diagnostic system and method for generating trained model
CN110866959B (en) Image reconstruction method, system, device and storage medium
US9858688B2 (en) Methods and systems for computed tomography motion compensation
US20200240934A1 (en) Tomography apparatus and controlling method for the same
US20190180481A1 (en) Tomographic reconstruction with weights
US20060126779A1 (en) Method and system for efficient helical cone-beam reconstruction
US8526757B2 (en) Imaging system and imaging method for imaging a region of interest
CN113520432A (en) Gating method suitable for tomography system
JP5171474B2 (en) Tomographic image processing apparatus, X-ray CT apparatus, and program
US20230145523A1 (en) Medical image processing apparatus, x-ray ct apparatus, medical image processing method and non-volatile storage medium storing program
CN113229840B (en) Oral CBCT (cone beam computed tomography) shot image motion compensation reconstruction method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 110167 No. 177-1 Innovation Road, Hunnan District, Shenyang City, Liaoning Province

Applicant after: DongSoft Medical System Co., Ltd.

Address before: 110167 No. 177-1 Innovation Road, Hunnan District, Shenyang City, Liaoning Province

Applicant before: Dongruan Medical Systems Co., Ltd., Shenyang

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant