CN110751702B - Image reconstruction method, system, device and storage medium - Google Patents

Image reconstruction method, system, device and storage medium Download PDF

Info

Publication number
CN110751702B
CN110751702B CN201911040232.9A CN201911040232A CN110751702B CN 110751702 B CN110751702 B CN 110751702B CN 201911040232 A CN201911040232 A CN 201911040232A CN 110751702 B CN110751702 B CN 110751702B
Authority
CN
China
Prior art keywords
image
projection data
initial
reconstruction
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911040232.9A
Other languages
Chinese (zh)
Other versions
CN110751702A (en
Inventor
曹文静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201911040232.9A priority Critical patent/CN110751702B/en
Publication of CN110751702A publication Critical patent/CN110751702A/en
Priority to US17/082,128 priority patent/US11776170B2/en
Application granted granted Critical
Publication of CN110751702B publication Critical patent/CN110751702B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/412Dynamic
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/424Iterative
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/436Limited angle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Algebra (AREA)
  • Image Processing (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the application discloses an image reconstruction method. The method may include at least one of the following operations: a set of raw projection data may be acquired, and at least one prior image associated with the set of raw projection data. The constraint term coefficients in the target reconstruction function corresponding to each prior image may be determined based on a set of raw projection data. A target reconstructed image corresponding to the original projection data may be acquired based on a set of original projection data and at least one prior image using the determined target reconstruction function of the constraint term coefficients corresponding to each prior image. The image reconstruction method disclosed by the embodiment of the application can improve the image quality of the reconstructed image under a limited angle.

Description

Image reconstruction method, system, device and storage medium
Technical Field
The embodiment of the application relates to the technical field of data processing, in particular to an image reconstruction method, an image reconstruction system, an image reconstruction device and a storage medium.
Background
Image reconstruction techniques are widely used today for disease diagnosis. For reconstruction procedures of certain scan objects, such as CT image reconstruction of the heart, temporal resolution is an important indicator. Because the heart is moving, it is desirable to obtain relatively stationary cardiac tomograms, limiting the scan data required for reconstruction to a smaller period of time, which also corresponds to a smaller scan angle range. The shorter the period of time, the higher the temporal resolution and the less motion artifacts of the reconstructed image. I.e. a higher temporal resolution is required. By adopting a traditional image reconstruction algorithm, the time resolution of a reconstructed image can be greatly reduced in a scanning angle range, and the time resolution of the reconstructed image can be improved by using scanning data corresponding to a smaller scanning angle range, but the problem of poor overall image effect caused by insufficient scanning data can be brought. Therefore, the image reconstruction method and system can reconstruct the scanning data in a small angle range, improve the time resolution and overcome the problem caused by insufficient data.
Disclosure of Invention
One aspect of an embodiment of the present application provides an image reconstruction method. The method may include at least one of the following operations. A set of raw projection data may be acquired, and at least one prior image associated with the set of raw projection data. Based on the set of raw projection data, constraint term coefficients corresponding to each prior image in the target reconstruction function may be determined. Based on the set of raw projection data and the at least one prior image, a target reconstruction image corresponding to the raw projection data may be acquired using a target reconstruction function determined by the constraint term coefficients.
Another aspect of an embodiment of the present application provides an image reconstruction system. The system comprises an acquisition module, a determination module and a reconstruction module. The acquisition module may be configured to acquire a set of raw projection data, and at least one prior image, the prior image being related to the set of raw projection data. The determination module may be configured to determine constraint term coefficients in the target reconstruction function corresponding to each prior image based on the set of raw projection data. The reconstruction module may be configured to obtain a target reconstructed image corresponding to the original projection data using the determined target reconstruction function based on the set of original projection data and the at least one prior image.
Another aspect of an embodiment of the present application provides an image reconstruction apparatus. The apparatus includes a processor and a memory. The memory may be used to store instructions. The instructions, when executed by the processor, may cause the apparatus to implement the image reconstruction method as described above.
Another aspect of embodiments of the present application provides a computer-readable storage medium. The storage medium may store computer instructions that, when read by a computer, the computer may perform the image reconstruction method as described above.
Drawings
The present application will be further described by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. The embodiments are not limiting, in which like numerals represent like structures, wherein:
FIG. 1 is a schematic diagram of an exemplary image reconstruction system shown in accordance with some embodiments of the present application;
FIG. 2 is an exemplary flow chart of an image reconstruction method shown in accordance with some embodiments of the present application;
FIG. 3 is an exemplary flow chart for determining constraint term coefficients according to some embodiments of the present application;
FIG. 4 is an exemplary flow chart for determining a mask matrix according to some embodiments of the present application;
FIG. 5 is an exemplary flowchart illustrating another determination of a mask matrix according to some embodiments of the present application;
fig. 6 is a block diagram of an exemplary processing device shown in accordance with some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the description of the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is obvious to those skilled in the art that the present application may be applied to other similar situations according to the drawings without inventive effort. Unless otherwise apparent from the context of the language or otherwise specified, like reference numerals in the figures refer to like structures or operations.
It should be appreciated that "system," "apparatus," "unit," and/or "module" as used in this application is one method for distinguishing between different components, elements, parts, portions, or assemblies at different levels. However, if other words can achieve the same purpose, the words can be replaced by other expressions.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not specific to the singular, but may include the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that the steps and elements are explicitly identified, and they do not constitute an exclusive list, as other steps or elements may be included in a method or apparatus.
Flowcharts are used in this application to describe the operations performed by systems according to embodiments of the present application. It should be appreciated that the preceding or following operations are not necessarily performed in order precisely. Rather, the steps may be processed in reverse order or simultaneously. Also, other operations may be added to or removed from these processes.
The image reconstruction methods disclosed herein may be applied to a variety of medical scanning imaging devices including, but not limited to, one of a Computer Radiography (CR), a Digital Radiography (DR), a Computed Tomography (CT), a screen X-ray machine, a mobile X-ray device (such as a mobile C-arm machine), a digital subtraction angiography scanner (DSA), a linac, an Emission Computed Tomography (ECT), and the like, or any combination thereof. For illustrative purposes only, the present application will take a CT system as an example, and the disclosed technical solutions will be described in detail, without intending to limit the scope of the invention. In this application, the term "angle" may refer to a range of scan angles, and/or scan data corresponding to the range of scan angles. The terms "angle" and "angular range" may be used interchangeably.
CT (computer tomography) technology is widely used in the medical field as one of the important references for disease diagnosis and treatment. Among CT reconstruction algorithms under limited angles, conventional analysis algorithms such as filtered-back projection (FBP) algorithms have very high requirements on the completeness of data, so that it is difficult for the analysis algorithm to reconstruct high-quality images under limited angles, and iterative algorithms (iterative reconstruction, IR) have very advantages. The iterative algorithm belongs to a numerical approximation algorithm, namely, starting from an initial value of the tomographic image, and repeatedly correcting an estimated value of the image to enable the estimated value to gradually approximate to a true value of the tomographic image. Common iterative reconstruction algorithms include algebraic reconstruction Algorithm (ART), joint algebraic reconstruction algorithm (SART), maximum likelihood expectation maximization algorithm (ML-EM), etc. In the prior ART, by improving an ART algorithm, a better reconstruction result is obtained under the sparse (wider) angle reconstruction condition, but the image effect reconstructed under a limited (smaller) angle is still unsatisfactory, and along with the reduction of a scanning angle, the reconstruction result is provided with artifacts.
In the related art, a large-angle image (for example, an image reconstructed using scan projection data at a larger scan angle range such as 360 °) may be used as a prior image (prior image) of a small-angle iterative reconstruction (for example, an iterative reconstruction using scan projection data at a smaller scan angle range such as 120 °, 150 °, or 180 °), and the small-angle reconstruction method may be completed by adding the prior image. However, it has been experimentally estimated that this way of improving the temporal resolution by means of a priori image constrained compressed sensing (PICCS, prior image constrains compressed sensing) is not ideal.
As an example, an objective function for cardiac image reconstruction may be expressed by the following formula:
Figure BDA0002252636170000031
wherein X is the reconstructed image to be solved, X prior For the prior image, Y is the original data acquired by the CT equipment, A is a system matrix, W is the weighting of the original data domain, R is a regularization function, and alpha is a parameter for controlling the regularization degree.
For cardiac reconstruction, a small angle reconstruction is required to obtain a high temporal resolution image. In order to solve the problem of insufficient angle caused by reconstructing an image at a smaller angle, a large-angle image can be used as a prior image (prior image) for iterative reconstruction at a smaller angle, however, in the prior image at a larger angle, the time resolution of the large-angle reconstructed image is lower, and if the prior image is entirely introduced into the image reconstruction, the region with intense motion is also brought into the finally obtained reconstructed image due to the influence of image blurring caused by the lower time resolution. It can also be seen from the above formula that α is a global coefficient, which is the same value for all regions that need to be reconstructed. Whereas a priori images are reconstructed images over a wide time window (which can be understood as a large angular range), CT values are relatively accurate without artifacts, but with poor temporal resolution, and regions of motion, such as blood vessels, are prone to blurring. And alpha is taken as a global coefficient, the value cannot be too small, and at the moment, the influence caused by the poor time resolution in the prior image is also brought into the finally acquired reconstructed image, so that the time resolution of the PICS is not ideal. Therefore, in order to reconstruct a good quality image under a limited angle, the present application proposes an image reconstruction algorithm to improve the temporal resolution of the reconstructed image. When the prior image is used for iterative reconstruction, the effect is better in the small-angle reconstruction of a target object, such as a heart movement intense region, and the time resolution of the prior image is poor, so that the prior image is not used; in a smooth motion area, the small-angle data reconstruction effect is poor due to insufficient data quantity during small-angle reconstruction, and the prior image is used at the moment, so that the time resolution and the reconstruction quality of the small-angle image reconstruction are effectively improved. The technical solutions disclosed in the present application are set forth below by the description of the drawings.
FIG. 1 is a schematic diagram of an exemplary image reconstruction system shown in accordance with some embodiments of the present application. In some embodiments, the image reconstruction system 100 may acquire scan projection data of a scan object and determine a reconstructed image of the scan object based on the acquired scan projection data. As shown in fig. 1, image reconstruction system 100 includes a scanning device 110, a network 120, one or more terminals 130, a processing device 140, and a storage device 150.
The scanning device 110 may include a gantry 111, a detector 112, a detection region 113, a couch 114, and a radiation scanning source 115. The gantry 111 can support a detector 112 and a radiation scanning source 115. The gantry 111 can rotate, for example, clockwise or counterclockwise about a gantry axis of rotation. The radiation scanning source 115 may rotate with the gantry 111. The target object may be placed on a scan bed 114 to receive the scan. The radiation scanning source 115 may emit a radiation beam toward a target object. The detector 112 may detect the radiation beam (e.g., gamma photons) emitted from the detection region 113, and after receiving the radiation beam that passes through the target object, the detector 112 may convert it to visible light, and from photoelectric to electrical signals, and from analog to digital converter to digital information, which is input to a computing device (e.g., a computer) for processing, or transmitted to a storage device for storage. In some embodiments, the detector 112 may include one or more detection units. The detector unit may include scintillation detectors (e.g., cesium iodide detectors) and other detectors, among others. The detector unit may be and/or comprise a single row of detectors and/or a plurality of rows of detectors.
Processing device 140 may process data and/or information obtained from scanning device 110, terminal 130, and/or storage device 150. For example, the processing device 140 may determine scan data required for image reconstruction. In some embodiments, the processing device 140 may be used to process the read data, e.g., data preprocessing, image reconstruction, post-reconstruction processing, etc. In some embodiments, the processing device 140 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, processing device 140 may access information and/or data from scanning device 110, terminal 130, and/or storage device 150 via network 120. As another example, processing device 140 may be directly connected to scanning device 110, terminal 130, and/or storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include one or a combination of several of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, and the like.
The terminal 130 may include a mobile device 131, a tablet 132, a notebook 133, or the like, or any combination thereof. In some embodiments, terminal 130 may interact with other components in image reconstruction system 100 via a network. For example, the terminal 130 may send one or more control instructions to the scanning device 110 to control the scanning device 110 to scan as instructed. For another example, the terminal 130 may also receive processing results of the processing device 140, such as reconstructed images. In some embodiments, mobile device 131 may include a smart home device, a wearable device, a mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home devices may include smart lighting devices, smart appliance control devices, smart monitoring devices, smart televisions, smart cameras, interphones, and the like, or any combination thereof. In some embodiments, the wearable device may include a bracelet, footwear, glasses, helmet, watch, clothing, backpack, smart accessory, or the like, or any combination thereof. In some embodiments, the mobile device may include a mobile phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a POS device, a notebook, a tablet, a desktop, etc., or any combination thereof. In some embodiments The virtual reality device and/or augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality patch, augmented reality helmet, augmented reality glasses, augmented reality patch, or the like, or any combination thereof. For example, the virtual reality device and/or augmented reality device may include Google Glass TM 、Oculus Rift TM 、HoloLens TM Or Gear VR TM Etc. In some embodiments, terminal 130 may be part of processing device 140. In some embodiments, terminal 130 may be integral with processing device 140 as an operator console for scanning device 110. A user/operator (e.g., a physician) of the image reconstruction system 100 may control the operation of the device 110 via the console and acquire scanned reconstructed images.
Storage device 150 may store data (e.g., scan data for a target object), instructions, and/or any other information. In some embodiments, the storage device 150 may store data obtained from the scanning device 110, the terminal 130, and/or the processing device 140, e.g., the storage device 150 may store scan data of a scan object obtained from the scanning device 110. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform the exemplary methods described herein. In some embodiments, the storage device 150 may include one or a combination of a large capacity memory, a removable memory, a volatile read-write memory, a read-only memory (ROM), and the like. Mass storage may include magnetic disks, optical disks, solid state disks, removable memory, and the like. Removable memory may include flash drives, floppy disks, optical disks, memory cards, ZIP disks, tape, and the like. Volatile read-write memory can include Random Access Memory (RAM). The RAM may include Dynamic Random Access Memory (DRAM), double data rate synchronous dynamic random access memory (DDR-SDRAM), static Random Access Memory (SRAM), silicon controlled random access memory (T-RAM), zero capacitance random access memory (Z-RAM), etc. ROM may include mask read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), optical discs of digital versatile discs, and the like. In some embodiments, storage device 150 may be implemented by a cloud platform as described herein. For example, the cloud platform may include one or a combination of several of private cloud, public cloud, hybrid cloud, community cloud, distributed cloud, cross-cloud, multi-cloud, and the like.
In some embodiments, the storage device 150 may be connected to the network 120 to enable communication with one or more components (e.g., the processing device 140, the terminal 130, etc.) in the image reconstruction system 100. One or more components in image reconstruction system 100 may read data or instructions in storage device 150 through network 120. In some embodiments, the storage device 150 may be part of the processing device 140 or may be separate and directly or indirectly connected to the processing device.
Network 120 may include any suitable network capable of facilitating the exchange of information and/or data by image reconstruction system 100. In some embodiments, one or more components of image processing system 100 (e.g., scanning device 110, terminal 130, processing device 140, storage device 150, etc.) may exchange information and/or data with one or more components of image processing system 100 via network 120. For example, processing device 140 may obtain planning data from a data processing planning system via network 120. Network 120 may include one or a combination of public networks (e.g., the internet), private networks (e.g., local Area Network (LAN), wide Area Network (WAN)), etc.), wired networks (e.g., ethernet), wireless networks (e.g., 802.11 networks, wireless Wi-Fi networks, etc.), cellular networks (e.g., long Term Evolution (LTE) networks), frame relay networks, virtual Private Networks (VPN), satellite networks, telephone networks, routers, hubs, server computers, etc. For example, network 120 may include a wired network, a fiber optic network, a telecommunications network, a local area network, a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), bluetooth TM Network, zigBee TM A network, a Near Field Communication (NFC) network, or the like. In some embodiments, network 120 may include one or more network access points. For example, network 120 may include wires and/orA wireless network access point, such as a base station and/or an internet switching point, through which one or more components of image reconstruction system 100 may connect to network 120 to exchange data and/or information.
Fig. 2 is an exemplary flow chart of an image reconstruction method according to some embodiments of the present application. In some embodiments, one or more steps of method 200 may be implemented in system 100 shown in fig. 1. For example, one or more steps of method 200 may be stored in storage device 150 as instructions and invoked and/or executed by processing device 140.
Step 210, a set of raw projection data is acquired, and at least one prior image, the prior image being related to the set of raw projection data. Step 210 may be performed by the acquisition module 610.
In some embodiments, the raw projection data may be scanned projection data acquired by a detection device, such as detector 112 shown in fig. 1, which may be used for image reconstruction. For example, the scanning device 110 scans the target object using a beam of collimated X-rays, receives the X-rays transmitted through the object plane by the detector 112, converts the X-rays into visible light, converts the visible light into an electrical signal by photoelectric conversion, and converts the resulting data by an analog/digital converter (analog/digital converter). The target object may include a patient, or other medical subject (e.g., other animal such as a laboratory mouse), etc. The target object may also be part of a patient or other medical subject, including organs and/or tissues, e.g., heart, lung, ribs, abdominal cavity, etc. With reference to fig. 1, it will be appreciated that during a scan, the radiation scanning source 115 rotates about the scan object, and that, based on the scanning protocol, the radiation scanning source 115 may emit a radiation beam at a particular instant in time, while the detector 112 may detect the radiation beam passing through a target object, such as a patient, to obtain scan projection data. Each particular instant is referred to as a data acquisition instant. Because the radiation scanning source 115 is moving, each data acquisition instant may correspond to one data acquisition angle. The data acquisition angle may be understood as the rotation angle of the radiation scanning source 115 at the time of data acquisition, or the rotation angle of the gantry 111 at the time of data acquisition, such as 10 °, 15 °, 90 °, 180 °, or the like. Thus, there may be one scan projection data for each data acquisition angle. It will be appreciated based on the above description that the scan angle range may be a range of angles of rotation of the radiation scanning source 115 during a scan. The set of raw projection data may refer to a set of scanned projection data corresponding to all data acquisition angles of a range of scan angles. In some embodiments, the selection of the set of raw projection data may be arbitrary. For example, the radiation scanning source 115 is rotated 360 ° while scanning a target object. The set of raw projection data may be a collection of scanned projection data corresponding to any angular range within 360, such as 90-240. In some embodiments, the selection of the set of raw projection data may be determined based on a particular target object. For example, assuming that the target object is a moving part of the heart, lung, rib, abdominal cavity, or the like of the patient, the data acquisition time corresponding to the target phase of the target object (for example, when the heart is in a relatively gentle motion state) may be first determined, and then the corresponding data acquisition angle may be acquired according to the data acquisition time. The scanned projection data over a range of scan angles is selected as the set of raw projection data centered at the data angle. In some embodiments, the set of raw projection data may be acquired directly from a scanning device that scans the target object, such as scanning device 110, or may be acquired from a storage device (e.g., storage device 150), or a medical database, via a network (e.g., network 120).
In some embodiments, the prior image is associated with a set of raw projection data, which may be a reconstructed image of the target object derived using further data including the set of raw projection data. Continuing with the example in 210, assume that the radiation scanning source 115 is rotated 360 ° while scanning a target object. The set of raw projection data is a set of scanned projection data corresponding to an angular range of 90 ° -240 °. The prior image may be an image reconstructed using a larger range of corresponding scanned projection data including an angular range of 90 deg. -240 deg., for example, 90 deg. -270 deg., 60 deg. -300 deg., or 0 deg. -360 deg.. The a priori image may carry more information of the target object due to the use of scanned projection data over a larger angular range. In some embodiments, the prior image may be obtained using existing image reconstruction algorithms. The image reconstruction algorithm may include an iterative reconstruction algorithm (e.g., a statistical reconstruction algorithm), a fourier slice theorem algorithm, a fan-beam reconstruction algorithm, an analytical reconstruction algorithm (e.g., a Filtered Back Projection (FBP) algorithm), an Algebraic Reconstruction Technique (ART), a Simultaneous Algebraic Reconstruction Technique (SART), a Feldkamp-Davis-Kress (FDK) reconstruction algorithm, or the like, or any combination thereof.
Step 220, determining constraint term coefficients corresponding to each prior image in the target reconstruction function based on the set of raw projection data. Step 220 may be performed by determination module 620.
In some embodiments, the target reconstruction function may be expressed as the following function:
Figure BDA0002252636170000071
wherein X is data of a reconstructed image to be solved, Y is raw data acquired by a system, for example, the group of raw data is a system matrix, W is weight data of an original data domain added according to a noise statistical model, R is a regularization function, the regularization function can play a role in inhibiting integral noise of the reconstructed image in an iterative process, β is a regularization coefficient of the image, which is a preset value, can also be adjusted according to different conditions, is used for balancing projection deviation and image smoothness, α is a constraint term coefficient of an priori image, is used for controlling regularization degree, and controls a portion of the priori image which is introduced into iterative reconstruction. In some embodiments, the W weight data may be determined from actual noise conditions. For example, the W weight data may be calculated from quantum noise in a state where the scanning device 110 is currently set with scanning parameters. The regularization coefficient beta can be a constant and can be selected within a set value range. For example, any one of the constants may be selected in a traversal manner. The constraint term coefficient alpha may be a constant or a matrix. In some embodiments, the constraint term coefficients α may be a matrix whose determination is related to the motion of the target object.
It will be appreciated that the introduction of the prior image in the iterative reconstruction is to use the prior image to reconstruct a final desired reconstructed image, or to illustrate the supplementation of the final reconstructed image with the superior portion of the prior image. When the target object is moving, for example, the heart, its coronary arteries are the region of intense motion. Whereas the lower temporal resolution of the a priori images determines that motion artifacts in the coronary parts can be very severe. If the prior image is directly introduced into the iterative reconstruction, the quality of the final reconstructed image in the coronary portion is affected. Therefore, the method and the device control and distinguish the part of the prior image which is finally introduced into the iterative reconstruction by using the constraint term coefficient alpha, so that the excellent part in the prior image supplements the final reconstructed image, and the influence of the poor part in the prior image on the final reconstructed image is removed.
In some embodiments, the constraint term coefficients may be matrices similar to a mask matrix. The set of raw projection data may be used to obtain an initial image. For example, the initial image is acquired based on the set of raw projection data using an existing image reconstruction algorithm. Thereafter, the initial image may be segmented into one or more regions. In the mask matrix of the same size as the initial value image, the values of the elements corresponding to the different regions are different. After adjusting the values of all elements in the mask matrix, the adjusted mask matrix may be designated as the constraint term coefficients.
In some embodiments, the set of raw projection data, and a set of one or more raw projection data corresponding to one or more phases adjacent to a phase corresponding to the raw projection data, associated with the set of raw projection data, may be used to motion estimate the target object. The result of the motion estimation can be used to set the constraint coefficients for different motion amplitude regions in the prior image. For a specific description of determining the constraint term coefficients corresponding to each prior image in the objective reconstruction function, reference may be made to other parts of the present application, for example, fig. 3-5, which are not repeated here.
In some embodiments, when the number of prior images is greater than 1, the target reconstruction function may also be expressed as the following function:
Figure BDA0002252636170000081
wherein alpha is 1 To alpha n The constraint term coefficients corresponding to each prior image respectively can be a single global coefficient or a matrix.
In some embodiments, each of the one or more prior images has a different temporal resolution than the other prior images, which may also be understood as images reconstructed from scanned projection data corresponding to different scan angle ranges. For example, assuming 3 prior images, the scan angle ranges for the 3 prior images may be 90 ° -270 °, 60 ° -300 °, or 0 ° -360 °, respectively. Because the scanning projection data corresponding to different scanning angle ranges are used for reconstruction, each prior image has certain characteristics. For example, the target object is the heart, the quality of the reconstruction of the central muscle portion of the prior image a is better, the quality of the reconstruction of the coronary artery portion of the prior image B is better, and the overall quality of the prior image C is better. The corresponding constraint term coefficients of the a-priori images a and B may be matrices for distinguishing the portions that were ultimately introduced into the iterative reconstruction when determining the corresponding constraint term coefficients, such that regions of better quality are introduced into the iterative reconstruction, such as myocardial portions and coronary portions. The constraint term coefficient corresponding to the prior image C may be a global coefficient for introducing the entire prior image into the iterative reconstruction. The method of determining the constraint coefficients may be similar to the usual method, as well as refer to other parts of the application, for example, fig. 3-5.
Based on the set of raw projection data and the at least one prior image, a target reconstruction image corresponding to the raw projection data is acquired using the constrained term coefficients determined target reconstruction function, step 230. Step 230 may be performed by reconstruction module 630.
The objective of the iterative reconstruction is to determine a reconstructed image that minimizes the function value of the target reconstruction function, i.e. the target reconstructed image. In some embodiments, an initial image may be acquired first based on the set of raw projection data. For example, after air correction, convolution, back projection and the like, an initial image is obtained based on the processed raw data by using an existing reconstruction algorithm. Thereafter, the initial value image is substituted into the objective function, and the function value of the objective reconstruction function reflects the difference between the projection value of the initial value image (i.e., the reconstructed image in the first iteration of the reconstruction) and the original data (e.g., the original projection data). Then, based on the difference value, the regularization function R (X) and the regularization coefficient beta in the objective function are adjusted, and the reconstructed image can be corrected. And substituting the corrected reconstructed image into the target reconstruction function again to carry out second-round iterative reconstruction, carrying out one or more iterations, and taking the finally obtained reconstructed image as the target reconstructed image after the difference value meets the accuracy or reaches the maximum iteration number.
Compared with the prior art, the constraint term coefficient is a global coefficient in the objective function of iterative reconstruction, and when image reconstruction is carried out, the influence caused by poor time resolution in the prior image is also introduced into the final iterative image. In the method, the constraint term coefficient of the prior image is improved to distinguish the region needing to be introduced from the region not needing to be introduced in the prior image, so that the influence caused by poor time resolution in the prior image is prevented from being introduced into the iterative image, and the time resolution of the final reconstructed image is improved.
It should be noted that the above description of the process 200 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the application. Various modifications and changes to flow 200 may be made by those skilled in the art in light of the present application. However, such modifications and variations are still within the scope of the present application.
FIG. 3 is an exemplary flow chart for determining constraint term coefficients according to some embodiments of the present application. In some embodiments, one or more steps of method 300 may be implemented in system 100 shown in fig. 1. For example, one or more steps of method 300 may be stored in storage device 150 as instructions and invoked and/or executed by processing device 140. In some embodiments, the method 300 may be performed by the determination module 620.
Step 310, acquiring an initial image based on the set of raw projection data.
In some embodiments, the initial value image may be an initial iterative reconstructed image of the target reconstruction function. The image may be reconstructed based on the set of raw projection data. The image reconstruction algorithm used may be an existing applicable image reconstruction algorithm, and may include an iterative reconstruction algorithm (e.g., a statistical reconstruction algorithm), a fourier slice theorem algorithm, a fan-beam reconstruction algorithm, an analytical reconstruction algorithm (e.g., a Filtered Back Projection (FBP) algorithm), an Algebraic Reconstruction Technique (ART), a Simultaneous Algebraic Reconstruction Technique (SART), a Feldkamp-Davis-Kress (FDK) reconstruction algorithm, or the like, or any combination thereof.
Step 320, based on the initial value image, obtaining a mask matrix corresponding to the initial value image.
In some embodiments, a Mask operation may be understood as controlling the area or process of image processing by globally or locally occluding the image to be processed with selected parameters (e.g., a matrix or multidimensional array). A mask matrix, i.e. a matrix for globally or locally occluding the image selected to be processed. It can act as a mask, by multiplying the image to be processed, certain areas on the image to be processed can be masked so that they do not participate in the processing, or only the masked areas are processed. The mask matrix corresponding to the initial value image can play a role in area distinguishing and shielding for the initial value image. Based on the description of the previous section of the application, the initial value image is an initial iteration reconstruction image of the target reconstruction algorithm, and is gradually corrected through the progress of an iteration process to finally obtain a final target image. The time resolution is high due to the small-angle iterative reconstruction. And the problem of insufficient data is solved by introducing a priori images. Also, the prior image need not be entirely introduced into the iterative reconstruction, but rather it needs to be determined from the reconstructed image (e.g., the initial image) first, which regions in the resulting reconstructed image need to be introduced with the prior image, e.g., regions where the motion of the target object is gentle, and which regions need not be introduced with the prior image, e.g., regions where the motion of the target object is severe. Based on the result of the determination, a corresponding mask matrix may be obtained. The mask matrix may be used as a basis, for example, after the values of the elements in the mask matrix are changed and adjusted, the mask matrix may be used as a constraint term coefficient of the prior image. With the constraint term coefficients thus determined, constraint adjustments can be made when portions of the prior image that need to be introduced (e.g., regions corresponding to the reconstructed image requiring data introduction) participate in iterative reconstruction.
In some embodiments, the initial image may be segmented into a plurality of regions, such as a region that does not require the introduction of a prior image for replenishment, a region that does require the introduction of a prior image for replenishment, and a buffer region between the two regions. And then, giving different values, for example, 0, 1 or any value between 0 and 1, to the elements corresponding to different areas in the initial mask matrix with the same size as the initial value image, so as to obtain the mask matrix corresponding to the initial value image. A specific description of the acquisition of the mask matrix corresponding to the initial image may be found in other parts of the present application, for example, fig. 4, and will not be repeated here.
Step 330, determining constraint term coefficients corresponding to the prior image based on the mask matrix.
In some embodiments, after the mask matrix is obtained, the values of the elements in the mask matrix may be calculated and adjusted, and the adjusted mask matrix is used as the constraint term coefficient corresponding to the prior image. In some embodiments, the values of the elements in the mask matrix may be adjusted for the first time based on a preset calculation process; then obtaining an initial constraint value, and carrying out second adjustment on the values of the elements in the mask matrix based on the initial constraint value; and finally, designating the mask matrix after the adjustment is finished as the constraint term coefficient.
And assuming that the initial value image is a three-dimensional image, the Mask matrix is a three-dimensional array and is expressed by Mask. Mask (x, y, z) represents the values of the elements in the Mask matrix corresponding to voxels of coordinates (x, y, z) in the initial image. An exemplary preset calculation process may be one that uses the following formula:
cos(Mask(x,y,z)(x,y,z)*pi/2)
where pi refers to the circumference ratio pi and cos is a cosine function. The values of the elements in the mask matrix are sequentially substituted into the formula, and the first adjustment of the values of the elements in the mask matrix can be completed through calculation and output of the result.
In some embodiments, the initial constraint value may be a pre-set global constraint term coefficient α for the prior image 0 . Exemplary second adjustments to elements in the mask matrix based on the initial constraint values may be made using the following formula:
α(x,y,z)=α 0 *(cos(Mask(x,y,z)*pi/2)) k
alpha (x, y, z) is the constraint value of the image value corresponding to the voxel with coordinates (x, y, z) in the initial image in the constraint term coefficient of the mask matrix after adjustment, alpha 0 The method is characterized in that the method is used as a global constraint term coefficient when being used alone for constraining the prior image when the prior image is entirely introduced into iterative reconstruction, and k is a preset configurable parameter for controlling the transition between areas. The k value may be a number greater than 0. For example, the target object is a heart, in the image reconstruction process, the region needing to introduce the prior image is a region with intense movement, such as a coronary artery, the region needing not to introduce the prior image is a region with gentle movement, and a transition region is arranged between the two regions. If the degree of movement of the region of intense movement is relatively small, the smaller the k value is, for example, 0.1 . If the degree of movement of the region of intense movement is relatively large, the larger the k value is, for example, 5. The magnitude of the resulting alpha (x, y, z) will be subject to alpha 0 The influence of (a), i.e. a (x, y, z) is less than or equal to a 0 . For example, when alpha 0 When set to 1, since cos (Mask (x, y, z): pi/2) is a [0,1]The number between, and therefore the resulting constraint coefficients α (x, y, z) will also be at [0,1]Between them.
It should be noted that the above description of the process 300 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the application. Various modifications and changes to flow 300 will be apparent to those skilled in the art in light of the teachings of this application. However, such modifications and variations are still within the scope of the present application.
FIG. 4 is an exemplary flow chart for determining a mask matrix according to some embodiments of the present application. In some embodiments, one or more steps of method 400 may be implemented in system 100 shown in fig. 1. For example, one or more steps in method 400 may be stored as instructions on storage device 150 and invoked and/or executed by processing device 140. In some embodiments, the method 400 may be performed by the determination module 620.
Step 410, segmenting the initial image to obtain a first region and a second region.
In some embodiments, the first region may be a region in which no prior image needs to be introduced during image reconstruction, and the second region may be a region in which a prior image needs to be introduced during image reconstruction. Taking cardiac reconstruction as an example, the region of the heart that contains intense motion may be a coronary region. By adopting the small-angle iterative reconstruction, the time resolution of the coronary region is high, so that the negative influence caused by the low resolution in the prior image is not required to be introduced. The problem of insufficient data in other regions due to the small angle reconstruction can be solved by introducing a priori images. Thus, the first region may be a coronary region in the reconstructed image and the second region may be other regions from which the coronary is removed.
It will be appreciated that each iteration of the iterative reconstruction yields a reconstructed image that differs from the initial image by a change in the image values of the pixels/voxels. Thus, the initial image may be segmented to obtain the first region and the second region. The existing image segmentation algorithm can be applied to segmentation operation of an initial value image, such as Fast Marching segmentation algorithm, live Wire segmentation algorithm, region growing segmentation algorithm, threshold segmentation algorithm and the like.
Step 420, an initial mask matrix is obtained.
In some embodiments, for the description of the mask matrix, reference may be made to other parts of the present application, for example, the details of step 320, which are not repeated here. The initial mask matrix may be a precursor of the mask matrix, which contains elements with values that are default initial values. After transformation, can be used as the mask matrix. The determination module 620 may directly generate a matrix of the same size as a matrix representing the initial value image as the initial mask matrix due to the relationship with the mask matrix.
Step 430, determining that in the initial mask matrix, the value of the element corresponding to the first region is a first value, and the value of the element corresponding to the second region is a second value.
In some embodiments, pixels belonging to different regions (e.g., first region or second region) in the initial value image, the values of their corresponding elements (e.g., co-ordinate points) in the initial mask matrix will be directly assigned. Based on the nature of the mask, the values in the matrix used for the masking operation, typically 0 and 1, are used to distinguish between areas in the image that are to be subjected to different operations. Taking cardiac image reconstruction as an example, an exemplary assignment may be as follows:
Figure BDA0002252636170000121
Wherein Pixel (x, y, z) is the Pixel/voxel with the coordinate point (x, y, z) in the initial value image, and is maskV essel (x, y, z) is the value of the pixel/voxel in the initial mask matrix corresponding to the coordinate point (x, y, z), e.g., the first value or the second value. For cardiac image reconstructionThe first region may be a coronary region in the reconstructed image, requiring a high temporal resolution, so that no a priori image has to be introduced to guide the reconstruction of the first region. A pixel/voxel with a coordinate point of (x, y, z) may be assigned a value of 1 when it belongs to the coronary region. The second region may be a region other than the coronary artery, the motion is gentle, high temporal resolution is not required and image quality is pursued due to sufficient data, and therefore a priori images need to be introduced to guide reconstruction of the second region. A pixel/voxel with a coordinate point of (x, y, z) may be assigned a value of 0 when it does not belong to the coronary region.
Step 440, determining a transition region from the second region, and determining a value of an element in the transition region as a third value.
In some embodiments, the vicinity of the region of intense motion may be considered as a region that is greatly affected by motion, i.e., the portion of the second region that is connected to the first region, or the portion that is greatly affected by the first region, and therefore, the portion may be considered as a transition region, so that the transition from the region of intense motion to the region of gentle motion of the reconstructed image may be smooth. The transition region is also understood to be a section connecting the region of intense movement with the region of gentle movement.
In some embodiments, the transition region may be determined by a gray scale expansion operation. Inflation is the process of incorporating all background points in contact with an object into the object, expanding the boundary outward, to fill voids in the object. Therefore, the first region can be expanded to the second region through the expansion operation, and the expanded portion is the transition region. Meanwhile, when the gray scale expansion operation is performed, a value in the initial mask matrix corresponding to the pixel/voxel in the transition region, that is, a third value may also be determined. For example, the average value of the values in the initial mask matrix corresponding to all pixels/voxels adjacent to the pixel/voxel is used as the value in the initial mask matrix corresponding to the pixel/voxel. Since the first and second values are 1 and 0. The third value calculated based on the first value and the second value is between 0 and 1.
Step 450, designating the initial mask matrix with the determined values of the elements as the mask matrix.
In some embodiments, after the values of the elements in the initial mask matrix are all determined, the initial mask matrix may be used as a mask matrix corresponding to the initial value image.
It should be noted that the above description of the process 500 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the application. Various modifications and changes to flow 500 may be made by those skilled in the art in light of the present application. However, such modifications and variations are still within the scope of the present application. For example, a first value corresponding to a first region may also be assigned a value of 0, and a second value corresponding to a second region may be assigned a value of 1.
FIG. 5 is an exemplary flowchart illustrating another determination of a mask matrix according to some embodiments of the present application. In some embodiments, one or more steps of method 500 may be implemented in system 100 shown in fig. 1. For example, one or more steps in method 500 may be stored as instructions on storage device 150 and invoked and/or executed by processing device 140. In some embodiments, the method 500 may be performed by the determination module 620.
Step 510 obtains another set or sets of raw projection data associated with the set of raw projection data.
In some embodiments, the associated other set or sets of raw projection data may be other phase-corresponding raw projection data of which the phase of the target object corresponding to the set of raw projection data is adjacent. Phase may refer to a state at a certain moment in a movement cycle of a target object (e.g. a cardiac cycle of a heart), which may be represented by a moment. For example, assuming the cardiac cycle of the heart is T, a certain phase may be represented by 0.4T, indicating a state in which the heart is at 40% of the time of the cardiac cycle. Phase adjacency may refer to temporal adjacency, including direct adjacency and indirect adjacency. For example, three phases, 0.2T, 0.4T and 0.8T, phases 0.2T and 0.4T are directly adjacent, phases 0.4T and 0.8T are directly adjacent, and phases 0.2T and 0.8T are indirectly adjacent.
In some embodiments, the other set or more sets of original projection data may be expanded left and right according to the data acquisition angle corresponding to the phase, so as to obtain scan projection data under a range of scan angles. Scanning of the target object may be done at once. The other set or sets of raw projection data associated with the set of raw projection data may be selected from the scanned projection data obtained from the scan.
In some embodiments, the above raw projection data may be obtained directly from a scanning device that scans the target object, such as scanning device 110, or may be obtained from a storage device (e.g., storage device 150), or a medical database, via a network (e.g., network 120).
Step 520, determining a motion vector field based on the set of raw projection data and the other set or sets of raw projection data.
It will be appreciated that each set of raw projection data, after processing, results in a reconstructed image. Since one set of raw projection data corresponds to one state of the target object in the motion cycle, reconstructed images obtained from adjacent multiple sets of raw projection data corresponding to multiple phases may exhibit one motion trend of the target object. For example, assuming that the target object is a heart, for a coronary region, a coronary centerline may be extracted from a plurality of reconstructed images obtained based on different sets of raw projection data, respectively. Subsequently, motion vectors representing the coronary region, including size (for representing the intensity of motion) and direction (for representing the direction of motion) may be determined based on the change in coordinates of the coronary centerline in the different reconstructed images. A vector field consisting of motion vectors corresponding to all regions can be understood as the motion vector field.
Step 530, determining constraint term coefficients of the prior image based on the motion vector field.
In some embodiments, since the motion vector field already includes vectors of motion corresponding to all regions, the motion amplitude of the whole or individual regions corresponding to the target object (which may also be referred to as reconstruction target) may be determined based on the motion vector field. The motion amplitude may represent the motion intensity of the reconstructed object, the greater the motion amplitude, the stronger the motion of the reconstructed object, the smaller the motion amplitude, and the weaker the motion of the reconstructed object. The size of the motion vector (length of the vector, or modulus of the vector) contained in the motion vector field can be used to evaluate the motion amplitude. For example, when the size of the motion vector is greater than the amplitude threshold, the region to which the motion vector corresponds may be considered to be moving vigorously. When the size of the motion vector is smaller than the amplitude threshold, the motion of the region corresponding to the motion vector can be considered to be gentle. Based on the size of the motion vector, the reconstruction target may be divided into a plurality of regions according to the motion amplitude.
Likewise, in small angle iterative reconstruction, the need for a priori images for regions of different motion magnitudes is different. For example, the region with intense motion does not need to introduce a priori images for reconstruction guidance due to the requirement of high resolution, while the region with gentle motion can be introduced. Thus, after determining the regions of different motion amplitudes of the reconstruction target, portions of the reconstructed image of the reconstruction target representing the different motion amplitudes can be partitioned. In addition, the initial image corresponding to the set of raw projection data is an initial reconstructed image of an iterative reconstruction, in which the image values of pixels/voxels, e.g., CT values, are corrected. Therefore, the initial image can be directly divided to obtain regions of different motion magnitudes.
Meanwhile, the determining module 620 may also obtain the initial constraint term coefficients. The initial constraint term coefficients may be matrices or arrays of the same size as the matrix representing the initial image, the elements contained corresponding to each pixel/voxel in the initial image. After the division is completed, the determining module 620 may assign values to the elements in the initial constraint coefficients that correspond to the regions of different motion magnitudes in the initial image. In combination with the previous example, in cardiac reconstruction, the severe motion segment such as the coronary does not need to introduce a priori images for reconstruction guidance, then the values of the elements corresponding to the severe motion segment in the initial constraint term coefficients may be assigned 0 for shielding the introduction of the corresponding segment in the prior image to the reconstructionAnd (5) building. The motion flattening portion needs to introduce a priori images for reconstruction guidance, then the values of the elements corresponding to the motion flattening region in the initial constraint term coefficients may be assigned as alpha 0 For example, global constraint values similar to a prior image.
It should be noted that the above description of the process 500 is for purposes of illustration and description only and is not intended to limit the scope of applicability of the application. Various modifications and changes to flow 500 may be made by those skilled in the art in light of the present application. However, such modifications and variations are still within the scope of the present application.
Fig. 6 is a block diagram of an exemplary processing device 140, shown in accordance with some embodiments of the present application. The processing device 140 may acquire the raw data and reconstruct an image using the objective function after determining the weight parameters of the objective function based on the raw data. As shown in fig. 6, the processing device 140 may include an acquisition module 610, a determination module 620, and a reconstruction module 630.
The acquisition module 610 may acquire data.
In some embodiments, the acquisition module 610 may acquire a set of raw projection data, as well as at least one prior image. The raw projection data may be scanned projection data acquired by a detection device, such as detector 112 shown in fig. 1, which may be used for image reconstruction. The prior image is related to the set of raw projection data and may be a reconstructed image of the target object derived using further data including the set of raw projection data. In some embodiments, the acquisition module 610 may acquire directly from a CT device that scans the target object, directly from a scanning device that scans the target object, such as the scanning device 110, or may acquire from a storage device (e.g., the storage device 150), or a medical database, via a network (e.g., the network 120).
The determination module 620 may determine one or more results based on the acquired data.
In some embodiments, the determination module 620 may determine constraint term coefficients in the target reconstruction function corresponding to each prior image based on the set of raw projection data. In some embodiments, the determination module 620 may obtain the initial image based on the set of raw projection data. And then based on the initial value image, acquiring a mask matrix corresponding to the initial value image. And determining constraint term coefficients corresponding to the prior image based on the mask matrix. In some embodiments, the determination module 620 may segment the initial image to obtain the first region and the second region. The first region may be a region in which the movement of the target object in the initial image is intense, and the second region may be a region in which the movement of the target object in the initial image is gentle. And acquiring an initial mask matrix. Determining that in the initial mask matrix, the value of the element corresponding to the first region is a first value and the value of the element corresponding to the second region is a second value. And determining a transition region from the second region, and determining that the value of the element within the transition region is a third value. Finally, an initial mask matrix with the determined values of the elements is designated as the mask matrix. In some embodiments, the determination module 620 may obtain another set or sets of raw projection data related to the set of raw projection data. The correlated raw projection data may be adjacent phase projection data corresponding to the field of view from which the raw projection data was derived. A motion vector field may then be determined based on the set of raw projection data, as well as the other set or sets of raw projection data. The motion state of the body tissue of the target object can be represented by the motion vector field. Finally, constraint term coefficients of the prior image may be determined based on the motion vector field.
The reconstruction module 630 may reconstruct an image.
In some embodiments, the reconstruction module 630 may acquire a target reconstructed image corresponding to the original projection data using the constrained term coefficients determined target reconstruction function based on the set of original projection data and the at least one prior image.
For a detailed description of the various modules of the processing device 140, reference may be made to the flow chart portion of the present application.
It should be understood that the system shown in fig. 5 and its modules may be implemented in a variety of ways. For example, in some embodiments, the system and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may then be stored in a memory and executed by a suitable instruction execution system, such as a microprocessor or special purpose design hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, such as provided on a carrier medium such as a magnetic disk, CD or DVD-ROM, a programmable memory such as read only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only with hardware circuitry, such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, etc., or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also with software, such as executed by various types of processors, and with a combination of the above hardware circuitry and software (e.g., firmware).
It should be noted that the above description of the processing device 140 and its modules is for descriptive convenience only and is not intended to limit the application to the scope of the illustrated embodiments. It will be appreciated by those skilled in the art that, given the principles of the system, various modules may be combined arbitrarily or a subsystem may be constructed in connection with other modules without departing from such principles. For example, in some embodiments, the acquisition module, the determination module, and the reconstruction module disclosed in fig. 5 may be different modules in one system, or may be one module to implement the functions of two or more modules. For another example, the determining module may be subdivided into an initial image acquiring unit, a mask matrix acquiring unit, and a constraint term coefficient determining unit, for respectively implementing acquiring an initial image based on the set of raw projection data; acquiring a mask matrix corresponding to the initial value image based on the initial value image; constraint term coefficients corresponding to the prior image are determined based on the mask matrix. Also for example, each module in the processing device 140 may share a storage module, or each module may have a respective storage module. Such variations are within the scope of the present application.
Possible beneficial effects of embodiments of the present application include, but are not limited to: by improving the constraint term coefficient aiming at the prior image in the image reconstruction process, when a small-angle scanning reconstruction is needed to obtain an image with high time resolution in a region with severe movement of a scanning target object, setting a smaller constraint term coefficient for the prior image; in a smooth motion area of a scanned target object, the data volume is insufficient when a graph is reconstructed at a small angle, a larger constraint term coefficient is set for the prior image, the problem that poor time resolution in the prior image is substituted into a final iteration image is effectively avoided, and the time resolution of the reconstructed image at a limited angle is improved. It should be noted that, the advantages that may be generated by different embodiments may be different, and in different embodiments, the advantages that may be generated may be any one or a combination of several of the above, or any other possible advantages that may be obtained.
While the basic concepts have been described above, it will be apparent to those skilled in the art that the foregoing detailed disclosure is by way of example only and is not intended to be limiting. Although not explicitly described herein, various modifications, improvements, and adaptations of the present application may occur to one skilled in the art. Such modifications, improvements, and modifications are intended to be suggested within this application, and are therefore within the spirit and scope of the exemplary embodiments of this application.
Meanwhile, the present application uses specific words to describe embodiments of the present application. Reference to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic is associated with at least one embodiment of the present application. Thus, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various positions in this application are not necessarily referring to the same embodiment. Furthermore, certain features, structures, or characteristics of one or more embodiments of the present application may be combined as suitable.
Furthermore, those skilled in the art will appreciate that the various aspects of the invention are illustrated and described in the context of a number of patentable categories or circumstances, including any novel and useful procedures, machines, products, or materials, or any novel and useful modifications thereof. Accordingly, aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may take the form of a computer product, comprising computer-readable program code, embodied in one or more computer-readable media.
The computer storage medium may contain a propagated data signal with the computer program code embodied therein, for example, on a baseband or as part of a carrier wave. The propagated signal may take on a variety of forms, including electro-magnetic, optical, etc., or any suitable combination thereof. A computer storage medium may be any computer readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated through any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or a combination of any of the foregoing.
The computer program code necessary for operation of portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C ++, c#, vb net, python, and the like, a conventional programming language such as C language, visualBasic, fortran2003, perl, COBOL2002, PHP, ABAP, a dynamic programming language such as Python, ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer or as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or processing device. In the latter scenario, the remote computer may be connected to the user's computer through any form of network, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or the use of services such as software as a service (SaaS) in a cloud computing environment.
Furthermore, the order in which the elements and sequences are presented, the use of numerical letters, or other designations are used in the application and are not intended to limit the order in which the processes and methods of the application are performed unless explicitly recited in the claims. While certain presently useful inventive embodiments have been discussed in the foregoing disclosure, by way of various examples, it is to be understood that such details are merely illustrative and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements included within the spirit and scope of the embodiments of the present application. For example, while the system components described above may be implemented by hardware devices, they may also be implemented solely by software solutions, such as installing the described system on an existing processing device or mobile device.
Likewise, it should be noted that in order to simplify the presentation disclosed herein and thereby aid in understanding one or more inventive embodiments, various features are sometimes grouped together in a single embodiment, figure, or description thereof. This method of disclosure, however, is not intended to imply that more features than are presented in the claims are required for the subject application. Indeed, less than all of the features of a single embodiment disclosed above.
Each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this application is hereby incorporated by reference in its entirety. Except for application history documents that are inconsistent or conflicting with the present application, documents that are currently or later attached to this application for which the broadest scope of the claims to the present application is limited. It is noted that the descriptions, definitions, and/or terms used in the subject matter of this application are subject to such descriptions, definitions, and/or terms if they are inconsistent or conflicting with such descriptions, definitions, and/or terms.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of this application. Thus, by way of example, and not limitation, alternative configurations of embodiments of the present application may be considered in keeping with the teachings of the present application. Accordingly, embodiments of the present application are not limited to only the embodiments explicitly described and depicted herein.

Claims (16)

1. A method of image reconstruction, the method comprising:
acquiring a set of raw projection data and at least one prior image, the prior image being related to the set of raw projection data;
Determining constraint term coefficients in the target reconstruction function corresponding to each prior image based on the set of raw projection data;
acquiring a target reconstruction image corresponding to the original projection data based on the set of original projection data and the at least one prior image using the determined target reconstruction function corresponding to the constraint term coefficients of each prior image, wherein,
the at least one prior image comprises two or more prior images, each prior image in the two or more prior images is obtained by reconstruction based on scanning projection data corresponding to different scanning angle ranges, and constraint term coefficients of the two or more prior images are determined based on the motion of a target object.
2. The method of claim 1, wherein determining constraint term coefficients in a target reconstruction function corresponding to each prior image based on the set of raw projection data comprises:
acquiring an initial image based on the set of original projection data;
acquiring a mask matrix corresponding to the initial value image based on the initial value image;
constraint term coefficients corresponding to the prior image are determined based on the mask matrix.
3. The method of claim 2, wherein the acquiring a mask matrix corresponding to the initial image based on the initial image comprises:
dividing the initial value image to obtain a first area and a second area;
acquiring an initial mask matrix, wherein the initial mask matrix corresponds to the matrix representing the initial value image in size;
determining that in the initial mask matrix, the value of the element corresponding to the first region is a first numerical value, and the value of the element corresponding to the second region is a second numerical value;
determining a transition region from the second region, and determining that the value of an element in the transition region is a third numerical value;
and taking the initial mask matrix with the determined element values as the mask matrix.
4. A method according to claim 3, wherein the first, second and third values are selected from the group consisting of value intervals: [0,1].
5. A method according to claim 3, wherein said determining constraint term coefficients corresponding to said a priori image based on said mask matrix comprises:
performing first adjustment on values of elements in the mask matrix based on a preset calculation process;
Acquiring an initial constraint value, and performing secondary adjustment on values of elements in the mask matrix based on the initial constraint value;
and designating the mask matrix after the adjustment is finished as the constraint term coefficient.
6. The method of claim 1, wherein determining constraint term coefficients in a target reconstruction function corresponding to each prior image based on the set of raw projection data comprises:
acquiring another set or more sets of raw projection data related to the set of raw projection data;
determining a motion vector field based on the set of raw projection data and the other set or sets of raw projection data;
constraint term coefficients of the prior image are determined based on the motion vector field.
7. The method of claim 1, wherein the at least one prior image has a different temporal resolution.
8. An image reconstruction system, wherein the system comprises an acquisition module, a determination module and a reconstruction module;
the acquisition module is configured to acquire a set of original projection data and at least one prior image, where the prior image is related to the set of original projection data, the at least one prior image includes two or more prior images, and each of the two or more prior images is reconstructed based on scan projection data corresponding to a different scan angle range;
The determining module is used for determining constraint term coefficients corresponding to each prior image in the target reconstruction function based on the set of original projection data, and determining constraint term coefficients of the two or more prior images based on the motion of the target object;
the reconstruction module is configured to acquire a target reconstruction image corresponding to the original projection data based on the set of original projection data and the at least one prior image, using the target reconstruction function determined by the constraint term coefficients corresponding to each prior image.
9. The system of claim 8, wherein to determine constraint term coefficients in the target reconstruction function for each prior image based on the set of raw projection data, the determination module is further to:
acquiring an initial image based on the set of original projection data;
acquiring a mask matrix corresponding to the initial value image based on the initial value image;
constraint term coefficients corresponding to the prior image are determined based on the mask matrix.
10. The system of claim 9, wherein to obtain a mask matrix corresponding to the initial image based on the initial image, the determining module is further to:
Dividing the initial value image to obtain a first area and a second area;
acquiring an initial mask matrix, wherein the initial mask matrix corresponds to the matrix representing the initial value image in size;
determining that in the initial mask matrix, the value of the element corresponding to the first region is a first numerical value, and the value of the element corresponding to the second region is a second numerical value;
determining a transition region from the second region, and determining that the value of an element in the transition region is a third numerical value;
and taking the initial mask matrix with the determined element values as the mask matrix.
11. The system of claim 10, wherein the first value, the second value, and the third value are selected from a range of values: [0,1].
12. The system of claim 10, wherein to determine constraint term coefficients corresponding to the prior image based on the mask matrix, the determining module is further to:
performing first adjustment on values of elements in the mask matrix based on a preset calculation process;
acquiring an initial constraint value, and performing secondary adjustment on values of elements in the mask matrix based on the initial constraint value;
And designating the mask matrix after the adjustment is finished as the constraint term coefficient.
13. The system of claim 8, wherein to determine constraint term coefficients in the target reconstruction function for each prior image based on the set of raw projection data, the determination module is further to:
acquiring another set or more sets of raw projection data related to the set of raw projection data;
determining a motion vector field based on the set of raw projection data and the other set or sets of raw projection data;
constraint term coefficients of the prior image are determined based on the motion vector field.
14. The system of claim 8, wherein the at least one prior image has a different temporal resolution.
15. An image reconstruction apparatus comprising a processor, wherein the processor is configured to perform the image reconstruction method according to any one of claims 1-7.
16. A computer readable storage medium storing computer instructions, wherein when the computer instructions in the storage medium are read by a computer, the computer performs the image reconstruction method according to any one of claims 1-7.
CN201911040232.9A 2019-10-29 2019-10-29 Image reconstruction method, system, device and storage medium Active CN110751702B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201911040232.9A CN110751702B (en) 2019-10-29 2019-10-29 Image reconstruction method, system, device and storage medium
US17/082,128 US11776170B2 (en) 2019-10-29 2020-10-28 Systems and methods for medical imaging

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911040232.9A CN110751702B (en) 2019-10-29 2019-10-29 Image reconstruction method, system, device and storage medium

Publications (2)

Publication Number Publication Date
CN110751702A CN110751702A (en) 2020-02-04
CN110751702B true CN110751702B (en) 2023-06-16

Family

ID=69280965

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911040232.9A Active CN110751702B (en) 2019-10-29 2019-10-29 Image reconstruction method, system, device and storage medium

Country Status (2)

Country Link
US (1) US11776170B2 (en)
CN (1) CN110751702B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111784794A (en) * 2020-07-05 2020-10-16 高健 CT image reconstruction method, device, computer equipment and readable storage medium
CN112001979B (en) * 2020-07-31 2024-04-26 上海联影医疗科技股份有限公司 Motion artifact processing method, system, readable storage medium and apparatus
TWI749676B (en) * 2020-08-03 2021-12-11 緯創資通股份有限公司 Image quality assessment apparatus and image quality assessment method thereof
CN112288762B (en) * 2020-10-15 2023-05-09 西北工业大学 Discrete iteration reconstruction method for limited angle CT scanning
CN112669405B (en) * 2020-12-30 2023-01-20 上海联影医疗科技股份有限公司 Image reconstruction method, system, readable storage medium and device
TWI800814B (en) * 2021-04-12 2023-05-01 宏碁股份有限公司 Image splicing method
US11328209B1 (en) 2021-06-02 2022-05-10 SambaNova Systems, Inc. Dual cycle tensor dropout in a neural network
US11256987B1 (en) 2021-06-02 2022-02-22 SambaNova Systems, Inc. Memory efficient dropout, with reordering of dropout mask elements
WO2022265875A1 (en) * 2021-06-18 2022-12-22 Subtle Medical, Inc. Systems and methods for real-time video denoising
CN117078686B (en) * 2023-10-17 2024-01-23 万里云医疗信息科技(北京)有限公司 Method, apparatus and storage medium for constructing samples for training bone suppression model
CN117409016A (en) * 2023-12-15 2024-01-16 华中科技大学同济医学院附属同济医院 Automatic segmentation method for magnetic resonance image

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010084389A1 (en) * 2009-01-21 2010-07-29 Koninklijke Philips Electronics N.V. Method and apparatus for large field of view imaging and detection and compensation of motion artifacts
US8204172B1 (en) * 2010-03-17 2012-06-19 General Electric Company System and method of prior image constrained image reconstruction using short scan image data and objective function minimization
CN102737392B (en) * 2012-06-07 2013-11-06 南方医科大学 Non-partial regularization prior reconstruction method for low-dosage X-ray captive test (CT) image
US9373159B2 (en) * 2012-06-18 2016-06-21 University Health Network Method and system for compressed sensing image reconstruction
CN103810733B (en) * 2014-02-28 2017-04-05 南方医科大学 A kind of statistics iterative reconstruction approach of sparse angular X ray CT image
US10147207B2 (en) * 2016-07-15 2018-12-04 Wisconsin Alumni Research Foundation System and method for high-temporal resolution, time-resolved cone beam CT angiography
US10249064B2 (en) * 2016-08-02 2019-04-02 Toshiba Medical Systems Corporation Motion estimation method and apparatus
CN109640824B (en) * 2016-08-22 2023-06-30 皇家飞利浦有限公司 Model regularized motion compensated medical image reconstruction
CN109791701B (en) * 2016-09-30 2023-10-03 皇家飞利浦有限公司 Iterative image reconstruction with dynamic suppression of the formation of noise-induced artifacts
CN109523602B (en) * 2018-10-22 2023-02-17 上海联影医疗科技股份有限公司 Iterative reconstruction optimization method, device and computer readable medium
CN110490881A (en) * 2019-08-19 2019-11-22 腾讯科技(深圳)有限公司 Medical image dividing method, device, computer equipment and readable storage medium storing program for executing

Also Published As

Publication number Publication date
US11776170B2 (en) 2023-10-03
US20210049795A1 (en) 2021-02-18
CN110751702A (en) 2020-02-04

Similar Documents

Publication Publication Date Title
CN110751702B (en) Image reconstruction method, system, device and storage medium
CN109741284B (en) System and method for correcting respiratory motion-induced mismatches in PET imaging
JP5580833B2 (en) A priori image-restricted image reconstruction method in heart rate cone-beam computed tomography
US9576391B2 (en) Tomography apparatus and method of reconstructing a tomography image by the tomography apparatus
JP6824633B2 (en) X-ray computed tomography equipment, successive approximation reconstruction method and medical image processing equipment
US7221728B2 (en) Method and apparatus for correcting motion in image reconstruction
CN109060849B (en) Method, system and device for determining radiation dose modulation line
US10937209B2 (en) Tomography imaging apparatus and method of reconstructing tomography image
EP3107457B1 (en) Tomography apparatus and method of reconstructing a tomography image by the tomography apparatus
US9235907B2 (en) System and method for partial scan artifact reduction in myocardial CT perfusion
CN115605915A (en) Image reconstruction system and method
CN108876730B (en) Method, device and equipment for correcting motion artifact and storage medium
CN110866959B (en) Image reconstruction method, system, device and storage medium
CN111462020A (en) Method, system, storage medium and device for correcting motion artifact of heart image
US10657621B2 (en) Moving structure motion compensation in imaging
JP2016152916A (en) X-ray computer tomographic apparatus and medical image processing apparatus
US9984476B2 (en) Methods and systems for automatic segmentation
US9858688B2 (en) Methods and systems for computed tomography motion compensation
CN110853742B (en) Image reconstruction method, system, device and storage medium
JP2018020120A (en) Medical image processor and medical image processing program
CN109077746B (en) Method, system and device for determining radiation dose modulation line
CN111369636B (en) Image reconstruction method, system, device and storage medium
US20220375038A1 (en) Systems and methods for computed tomography image denoising with a bias-reducing loss function
US20190180481A1 (en) Tomographic reconstruction with weights
US11908044B2 (en) Systems and methods for computed tomography image reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant after: Shanghai Lianying Medical Technology Co.,Ltd.

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

GR01 Patent grant
GR01 Patent grant