CN113160346A - PET image reconstruction method based on wavelet fusion - Google Patents

PET image reconstruction method based on wavelet fusion Download PDF

Info

Publication number
CN113160346A
CN113160346A CN202110378077.2A CN202110378077A CN113160346A CN 113160346 A CN113160346 A CN 113160346A CN 202110378077 A CN202110378077 A CN 202110378077A CN 113160346 A CN113160346 A CN 113160346A
Authority
CN
China
Prior art keywords
image
pet
frequency
wavelet
low
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110378077.2A
Other languages
Chinese (zh)
Inventor
万丽雯
李彦明
郑海荣
刘新
任溥弦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Original Assignee
Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen National Research Institute of High Performance Medical Devices Co Ltd filed Critical Shenzhen National Research Institute of High Performance Medical Devices Co Ltd
Priority to CN202110378077.2A priority Critical patent/CN113160346A/en
Publication of CN113160346A publication Critical patent/CN113160346A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Processing (AREA)
  • Nuclear Medicine (AREA)

Abstract

The invention discloses a PET image reconstruction method based on wavelet fusion. The method comprises the following steps: performing wavelet decomposition on the first PET image to obtain a corresponding first high-frequency image and a corresponding first low-frequency image, and performing wavelet decomposition on the second PET image to obtain a corresponding second high-frequency image and a corresponding second low-frequency image, wherein the first PET image and the second PET are PET reconstructed images obtained according to different modes; in different frequency channels, processing the wavelet coefficient by utilizing a set fusion criterion according to the wavelet coefficient, and performing inverse wavelet transform on the processed new wavelet coefficient to obtain a fused image; and evaluating the fused image and iterating to obtain an optimal solution, and taking the obtained fused image meeting the set evaluation standard as a reconstructed image. The reconstructed image obtained by the invention can well keep the image contour and the details.

Description

PET image reconstruction method based on wavelet fusion
Technical Field
The invention relates to the technical field of medical image processing, in particular to a PET image reconstruction method based on wavelet fusion.
Background
Positron Emission Tomography (PET) is a highly sensitive molecular imaging method for multifunctional metabolic imaging, and is widely used in the clinic. By using the PET image, the early condition of the tumor can be found, and cardiovascular diseases and nervous system diseases can be detected. However, since PET scanning requires the patient to be infused with a contrast agent, and long dynamic scans and high doses of drugs, while making PET imaging more clear, impose a burden on the patient. Therefore, it is essential to reduce the radiation dose to the patient As much As possible, based on the principle of Reasonably using Low doses (As Low As reasonability Achievable) in clinical diagnosis. However, as the dose decreases, PET image reconstruction results in poor imaging quality and also affects the accuracy of the Patlak map and some other data. Therefore, the method for reconstructing the low-dose PET image is researched and developed, the injection amount of the medicament for the patient is reduced under the condition that the effects of the PET image reconstruction and the Patlak image are guaranteed, and the method has the application prospect of being more easily accepted by the patient and reducing the high-dose risk for clinical medical means.
In the prior art, there are many PET image reconstruction algorithms, which have both advantages and disadvantages, and for the reconstruction of parametric images, EM (expectation maximization) can reconstruct images well, but is noisy and has a high count, while MR-guided Kernel EM can reduce noise and reconstruct images while reducing counts (the number of photons generated by the positive and negative electron flues in opposite directions), but has insufficient details, and for the part Ki of interest in the parametric images, the net inflow rate is lower than the true value.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a PET image reconstruction method based on wavelet fusion, which is used for decomposing images reconstructed by Kernel EM and EM respectively based on wavelet transformation and extracting features to perform frame-by-frame fusion so that a relatively clear image can be obtained under the condition of reducing the dose when the PET image is reconstructed.
The technical scheme of the invention is to provide a PET image reconstruction method based on wavelet fusion. The method comprises the following steps:
performing wavelet decomposition on the first PET image to obtain a corresponding first high-frequency image and a corresponding first low-frequency image, and performing wavelet decomposition on the second PET image to obtain a corresponding second high-frequency image and a corresponding second low-frequency image, wherein the first PET image and the second PET are PET reconstructed images obtained according to different modes;
in different frequency channels, processing the wavelet coefficient by utilizing a set fusion criterion according to the wavelet coefficient, and performing inverse wavelet transform on the processed new wavelet coefficient to obtain a fused image;
and evaluating the fused image and iterating to obtain an optimal solution, and taking the obtained fused image meeting the set evaluation standard as a reconstructed image.
Compared with the prior art, the method has the advantages that on the basis of the reconstruction algorithm, the advantages of the method are utilized, the advantages of the method are extracted from the time domain to the frequency domain through wavelet decomposition and the advantages of the method are fused through the weight mode, so that the quality of the reconstructed image is improved, and the high-frequency and low-frequency information characteristic fusion is extracted through the wavelet. And the decision of feature fusion, the advantages of different image reconstruction method results are extracted, and the image quality is improved.
Other features of the present invention and advantages thereof will become apparent from the following detailed description of exemplary embodiments thereof, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description, serve to explain the principles of the invention.
FIG. 1 is a flow chart of a PET image reconstruction method based on wavelet fusion according to one embodiment of the present invention;
FIG. 2 is a process diagram of a PET image reconstruction method based on wavelet fusion according to an embodiment of the present invention;
FIG. 3 is a histogram of an image after a three-level wavelet transform according to one embodiment of the present invention;
fig. 4 is a process diagram of a PET image reconstruction method based on wavelet fusion according to another embodiment of the present invention.
Detailed Description
Various exemplary embodiments of the present invention will now be described in detail with reference to the accompanying drawings. It should be noted that: the relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise.
The following description of at least one exemplary embodiment is merely illustrative in nature and is in no way intended to limit the invention, its application, or uses.
Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate.
In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Referring to fig. 1 and 2, the PET image reconstruction method based on wavelet fusion provided by the present invention includes the following steps.
And step S1, reconstructing the PET image to obtain a reconstructed image.
Specifically, the likelihood function composed of the data set a is expressed as:
L(A|X)=ΠiΠjp(Aij|β=Hij*Xj) (1)
wherein beta represents desirably
E(β)=Hij*Xj (2)
The Poisson distribution of (A) is taken as a variableijProbability of time of (H)ijIs a constant representing the system response matrix, expressed as the probability that the gamma photon emitted by the jth pixel is detected by the ith LOR, and X is a one-dimensional array representing a two-dimensional image.
At this time AijIs expressed as:
Figure BDA0003012080450000031
after taking logarithm and simplifying, the method can obtain
Figure BDA0003012080450000041
Where const denotes some other quantity unrelated to X.
To eliminate the random variable A in the above formulaijHere, use is made of AijReplacing the original random variable A in the case that the maximum likelihood is then obtained that the expected X of the number of photons emitted by all pixels facing various directions is knownijThus achieving the purpose of iterative update.
Since iterative updating is required to reconstruct the image, let the current A beijAs A in the next iterationij。XcurrentCurrent is denoted by c for the current image.
Namely, it is
Figure BDA0003012080450000042
Figure BDA0003012080450000043
The desired formula is solved from the binomial distribution: if r to B (r, P), e (r) is np, it is understood that n is PjP is
Figure BDA0003012080450000044
And beta isij=Hij*Xj
Thus can obtain
Figure BDA0003012080450000045
The obtained logarithm is simplified to obtain a formula L (A | X) ═ Sigmaij(Aijln(HijXj)-HijXj) Calculating the partial derivative to obtain the extreme point:
Figure BDA0003012080450000046
will find AijSubstituting, making the partial derivative as 0 to obtain an extreme point:
Figure BDA0003012080450000047
Figure BDA0003012080450000051
each time of XjAs XcurrentIterating to obtain updated Xj. Obtaining an iterative formula:
Figure BDA0003012080450000052
and step S2, fusing the dynamic frame in the dynamic PET as prior information into a forward projection model to obtain a reconstructed image.
Specifically, the raw data forward projection model
y=PKα+r (12)
Where α represents the coefficient vector of K, P represents the system matrix, and r represents lightScattering and random number of the sub-pixels, K being the key, K representing the KNN method used to find each pixel fiOf (a) is close tojThe relationship (2) of (c).
Kij=K(fi,fj) (13)
The PET image reconstruction for low count scans is altered by finding KNN (k-nearest neighbor algorithm) based on its euclidean distance, fusing the dynamic frames in dynamic PET as prior information into a forward projection model. The step S1 is used as a basis to reconstruct the image, and the time frame is divided into three time segments, which are respectively accumulated to obtain features, which are used as prior information to reconstruct the PET image.
Similar to step S1, the projection model is combined
y=PKα+r (14)
And a poisson likelihood function to estimate a coefficient image of the kernel.
Figure BDA0003012080450000054
In the above formula, bp (a) is a penalty function, and b is a regularization parameter thereof. After the kernel-based coefficient image is obtained, a reconstructed image can be obtained according to the projection model:
Figure BDA0003012080450000053
the reconstructed image x has been regularized by K, so that when estimating the maximum likelihood function, the regularization parameter b can be made 0, and the above equation is simplified as follows:
Figure BDA0003012080450000055
after performing the desired calculation and logarithmic reduction, the K-based iterative formula can be obtained, similarly to step S1:
Figure BDA0003012080450000061
in summary, the present invention reconstructs a dynamic PET image to obtain slice images corresponding to a first group of time frames, and reconstructs the PET image based on a kernel method to obtain slice images corresponding to a second group of time frames, wherein the reconstruction results of the first and second groups of images are PET reconstructed images obtained according to different manners.
In step S3, the images obtained in step S1 and step S2 are extracted frame by wavelet transform and fused.
The wavelet transform is a sub-graph that decomposes an image into frequency segments in the frequency domain, and represents the characteristic components of the original image. The image is decomposed into a low-frequency image and a high-frequency image after wavelet transform decomposition, and low-frequency and high-frequency coefficients are obtained at the same time, wherein the low-frequency image comprises a low-frequency component, the high-frequency image comprises a high-frequency component, the low-frequency component is the outline part of the image, and the high-frequency component is the detail part of the image. Then, in different frequency channels, the wavelet coefficients are processed by adopting different fusion criteria according to different wavelet coefficients, and more frequency band characteristics are well preserved in the processed new wavelet coefficients. And finally, performing inverse wavelet transform on the new wavelet coefficient to obtain a fused image.
Specifically, after each wavelet transform, the image is decomposed into four bands LL, HL, LH, and HH.
The LL band, which holds the original image content information, where the image energy is concentrated:
Figure BDA0003012080450000062
the HL band, which holds high-frequency information in the image horizontal direction:
Figure BDA0003012080450000063
the LH band, which holds high frequency information in the vertical direction of the image:
Figure BDA0003012080450000064
the HH band, which holds high-frequency information of the image in the diagonal direction:
Figure BDA0003012080450000065
where f (k, l) represents the details of the image projected at each scale. (m, n) denotes the pixels of the image, and j denotes the number of levels of the low-pass and high-pass decomposition coefficients.
For example, when the image is subjected to a three-level wavelet transform, the distribution of the frequency is shown in fig. 3. It should be understood that more layers of wavelet transforms may also be employed.
In one embodiment, taking the example of three-layer wavelet transform, for the high frequencies H1(j, k) and H2(j, k) of two images, the maximum value of these is taken as the information H (j, k) of the high frequency of the fused image, where j is 1,2,3, and j is the number of decomposition layers. For the low-frequency information L1(j, k), L2(j, k) of the two images, the low-frequency information L (j, k) of the fused image is obtained by an averaging method.
And step S4, evaluating the fusion result and iterating to obtain the optimal solution.
For example, as shown in fig. 4, the fusion result of the obtained fusion image is determined using the mean and the standard deviation. The mean value may represent the average brightness of the image, and the standard deviation reflects the degree of dispersion of the image gray level from the mean value. A large standard deviation, representing a large difference between the majority of the values and their mean values; a smaller standard deviation indicates that these values are closer to the mean. I.e., the smaller the standard deviation, the more accurate the data.
For example, the fusion result is evaluated using the information entropy of the image, which defines the average amount of information obtained when a single source symbol output is observed. The situation where the entropy is maximized occurs when the probability of occurrence of the source symbols is equal, and the source provides the maximum possible average amount of information for the source symbols at that time. The larger the information entropy is, the larger the information amount contained in the image is, i.e. the better the fusion effect is.
For example, the improvement in the sharpness of the image may be represented by an average gradient. Image gradients may also reflect subtle detail contrasts and texture-changing features in the image:
Figure BDA0003012080450000071
where A, B denotes the size of the image, Δxf (i, j) and Δyf (i, j) is the first order difference of pixel point (i, j) in x and y directions
Δxf(i,j)=f(i+1,j)-f(i,j) (24)
Δyf(i,j)=f(i,j+1)-f(i,j) (25)
And fusing each frame of image of the dynamic PET imaging through iteration to obtain a wavelet fused dynamic PET image.
In addition to the fusion using the images obtained in step S1 and step S2, the fusion may be performed using other reconstruction algorithms, such as filtered back-projection reconstruction, least-squares-based PET image reconstruction, and the like, after the present invention is modified as appropriate, depending on the reconstruction effect.
In conclusion, the invention performs frame-by-frame fusion aiming at the details of the high-frequency component and the contours of the low-frequency component by considering the difference of different PET image reconstruction algorithm results, thereby reducing the image noise after image reconstruction and improving the image details. In addition, on the basis of the reduction of the reconstructed count of the PET image, the reconstructed image based on the low-dose image can well keep the contour and the detail and improve the net inflow of a tumor region and the fitting degree of a true value.
The present invention may be a system, method and/or computer program product. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for causing a processor to implement various aspects of the present invention.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present invention may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + +, Python, or the like, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, aspects of the present invention are implemented by personalizing an electronic circuit, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA), with state information of computer-readable program instructions, which can execute the computer-readable program instructions.
Aspects of the present invention are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions. It is well known to those skilled in the art that implementation by hardware, by software, and by a combination of software and hardware are equivalent.
Having described embodiments of the present invention, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the embodiments disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein. The scope of the invention is defined by the appended claims.

Claims (10)

1. A PET image reconstruction method based on wavelet fusion comprises the following steps:
step S11, performing wavelet decomposition on the first PET image to obtain a corresponding first high-frequency image and a corresponding first low-frequency image, and performing wavelet decomposition on the second PET image to obtain a corresponding second high-frequency image and a corresponding second low-frequency image, wherein the first PET image and the second PET are PET reconstructed images obtained according to different modes;
step S12, processing the wavelet coefficient by using the set fusion criterion according to the wavelet coefficient in different frequency channels, and performing inverse wavelet transform on the processed new wavelet coefficient to obtain a fused image;
and step S13, evaluating the fused image and iterating to obtain an optimal solution, and taking the obtained fused image meeting the set evaluation standard as a reconstructed image.
2. The method according to claim 1, wherein step S11 includes:
the first PET image and the second PET image are respectively subjected to multi-layer wavelet decomposition, and the obtained first high-frequency image is represented as H1(j, k), the first low-frequency image is represented as L1(j, k), the second high-frequency image is represented as H2(j, k), and the second low-frequency image is represented as L1(j, k), wherein j represents the layer number index of the wavelet decomposition.
3. The method according to claim 2, wherein step S12 includes:
taking the maximum value of the first high-frequency image H1(j, k) and the second high-frequency image H2(j, k) as the high-frequency image information H (j, k) of the fused image;
for the first low-frequency image L1(j, k) and the second low-frequency image L2(j, k), the low-frequency image information L (j, k) of the fusion image is obtained by averaging.
4. The method of claim 1, wherein the fused image is evaluated using one or more of a mean value, a standard deviation and an average gradient, wherein the mean value is used to represent an average brightness of the fused image, the standard deviation reflects a degree of dispersion of a gray level of the fused image with respect to the mean value, and the average gradient reflects a fine detail contrast and a texture transformation feature in the fused image.
5. The method of claim 4, wherein the average gradient is represented as:
Figure FDA0003012080440000011
wherein A and B denote the size of the image, Δxf (i, j) and Ayf (i, j) is the first difference of pixel point (i, j) in x and y directions, respectively.
6. The method of claim 1, wherein the first PET image is a PET reconstructed image obtained according to a KernelEM algorithm.
7. The method of claim 1, wherein the second PET image is obtained according to the following steps:
estimating a coefficient image of the kernel by combining the projection model y ═ PK α + r with the poisson likelihood function:
Figure FDA0003012080440000021
after obtaining the kernel-based coefficient image, a reconstructed image is obtained from the projection model, which is expressed as:
Figure FDA0003012080440000022
let regularization parameter b be 0, when estimating the maximum likelihood function, it is expressed as:
Figure FDA0003012080440000023
after performing expected calculation and logarithmic reduction, obtaining an iterative formula based on K:
Figure FDA0003012080440000024
where α represents the coefficient vector of K, P represents the system matrix, r represents the scattering and random number of photons, and K represents each pixel fiOf (a) is close tojbP (a) is a penalty function, b is its regularization parameter, XcurrentRepresenting the current image.
8. The method of claim 2, wherein the first PET image and the second PET image are each subjected to a trilayer wavelet decomposition.
9. A computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 8.
10. A computer device comprising a memory and a processor, on which memory a computer program is stored which is executable on the processor, characterized in that the steps of the method of any of claims 1 to 8 are implemented when the processor executes the program.
CN202110378077.2A 2021-04-08 2021-04-08 PET image reconstruction method based on wavelet fusion Pending CN113160346A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110378077.2A CN113160346A (en) 2021-04-08 2021-04-08 PET image reconstruction method based on wavelet fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110378077.2A CN113160346A (en) 2021-04-08 2021-04-08 PET image reconstruction method based on wavelet fusion

Publications (1)

Publication Number Publication Date
CN113160346A true CN113160346A (en) 2021-07-23

Family

ID=76889091

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110378077.2A Pending CN113160346A (en) 2021-04-08 2021-04-08 PET image reconstruction method based on wavelet fusion

Country Status (1)

Country Link
CN (1) CN113160346A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844606A (en) * 2016-03-22 2016-08-10 博康智能网络科技股份有限公司 Wavelet transform-based image fusion method and system thereof
CN107527359A (en) * 2017-08-07 2017-12-29 沈阳东软医疗系统有限公司 A kind of PET image reconstruction method and PET imaging devices
CN109300096A (en) * 2018-08-07 2019-02-01 北京智脉识别科技有限公司 A kind of multi-focus image fusing method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105844606A (en) * 2016-03-22 2016-08-10 博康智能网络科技股份有限公司 Wavelet transform-based image fusion method and system thereof
CN107527359A (en) * 2017-08-07 2017-12-29 沈阳东软医疗系统有限公司 A kind of PET image reconstruction method and PET imaging devices
CN109300096A (en) * 2018-08-07 2019-02-01 北京智脉识别科技有限公司 A kind of multi-focus image fusing method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
GUOBAO WANG 等: "PET Image Reconstruction Using Kernel Method", 《IEEE TRANSACTIONS ON MEDICAL IMAGING》 *

Similar Documents

Publication Publication Date Title
Ghani et al. Fast enhanced CT metal artifact reduction using data domain deep learning
US11080897B2 (en) Systems and methods for a PET image reconstruction device
US7831097B2 (en) System and method for image reconstruction
US9159122B2 (en) Image domain de-noising
CN107123095B (en) PET image reconstruction method and imaging system
EP3501006B1 (en) Feature-based image processing using feature images extracted from different iterations
CN111709897B (en) Domain transformation-based positron emission tomography image reconstruction method
Budhiraja et al. An efficient image denoising scheme for higher noise levels using spatial domain filters
CN105164725A (en) De-noised reconstructed image data edge improvement
Ma et al. Generalized Gibbs priors based positron emission tomography reconstruction
US20170172534A1 (en) Thoracic imaging for cone beam computed tomography
Lee et al. A metal artifact reduction method using a fully convolutional network in the sinogram and image domains for dental computed tomography
Rai et al. Augmented noise learning framework for enhancing medical image denoising
Yin et al. Unpaired low-dose CT denoising via an improved cycle-consistent adversarial network with attention ensemble
Li et al. Neural KEM: A kernel method with deep coefficient prior for PET image reconstruction
Karimi et al. Reducing streak artifacts in computed tomography via sparse representation in coupled dictionaries
Lepcha et al. A constructive non-local means algorithm for low-dose computed tomography denoising with morphological residual processing
Du et al. DRGAN: a deep residual generative adversarial network for PET image reconstruction
Zhu et al. Bayesian algorithms for PET image reconstruction with mean curvature and Gauss curvature diffusion regularizations
CN113160346A (en) PET image reconstruction method based on wavelet fusion
Narasimha et al. An effective tumor detection approach using denoised MRI based on fuzzy bayesian segmentation approach
CN113129296B (en) Image processing method based on denoising automatic encoder under transform domain as prior
WO2022213321A1 (en) Wavelet fusion-based pet image reconstruction method
Pandey et al. 99m-Tc MDP bone scan image enhancement using pipeline application of dynamic stochastic resonance algorithm and block-matching 3D filter
Rodríguez-Gallo et al. Inpainting-filtering for metal artifact reduction (IMIF-MAR) in computed tomography

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210723

RJ01 Rejection of invention patent application after publication