CN112614200A - PET image reconstruction method, device and equipment - Google Patents

PET image reconstruction method, device and equipment Download PDF

Info

Publication number
CN112614200A
CN112614200A CN202011507583.9A CN202011507583A CN112614200A CN 112614200 A CN112614200 A CN 112614200A CN 202011507583 A CN202011507583 A CN 202011507583A CN 112614200 A CN112614200 A CN 112614200A
Authority
CN
China
Prior art keywords
image
pixel
gray level
prior
pet
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011507583.9A
Other languages
Chinese (zh)
Inventor
高东芳
胡战利
杨永峰
曾天翼
杨茜
梁栋
刘新
郑海荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN202011507583.9A priority Critical patent/CN112614200A/en
Publication of CN112614200A publication Critical patent/CN112614200A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/40Analysis of texture
    • G06T7/41Analysis of texture based on statistical description of texture
    • G06T7/45Analysis of texture based on statistical description of texture using co-occurrence matrix computation

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Probability & Statistics with Applications (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine (AREA)

Abstract

The invention discloses a method, a device and equipment for reconstructing a PET image, wherein the method comprises the following steps: acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation characteristic image; the self-correlation characteristic image is determined according to a gray level co-occurrence matrix of the anatomical image; acquiring a characteristic value of a prior image; and reconstructing the PET image according to the characteristic value and an iterative algorithm.

Description

PET image reconstruction method, device and equipment
Technical Field
The invention relates to the technical field of PET imaging, in particular to a method, a device and equipment for reconstructing a PET image.
Background
PET (Positron Emission Tomography) can reflect metabolic processes in vivo by using a radioactive tracer to perform noninvasive imaging, and can be used for clinical early tumor detection, tumor treatment evaluation, pharmacokinetic analysis and the like. The image reconstruction of the low-count PET projection data can not only reduce the injection dose of the radioactive tracer, the scanning time and the economic cost, but also improve the time resolution of the dynamic PET, and has important significance in clinical application. However, since the low-count PET reconstruction is a ill-conditioned inverse estimation problem, it is difficult to solve, which results in high noise and low signal-to-noise ratio of the reconstructed image. Therefore, the study of the low-count PET reconstruction algorithm has important scientific significance for PET clinical application.
In the related art, the quality of a PET reconstructed image can be improved based on prior information of an MRI (Magnetic Resonance Imaging) anatomical image. There are two main implementations in the related art: the first method is to introduce prior information into a target likelihood function through a penalty term by utilizing a penalty likelihood function frame so as to realize reconstruction of a PET image; and the second method is to introduce the similarity of MRI gray values around the voxels of the MRI anatomical image into the forward projection process of PET reconstruction as anatomical prior information, and then carry out iterative reconstruction through a classical expected maximum algorithm to obtain a PET reconstruction image.
Although the two implementation manners in the related art can improve the quality of the PET reconstructed image based on the prior information of the MRI anatomical image, the first implementation manner is difficult to implement and is not suitable for the reconstruction of the PET image in a general environment; although the second implementation manner is lower in implementation difficulty than the first implementation manner, only the gray value feature of the MRI anatomical image is utilized, and other features of the MRI anatomical image are ignored, so that the acquired prior information is insufficient and incomplete, and the quality of the PET image reconstructed based on the prior information is poor and the noise is high.
Disclosure of Invention
The embodiment of the invention provides a method, a device and equipment for reconstructing a PET (positron emission tomography) image, which are used for solving the problems of poor quality and much noise of the PET image reconstructed based on prior information in the related art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, a method for reconstructing a PET image is provided, the method comprising:
acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation feature image; the autocorrelation feature images are determined according to a gray level co-occurrence matrix of the anatomical images;
acquiring a characteristic value of the prior image;
and reconstructing the PET image according to the characteristic value and an iterative algorithm.
In a second aspect, an apparatus for reconstructing a PET image is provided, the apparatus comprising:
the first acquisition module is used for acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation feature image; the autocorrelation feature images are determined according to a gray level co-occurrence matrix of the anatomical images;
the second acquisition module is used for acquiring the characteristic value of the prior image;
and the reconstruction module is used for reconstructing the PET image according to the characteristic value and the iterative algorithm.
In a third aspect, an apparatus is provided, comprising: a memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the method according to the first aspect as described above.
In a fourth aspect, a computer-readable storage medium is provided, having stored thereon a computer program which, when executed by a processor, carries out the steps of the method according to the first aspect as described above.
The at least one technical scheme provided by the embodiment of the invention can achieve the following technical effects:
because the gray level co-occurrence matrix is combined with the gray level characteristics and the spatial texture characteristics of the MRI image when the PET image is reconstructed based on the prior image, more accurate texture characteristics can be obtained by calculating fewer quantized gray level numbers for a small image area, thereby effectively improving the quality of the reconstructed image and reducing the noise of the reconstructed image.
In addition, the present embodiment combines the spatial texture feature and the gray scale feature of the anatomical image MRI, and applies the prior information to the PET image reconstruction. Therefore, for the reconstruction of the low-count PET data, the signal to noise ratio of the reconstructed PET image and the accuracy of the reconstruction of the tumor image are improved, so that the PET image which meets the requirement of clinical diagnosis better is obtained.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 is a schematic flowchart of a PET image reconstruction method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of pixel pair directions of a gray level co-occurrence matrix according to an embodiment of the present invention;
FIG. 3 is a schematic flow chart of determining an autocorrelation feature image of an MRI image according to an embodiment of the present invention;
FIG. 4 is one of application scenarios of a PET image reconstruction method according to an embodiment of the present invention;
fig. 5 is a second application scenario diagram of the PET image reconstruction method according to an embodiment of the present invention;
fig. 6 is a schematic block diagram of a PET image reconstruction apparatus 600 according to an embodiment of the present invention;
fig. 7 is a schematic hardware structure diagram of a PET image reconstruction device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the technical solutions of the present invention will be clearly and completely described below with reference to the specific embodiments of the present invention and the accompanying drawings. It is to be understood that the described embodiments are merely exemplary of the invention, and not restrictive of the full scope of the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The technical solutions provided by the embodiments of the present invention are described in detail below with reference to the accompanying drawings.
Referring to fig. 1, fig. 1 is a method, an apparatus and a device for reconstructing a PET image according to an embodiment of the present invention, as shown in fig. 1, the method includes the following steps:
step 102: acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation characteristic image; the autocorrelation feature images are determined from a gray level co-occurrence matrix of the anatomical images.
Step 104: and acquiring the characteristic value of the prior image.
Step 106: and reconstructing the PET image according to the characteristic value and an iterative algorithm.
In this embodiment, the anatomical image may specifically be an MRI anatomical image.
After the acquisition of the anatomical image, the present embodiment may acquire a prior image from the anatomical image. Unlike the prior information in the related art, the prior image in the present embodiment includes an anatomical image and an autocorrelation feature image, where the autocorrelation feature image is determined according to a gray level co-occurrence matrix of the anatomical image.
In one embodiment, after the anatomical image is acquired, a gray level co-occurrence matrix of the anatomical image may be determined, and an auto-correlation feature image of the anatomical image may be acquired according to the gray level co-occurrence matrix of the anatomical image.
When the gray level co-occurrence matrix of the anatomical image is determined, the gray levels of pixels in the anatomical image can be quantized, then the initial gray level co-occurrence matrix of the anatomical image with the quantized gray levels in each setting direction is obtained according to the set step length and the set direction, normalization processing is carried out on the obtained initial gray level co-occurrence matrix to obtain the gray level co-occurrence matrix after the normalization processing, and the self-correlation characteristic image of the anatomical image is determined according to the self-correlation characteristic value of the gray level co-occurrence matrix after the normalization processing.
The determination process of the above-mentioned autocorrelation feature image of the anatomical image is described below with reference to a specific example:
in one example, P (v) may be usedg,vk) To perform a gray level co-occurrence matrix, P (v), representing the anatomical imageg,vk) It may be determined by calculating the image gray value v in the case of the set step dgAnd vkThe probability of the pixel pair of (a) appearing in a certain direction.
When determining the autocorrelation feature image of the anatomical image, the anatomical image (or a specific or designated region of the anatomical image) may be quantized to a designated gray level, e.g., to NgGrey scale and then mapping the pixel values of the anatomical image to the range [1, N ]g]At this time, the gray level co-occurrence matrix is P (v)g,vk) Has a size of [ N ]g,Ng]. From this, it is understood that the number of gray scales for image quantization determines the size of the corresponding gray co-occurrence matrix.
After the quantization of the pixel values of the anatomical image described above, the step size and direction of the pixel pairs may be set. When the step length d is set, the step length d can be set to 1; in setting the directions, the pixel pair (v) is a pair of pixels (v) in consideration of 8 directions of the pixel pair, which are respectively horizontal (i.e., 0 °), vertical (i.e., 90 °), diagonal (i.e., 45 ° and 135 °), and the reverse directions of the three directions, as shown in fig. 2g,vk) And (v)g,vk) The number of times of occurrence in the opposite direction is the same, and therefore, only gray level co-occurrence matrices in four directions shown in fig. 2, i.e., 0 °, 45 °, 90 °, and 135 °, may be calculated and used separately
Figure BDA0002845358810000051
And
Figure BDA0002845358810000052
to represent the gray level co-occurrence matrices of the four directions, as shown in fig. 3.
In this example, P(v) ofg,vk) The element value may be calculated by computing a pixel pair (v)g,vk) At 00The number of occurrences in the direction; in a corresponding manner, the first and second electrodes are,
Figure BDA0002845358810000053
and
Figure BDA0002845358810000054
(v) ofg,vk) The element values may be calculated by computing pixel pairs (v) respectivelyg,vk) The number of occurrences in the 45 °, 90 ° and 135 ° directions.
Is obtained by
Figure BDA0002845358810000055
And
Figure BDA0002845358810000056
then, normalization processing can be performed on the four gray level co-occurrence matrices to obtain a gray level co-occurrence matrix after the normalization processing.
In one example, the normalized formula may be:
Figure BDA0002845358810000057
wherein P can be a gray level co-occurrence matrix before normalization, such as
Figure BDA0002845358810000058
And
Figure BDA0002845358810000059
p may be the gray level co-occurrence matrix after the normalization process.
As shown in fig. 3, after obtaining the normalized gray level co-occurrence matrix, the autocorrelation feature value of the normalized gray level co-occurrence matrix may be obtained, so as to determine the autocorrelation feature image of the anatomical image according to the obtained autocorrelation feature value.
In an embodiment, after the gray level co-occurrence matrix after the normalization processing is obtained, an autocorrelation feature value of the gray level co-occurrence matrix after the normalization processing is obtained by calculating according to a preset autocorrelation feature value calculation formula may be obtained.
In one example, the preset autocorrelation feature value calculation formula may be:
Figure BDA0002845358810000061
wherein f isaRepresenting an autocorrelation feature value; n is a radical ofgRepresenting a gray level; v. ofgAnd vkRepresenting an image gray scale value; p (v)g,vk) Representing a pixel pair (v)g,vk) And normalizing the processed gray level co-occurrence matrix.
After the autocorrelation feature values of the four directions, i.e., 0 °, 45 °, 90 °, and 135 °, are calculated according to the preset autocorrelation feature value calculation formula, as shown in fig. 3, the average value of the autocorrelation feature values of the four directions may be used as the autocorrelation texture feature of the anatomical image, and an autocorrelation feature image may be generated based on the autocorrelation texture feature.
After the auto-correlation image of the anatomical image is acquired, a prior image may be acquired from the anatomical image and the auto-correlation feature image of the anatomical image.
In this embodiment, after the prior image is acquired, the feature value of the prior image may be acquired.
In one example, the eigenvalues of the prior image may be a matrix.
In this embodiment, the feature values of the prior image may be determined according to a radial gaussian kernel function. In one example, the radial gaussian kernel function may be:
Figure BDA0002845358810000062
wherein K represents a characteristic value of the prior image; j' represents a pixel within the neighborhood of pixel j; pixel j represents any pixel in the prior image; σ denotes the width parameter of the radial gaussian kernel.
In this embodiment, the eigenvalue K of the prior image may be a kernel matrix, and may have a variety of representations.
In this embodiment, after obtaining the eigenvalue K of the prior image, K nearest neighbor pixels of each pixel may also be obtained by using a K nearest neighbor method, so as to obtain the sparse kernel matrix K.
After the feature values of the prior image are obtained, the PET image can be reconstructed according to the feature values and an iterative algorithm.
In one embodiment, the feature value of the prior image may be obtained by mapping an initial low-dimensional feature value to a high-dimensional space, wherein the initial low-dimensional feature value may be extracted from the prior image according to a kernel method, and the initial low-dimensional feature value may be used to describe each pixel in the prior image. In this embodiment, the feature value of the prior image may be used to linearly describe the pixel value of each pixel in the prior image.
In one embodiment, the target formula may be determined according to a preset pixel gray value calculation formula and the above-mentioned obtained characteristic values.
In one example, the pixel gray value calculation formula may be:
Figure BDA0002845358810000071
in the preset pixel gray value calculation formula, j represents any pixel in the prior image; j' represents a pixel within the neighborhood of pixel j; x is the number ofjRepresents the image gray scale value at pixel j; n represents the number of neighborhood pixels; f. ofjAnd fj'Respectively representing the values of pixels j and j' in the prior image; alpha is alphaj'Represents a specified parameter value at pixel j'; k represents a feature value of the prior image.
As can be seen from the pixel gray-scale value calculation formula and the feature value, the target formula may be x ═ K α. As can be seen from the above, in the target formula, the eigenvalue K is obtained by the above steps, so the eigenvalue K is known, and the image x and the specified parameter α are unknown.
In the embodiment of the present invention, the specific parameter may be a coefficient image parameter, then αj'A coefficient image at the pixel j' may be represented, wherein the coefficient image may include an image to which a preset coefficient corresponds.
In this embodiment, when the PET image is reconstructed according to the feature value and the iterative algorithm, the obtained feature value may be added to the forward projection model of the PET image to obtain a target forward projection model and a target log-likelihood function corresponding to the target forward projection model, then, according to the expectation maximization method, an estimated value of a specified parameter in the target log-likelihood function may be obtained through the iterative formula, and the reconstructed PET image may be determined according to the target formula, the obtained feature value, and the estimated value of the specified parameter.
In one embodiment, the PET projection data y may satisfy the desire to be
Figure BDA0002845358810000072
In this case, the following log-likelihood function can be obtained by modeling:
Figure BDA0002845358810000081
where x may represent an image to be reconstructed.
Figure BDA0002845358810000082
And x may have the following mapping relationship:
Figure BDA0002845358810000083
Figure BDA0002845358810000084
wherein n isiAnd njThe number of pixels and the number of detector pairs may be indicated separately. Element G of the PET System matrix GijMay represent the probability that a photon pair emanating from pixel j is detected by detector pair i and r may represent random events and scatter events.
In one implementation, after adding the acquired feature value K to the forward projection model of the PET image, the obtained target forward projection model may be:
Figure BDA0002845358810000085
correspondingly, the target log-likelihood function corresponding to the target forward projection model may be:
Figure BDA0002845358810000086
then, the coefficient image estimation value can be obtained by an iterative formula using the expectation maximization method widely used at present.
In one example, the iterative formula may be:
Figure BDA0002845358810000087
obtaining the estimated value of the coefficient image in the target log-likelihood function by the above iterative formula according to the expectation maximization method
Figure BDA0002845358810000088
The resulting coefficient image estimate may then be used
Figure BDA0002845358810000089
And the obtained characteristic value K is substituted into the target formula to obtainThe reconstructed image.
In this case, the reconstructed PET image may be:
Figure BDA00028453588100000810
from the above, since the present embodiment combines the spatial texture feature and the grayscale feature of the anatomical image MRI, and applies these a priori information to the PET image reconstruction. Therefore, for the reconstruction of the low-count PET data, the signal to noise ratio of the reconstructed PET image and the accuracy of the reconstruction of the tumor image are improved, so that the PET image which meets the requirement of clinical diagnosis better is obtained.
Referring to fig. 4, the two related technologies related to this embodiment and the related art are verified by the simulation phantom data set experiment to obtain the verification results under the counting conditions of 50K, 100K, 500K and 5M. As can be seen from fig. 4, the present embodiment can improve the signal-to-noise ratio of the image in the low count reconstruction.
Fig. 5 is an enlarged view of the region of interest of the PET reconstructed image under the condition of 500K counts according to the two related technologies in this embodiment and the related art. As can be seen from fig. 5, the present embodiment reconstructs a large tumor and a small tumor more accurately, and the recovery degree of the image details is improved.
Please refer to table 1 and table 2, which are respectively deviation data and variance data of images reconstructed under different counting conditions according to two related technologies in the present embodiment and the related art in the background art, as can be seen from table 1 and table 2, when the present embodiment is applied to PET data reconstruction of counting at each data level, the deviation of the reconstructed PET images is small, and particularly, as the counting is reduced, the advantages of the deviation and variance of the reconstructed images become more obvious, which indicates that the reconstruction result is better for low counting.
TABLE 1
Figure BDA0002845358810000091
TABLE 2
Figure BDA0002845358810000101
As can be seen from the above, in the embodiment of the present invention, when the PET image is reconstructed based on the prior image, the gray level feature and the spatial texture feature of the MRI image are combined by the gray level co-occurrence matrix, so that a more accurate texture feature can be obtained by calculating a smaller number of quantized gray levels for a small image region, and thus the quality of the reconstructed image can be effectively improved and the noise of the reconstructed image can be reduced.
In correspondence to the above PET image reconstruction method, an embodiment of the present invention further provides a PET image reconstruction device, as shown in fig. 6, the PET image reconstruction device 600 includes:
a first obtaining module 601, configured to obtain a priori image; the prior image comprises an anatomical image and an autocorrelation feature image; the autocorrelation feature images are determined according to a gray level co-occurrence matrix of the anatomical images;
a second obtaining module 602, configured to obtain a feature value of the prior image;
and a reconstruction module 603 configured to reconstruct the PET image according to the feature value and an iterative algorithm.
Optionally, the first obtaining module 601 is configured to:
quantifying a gray level of a pixel in the anatomical image;
acquiring an initial gray level co-occurrence matrix of the anatomical image subjected to gray level quantization in each setting direction according to the set step length and direction;
carrying out normalization processing on the initial gray level co-occurrence matrix to obtain a gray level co-occurrence matrix after normalization processing;
determining an autocorrelation characteristic image of the anatomical image according to the autocorrelation characteristic value of the normalized gray level co-occurrence matrix;
and acquiring a prior image according to the anatomical image and the self-correlation characteristic image of the anatomical image.
Optionally, the autocorrelation characteristic value of the normalized gray level co-occurrence matrix is calculated according to a preset autocorrelation characteristic value calculation formula; the preset autocorrelation characteristic value calculation formula is as follows:
Figure BDA0002845358810000111
wherein f isaRepresenting an autocorrelation feature value; n is a radical ofgRepresenting a gray level; v. ofgAnd vkRepresenting an image gray scale value; p (v)g,vk) Representing a pixel pair (v)g,vk) And normalizing the processed gray level co-occurrence matrix.
Optionally, the feature value of the prior image is obtained by mapping the initial low-dimensional feature value to a high-dimensional space; the initial low-dimensional feature values are extracted from the prior image according to a kernel method; the feature values of the prior image are used to linearly describe the pixel value of each pixel in the prior image.
Optionally, the second obtaining module 602 is configured to:
determining a characteristic value of the prior image according to a radial Gaussian kernel function; wherein the radial Gaussian kernel function is:
Figure BDA0002845358810000112
where j' represents a pixel in the neighborhood of pixel j; x is the number ofjRepresents the image gray scale value at pixel j; n represents the number of neighborhood pixels; f. ofjAnd fj'Respectively representing the values of pixels j and j' in the prior image; alpha is alphaj'Represents a preset coefficient image at pixel j'; k represents a feature value of the prior image.
Wherein K represents a characteristic value of the prior image; j' represents a pixel within the neighborhood of pixel j; the pixel j represents any pixel in the prior image; σ represents a width parameter of the radial gaussian kernel.
Optionally, the reconstruction module 603 is configured to:
adding the characteristic value into an orthographic projection model of the PET image to obtain a target orthographic projection model and a target log-likelihood function corresponding to the target orthographic projection model;
obtaining an estimated value of a designated parameter in the target log-likelihood function through an iterative formula according to an expectation maximization method;
determining a reconstructed PET image according to a target formula, the characteristic value and the estimated value of the specified parameter; wherein the target formula is determined by the feature value and pixel gray value calculation formula.
Optionally, the pixel gray-scale value calculation formula is:
Figure BDA0002845358810000121
wherein j represents any pixel in the prior image; j' represents a pixel within the neighborhood of pixel j; x is the number ofjRepresents the image gray scale value at pixel j; n represents the number of neighborhood pixels; f. ofjAnd fj'Respectively representing the values of pixels j and j' in the prior image; alpha is alphaj'Represents a specified parameter value at pixel j'; k represents a feature value of the prior image.
Correspondingly, the target formula is: x ═ K α.
Because the gray level co-occurrence matrix is combined with the gray level characteristics and the spatial texture characteristics of the MRI image when the PET image is reconstructed based on the prior image, more accurate texture characteristics can be obtained by calculating fewer quantized gray level numbers for a small image area, thereby effectively improving the quality of the reconstructed image and reducing the noise of the reconstructed image.
Corresponding to the above PET image reconstruction method, an embodiment of the present invention further provides a PET image reconstruction device, and fig. 7 is a schematic diagram of a hardware structure of the PET image reconstruction device according to an embodiment of the present invention.
The PET image reconstruction device may be a terminal device or a server or the like for reconstructing a PET image provided in the above embodiments.
PET image reconstruction devices, which may vary considerably in configuration or performance, may include one or more processors 701 and memory 702, where one or more stored applications or data may be stored in memory 702. Memory 702 may be, among other things, transient storage or persistent storage. The application program stored in the memory 702 may include one or more modules (not shown), each of which may include a series of computer-executable instructions for a PET image reconstruction device. Still further, the processor 701 may be configured to communicate with the memory 702 to execute a series of computer-executable instructions in the memory 702 on the PET image reconstruction device. The PET image reconstruction apparatus may also include one or more power supplies 703, one or more wired or wireless network interfaces 704, one or more input-output interfaces 705, one or more keyboards 706.
In particular, in the present embodiment, the PET image reconstruction apparatus includes a memory and one or more programs, wherein the one or more programs are stored in the memory, and the one or more programs may include one or more modules, and each module may include a series of computer-executable instructions for the PET image reconstruction apparatus and is configured to be executed by one or more processors to perform the above-mentioned embodiments.
Because the gray level co-occurrence matrix is combined with the gray level characteristics and the spatial texture characteristics of the MRI image when the PET image is reconstructed based on the prior image, more accurate texture characteristics can be obtained by calculating fewer quantized gray level numbers for a small image area, thereby effectively improving the quality of the reconstructed image and reducing the noise of the reconstructed image.
In the 90 s of the 20 th century, improvements in a technology could clearly distinguish between improvements in hardware (e.g., improvements in circuit structures such as diodes, transistors, switches, etc.) and improvements in software (improvements in process flow). However, as technology advances, many of today's process flow improvements have been seen as direct improvements in hardware circuit architecture. Designers almost always obtain the corresponding hardware circuit structure by programming an improved method flow into the hardware circuit. Thus, it cannot be said that an improvement in the process flow cannot be realized by hardware physical modules. For example, a Programmable Logic Device (PLD), such as a Field Programmable Gate Array (FPGA), is an integrated circuit whose Logic functions are determined by programming the Device by a user. A digital system is "integrated" on a PLD by the designer's own programming without requiring the chip manufacturer to design and fabricate application-specific integrated circuit chips. Furthermore, nowadays, instead of manually making an Integrated Circuit chip, such Programming is often implemented by "logic compiler" software, which is similar to a software compiler used in program development and writing, but the original code before compiling is also written by a specific Programming Language, which is called Hardware Description Language (HDL), and HDL is not only one but many, such as abel (advanced Boolean Expression Language), ahdl (alternate Hardware Description Language), traffic, pl (core universal Programming Language), HDCal (jhdware Description Language), lang, Lola, HDL, laspam, hardward Description Language (vhr Description Language), vhal (Hardware Description Language), and vhigh-Language, which are currently used in most common. It will also be apparent to those skilled in the art that hardware circuitry that implements the logical method flows can be readily obtained by merely slightly programming the method flows into an integrated circuit using the hardware description languages described above.
The controller may be implemented in any suitable manner, for example, the controller may take the form of, for example, a microprocessor or processor and a computer-readable medium storing computer-readable program code (e.g., software or firmware) executable by the (micro) processor, logic gates, switches, an Application Specific Integrated Circuit (ASIC), a programmable logic controller, and an embedded microcontroller, examples of which include, but are not limited to, the following microcontrollers: ARC 625D, Atmel AT91SAM, Microchip PIC18F26K20, and Silicone Labs C8051F320, the memory controller may also be implemented as part of the control logic for the memory. Those skilled in the art will also appreciate that, in addition to implementing the controller as pure computer readable program code, the same functionality can be implemented by logically programming method steps such that the controller is in the form of logic gates, switches, application specific integrated circuits, programmable logic controllers, embedded microcontrollers and the like. Such a controller may thus be considered a hardware component, and the means included therein for performing the various functions may also be considered as a structure within the hardware component. Or even means for performing the functions may be regarded as being both a software module for performing the method and a structure within a hardware component.
The systems, devices, modules or units illustrated in the above embodiments may be implemented by a computer chip or an entity, or by a product with certain functions. One typical implementation device is a computer. In particular, the computer may be, for example, a personal computer, a laptop computer, a cellular telephone, a camera phone, a smartphone, a personal digital assistant, a media player, a navigation device, an email device, a game console, a tablet computer, a wearable device, or a combination of any of these devices.
For convenience of description, the above devices are described as being divided into various units by function, and are described separately. Of course, the functions of the units may be implemented in the same software and/or hardware or in a plurality of software and/or hardware when implementing the invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). Memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It should also be noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The invention may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The invention may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments. In particular, for the system embodiment, since it is substantially similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
The above description is only an example of the present invention, and is not intended to limit the present invention. Various modifications and alterations to this invention will become apparent to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the scope of the claims of the present invention.

Claims (10)

1. A method of reconstructing a positron emission tomography, PET, image, the method comprising:
acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation feature image; the autocorrelation feature images are determined according to a gray level co-occurrence matrix of the anatomical images;
acquiring a characteristic value of the prior image;
and reconstructing the PET image according to the characteristic value and an iterative algorithm.
2. The method of claim 1, wherein said obtaining a priori images comprises:
quantifying a gray level of a pixel in the anatomical image;
acquiring an initial gray level co-occurrence matrix of the anatomical image subjected to gray level quantization in each setting direction according to the set step length and direction;
carrying out normalization processing on the initial gray level co-occurrence matrix to obtain a gray level co-occurrence matrix after normalization processing;
determining an autocorrelation characteristic image of the anatomical image according to the autocorrelation characteristic value of the normalized gray level co-occurrence matrix;
and acquiring a prior image according to the anatomical image and the self-correlation characteristic image of the anatomical image.
3. The method according to claim 2, wherein the autocorrelation characteristic value of the gray level co-occurrence matrix after the normalization processing is calculated according to a preset autocorrelation characteristic value calculation formula; the preset autocorrelation characteristic value calculation formula is as follows:
Figure FDA0002845358800000021
wherein f isaRepresenting an autocorrelation feature value; n is a radical ofgRepresenting a gray level; v. ofgAnd vkRepresenting an image gray scale value; p (v)g,vk) Representing a pixel pair (v)g,vk) And normalizing the processed gray level co-occurrence matrix.
4. The method of claim 1, wherein the eigenvalues of the prior image are obtained after mapping initial low-dimensional eigenvalues to a high-dimensional space; the initial low-dimensional feature values are extracted from the prior image according to a kernel method; the feature values of the prior image are used to linearly describe the pixel value of each pixel in the prior image.
5. The method of claim 4, wherein the obtaining the feature values of the prior images comprises:
determining a characteristic value of the prior image according to a radial Gaussian kernel function; wherein the radial Gaussian kernel function is:
Figure FDA0002845358800000022
wherein K represents a characteristic value of the prior image; j' represents a pixel within the neighborhood of pixel j; the pixel j represents any pixel in the prior image; σ represents a width parameter of the radial gaussian kernel.
6. The method of claim 1, wherein the reconstructing the PET image from the feature values and an iterative algorithm comprises:
adding the characteristic value into an orthographic projection model of the PET image to obtain a target orthographic projection model and a target log-likelihood function corresponding to the target orthographic projection model;
obtaining an estimated value of a designated parameter in the target log-likelihood function through an iterative formula according to an expectation maximization method;
determining a reconstructed PET image according to a target formula, the characteristic value and the estimated value of the specified parameter; wherein the target formula is determined by the feature value and pixel gray value calculation formula.
7. The method of claim 6, wherein the pixel gray scale value calculation formula is:
Figure FDA0002845358800000031
wherein j represents any pixel in the prior image; j' represents a pixel within the neighborhood of pixel j; x is the number ofjRepresents the image gray scale value at pixel j; n represents the number of neighborhood pixels; f. ofjAnd fj′Respectively representing the values of pixels j and j' in the prior image; alpha is alphaj′Representing a pixelA specified parameter value at j'; k represents a characteristic value of the prior image;
correspondingly, the target formula is: x ═ K α.
8. An apparatus for reconstructing Positron Emission Tomography (PET) images, the apparatus comprising:
the first acquisition module is used for acquiring a prior image; the prior image comprises an anatomical image and an autocorrelation feature image; the autocorrelation feature images are determined according to a gray level co-occurrence matrix of the anatomical images;
the second acquisition module is used for acquiring the characteristic value of the prior image;
and the reconstruction module is used for reconstructing the PET image according to the characteristic value and the iterative algorithm.
9. An apparatus, comprising: memory, processor and computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that a computer program is stored on the computer-readable storage medium, which computer program, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 7.
CN202011507583.9A 2020-12-18 2020-12-18 PET image reconstruction method, device and equipment Pending CN112614200A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011507583.9A CN112614200A (en) 2020-12-18 2020-12-18 PET image reconstruction method, device and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011507583.9A CN112614200A (en) 2020-12-18 2020-12-18 PET image reconstruction method, device and equipment

Publications (1)

Publication Number Publication Date
CN112614200A true CN112614200A (en) 2021-04-06

Family

ID=75240648

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011507583.9A Pending CN112614200A (en) 2020-12-18 2020-12-18 PET image reconstruction method, device and equipment

Country Status (1)

Country Link
CN (1) CN112614200A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023056634A1 (en) * 2021-10-09 2023-04-13 深圳先进技术研究院 Pet parameter imaging method and apparatus, and electronic device and readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023056634A1 (en) * 2021-10-09 2023-04-13 深圳先进技术研究院 Pet parameter imaging method and apparatus, and electronic device and readable storage medium

Similar Documents

Publication Publication Date Title
JP7179757B2 (en) Dose Reduction for Medical Imaging Using Deep Convolutional Neural Networks
Michel et al. Total variation regularization for fMRI-based prediction of behavior
CN110097611B (en) Image reconstruction method, device, equipment and storage medium
CN109410216B (en) Ischemic stroke image region segmentation method and device
Rahmim et al. Four‐dimensional (4D) image reconstruction strategies in dynamic PET: beyond conventional independent frame reconstruction
WO2020114329A1 (en) Fast magnetic resonance parametric imaging and device
US20220343496A1 (en) Systems and methods for accurate and rapid positron emission tomography using deep learning
CN111709897B (en) Domain transformation-based positron emission tomography image reconstruction method
Singh et al. A transform-based fast fuzzy C-means approach for high brain MRI segmentation accuracy
CN109815931B (en) Method, device, equipment and storage medium for identifying video object
CN113450397B (en) Image deformation registration method based on deep learning
CN112614144A (en) Image segmentation method, device, equipment and storage medium
US20220319069A1 (en) Method, device and equipment for reconstructing pet images
WO2023114923A1 (en) Fusion of deep-learning based image reconstruction with noisy image measurements
CN110874855B (en) Collaborative imaging method and device, storage medium and collaborative imaging equipment
CN115100185A (en) Image processing method, image processing device, computer equipment and storage medium
CN112614200A (en) PET image reconstruction method, device and equipment
Michel et al. Multiclass sparse Bayesian regression for fMRI-based prediction
CN112488952A (en) Reconstruction method and reconstruction terminal for PET image and computer readable storage medium
CN116563096B (en) Method and device for determining deformation field for image registration and electronic equipment
Caucci et al. GPU programming for biomedical imaging
Xiong et al. Grayscale image segmentation by spatially variant mixture model with student’s t-distribution
Li et al. MRI Measurement Matrix Learning via Correlation Reweighting
CN111914920A (en) Sparse coding-based similarity image retrieval method and system
CN116228786B (en) Prostate MRI image enhancement segmentation method, device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination