CN114897861A - Image processing method and system - Google Patents

Image processing method and system Download PDF

Info

Publication number
CN114897861A
CN114897861A CN202210584974.3A CN202210584974A CN114897861A CN 114897861 A CN114897861 A CN 114897861A CN 202210584974 A CN202210584974 A CN 202210584974A CN 114897861 A CN114897861 A CN 114897861A
Authority
CN
China
Prior art keywords
imaging data
image
image processing
processing
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210584974.3A
Other languages
Chinese (zh)
Inventor
谢慧芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202210584974.3A priority Critical patent/CN114897861A/en
Publication of CN114897861A publication Critical patent/CN114897861A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/006Inverse problem, transformation from projection-space into object-space, e.g. transform methods, back-projection, algebraic methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Mathematical Physics (AREA)
  • Pure & Applied Mathematics (AREA)
  • Algebra (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine (AREA)

Abstract

The embodiment of the specification provides an image processing method and system, the method comprises the steps of obtaining original imaging data, wherein the original imaging data are obtained by scanning a target object through a medical device; determining scan information related to a scan of a target object; determining an image processing parameter based on the scanning information, the image processing parameter being a variable of a position in the target direction; and processing the raw imaging data based on the image processing parameters to generate target imaging data.

Description

Image processing method and system
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and a system for determining image processing parameters.
Background
Medical imaging may be applied to various medical treatments and/or diagnoses. In some medical imaging procedures, multiple body-part scans or whole-body scans of a target object (e.g., a patient) along an axial direction of a medical device (e.g., a long-axis PET device) are required. Due to the fact that some factors (for example, the sensitivity of a detector of the medical device, the tracer activity of the target object, the physical characteristics of the target object and the like) vary in the axial direction of the medical device, the quality difference of imaging data (for example, PET scanning data) acquired in different regions in the axial direction is obvious, and further, the reconstructed image has a large image quality difference in the axial direction of the medical device, which brings many adverse factors for the diagnosis and treatment of subsequent diseases. It is therefore desirable to provide a method and a system for image processing that allows for a flexible control of the image quality in the axial direction of the medical device.
Disclosure of Invention
One of the embodiments of the present specification provides an image processing method. The image processing method comprises the following steps: acquiring original imaging data, wherein the original imaging data is acquired by scanning a target object through medical equipment; determining scan information related to a scan of the target object; determining an image processing parameter based on the scanning information, wherein the image processing parameter is a variable of a position in a target direction; and processing the raw imaging data based on the image processing parameters to generate target imaging data.
In some embodiments, the scan information comprises at least one of: information related to the medical device, information related to the raw imaging data, or information related to the target object.
In some embodiments, the medical device comprises a positron emission tomography device, the information related to the medical device comprises at least one of sensitivity of a detector of the medical device, a gap between adjacent detectors, and detection efficiency of the detector, the information related to the raw imaging data comprises at least one of coincidence event count information and tracer activity, and the information related to the target object comprises at least one of characteristic information of the target object and historical imaging data.
In some embodiments, processing the raw imaging data based on the image processing parameters comprises a plurality of iterations, wherein at least one iteration comprises: acquiring updated imaging data generated in the previous iteration; processing the updated imaging data based on the image processing parameters to generate processed imaging data; judging whether the iteration meets a termination condition; and in response to determining that the iteration satisfies the termination condition, determining the processed imaging data as the target imaging data.
In some embodiments, processing the updated imaging data based on the image processing parameters to generate processed imaging data comprises: updating the image processing parameters based on the updated imaging data and the scan information to generate updated image processing parameters; and processing the updated imaging data based on the updated image processing parameters to generate processed imaging data.
In some embodiments, the processing comprises image reconstruction processing, and the processing the updated imaging data based on the image processing parameters to generate processed imaging data comprises: obtaining an iteration updating factor; and reconstructing the updated imaging data based on the iterative update factor and the image processing parameters to generate a reconstructed image.
In some embodiments, the obtaining the iterative update factor comprises: performing orthographic projection operation on the updated imaging data to determine first projection data; determining second projection data based on the raw imaging data; determining third projection data based on the first projection data and the second projection data; carrying out back projection operation on the third projection data to determine back projection data; and determining the iteration update factor according to the first projection data, the back projection data and the normalization matrix.
In some embodiments, the processing comprises at least one of: image smoothing processing, image enhancement processing, image fusion processing, and image beautification processing.
One of the embodiments of the present specification provides an image processing system. The image processing system includes at least one processor and at least one memory. The at least one memory is for storing computer instructions. The at least one processor is configured to execute at least a portion of the computer instructions to implement the image processing method described herein.
One of the embodiments of the present specification provides an image processing system. The image processing system comprises an acquisition module, a scanning information determination module, a processing parameter determination module and a processing module. The acquisition module is used for acquiring original imaging data, and the original imaging data is acquired by scanning a target object through medical equipment. The scan information determination module is configured to determine scan information related to a scan of the target object. And the processing parameter determining module is used for determining an image processing parameter based on the scanning information, wherein the image processing parameter is a variable of a position in a target direction. The processing module is used for processing the original imaging data based on the image processing parameters to generate target imaging data.
Drawings
The present description will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of an image processing system according to some embodiments of the present description;
FIG. 2 is an exemplary flow diagram illustrating the generation of target imaging data according to some embodiments of the present description;
FIG. 3 is an exemplary flow diagram illustrating the generation of target imaging data according to some embodiments of the present description;
FIG. 4 is an exemplary flow diagram illustrating the generation of a reconstructed image of a target according to some embodiments of the present description;
FIG. 5 is an exemplary flow diagram illustrating obtaining an iterative update factor according to some embodiments of the present description;
FIG. 6 is an exemplary block diagram of a processing device shown in accordance with some embodiments of the present description;
FIG. 7 is a graph of image smoothing parameters versus instantaneous coincident event count, shown in accordance with some embodiments of the present description;
FIG. 8 is a graph of image smoothing parameters versus instantaneous coincident event count, shown in accordance with some embodiments of the present description;
FIG. 9 is a schematic diagram of a reconstructed image according to some embodiments described herein.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings used in the description of the embodiments will be briefly described below. It is obvious that the drawings in the following description are only examples or embodiments of the present description, and that for a person skilled in the art, the present description can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this specification and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used in this description to illustrate operations performed by a system according to embodiments of the present description. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
FIG. 1 is a schematic diagram of an application scenario of an image processing system according to some embodiments of the present description. The image processing system 100 may include a processing device 110, a medical device 120, one or more terminals 130, a network 140, and a storage device 150.
The components in the image processing system 100 may be connected in various ways. By way of example only, the medical device 120 may be connected to the processing device 110 directly (as indicated by the dashed double-headed arrow connecting the medical device 120 and the processing device 110) or through the network 140. As yet another example, the storage device 150 may be connected to the medical device 120 directly (as indicated by the dashed double-headed arrow connecting the storage device 150 and the medical device 120) or through the network 140. As yet another example, terminal 130 may be connected to processing device 110 directly (as indicated by the dashed double-headed arrow connecting terminal 130 and processing device 110) or through network 140.
The processing device 110 may process data and/or information obtained from the medical device 120, the terminal 130, and/or the storage device 150. For example, the processing device 110 may acquire raw imaging data of the target object from the medical device 120. For another example, the processing device 110 may retrieve scan information related to a scan of the target object from the storage device 150. As another example, processing device 110 may determine image processing parameters based on the scan information. As another example, the processing device 110 may process the raw imaging data based on the image processing parameters to generate target imaging data. In some embodiments, the processing device 110 may be a single server or a group of servers. The server groups may be centralized or distributed. In some embodiments, processing device 110 may be a local component or a remote component with respect to one or more other components of image processing system 100. For example, the processing device 110 may access information and/or data stored in the medical device 120, the terminal 130, and/or the storage device 150 via the network 140.
The medical device 120 may scan a target object to acquire scan data (e.g., raw imaging data) of the target object. In some embodiments, the target object may include a biological object and/or a non-biological object. For example, the target object may include a particular portion of a human body, such as the head, chest, abdomen, etc., or a combination thereof. As another example, the target object may be a patient to be scanned by the medical device 120. In some embodiments, the scan data related to the target object may include raw data acquired by the medical device 120 (e.g., projection data of the target object), one or more scan images, and so forth.
In some embodiments, the medical device 120 may be a non-invasive medical imaging apparatus for disease diagnosis or research purposes. For example, the medical device 120 may include a single modality scanning device and/or a multi-modality scanning device. The single modality scanning device may include an ultrasound scanning device, an X-ray scanning device, a Computed Tomography (CT) scanning device, a Magnetic Resonance Imaging (MRI) scanning device, an ultrasound examination device, a Positron Emission Tomography (PET) scanning device, an Optical Coherence Tomography (OCT) scanning device, an Ultrasound (US) scanning device, an intravascular ultrasound (IVUS) scanning device, a near infrared spectroscopy (NIRS) scanning device, a Far Infrared (FIR) scanning device, and the like, or any combination thereof. The multi-modality scanning device may include an X-ray imaging-magnetic resonance imaging (X-ray-MRI) scanning device, a positron emission tomography-X-ray imaging (PET-X-ray) scanning device, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) scanning device, a positron emission tomography-computed tomography (PET-CT) scanning device, a digital subtraction angiography-magnetic resonance imaging (DSA-MRI) scanning device, and the like. The medical device 120 is provided above for illustrative purposes only and is not intended to limit the scope of the present application. As used herein, the term "imaging modality" or "modality" refers to an imaging method or technique that collects, generates, processes, and/or analyzes imaging information of a target object.
In some embodiments, the medical device 120 (e.g., a PET device) may include a gantry, a detector, and a scanning couch. The gantry may support a detector. The target object may be placed on a scanning couch and moved in an axial direction of the medical device 120 (e.g., a Z-axis direction shown in fig. 1) to scan the target object. In some embodiments, the detector may comprise one or more detector cells. The detector may include a scintillation detector (e.g., a cesium iodide detector), a gas detector, and the like.
In some embodiments, the medical device 120 may be a long axis PET device. The axial length of long-axis PET devices (e.g., the length in the Z-axis direction shown in fig. 1) is typically large (e.g., greater than or equal to 0.75 meters), and thus long-axis PET devices typically have a long axial scan field of view that allows simultaneous scan imaging of multiple sites of a target object. For example, the scanning field of view of a long-axis PET device may cover the entire body of the target subject, i.e., the single bed may cover the entire body of the target subject during the scanning process, thereby ensuring that the radiotracer is covered and detected everywhere in the body of the target subject, and that a single scan may complete the whole-body imaging of the target subject.
The terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, the like, or any combination thereof. In some embodiments, one or more terminals 130 may be part of processing device 110.
Network 140 may include any suitable network that may facilitate the exchange of information and/or data for image processing system 100. In some embodiments, one or more components of the image processing system 100 (e.g., the medical device 120, the terminal 130, the processing device 110, the storage device 150) may communicate information and/or data with one or more other components of the image processing system 100 via the network 140. For example, the processing device 110 may obtain imaging data from the medical device 120 via the network 140. As another example, processing device 110 may obtain user instructions from terminal 130 via network 140.
Storage device 150 may store data, instructions, and/or any other information. In some embodiments, the storage device 150 may store data obtained from the terminal 130 and/or the processing device 110. In some embodiments, storage device 150 may store data and/or instructions that processing device 110 may perform or be used to perform the exemplary methods described herein.
In some embodiments, storage device 150 may be connected to network 140 to communicate with one or more other components of image processing system 100 (e.g., processing device 110, terminal 130). One or more components of image processing system 100 may access data or instructions stored in storage device 150 via network 140. In some embodiments, storage device 150 may be directly connected to or in communication with one or more other components of image processing system 100 (e.g., processing device 110, terminal 130). In some embodiments, the storage device 150 may be part of the processing device 110.
In some embodiments, the location information in the image processing system 100 may be represented in a coordinate system 160 as shown in FIG. 1. Coordinate system 160 may include an X-axis, a Y-axis, and a Z-axis. As shown in fig. 1, the positive X-axis direction may be a direction from the left side to the right side of the scanning bed as viewed from a direction facing the front of the medical device 120. The positive Y-axis direction may be a direction from a lower portion of the medical device 120 to an upper portion of the medical device 120. The positive Z-axis direction may be the direction in which the scanning stage moves from the interior to the exterior of the medical device 120. In some embodiments, the Z-axis direction may also be referred to as an axial direction of the medical device 120.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the example embodiments described herein may be combined in various ways to obtain additional and/or alternative example embodiments. However, such changes and modifications do not depart from the scope of the present application.
FIG. 2 is an exemplary flow diagram illustrating the generation of target imaging data according to some embodiments of the present description. In some embodiments, flow 200 may be performed by processing device 110. As shown in fig. 2, the process 200 includes the following steps.
At step 210, raw imaging data is acquired. In some embodiments, step 210 may be performed by the obtaining module 610.
Raw imaging data may refer to data to be image processed. The raw imaging data may be data acquired by a scan of a target object by a medical device (e.g., medical device 120). The medical devices may include PET devices, SPECT devices, MRI devices, CT devices, and the like. The raw imaging data may include raw data acquired in a scan of the target object and/or a scan image generated based on the raw data. For example, the raw imaging data may include PET data, SPECT data, MRI data, CT data, and the like. For example only, the raw imaging data may be PET projection data acquired by a PET device scanning a target object. The PET projection data may include list mode data or sinogram data.
In some embodiments, the raw imaging data may include a scan image. For example, the processing device 110 may generate a PET image based on the PET projection data. In some embodiments, the processing device 110 may generate a PET image based on the PET projection data according to an image reconstruction algorithm. The PET image may present the uptake of the tracer by the subject. Exemplary image reconstruction algorithms may include iterative algorithms, analytical algorithms, and the like. The iterative algorithm may include a Maximum Likelihood Estimation (MLE) algorithm, an Ordered Subset Expectation Maximization (OSEM), and the like. The analysis algorithm may include a Filtered Back Projection (FBP) algorithm, or the like.
In some embodiments, the processing device 110 may obtain raw imaging data via the network 140 from one or more components of the image processing system 100 (e.g., the medical device 120, the terminal 130, the storage device 150) or an external memory. For example, the medical device 120 may transmit the acquired raw imaging data to a storage device (e.g., storage device 150, an external storage device) for storage. The processing device 110 may obtain raw imaging data from a storage device. As another example, the processing device 110 may obtain raw imaging data directly from the medical device 120.
In step 220, scan information relating to the scanning of the target object is determined. In some embodiments, step 220 may be performed by the scan information determination module 620.
The scan information may refer to information on a parameter varying in a target direction. For example only, the target direction may be an axial direction of the medical device (e.g., the Z-axis direction shown in fig. 1). in some embodiments, the scan information may include information related to the medical device, information related to the raw imaging data, information related to the target object, and the like, or any combination thereof. For exemplary purposes, the following describes the kind of scan information by taking a PET apparatus as an example. Example information related to the medical device may include sensitivity of the detectors, gaps between adjacent detectors, detection efficiency of the detectors, or the like, or any combination thereof.
During a PET scan, a tracer is first injected into the target subject. The tracer may undergo positron emission decay and emit positrons. A positron and an electron have the same mass and opposite charges, and when two particles collide, the positron and the electron (the electron is present in large numbers in the body of the subject) can undergo annihilation (also referred to as an "annihilation event" or "coincidence event"). Electron-positron annihilation can cause two particles (e.g., two 511keV gamma photons) to begin traveling in opposite directions to each other. In a PET scan of a target object, particles resulting from an annihilation event will reach and be detected by detectors of a PET device.
The sensitivity of the detector refers to the detection efficiency of the detector. Taking a PET device as an example, the sensitivity of the detector may refer to the count of coincidence events detected by the detector per radiation dose per unit time. The higher the sensitivity of the detector, the more signals detected for the same activity of the radiation source, and the higher the quality of the image obtained accordingly. The detection efficiency of a detector is the probability that a gamma photon, when passing through the detector crystal, can be recorded. The higher the detection efficiency of the detector, the higher the quality of the obtained image. In some embodiments, the medical device may include a detector array composed of a plurality of detectors (e.g., LSO crystals, LYSO crystals, BGO crystals, LaBr crystals), the smaller the interstitial gaps between adjacent detectors (e.g., adjacent crystals), the higher the detection efficiency of the detectors. The larger the interstitial gap between adjacent detectors (e.g., adjacent crystals) the more missing PET data is caused, thereby affecting image quality.
The information related to the raw imaging data may include coincidence event count information, tracer activity, and the like, or any combination thereof. In some embodiments, coincidence events may include immediate coincidence events, true coincidence events, random events, and scattering events. All coincidence events detected by the detector are referred to as instant coincidence events. A true coincidence event is called when a pair of detector cells detects two incident photons (also called coincident photons) from the same annihilation event within a certain time window. A pair of detector cells is called a random event when it detects two incident photons from two annihilation events within a certain time window. In some embodiments, photons generated by an annihilation event may undergo compton scattering when passing through a target object, referred to as a scatter event when at least one of two incident photons detected within a time window undergoes compton scattering at least once before it reaches a detector cell.
The tracer activity may reflect biological activity information of the target subject. For example, one or more atoms of the tracer may be chemically incorporated into a biologically active molecule within the body of the target subject. The active molecules may be concentrated in the tissue of interest within the target object. The tracer may comprise [ alpha ], [ beta ] -an 15 O]H 2 O、[ 15 O]Butanol, [ 2 ] 11 C]Butanol, [ 2 ] 18 F]Fluorodeoxyglucose (FDG), (A) 64 Cu]Diacetyl-bis ( 64 Cu-ATSM)、[ 18 F]Fluoride, 3 '-deoxy-3' - [ solution ] 18 F]Fluorothymidine (FLT), (FLT) 18 F]-Flufenacet (FMISO), gallium, thallium, etc. or any combination thereof.
The information related to the target object may include characteristic information of the target object, historical imaging data, or the like, or any combination thereof. The characteristic information of the target subject may include a body shape (e.g., height, body width, body thickness), a weight, scale information of parts of the body, physical disease information, physiological information (e.g., blood glucose concentration), anatomical structure information, and the like of the target subject, or any combination thereof. For example, an increase in blood glucose concentration in the target subject may promote the secretion of insulin, which accelerates the uptake of glucose (e.g., FDG) by insulin-sensitive tissues (e.g., heart muscle, fat, skeletal muscle), with a corresponding decrease in uptake by tumors and brain tissue, resulting in a change in the FDG profile in the target subject. The anatomical structure information of the target object may include any combination of the position, shape, etc. of the organ or tissue of the target object. The historical imaging data of the target object may include historical scan data (e.g., PET scan data, CT scan data, MRI scan data) acquired in a historical scan (e.g., PET scan, CT scan, MRI scan) of the target object.
In some embodiments, the processing device 110 may retrieve scan information related to a scan of the target object (e.g., information related to the medical device, information related to the target object) from one or more components of the image processing system 100 (e.g., the medical device 120, the terminal 130, the storage device 150) or an external memory. In some embodiments, the processing device 110 may determine scan information (e.g., coincidence count information, tracer activity) related to a scan of the target object from the raw imaging data. For example, the processing device 110 may determine an activity value (or concentration value) of the tracer within the body of the target object based on pixel values (or voxel values) of a PET image of the target object. In some embodiments, the processing device 110 may determine feature information and/or anatomical structure information of the target object from historical imaging data of the target object.
Based on the scan information, image processing parameters are determined, step 230. In some embodiments, step 230 may be performed by process parameter determination module 630.
The image processing parameters may be used to process the raw imaging data (e.g., image reconstruction processing, image smoothing processing, image enhancement processing, image fusion processing, image beautification processing). In some embodiments, the image processing parameters may include image reconstruction parameters, image smoothing parameters (e.g., regularization iterative smoothing parameters, artificial intelligence iterative smoothing parameters, post-filter smoothing parameters), image enhancement parameters, image fusion parameters, image beautification parameters, and the like, or any combination thereof. In some embodiments, the image processing parameters may include parameters related to image processing algorithms (e.g., image reconstruction processing algorithms, image smoothing processing algorithms, image enhancement processing algorithms, image fusion processing algorithms, image beautification processing algorithms).
In some embodiments, the image processing parameter may be a variable of the position in the target direction. In this specification, if there are at least two different positions in the axial direction and the corresponding image processing parameters are different, the image processing parameters may be considered as variables of the position in the target direction. For example only, the target direction may be an axial direction of the medical device (e.g., the Z-axis direction shown in fig. 1). If there are two or more positions with different axial coordinates, whose corresponding image processing parameters are different (e.g., the image processing parameters corresponding to the head and abdomen of the target subject are different), the image parameters may be considered as variables of the axial positions. In some embodiments, when a medical device (e.g., a long-axis PET device) is used to scan a target object, due to the variation of some factors affecting the imaging quality in the target direction, the quality of the scan data corresponding to different positions in the target direction is different, so that the quality of the parts corresponding to different axial positions in the reconstructed image is different, and the subsequent diagnosis and treatment of diseases based on the image are affected. For example, at different axial positions, the detector sensitivity of the medical device, the organ or tissue being scanned, the coincidence count rate, the tracer activity, etc. may differ, affecting the quality of the scan data and reconstructed images.
In the embodiments in the present specification, by setting different image processing parameters for different positions of the target direction, the image quality (e.g., spatial resolution, density resolution, signal-to-noise ratio) of the image in the target direction can be controlled. For example, the difference in image quality at different positions in the target direction can be reduced by setting different image processing parameters for different positions in the target direction. As another example, specific image processing parameters may be set for a specific organ (e.g., head and chest) of the target object in the target direction, such that the organ has a specific optimization effect in the reconstructed image. For another example, for a position at an edge in the target direction, the noise of the corresponding region in the reconstructed image can be reduced by setting a specific image processing parameter.
In some embodiments, processing device 110 may determine image processing parameters based on the scan information. In some embodiments, the processing device 110 may determine the image processing parameters corresponding to a particular location in the target direction based on the scan information corresponding to the particular location. For example, the processing device 110 may determine the image processing parameters corresponding to a particular location in the target direction based on information related to the medical device corresponding to the particular location (e.g., sensitivity of a detector corresponding to the particular location, a gap between adjacent detectors corresponding to the particular location, detection efficiency of a detector corresponding to the particular location), information related to the raw image data (e.g., coincidence event count information corresponding to the particular location, tracer activity corresponding to the particular location), and/or information related to the target object (e.g., type and structure of organs and tissues of the target object corresponding to the particular location). For example only, the processing device 110 may determine image processing parameters corresponding to the chest region of the patient based on scan information (e.g., coincidence count information, tracer activity) corresponding to the chest region of the patient. The processing device 110 may determine image processing parameters corresponding to the abdominal region of the patient based on scan information (e.g., coincidence count information, tracer activity) corresponding to the abdominal region of the patient.
In some embodiments, the processing device 110 may determine the image processing parameters corresponding to a specific location in the target direction based on the scan information about the specific location and the reference location of the location. In some embodiments, the reference location for a particular location may include a location that is adjacent to the location (e.g., a location that is less than a threshold distance from the location). For example, the processing device 110 may determine image processing parameters corresponding to the chest region of the patient based on scan information corresponding to the chest region of the patient, scan information corresponding to the neck region, and scan information corresponding to the abdomen region. By determining the image processing parameters corresponding to the particular location using the particular location and the scan information corresponding to the particular location reference location, the determination of the image processing parameters may be made more reasonable and accurate. In some embodiments, the reference location for a particular location may include non-adjacent locations to the location (e.g., locations that are greater than a threshold distance from the location). For example, the processing device 110 may determine the image processing parameters corresponding to the scan information corresponding to the brain region of the patient based on the scan information corresponding to the brain region of the patient and the scan information corresponding to the liver region. In some embodiments, when the scanning information of the specific position is difficult to obtain, the image processing parameter corresponding to the specific position may be determined by obtaining the scanning information of the reference position. For example, since blood vessels within the human body are connected, image processing parameters corresponding to a particular blood vessel region may be determined based on scan information (e.g., tracer activity) of one or more reference blood vessel regions. In some embodiments, the processing device 110 may determine the image processing parameters corresponding to a specific location in the target direction based on the scan information about the reference location of the specific location; the processing device 110 may determine image processing parameters corresponding to a reference position of a particular location in the target direction based on scan information about the particular location. For example, the processing device 110 may determine scan information corresponding to a liver region of the patient based on the scan information corresponding to the brain region of the patient; the processing device 110 may determine scan information corresponding to the brain region of the patient based on the scan information corresponding to the liver region of the patient.
In some embodiments, a user (e.g., physician, technician) of the image processing system 100 may empirically determine the relationship between the scan information and the image processing parameters. In some embodiments, the relationship between the scan information and the image processing parameters may be determined by performing a simulated scan experiment on the phantom. For example only, the image smoothing parameters for a PET image may be determined according to equation (1):
λ Z =k·(promptCounts Z ) n (1),
wherein λ is Z Representing the corresponding image smoothing parameter at position Z in the axial direction; k and n represent fixed constants that can be set empirically; and promptCounts Z Representing the corresponding instantaneous coincident event count at position Z in the axial direction.
As can be seen from equation (1) above, there is a correlation between the image smoothing parameter and the instant coincidence event count. Since the instant coincident event count is a variable of position in the axial direction, the image smoothing parameter is also a variable of position in the axial direction. The instantaneous coincident event counts may be different for different positions in the axial direction, and thus the image smoothing parameters may also be different for different positions in the axial direction. In some embodiments, the relationship of the image smoothing parameter to the instantaneous coincident event count may be represented by a curve 700 as in FIG. 7 or a curve 800 as in FIG. 8. As shown in fig. 7, there may be a non-linear relationship between the image smoothing parameters and the instantaneous coincident event count. As can be seen in FIG. 7, the instant Measure count is less than 1 × 10 5 When the instantaneous coincidence event count is gradually increased, the image processing parameters are rapidly decreased when the instantaneous coincidence event count is increasedThe time coincidence time count reaches 1 × 10 5 Thereafter, the image processing parameters are slowly decreased as the instantaneous coincident event count is gradually increased. As shown in fig. 8, there may be a linear relationship between the image smoothing parameters and the instantaneous coincident event count. As can be seen from fig. 8, the image processing parameters decrease linearly with increasing instantaneous coincidence event counts.
In some embodiments, the processing device 110 may model the plurality of scan information separately, and the processing device 110 may multiply the modeling functions to determine the image processing parameters. For example, the processing device 110 may model information related to the medical device (e.g., sensitivity distribution of the detectors, gaps between adjacent detectors, detection efficiency of the detectors), information related to the raw imaging data (e.g., coincidence counting information), and information related to the target object (e.g., historical imaging data) to determine image processing parameters.
In some embodiments, the processing device 110 may determine the cause of the PET image noise variance. For example, in the application scenario of two-dimensional PET scanning, assuming PET scanning of a homogeneous water phantom, the true coincidence at a point in its reconstructed image is counted as t e I.e. mean is t at the current point mean e The noise (Variance) generated at the current point can be determined by summing the effects of the PET counts at different angles, i.e., Variance ∑ m VAR e ,VAR e Equal to the sum of the weighted variances of the samples from each of the m projection angles contributing to an image element, the signal-to-noise ratio can be determined according to equation (2):
Figure BDA0003665563030000111
wherein SNR represents a signal-to-noise ratio; c represents a constant; mean represents the current point mean; and variance represents the noise at the current point. For a certain line of response sampling, assuming that both the scatter count and the random count are predicted to be correct, the expectation of instantaneous count is:
E(p)=T p +S p +R p (3),
wherein, T p Representing a true coincidence count; s p Represents a scatter count; r p Representing a random count.
From the principle of poisson distribution, the variance is equal to the expectation, it can be derived:
variance=∑ m w m (T p +S p +R p )=∑ m w m T p (1+α sprp ) (4),
wherein, w m Noise weight for the mth angle response line;
Figure BDA0003665563030000112
respectively, representing the scatter count and the ratio of the random count to the true count.
Assuming that the signal-to-noise ratio of the center point of the image is obtained, the weight (w) of the noise in each direction is the same due to the symmetry principle, and the noise at each angle is:
VAR e =w(Dt e a c /d)(1+α sprp ) (5),
wherein D is the diameter of the uniform water mold; d is the number of pixels; a is c Correcting for attenuation coefficient and dead time; w is the weight of the noise in each direction.
It can further be derived that:
t e =avg(a c )T/πD 2 /4d 2 ) (6),
where T is the total true coincidence count.
Thus, the signal-to-noise ratio can be determined by equation (7):
Figure BDA0003665563030000113
the noise equivalent count rate (NEC) can be determined by equation (8):
NEC=T/(1+α sprp ) (8),
the single layer NEC count can be determined by equation (9):
NEC=T 2 /(T+S+R) (9),
the different probability of detecting the emitted photons at each position in the system results in different system sensitivity at different positions, and thus in different SNR distributions for non-image center points than for center points. And therefore the number of meters will vary for different locations within the system.
In some embodiments, assuming that there are two locations x and y, their mean and noise are:
mean x =Sns x ·t e (10),
variance x =E(Sns x ·t e )=Sns x ·t e (11),
mean y =Sns y ·t e (12),
variance y =E(Sns y ·t e )=Sns y ·t e (13),
wherein, Sns x And Sns y Indicating the system sensitivity.
Thus, it can be deduced that:
Figure BDA0003665563030000121
from the above derivation, the SNR and
Figure BDA0003665563030000122
proportional, therefore NEC can be the basis for measuring image noise. SNR is related to the diameter of the object, and therefore the patient volume can be a measure of the image noise. SNR is related to system sensitivity, and therefore system sensitivity can be a measure of image noise.
Similarly, in a three-dimensional PET scanning application scenario, the NEC of the target direction region, the patient volume in the target direction, and the system sensitivity in the target direction may be used to estimate the noise in the target direction. For example, for each region in the target direction, NEC is labeled NEC z The object Volume is labeled Volume z And system sensitivity is marked as Sns z Then based on the imageThe modeling of the process parameters with respect to count information is:
Figure BDA0003665563030000123
wherein k is a constant; NEC z ,Volume z And Sns z Respectively, the position in the target direction.
Step 240, processing the raw imaging data based on the image processing parameters to generate target imaging data. In some embodiments, step 240 may be performed by processing module 640.
The target imaging data may refer to data processed from the raw imaging data. In some embodiments, processing the raw imaging data may include image reconstruction processing, image smoothing processing, image enhancement processing, image fusion processing, image beautification processing, and the like, or any combination thereof, on the raw imaging data.
In some embodiments, the processing device 110 may process the raw imaging data according to an iterative image processing algorithm (e.g., an iterative reconstruction algorithm) based on the image processing parameters to generate target imaging data. The iterative reconstruction algorithm may include a Maximum Likelihood Estimation (MLE) algorithm, an Ordered Subset Expectation Maximization (OSEM), and the like. For example, the processing device 110 may generate target imaging data by performing multiple iterations of the raw imaging data based on the image processing parameters. For further description of generating target imaging data, reference is made to the description associated with FIG. 3. As another example, the processing device 110 may obtain the iterative update factor. The processing device 110 may generate a target reconstructed image by performing multiple iterations of the initialization image based on the iteration update factor and the image processing parameters. For more description of generating a reconstructed image of the object, reference is made to the description associated with fig. 4.
In some embodiments, the processing device 110 may process the raw imaging data according to a non-iterative image processing algorithm (e.g., a non-iterative reconstruction algorithm) based on the image processing parameters to generate target imaging data. The non-iterative reconstruction algorithm may include filtered backprojectionReconstruction (FBP) algorithms, etc. For example, in the filtered back projection reconstruction (FBP) algorithm, an image f (x, y0 at view angle is known from the central slice theorem
Figure BDA0003665563030000124
Projection of time
Figure BDA0003665563030000125
Is a slice through the origin of the two-dimensional fourier transform of f (x, y 0), so the formula for the reconstructed image I (x, y) derived from the central slice theorem is as follows:
Figure BDA0003665563030000131
where h(s) represents a filter function. In some embodiments, the filter coefficients of the filter function may be adjusted according to different scan information corresponding to different positions in the target direction. For example, compared to using the same h(s) at different positions in the target direction in the conventional FBP reconstruction algorithm, in some embodiments of the present application, the filter function may be a variable of the position in the target direction, that is, the filter function may be:
Figure BDA0003665563030000132
in some embodiments, processing device 110 may process the raw imaging data according to a plurality of joint image processing algorithms based on image processing parameters to generate target imaging data. For example, the processing device 110 may process the raw imaging data according to an image reconstruction algorithm (e.g., a line processing maximum likelihood algorithm (RAMLA) algorithm) and a filtering algorithm based on the image processing parameters to generate target imaging data. The process of image reconstruction according to the RAMLA algorithm can be represented as:
Figure BDA0003665563030000133
where j denotes the pixel index, i denotes the number of iterations, b denotes the projection, a denotes the system matrix, l denotes the subset index, λ z Denotes a convergence coefficient, and Sn denotes a subset sequence. After the reconstruction of the image is completed, the image is filtered using a post-filtering algorithm (e.g., gaussian post-filtering) with target direction adjustment:
Figure BDA0003665563030000134
wherein σ z Representing the width at half maximum, σ, of Gauss z A variable that may be a position in a target direction; and x represents the distance of the neighborhood pixels from the center pixel. According to some embodiments of the present application, noise distribution in an image may be equalized by jointly using an image reconstruction algorithm and a filtering algorithm.
FIG. 3 is an exemplary flow diagram illustrating the generation of target imaging data according to some embodiments of the present description. In some embodiments, flow 300 may be performed by processing device 110 (e.g., processing module 640). In some embodiments, step 240 shown in FIG. 2 may be implemented by flow 300. As shown in fig. 3, the process 300 includes one or more iterations. Each iteration round may include the following steps.
Step 310, obtaining original imaging data or updated imaging data generated in a previous iteration.
If the current iteration is the first iteration, the original imaging data to be subjected to image processing (e.g., image reconstruction processing, image smoothing processing, image enhancement processing, image fusion processing, and image beautification processing) may be acquired. For example, the raw imaging data may be PET projection data (e.g., list mode data, sinogram data), PET reconstructed images, or the like. In some embodiments, the raw imaging data may be a PET reconstructed image generated according to the process 400, and the processing device 110 may further perform other image processing operations (e.g., image smoothing, image enhancement, image fusion, image beautification) on the PET reconstructed image according to the process 300. The acquisition of raw imaging data is similar to that described in step 210 and will not be described in detail here.
If the current iteration is the second or subsequent iteration, the updated imaging data generated in the previous iteration may be acquired.
Step 320, processing the original imaging data or the updated imaging data based on the image processing parameters to generate processed imaging data.
In some embodiments, the image processing parameters used in different iterations may be the same or different. In some embodiments, in the current iteration, the processing device 110 may update the image processing parameters based on the updated imaging data and scan information generated in the previous iteration to generate updated image processing parameters. For example, the processing device 110 may determine a tracer activity based on the updated imaging data (e.g., an updated PET image) and adjust the image processing parameters according to the tracer activity to generate updated image processing parameters. In particular, the pixel values (or voxel values) of the updated imaging data (e.g., the updated PET image) may represent tracer activity distribution information within the human body, and the processing device 110 may adjust the image processing parameters based on the pixel value (or voxel value) information along the target direction in the imaging data to generate updated image processing parameters. For another example, the value of the image processing parameter used in each iteration may be gradually decreased as the number of iterations increases, so as to optimize the convergence path and achieve a better convergence effect for image processing. The processing device 110 may process the updated imaging data based on the updated image processing parameters to generate processed imaging data. According to some embodiments of the present description, by adjusting image processing parameters in each iteration, the convergence speed of the iteration can be increased, thereby increasing the efficiency of the iterative processing.
In some embodiments, the flow 300 may be used to smooth or enhance the raw imaging data. At this time, the image processing parameter may be an image smoothing parameter or an image enhancement parameter. In step 320, the processing device 110 may perform image smoothing processing (or image enhancement processing) on the updated imaging data generated in the previous iteration based on the image smoothing parameter (or image enhancement parameter) and the prior function. A prior function may be used to constrain the prior distribution of the image distribution. For example, the prior function may be a Markov random field model that has a smoothing effect on the image. In some embodiments, the process 300 may be used to reconstruct raw imaging data. At this time, the image processing parameter may be an image reconstruction parameter. In step 320, the processing device 110 may process the updated reconstructed image generated in the previous iteration based on the image reconstruction parameters and the iteration update factor. For further description of image reconstruction, reference is made to the description associated with fig. 4.
Step 330, determine whether the iteration satisfies the termination condition.
In some embodiments, the termination condition may be related to the number of iterations that have been performed. For example, the termination condition may be that the number of iterations that have been performed is greater than a number threshold. In some embodiments, the termination condition may be related to a difference between processed imaging data generated for two consecutive iterations. For example, the termination condition may be that a difference between processed imaging data generated for two consecutive iterations is less than a difference threshold. In some embodiments, the difference between the first imaging data (e.g., the first image) and the second imaging data (e.g., the second image) may be represented by a difference between an average gray-scale value of pixels or voxels of the first imaging data and an average gray-scale value of pixels or voxels of the second imaging data. In some embodiments, the termination condition may be related to the quality of the processed imaging data. For example, the termination condition may be that the quality (e.g., spatial resolution, density resolution, signal-to-noise ratio) of the processed imaging data satisfies a quality threshold (e.g., spatial resolution threshold, density resolution threshold, signal-to-noise ratio threshold). The number threshold and/or the difference threshold may be set manually by a user or set by one or more components of the image processing system 100 (e.g., the processing device 110) according to different circumstances.
Step 340, in response to determining that the iteration satisfies the termination condition, determining the processed imaging data as target imaging data.
In some embodiments, in response to determining that the iteration satisfies the termination condition, the processed imaging data resulting from the current iteration may be determined to be the target imaging data. In some embodiments, in response to determining that the iteration does not satisfy the termination condition, the processed imaging data obtained in the current iteration may be used as updated imaging data, and the process returns to step 310 for the next iteration until the termination condition is satisfied in a certain iteration.
It should be noted that the above description of the flow is for illustration and description only and does not limit the scope of the application of the present specification. Various modifications and alterations to the flow may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
FIG. 4 is an exemplary flow diagram illustrating the generation of a reconstructed image of a target according to some embodiments of the present description. In some embodiments, flow 400 may be performed by processing device 110 (e.g., processing module 640). In some embodiments, step 240 shown in FIG. 2 may be implemented by flow 400. As shown in fig. 4, the process 400 includes one or more iterations. Each iteration round may include the following steps.
At step 410, an initialization image or updated imaging data generated in a previous iteration is obtained.
If the current iteration is the first iteration, an initialization image may be obtained. In some embodiments, the initialization image may include at least two pixels or voxels having an estimated characteristic, such as a gray value, intensity, color, or the like. In some embodiments, the initialization image may be set by a user or determined by one or more components of image processing system 100 (e.g., processing device 110). In some embodiments, the gray-scale values of the pixels or voxels in the initialization image may be set to different values or the same value. For example, the gray scale values of pixels or voxels in the initialization image may each be set to 0.
If the current iteration is the second or subsequent iteration, the updated imaging data generated in the previous iteration may be acquired.
Step 420, reconstructing the initialized image or the updated image data based on the iterative update factor and the image processing parameters to generate a reconstructed image.
The iterative update factor may refer to a difference between the reconstructed initial image and the actual tissue photon distribution image, which is inversely derived by modeling a plurality of physical factors (e.g., particle collisions, detector efficiency, and randomness of events) based on a correlation between the number of photons emitted in the tissue and the number of photons received by the detector. In some embodiments, an iterative update factor may be determined according to flow 500. In some embodiments, during each iteration, processing device 110 may reconstruct the initialized image or updated imaging data generated in a previous iteration based on the image processing parameters and the iteration update factor to generate a reconstructed image. The image processing parameters may be the same or different in each iteration, as described in step 320 of fig. 3.
In some embodiments, the processing device 110 may reconstruct the initialized image based on the iterative update factor and the image processing parameters according to an iterative reconstruction algorithm to generate a reconstructed image. In a conventional iterative reconstruction process, where the image processing parameter is not a variable of the position in the target direction (i.e. different positions in the target direction use the same image processing parameter), the iterative reconstruction process can be expressed as equation (20):
f n+1 =f n +λG(f) (20),
wherein f is n+1 Representing images generated by the iteration of the current round; f. of n Representing the image generated in the previous iteration; λ represents an image processing parameter (which may also be referred to as an image reconstruction parameter or an iterative adjustment factor), which may represent a convergence step size of the image reconstruction; g (f) represents an iteration update factor. According to some embodiments of the present description, the image processing parameter may be a variable of a position in the target direction, and the iterative reconstruction process may be represented as equation (21):
f n+1 =f nz G(f) (21),
wherein λ is Z Representing the image processing parameters at the Z position in the target direction. Merely as an example, λ Z The image processing parameters at the Z position in the axial direction are indicated, and the image processing parameters are different for different Z positions.
In some casesIn the embodiment, formula (3) may also represent other image processing iterative processes such as image smoothing, image denoising, image enhancement and the like. For example, during an image smoothing iteration (e.g., a regularization iteration), λ Z Representing image smoothing parameters (e.g., smoothing intensity coefficients) at the Z position in the target direction.
Step 430, determining whether the iteration satisfies a termination condition.
Step 430 is similar to step 330 and will not be described herein.
Step 440, in response to determining that the iteration satisfies the termination condition, determining the reconstructed image as the target reconstructed image.
In some embodiments, in response to determining that the iteration satisfies the termination condition, the reconstructed image obtained in the most current iteration may be determined as the target reconstructed image. In some embodiments, in response to determining that the iteration does not satisfy the termination condition, the reconstructed image obtained in the current iteration may be used as the updated imaging data, and the process returns to step 410 for the next iteration until the termination condition is satisfied in a certain iteration.
FIG. 5 is an exemplary flow diagram illustrating obtaining an iterative update factor according to some embodiments of the present description. In some embodiments, flow 500 may be performed by processing device 110 (e.g., processing module 640). As shown in fig. 5, the process 500 includes the following steps.
At step 510, a forward projection operation is performed on the initialized image or the updated imaging data to determine first projection data.
In some embodiments, during the first iteration, the processing device 110 may determine the first projection data by performing a forward projection operation on the initialization image. During subsequent iterations, the processing device 110 may perform a forward projection operation on updated imaging data (e.g., an updated reconstructed image) generated in a previous iteration to determine first projection data.
In some embodiments, the processing device 110 may determine the first projection data by projecting the initialization image (or the updated imaging data) onto a particular projection plane. In some embodiments, the processing device 110 may determine the first projection data based on the initialization image (or the updated imaging data) and the projection matrix. For example, the processing device 110 may determine the first projection data by multiplying the projection matrix by the initialization image. In some embodiments, the projection matrix may be set by a user or set by one or more components of image processing system 100 (e.g., processing device 110) according to different circumstances.
Second projection data is determined based on the raw imaging data, step 520.
In some embodiments, the raw imaging data may be raw projection data acquired by a medical device, which may be directly as the second projection data. Alternatively, the processing device 110 may pre-process the raw projection data (e.g., PET projection data) and treat the pre-processed raw projection data as the second projection data. In some embodiments, the raw imaging data may be an image, and the processing device 110 may acquire raw projection data corresponding to the image and treat the raw projection data corresponding to the image as second projection data. Alternatively, the processing device 110 may perform a forward projection operation on the image to determine the second projection data.
In step 530, third projection data is determined based on the first projection data and the second projection data.
The third projection data may represent a difference between the first projection data and the second projection data. In some embodiments, the third projection data may be a ratio of the first projection data to the second projection data. In some embodiments, the third projection data may be a difference of the first projection data and the second projection data. For example, the processing device 110 may determine the third projection data by subtracting the first projection data and the second projection data.
And 540, performing back projection operation on the third projection data to determine back projection data.
In some embodiments, the processing device 110 may perform a backprojection operation (e.g., a filtered backprojection operation) on the third projection data to determine the backprojection data.
In step 550, an iterative update factor is determined based on the first projection data, the backprojection data, and the normalization matrix.
In some embodiments, the processing device 110 may obtain an image with an initialization value of 1 and determine the normalization matrix based on the image with the initialization value of 1. For example, the processing device 110 may determine the normalization matrix by performing a backprojection operation on an image with an initialization value of 1.
In some embodiments, the processing module 640 may determine an iterative update factor based on the first projection data, the backprojection data, and the normalization matrix.
It should be noted that the above description of the flow is for illustration and description only and does not limit the scope of the application of the present specification. Various modifications and alterations to the flow may occur to those skilled in the art, given the benefit of this description. However, such modifications and variations are intended to be within the scope of the present description.
FIG. 6 is an exemplary block diagram of an image processing system shown in accordance with some embodiments of the present description. In some embodiments, the processing device 110 may include an acquisition module 610, a scan information determination module 620, a processing parameter determination module 630, and a processing module 640.
The acquisition module 610 may be used to acquire information and data related to the image processing system 100. In some embodiments, the acquisition module 610 may acquire raw imaging data. More description of acquiring raw imaging data may be found elsewhere in this specification (e.g., step 210 in fig. 2 and its description).
The scan information determination module 620 may be used to determine scan information related to a scan of a target object. The scan information may include information related to the medical device, information related to the raw imaging data, information related to the target object, and the like, or any combination thereof. More description of determining scan information related to a scan of a target object may be found elsewhere in this specification (e.g., step 220 in fig. 2 and its description).
The processing parameter determination module 630 may be used to determine image processing parameters. In some embodiments, the processing parameter determination module 630 may determine image processing parameters based on the scan information. For example, the processing parameter determining module 630 may determine the image processing parameter corresponding to a specific position in the target direction based on the scanning information corresponding to the specific position. For another example, the processing parameter determining module 630 may determine the image processing parameter corresponding to a specific position in the target direction based on the scan information related to the specific position and the reference position of the specific position. More description of determining image processing parameters may be found elsewhere in this specification (e.g., step 230 in fig. 2 and its description).
The processing module 640 may be used to process the raw imaging data based on image processing parameters to generate target imaging data. In some embodiments, the processing module 640 may acquire raw imaging data or updated imaging data generated in a previous iteration. The processing module 640 may process the raw imaging data or the updated imaging data based on the image processing parameters to generate processed imaging data. The processing module 640 may determine whether the iteration satisfies a termination condition. The processing module 640 may determine the processed imaging data as target imaging data in response to determining that the iteration satisfies the termination condition. In some embodiments, the processing module 640 may obtain an initialization image or updated imaging data generated in a previous iteration. The processing module 640 may reconstruct the initialization image or the update image data based on the iterative update factor and the image processing parameters to generate a reconstructed image. The processing module 640 may determine whether the iteration satisfies a termination condition. The processing module 640 may determine the reconstructed image as the target reconstructed image in response to determining that the iteration satisfies the termination condition. More description of generating target imaging data may be found elsewhere in this specification (e.g., step 240 in fig. 2, fig. 3-5, and descriptions thereof).
It should be noted that the above description of the processing device 110 is for illustrative purposes and is not intended to limit the scope of the present application. Various changes and modifications will occur to those skilled in the art based on the description herein. However, such changes and modifications do not depart from the scope of the present application. In some embodiments, one or more modules may be combined into a single module. For example, the scan information determination module 620 and the processing parameter determination module 630 may be combined into a single module. In some embodiments, one or more modules may be added or omitted in processing device 110. For example, processing device 110 may also include a storage module (not shown in fig. 6) configured to store data and/or information associated with image processing system 100 (e.g., raw imaging data, scan information, image processing parameters, target imaging data).
FIG. 9 is a schematic diagram of a reconstructed image according to some embodiments described herein. As shown in fig. 9, reconstructed images 910A and 920A are reconstructed images of a target object generated according to a Maximum Intensity Projection (MIP) algorithm. Reconstructed images 910B and 920B are reconstructed images of cross-sections of corresponding target objects. Reconstructed images 910A and 910B are images processed based on conventional image smoothing parameters (i.e., image smoothing parameters are a fixed amount in the target direction). The reconstructed images 920A and 920B are images obtained after processing based on image smoothing parameters (that is, image smoothing parameters are variables of positions in the target direction) determined in an embodiment of the present specification. As can be seen in FIG. 9, reconstructed images 920A and 920B are smoother and significantly less noisy (e.g., image edge noise) than reconstructed images 910A and 910B.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) by setting the image processing parameter as a variable of a position in the target direction, the image quality of the image in the target direction can be flexibly controlled; (2) determining image processing parameters based on scanning information related to scanning of the target object in the target direction, so that image differences at different positions in the target direction can be reduced, and relatively uniform image quality is obtained; (3) a desired image quality can be achieved at a predetermined position in the target direction, for example, specific image processing parameters can be set for a specific organ region of the target object, thereby obtaining a specific optimized image effect of the specific organ; for another example, for a region of a boundary layer in an image where noise is large, generation of noise can be suppressed by setting a specific image processing parameter corresponding to the region; (4) different image processing parameters can be determined according to the characteristic information of different patients and/or different scanning related information, so that personalized image processing parameter setting aiming at different patients and/or different scanning is realized, and the quality of target imaging data is further improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, any one or combination of the above advantages may be produced, or any other advantages may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the description. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features than are expressly recited in a claim. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into this specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of this specification shall control if they are inconsistent or contrary to the descriptions and/or uses of terms in this specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present disclosure. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the specification can be considered consistent with the teachings of the specification. Accordingly, the embodiments of the present description are not limited to only those embodiments explicitly described and depicted herein.

Claims (10)

1. An image processing method, characterized in that the method comprises:
acquiring original imaging data, wherein the original imaging data is acquired by scanning a target object through medical equipment;
determining scan information related to a scan of the target object;
determining an image processing parameter based on the scanning information, wherein the image processing parameter is a variable of a position in a target direction; and
processing the raw imaging data based on the image processing parameters to generate target imaging data.
2. The method of claim 1, the scan information comprising at least one of: information related to the medical device, information related to the raw imaging data, or information related to the target object.
3. The method of claim 2, wherein the medical device comprises a positron emission tomography device,
the information related to the medical device includes at least one of sensitivity of detectors of the medical device, a gap between adjacent detectors, and detection efficiency of the detectors,
the information related to the raw imaging data includes at least one of coincidence event count information and tracer activity,
the information related to the target object includes at least one of characteristic information of the target object and historical imaging data.
4. The method of claim 1, wherein processing the raw imaging data based on the image processing parameters comprises a plurality of iterations, wherein at least one iteration comprises:
acquiring updated imaging data generated in the previous iteration;
processing the updated imaging data based on the image processing parameters to generate processed imaging data;
judging whether the iteration meets a termination condition; and
in response to determining that the iteration satisfies the termination condition, determining the processed imaging data as the target imaging data.
5. The method of claim 4, wherein processing the updated imaging data based on the image processing parameters to generate processed imaging data comprises:
updating the image processing parameters based on the updated imaging data and the scan information to generate updated image processing parameters; and
processing the updated imaging data based on the updated image processing parameters to generate processed imaging data.
6. The method of claim 4, wherein the processing comprises an image reconstruction process, and wherein the processing the updated imaging data based on the image processing parameters to generate processed imaging data comprises:
obtaining an iteration updating factor; and
reconstructing the updated imaging data based on the iterative update factor and the image processing parameters to generate a reconstructed image.
7. The method of claim 6, wherein obtaining the iterative update factor comprises:
performing orthographic projection operation on the updated imaging data to determine first projection data;
determining second projection data based on the raw imaging data;
determining third projection data based on the first projection data and the second projection data;
carrying out back projection operation on the third projection data to determine back projection data; and
determining the iteration update factor according to the first projection data, the back projection data and the normalization matrix.
8. The method of claim 1, wherein the processing comprises at least one of: image smoothing processing, image enhancement processing, image fusion processing, and image beautification processing.
9. An image processing system, characterized in that the system comprises at least one processor and at least one memory;
the at least one memory is for storing computer instructions;
the at least one processor is configured to execute at least some of the computer instructions to implement the method of any of claims 1 to 8.
10. An image processing system is characterized by comprising an acquisition module, a scanning information determination module, a processing parameter determination module and a processing module;
the acquisition module is used for acquiring original imaging data, and the original imaging data is acquired by scanning a target object through medical equipment;
the scanning information determination module is used for determining scanning information related to the scanning of the target object;
the processing parameter determining module is used for determining an image processing parameter based on the scanning information, wherein the image processing parameter is a variable of a position in a target direction;
the processing module is used for processing the original imaging data based on the image processing parameters to generate target imaging data.
CN202210584974.3A 2022-05-27 2022-05-27 Image processing method and system Pending CN114897861A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210584974.3A CN114897861A (en) 2022-05-27 2022-05-27 Image processing method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210584974.3A CN114897861A (en) 2022-05-27 2022-05-27 Image processing method and system

Publications (1)

Publication Number Publication Date
CN114897861A true CN114897861A (en) 2022-08-12

Family

ID=82725289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210584974.3A Pending CN114897861A (en) 2022-05-27 2022-05-27 Image processing method and system

Country Status (1)

Country Link
CN (1) CN114897861A (en)

Similar Documents

Publication Publication Date Title
JP7518255B2 (en) Medical image processing device and medical image processing system
Gong et al. Iterative PET image reconstruction using convolutional neural network representation
Tong et al. Image reconstruction for PET/CT scanners: past achievements and future challenges
JP2024054204A (en) Learning method of neural network, program, medical image processing method and medical device
Guérin et al. Nonrigid PET motion compensation in the lower abdomen using simultaneous tagged‐MRI and PET imaging
Li et al. Model‐based image reconstruction for four‐dimensional PET
CN112381741B (en) Tomography image reconstruction method based on SPECT data sampling and noise characteristics
US9053569B2 (en) Generating attenuation correction maps for combined modality imaging studies and improving generated attenuation correction maps using MLAA and DCC algorithms
CN115605915A (en) Image reconstruction system and method
CN108209954B (en) Emission type computed tomography image reconstruction method and system
CN109961419B (en) Correction information acquisition method for attenuation correction of PET activity distribution image
CN107348969B (en) PET data processing method and system and PET imaging equipment
Hu et al. Design and implementation of automated clinical whole body parametric PET with continuous bed motion
Ote et al. List-mode PET image reconstruction using deep image prior
US11995745B2 (en) Systems and methods for correcting mismatch induced by respiratory motion in positron emission tomography image reconstruction
CN110458779B (en) Method for acquiring correction information for attenuation correction of PET images of respiration or heart
CN112529977B (en) PET image reconstruction method and system
CN114897861A (en) Image processing method and system
US11487029B2 (en) Systems and methods for positron emission tomography image reconstruction
US11663758B2 (en) Systems and methods for motion estimation in PET imaging using AI image reconstructions
US8437525B2 (en) Method and system for using a modified ordered subsets scheme for attenuation weighted reconstruction
Kaur et al. Complex diffusion regularisation-based low dose CT image reconstruction
CN112365593B (en) PET image reconstruction method and system
WO2018022565A1 (en) System and method for tomographic image reconstruction
Us Reduction of Limited Angle Artifacts in Medical Tomography via Image Reconstruction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination