WO2023039736A1 - Systems and methods for image reconstruction - Google Patents

Systems and methods for image reconstruction Download PDF

Info

Publication number
WO2023039736A1
WO2023039736A1 PCT/CN2021/118380 CN2021118380W WO2023039736A1 WO 2023039736 A1 WO2023039736 A1 WO 2023039736A1 CN 2021118380 W CN2021118380 W CN 2021118380W WO 2023039736 A1 WO2023039736 A1 WO 2023039736A1
Authority
WO
WIPO (PCT)
Prior art keywords
reconstruction
target
data
image
initial image
Prior art date
Application number
PCT/CN2021/118380
Other languages
French (fr)
Inventor
Xinlu GUO
Shankui LI
Junjie Li
Xin Wang
Zhonghua YU
Original Assignee
Shanghai United Imaging Healthcare Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co., Ltd. filed Critical Shanghai United Imaging Healthcare Co., Ltd.
Priority to PCT/CN2021/118380 priority Critical patent/WO2023039736A1/en
Priority to CN202180102293.2A priority patent/CN117940072A/en
Publication of WO2023039736A1 publication Critical patent/WO2023039736A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/52Devices using data or image processing specially adapted for radiation diagnosis
    • A61B6/5205Devices using data or image processing specially adapted for radiation diagnosis involving processing of raw data to produce diagnostic data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography

Definitions

  • the disclosure generally relates to the field of image reconstruction, and more particularly relates to systems and methods for multiple image reconstructions associated with the same raw data.
  • an imaging device e.g., a computed tomography (CT) device may be used for scanning an object (e.g., a patient or a portion thereof) to generate raw data of the object.
  • CT computed tomography
  • users e.g., a doctor
  • the multiple reconstructions performed on the raw data may include repeated or related operations, which may occupy more resources and are time-consuming. Therefore, it is desirable to provide systems and methods for improving process efficiency of multiple image reconstructions associated with the same raw data, thereby simplifying the multiple image reconstruction process and saving the resources and time of the multiple image reconstruction process.
  • a method for image reconstruction may be implemented on a computing device including at least one processor and at least one storage device.
  • the method may include obtaining raw data of an object generated by an imaging device.
  • the method may include generating preprocessed data by preprocessing the raw data.
  • the method may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the method may also include storing the initial image data.
  • the method may also include retrieving first target data from the initial image data according to a first target reconstruction need.
  • the method may further include generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
  • the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
  • a size of pixels or voxels of the first target image may be no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
  • the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
  • a reconstruction slice thickness a reconstruction interval
  • FOV reconstruction field of view
  • the method may further include retrieving second target data from the initial image data according to a second target reconstruction need; and generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need.
  • the second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
  • the first target image of the object and the second target image of the object may be generated in parallel.
  • the first target reconstruction need may be associated with a first set of basic reconstruction parameters.
  • the second target reconstruction need may be associated with a second set of basic reconstruction parameters. At least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
  • the preprocessing the raw data may include determining whether the raw data is corrupted, determining whether the raw data needs correction, or determining whether the raw data needs noise reduction.
  • the postprocessing operation may include at least one of an artifact removal operation, a correction operation, or a merging operation.
  • the method may further include obtaining a request for adjusting the plurality of reconstruction parameters; and updating the plurality of reconstruction parameters according to the request.
  • the method may further include causing the first target image to be displayed for a user.
  • the method may further include determining whether there is an erroneous result in performing the method. In response to a determination that there is an erroneous result, the method may include identifying, from operations of the method, an error operation; and reporting the error operation.
  • a system for image reconstruction may include a storage device storing a set of instructions, and at least one processor in communication with the storage device.
  • the at least one processor may be may be configured to cause the system to perform following operations.
  • the operations may include obtaining raw data of an object generated by an imaging device.
  • the operations may include generating preprocessed data by preprocessing the raw data.
  • the operations may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the operations may also include storing the initial image data.
  • the operations may also include retrieving first target data from the initial image data according to a first target reconstruction need.
  • the operations may further include generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
  • the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
  • a size of pixels or voxels of the first target image is no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
  • the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
  • a reconstruction slice thickness a reconstruction interval
  • FOV reconstruction field of view
  • the operations may further include retrieving second target data from the initial image data according to a second target reconstruction need; and generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need.
  • the second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
  • the first target image of the object and the second target image of the object may be generated in parallel.
  • the first target reconstruction need may be associated with a first set of basic reconstruction parameters.
  • the second target reconstruction need may be associated with a second set of basic reconstruction parameters. At least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
  • the preprocessing the raw data may include determining whether the raw data is corrupted, determining whether the raw data needs correction, or determining whether the raw data needs noise reduction.
  • the postprocessing operation may include at least one of an artifact removal operation, a correction operation, or a merging operation.
  • the operations may further include obtaining a request for adjusting the plurality of reconstruction parameters; and updating the plurality of reconstruction parameters according to the request.
  • the operations may further include causing the first target image to be displayed for a user.
  • the operations may further include determining whether there is an erroneous result in performing the operations. In response to a determination that there is an erroneous result, the operations may further include identifying, from the operations, an error operation; and reporting the error operation.
  • a system for image reconstruction may include an obtaining module, a preprocess module, a reconstruction module, one or more pick modules, and one or more postprocess modules each of which corresponds to one of the one or more pick modules.
  • the obtaining module may be configured to obtain raw data of an object generated by an imaging device.
  • the preprocess module may be configured to generate preprocessed data by preprocessing the raw data.
  • the reconstruction module may be configured to generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs; and store the initial image data.
  • Each of the one or more pick modules may be configured to retrieve target data from the initial image data according to a target reconstruction need.
  • the one or more pick modules may be arranged in parallel.
  • Each of the one or more postprocess modules may be configured to generate a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the first target reconstruction need.
  • a non-transitory computer readable medium may include executable instructions that, when executed by at least one processor, direct the at least one processor to perform a method for image reconstruction.
  • the method may include obtaining raw data of an object generated by an imaging device.
  • the method may include generating preprocessed data by preprocessing the raw data.
  • the method may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the method may also include storing the initial image data.
  • the method may also include retrieving first target data from the initial image data according to a first target reconstruction need.
  • the method may further include generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
  • a method for image reconstruction may be implemented on a computing device including at least one processor and at least one storage device.
  • the method may include obtaining a first parameter set associated with reconstruction according to a first user input.
  • the method may include obtaining raw data of an object generated by an imaging device.
  • the method may also include generating initial image data of the object by performing, according to the first parameter set, a reconstruction operation on the raw data.
  • the method may also include obtaining a second parameter set associated with postprocessing according to a second user input.
  • the method may further include generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
  • a method for image reconstruction may be implemented on a computing device including at least one processor and at least one storage device.
  • the method may include obtaining raw data of an object generated by an imaging device.
  • the method may include generating preprocessed data by preprocessing the raw data.
  • the method may also include obtaining a reconstruction parameter set associated with at least two target reconstruction needs.
  • the method may further include generating an initial image by performing, according to the reconstruction parameter set, a reconstruction operation on the preprocessed data.
  • the initial image may meet the at least two target reconstruction needs.
  • a method for image reconstruction may be implemented on a computing device including at least one processor and at least one storage device.
  • the method may include obtaining raw data of an object generated by an imaging device.
  • the method may include generating initial image data of the object by performing a reconstruction operation on the raw data.
  • the method may also include obtaining a parameter set associated with postprocessing according to a user input.
  • the method may further include generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
  • a method for image reconstruction may be implemented on a computing device including at least one processor and at least one storage device.
  • the method may include obtaining an initial image of an object.
  • the method may also include obtaining at least two parameter sets associated with postprocessing according to a user input.
  • the method may further include generating a target image by performing, according to the each parameter set, a postprocessing operation on the initial image.
  • an interface may include at least two icons corresponding to at least two target reconstruction needs respectively.
  • Each of the at least two target reconstruction needs may correspond to a reconstruction parameter set and a postprocessing parameter set.
  • operations for image reconstruction may be triggered.
  • the operations may include determining a plurality of reconstruction parameters based on the at least two reconstruction parameter sets; obtaining raw data of an object generated by an imaging device; generating initial image data of the object by performing, according to the plurality of reconstruction parameters, a reconstruction operation on the raw data; generating a target image by performing, according to each of the at least two postprocessing parameter sets, a postprocessing operation on the initial image data.
  • FIG. 1 is a schematic diagram illustrating an exemplary medical imaging system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure
  • FIG. 6 is a flowchart illustrating an exemplary process for generating a target image of an object based on initial image data of the object according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure.
  • FIG. 8 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure.
  • system, ” “engine, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections or assembly of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
  • module, ” “unit, ” or “block, ” as used herein refers to logic embodied in hardware or firmware, or to a collection of software instructions.
  • a module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device.
  • a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG.
  • a computer-readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • a computer-readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or can be included of programmable units, such as programmable gate arrays or processors.
  • modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware.
  • the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage. The description may be applicable to a system, an engine, or a portion thereof.
  • image in the present disclosure is used to collectively refer to image data (e.g., scan data) and/or images of various forms, including a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) image, etc.
  • pixel and “voxel” in the present disclosure are used interchangeably to refer to an element of an image.
  • the subject may include a biological subject (e.g., a human, an animal) , a non-biological subject (e.g., a phantom) , etc.
  • the flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in an inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
  • a representation of a subject in an image may be referred to as the subject for brevity.
  • a representation of an organ or tissue e.g., the heart, the liver, a lung, etc., of a patient
  • an image including a representation of a subject may be referred to as an image of the subject or an image including the subject for brevity.
  • an operation on a representation of a subject in an image may be referred to as an operation on the subject for brevity.
  • a segmentation of a portion of an image including a representation of an organ or tissue (e.g., the heart, the liver, a lung, etc., of a patient) from the image may be referred to as a segmentation of the organ or tissue for brevity.
  • an organ or tissue e.g., the heart, the liver, a lung, etc., of a patient
  • An aspect of the present disclosure relates to systems and methods for image reconstruction (e.g., optimizing multiple image reconstructions associated with the same raw data) .
  • the systems and methods may generate initial image data (e.g., an initial image sequence) by performing, according to a plurality of reconstruction parameters (also referred to as parameters with relatively small granules) associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the systems and methods may store the initial image data.
  • the systems and methods may retrieve one or more sets of target data from the initial image data according to one or more target reconstruction needs (TRNs) respectively.
  • TRNs target reconstruction needs
  • the systems and methods may further generate a target image of the object by performing a postprocessing operation on the set of target data, such that the target image meets the target reconstruction need corresponding to the set of the target data.
  • Each image reconstruction process may include multiple operations such as read (e.g., for obtaining the raw data) , prepare (e.g., for preprocessing the raw data) , reconstruction (e.g., for reconstructing images based on the preprocessed raw data) , postprocess (e.g., for postprocessing the reconstructed image) , output image (e.g., for outputting the postprocessed image) , etc.
  • the operations of read, prepare and reconstruction in the reconstruction process of multiple images may be at least partially the same or similar.
  • the operations of read, prepare and reconstruction may occupy a relatively large amount of resource in the image reconstruction process, e.g., occupy 60%-80%of the resource needed for the image reconstruction process.
  • the operations of read, prepare, and reconstruction may need to be repeated, which is time-consuming and resource intensive.
  • it may take relatively large amount of manpower and time for maintenance. Therefore, it is desirable to provide systems and methods for improve processing efficiency of multiple image reconstructions.
  • an image reconstruction process may be modified to include two parts.
  • a first part may include the similar and resource-intensive operations (e.g., the operations of read, preprocess, and reconstruction) of the image reconstruction process as described aforementioned.
  • a second part may include operations of pick, postprocess, and output image. Differing from the traditional image reconstruction process, the operations in the first part of reconstruction process may be performed to generate initial image data according to a plurality of potential reconstruction needs, such that the pick operation in the second part can determine target image data based on the initial image data for generating one or more target images that meet one or more target reconstruction needs.
  • TRN 1, TRN 2, ..., TRN n multiple target reconstruction needs
  • operations of read, prepare and reconstruction may be performed on the raw data only once, and target image data corresponding to each target reconstruction need may be directly retrieved from the initial image data for generating a corresponding target image, thereby reducing the consumption of resource and time of the multiple image reconstructions.
  • TRN 1, TRN 2, ..., TRN n multiple target reconstruction needs
  • operations of read, prepare and reconstruction may be performed on the raw data only once, and target image data corresponding to each target reconstruction need may be directly retrieved from the initial image data for generating a corresponding target image, thereby reducing the consumption of resource and time of the multiple image reconstructions.
  • the image reconstruction process with the two parts when an erroneous result occurs in the multiple image reconstructions, it is easy to identify which part has an error for positioning an error operation, which helps the developer to position the error operation faster and saves the manpower and time.
  • FIG. 1 is a schematic diagram illustrating an exemplary medical imaging system according to some embodiments of the present disclosure.
  • the medical imaging system 100 may be used for non-invasive imaging, such as for disease diagnosis, treatment, and/or research purposes.
  • the medical imaging system 100 may include a single modality system and/or a multi-modality system.
  • the term “modality” used herein broadly refers to an imaging or treatment method or technology that gathers, generates, processes, and/or analyzes imaging information of a subject or treatments the subject.
  • the single modality system may include a computed tomography (CT) system, a magnetic resonance imaging (MRI) system, a positron emission tomography (PET) system, or the like, or any combination thereof.
  • CT computed tomography
  • MRI magnetic resonance imaging
  • PET positron emission tomography
  • the multi-modality system may include a positron emission tomography-computed tomography (PET-CT) system, a positron emission tomography-magnetic resonance imaging (PET-MRI) system, an image guided radiotherapy system (e.g., a CT guided radiotherapy system) , or the like, or any combination thereof.
  • PET-CT positron emission tomography-computed tomography
  • PET-MRI positron emission tomography-magnetic resonance imaging
  • an image guided radiotherapy system e.g., a CT guided radiotherapy system
  • the medical imaging system 100 may include an imaging device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150.
  • the components of the medical imaging system 100 may be connected in one or more of various ways.
  • the imaging device 110 may be connected to the processing device 140 through the network 120.
  • the imaging device 110 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140) .
  • the storage device 150 may be connected to the processing device 140 directly or through the network 120.
  • the terminal device 130 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal device 130 and the processing device 140) or through the network 120.
  • the imaging device 110 may be configured to scan or image an object or a portion thereof.
  • the object may include a biological subject (e.g., a patient) or a non-biological subject (e.g., a phantom) .
  • the object may include a specific part, organ, and/or tissue of a patient.
  • the object may include the head, the brain, the neck, the breast, the heart, the lung, the stomach, blood vessels, soft tissues, or the like, or any combination thereof.
  • the term “object” or “subject” are used interchangeably in the present disclosure.
  • the imaging device 110 may include a single modality device.
  • the imaging device 110 may include a CT device, an MRI device, a PET device, etc.
  • the imaging device 110 may include a multi-modality device (e.g., a double-modality device) .
  • the imaging device 110 may include a PET-CT device, a PET-MRI device, an image guided radiotherapy device, etc.
  • the imaging device 110 illustrated in FIG. 1 is provided with reference to a CT device, which is not intended to limit the scope of the present disclosure.
  • the imaging device 110 may include a gantry 111, a detector 112, a detecting region 113, a table 114, and a radiation source 115.
  • the gantry 111 may support the detector 112 and the radiation source 115.
  • the gantry 111 may rotate, for example, clockwise or counterclockwise about an axis of rotation of the gantry 111.
  • the radiation source 115 and/or the detector 112 may rotate together with the gantry 111.
  • the object may be placed on the table 114 for scanning.
  • the radiation source 115 may emit a beam of radiation rays to the object.
  • the detector 112 may detect the radiation beam (e.g., gamma photons) emitted from the radiation source 115. After the detector 112 receives the radiation beam passing through the object, the received radiation beam may be converted into visible lights. The visible lights may be converted into electrical signals. The electrical signals may be further converted into digital information using an analog-to-digital (AD) converter. The digital information may be transmitted to a computing device (e.g., the processing device 140) for processing, or transmitted to a storage device (e.g., the storage device 150) for storage.
  • the detector 112 may include one or more detector units.
  • the detector unit (s) may be and/or include single-row detector elements and/or multi-row detector elements.
  • the coordinate system 116 may be a Cartesian system including an X-axis, a Y-axis, and a Z-axis.
  • the X-axis and the Z-axis shown in FIG. 1 may be horizontal and the Y-axis may be vertical.
  • the positive X direction along the X-axis may be from the left side to the right side of the table 114 viewed from the direction facing the front of the imaging device 110;
  • the positive Z direction along the Z-axis shown in FIG. 1 may be from the front side to the rare side of the imaging device 110;
  • the positive Y direction along the Y-axis shown in FIG. 1 may be from the lower part to the upper part of the imaging device 110.
  • the processing device 140 may process data and/or information.
  • the data and/or information may be obtained from one or more components of the medical imaging system 100 or an external source that the medical imaging system 100 can access.
  • the data and/or information may be obtained from the imaging device 110, the terminal (s) 130, the storage device 150, a medical database, etc.
  • the processing device 140 may process the data and/or information for image reconstruction.
  • the processing device 140 may obtain raw data of an object generated by the imaging device 110.
  • the processing device 140 may preprocess the raw data to generate preprocessed data.
  • the processing device 140 may generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the processing device 140 may store the initial image data.
  • the processing device 140 may retrieve first target data from the initial image data according to a target reconstruction need.
  • the processing device 140 may generate a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the target reconstruction need.
  • the processing device 140 may be a single server or a server group.
  • the server group may be centralized or distributed.
  • the processing device 140 may be local or remote.
  • the processing device 140 may access information and/or data stored in the imaging device 110, the terminal (s) 130, and/or the storage device 150 via the network 120.
  • the processing device 140 may be directly connected to the imaging device 110, the terminal (s) 130, and/or the storage device 150 to access stored information and/or data.
  • the processing device 140 may be implemented on a cloud platform.
  • a cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, and a multi-cloud, or the like, or any combination thereof.
  • the processing device 140 may be implemented by a computing device 200 having one or more components as illustrated in FIG. 2.
  • the terminal 130 may input/output signals, data, information, etc.
  • the terminal 130 may enable a user interaction with the processing device 140.
  • the terminal 130 may display a target image of the object on a screen of the terminal 130.
  • the terminal 130 may obtain a user’s input information (e.g., one or more target reconstruction needs input or selected by a user) through an input device (e.g., a keyboard, a touch screen, a brain wave monitoring device) , and transmit the input information to the processing device 140 for further processing.
  • the terminal 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof.
  • the mobile device 131 may include a smart home device, a wearable device, a mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a bracelet, footwear, a pair of glasses, a helmet, a watch, clothing, a backpack, a smart accessory, or the like, or any combination thereof.
  • the mobile device may include a mobile phone, a personal digital assistant (PDA) , a navigation device, a point of sale (POS) device, a laptop computer, a tablet computer, a desktop computer, or the like, or any combination thereof.
  • the virtual reality device and/or augmented reality device may include a virtual reality helmet, a pair of virtual reality glasses, a virtual reality patch, an augmented reality helmet, a pair of augmented reality glasses, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or augmented reality device may include a Google Glass TM , an Oculus Rift TM , a HoloLens TM , a Gear VR TM , or the like.
  • the terminal 130 may be part of the processing device 140. In some embodiments, the terminal 130 may be integrated with the processing device 140 as an operation station of the imaging device 110. Merely by way of example, a user (for example, a doctor or an operator) of the medical imaging system 100 may control an operation of the imaging device 110 through the operation station.
  • the storage device 150 may store data (e.g., raw data of an object) , instructions, and/or any other information.
  • the storage device 150 may store data obtained from the imaging device 110, the terminal (s) 130 and/or the processing device 140.
  • the storage device 150 may store raw data of an object acquired from the imaging device 110.
  • the storage device 150 may store initial image data of the object generated by the processing device 140.
  • the storage device 150 may store one or more target images of the object generated by the processing device 140.
  • the storage device 150 may store data and/or instructions executed or used by the processing device 140 to perform exemplary methods described in the present disclosure.
  • the storage device 150 may include a mass storage device, a removable storage device, a volatile read-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • the mass storage device may include a magnetic disk, an optical disk, a solid-state drive, a mobile storage, etc.
  • the removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a ZIP disk, a magnetic tape, etc.
  • the volatile read-and-write memory may include a random access memory (RAM) .
  • the RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR-SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • the ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , a digital versatile disk ROM, etc.
  • the storage device 150 may be implemented by the cloud platform described in the present disclosure.
  • a cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 150 may be connected to the network 120 to communicate with one or more components (e.g., the processing device 140, the terminal 130, etc. ) of the medical imaging system 100. One or more components of the medical imaging system 100 may access the data or instructions in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be a part of the processing device 140 or may be independent and directly or indirectly connected to the processing device 140.
  • the network 120 may include any suitable network that can facilitate the exchange of information and/or data of the medical imaging system 100.
  • one or more components of the medical imaging system 100 e.g., the imaging device 110, the terminal 130, the processing device 140, the storage device 150, etc.
  • the network 120 may include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , etc.
  • LAN local area network
  • WAN wide area network
  • a wired network e.g., an Ethernet network
  • a wireless network e.g., an 802.11 network, a Wi-Fi network, etc.
  • a cellular network e.g., a Long Term Evolution (LTE) network
  • LTE Long Term Evolution
  • frame relay network e.g., a virtual private network ( "VPN" )
  • satellite network a telephone network, routers, hubs, server computers, or the like, or a combination thereof.
  • the network 120 may include a wireline network, an optical fiber network, a telecommunication network, a local area network, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or a combination thereof.
  • the network 120 may include one or more network access points.
  • the network 120 may include wired and/or wireless network access points, such as base stations and/or Internet exchange points, through which one or more components of the medical imaging system 100 may be connected to the network 120 to exchange data and/or information.
  • the medical imaging system 100 may include one or more additional components and/or one or more components of the medical imaging system 100 described above may be omitted.
  • a component of the medical imaging system 100 may be implemented on two or more sub-components. Two or more components of the medical imaging system 100 may be integrated into a single component.
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure.
  • the computing device 200 may be configured to implement any component of the medical imaging system 100.
  • the imaging device 110, the terminal 130, the processing device 140, and/or the storage device 150 may be implemented on the computing device 200.
  • the computer functions relating to the medical imaging system 100 as described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
  • the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
  • I/O input/output
  • the processor 210 may execute computer instructions (e.g., program codes) and perform functions of the processing device 140 in accordance with techniques described herein.
  • the computer instructions may include, for example, routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein.
  • the processor 210 may perform instructions obtained from the terminal 130 and/or the storage device 150.
  • the processor 210 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application-specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
  • RISC reduced instruction set computer
  • ASICs application-specific integrated circuits
  • ASIP application-specific instruction-set processor
  • CPU central processing unit
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • ARM advanced RIS
  • the computing device 200 in the present disclosure may also include multiple processors.
  • operations and/or method steps that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both operation A and operation B
  • operation A and operation B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
  • the storage 220 may store data/information obtained from the imaging device 110, the terminal 130, the storage device 150, or any other component of the medical imaging system 100.
  • the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
  • the I/O 230 may input or output signals, data, and/or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, a camera capturing gestures, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, a 3D hologram, a light, a warning light, or the like, or a combination thereof.
  • Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
  • LCD liquid crystal display
  • LED light-emitting diode
  • CRT cathode ray tube
  • the communication port 240 may be connected with a network (e.g., the network 120) to facilitate data communications.
  • the communication port 240 may establish connections between the processing device 140 and the imaging device 110, the terminal 130, or the storage device 150.
  • the connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception.
  • the wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
  • the wireless connection may include a Bluetooth TM network, a Wi-Fi network, a WiMax network, a WLAN, a ZigBee TM network, a mobile network (e.g., 3G, 4G, 5G) , or the like, or any combination thereof.
  • the communication port 240 may be a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
  • DICOM digital imaging and communications in medicine
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure.
  • the processing device 140 or the terminal 130 may be implemented on the mobile device 300.
  • the mobile device 300 may include a communication module 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and storage 390.
  • the CPU 340 may include interface circuits and processing circuits similar to the processor 210.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • an operating system (OS) 370 e.g., iOS TM , Android TM , Windows Phone TM
  • applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to imaging on the mobile device 300. User interactions with the information stream may be achieved via the I/O devices 350 and provided to the processing device 140 and/or other components of the medical imaging system 100 via the network 120.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIG. 4 is a schematic diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • the processing device 140 may include a first part and a second part.
  • the first part may be configured to generate initial image data of an object for storage.
  • the second part may be configured to generate one or more target images of the object based on the initial image data.
  • the first part and the second part belonging to the same processing device 140 as shown in FIG. 4 are provided for illustration purposes.
  • the first part and the second part may belong to or implemented on different processing devices, respectively.
  • the first part may belong to or implemented on one or more first processing devices, while the second part may belong to or implemented on one or more second processing devices.
  • the first part may include an obtaining module 401, a preprocess module 402, and a reconstruction module 403.
  • the obtaining module 401 may be configured to obtain data/information from one or more components (e.g., the imaging device 110, a terminal (e.g., the terminal 130) , a storage device (e.g., the storage device 150) , etc. ) of the medical imaging system 100 or an external source (e.g., a medical database) .
  • the obtaining module 401 may obtain raw data of the object generated by the imaging device 110.
  • the raw data of the object e.g., a patient or a portion thereof
  • imaging data e.g., scan data
  • the raw data of the object may include projection data of the object acquired by the CT device. More descriptions regarding the obtaining of the raw data of the object may be found elsewhere in the present disclosure (e.g., operation 501 and the description thereof) .
  • the preprocess module 402 may be configured to preprocess the raw data of the object. For example, the preprocess module 402 may perform one or more preprocessing operations on the raw data to generate preprocessed data.
  • the one or more preprocessing operations may include determining whether the raw data is corrupted, determining whether the raw data needs correction, determining whether the raw data needs noise reduction, or the like, or any combination thereof. More descriptions regarding the preprocessing of the raw data may be found elsewhere in the present disclosure (e.g., operation 503 and the description thereof) .
  • the reconstruction module 403 may be configured to generate initial image data based on the preprocessed data. For example, the reconstruction module 403 may generate the initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • the initial image data may include (or correspond to) an initial image sequence of the object.
  • the initial image sequence of the object may include a plurality of image slices relating to the object that are arranged along the Z-axis, Y-axis, or X-axis.
  • the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction interval, or a reconstruction field of view (FOV) , reconstruction start and end positions, a reconstruction matrix, etc.
  • the processing device 140 may determine the plurality of reconstruction parameters based on the plurality of potential reconstruction needs.
  • the plurality of potential reconstruction needs may be determined according to clinical experiences. That is, in general, one or more users (e.g., a doctor or a technician) may need to view one or more images of the object (e.g., from different views of the object) for diagnosis or research purposes. The one or more images of the object may be reconstructed based on the raw data of the object according to one or more of the plurality of potential reconstruction needs respectively.
  • Each of the plurality of potential reconstruction needs may correspond to a set of potential reconstruction parameters.
  • the processing device 140 may determine the plurality of reconstruction parameters based on (e.g., from) the plurality sets of potential reconstruction parameters.
  • Each potential reconstruction parameter of a set of potential reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
  • the reconstruction module 403 may store the initial image data in a storage device (e.g., the storage device 150) . More descriptions regarding the generation and storing of the initial image data may be found elsewhere in the present disclosure (e.g., operations 505 and 507 and the descriptions thereof) .
  • the second part may include one or more pick modules 404 (e.g., a pick module 404-1, a pick module 404-2, ..., a pick module 404-n) , one or more postprocess modules 405 (e.g., a postprocess module 405-1, a postprocess module 405-2, ..., a postprocess module 405-n) , and one or more output modules 406 (e.g., an output module 406-2, an output module 406-2, ..., and an output module 406-n) .
  • Each pick module 404 may correspond to a postprocess module 405 and an output module 406.
  • the pick module 404 with its corresponding postprocess module 405 and corresponding output module 406 may correspond to a target reconstruction need for generating a target image that meets the target reconstruction need.
  • the pick module 404-1 may correspond to the postprocess module 405-1 and the output module 406-1.
  • the pick module 404-1 may correspond to the postprocess module 405-2 and the output module 406-2.
  • the pick module 404-1, the postprocess module 405-1 and the output module 406-1 may correspond to a first target reconstruction need.
  • the pick module 404-2, the postprocess module 405-2 and the output module 406-2 may correspond to a second target reconstruction need.
  • the modules corresponding to different target reconstruction needs may be implemented in different processing devices or different sub-devices of a processing device.
  • the pick module (s) 404 may be configured to retrieve target data from the initial image data according to one or more target reconstruction needs.
  • the pick module 404-1 may retrieve first target data from the initial image data according to a first target reconstruction need for generating a first target image.
  • the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, a size of pixels or voxels of the first target image, etc.
  • the size of pixels or voxels of the first target image may be no less than the sizes of pixels or voxels of the initial image sequence corresponding to the initial image data.
  • the first target reconstruction need may be associated with a first set of basic reconstruction parameters.
  • Each target reconstruction parameter of the first set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. More descriptions regarding the retrieving of target data from the initial image data may be found elsewhere in the present disclosure (e.g., operations 509 and 601 and the descriptions thereof) .
  • the postprocess module (s) 405 may be configured to generate one or more target images based on the retrieved target data.
  • the postprocess module 405-1 may generate the first target image of the object by performing a postprocessing operation on the first target data.
  • the postprocessing operation may include an artifact removal operation, a correction operation, a merging operation, a multi-planner reformation (MPR) operation, or the like, or any combination thereof. More descriptions regarding the generation of the one or more target images may be found elsewhere in the present disclosure (e.g., operations 509 and 603 and the descriptions thereof) .
  • the output module (s) 406 may be configured to cause the one or more target images to be displayed for a user.
  • the output module 406-1 may cause the first target image to be displayed, e.g., on a screen of the terminal 130 for the user.
  • the user may input a feedback indicating whether the first target image is satisfied or whether to print the first target image via the terminal 130. More descriptions regarding causing the one or more target images to be displayed may be found elsewhere in the present disclosure (e.g., operation 511 and the description thereof) .
  • the modules in the processing device 140 may be connected to or communicated with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth TM , a ZigBee TM , a Near Field Communication (NFC) , or the like, or any combination thereof.
  • LAN Local Area Network
  • WAN Wide Area Network
  • Bluetooth TM Bluetooth TM
  • ZigBee TM ZigBee TM
  • NFC Near Field Communication
  • Each of the modules described above may be a hardware circuit that is designed to perform certain actions, e.g., according to a set of instructions stored in one or more storage media, and/or any combination of the hardware circuit and the one or more storage media.
  • the processing device 140 may include one or more other modules and/or one or more modules described above may be omitted. Additionally or alternatively, two or more modules may be integrated into a single module, and/or a module may be divided into two or more units. For example, the above-mentioned modules may be integrated into a console (not shown) . Via the console, a user may set parameters for scanning an object, controlling imaging processes, controlling reconstruction processes, viewing images, etc. As another example, the processing device 140 may include a storage module (not shown) (e.g., a first storage module in the first part and/or a second storage module in the second part) configured to store information and/or data (e.g., the initial image data, the one or more target images, etc.
  • a storage module not shown
  • information and/or data e.g., the initial image data, the one or more target images, etc.
  • the processing device 140 may include a communication module configured to cause the initial image data and/or the one or more target images to be displayed for a user, e.g., transmit the initial image data and/or the one or more target images to the terminal 130 for display.
  • FIG. 5 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure.
  • process 500 may be implemented as a set of instructions (e.g., an application) stored in a storage device (e.g., the storage device 130, the storage device 220, and/or the storage 390) .
  • the processing device 140 e.g., the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 140 may be configured to perform the process 500.
  • the operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing device 140 may obtain raw data of an object generated by an imaging device (e.g., the imaging device 110) .
  • operation 501 may also be referred to as a read operation, e.g., as shown in FIG. 8.
  • the raw data of the object refers to imaging data (e.g., scan data) of the object generated by the imaging device 110.
  • imaging data e.g., scan data
  • the raw data of the object may include projection data of the object acquired by the CT device.
  • the raw data of the object may reflect attenuation information of radiation rays (e.g., X-rays) that pass through the object, and may be generally used to generate one or more images related to the object.
  • the raw data of the object may be detected and/or collected by the imaging device 110 at a plurality of angles during a scan of the object.
  • the raw data of the object may include a plurality sets of data corresponding to the plurality of angles.
  • the imaging device 110 e.g., a CT imaging device
  • the radiation source 115 and the detector 112 may rotate with the gantry 111 around the Z-axis to scan the object from different angles.
  • the processing device 140 may obtain the raw data of the object from one or more components of the medical imaging system 100, such as the imaging device 110, a terminal (e.g., the terminal 130) , a storage device (e.g., the storage device 150) , etc.
  • the processing device 140 may obtain the raw data of the object from an external source via the network 120.
  • the obtaining module 401 may obtain the raw data from, for example, a medical database, etc.
  • the processing device 140 may generate preprocessed data by preprocessing the raw data.
  • the operation 503 may also be referred to as a preprocess operation, e.g., as shown in FIG. 8.
  • the processing device 140 may perform one or more preprocessing operations on the raw data to generate the preprocessed data.
  • the one or more preprocessing operations may include determining whether the raw data is corrupted, determining whether the raw data needs correction, determining whether the raw data needs noise reduction, or the like, or any combination thereof.
  • the processing device 140 may determine whether the raw data is corrupted (or damaged) .
  • the processing device 140 may determine that the raw data is corrupted if the raw data fails to be read or opened.
  • the processing device 140 may perform a recovery operation on the corrupted raw data to recover the raw data.
  • the process 500 may proceed to operation 501, that is, the processing device 140 may obtain the raw data of the object again.
  • the processing device 140 may determine whether the raw data needs correction.
  • different tissues or structures may correspond to different value ranges.
  • the processing device 140 may obtain a value range corresponding to the object and determine whether values of the raw data of the object are within the value range corresponding to the object.
  • the processing device 140 may determine that the raw data needs no correction.
  • the processing device 140 may determine that the raw data of the object needs correction, and the processing device 140 may perform a correction operation on the raw data of the object to correct the values of the raw data. As still another example, the processing device 140 may determine whether the raw data needs noise reduction. For instance, the processing device 140 may determine that the raw data needs noise reduction if noise (s) identified from the raw data is greater than a noise threshold. In response to a determination that the raw data needs noise reduction, the processing device 140 may determine a type of noise of the raw data and perform a noise reduction operation corresponding to the determined type of noise on the raw data of the object for noise reduction.
  • the processing device 140 may generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs.
  • operation 505 may also be referred to as a reconstruction operation, e.g., as shown in FIG. 8.
  • the plurality of reconstruction parameters may include a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, a reconstruction matrix, or the like, or a combination thereof.
  • the reconstruction slice thickness refers to a thickness (e.g., along the Z-axis, the X-axis, or the Y-axis) of a single image slice of the initial image sequence.
  • the reconstruction interval refers to a distance (e.g., along the Z-axis, the X-axis, or the Y-axis) between two adjacent image slices of the initial image sequence.
  • the reconstruction slice thickness and/or the reconstruction interval may indicate a count (or number) of image slices of the initial image sequence.
  • the reconstruction FOV refers to a size of a scan FOV that is reconstructed to produce image (s) (e.g., the image slice (s) of the initial image sequence) , which may indicate which portion of the object is shown in the image (s) .
  • the scan FOV refers to an area being scanned by the imaging device 110.
  • the reconstruction FOV may be no larger than the scan FOV.
  • the reconstruction start and end positions may include a start position and an end position, which indicate a range of the raw data that is used to reconstruct the initial image sequence.
  • the raw data of the object may be acquired by rotating the radiation source 115 of the imaging device 110 from 0° to 360° around the object.
  • the reconstruction start and end positions may include a start position denoted by a start angle (e.g., an angle no less than 0°) and an end position denoted by an end angle (e.g., an angle no larger than 360°) , which indicates the raw data generated from the start angle and the end angle is used to reconstruct the initial image sequence of the object.
  • the reconstruction matrix refers to an array of rows and columns of pixels/voxels (e.g., 512 ⁇ 512, 1024 ⁇ 1024, etc. ) of an image of the initial image sequence, which can indicate a resolution of the image.
  • an image corresponding to a reconstruction matrix of 1024 ⁇ 1024 may have a higher resolution than an image corresponding to a reconstruction matrix of 512 ⁇ 512.
  • the processing device 140 may determine the plurality of reconstruction parameters based on the plurality of potential reconstruction needs.
  • the plurality of potential reconstruction needs may be determined according to clinical experiences. That is, in general, one or more users (e.g., a doctor or a technician) may need to view one or more images of the object (e.g., from different views of the object) for diagnosis or research purposes. The one or more images of the object may be reconstructed based on the raw data of the object according to one or more of the plurality of potential reconstruction needs respectively.
  • Each of the plurality of potential reconstruction needs may correspond to a set of potential reconstruction parameters (also referred to as a potential parameter set associated with reconstruction or a potential reconstruction parameter set) and a potential parameter set associated with postprocessing (also referred to as a potential postprocessing parameter set) . Accordingly, the processing device 140 may determine the plurality of reconstruction parameters based on (e.g., from) the plurality sets of potential reconstruction parameters.
  • each set of potential reconstruction parameters may have different parameter types.
  • Parameter types of any two sets of potential reconstruction parameters corresponding to different potential reconstruction needs may be the same or different.
  • Parameters types of the plurality of reconstruction parameters may include at least an assemble (e.g., a union) of parameter types of the plurality of sets of potential reconstruction parameters.
  • a set of potential reconstruction parameters corresponding to a first potential reconstruction need may have 5 parameter types
  • a set of potential reconstruction parameters corresponding to a second potential reconstruction need may have 8 parameter types.
  • a portion (e.g., 3) of the 5 parameter types may be the same as that of the 8 parameter types.
  • the plurality of reconstruction parameters may have a union of the 5 parameter types and the 8 parameter types (e.g., 10 parameter types) .
  • the plurality of reconstruction parameters may have other parameter type (s) in addition to the parameter types of the plurality of sets of potential reconstruction parameters.
  • a reconstruction need may relate to which portion of the object needs to be showed in image (s) , resolution (s) of the image (s) , a range of data that is used to reconstruct the image (s) , a count (or number) of the image (s) , or the like, or any combination thereof.
  • different users may have different potential reconstruction needs
  • different objects may correspond to different potential reconstruction needs
  • different diseases may correspond to different potential reconstruction needs.
  • the processing device 140 may parse the plurality of potential reconstruction needs to determine the plurality of reconstruction parameters.
  • the plurality of potential reconstruction needs may be parsed manually to obtain the plurality of reconstruction parameters.
  • a reconstruction parameter of the plurality of reconstruction parameters may be determined based on a highest demand of one aspect of the plurality of potential reconstruction needs.
  • one aspect of the plurality of potential reconstruction needs may relate to various image resolutions (e.g., image resolution R1 may be needed for image preview, image resolution R2 may be needed for disease diagnosis, in which R2>R1)
  • a reconstruction parameter e.g., the reconstruction matrix
  • a highest demand e.g., the highest image resolution R2
  • image (s) with a relatively high resolution may be reconstructed for postprocessing.
  • Image (s) with a relatively low resolution may be obtained by down-sampling the image (s) with the relatively high resolution (e.g., R2) .
  • an aspect of the reconstruction need relating to image resolution may correspond to a potential reconstruction parameter with various values (e.g., reconstruction matrix M1 corresponding to resolution R1, reconstruction matrix M2 corresponding to resolution R2) .
  • the processing device 140 may determine a parameter with a finest value (e.g., M2 corresponding to R2) among the various values (e.g., M1 corresponding to R1, M2 corresponding to R2) or a value finer than the finest value as one of the plurality of reconstruction parameters.
  • the plurality of reconstruction parameters may also be referred to as parameters with relatively small granules, while each set of potential reconstruction parameters may also be referred to as parameters with relatively large granules.
  • the plurality of reconstruction parameters may also be referred to as a complete reconstruction parameter set indicating that the plurality of reconstruction parameters meet the plurality of potential reconstruction needs. That is, each potential reconstruction parameter of a set of potential reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. For instance, for each set of potential reconstruction parameters, a potential reconstruction parameter of the set of potential reconstruction parameters may be coarser than a corresponding reconstruction parameter of the plurality of reconstruction parameters.
  • a potential reconstruction slice thickness of a set of potential reconstruction parameters may be no less than the reconstruction slice thickness of the plurality of reconstruction parameters.
  • a potential reconstruction interval of a set of potential reconstruction parameters may be no less than the reconstruction interval of the plurality of reconstruction parameters.
  • a potential reconstruction FOV of a set of potential reconstruction parameters may be no greater than the reconstruction FOV of the plurality of reconstruction parameters.
  • potential reconstruction start and end positions of each set of potential reconstruction parameters may indicate a potential range of the raw data which is no larger than the range of the raw data indicated by the reconstruction start and end positions of the plurality of reconstruction parameters.
  • a potential reconstruction matrix of a set of potential reconstruction parameters may be no larger than the reconstruction matrix of the plurality of reconstruction parameters.
  • the reconstruction slice thickness of the plurality of reconstruction parameters may be finer than the three potential reconstruction slice thicknesses, e.g., being 0.5 mm.
  • the reconstruction interval of the plurality of reconstruction parameters may be finer than the three potential reconstruction intervals, e.g., being 0.5 mm.
  • the plurality of reconstruction parameters may be as fine as possible such that the initial image data (or the initial image sequence) may meet subsequent target reconstruction needs.
  • the initial image data may include an image (e.g., a 3D image or a 2D image) of the object.
  • the initial image may include (or correspond to) an initial image sequence of the object.
  • the initial image sequence of the object may include a plurality of image slices relating to the object that are arranged along the Z-axis, the X-axis, and/or the Y-axis.
  • the processing device 140 may perform the reconstruction operation on the preprocessed data according to the plurality of reconstruction parameters to generate the initial image sequence.
  • the plurality of reconstruction parameters are with relatively small granules and fine, sizes of pixels or voxels of the initial image sequence may be relatively small.
  • one or more images of the initial image sequence may be relatively blurred and/or sharp, and may need to be postprocessed (e.g., according to operation 509) .
  • the initial image data may include multiple initial image sequences corresponding to different reconstruction parameters respectively.
  • the processing device 140 may generate different image sequences each corresponding different reconstruction slice thicknesses (e.g., 0.5 mm, 1 mm, 1.5 mm, etc. ) , such that target data corresponding to one of the different reconstruction slice thicknesses can be directly retrieved from multiple image sequences.
  • the processing device 140 may perform the reconstruction operation on the preprocessed data using a reconstruction algorithm.
  • exemplary reconstruction algorithms may include a filtered back projection (FBP) algorithm, a forward projection algorithm, etc. ) , an iterative reconstruction algorithm, a Fourier-based reconstruction algorithm, a rearrangement algorithm, or the like, or any combination thereof.
  • the processing device 140 may store the initial image data.
  • operation 507 may be a portion of the reconstruction operation in 505 or independent from the reconstruction operation.
  • the processing device 140 may store the initial image data in a storage device such as the storage device 150, the storage 220, or the storage 390 for subsequent access and processing.
  • the processing device 140 may store the initial image data in a storage module of the processing device 140.
  • the processing device 140 may generate, based on the initial image data, one or more target images of the object according to one or more target reconstruction needs, respectively.
  • operation 509 may include a pick operation and a postprocess operation, e.g., as shown in FIG. 8.
  • each of the one or more target reconstruction needs may be associated with a set of basic reconstruction parameters (also referred to as a target parameter set associated with reconstruction, or a target reconstruction parameter set) .
  • the set of basic reconstruction parameters may be parameters with relatively large granules with respect to the plurality of reconstruction parameters. That is, each set of basic reconstruction parameters may be coarser than the plurality of reconstruction parameters.
  • each type of parameter of the set of basic reconstruction parameters may be coarser than a corresponding type of parameter of the plurality of reconstruction parameters, which is similar to that each potential reconstruction parameter of the set of potential reconstruction parameters is coarser than a corresponding reconstruction parameter of the plurality of reconstruction parameters.
  • first target reconstruction need may be associated with a first set of basic reconstruction parameters.
  • the second target reconstruction need may be associated with a second set of basic reconstruction parameters.
  • At least one (e.g., each target reconstruction parameter) of the first set of basic reconstruction parameters and/or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
  • each set of basic reconstruction parameters may be input and/or adjusted by a user of the medical imaging system 100.
  • the processing device 140 may retrieve target data from the initial image data according to the target reconstruction need.
  • the target reconstruction need may indicate at least one of a position and/or a scope of the target data in the initial image data, a size of pixels or voxels of the target image, etc.
  • the size of pixels or voxels of the first target image may be no less than the sizes of pixels or voxels of the initial image sequence corresponding to the initial image data.
  • the processing device 140 may generate a target image of the object by performing a postprocessing operation on the target data, such that the target image of the object meets the target reconstruction need.
  • the postprocessing operation may include at least one of an artifact removal operation, a correction operation, or a merging operation.
  • the target reconstruction need may correspond to a target parameter set associated with postprocessing (also referred to as a target postprocessing parameter set) .
  • the processing device 140 may generate the target image of the object by performing, according to the target postprocessing parameter set, the postprocessing operation on the target data. More descriptions regarding retrieving the target data and performing the postprocessing operation on the target data may be found elsewhere in the present disclosure (e.g., FIG. 6 and the description thereof) .
  • the processing device 140 may generate the one or more target images of the object in parallel. That is, the processing device 140 may retrieve the target data according to each of the one or more target reconstruction needs simultaneously (or synchronously) and separately for generating the one or more target images. For example, there may be a first target reconstruction need and a second target reconstruction need. The processing device 140 may retrieve first target data and second target data from the initial image data in parallel according to the first target reconstruction need and the second target reconstruction need, respectively. The processing device 140 may generate a first target image of the object by performing a first postprocessing operation on the first target data, and simultaneously (or synchronously) generate a second target image of the object by performing a second postprocessing operation on the first target data.
  • the first postprocessing operation may be the same as or different from the second postprocessing operation.
  • the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
  • the second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
  • the processing device 140 may generate the one or more target images of the object in sequence.
  • the processing device 140 may cause the one or more target images of the object to be displayed for a user.
  • operation 511 may also be referred to as an operation of output image, e.g., as shown in FIG. 8.
  • the processing device 140 may cause the one or more target images of the object to be displayed on a screen of the terminal 130 for the user. Further, the processing device 140 may receive a feedback from the user indicating whether the one or more target images are satisfied via the terminal 130.
  • the one or more target images may be displayed simultaneously and individually. For example, the one or more target images may be displayed simultaneously such that the user can view the one or more target images synchronously. Further, the one or more target images may be printed in a same report. As another example, the one or more target images may be displayed individually according to the generation sequence of the one or more target images, a default setting, or a designation of the user.
  • the multiple image reconstructions may be divided into two stages, e.g., a first stage (e.g., operations 501-507) for generating the initial image data and a second stage (e.g., operations 509-511) for generating one or more target images based on the initial image data.
  • the division of multiple image reconstructions into the first stage and the second stage may avoid repeating resource-intensive operations, thereby saving a lot of resources and time and simplifying the process of the multiple image reconstructions.
  • the operations of read, prepare and reconstruction may take up 60%-80%of reconstruction time for an image reconstruction process according to a target reconstruction need.
  • the image reconstruction process may need to repeat the resource-intensive operations multiple times, which is time-consuming and resource-intensive.
  • the time-consuming and resource-intensive operations may need to be performed only once, which reduces the reconstruction time and resource used in the multiple image reconstructions.
  • the division of multiple image reconstructions into the first stage and the second stage may help to quickly determine and position error operation (s) occur in the multiple image reconstructions, which saves the manpower.
  • the processing device 140 may determine whether there is an erroneous result in performing the process 500.
  • the processing device 140 may identify, from operations (e.g., operations 501-511) of the process 500, an error operation and report the error operation. For instance, if the processing device 140 can retrieve the target data from the initial image data according to each target reconstruction need, the processing device 140 may determine the first stage to be normal and position the error operation (s) in the second stage. In some embodiments, if operation (s) associated with a new target reconstruction need to be performed, the processing device 140 may only need to perform the operation (s) in the second stage and avoid repeating the operations in the first stage.
  • operations e.g., operations 501-5111
  • the processing device 140 may identify, from operations (e.g., operations 501-511) of the process 500, an error operation and report the error operation. For instance, if the processing device 140 can retrieve the target data from the initial image data according to each target reconstruction need, the processing device 140 may determine the first stage to be normal and position the error operation (s) in the second stage. In some embodiments, if operation (s) associated with
  • the processing device 140 may only need to add the candidate operation (s) into the second stage of the process 500 (e.g., replace the operation (s) in 509 with the candidate operation (s) to perform or test the candidate operation (s) ) without repeating the performing or testing of the operations in the first stage, thereby saving the test time.
  • the first stage and the second stage of the process 500 may be performed online and/or offline.
  • the first stage and the second stage of the process 500 may be performed separately.
  • the first stage of the process 500 may be performed offline, while the second stage of the process 500 may be performed online. That is, the first stage of the process 500 may be performed offline and the initial image data may be stored before the user selects or designates the one or more target reconstruction needs, and the second stage of the process 500 may be performed online when the user makes the selection and/or the designation and retrieve the target data from the initial image data for image processing.
  • the first stage and the second stage of the process 500 may both be performed online or offline. That is, the first stage and the second stage of the process 500 may be performed in response to the one or more selected and/or designated target reconstruction needs. In such cases, the one or more potential reconstruction needs in operation 505 may be determined based on the one or more target reconstruction needs.
  • one or more additional operations may be added in the process 500.
  • another storing operation may be added elsewhere in the process 500.
  • the processing device 140 may store information and/or data (e.g., the one or more target images) used or obtained in operations of the process 500 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure.
  • one or more operations of the process 500 may be omitted.
  • operations 501 and 503 may be integrated into a single operation.
  • an operation of the process 500 may be achieved by two or more sub-operations.
  • operation 509 may be divided into two sb-operations one of which is for retrieving target data from the initial image data according to each of the one or more target reconstruction needs and another of which is for generating a target image by postprocessing the retrieved target data.
  • the first stage e.g., operations 501-507) and the second stage (e.g., operations 509-511) may be performed by different processing devices.
  • FIG. 6 is a flowchart illustrating an exemplary process for generating a target image of an object based on initial image data of the object according to some embodiments of the present disclosure.
  • process 600 may be implemented as a set of instructions (e.g., an application) stored in a storage device (e.g., the storage device 130, the storage device 220, and/or the storage 390) .
  • the processing device 140 e.g., the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 140 may be configured to perform the process 600.
  • the operations of the illustrated process presented below are intended to be illustrative.
  • the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 600 illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, operation 509 in FIG. 5 may be achieved by the process 600.
  • the processing device 140 may retrieve target data from the initial image data of the object according to a target reconstruction need.
  • the target reconstruction need may be associated with a set of basic reconstruction parameters.
  • the processing device 140 may determine whether the set of the basic reconstruction parameters are coarser than the plurality of reconstruction parameters according to which the initial image data is generated. In response to a determination that the set of the basic reconstruction parameters are coarser than the plurality of reconstruction parameters, the processing device 140 may retrieve the target data from the initial image data according to the target reconstruction need.
  • the target reconstruction need may indicate at least one of a position and/or a scope of the target data in the initial image data, or a size of pixels or voxels of the target image that meets the target reconstruction need.
  • the processing device 140 may update the plurality of reconstruction parameters (used in 505) based on the target reconstruction need.
  • the processing device 140 may obtain a request for adjusting the plurality of reconstruction parameters (e.g., from a user) .
  • the processing device 140 may update the initial image data based on the plurality of updated reconstruction parameters and retrieve the target data from the updated initial image data.
  • the target reconstruction need may indicate at least one of a position and/or a scope of the target data in the updated initial image data, or a size of pixels or voxels of the target image that meets the target reconstruction need.
  • the processing device 140 may generate the target image of the object by performing a postprocessing operation on the target data, such that the target image meets the target reconstruction need.
  • the postprocessing operation may include an artifact removal operation, a correction operation, a merging operation, a multi-planner reformation (MPR) operation, or the like, or any combination thereof.
  • the artifact removal operation may be configured to remove an artifact (e.g., a beam hardening artifact, a ring artifact, a metal artifact, etc. ) from the target data to generate the target image.
  • an artifact e.g., a beam hardening artifact, a ring artifact, a metal artifact, etc.
  • MAR metal artifact reduction
  • the correction operation may be configured for image correction (e.g., motion correction) to generate the target image.
  • the merging operation may be configured to merge at least two image slices of the target image data to generate the target image.
  • the MPR operation may be configured to generate the target image based on the target data using an MPR algorithm.
  • the target reconstruction need may indicate a preset image quality.
  • the processing device 140 may determine whether the target image meets the preset image quality. In response to a determination that target image meets the preset image quality, the processing device 140 may output the target image. In response to a determination that the target image does not meet the preset image quality, the processing device 140 may further perform the postprocessing operation on the target image until the target image meets the preset image quality.
  • one or more additional operations may be added in the process 600.
  • a storing operation may be added elsewhere in the process 600.
  • the processing device 140 may store information and/or data used or obtained in operations of the process 600 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure.
  • each of the at least two target reconstruction needs may correspond to a target reconstruction parameter set and a target postprocessing parameter set.
  • the at least two target reconstruction needs may be achieved according to following operations.
  • the processing device 140 may obtain a first parameter set associated with reconstruction, e.g., according to a first user input.
  • the first parameter set may include parameters with relatively small granules, while each set of the at least two target reconstruction parameter sets may include parameters with relatively large granule.
  • the first parameter set may be a preset parameter set, e.g., a complete reconstruction parameter set including the plurality of reconstruction parameters associated with a plurality of potential reconstruction needs as described in 505.
  • the first parameter set may be determined based on the at least two target reconstruction parameter sets, which is similar to the determination of the plurality of reconstruction parameters.
  • Parameters types of the first parameter set may include at least an assemble (e.g., a union) of parameter types of two or more reconstruction parameter sets.
  • the first parameter set may have other parameter type (s) in addition to the parameter types of the at least two target reconstruction parameter sets.
  • the first user input may indicate the at least two target reconstruction needs, or at least two target reconstruction parameter sets of the at least two target reconstruction needs or a portion thereof.
  • the processing device 140 may receive the first user input from an interface of the medical imaging system 100. For example, a user may select or click at least two icons (e.g., a buttons) corresponding to the at least two target reconstruction parameter sets respectively on the interface to transmit the first user input before the image reconstruction.
  • icons e.g., a buttons
  • the processing device 140 may obtain raw data of an object generated by an imaging device, which is similar to operation 501.
  • the processing device 140 may generate initial image data (e.g., an image such as a 3D image or a 2D image) of the object by performing, according to the first parameter set, a reconstruction operation on the raw data.
  • the processing device 140 may generate preprocessed data by preprocessing the raw data, and generate the initial image data by performing, according to the first parameter set, the reconstruction operation on the preprocessed data, which is similar to operations 503 and 505.
  • the first parameter set associated with reconstruction is determined based on the complete reconstruction parameter set or the at least two target reconstruction parameter sets, the initial image data may meet the at least two target reconstruction needs.
  • the processing device 140 may obtain at least two second parameter sets associated with postprocessing according to a second user input.
  • the processing device 140 may obtain a target postprocessing parameter set corresponding to the target reconstruction need as the second parameter set associated with postprocessing.
  • the second user input may indicate the at least two target reconstruction needs, or at least two target postprocessing parameter sets of the at least two target reconstruction needs.
  • the processing device 140 may receive the second user input via an interface of the medical imaging system 100. For example, the user may select or click at least two icons (or buttons) corresponding to the at least two target postprocessing parameter sets respectively on the interface after the initial image data is generated.
  • the second user input and the first user input may be integrated into a single user input.
  • the single user input may indicate the at least two target reconstruction parameter sets and the at least two target postprocessing parameter sets of the at least two target reconstruction needs. For example, a user may need only select or click at least two icons (e.g., buttons) corresponding to the at least two target reconstruction needs on the interface before image reconstruction to generate the single user input.
  • the single user input may trigger the image reconstruction process, i.e., the processing device 140 may perform operations S1-S5 in response to the single user input.
  • the processing device 140 may generate a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data, which is similar to operation 509.
  • the processing device 140 may generate at least two target images corresponding to the at least two target reconstruction needs simultaneously, during which only one reconstruction operation is performed, which saves the computing resource and improves the efficiency of image reconstruction.
  • the initial image data (e.g., an initial image) of the object generated in S3 may be stored (e.g., online or offline) in a storage device as described elsewhere in the present disclosure, which is similar to operation 507.
  • the processing device 140 may retrieve or obtain the initial image of the object from the storage device. Then, the processing device 140 may obtain at least two parameter sets associated with postprocessing according to a user input (e.g., similar to the second user input) . For each parameter set of the at least two parameter sets associated with postprocessing, the processing device 140 may generate a target image by performing, according to the each parameter set, a postprocessing operation on the initial image, which is similar to operation S5.
  • an interface for image reconstruction may be configured to receive one or more user inputs (e.g., the first user input, the second user input, etc. ) for triggering one or more operations associated with image reconstruction.
  • the interface may include at least two icons (e.g., buttons) corresponding to the at least two target reconstruction needs respectively.
  • Each of the at least two target reconstruction needs may correspond to a reconstruction parameter set and a postprocessing parameter set.
  • the at least two icons may be selected (or clicked) by the user together or individually.
  • the processing device 140 may determine a plurality of reconstruction parameters (e.g.
  • the processing device 140 may obtain raw data of an object generated by an imaging device, which is similar to S2.
  • the processing device 140 may generate initial image data of the object by performing, according to the plurality of reconstruction parameters, a reconstruction operation on the raw data, which is similar to S3.
  • the processing device 140 may generate a target image by performing, according to each of the at least two postprocessing parameter sets, a postprocessing operation on the initial image data, which is similar to S5.
  • the interface may include at least two first icons for receiving the first user input in S1, and at least two second icons for receiving the second user input in S4.
  • the user may select or input a target reconstruction need before image reconstruction.
  • the processing device 140 may receive the reconstruction parameter set and the postprocessing parameter set of the target reconstruction need before image reconstruction and perform the image reconstruction from the beginning to the end (e.g., (e.g., from obtaining raw data to generating a target image) continuously for each image reconstruction.
  • the image reconstruction may be divided into two stages (or parts) .
  • a first stage may be configured for initial image reconstruction to generate initial image data (e.g., an initial image) .
  • a second stage may be configured for postprocessing to generate a target image.
  • the processing device 140 may generate the initial image data by performing the reconstruction operation only once according to the first stage, e.g., as described in operations 501-505 and/or S1-S3.
  • the initial image data may meet the at least two target reconstruction needs.
  • the processing device 140 may receive or obtain the postprocessing reconstruction parameter set of each of the at least two target reconstructions needs for generating the target image according to the second stage. Accordingly, the consumption of resource and time of the at least two image reconstructions may be reduced in comparison with performing the at least two image reconstructions independently.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
  • a non-transitory computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electromagnetic, optical, or the like, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran, Perl, COBOL, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • the numbers expressing quantities, properties, and so forth, used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ”
  • “about, ” “approximate” or “substantially” may indicate ⁇ 20%variation of the value it describes, unless otherwise stated.
  • the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment.
  • the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Pulmonology (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Systems (100) and methods for image reconstruction. The methods may include obtaining raw data of an object generated by an imaging device (501), generating preprocessed data by preprocessing the raw data (503), generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data (505), such that the initial image data meets the plurality of potential reconstruction needs, storing the initial image data (507), retrieving target data from the initial image data according to a target reconstruction need (601), generating a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the target reconstruction need (603).

Description

SYSTEMS AND METHODS FOR IMAGE RECONSTRUCTION TECHNICAL FIELD
The disclosure generally relates to the field of image reconstruction, and more particularly relates to systems and methods for multiple image reconstructions associated with the same raw data.
BACKGROUND
Medical imaging technologies have been widely used for clinical examination and medical diagnosis. For example, an imaging device (e.g., a computed tomography (CT) device may be used for scanning an object (e.g., a patient or a portion thereof) to generate raw data of the object. In clinical practice, users (e.g., a doctor) may need to view multiple images of the object which are generated by multiple reconstructions performed on the raw data. The multiple reconstructions performed on the raw data may include repeated or related operations, which may occupy more resources and are time-consuming. Therefore, it is desirable to provide systems and methods for improving process efficiency of multiple image reconstructions associated with the same raw data, thereby simplifying the multiple image reconstruction process and saving the resources and time of the multiple image reconstruction process.
SUMMARY
In an aspect of the present disclosure, a method for image reconstruction is provided. The method may be implemented on a computing device including at least one processor and at least one storage device. The method may include obtaining raw data of an object generated by an imaging device. The method may include generating preprocessed data by preprocessing the raw data. The method may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. The method may also include storing the initial image data. The method may also include retrieving first target data from the initial image data according to a first target reconstruction need. The method may further include  generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
In some embodiments, the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
In some embodiments, a size of pixels or voxels of the first target image may be no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
In some embodiments, the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
In some embodiments, the method may further include retrieving second target data from the initial image data according to a second target reconstruction need; and generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need. The second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
In some embodiments, the first target image of the object and the second target image of the object may be generated in parallel.
In some embodiments, the first target reconstruction need may be associated with a first set of basic reconstruction parameters. The second target reconstruction need may be associated with a second set of basic reconstruction parameters. At least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
In some embodiments, the preprocessing the raw data may include determining whether the raw data is corrupted, determining whether the raw data needs correction, or determining whether the raw data needs noise reduction.
In some embodiments, the postprocessing operation may include at least one of an artifact removal operation, a correction operation, or a merging operation.
In some embodiments, the method may further include obtaining a request for adjusting the plurality of reconstruction parameters; and updating the plurality of reconstruction parameters according to the request.
In some embodiments, the method may further include causing the first target image to be displayed for a user.
In some embodiments, the method may further include determining whether there is an erroneous result in performing the method. In response to a determination that there is an erroneous result, the method may include identifying, from operations of the method, an error operation; and reporting the error operation.
In another aspect of the present disclosure, a system for image reconstruction is provided. The system may include a storage device storing a set of instructions, and at least one processor in communication with the storage device. When executing the set of instructions, the at least one processor may be may be configured to cause the system to perform following operations. The operations may include obtaining raw data of an object generated by an imaging device. The operations may include generating preprocessed data by preprocessing the raw data. The operations may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. The operations may also include storing the initial image data. The operations may also include retrieving first target data from the initial image data according to a first target reconstruction need. The operations may further include generating a first target image of the object by performing a postprocessing operation on  the first target data, such that the first target image meets the first target reconstruction need.
In some embodiments, the first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
In some embodiments, a size of pixels or voxels of the first target image is no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
In some embodiments, the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
In some embodiments, the operations may further include retrieving second target data from the initial image data according to a second target reconstruction need; and generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need. The second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
In some embodiments, the first target image of the object and the second target image of the object may be generated in parallel.
In some embodiments, the first target reconstruction need may be associated with a first set of basic reconstruction parameters. The second target reconstruction need may be associated with a second set of basic reconstruction parameters. At least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters.
In some embodiments, the preprocessing the raw data may include determining whether the raw data is corrupted, determining whether the raw data needs correction, or determining whether the raw data needs noise reduction.
In some embodiments, the postprocessing operation may include at least one of an artifact removal operation, a correction operation, or a merging operation.
In some embodiments, the operations may further include obtaining a request for adjusting the plurality of reconstruction parameters; and updating the plurality of reconstruction parameters according to the request.
In some embodiments, the operations may further include causing the first target image to be displayed for a user.
In some embodiments, the operations may further include determining whether there is an erroneous result in performing the operations. In response to a determination that there is an erroneous result, the operations may further include identifying, from the operations, an error operation; and reporting the error operation.
In another aspect of the present disclosure, a system for image reconstruction is provided. The system may include an obtaining module, a preprocess module, a reconstruction module, one or more pick modules, and one or more postprocess modules each of which corresponds to one of the one or more pick modules. The obtaining module may be configured to obtain raw data of an object generated by an imaging device. The preprocess module may be configured to generate preprocessed data by preprocessing the raw data. The reconstruction module may be configured to generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs; and store the initial image data. Each of the one or more pick modules may be configured to retrieve target data from the initial image data according to a target reconstruction need. The one or more pick modules may be arranged in parallel. Each of the one or more postprocess modules may be configured to generate a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the first target reconstruction need.
In another aspect of the present disclosure, a non-transitory computer readable medium is provided. The non-transitory computer readable medium may include  executable instructions that, when executed by at least one processor, direct the at least one processor to perform a method for image reconstruction. The method may include obtaining raw data of an object generated by an imaging device. The method may include generating preprocessed data by preprocessing the raw data. The method may also include generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. The method may also include storing the initial image data. The method may also include retrieving first target data from the initial image data according to a first target reconstruction need. The method may further include generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
In another aspect of the present disclosure, a method for image reconstruction is provided. The method may be implemented on a computing device including at least one processor and at least one storage device. The method may include obtaining a first parameter set associated with reconstruction according to a first user input. The method may include obtaining raw data of an object generated by an imaging device. The method may also include generating initial image data of the object by performing, according to the first parameter set, a reconstruction operation on the raw data. The method may also include obtaining a second parameter set associated with postprocessing according to a second user input. The method may further include generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
In another aspect of the present disclosure, a method for image reconstruction is provided. The method may be implemented on a computing device including at least one processor and at least one storage device. The method may include obtaining raw data of an object generated by an imaging device. The method may include generating preprocessed data by preprocessing the raw data. The method may also include  obtaining a reconstruction parameter set associated with at least two target reconstruction needs. The method may further include generating an initial image by performing, according to the reconstruction parameter set, a reconstruction operation on the preprocessed data. The initial image may meet the at least two target reconstruction needs.
In another aspect of the present disclosure, a method for image reconstruction is provided. The method may be implemented on a computing device including at least one processor and at least one storage device. The method may include obtaining raw data of an object generated by an imaging device. The method may include generating initial image data of the object by performing a reconstruction operation on the raw data. The method may also include obtaining a parameter set associated with postprocessing according to a user input. The method may further include generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
In another aspect of the present disclosure, a method for image reconstruction is provided. The method may be implemented on a computing device including at least one processor and at least one storage device. The method may include obtaining an initial image of an object. The method may also include obtaining at least two parameter sets associated with postprocessing according to a user input. For each parameter set of the at least two parameter sets, the method may further include generating a target image by performing, according to the each parameter set, a postprocessing operation on the initial image.
In another aspect of the present disclosure, an interface is provided. The interface may include at least two icons corresponding to at least two target reconstruction needs respectively. Each of the at least two target reconstruction needs may correspond to a reconstruction parameter set and a postprocessing parameter set. In response to the at least two icons being selected, operations for image reconstruction may be triggered. The operations may include determining a plurality of reconstruction parameters based on the at least two reconstruction parameter sets; obtaining raw data of  an object generated by an imaging device; generating initial image data of the object by performing, according to the plurality of reconstruction parameters, a reconstruction operation on the raw data; generating a target image by performing, according to each of the at least two postprocessing parameter sets, a postprocessing operation on the initial image data.
Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities, and combinations set forth in the detailed examples discussed below.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. The drawings are not to scale. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a schematic diagram illustrating an exemplary medical imaging system according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure;
FIG. 4 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure;
FIG. 5 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure;
FIG. 6 is a flowchart illustrating an exemplary process for generating a target image of an object based on initial image data of the object according to some embodiments of the present disclosure;
FIG. 7 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure; and
FIG. 8 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to those skilled in the art that the present disclosure may be practiced without such details. In other instances, well-known methods, procedures, systems, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise, ” “comprises, ” and/or “comprising, ” “include, ” “includes, ” and/or “including, ” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that the term “system, ” “engine, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections or assembly of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
Generally, the word “module, ” “unit, ” or “block, ” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions. A module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device. In some embodiments, a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG. 2) may be provided on a computer-readable medium, such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) . Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or can be included of programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. In general, the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage. The description may be applicable to a system, an engine, or a portion thereof.
It will be understood that when a unit, engine, module or block is referred to as being “on, ” “connected to, ” or “coupled to, ” another unit, engine, module, or block, it may be directly on, connected or coupled to, or communicate with the other unit, engine, module, or block, or an intervening unit, engine, module, or block may be present, unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. The term “image” in the present disclosure is used to collectively refer to image data (e.g., scan data) and/or images of various forms, including a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) image, etc. The term “pixel” and “voxel” in the present disclosure are used interchangeably to refer to an element of an image. The subject may include a biological subject (e.g., a human, an animal) , a non-biological subject (e.g., a phantom) , etc.
These and other features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form a part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended to limit the scope of the present disclosure. It is understood that the drawings are not to scale.
The flowcharts used in the present disclosure illustrate operations that systems implement according to some embodiments of the present disclosure. It is to be expressly understood the operations of the flowcharts may be implemented not in order. Conversely, the operations may be implemented in an inverted order, or simultaneously. Moreover, one or more other operations may be added to the flowcharts. One or more operations may be removed from the flowcharts.
As used herein, a representation of a subject (e.g., a patient, or a portion thereof) in an image may be referred to as the subject for brevity. For instance, a representation of an organ or tissue (e.g., the heart, the liver, a lung, etc., of a patient) in an image may  be referred to as the organ or tissue for brevity. An image including a representation of a subject may be referred to as an image of the subject or an image including the subject for brevity. As used herein, an operation on a representation of a subject in an image may be referred to as an operation on the subject for brevity. For instance, a segmentation of a portion of an image including a representation of an organ or tissue (e.g., the heart, the liver, a lung, etc., of a patient) from the image may be referred to as a segmentation of the organ or tissue for brevity.
An aspect of the present disclosure relates to systems and methods for image reconstruction (e.g., optimizing multiple image reconstructions associated with the same raw data) . The systems and methods may generate initial image data (e.g., an initial image sequence) by performing, according to a plurality of reconstruction parameters (also referred to as parameters with relatively small granules) associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. The systems and methods may store the initial image data. The systems and methods may retrieve one or more sets of target data from the initial image data according to one or more target reconstruction needs (TRNs) respectively. For each of the one or more sets of target data, the systems and methods may further generate a target image of the object by performing a postprocessing operation on the set of target data, such that the target image meets the target reconstruction need corresponding to the set of the target data.
Traditionally, as shown in FIG. 7, multiple image reconstructions may need to be performed on the same raw data according to multiple target reconstruction needs (e.g., TRN 1, TRN 2, …, TRN n) , respectively. Each image reconstruction process may include multiple operations such as read (e.g., for obtaining the raw data) , prepare (e.g., for preprocessing the raw data) , reconstruction (e.g., for reconstructing images based on the preprocessed raw data) , postprocess (e.g., for postprocessing the reconstructed image) , output image (e.g., for outputting the postprocessed image) , etc. In some embodiments, for the same raw data, the operations of read, prepare and reconstruction in the reconstruction process of multiple images may be at least partially the same or similar. In  addition, the operations of read, prepare and reconstruction may occupy a relatively large amount of resource in the image reconstruction process, e.g., occupy 60%-80%of the resource needed for the image reconstruction process. For each target reconstruction need, the operations of read, prepare, and reconstruction may need to be repeated, which is time-consuming and resource intensive. In some embodiments, when an erroneous result occurs in the multiple image reconstructions, it may take relatively large amount of manpower and time for maintenance. Therefore, it is desirable to provide systems and methods for improve processing efficiency of multiple image reconstructions.
According to some embodiments of the present disclosure, as shown in FIG. 8, an image reconstruction process may be modified to include two parts. A first part may include the similar and resource-intensive operations (e.g., the operations of read, preprocess, and reconstruction) of the image reconstruction process as described aforementioned. A second part may include operations of pick, postprocess, and output image. Differing from the traditional image reconstruction process, the operations in the first part of reconstruction process may be performed to generate initial image data according to a plurality of potential reconstruction needs, such that the pick operation in the second part can determine target image data based on the initial image data for generating one or more target images that meet one or more target reconstruction needs. Accordingly, when there are multiple target reconstruction needs (e.g., TRN 1, TRN 2, …, TRN n) , operations of read, prepare and reconstruction may be performed on the raw data only once, and target image data corresponding to each target reconstruction need may be directly retrieved from the initial image data for generating a corresponding target image, thereby reducing the consumption of resource and time of the multiple image reconstructions. In some embodiments, by using the image reconstruction process with the two parts, when an erroneous result occurs in the multiple image reconstructions, it is easy to identify which part has an error for positioning an error operation, which helps the developer to position the error operation faster and saves the manpower and time.
FIG. 1 is a schematic diagram illustrating an exemplary medical imaging system according to some embodiments of the present disclosure. In some embodiments, the  medical imaging system 100 may be used for non-invasive imaging, such as for disease diagnosis, treatment, and/or research purposes. In some embodiments, the medical imaging system 100 may include a single modality system and/or a multi-modality system. The term “modality” used herein broadly refers to an imaging or treatment method or technology that gathers, generates, processes, and/or analyzes imaging information of a subject or treatments the subject. The single modality system may include a computed tomography (CT) system, a magnetic resonance imaging (MRI) system, a positron emission tomography (PET) system, or the like, or any combination thereof. The multi-modality system may include a positron emission tomography-computed tomography (PET-CT) system, a positron emission tomography-magnetic resonance imaging (PET-MRI) system, an image guided radiotherapy system (e.g., a CT guided radiotherapy system) , or the like, or any combination thereof.
As shown in FIG. 1, the medical imaging system 100 may include an imaging device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150. The components of the medical imaging system 100 may be connected in one or more of various ways. Mere by way of example, as illustrated in FIG. 1, the imaging device 110 may be connected to the processing device 140 through the network 120. As another example, the imaging device 110 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140) . As a further example, the storage device 150 may be connected to the processing device 140 directly or through the network 120. As still a further example, the terminal device 130 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal device 130 and the processing device 140) or through the network 120.
The imaging device 110 may be configured to scan or image an object or a portion thereof. In some embodiments, the object may include a biological subject (e.g., a patient) or a non-biological subject (e.g., a phantom) . For example, the object may include a specific part, organ, and/or tissue of a patient. As another example, the object may include the head, the brain, the neck, the breast, the heart, the lung, the stomach,  blood vessels, soft tissues, or the like, or any combination thereof. The term “object” or “subject” are used interchangeably in the present disclosure. In some embodiments, the imaging device 110 may include a single modality device. For example, the imaging device 110 may include a CT device, an MRI device, a PET device, etc. Alternatively, the imaging device 110 may include a multi-modality device (e.g., a double-modality device) . For example, the imaging device 110 may include a PET-CT device, a PET-MRI device, an image guided radiotherapy device, etc. For illustration purposes, the imaging device 110 illustrated in FIG. 1 is provided with reference to a CT device, which is not intended to limit the scope of the present disclosure.
As illustrated, the imaging device 110 (e.g., the CT device) may include a gantry 111, a detector 112, a detecting region 113, a table 114, and a radiation source 115. The gantry 111 may support the detector 112 and the radiation source 115. The gantry 111 may rotate, for example, clockwise or counterclockwise about an axis of rotation of the gantry 111. The radiation source 115 and/or the detector 112 may rotate together with the gantry 111. The object may be placed on the table 114 for scanning. The radiation source 115 may emit a beam of radiation rays to the object. The detector 112 may detect the radiation beam (e.g., gamma photons) emitted from the radiation source 115. After the detector 112 receives the radiation beam passing through the object, the received radiation beam may be converted into visible lights. The visible lights may be converted into electrical signals. The electrical signals may be further converted into digital information using an analog-to-digital (AD) converter. The digital information may be transmitted to a computing device (e.g., the processing device 140) for processing, or transmitted to a storage device (e.g., the storage device 150) for storage. In some embodiments, the detector 112 may include one or more detector units. The detector unit (s) may be and/or include single-row detector elements and/or multi-row detector elements.
For illustration purposes, a coordinate system 116 is provided in FIG. 1. The coordinate system 116 may be a Cartesian system including an X-axis, a Y-axis, and a Z-axis. The X-axis and the Z-axis shown in FIG. 1 may be horizontal and the Y-axis may be  vertical. As illustrated, the positive X direction along the X-axis may be from the left side to the right side of the table 114 viewed from the direction facing the front of the imaging device 110; the positive Z direction along the Z-axis shown in FIG. 1 may be from the front side to the rare side of the imaging device 110; the positive Y direction along the Y-axis shown in FIG. 1 may be from the lower part to the upper part of the imaging device 110.
The processing device 140 may process data and/or information. The data and/or information may be obtained from one or more components of the medical imaging system 100 or an external source that the medical imaging system 100 can access. For example, the data and/or information may be obtained from the imaging device 110, the terminal (s) 130, the storage device 150, a medical database, etc. In some embodiments, the processing device 140 may process the data and/or information for image reconstruction. For example, the processing device 140 may obtain raw data of an object generated by the imaging device 110. The processing device 140 may preprocess the raw data to generate preprocessed data. The processing device 140 may generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. As another example, the processing device 140 may store the initial image data. As still another example, the processing device 140 may retrieve first target data from the initial image data according to a target reconstruction need. The processing device 140 may generate a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the target reconstruction need. In some embodiments, the processing device 140 may be a single server or a server group. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data stored in the imaging device 110, the terminal (s) 130, and/or the storage device 150 via the network 120. As another example, the processing device 140 may be directly connected to the imaging device 110, the terminal (s) 130, and/or the storage device 150 to access stored information and/or  data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, a cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, and a multi-cloud, or the like, or any combination thereof. In some embodiments, the processing device 140 may be implemented by a computing device 200 having one or more components as illustrated in FIG. 2.
The terminal 130 may input/output signals, data, information, etc. In some embodiments, the terminal 130 may enable a user interaction with the processing device 140. For example, the terminal 130 may display a target image of the object on a screen of the terminal 130. As another example, the terminal 130 may obtain a user’s input information (e.g., one or more target reconstruction needs input or selected by a user) through an input device (e.g., a keyboard, a touch screen, a brain wave monitoring device) , and transmit the input information to the processing device 140 for further processing. The terminal 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof. In some embodiments, the mobile device 131 may include a smart home device, a wearable device, a mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof. In some embodiments, the wearable device may include a bracelet, footwear, a pair of glasses, a helmet, a watch, clothing, a backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the mobile device may include a mobile phone, a personal digital assistant (PDA) , a navigation device, a point of sale (POS) device, a laptop computer, a tablet computer, a desktop computer, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or augmented reality device may include a virtual reality helmet, a pair of virtual reality glasses, a virtual reality patch, an augmented reality helmet, a pair of augmented reality glasses, an augmented reality patch, or the like, or any combination thereof. For example, the virtual  reality device and/or augmented reality device may include a Google Glass TM, an Oculus Rift TM, a HoloLens TM, a Gear VR TM, or the like. In some embodiments, the terminal 130 may be part of the processing device 140. In some embodiments, the terminal 130 may be integrated with the processing device 140 as an operation station of the imaging device 110. Merely by way of example, a user (for example, a doctor or an operator) of the medical imaging system 100 may control an operation of the imaging device 110 through the operation station.
The storage device 150 may store data (e.g., raw data of an object) , instructions, and/or any other information. In some embodiments, the storage device 150 may store data obtained from the imaging device 110, the terminal (s) 130 and/or the processing device 140. For example, the storage device 150 may store raw data of an object acquired from the imaging device 110. As another example, the storage device 150 may store initial image data of the object generated by the processing device 140. As still another example, the storage device 150 may store one or more target images of the object generated by the processing device 140. In some embodiments, the storage device 150 may store data and/or instructions executed or used by the processing device 140 to perform exemplary methods described in the present disclosure. In some embodiments, the storage device 150 may include a mass storage device, a removable storage device, a volatile read-write memory, a read-only memory (ROM) , or the like, or any combination thereof. For example, the mass storage device may include a magnetic disk, an optical disk, a solid-state drive, a mobile storage, etc. The removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a ZIP disk, a magnetic tape, etc. The volatile read-and-write memory may include a random access memory (RAM) . The RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR-SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. The ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , a digital versatile disk ROM, etc. In some embodiments, the storage device 150 may be  implemented by the cloud platform described in the present disclosure. For example, a cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components (e.g., the processing device 140, the terminal 130, etc. ) of the medical imaging system 100. One or more components of the medical imaging system 100 may access the data or instructions in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be a part of the processing device 140 or may be independent and directly or indirectly connected to the processing device 140.
The network 120 may include any suitable network that can facilitate the exchange of information and/or data of the medical imaging system 100. In some embodiments, one or more components of the medical imaging system 100 (e.g., the imaging device 110, the terminal 130, the processing device 140, the storage device 150, etc. ) may communicate information and/or data with one or more components of the medical imaging system 100 via the network 120. The network 120 may include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , etc. ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc. ) , a cellular network (e.g., a Long Term Evolution (LTE) network) , a frame relay network, a virtual private network ( "VPN" ) , a satellite network, a telephone network, routers, hubs, server computers, or the like, or a combination thereof. For example, the network 120 may include a wireline network, an optical fiber network, a telecommunication network, a local area network, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or a combination thereof. In some embodiments, the network 120 may include one or more network access points. For example, the network 120 may include wired and/or wireless network access points, such as base stations and/or Internet  exchange points, through which one or more components of the medical imaging system 100 may be connected to the network 120 to exchange data and/or information.
It should be noted that the above description regarding the medical imaging system 100 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, the medical imaging system 100 may include one or more additional components and/or one or more components of the medical imaging system 100 described above may be omitted. In some embodiments, a component of the medical imaging system 100 may be implemented on two or more sub-components. Two or more components of the medical imaging system 100 may be integrated into a single component.
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure. The computing device 200 may be configured to implement any component of the medical imaging system 100. For example, the imaging device 110, the terminal 130, the processing device 140, and/or the storage device 150 may be implemented on the computing device 200. Although only one such computing device is shown for convenience, the computer functions relating to the medical imaging system 100 as described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load. As illustrated in FIG. 2, the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
The processor 210 may execute computer instructions (e.g., program codes) and perform functions of the processing device 140 in accordance with techniques described herein. The computer instructions may include, for example, routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein. In some embodiments, the processor 210 may  perform instructions obtained from the terminal 130 and/or the storage device 150. In some embodiments, the processor 210 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application-specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
Merely for illustration, only one processor is described in the computing device 200. However, it should be noted that the computing device 200 in the present disclosure may also include multiple processors. Thus operations and/or method steps that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors. For example, if in the present disclosure the processor of the computing device 200 executes both operation A and operation B, it should be understood that operation A and operation B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
The storage 220 may store data/information obtained from the imaging device 110, the terminal 130, the storage device 150, or any other component of the medical imaging system 100. In some embodiments, the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. In some embodiments, the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
The I/O 230 may input or output signals, data, and/or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device.  Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, a camera capturing gestures, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, a 3D hologram, a light, a warning light, or the like, or a combination thereof. Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
The communication port 240 may be connected with a network (e.g., the network 120) to facilitate data communications. The communication port 240 may establish connections between the processing device 140 and the imaging device 110, the terminal 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception. The wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof. The wireless connection may include a Bluetooth TM network, a Wi-Fi network, a WiMax network, a WLAN, a ZigBee TM network, a mobile network (e.g., 3G, 4G, 5G) , or the like, or any combination thereof. In some embodiments, the communication port 240 may be a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure. In some embodiments, the processing device 140 or the terminal 130 may be implemented on the mobile device 300. As illustrated in FIG. 3, the mobile device 300 may include a communication module 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and storage 390. The CPU 340 may include interface circuits and processing circuits similar to the processor 210. In some embodiments, any other suitable component, including but not  limited to a system bus or a controller (not shown) , may also be included in the mobile device 300. In some embodiments, an operating system (OS) 370 (e.g., iOS TM, Android TM, Windows Phone TM) and one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340. The applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to imaging on the mobile device 300. User interactions with the information stream may be achieved via the I/O devices 350 and provided to the processing device 140 and/or other components of the medical imaging system 100 via the network 120.
To implement various modules, units, and their functionalities described in the present disclosure, computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein. A computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device. A computer may also act as a server if appropriately programmed.
FIG. 4 is a schematic diagram illustrating an exemplary processing device according to some embodiments of the present disclosure. As shown in FIG. 4, the processing device 140 may include a first part and a second part. The first part may be configured to generate initial image data of an object for storage. The second part may be configured to generate one or more target images of the object based on the initial image data. It should be noted that the first part and the second part belonging to the same processing device 140 as shown in FIG. 4 are provided for illustration purposes. In some embodiments, the first part and the second part may belong to or implemented on different processing devices, respectively. For example, the first part may belong to or implemented on one or more first processing devices, while the second part may belong to or implemented on one or more second processing devices.
The first part may include an obtaining module 401, a preprocess module 402, and a reconstruction module 403.
The obtaining module 401 may be configured to obtain data/information from one or more components (e.g., the imaging device 110, a terminal (e.g., the terminal 130) , a storage device (e.g., the storage device 150) , etc. ) of the medical imaging system 100 or an external source (e.g., a medical database) . For example, the obtaining module 401 may obtain raw data of the object generated by the imaging device 110. As used herein, the raw data of the object (e.g., a patient or a portion thereof) refers to imaging data (e.g., scan data) of the object generated by the imaging device 110. Taking the imaging device 110 of a CT device as an example, the raw data of the object may include projection data of the object acquired by the CT device. More descriptions regarding the obtaining of the raw data of the object may be found elsewhere in the present disclosure (e.g., operation 501 and the description thereof) .
The preprocess module 402 may be configured to preprocess the raw data of the object. For example, the preprocess module 402 may perform one or more preprocessing operations on the raw data to generate preprocessed data. The one or more preprocessing operations may include determining whether the raw data is corrupted, determining whether the raw data needs correction, determining whether the raw data needs noise reduction, or the like, or any combination thereof. More descriptions regarding the preprocessing of the raw data may be found elsewhere in the present disclosure (e.g., operation 503 and the description thereof) .
The reconstruction module 403 may be configured to generate initial image data based on the preprocessed data. For example, the reconstruction module 403 may generate the initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. In some embodiments, the initial image data may include (or correspond to) an initial image sequence of the object. The initial image sequence of the object may include a plurality of image slices relating to the object that are arranged along the Z-axis, Y-axis, or X-axis. In some embodiments, the plurality of reconstruction parameters may include at least one of a reconstruction slice thickness, a reconstruction  interval, or a reconstruction field of view (FOV) , reconstruction start and end positions, a reconstruction matrix, etc. In some embodiments, the processing device 140 may determine the plurality of reconstruction parameters based on the plurality of potential reconstruction needs. The plurality of potential reconstruction needs may be determined according to clinical experiences. That is, in general, one or more users (e.g., a doctor or a technician) may need to view one or more images of the object (e.g., from different views of the object) for diagnosis or research purposes. The one or more images of the object may be reconstructed based on the raw data of the object according to one or more of the plurality of potential reconstruction needs respectively. Each of the plurality of potential reconstruction needs may correspond to a set of potential reconstruction parameters. Accordingly, the processing device 140 may determine the plurality of reconstruction parameters based on (e.g., from) the plurality sets of potential reconstruction parameters. Each potential reconstruction parameter of a set of potential reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. Further, the reconstruction module 403 may store the initial image data in a storage device (e.g., the storage device 150) . More descriptions regarding the generation and storing of the initial image data may be found elsewhere in the present disclosure (e.g.,  operations  505 and 507 and the descriptions thereof) .
The second part may include one or more pick modules 404 (e.g., a pick module 404-1, a pick module 404-2, …, a pick module 404-n) , one or more postprocess modules 405 (e.g., a postprocess module 405-1, a postprocess module 405-2, …, a postprocess module 405-n) , and one or more output modules 406 (e.g., an output module 406-2, an output module 406-2, …, and an output module 406-n) . Each pick module 404 may correspond to a postprocess module 405 and an output module 406. The pick module 404 with its corresponding postprocess module 405 and corresponding output module 406 may correspond to a target reconstruction need for generating a target image that meets the target reconstruction need. For example, the pick module 404-1 may correspond to the postprocess module 405-1 and the output module 406-1. The pick module 404-1 may correspond to the postprocess module 405-2 and the output module 406-2. The pick  module 404-1, the postprocess module 405-1 and the output module 406-1 may correspond to a first target reconstruction need. The pick module 404-2, the postprocess module 405-2 and the output module 406-2 may correspond to a second target reconstruction need. In some embodiments, the modules corresponding to different target reconstruction needs may be implemented in different processing devices or different sub-devices of a processing device.
The pick module (s) 404 may be configured to retrieve target data from the initial image data according to one or more target reconstruction needs. Merely by way of example, the pick module 404-1 may retrieve first target data from the initial image data according to a first target reconstruction need for generating a first target image. The first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, a size of pixels or voxels of the first target image, etc. The size of pixels or voxels of the first target image may be no less than the sizes of pixels or voxels of the initial image sequence corresponding to the initial image data. The first target reconstruction need may be associated with a first set of basic reconstruction parameters. Each target reconstruction parameter of the first set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. More descriptions regarding the retrieving of target data from the initial image data may be found elsewhere in the present disclosure (e.g.,  operations  509 and 601 and the descriptions thereof) .
The postprocess module (s) 405 may be configured to generate one or more target images based on the retrieved target data. For example, the postprocess module 405-1 may generate the first target image of the object by performing a postprocessing operation on the first target data. The postprocessing operation may include an artifact removal operation, a correction operation, a merging operation, a multi-planner reformation (MPR) operation, or the like, or any combination thereof. More descriptions regarding the generation of the one or more target images may be found elsewhere in the present disclosure (e.g.,  operations  509 and 603 and the descriptions thereof) .
The output module (s) 406 may be configured to cause the one or more target images to be displayed for a user. For example, the output module 406-1 may cause the first target image to be displayed, e.g., on a screen of the terminal 130 for the user. Further, the user may input a feedback indicating whether the first target image is satisfied or whether to print the first target image via the terminal 130. More descriptions regarding causing the one or more target images to be displayed may be found elsewhere in the present disclosure (e.g., operation 511 and the description thereof) .
The modules in the processing device 140 may be connected to or communicated with each other via a wired connection or a wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof. The wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth TM, a ZigBee TM, a Near Field Communication (NFC) , or the like, or any combination thereof. Each of the modules described above may be a hardware circuit that is designed to perform certain actions, e.g., according to a set of instructions stored in one or more storage media, and/or any combination of the hardware circuit and the one or more storage media. In some embodiments, the processing device 140 may include one or more other modules and/or one or more modules described above may be omitted. Additionally or alternatively, two or more modules may be integrated into a single module, and/or a module may be divided into two or more units. For example, the above-mentioned modules may be integrated into a console (not shown) . Via the console, a user may set parameters for scanning an object, controlling imaging processes, controlling reconstruction processes, viewing images, etc. As another example, the processing device 140 may include a storage module (not shown) (e.g., a first storage module in the first part and/or a second storage module in the second part) configured to store information and/or data (e.g., the initial image data, the one or more target images, etc. ) associated with the above-mentioned modules. As still another example, the processing device 140 may include a communication module configured to cause the initial image data and/or the one or more target images to be displayed for a user, e.g.,  transmit the initial image data and/or the one or more target images to the terminal 130 for display.
FIG. 5 is a flowchart illustrating an exemplary process for multiple image reconstructions associated with the same raw data according to some embodiments of the present disclosure. In some embodiments, process 500 may be implemented as a set of instructions (e.g., an application) stored in a storage device (e.g., the storage device 130, the storage device 220, and/or the storage 390) . The processing device 140 (e.g., the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 140 may be configured to perform the process 500. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
In 501, the processing device 140 (e.g., the obtaining module 401) may obtain raw data of an object generated by an imaging device (e.g., the imaging device 110) . In some embodiments, operation 501 may also be referred to as a read operation, e.g., as shown in FIG. 8.
As used herein, the raw data of the object (e.g., a patient or a portion thereof) refers to imaging data (e.g., scan data) of the object generated by the imaging device 110. Taking the imaging device 110 of a CT device as an example, the raw data of the object may include projection data of the object acquired by the CT device. In such occasions, the raw data of the object may reflect attenuation information of radiation rays (e.g., X-rays) that pass through the object, and may be generally used to generate one or more images related to the object. In some embodiments, the raw data of the object may be detected and/or collected by the imaging device 110 at a plurality of angles during a scan of the object. The raw data of the object may include a plurality sets of data corresponding to the plurality of angles. For example, the imaging device 110 (e.g., a CT  imaging device) may perform a scan of the object by irradiating the object with X-rays. In some embodiments, during the scan, the radiation source 115 and the detector 112 may rotate with the gantry 111 around the Z-axis to scan the object from different angles.
In some embodiments, the processing device 140 may obtain the raw data of the object from one or more components of the medical imaging system 100, such as the imaging device 110, a terminal (e.g., the terminal 130) , a storage device (e.g., the storage device 150) , etc. Alternatively or additionally, the processing device 140 may obtain the raw data of the object from an external source via the network 120. For example, the obtaining module 401 may obtain the raw data from, for example, a medical database, etc.
In 503, the processing device 140 (e.g., the preprocess module 402) may generate preprocessed data by preprocessing the raw data. In some embodiments, the operation 503 may also be referred to as a preprocess operation, e.g., as shown in FIG. 8.
In some embodiments, the processing device 140 may perform one or more preprocessing operations on the raw data to generate the preprocessed data. The one or more preprocessing operations may include determining whether the raw data is corrupted, determining whether the raw data needs correction, determining whether the raw data needs noise reduction, or the like, or any combination thereof. For example, the processing device 140 may determine whether the raw data is corrupted (or damaged) . For instance, the processing device 140 may determine that the raw data is corrupted if the raw data fails to be read or opened. In response to a determination that the raw data is corrupted, the processing device 140 may perform a recovery operation on the corrupted raw data to recover the raw data. Alternatively, the process 500 may proceed to operation 501, that is, the processing device 140 may obtain the raw data of the object again. As another example, the processing device 140 may determine whether the raw data needs correction. For raw data generated by the CT device, different tissues or structures may correspond to different value ranges. The processing device 140 may obtain a value range corresponding to the object and determine whether values of the raw data of the object are within the value range corresponding to the object. In response to a determination that the values of the raw data of the object are within the value range  corresponding to the object, the processing device 140 may determine that the raw data needs no correction. In response to a determination that the values of the raw data of the object are not within the value range corresponding to the object, the processing device 140 may determine that the raw data of the object needs correction, and the processing device 140 may perform a correction operation on the raw data of the object to correct the values of the raw data. As still another example, the processing device 140 may determine whether the raw data needs noise reduction. For instance, the processing device 140 may determine that the raw data needs noise reduction if noise (s) identified from the raw data is greater than a noise threshold. In response to a determination that the raw data needs noise reduction, the processing device 140 may determine a type of noise of the raw data and perform a noise reduction operation corresponding to the determined type of noise on the raw data of the object for noise reduction.
In 505, the processing device 140 (e.g., the reconstruction module 403) may generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs. In some embodiments, operation 505 may also be referred to as a reconstruction operation, e.g., as shown in FIG. 8.
In some embodiments, the plurality of reconstruction parameters may include a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, a reconstruction matrix, or the like, or a combination thereof. As used herein, the reconstruction slice thickness refers to a thickness (e.g., along the Z-axis, the X-axis, or the Y-axis) of a single image slice of the initial image sequence. The reconstruction interval refers to a distance (e.g., along the Z-axis, the X-axis, or the Y-axis) between two adjacent image slices of the initial image sequence. The reconstruction slice thickness and/or the reconstruction interval may indicate a count (or number) of image slices of the initial image sequence. The larger the reconstruction slice thickness and/or the larger the reconstruction interval is, the less the count of the image slices of the initial image sequence may be. The reconstruction FOV  refers to a size of a scan FOV that is reconstructed to produce image (s) (e.g., the image slice (s) of the initial image sequence) , which may indicate which portion of the object is shown in the image (s) . The scan FOV refers to an area being scanned by the imaging device 110. The reconstruction FOV may be no larger than the scan FOV. The reconstruction start and end positions may include a start position and an end position, which indicate a range of the raw data that is used to reconstruct the initial image sequence. Merely by way of example, the raw data of the object may be acquired by rotating the radiation source 115 of the imaging device 110 from 0° to 360° around the object. The reconstruction start and end positions may include a start position denoted by a start angle (e.g., an angle no less than 0°) and an end position denoted by an end angle (e.g., an angle no larger than 360°) , which indicates the raw data generated from the start angle and the end angle is used to reconstruct the initial image sequence of the object. The reconstruction matrix refers to an array of rows and columns of pixels/voxels (e.g., 512×512, 1024×1024, etc. ) of an image of the initial image sequence, which can indicate a resolution of the image. For an image with preset size, the larger the reconstruction matrix is, the higher a resolution of the image may be. For example, an image corresponding to a reconstruction matrix of 1024×1024 may have a higher resolution than an image corresponding to a reconstruction matrix of 512×512.
In some embodiments, the processing device 140 may determine the plurality of reconstruction parameters based on the plurality of potential reconstruction needs. The plurality of potential reconstruction needs may be determined according to clinical experiences. That is, in general, one or more users (e.g., a doctor or a technician) may need to view one or more images of the object (e.g., from different views of the object) for diagnosis or research purposes. The one or more images of the object may be reconstructed based on the raw data of the object according to one or more of the plurality of potential reconstruction needs respectively. Each of the plurality of potential reconstruction needs may correspond to a set of potential reconstruction parameters (also referred to as a potential parameter set associated with reconstruction or a potential reconstruction parameter set) and a potential parameter set associated with  postprocessing (also referred to as a potential postprocessing parameter set) . Accordingly, the processing device 140 may determine the plurality of reconstruction parameters based on (e.g., from) the plurality sets of potential reconstruction parameters.
In some embodiments, each set of potential reconstruction parameters may have different parameter types. Parameter types of any two sets of potential reconstruction parameters corresponding to different potential reconstruction needs may be the same or different. Parameters types of the plurality of reconstruction parameters may include at least an assemble (e.g., a union) of parameter types of the plurality of sets of potential reconstruction parameters. For example, a set of potential reconstruction parameters corresponding to a first potential reconstruction need may have 5 parameter types, while a set of potential reconstruction parameters corresponding to a second potential reconstruction need may have 8 parameter types. A portion (e.g., 3) of the 5 parameter types may be the same as that of the 8 parameter types. Accordingly, the plurality of reconstruction parameters may have a union of the 5 parameter types and the 8 parameter types (e.g., 10 parameter types) . Alternatively, the plurality of reconstruction parameters may have other parameter type (s) in addition to the parameter types of the plurality of sets of potential reconstruction parameters.
As used herein, a reconstruction need may relate to which portion of the object needs to be showed in image (s) , resolution (s) of the image (s) , a range of data that is used to reconstruct the image (s) , a count (or number) of the image (s) , or the like, or any combination thereof. In some embodiments, different users may have different potential reconstruction needs, different objects may correspond to different potential reconstruction needs, and/or different diseases may correspond to different potential reconstruction needs. In some embodiments, the processing device 140 may parse the plurality of potential reconstruction needs to determine the plurality of reconstruction parameters. In some embodiments, the plurality of potential reconstruction needs may be parsed manually to obtain the plurality of reconstruction parameters. In some embodiments, a reconstruction parameter of the plurality of reconstruction parameters may be determined based on a highest demand of one aspect of the plurality of potential reconstruction needs.  For example, one aspect of the plurality of potential reconstruction needs may relate to various image resolutions (e.g., image resolution R1 may be needed for image preview, image resolution R2 may be needed for disease diagnosis, in which R2>R1) , a reconstruction parameter (e.g., the reconstruction matrix) may be determined based on a highest demand (e.g., the highest image resolution R2) , which means image (s) with a relatively high resolution may be reconstructed for postprocessing. Image (s) with a relatively low resolution (e.g., R1) may be obtained by down-sampling the image (s) with the relatively high resolution (e.g., R2) . Accordingly, an aspect of the reconstruction need relating to image resolution may correspond to a potential reconstruction parameter with various values (e.g., reconstruction matrix M1 corresponding to resolution R1, reconstruction matrix M2 corresponding to resolution R2) . In some embodiments, the processing device 140 may determine a parameter with a finest value (e.g., M2 corresponding to R2) among the various values (e.g., M1 corresponding to R1, M2 corresponding to R2) or a value finer than the finest value as one of the plurality of reconstruction parameters.
In some embodiments, the plurality of reconstruction parameters may also be referred to as parameters with relatively small granules, while each set of potential reconstruction parameters may also be referred to as parameters with relatively large granules. For brevity, the plurality of reconstruction parameters may also be referred to as a complete reconstruction parameter set indicating that the plurality of reconstruction parameters meet the plurality of potential reconstruction needs. That is, each potential reconstruction parameter of a set of potential reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. For instance, for each set of potential reconstruction parameters, a potential reconstruction parameter of the set of potential reconstruction parameters may be coarser than a corresponding reconstruction parameter of the plurality of reconstruction parameters. For example, a potential reconstruction slice thickness of a set of potential reconstruction parameters may be no less than the reconstruction slice thickness of the plurality of reconstruction parameters. As another example, a potential reconstruction interval of a set of potential  reconstruction parameters may be no less than the reconstruction interval of the plurality of reconstruction parameters. As still another example, a potential reconstruction FOV of a set of potential reconstruction parameters may be no greater than the reconstruction FOV of the plurality of reconstruction parameters. As further another example, potential reconstruction start and end positions of each set of potential reconstruction parameters may indicate a potential range of the raw data which is no larger than the range of the raw data indicated by the reconstruction start and end positions of the plurality of reconstruction parameters. As further another example, a potential reconstruction matrix of a set of potential reconstruction parameters may be no larger than the reconstruction matrix of the plurality of reconstruction parameters. Merely by way of example, there may be three sets of potential reconstruction parameters corresponding to three potential reconstruction needs respectively. Assuming that the three sets of potential reconstruction parameters include potential reconstruction slice thicknesses of 5 mm, 3 mm, and 2 mm, respectively, the reconstruction slice thickness of the plurality of reconstruction parameters may be finer than the three potential reconstruction slice thicknesses, e.g., being 0.5 mm. Assuming that the three sets of potential reconstruction parameters include potential reconstruction intervals of 5 mm, 3 mm, and 2 mm, respectively, the reconstruction interval of the plurality of reconstruction parameters may be finer than the three potential reconstruction intervals, e.g., being 0.5 mm. In some embodiments, the plurality of reconstruction parameters may be as fine as possible such that the initial image data (or the initial image sequence) may meet subsequent target reconstruction needs.
In some embodiments, the initial image data may include an image (e.g., a 3D image or a 2D image) of the object. For instance, the initial image may include (or correspond to) an initial image sequence of the object. The initial image sequence of the object may include a plurality of image slices relating to the object that are arranged along the Z-axis, the X-axis, and/or the Y-axis. For example, the processing device 140 may perform the reconstruction operation on the preprocessed data according to the plurality of reconstruction parameters to generate the initial image sequence. As the plurality of  reconstruction parameters are with relatively small granules and fine, sizes of pixels or voxels of the initial image sequence may be relatively small. In some embodiments, one or more images of the initial image sequence may be relatively blurred and/or sharp, and may need to be postprocessed (e.g., according to operation 509) . In some embodiments, the initial image data may include multiple initial image sequences corresponding to different reconstruction parameters respectively. Merely by way of example, the processing device 140 may generate different image sequences each corresponding different reconstruction slice thicknesses (e.g., 0.5 mm, 1 mm, 1.5 mm, etc. ) , such that target data corresponding to one of the different reconstruction slice thicknesses can be directly retrieved from multiple image sequences.
In some embodiments, the processing device 140 may perform the reconstruction operation on the preprocessed data using a reconstruction algorithm. Exemplary reconstruction algorithms may include a filtered back projection (FBP) algorithm, a forward projection algorithm, etc. ) , an iterative reconstruction algorithm, a Fourier-based reconstruction algorithm, a rearrangement algorithm, or the like, or any combination thereof.
In 507, the processing device 140 (e.g., the reconstruction module 403) may store the initial image data. In some embodiments, operation 507 may be a portion of the reconstruction operation in 505 or independent from the reconstruction operation.
In some embodiments, the processing device 140 may store the initial image data in a storage device such as the storage device 150, the storage 220, or the storage 390 for subsequent access and processing. Alternatively, the processing device 140 may store the initial image data in a storage module of the processing device 140.
In 509, the processing device 140 (e.g., the pick module (s) 404, the postprocess module (s) 405) may generate, based on the initial image data, one or more target images of the object according to one or more target reconstruction needs, respectively. In some embodiments, operation 509 may include a pick operation and a postprocess operation, e.g., as shown in FIG. 8.
In some embodiments, each of the one or more target reconstruction needs may be associated with a set of basic reconstruction parameters (also referred to as a target parameter set associated with reconstruction, or a target reconstruction parameter set) . The set of basic reconstruction parameters may be parameters with relatively large granules with respect to the plurality of reconstruction parameters. That is, each set of basic reconstruction parameters may be coarser than the plurality of reconstruction parameters. For instance, for each set of basic reconstruction parameters, each type of parameter of the set of basic reconstruction parameters may be coarser than a corresponding type of parameter of the plurality of reconstruction parameters, which is similar to that each potential reconstruction parameter of the set of potential reconstruction parameters is coarser than a corresponding reconstruction parameter of the plurality of reconstruction parameters. For example, there may be a first target reconstruction need and a second target reconstruction need. The first target reconstruction need may be associated with a first set of basic reconstruction parameters. The second target reconstruction need may be associated with a second set of basic reconstruction parameters. At least one (e.g., each target reconstruction parameter) of the first set of basic reconstruction parameters and/or the second set of basic reconstruction parameters may be coarser than a corresponding one of the plurality of reconstruction parameters. In some embodiments, each set of basic reconstruction parameters may be input and/or adjusted by a user of the medical imaging system 100.
In some embodiments, for each of the one or more target reconstruction needs, the processing device 140 may retrieve target data from the initial image data according to the target reconstruction need. The target reconstruction need may indicate at least one of a position and/or a scope of the target data in the initial image data, a size of pixels or voxels of the target image, etc. The size of pixels or voxels of the first target image may be no less than the sizes of pixels or voxels of the initial image sequence corresponding to the initial image data. The processing device 140 may generate a target image of the object by performing a postprocessing operation on the target data, such that the target image of the object meets the target reconstruction need. The postprocessing operation  may include at least one of an artifact removal operation, a correction operation, or a merging operation. For example, the target reconstruction need may correspond to a target parameter set associated with postprocessing (also referred to as a target postprocessing parameter set) . The processing device 140 may generate the target image of the object by performing, according to the target postprocessing parameter set, the postprocessing operation on the target data. More descriptions regarding retrieving the target data and performing the postprocessing operation on the target data may be found elsewhere in the present disclosure (e.g., FIG. 6 and the description thereof) .
In some embodiments, the processing device 140 may generate the one or more target images of the object in parallel. That is, the processing device 140 may retrieve the target data according to each of the one or more target reconstruction needs simultaneously (or synchronously) and separately for generating the one or more target images. For example, there may be a first target reconstruction need and a second target reconstruction need. The processing device 140 may retrieve first target data and second target data from the initial image data in parallel according to the first target reconstruction need and the second target reconstruction need, respectively. The processing device 140 may generate a first target image of the object by performing a first postprocessing operation on the first target data, and simultaneously (or synchronously) generate a second target image of the object by performing a second postprocessing operation on the first target data. The first postprocessing operation may be the same as or different from the second postprocessing operation. The first target reconstruction need may indicate at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image. The second target reconstruction need may indicate at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image. Alternatively, the processing device 140 may generate the one or more target images of the object in sequence.
In 511, the processing device 140 (e.g., the output module (s) 406) may cause the one or more target images of the object to be displayed for a user. In some  embodiments, operation 511 may also be referred to as an operation of output image, e.g., as shown in FIG. 8.
In some embodiments, the processing device 140 may cause the one or more target images of the object to be displayed on a screen of the terminal 130 for the user. Further, the processing device 140 may receive a feedback from the user indicating whether the one or more target images are satisfied via the terminal 130. In some embodiments, the one or more target images may be displayed simultaneously and individually. For example, the one or more target images may be displayed simultaneously such that the user can view the one or more target images synchronously. Further, the one or more target images may be printed in a same report. As another example, the one or more target images may be displayed individually according to the generation sequence of the one or more target images, a default setting, or a designation of the user.
In some embodiments, according to the process 500, the multiple image reconstructions may be divided into two stages, e.g., a first stage (e.g., operations 501-507) for generating the initial image data and a second stage (e.g., operations 509-511) for generating one or more target images based on the initial image data. The division of multiple image reconstructions into the first stage and the second stage may avoid repeating resource-intensive operations, thereby saving a lot of resources and time and simplifying the process of the multiple image reconstructions. For example, in general, the operations of read, prepare and reconstruction may take up 60%-80%of reconstruction time for an image reconstruction process according to a target reconstruction need. In traditional multiple image reconstructions, the image reconstruction process may need to repeat the resource-intensive operations multiple times, which is time-consuming and resource-intensive. In the multiple image reconstructions according to the process 500, the time-consuming and resource-intensive operations may need to be performed only once, which reduces the reconstruction time and resource used in the multiple image reconstructions. In some embodiments, the division of multiple image reconstructions into the first stage and the second stage may  help to quickly determine and position error operation (s) occur in the multiple image reconstructions, which saves the manpower. For example, the processing device 140 may determine whether there is an erroneous result in performing the process 500. In response to a determination that there is an erroneous result, the processing device 140 may identify, from operations (e.g., operations 501-511) of the process 500, an error operation and report the error operation. For instance, if the processing device 140 can retrieve the target data from the initial image data according to each target reconstruction need, the processing device 140 may determine the first stage to be normal and position the error operation (s) in the second stage. In some embodiments, if operation (s) associated with a new target reconstruction need to be performed, the processing device 140 may only need to perform the operation (s) in the second stage and avoid repeating the operations in the first stage. For example, if candidate operation (s) associated with a new target reconstruction need to be tested, the processing device 140 may only need to add the candidate operation (s) into the second stage of the process 500 (e.g., replace the operation (s) in 509 with the candidate operation (s) to perform or test the candidate operation (s) ) without repeating the performing or testing of the operations in the first stage, thereby saving the test time.
In some embodiments, the first stage and the second stage of the process 500 may be performed online and/or offline. For example, the first stage and the second stage of the process 500 may be performed separately. The first stage of the process 500 may be performed offline, while the second stage of the process 500 may be performed online. That is, the first stage of the process 500 may be performed offline and the initial image data may be stored before the user selects or designates the one or more target reconstruction needs, and the second stage of the process 500 may be performed online when the user makes the selection and/or the designation and retrieve the target data from the initial image data for image processing. As another example, the first stage and the second stage of the process 500 may both be performed online or offline. That is, the first stage and the second stage of the process 500 may be performed in response to the one or more selected and/or designated target reconstruction needs. In such  cases, the one or more potential reconstruction needs in operation 505 may be determined based on the one or more target reconstruction needs.
It should be noted that the above description regarding the process 500 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, one or more additional operations may be added in the process 500. For example, another storing operation may be added elsewhere in the process 500. In the another storing operation, the processing device 140 may store information and/or data (e.g., the one or more target images) used or obtained in operations of the process 500 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure. In some embodiments, one or more operations of the process 500 may be omitted. For example,  operations  501 and 503 may be integrated into a single operation. In some embodiments, an operation of the process 500 may be achieved by two or more sub-operations. For example, operation 509 may be divided into two sb-operations one of which is for retrieving target data from the initial image data according to each of the one or more target reconstruction needs and another of which is for generating a target image by postprocessing the retrieved target data. In some embodiments, the first stage (e.g., operations 501-507) and the second stage (e.g., operations 509-511) may be performed by different processing devices.
FIG. 6 is a flowchart illustrating an exemplary process for generating a target image of an object based on initial image data of the object according to some embodiments of the present disclosure. In some embodiments, process 600 may be implemented as a set of instructions (e.g., an application) stored in a storage device (e.g., the storage device 130, the storage device 220, and/or the storage 390) . The processing device 140 (e.g., the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4) may execute the set of instructions, and when executing the instructions, the processing device 140 may be configured to perform the process 600. The operations of  the illustrated process presented below are intended to be illustrative. In some embodiments, the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 600 illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, operation 509 in FIG. 5 may be achieved by the process 600.
In 601, the processing device 140 (e.g., the pick module 404) may retrieve target data from the initial image data of the object according to a target reconstruction need.
As described in operation 509, the target reconstruction need may be associated with a set of basic reconstruction parameters. In some embodiments, the processing device 140 may determine whether the set of the basic reconstruction parameters are coarser than the plurality of reconstruction parameters according to which the initial image data is generated. In response to a determination that the set of the basic reconstruction parameters are coarser than the plurality of reconstruction parameters, the processing device 140 may retrieve the target data from the initial image data according to the target reconstruction need. The target reconstruction need may indicate at least one of a position and/or a scope of the target data in the initial image data, or a size of pixels or voxels of the target image that meets the target reconstruction need. In response to a determination that the set of the basic reconstruction parameters are no coarser than the plurality of reconstruction parameters, the processing device 140 may update the plurality of reconstruction parameters (used in 505) based on the target reconstruction need. Alternatively, the processing device 140 may obtain a request for adjusting the plurality of reconstruction parameters (e.g., from a user) . Then, the processing device 140 may update the initial image data based on the plurality of updated reconstruction parameters and retrieve the target data from the updated initial image data. The target reconstruction need may indicate at least one of a position and/or a scope of the target data in the updated initial image data, or a size of pixels or voxels of the target image that meets the target reconstruction need.
In 603, the processing device 140 (e.g., the postprocess module 405) may generate the target image of the object by performing a postprocessing operation on the target data, such that the target image meets the target reconstruction need.
In some embodiments, the postprocessing operation may include an artifact removal operation, a correction operation, a merging operation, a multi-planner reformation (MPR) operation, or the like, or any combination thereof. The artifact removal operation may be configured to remove an artifact (e.g., a beam hardening artifact, a ring artifact, a metal artifact, etc. ) from the target data to generate the target image. For example, if the artifact removal operation is configured to remove the metal artifact, the artifact removal operation may be performed using a metal artifact reduction (MAR) algorithm. The correction operation may be configured for image correction (e.g., motion correction) to generate the target image. The merging operation may be configured to merge at least two image slices of the target image data to generate the target image. The MPR operation may be configured to generate the target image based on the target data using an MPR algorithm.
In some embodiments, the target reconstruction need may indicate a preset image quality. The processing device 140 may determine whether the target image meets the preset image quality. In response to a determination that target image meets the preset image quality, the processing device 140 may output the target image. In response to a determination that the target image does not meet the preset image quality, the processing device 140 may further perform the postprocessing operation on the target image until the target image meets the preset image quality.
It should be noted that the above description regarding the process 600 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, one or more additional operations may be added in the process 600. For example, a storing operation may be added elsewhere in the process 600. In  the storing operation, the processing device 140 may store information and/or data used or obtained in operations of the process 600 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure.
According to some embodiments of the present disclosure, there may be at least two target reconstruction needs during image reconstruction. Each of the at least two target reconstruction needs may correspond to a target reconstruction parameter set and a target postprocessing parameter set. The at least two target reconstruction needs may be achieved according to following operations.
S1, the processing device 140 may obtain a first parameter set associated with reconstruction, e.g., according to a first user input.
The first parameter set may include parameters with relatively small granules, while each set of the at least two target reconstruction parameter sets may include parameters with relatively large granule. For example, the first parameter set may be a preset parameter set, e.g., a complete reconstruction parameter set including the plurality of reconstruction parameters associated with a plurality of potential reconstruction needs as described in 505. As another example, the first parameter set may be determined based on the at least two target reconstruction parameter sets, which is similar to the determination of the plurality of reconstruction parameters. Parameters types of the first parameter set may include at least an assemble (e.g., a union) of parameter types of two or more reconstruction parameter sets. Alternatively, the first parameter set may have other parameter type (s) in addition to the parameter types of the at least two target reconstruction parameter sets.
The first user input may indicate the at least two target reconstruction needs, or at least two target reconstruction parameter sets of the at least two target reconstruction needs or a portion thereof. In some embodiments, the processing device 140 may receive the first user input from an interface of the medical imaging system 100. For example, a user may select or click at least two icons (e.g., a buttons) corresponding to the at least two target reconstruction parameter sets respectively on the interface to transmit the first user input before the image reconstruction.
S2, the processing device 140 may obtain raw data of an object generated by an imaging device, which is similar to operation 501.
S3, the processing device 140 may generate initial image data (e.g., an image such as a 3D image or a 2D image) of the object by performing, according to the first parameter set, a reconstruction operation on the raw data. For example, the processing device 140 may generate preprocessed data by preprocessing the raw data, and generate the initial image data by performing, according to the first parameter set, the reconstruction operation on the preprocessed data, which is similar to  operations  503 and 505. As the first parameter set associated with reconstruction is determined based on the complete reconstruction parameter set or the at least two target reconstruction parameter sets, the initial image data may meet the at least two target reconstruction needs.
S4, the processing device 140 may obtain at least two second parameter sets associated with postprocessing according to a second user input.
For each of the at least two target reconstruction needs, the processing device 140 may obtain a target postprocessing parameter set corresponding to the target reconstruction need as the second parameter set associated with postprocessing. The second user input may indicate the at least two target reconstruction needs, or at least two target postprocessing parameter sets of the at least two target reconstruction needs. In some embodiments, the processing device 140 may receive the second user input via an interface of the medical imaging system 100. For example, the user may select or click at least two icons (or buttons) corresponding to the at least two target postprocessing parameter sets respectively on the interface after the initial image data is generated.
In some embodiments, the second user input and the first user input may be integrated into a single user input. The single user input may indicate the at least two target reconstruction parameter sets and the at least two target postprocessing parameter sets of the at least two target reconstruction needs. For example, a user may need only select or click at least two icons (e.g., buttons) corresponding to the at least two target reconstruction needs on the interface before image reconstruction to generate the single user input. The single user input may trigger the image reconstruction process, i.e., the  processing device 140 may perform operations S1-S5 in response to the single user input.
S5, for each of the at least two second parameter sets, the processing device 140 may generate a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data, which is similar to operation 509.
Accordingly, the processing device 140 may generate at least two target images corresponding to the at least two target reconstruction needs simultaneously, during which only one reconstruction operation is performed, which saves the computing resource and improves the efficiency of image reconstruction.
In some embodiments, the initial image data (e.g., an initial image) of the object generated in S3 may be stored (e.g., online or offline) in a storage device as described elsewhere in the present disclosure, which is similar to operation 507. The processing device 140 may retrieve or obtain the initial image of the object from the storage device. Then, the processing device 140 may obtain at least two parameter sets associated with postprocessing according to a user input (e.g., similar to the second user input) . For each parameter set of the at least two parameter sets associated with postprocessing, the processing device 140 may generate a target image by performing, according to the each parameter set, a postprocessing operation on the initial image, which is similar to operation S5.
According to some embodiments of the present disclosure, an interface for image reconstruction is provided. The interface may be configured to receive one or more user inputs (e.g., the first user input, the second user input, etc. ) for triggering one or more operations associated with image reconstruction. In some embodiments, the interface may include at least two icons (e.g., buttons) corresponding to the at least two target reconstruction needs respectively. Each of the at least two target reconstruction needs may correspond to a reconstruction parameter set and a postprocessing parameter set. The at least two icons may be selected (or clicked) by the user together or individually. For example, in response to two or more of the at least two icons being selected (e.g., clicked) , the processing device 140 may determine a plurality of reconstruction parameters (e.g. the first parameter set described in S1) based on the two or more reconstruction  parameter sets. The processing device 140 may obtain raw data of an object generated by an imaging device, which is similar to S2. The processing device 140 may generate initial image data of the object by performing, according to the plurality of reconstruction parameters, a reconstruction operation on the raw data, which is similar to S3. The processing device 140 may generate a target image by performing, according to each of the at least two postprocessing parameter sets, a postprocessing operation on the initial image data, which is similar to S5. As another example, the interface may include at least two first icons for receiving the first user input in S1, and at least two second icons for receiving the second user input in S4.
Generally, the user may select or input a target reconstruction need before image reconstruction. The processing device 140 may receive the reconstruction parameter set and the postprocessing parameter set of the target reconstruction need before image reconstruction and perform the image reconstruction from the beginning to the end (e.g., (e.g., from obtaining raw data to generating a target image) continuously for each image reconstruction. Thus, at least two image reconstructions corresponding to the at least two target reconstruction needs may be performed individually. According to the systems and methods of the present disclosure, the image reconstruction may be divided into two stages (or parts) . A first stage may be configured for initial image reconstruction to generate initial image data (e.g., an initial image) . A second stage may be configured for postprocessing to generate a target image. For the at least two target reconstruction needs, the processing device 140 may generate the initial image data by performing the reconstruction operation only once according to the first stage, e.g., as described in operations 501-505 and/or S1-S3. The initial image data may meet the at least two target reconstruction needs. Then, the processing device 140 may receive or obtain the postprocessing reconstruction parameter set of each of the at least two target reconstructions needs for generating the target image according to the second stage. Accordingly, the consumption of resource and time of the at least two image reconstructions may be reduced in comparison with performing the at least two image reconstructions independently.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ” and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
A non-transitory computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms,  including electromagnetic, optical, or the like, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran, Perl, COBOL, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations, therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For  example, although the implementation of various components described above may be embodied in a hardware device, it may also be implemented as a software-only solution, e.g., an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof to streamline the disclosure aiding in the understanding of one or more of the various inventive embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed object matter requires more features than are expressly recited in each claim. Rather, inventive embodiments lie in less than all features of a single foregoing disclosed embodiment.
In some embodiments, the numbers expressing quantities, properties, and so forth, used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ” For example, “about, ” “approximate” or “substantially” may indicate ±20%variation of the value it describes, unless otherwise stated. Accordingly, in some embodiments, the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment. In some embodiments, the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.
Each of the patents, patent applications, publications of patent applications, and other material, such as articles, books, specifications, publications, documents, things, and/or the like, referenced herein is hereby incorporated herein by this reference in its entirety for all purposes, excepting any prosecution file history associated with same, any of same that is inconsistent with or in conflict with the present document, or any of same  that may have a limiting effect as to the broadest scope of the claims now or later associated with the present document. By way of example, should there be any inconsistency or conflict between the description, definition, and/or the use of a term associated with any of the incorporated material and that associated with the present document, the description, definition, and/or the use of the term in the present document shall prevail.
In closing, it is to be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the application. Other modifications that may be employed may be within the scope of the application. Thus, by way of example, but not of limitation, alternative configurations of the embodiments of the application may be utilized in accordance with the teachings herein. Accordingly, embodiments of the present application are not limited to that precisely as shown and described.

Claims (31)

  1. A method for image reconstruction, which is implemented on a computing device including at least one processor and at least one storage device, comprising:
    obtaining raw data of an object generated by an imaging device;
    generating preprocessed data by preprocessing the raw data;
    generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs;
    storing the initial image data;
    retrieving first target data from the initial image data according to a first target reconstruction need; and
    generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
  2. The method of claim 1, wherein the first target reconstruction need indicates at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
  3. The method of claim 1, wherein a size of pixels or voxels of the first target image is no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
  4. The method of claim 1, wherein the plurality of reconstruction parameters include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
  5. The method of claim 1, further comprising:
    retrieving second target data from the initial image data according to a second target reconstruction need; and
    generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need,
    wherein the second target reconstruction need indicates at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
  6. The method of claim 5, wherein the first target image of the object and the second target image of the object are generated in parallel.
  7. The method of claim 5 or 6, wherein
    the first target reconstruction need is associated with a first set of basic reconstruction parameters;
    the second target reconstruction need is associated with a second set of basic reconstruction parameters; and
    at least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters is coarser than a corresponding one of the plurality of reconstruction parameters.
  8. The method of any one of claims 1-7, wherein the preprocessing the raw data includes:
    determining whether the raw data is corrupted,
    determining whether the raw data needs correction, or
    determining whether the raw data needs noise reduction.
  9. The method of any one of claims 1-6, wherein the postprocessing operation includes at least one of an artifact removal operation, a correction operation, or a merging operation.
  10. The method of any one of claims 1-9, further comprising:
    obtaining a request for adjusting the plurality of reconstruction parameters; and
    updating the plurality of reconstruction parameters according to the request.
  11. The method of any one of claims 1-10, further comprising:
    causing the first target image to be displayed for a user.
  12. The method of any one of claims 1-11, further comprising:
    determining whether there is an erroneous result in performing the method; and
    in response to a determination that there is an erroneous result,
    identifying, from operations of the method, an error operation; and
    reporting the error operation.
  13. A system for image reconstruction, comprising:
    a storage device storing a set of instructions;
    at least one processor in communication with the storage device, wherein when executing the set of instructions, the at least one processor is configured to cause the system to perform operations including:
    obtaining raw data of an object generated by an imaging device;
    generating preprocessed data by preprocessing the raw data;
    generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs;
    storing the initial image data;
    retrieving first target data from the initial image data according to a first target reconstruction need; and
    generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target  reconstruction need.
  14. The system of claim 13, wherein the first target reconstruction need indicates at least one of a position and/or a scope of the first target data in the initial image data, or a size of pixels or voxels of the first target image.
  15. The system of claim 13, wherein a size of pixels or voxels of the first target image is no less than sizes of pixels or voxels of an initial image sequence corresponding to the initial image data.
  16. The system of claim 13, wherein the plurality of reconstruction parameters include at least one of a reconstruction slice thickness, a reconstruction interval, a reconstruction field of view (FOV) , reconstruction start and end positions, or a reconstruction matrix.
  17. The system of claim 13, the operations further comprising:
    retrieving second target data from the initial image data according to a second target reconstruction need; and
    generating a second target image of the object by performing a postprocessing operation on the second target data, such that the second target image meets the second target reconstruction need,
    wherein the second target reconstruction need indicates at least one of a position and/or a scope of the second target data in the initial image data, or a size of pixels or voxels of the second target image.
  18. The system of claim 17, wherein the first target image of the object and the second target image of the object are generated in parallel.
  19. The system of claim 17 or 18, wherein
    the first target reconstruction need is associated with a first set of basic reconstruction  parameters;
    the second target reconstruction need is associated with a second set of basic reconstruction parameters; and
    at least one of the first set of basic reconstruction parameters or the second set of basic reconstruction parameters is coarser than a corresponding one of the plurality of reconstruction parameters.
  20. The system of any one of claims 13-19, wherein the preprocessing the raw data includes:
    determining whether the raw data is corrupted,
    determining whether the raw data needs correction, or
    determining whether the raw data needs noise reduction.
  21. The system of any one of claims 13-18, wherein the postprocessing operation includes at least one of an artifact removal operation, a correction operation, or a merging operation.
  22. The system of any one of claims 13-21, the operations further comprising:
    obtaining a request for adjusting the plurality of reconstruction parameters; and
    updating the plurality of reconstruction parameters according to the request.
  23. The system of any one of claims 13-22, the operations further comprising:
    causing the first target image to be displayed for a user.
  24. The system of any one of claims 13-23, the operations further comprising:
    determining whether there is an erroneous result in performing the operations; and
    in response to a determination that there is an erroneous result,
    identifying, from the operations, an error operation; and
    reporting the error operation.
  25. A system for image reconstruction, comprising:
    an obtaining module configured to obtain raw data of an object generated by an imaging device;
    a preprocess module configured to generate preprocessed data by preprocessing the raw data;
    a reconstruction module configured to:
    generate initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs; and
    store the initial image data; and
    one or more pick modules each of which is configured to retrieve target data from the initial image data according to a target reconstruction need, wherein the one or more pick modules are arranged in parallel; and
    one or more postprocess modules each of which corresponds to one of the one or more pick modules and is configured to generate a target image of the object by performing a postprocessing operation on the target data, such that the target image meets the first target reconstruction need.
  26. A non-transitory computer readable medium, comprising executable instructions that, when executed by at least one processor, direct the at least one processor to perform a method for image reconstruction, the method comprising:
    obtaining raw data of an object generated by an imaging device;
    generating preprocessed data by preprocessing the raw data;
    generating initial image data by performing, according to a plurality of reconstruction parameters associated with a plurality of potential reconstruction needs, a reconstruction operation on the preprocessed data, such that the initial image data meets the plurality of potential reconstruction needs;
    storing the initial image data;
    retrieving first target data from the initial image data according to a first target reconstruction need; and
    generating a first target image of the object by performing a postprocessing operation on the first target data, such that the first target image meets the first target reconstruction need.
  27. A method for image reconstruction, which is implemented on a computing device including at least one processor and at least one storage device, comprising:
    obtaining a first parameter set associated with reconstruction according to a first user input;
    obtaining raw data of an object generated by an imaging device;
    generating initial image data of the object by performing, according to the first parameter set, a reconstruction operation on the raw data;
    obtaining a second parameter set associated with postprocessing according to a second user input; and
    generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
  28. A method for image reconstruction, which is implemented on a computing device including at least one processor and at least one storage device, comprising:
    obtaining raw data of an object generated by an imaging device;
    generating preprocessed data by preprocessing the raw data;
    obtaining a reconstruction parameter set associated with at least two target reconstruction needs;
    generating an initial image by performing, according to the reconstruction parameter set, a reconstruction operation on the preprocessed data, wherein the initial image meets the at least two target reconstruction needs.
  29. A method for image reconstruction, which is implemented on a computing device including at least one processor and at least one storage device, comprising:
    obtaining raw data of an object generated by an imaging device;
    generating initial image data of the object by performing a reconstruction operation on the raw data;
    obtaining a parameter set associated with postprocessing according to a user input; and
    generating a target image by performing, according to the second parameter set, a postprocessing operation on the initial image data.
  30. A method for image reconstruction, which is implemented on a computing device including at least one processor and at least one storage device, comprising:
    obtaining an initial image of an object;
    obtaining at least two parameter sets associated with postprocessing according to a user input;
    for each parameter set of the at least two parameter sets, generating a target image by performing, according to the each parameter set, a postprocessing operation on the initial image.
  31. An interface, comprising:
    at least two icons corresponding to at least two target reconstruction needs respectively, wherein each of the at least two target reconstruction needs corresponds to a reconstruction parameter set and a postprocessing parameter set,
    in response to the at least two icons being selected, operations for image reconstruction being triggered, the operations including:
    determining a plurality of reconstruction parameters based on the at least two reconstruction parameter sets;
    obtaining raw data of an object generated by an imaging device;
    generating initial image data of the object by performing, according to the plurality  of reconstruction parameters, a reconstruction operation on the raw data;
    generating a target image by performing, according to each of the at least two postprocessing parameter sets, a postprocessing operation on the initial image data.
PCT/CN2021/118380 2021-09-15 2021-09-15 Systems and methods for image reconstruction WO2023039736A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2021/118380 WO2023039736A1 (en) 2021-09-15 2021-09-15 Systems and methods for image reconstruction
CN202180102293.2A CN117940072A (en) 2021-09-15 2021-09-15 Image reconstruction method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2021/118380 WO2023039736A1 (en) 2021-09-15 2021-09-15 Systems and methods for image reconstruction

Publications (1)

Publication Number Publication Date
WO2023039736A1 true WO2023039736A1 (en) 2023-03-23

Family

ID=85602254

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/118380 WO2023039736A1 (en) 2021-09-15 2021-09-15 Systems and methods for image reconstruction

Country Status (2)

Country Link
CN (1) CN117940072A (en)
WO (1) WO2023039736A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6141398A (en) * 1998-08-25 2000-10-31 General Electric Company Protocol driven image reconstruction, display, and processing in a multislice imaging system
US6745066B1 (en) * 2001-11-21 2004-06-01 Koninklijke Philips Electronics, N.V. Measurements with CT perfusion
CN102525523A (en) * 2010-12-20 2012-07-04 Ge医疗系统环球技术有限公司 Image browser and CT (Computerized Tomography) equipment
CN106296764A (en) * 2016-08-02 2017-01-04 上海联影医疗科技有限公司 Image rebuilding method and system
US20200242815A1 (en) * 2019-01-28 2020-07-30 Wisconsin Alumni Research Foundation System for Harmonizing Medical Image Presentation

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6141398A (en) * 1998-08-25 2000-10-31 General Electric Company Protocol driven image reconstruction, display, and processing in a multislice imaging system
US6745066B1 (en) * 2001-11-21 2004-06-01 Koninklijke Philips Electronics, N.V. Measurements with CT perfusion
CN102525523A (en) * 2010-12-20 2012-07-04 Ge医疗系统环球技术有限公司 Image browser and CT (Computerized Tomography) equipment
CN106296764A (en) * 2016-08-02 2017-01-04 上海联影医疗科技有限公司 Image rebuilding method and system
US20200242815A1 (en) * 2019-01-28 2020-07-30 Wisconsin Alumni Research Foundation System for Harmonizing Medical Image Presentation

Also Published As

Publication number Publication date
CN117940072A (en) 2024-04-26

Similar Documents

Publication Publication Date Title
US11653890B2 (en) Systems and methods for image acquisition
US11887221B2 (en) Systems and methods for image correction in positron emission tomography
US11455756B2 (en) System and method for image reconstruction
US11995837B2 (en) System and method for medical image visualization
US11094094B2 (en) System and method for removing hard tissue in CT image
US11842465B2 (en) Systems and methods for motion correction in medical imaging
CN109272562B (en) System and method for iterative reconstruction
US20230237665A1 (en) Systems and methods for image segmentation
US20230293132A1 (en) Systems and methods for image acquisition
US11734862B2 (en) Systems and methods for image reconstruction
US20230225687A1 (en) System and method for medical imaging
US11900602B2 (en) System and method for medical imaging
WO2023039736A1 (en) Systems and methods for image reconstruction
CN117897733A (en) Medical imaging system and method
US20240212145A1 (en) System and method for medical imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21957027

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202180102293.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE