CN109272562B - System and method for iterative reconstruction - Google Patents

System and method for iterative reconstruction Download PDF

Info

Publication number
CN109272562B
CN109272562B CN201810954772.7A CN201810954772A CN109272562B CN 109272562 B CN109272562 B CN 109272562B CN 201810954772 A CN201810954772 A CN 201810954772A CN 109272562 B CN109272562 B CN 109272562B
Authority
CN
China
Prior art keywords
image
air mask
raw data
seed
air
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810954772.7A
Other languages
Chinese (zh)
Other versions
CN109272562A (en
Inventor
王思睿
曹文静
亚历山大·扎米亚京
喻智聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201810954772.7A priority Critical patent/CN109272562B/en
Priority to US16/206,167 priority patent/US10977840B2/en
Publication of CN109272562A publication Critical patent/CN109272562A/en
Priority to US17/221,012 priority patent/US11481935B2/en
Application granted granted Critical
Publication of CN109272562B publication Critical patent/CN109272562B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/416Exact reconstruction

Abstract

The present application relates to a system and method of iterative reconstruction. The system and method may acquire raw data detected by an imaging device from at least one angle; generating a first type image by performing filtered back-projection on the raw data; a first air mask may be determined by performing a minimum Back Projection (BP) on the raw data; and at least one image may be reconstructed by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.

Description

System and method for iterative reconstruction
Technical Field
The present application relates to image reconstruction, and more particularly to a system and method for iterative reconstruction.
Background
Non-invasive imaging techniques can obtain image data of the internal structure of an object without performing invasive procedures on the object. Imaging techniques such as Computed Tomography (CT), positron Emission Tomography (PET), magnetic Resonance Imaging (MRI), and the like utilize various physical principles to acquire image data and provide images of objects. One of the methods of image reconstruction is iterative reconstruction. Iterative reconstruction refers to a technique of forming an image by repeatedly adjusting an existing estimate according to the quality of a match between measured data and a simulated measurement from a current estimate of the image. However, iterative reconstruction techniques are typically computationally intensive and time consuming. Slow reconstruction speed is one of the biggest challenges of iterative reconstruction. Therefore, it is desirable to speed up iterative reconstruction.
Disclosure of Invention
The present application aims to provide a system and a method for iterative reconstruction, which can accelerate the iterative reconstruction by performing minimum backprojection to generate an air mask and performing iterative reconstruction based on the generated air mask.
In order to achieve the above purpose, the technical solution provided by the present application is as follows:
according to an aspect of the present application, a method of iterative reconstruction is provided, which may be used to reconstruct an image, which may include at least one element, each of which may be a pixel or a voxel. The method may further include acquiring raw data detected by the imaging device from at least one angle; generating a first type image by performing filtered back-projection on the raw data; determining a first air mask by performing a minimum backprojection on the raw data; and reconstructing at least one image by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.
In some embodiments, the at least one image may comprise a 3D image or a volumetric image.
In some embodiments, the first air mask may include at least one element having zero attenuation of radiation rays in the imaging device.
In some embodiments, determining the first air mask by performing a minimum backprojection on the raw data may further include, for each element of the at least one image, performing a minimum backprojection by specifying, as a backprojection value for the each element, a minimum raw data of a portion of the raw data or modified data, the portion of the raw data being associable with the each element and detectable from the at least one angle, the modified data being generated by interpolation based on the raw data; a second seed image may be generated based on at least one backprojection value associated with the at least one element; and the first air mask may be generated by performing thresholding on the second seed image based on a second threshold.
In some embodiments, the raw data may be detected by at least one detector of the imaging device, the at least one detector may be aligned along a channel direction and a row direction, determining a first air mask by performing a minimum backprojection on the raw data may further comprise: before performing the minimum back-projection, a one-dimensional low-pass filtering may be performed on the raw data in the channel direction.
In some embodiments, the raw data may correspond to a sinogram, determining a first air mask by performing a minimum back-projection on the raw data may further comprise: two-dimensional low-pass filtering may be performed on the raw data before performing the minimum back-projection.
In some embodiments, determining a first air mask by performing a minimum backprojection on the raw data may further comprise: a second air mask may be generated by performing thresholding on the first seed image based on a third threshold; and the first air mask may be modified by performing a boolean operation based on the first air mask and the second air mask.
In some embodiments, determining a first air mask by performing a minimum backprojection on the raw data may further comprise: the first seed image may be modified by at least one of subtracting an outer loop background from the first seed image or performing two-dimensional low pass filtering on the first seed image; generating a second air mask by performing thresholding on the modified first seed image based on a third threshold; and the first air mask may be modified by performing a boolean operation based on the first air mask and the second air mask.
In some embodiments, determining a first air mask by performing a minimum backprojection on the raw data may further comprise: morphological operations may be performed on the modified first air mask.
In some embodiments, the first air mask may be determined from at least one block of pixels, wherein each block of pixels of the at least one block of pixels may include N × M pixels.
In some embodiments, the iterative reconstruction may include at least one forward projection and at least one backward projection, and the first air mask may be used in the at least one forward projection and the at least one backward projection.
In some embodiments, the first air mask may be updated in at least one iteration of the iterative reconstruction.
In some embodiments, the first type of sub-image may be modified by subtracting an outer ring background from the first type of sub-image.
In accordance with another aspect of the present application, there is provided a system for iterative reconstruction, the system for reconstructing an image, the system may include at least one processor; and a memory configured to store instructions, wherein the at least one processor is operable to, when executing the instructions: acquiring raw data detected by an imaging device from at least one angle; generating a first type image by performing filtered back-projection on the raw data; determining a first air mask by performing a minimum backprojection on the raw data; and reconstructing at least one image by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.
According to another aspect of the present application, there is provided a system for iterative reconstruction, the system for reconstructing an image, the system may include an acquisition module, a seed image generation module, an air mask generation module, and a reconstruction module; the acquisition module may be configured to acquire raw data detected by an imaging device from at least one angle; the seed image generation module may be configured to generate a first seed image by performing filtered backprojection on the raw data; the air mask generation module may be to determine a first air mask by performing a minimum backprojection on the raw data; and the reconstruction module may be configured to reconstruct at least one image by performing an iterative reconstruction based on the first type of sub-image, the first air mask and the raw data.
According to another aspect of the present application, there is provided a computer-readable storage medium, which may store computer instructions that, when executed, perform the method of iterative reconstruction
Because the technical scheme is adopted, the method has the following technical effects:
1. the air mask generated by the minimum filtering back projection has high precision and good quality;
2. iterative reconstruction is performed based on the air mask, so that the iterative reconstruction speed can be increased, and the quality of a reconstructed image can be improved.
Additional features will be set forth in part in the description which follows and in part will become apparent to those having ordinary skill in the art upon examination of the following or may be learned from the manufacture or operation of the embodiments. The features of the present invention may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities and combinations particularly pointed out hereinafter.
Drawings
The present application will be further described in conjunction with the exemplary embodiments. These exemplary embodiments will be described in detail by means of the accompanying drawings. The figures are not drawn to scale. These embodiments are non-limiting exemplary embodiments, in which like numerals represent like structures throughout the several views of the drawings, and in which:
FIG. 1 is a schematic view of an exemplary imaging system shown in accordance with some embodiments of the present application;
FIG. 2 is a schematic diagram of exemplary hardware and/or software components of a computing device that may be used to implement a processing device according to some embodiments of the present application;
FIG. 3 is a schematic diagram of exemplary hardware and/or software components of a mobile device shown in accordance with some embodiments of the present application;
FIG. 4 is a schematic illustration of an exemplary processing device according to some embodiments of the present application;
FIG. 5 is a flow diagram of an exemplary process for reconstructing at least one image, shown in accordance with some embodiments of the present application;
FIG. 6 is a flow chart illustrating an exemplary process of generating an air mask according to some embodiments of the present application;
7A-7D are schematic diagrams of exemplary seed images of an object shown according to some embodiments of the present application;
8A-8D are schematic diagrams of exemplary seed images of an object shown in accordance with some embodiments of the present application;
FIG. 9 is a flow chart of an exemplary process of modifying an air mask, shown in accordance with some embodiments of the present application;
FIG. 10A is a schematic diagram of an exemplary process of generating an air mask, according to some embodiments of the present application;
FIG. 10B is a schematic illustration of an exemplary process of generating an air mask, shown in accordance with some embodiments of the present application;
FIG. 11A shows an air mask seed image corresponding to the air mask depicted in FIG. 10A;
FIG. 11B shows an air mask seed image corresponding to the air mask depicted in FIG. 10B;
FIG. 12A is a schematic illustration of an exemplary process of generating an air mask, shown in accordance with some embodiments of the present application;
FIG. 12B is a schematic view of another exemplary process of generating an air mask, shown in accordance with some embodiments of the present application;
13A and 13B are exemplary air mask seed images of a portion of an object shown without truncation according to some embodiments of the present application;
FIG. 13C illustrates an exemplary filtered back projection seed image generated based on the same raw data of the same object as in FIGS. 13A and 13B;
13D and 13E are exemplary air mask seed images with another portion of the object truncated, according to some embodiments of the present application;
FIG. 13F illustrates an exemplary filtered back projection seed image generated based on the same raw data of the same object as in FIGS. 13D and 13E;
FIG. 14A is a flow chart of an exemplary process of reconstructing at least one image, shown in accordance with some embodiments of the present application;
FIG. 14B is a schematic illustration of an exemplary outer ring shown according to some embodiments of the present application;
FIG. 15 is a schematic diagram of an exemplary seed image correction module shown in accordance with some embodiments of the present application;
FIG. 16 is a flow diagram illustrating an exemplary process of generating a corrected seed image according to some embodiments of the present application;
FIG. 17 is a flow diagram of an exemplary process of generating a background image, shown in accordance with some embodiments of the present application;
FIG. 18 is a schematic diagram of an exemplary seed image of an object shown in accordance with some embodiments of the present application;
FIG. 19 is a schematic diagram of an exemplary intermediate image of an object shown in accordance with some embodiments of the present application;
FIG. 20 is a diagram showing a plurality of reference values in a plurality of rows in the fourth sub-region;
FIG. 21A is a schematic diagram of an exemplary initial circle image shown in accordance with some embodiments of the present application;
FIG. 21B is a schematic illustration of an exemplary background image shown according to some embodiments of the present application;
FIG. 22 is a flow diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application;
FIG. 23 is a schematic diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application;
FIG. 24 is a flow diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application; and
FIG. 25 is a schematic diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application.
Detailed Description
In the following description, specific details are set forth by way of examples in order to provide a thorough understanding of the present disclosure. It will be apparent to one skilled in the art that the present application may be practiced without the specific details. In other instances, well-known methods, procedures, systems, components, and/or circuits have been described herein at a relatively high-level, in order to avoid unnecessarily obscuring aspects of the present application. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present application. Thus, the present application is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to limit the scope of the present application. As used herein, the terms "a," "an," "the," and the like are not intended to be inclusive and may include the plural as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, components, and/or components, but do not preclude the presence or addition of at least one other feature, integer, step, operation, component, and/or group thereof.
It should be understood that the terms "system," "unit," "module," and/or "block" as used herein are a means for distinguishing between different components, elements, components, parts, or assemblies of different levels in ascending order. However, other expressions may be substituted for the words described if they accomplish the same purpose.
Generally, a "module," "unit," and/or "block" in this application refers to logic or a set of software instructions stored in hardware, firmware. The "module," "unit," and/or "block" referred to herein can be implemented by software and/or hardware modules, and can be stored in any kind of computer-readable non-transitory medium or other storage device. In some embodiments, software modules/units/blocks may be compiled and linked into one executable program. The software modules herein may respond to information communicated by themselves or other modules/units/blocks and/or may respond upon detection of certain events or interrupts. Software modules/units/blocks configured to execute on a computing device (e.g., processor 210 shown in fig. 2) may be provided on a computer readable medium such as a compact disc, digital video disc, flash drive, magnetic disk, or any other tangible medium, or as a digital download (which may be stored in a compressed or installation package, requiring installation, decompression, or decryption prior to execution). The software code herein may be stored in part or in whole in a memory device of a computing device performing the operations and employed in the operations of the computing device. The software instructions may be embedded in firmware, such as erasable programmable read-only memory (EPROM). It should be further understood that the hardware modules/units/blocks may comprise connected logic components, such as gates and flip-flops, and/or may comprise programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functions described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. Generally, the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks, regardless of their physical organization or storage.
It will be understood that when a unit, engine, module or block is "on," "connected to" or "coupled to" another unit, engine, module or block, unless the context clearly dictates otherwise, the unit, engine, module or block may be directly on, connected or coupled to or communicated with the other unit, engine, module or block, and other intervening units, engines, modules or blocks may also be present. In this application, the term "and/or" may include any one or combination of at least one of the associated listed items.
The features and characteristics of the present application, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, will become more apparent upon consideration of the following description of the drawings, which form a part of this application. It is to be understood, however, that the drawings are designed solely for the purposes of illustration and description and are not intended as a definition of the limits of the application. It should be understood that the drawings are not to scale.
One aspect of the present application relates to a system and method for accelerating iterative reconstruction. At least one image may be generated by performing an iterative reconstruction. Each of the at least one image may include at least one element. The elements may be pixels or voxels. To speed up iterative reconstruction, the system and method may generate an air mask, which may include only elements representing air. Elements representing air may be ignored in the iterative reconstruction, which may speed up the iterative reconstruction. To generate the air mask, the system and method may acquire raw data detected by the imaging device from at least one angle. The system and method may perform minimum back-projection (BP) by designating a minimum of the raw data or a portion of the modified data as a back-projection value for each element of the at least one image, and generate an air mask seed image (or second seed image) based on the at least one back-projection value associated with the at least one element. The system and method may generate an air mask (also referred to herein as a first air mask) by performing thresholding on an air mask seed image (or a second seed image) based on a threshold. To perform iterative reconstruction, the system and method may also generate a seed image (also referred to herein as an initial image estimate) by performing filtered back-projection (FBP) on the raw data. The system and method may generate at least one image by performing an iterative reconstruction based on the seed image (or initial image estimate), the air mask (or first air mask), and the raw data.
Alternatively or additionally, in general, the seed image (i.e., the initial image estimate) may include an offset that occurs due to a lack of data in the outer loop. To compensate for this offset, more iterative reconstructions may need to be performed. However, the systems and methods disclosed herein may accelerate iterative reconstruction (i.e., reduce the number of iterations of iterative reconstruction) by generating a corrected seed image. The system and method may determine a corrected seed image by subtracting the outer-loop background from the seed image. The outer ring background may correspond to the background of the outer ring. The outer ring may correspond to a region between a (scanned) field of view (FOV) and a gantry aperture of an imaging device. The system and method may generate at least one image by performing an iterative reconstruction based on the corrected seed image and the raw data.
FIG. 1 is a schematic view of an exemplary imaging system shown in accordance with some embodiments of the present application. The imaging system 100 may include a Computed Tomography (CT) system, a Computed Tomography Angiography (CTA) system, a Positron Emission Tomography (PET) system, a Single Photon Emission Computed Tomography (SPECT) system, a Magnetic Resonance Imaging (MRI) system, or the like, or a combination thereof. In some embodiments, the medical imaging system may be used alone. In some embodiments, the medical imaging system may be used as a multi-modality system (e.g., a PET-CT system).
As shown in fig. 1, the imaging system 100 may include an imaging device 110, a network 120, at least one terminal 130, a processing device 140, and a storage device 150. The components in the imaging system 100 may be connected in various ways. By way of example only, the imaging device 110 may be connected to the processing device 140 through the network 120. As another example, imaging device 110 may be directly connected to processing device 140, as indicated by the dashed double-headed arrow connecting imaging device 110 and processing device 140. As another example, storage device 150 may be connected to processing device 140 directly or through network 120. As another example, terminal 130 may be connected to processing device 140 directly (as indicated by the dashed double-headed arrow connecting terminal 130 and processing device 140) or through network 120.
Taking the CT system as an example, the imaging device 110 may include a gantry 111, a detector module 112, a radiation source 113, and a scanning table 114. The detector module 112 and the radiation source 113 may be mounted opposite on the gantry 111. The subject may be placed on the scanning stage 114 and moved into the detection channel of the imaging device 110. The object may be a biological object (e.g., a patient, an animal) or a non-biological object (e.g., a man-made object). In this application, "object" and "target" are used interchangeably. For ease of illustration, the coordinate system shown in FIG. 1 is introduced. The coordinate system may include an X-axis, a Y-axis, and a Z-axis. The Z-axis may refer to a direction along which an object moves into and/or out of a detection channel of the imaging device 110. The X-axis and the Y-axis may form a plane perpendicular to the Z-axis.
The radiation source 113 may emit radiation rays to scan an object placed on the scanning table 114. The radiation rays may include X-rays, gamma rays, alpha rays, ultraviolet rays, laser light, neutrons, protons, and the like, or combinations thereof. The detector module 112 may receive radiation rays that pass through the object. In some embodiments, the detector module 112 may include at least one detector, which may be arranged in the channel direction and the row direction. The detector may comprise a scintillation detector (e.g., a cesium iodide detector) or a gas detector.
Network 120 may facilitate the exchange of information and/or data. In some embodiments, of imaging system 100At least one component (e.g., imaging device 110, terminal 130, processing device 140, or storage device 150) may transmit information and/or data in imaging system 100 to another component over network 120. For example, the processing device 140 may obtain raw data related to the object from the imaging device 110 via the network 120. As another example, processing device 140 may obtain user instructions for generating an air mask, correcting a seed image, reconstructing at least one image, etc., via network 120. In some embodiments, the network 120 may be any one of, or a combination of, a wired network or a wireless network. Network 120 may be and/or include a public network (e.g., the internet), a private network (e.g., a Local Area Network (LAN), a Wide Area Network (WAN)), a wired network (e.g., ethernet), a wireless network (e.g., an 802.11 network, a Wi-Fi network), a cellular network (e.g., a Long Term Evolution (LTE) network), a frame relay network, a virtual private network ("VPN"), a satellite network, a telephone network, a router, a hub, a switch, a server computer, and/or any combination thereof. By way of example only, network 120 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Wide Area Network (WAN), the Public Switched Telephone Network (PSTN), bluetooth TM Network, zigbee TM A network, a Near Field Communication (NFC) network, etc., or any combination thereof. In some embodiments, network 120 may include at least one network switching point. For example, the network 120 may include wired or wireless network access points, such as base stations and/or internet exchange points, through which at least one component of the imaging system 100 may connect to the network 120 to exchange data and/or information.
The terminal 130 may include a mobile device 130-1, a tablet computer 130-2, a laptop computer 130-3, etc., or any combination thereof. In some embodiments, the mobile device 130-1 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home devices may include smart lighting devices, control devices for smart appliances, smart monitoring devices, smart televisions, smart cameras, interphones, and the like, or any combination thereof. In some embodiments, the wearable device may include a bracelet, footwear, glasses, helmet, watch, clothing, backpack, accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smart phone, a Personal Digital Assistant (PDA), a gaming device, a navigation device, a point of sale (POS) device, and the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, virtual reality eyeshields, augmented reality helmets, augmented reality glasses, augmented reality eyeshields, and the like, or any combination thereof. For example, the virtual reality device and/or augmented reality device may include Google Glass, eyes Rift, hololens, or Gear VR, among others. In some embodiments, terminal 130 may remotely operate imaging device 110. In some embodiments, terminal 130 may operate imaging device 110 through a wireless connection. In some embodiments, the terminal 130 may receive information and/or instructions input by a user and transmit the received information and/or instructions to the imaging device 110 or the processing device 140 via the network 120. In some embodiments, the terminal 130 may receive information and/or data from the processing device 140. In some embodiments, the terminal 130 may be part of the processing device 140. In some embodiments, terminal 130 may be omitted.
In some embodiments, processing device 140 may process data obtained from imaging device 110, terminal 130, or storage device 150. For example, the processing device 140 may generate a seed image and/or an air mask based on raw data detected by the imaging device 110. As another example, processing device 140 may correct the seed image to determine a corrected seed image. For another example, the processing device 140 may reconstruct at least one image by performing an iterative reconstruction based on user instructions obtained from the terminal 130. The processing device 140 may be a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a system on chip (SoC), a microcontroller unit (MCU), etc., or any combination thereof. In some embodiments, the processing device 140 may be a single server or a group of servers. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, processing device 140 may access information and/or data stored in imaging device 110, terminal 130, and/or storage device 150 via network 120. As another example, processing device 140 may be directly connected to imaging device 110, terminal 130, and/or storage device 150 to access stored information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a cloudy, etc., or any combination thereof. In some embodiments, the processing device 140 may be implemented on a computing device 200 having at least one component as illustrated in FIG. 2 herein.
Storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may store data obtained from the terminal 130 and/or the processing device 140. In some embodiments, storage device 150 may store data and/or instructions executable by processing device 140 or used to perform the exemplary methods described herein. In some embodiments, the storage device 150 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), etc., or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memories may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and so forth. Exemplary volatile read and write memories can include Random Access Memory (RAM). Exemplary random access memories may include Dynamic Random Access Memory (DRAM), double-data-rate synchronous dynamic random access memory (DDR SDRAM), static Random Access Memory (SRAM), thyristor random access memory (T-RAM), and zero-capacitance random access memory (Z-RAM), among others. Exemplary read-only memories may include mask read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (perrom), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), digital versatile disc read-only memory (dvd-ROM), and the like. In some embodiments, the storage device 150 may be implemented on a cloud platform. For example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a cloudy cloud, and the like, or any combination thereof.
In some embodiments, a storage device 150 may be connected to the network 120 to communicate with at least one component of the imaging system 100 (e.g., the terminal 130, the processing device 140). At least one component in imaging system 100 may access data or instructions stored in storage device 150 via network 120. In some embodiments, the storage device 150 may be directly connected or in communication with at least one component of the imaging system 100 (e.g., the terminal 130, the processing device 140). In some embodiments, the storage device 150 may be part of the processing device 140.
Fig. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device 200 that may be used to implement the processing device 140 according to some embodiments of the present application. As shown in fig. 2, computing device 200 may include a processor 210, memory 220, input/output (I/O) 230, and communication ports 240.
The processor 210 may execute computer instructions (program code) that, when executed, cause the processing device 140 to perform the functions of the processing device 140 in accordance with the techniques described herein. The computer instructions may include instructions for performing the particular functions described herein, such as routines, programs, objects, components, signals, data structures, procedures, modules, and functions. In some embodiments, processor 210 may process data and/or images obtained from imaging device 110, terminal 130, storage device 150, and/or any other component of imaging system 100. For example, processor 210 may generate an air mask based on raw data detected by imaging device 110. As another example, processor 210 may correct the seed image by subtracting the outer-loop background from the seed image. As another example, processor 210 may reconstruct at least one image by performing an iterative reconstruction. In some embodiments, processor 210 may include at least one hardware processor, such as a microcontroller, microprocessor, reduced Instruction Set Computer (RISC), application Specific Integrated Circuit (ASIC), application specific instruction set processor ASIP, central Processing Unit (CPU), graphics Processing Unit (GPU), physical Processing Unit (PPU), microcontroller unit, digital Signal Processor (DSP), field Programmable Gate Array (FPGA), advanced RISC Machine (ARM), programmable Logic Device (PLD), any circuit or processor capable of performing at least one function, or the like, or any combination thereof.
For illustration only, only one processor is depicted in computing device 200. It should be noted that the computing device 200 in the present application may also include multiple processors. Thus, operations and/or method steps described herein as being performed by one processor may also be performed by multiple processors, either jointly or separately. For example, if a processor of computing device 200 performs process a and process B in the present application, it should be understood that process a and process B may also be performed by two or more different processors in computing device 200, collectively or individually (e.g., a first processor performs process a and a second processor performs process B, or a first processor and a second processor perform processes a and B together).
Memory 220 may store data/information obtained from imaging device 110, terminal 130, storage device 150, or any other component of imaging system 100. In some embodiments, memory 220 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), etc., or any combination thereof. For example, mass storage may include magnetic disks, optical disks, solid state disks, and so forth. Removable storage may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. The volatile read and write memory may include Random Access Memory (RAM). The random access memory may include Dynamic Random Access Memory (DRAM), double-data-rate synchronous dynamic random access memory (DDR SDRAM), static Random Access Memory (SRAM), thyristor random access memory (T-RAM), zero-capacitance (Z-RAM), and the like. The read-only memory may include mask read-only memory (MROM), programmable read-only memory (PROM), erasable programmable read-only memory (perrom), electrically erasable programmable read-only memory (EEPROM), compact disc read-only memory (CD-ROM), digital versatile disc read-only memory (dvd-ROM), and the like. In some embodiments, memory 220 may store at least one program and/or instructions to perform the example methods described herein. For example, the memory 220 may store a program (e.g., in the form of computer-executable instructions) for the processing device 140 to generate the air mask. For another example, memory 220 may store a program (e.g., in the form of computer-executable instructions) for processing device 140 to correct the seed image. As another example, the memory 220 may store a program (e.g., in the form of computer-executable instructions) for the processing device 140 for reconstructing at least one image by performing iterative reconstruction.
Input/output 230 may input or output signals, data, and/or information. In some embodiments, input/output 230 may enable a user to interact with processing device 140. In some embodiments, input/output 230 may include an input device and an output device. Exemplary input devices may include a keyboard, mouse, touch screen, microphone, etc., or any combination thereof. Exemplary output devices may include a display device, speakers, printer, projector, etc., or any combination thereof. Exemplary display devices may include Liquid Crystal Displays (LCDs), light Emitting Diode (LED) based displays, flat panel displays, curved displays, television devices, cathode Ray Tubes (CRTs), and the like, or any combination thereof.
The communication port 240 may be connected to a network (e.g., network 120) to facilitate data communication. The communication port 240 may establish a connection between the processing device 140 and the imaging device 110, the terminal 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination thereof to enable data transmission and reception. The wired connection may include an electrical cable, an optical cable, a telephone line, etc., or any combination thereof. The wireless connection may include bluetooth, wi-Fi, wiMAX, WLAN, zigBee, mobile networks (e.g., 3G, 4G, 5G, etc.), etc., or a combination thereof. In some embodiments, the communication port 240 may be a standardized port, such as RS232, RS485, and the like. In some embodiments, the communication port may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
Fig. 3 is a schematic diagram of exemplary hardware and/or software components of a mobile device 300 shown in accordance with some embodiments of the present application. As shown in fig. 3, mobile device 300 may include a communication platform 310, a display 320, a Graphics Processing Unit (GPU) 330, a Central Processing Unit (CPU) 340, an input/output 350, a memory 360, and a storage 390. In some embodiments, any other suitable component, including but not limited to a system bus or a controller (not shown), may also be included in the mobile device 300. In some embodiments, an operating system 370 (e.g., iOS, android, windows Phone, etc.) and at least one application 380 may be loaded from storage 390 into memory 360 for execution by CPU 340. The applications 380 may include a browser or any other suitable mobile application for receiving and presenting information related to image processing or other information from the processing device 140. User interaction with the information stream may be accomplished via input/output 350 and provided to processing device 140 and/or other components of imaging system 100 via network 120.
To implement the various modules, units and their functions described herein, a computer hardware platform may be used as the hardware platform for at least one of the elements described herein. The hardware elements, operating systems, and programming languages of such computers are conventional in nature, and it is assumed that those skilled in the art are familiar with these techniques to adapt them for acceleration of iterative reconstruction as described herein. A computer with user interface elements may be used to implement a Personal Computer (PC) or other type of workstation or terminal device, and if suitably programmed, may also act as a server. It is believed that one skilled in the art will be familiar with the structure, programming, and general operation of such computer devices and that no additional explanation is required for all figures.
Fig. 4 is a schematic illustration of an exemplary processing device shown in accordance with some embodiments of the present application. Processing device 140 may be implemented on computing device 200 (e.g., processor 210 as shown in fig. 2 or CPU 340 as shown in fig. 3). The processing device 140 may include an acquisition module 402, a seed image generation module 404, an air mask generation module 406, a reconstruction module 408, and a seed image correction module 410.
The acquisition module 402 may be configured to acquire information related to the imaging system 100. The information may include scan data (e.g., raw data), image data, etc., or any combination thereof. For example, acquisition module 402 may acquire raw data detected by an imaging device (e.g., imaging device 100) from at least one angle. The raw data may reflect attenuation information of radiation rays (e.g., X-rays) that have passed through the object, and may generally be used to generate at least one image associated with the object. The raw data may include at least one set of data corresponding to at least one angle. A set of raw data collected at an angle may also be referred to as a view. In some embodiments, the acquisition module 402 may obtain raw data from at least one component of the imaging system 100 (e.g., the imaging device 100, a terminal (e.g., the terminal 130), a storage device (e.g., the storage device 150), etc.). Alternatively or additionally, the acquisition module 402 may obtain raw data from an external source via the network 120. For example, the acquisition module 402 may obtain raw data from a medical database.
The seed image generation module 404 may be configured to generate a seed image (also referred to as a first seed image). In some embodiments, the first seed image may provide an initial image estimate for the iterative reconstruction. In some embodiments, the seed image generation module 404 may generate the first seed image by performing Filtered Back Projection (FBP) on the raw data obtained by the acquisition module 402. In an exemplary filtered backprojection operation, the seed image generation module 404 may perform filtering on at least one set (e.g., each set) of raw data. In some embodiments, the filtering may be performed using, for example, a high pass filter, a slope filter, and the like, or any combination thereof. In some embodiments, each set of raw data (also referred to as each view) may be convolved with a kernel of a filter to generate a filtered view. The seed image generation module 404 may perform backprojection on the at least one filtered view to reconstruct the first seed image.
The air mask generation module 406 may be configured to generate an air mask (also referred to as a first air mask). The first air mask may be used to distinguish elements of air that are represented in the image and/or elements of air that are not represented. In some embodiments, the air mask generation module 406 may determine the first air mask by performing a minimum Back Projection (BP) on the raw data obtained by the acquisition module 402. In particular, the air mask generation module 406 may perform a minimum backprojection on each element of the image to determine a backprojection value associated with each element of the image. Each element of the image may refer to each element in the image to be reconstructed and may correspond to a spatial point of the object. The air mask generation module 406 may generate a second seed image (also referred to as an air mask seed image) based on at least one backprojection value associated with at least one element corresponding to at least one spatial point of the object. The air mask generation module 406 may generate the first air mask by performing thresholding on the second seed image further based on a threshold (i.e., a second threshold). Details regarding the generation of the first air mask may be found elsewhere in the present application (e.g., fig. 6 and its description).
In some embodiments, the air mask generation module 406 may also be configured to modify the first air mask. For example, the air mask generation module 406 can modify the first air mask by performing a logical operation (also referred to as a boolean operation) based on the first air mask and the second air mask. In particular, the air mask generation module 406 may perform an or operation and/or an and operation on the first air mask and the second air mask to modify the first air mask. In some embodiments, the second air mask may be generated by performing thresholding on the first seed image based on a threshold (i.e., a third threshold). In some embodiments, the second air mask may be different from the first air mask. In some embodiments, a second air mask may be used to modify the first air mask. For another example, the air mask generation module 406 may modify the first air mask by performing morphological operations on the first air mask. The morphological operations may include erosion operations, dilation operations, opening operations, closing operations, and the like, or any combination thereof. Details regarding the modification of the first air mask can be found elsewhere in this application (e.g., fig. 9 and its description).
The reconstruction module 408 may be configured to reconstruct at least one image by performing an iterative reconstruction. Exemplary iterative reconstruction algorithms may include Adaptive Statistical Iterative Reconstruction (ASiR), model-based iterative reconstruction (MBiR), image space Iterative Reconstruction (iRIS), raw data-based iterative reconstruction (SAFIRE), dual model-based iterative reconstruction (DMBiR), adaptive iterative dose reduction (aid) iterative reconstruction, IMR, and the like, or any combination thereof. In some embodiments, the reconstruction module 408 may reconstruct at least one image by performing an iterative reconstruction based on the first seed image, the first air mask, and/or the raw data. Alternatively, the reconstruction module 408 may reconstruct at least one image by performing an iterative reconstruction based on the corrected seed image and the raw data. Details regarding image reconstruction may be found elsewhere in this application (e.g., operation 507 of process 500, operation 1407 of process 1400, and descriptions thereof).
Seed image correction module 410 may be configured to correct and/or modify a seed image (e.g., a first seed image) to determine a corrected seed image. In some embodiments, the seed image correction module 410 may perform low pass filtering on the seed image (e.g., the first seed image). For example, the seed image correction module 410 may perform two-dimensional low-pass filtering on the first seed image. As another example, the seed image correction module 410 may perform one-dimensional low-pass filtering in the channel direction of at least one detector of the detector modules 112. Alternatively or additionally, the seed image correction module 410 may subtract the outer ring background from the seed image (e.g., the first seed image) to determine a corrected seed image. The outer ring background may correspond to the background of the outer ring. The outer ring may correspond to a region between a (scanned) field of view (FOV) of the imaging device 110 and a gantry aperture of the imaging device 110. Details regarding the correction of the seed image may be found elsewhere in this application (e.g., operation 901 of process 900, operation 1405 of process 1400, process 1600, and descriptions thereof).
It should be noted that the above description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. For example, processing device 140 may also include a storage module (not shown in FIG. 4). The storage module may be configured to store data generated by the processing device 140. In some embodiments, two or more modules (or units) may be combined into a single module (or unit). For example, the seed image generation module 404 and the seed image correction module 410 may be combined into a single module. The single module may be configured to generate and/or modify a seed image. Alternatively or additionally, any one module may be divided into two or more units (or sub-units). For example, the air mask generation module 406 may be divided into two units. One of the cells may be configured to generate an air mask (e.g., a first air mask) and the other cell may be configured to modify the air mask (e.g., the first air mask).
FIG. 5 is a flow chart illustrating an exemplary process for reconstructing at least one image according to some embodiments of the present application. In some embodiments, at least one operation of process 500 may be implemented in imaging system 100 shown in fig. 1. For example, process 500 may be stored in storage device 150 and/or memory 220 in the form of instructions (e.g., an application) and invoked and/or executed by a processing device 140 (e.g., processor 210 in computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 500 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 500 may be implemented with at least one additional operation not described, and/or with at least one operation described herein. Further, the order of the process 500 shown in FIG. 5 and the operations described below is not limiting.
In operation 501, the processing device 140 (e.g., the acquisition module 402) may acquire raw data detected by an imaging device (e.g., the imaging device 110) from at least one angle (also referred to herein as a projection angle). In some embodiments, the raw data may be referred to as projection data. The raw data may reflect attenuation information of radiation rays (e.g., X-rays) that have passed through the object, and may generally be used to generate at least one image associated with the object. In some embodiments, the raw data may be detected and/or collected by the imaging device 110 from at least one angle. The raw data may include at least one set of data corresponding to at least one angle. In some embodiments, a set of raw data collected at an angle may also be referred to as a view. For example, the imaging device 110 (e.g., a CT imaging device) may perform a scan of the object by irradiating the object with X-rays. During scanning, the radiation source 113 and the detector module 112 may be rotated with the gantry 111 about the Z-axis to scan the object from different angles. The Z-axis may be from the front side to the back side of the gantry 111 along the axis of the scan path of the gantry 111.
In some embodiments, the projection angle may refer to an angle formed by a line connecting the radiation source 113 and the center of rotation of the gantry 111 and a coordinate system (e.g., X-axis, Y-axis as shown in fig. 1). In some embodiments, the angle may range from a start angle and a stop angle. The starting angle may correspond to the position of the radiation source 113 at which the radiation source 113 starts emitting radiation rays and the detector module 112 starts detecting raw data. The stop angle may correspond to a position of the radiation source 113 at which the radiation source 113 stops emitting radiation rays and the detector module 112 stops detecting raw data. The starting angle may be, for example, 0 °,5 °, 10 °, 20 °, etc. The stop angle may be greater than the start angle, e.g., 180 °, 210 °, 270 °, 360 °, etc. In some embodiments, the radiation source 113 may continuously emit radiation rays (e.g., X-rays) toward the subject as the gantry 111 rotates. In some embodiments, the detector module 112 may collect multiple sets of raw data corresponding to multiple angles (e.g., 1200 or 2400 angles ranging from 0 ° to 360 °). Alternatively, the radiation source 113 may intermittently emit radiation rays (e.g., X-rays) toward the subject as the gantry 111 rotates. In some embodiments, the radiation source 113 may emit radiation rays (e.g., X-rays) toward the subject at specific angular intervals (e.g., 1 °, 2 °,5 °, 10 °, etc.). For example, the angular interval may be 1 ° and the angular range may be 0 ° -360 °, so the detector module 112 may collect 360 sets of raw data (also referred to as 360 views). As another example, the angular interval may be 0.5 and the angular range may be 0-360, so that the detector module 112 may collect 720 sets of raw data (also referred to as 720 views).
In some embodiments, acquisition module 402 may obtain raw data from at least one component of imaging system 100, such as imaging device 110, a terminal (e.g., terminal 130), a storage device (e.g., storage device 150), and the like. Alternatively or additionally, the acquisition module 402 may obtain raw data from an external source via the network 120. For example, the acquisition module 402 may obtain raw data from a medical database.
In operation 503, the processing device 140 (e.g., the seed image generation module 404) may generate a first type image. In some embodiments, the seed image generation module 404 may generate the first seed image by performing Filtered Back Projection (FBP) on the raw data obtained in operation 501. In some embodiments, filtered backprojection may refer to a technique that reconstructs an image from raw data (or projection data) that processes the data in projection space and determines the value of each element (also referred to herein as an image element) in image space as a linear combination of a portion of the processed image. A portion of the processed data may be obtained from a plurality of points in the projection space to which the image elements contribute in the forward projection. The image elements may be pixels or voxels corresponding to spatial points of the object. The values of the image elements may be data related to attenuation coefficients of radiation rays (e.g., X-rays) that have traversed a respective spatial point of the object from at least one angle.
In an exemplary filtered backprojection operation, the seed image generation module 404 may perform filtering on at least one set (e.g., each set) of raw data. In some embodiments, the filtering may be performed using, for example, a high pass filter, a slope filter, or the like, or any combination thereof. In some embodiments, each set of raw data (also referred to as each view) may be convolved with a kernel of a filter to generate a filtered view. The seed image generation module 404 may perform backprojection on the at least one filtered view to reconstruct the first seed image.
In some embodiments, the first type of sub-image may be further modified and/or corrected to determine a modified first type of sub-image. In some embodiments, the seed image correction module 410 may perform low pass filtering on the first seed image. Alternatively or additionally, the seed image correction module 410 may modify and/or correct the first seed image by subtracting the outer-loop background from the first seed image. More description regarding the modification and/or correction of the first type of sub-image may be found elsewhere in this application (e.g., operation 901 of process 900, operation 1405 of process 1400, process 1600, and related descriptions thereof).
In operation 505, the processing device 140 (e.g., the air mask generation module 406) may determine a first air mask based on the raw data obtained in operation 501. In some embodiments, the air mask may include at least one element having zero attenuation to radiation rays (e.g., X-rays). An element having zero attenuation or relatively low (e.g., negligible) attenuation of the radiation rays may represent air. Each of the at least one element may be a pixel or a voxel. The first air mask may be used to distinguish between elements representing air and elements not representing air in the image (e.g., the first type of sub-image determined in operation 503). The first air mask may be a binary image. For example only, in a first air mask, an element representing air may have a first value, and an element not representing air (e.g., an object) may have a second value. The first value and the second value may be different. For example, the first value may be 1 and the second value may be 0. For another example, the first value may be 0 and the second value may be 1. These values are provided for illustrative purposes only and are not intended to limit the scope of the present application.
In some embodiments, the air mask generation module 406 may determine the first air mask based on the raw data. For example only, the air mask generation module 406 may perform minimum Backprojection (BP) on each element of the image (also referred to herein as an image element) to determine a backprojected value associated with each element of the image. Each element of the image may refer to each element in the image to be reconstructed and may correspond to a spatial point of the object. The air mask generation module 406 may generate a second seed image (also referred to as an air mask seed image) based on at least one backprojection value associated with at least one element corresponding to at least one spatial point of the object. The air mask generation module 406 can generate the first air mask by performing thresholding on the second seed image further based on the second threshold. The second threshold may be a default value or an empirical value associated with the imaging system 100. In some embodiments, the second threshold may be set according to a default setting of the imaging system 100, or preset by a user or operator through the terminal 130. It should be noted that in some embodiments, unlike a filtered backprojection operation, air mask generation module 406 may not need to perform filtering on the raw data to determine the first air mask prior to performing the minimum backprojection operation. In some embodiments, the filtering process in the filtered back-projection (e.g., the ramp filter used in the filtered back-projection) may suppress low-frequency components and enhance high-frequency components in the raw data. Furthermore, the filtered backprojection operation is sensitive to truncation. In some embodiments, the data value at the truncation position may drop sharply. That is, the filtering process may produce a sharp rise in data values near the truncation edge, resulting in artifacts that appear as white banding in the resulting image. Furthermore, truncation artifacts may propagate toward the center of the resulting image, thereby reducing overall image quality. In the minimum value back projection operation, a value of each element may be determined based on a minimum raw data among raw data detected from at least one angle. Thus, the value may not be affected by artifacts, and the minimum backprojection operation may not be sensitive to truncation. More description of the minimum backprojection operation and the generation of the first air mask based on the raw data can be found elsewhere in this application (e.g., fig. 6 and its associated description).
Alternatively or additionally, the air mask generation module 406 may determine the first air mask based on the raw data and the (modified) first seed image. For example only, the air mask generation module 406 may determine the first air mask based on the raw data as described above. The air mask generation module 406 may generate a second air mask based on the (modified) first seed image, e.g., by performing thresholding on the (modified) first seed image based on a third threshold. The third threshold may be a default value or an empirical value associated with the imaging system 100. In some embodiments, the third threshold may be set according to a default setting of the imaging system 100, or may be preset by a user or operator via the terminal 130. The third threshold may be the same as or different from the second threshold. For example only, for an element of the (modified) first type sub-image, the air mask generation module 406 may compare the value of the element (also referred to as the element value) to a third threshold. In response to determining that the element value of the (modified) first type of sub-image is less than the third threshold, the air mask generation module 406 may assign a first value to a corresponding element of the second air mask (i.e., an element of the second air mask that corresponds to an element of the (modified) first type of sub-image). In response to determining that the element values of the (modified) first type of sub-image exceed the third threshold, the air mask generation module 406 may assign a second value to the corresponding element of the second air mask. It should be noted that in response to determining that the value of an element of the (modified) first type of sub-image is equal to the third threshold, the air mask generation module 406 may assign the first value or the second value to a corresponding element of the second air mask. For example, a first value may be assigned to a corresponding element of the second air mask. As another example, a second value may be assigned to a corresponding element of the second air mask. When each of all elements of the second air mask is given a specific value (a first value or a second value), the second air mask may be generated.
The air mask generation module 406 may modify the first air mask by performing a logical operation (also referred to as a boolean operation) based on the first air mask and the second air mask. Further description regarding generating a first air mask based on raw data and a (modified) first type of sub-image may be found elsewhere in the application (e.g., fig. 9 and its associated description).
In operation 507, the processing device 140 (e.g., the reconstruction module 408) may reconstruct at least one image by performing an iterative reconstruction based on the first seed image, the first air mask, and/or the raw data. In some embodiments, the at least one image may comprise a 3D image or a volumetric image. Each of the at least one image may include at least one element. Each element may be a pixel or a voxel. The first seed image may provide an initial image estimate for the at least one image in an iterative reconstruction.
In some embodiments, the iterative reconstruction may include at least one forward projection and/or at least one back projection. The first air mask determined in operation 505 may be used in the forward projection and the back projection. For example, the first air mask may include elements that represent air and elements (e.g., objects) that do not represent air. The element representing air may have a first value (e.g., 0), and the element not representing air may have a second value (e.g., 1). The elements representing air may be omitted in the iterative reconstruction of forward projections and/or back projections. By using the first air mask, the iterative reconstruction can be accelerated. Exemplary iterative reconstruction algorithms may include Adaptive Statistical Iterative Reconstruction (ASiR), model-based iterative reconstruction (MBiR), image space Iterative Reconstruction (iRIS), raw data-based iterative reconstruction (SAFIRE), dual model-based iterative reconstruction (DMBiR), adaptive iterative dose reduction (aid) iterative reconstruction, IMR, and the like, or any combination thereof.
In some embodiments, the first air mask may be updated in one or more iterations of the iterative reconstruction. For example only, the first air mask may be generated based on the first type of image. The first seed image may be updated in said iteration of the iterative reconstruction, and the first air mask may therefore be updated based on the updated first seed image in said iteration.
It should be noted that the above description of process 500 is provided for illustrative purposes and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, downsampling may be performed on the raw data. The first air mask may be determined based on the down-sampled raw data. Alternatively or additionally, downsampling may be performed on the first (or second) seed image. The first air mask may be determined from at least one block of pixels of the down-sampled first (or second) seed image. Each of the at least one pixel block may include N × M pixels, e.g., 3 × 2 pixels, 4 × 4 pixels, 4 × 6 pixels, and the like. The number N may refer to the number (or count) of rows of a pixel block and the number M may refer to the number (or count) of columns of the pixel block. In some embodiments, morphological operations may be performed on the first air mask to modify the first air mask. The morphological operations may include erosion operations, dilation operations, or the like, or any combination thereof. In some embodiments, the first air mask may be used for image processing, image segmentation, image denoising, and the like.
In some embodiments, the air mask generation module 406 may determine the first air mask based on the (modified) first seed image. The air mask generation module 406 may determine a first air mask by performing thresholding on the (modified) first seed image based on a first threshold. The first threshold may be the same as or different from the second threshold or the third threshold. In some embodiments, if the first threshold is the same as the third threshold, the first air mask generated based on the (modified) first type image therein may be the same as the second air mask described above.
FIG. 6 is a flow chart illustrating an exemplary process of generating an air mask according to some embodiments of the present application. In some embodiments, at least one operation of process 600 may be implemented in imaging system 100 shown in fig. 1. For example, process 600 may be stored in the form of instructions (e.g., an application) in storage device 150 and/or memory 220 and invoked and/or executed by a processing device 140 (e.g., processor 210 of computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 600 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 600, when implemented, may add at least one additional operation not described, and/or delete at least one operation described herein. Further, the order of the operations of process 600 illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, operation 505 of process 500 may be performed in accordance with process 600.
In operation 601, the processing device 140 (e.g., the air mask generation module 406) may perform low pass filtering on the raw data (e.g., the raw data obtained in 501). The raw data may be detected by at least one detector of the detector module 112. The at least one detector may be arranged in a channel direction and a row direction. In some embodiments, the air mask generation module 406 may perform one-dimensional (1D) low pass filtering on the raw data in the channel direction before performing the minimum backprojection. Alternatively, the air mask generation module 406 may perform two-dimensional (2D) low pass filtering on the raw data in the channel and row directions before performing minimum back projection. In some embodiments, operation 601 may be omitted. The air mask generation module 406 may omit filtering of the raw data and perform minimum backprojection directly on the raw data.
In operation 603, the processing device 140 (e.g., the air mask generation module 406) may perform minimum value back-projection by specifying a minimum of raw data of a portion of the raw data or modified data as a back-projection value for each element of the at least one image. The portion of the raw data may be associated with each element and detected from at least one angle. The modified data may be generated by interpolation based on the original data.
For example only, each element of the image may correspond to a spatial point of the object. Radiation rays (e.g., X-rays) emitted from radiation source 113 can pass through a spatial point of the subject from at least one angle and are detected by detector module 112. The air mask generation module 406 may compare the detected raw data associated with each element and assign a substantially minimum raw data of the detected raw data as a backprojection value for each element. In some embodiments, radiation rays that pass through a spatial point of the object from a particular one of the at least one angle may not be detected directly by the detector of detector module 112. For example, the radiation rays may impinge on a gap between two detectors. Data corresponding to the radiation ray (also referred to as modified data) may be generated by interpolation based on raw data detected at two or more angles adjacent to the particular angle. The interpolation may include nearest neighbor pixel interpolation, linear interpolation, and the like.
In operation 605, the processing device 140 (e.g., the air mask generation module 406) may generate a second seed image based on the at least one backprojection value associated with the at least one element. A second seed image (also referred to as an air mask seed image) may be generated by assigning a back projection value to each of the at least one element. Unlike the first seed image generated based on filtered backprojection, the second seed image may be generated by performing minimum backprojection on the raw data. In some embodiments, the second seed image may not reflect actual attenuation information of the radiation ray, and the second seed image may not be used as an initial image estimate for the iterative reconstruction.
In operation 607, the processing device 140 (e.g., the air mask generation module 406) may generate a first air mask by performing thresholding on the second seed image based on the second threshold. In some embodiments, the first air mask may be used to distinguish between elements representing air and/or elements not representing air (e.g., elements representing non-air) in the second seed image. For example only, in a first air mask, elements representing air may have a first value (e.g., 0) and elements representing non-air may have a second value (e.g., 1).
In some embodiments, for elements of the second type of image, the air mask generation module 406 may compare the value of the element (also referred to as the element value) to a second threshold. Elements that represent air may have lower radiation ray attenuation information than elements that do not represent air. In response to determining that the element values of the second type of sub-image are less than the second threshold, the air mask generation module 406 may assign a first value to a corresponding element of the first air mask (i.e., an element of the first air mask that corresponds to an element of the second type of sub-image). In response to determining that the element values of the second type of image exceed the second threshold, the air mask generation module 406 may assign a second value to the corresponding element of the first air mask. It should be noted that in response to determining that the element values of the second type of sub-image are equal to the second threshold, the air mask generation module 406 may assign the first value or the second value to the corresponding element of the first air mask. For example, a first value may be assigned to a corresponding element of a first air mask. As another example, a second value may be assigned to a corresponding element of the first air mask. When each of all elements of the first air mask is given a specific value (a first value or a second value), the first air mask can be generated. The second threshold may be a default value or an empirical value associated with the imaging system 100.
It should be noted that the above description of process 600 is provided for illustrative purposes and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, the processing device 140 may perform a deghosting operation on the (filtered) raw data before performing the minimum back-projection. The clipping operation may refer to an operation of modifying at least one reference value. In an exemplary chop operation, the processing device 140 may convert a portion of the raw data whose baseline value is less than the reference value to the reference value (e.g., convert a negative to 0).
7A-7D are schematic diagrams of exemplary seed images of objects shown according to some embodiments of the present application. The seed images shown in fig. 7A-7D are generated based on the same raw data without truncation. Fig. 7A and 7C show the same image at different Window Widths (WW) and Window Levels (WL), which are also referred to as first seed images (or filtered backprojection seed images) in the present application. Fig. 7B and 7D show the same image at different Window Widths (WW) and Window Levels (WL), which are also referred to as second seed images (or air mask seed images) in the present application. In some embodiments, the Window Width (WW) of an image may affect the contrast of the image, and the Window Level (WL) of the image may affect the brightness of the image. The first sub-image (also the filtered backprojection seed image) shown in fig. 7A and 7C is generated according to the Feldkamp-Davis-Kress (FDK) algorithm. The FDK algorithm is a filtered backprojection algorithm. The second type of sub-image (also an air mask seed image) shown in fig. 7B and 7D is generated according to the minimum back-projection algorithm described in the present application. A first area 701 shown in fig. 7A represents an object, and a second area 703 (i.e., other area) represents background information (e.g., air, a bed plate, etc.). Similarly, a first region 702 shown in fig. 7B represents an object, while a second region 704 (i.e., other region) represents background information (e.g., air, bed deck, etc.).
A first contour of the filtered back projected seed image for a plurality of elements at line 705 is shown in fig. 7A. A second outline of the plurality of elements of the air mask seed image at row 706 is shown in FIG. 7B. Line 706 has a position corresponding to line 705 (i.e., if the filtered backprojection seed image and the air mask seed image are aligned and overlap, then line 705 and line 706 may coincide). The first contour reflects the element values (e.g., grayscale values) of the row 705. The second contour reflects the element values (e.g., grayscale values) of the row 706. Elemental values near 0 may correspond to air. The values of the elements near 0 in the first contour (marked with solid circles in fig. 7A) are fluctuating, which means that the change in the values of the elements near 0 in row 705 is relatively obvious. The values of the elements near 0 in the second contour (marked with dashed circles in fig. 7B) are smooth, which means that the variation of the values of the elements near 0 in the row 706 is relatively small. If thresholding is performed on the filtered backprojection seed image and the air mask seed image to generate air masks separately, the air mask generated based on the air mask seed image may be more accurate than the filtered backprojection seed image because the thresholding operation is more likely to be affected by fluctuations in the element values.
For comparing the first sub-image and the second sub-image, the first sub-image and the second sub-image are displayed at the same window width and level. Specifically, as shown in fig. 7C and 7D, the window width may be 100 Hounsfield Units (HU) and the window level may be 0HU. As shown in fig. 7C, the elements in the first region 701 have relatively high element values and are displayed in white. The elements in the second area 703 have relatively low element values and are shown in gray. Similarly, as shown in fig. 7D, the elements in the first region 702 have relatively high element values and are displayed in white. The elements in the second area 704 have relatively low element values and are shown in gray. The element values of the second region 703 are not uniform, and the element values of the second region 704 are uniform. That is, the second seed image may provide a sharp outline of the object and noise in the second seed image may be minimized, which may facilitate segmenting or distinguishing air regions from non-air regions (e.g., objects) by thresholding. In some embodiments, an air mask may be generated by performing thresholding on the second seed image.
Fig. 8A-8D are schematic diagrams of exemplary seed images of an object shown according to some embodiments of the present application. The seed images shown in fig. 8A-8D are generated based on the same raw data with truncation. Truncation may be introduced in the tomographic reconstruction if the object extends outside the (scanned) field of view (FOV) of the imaging device 110. Fig. 8A and 8C show the same image at different Window Widths (WW) and levels (WL), which are also referred to as first seed images (also filtered backprojection seed images) in the present application. Fig. 8B and 8D show the same image at different Window Widths (WW) and Window Levels (WL), which are also referred to as second seed images (also air mask seed images) in the present application. The first sub-image (also the filtered backprojection seed image) shown in fig. 8A and 8C may be generated according to the Feldkamp-Davis-Kress (FDK) algorithm. The FDK algorithm is a filtered backprojection algorithm. The second seed image (also an air mask seed image) shown in fig. 8B and 8D may be generated according to the minimum backprojection algorithm described in this application. A first region 801 shown in fig. 8A represents an object, and a second region 803 (i.e., other region) represents background information (e.g., air, a bed, etc.). Similarly, a first region 802 shown in fig. 8B represents an object, and a second region 804 (i.e., other region) represents background information (e.g., air, bed deck, etc.).
A first contour of the filtered back projected seed image for a plurality of elements at line 805 is shown in fig. 8A. A second contour of the plurality of elements of the air mask seed image at row 806 is shown in fig. 8B. Line 806 has a position corresponding to line 805 (i.e., if the first seed image and the second seed image are aligned and overlap, then line 805 and line 806 may coincide). The first contour reflects the element values (e.g., grayscale values) of the row 805. The second contour reflects the element values (e.g., grayscale values) of row 806. Elemental values near 0 may correspond to air. The values of the elements near 0 in the first contour (marked with solid circles in fig. 8A) are fluctuating, which means that the change in the values of the elements near 0 in the row 805 is relatively significant. The values of the elements near 0 in the second contour (marked with dashed circles in fig. 8B) are smooth, which means that the variation of the values of the elements near 0 in the row 806 is relatively small. Further, due to data truncation, the value of an element located at the boundary between air and the object in the first contour may abruptly change (as shown by arrow 807 in fig. 8A).
In some embodiments, to compare the first seed image and the second seed image, the first seed image and the second seed image may be displayed at the same window width and level. Specifically, as shown in fig. 8C and 8D, the window width may be 100HU and the window level may be 0HU. As shown in fig. 8C, the elements in the first region 801 have relatively high element values and are displayed in white. It should be noted that a portion of the element values in the second region 803 are higher than the element values of air due to data truncation. Therefore, it is difficult to segment or distinguish air and objects in the first sub-image. As shown in fig. 8D, the elements in the first area 802 have relatively high element values and are displayed in white. The elements in the second region 804 have relatively low element values and are shown in gray. The element values of the second region 804 are uniform, thus facilitating segmentation or distinguishing air from objects in the second seed image. That is, when data truncation occurs or is performed, the second seed image generated according to the minimum back-projection algorithm described in the present application may more conveniently generate an air mask, and the generated air mask may be more accurate.
Fig. 9 is a flow chart of an exemplary process of modifying an air mask (e.g., a first air mask) according to some embodiments of the present application. In some embodiments, at least one operation of process 900 may be implemented in imaging system 100 shown in fig. 1. For example, process 900 may be stored in the form of instructions (e.g., an application) in storage device 150 and/or memory 220 and invoked and/or executed by a processing device 140 (e.g., processor 210 in computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 900 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 900 may be implemented with at least one additional operation not described, and/or with at least one operation described herein. Further, the order of the operations of process 900 shown in FIG. 9 and the order of the operations described below is not intended to be limiting. In some embodiments, operation 505 of process 500 may be performed in accordance with process 900.
In operation 901, the processing device 140 (e.g., the seed image correction module 410) may modify the first seed image by subtracting the outer-loop background from the first seed image and/or performing low-pass filtering on the first seed image.
In some embodiments, the seed image correction module 410 may perform low pass filtering on the first seed image. For example, the seed image correction module 410 may perform one-dimensional (1D) low pass filtering on the first seed image in the channel direction of at least one detector of the detector modules 112. For another example, the seed image correction module 410 may perform two-dimensional (2D) low pass filtering on the first seed image in the channel direction and the row direction of at least one detector of the detector module 112. The size of the two-dimensional filter may be, for example, 5 × 5, 10 × 10, or the like. In some embodiments, the seed image correction module 410 may modify the first seed image by subtracting the outer ring background from the first seed image. The outer ring may correspond to a region between a (scanned) field of view (FOV) of the imaging device 110 and a gantry aperture of the imaging device 110. More description regarding the modification and/or correction of the first type of sub-image may be found elsewhere in this application (e.g., operation 1405 of process 1400, process 1600, and related descriptions thereof).
In operation 903, the processing device 140 (e.g., the air mask generation module 406) may generate a second air mask by performing thresholding on the (modified) first seed image based on a third threshold. The second air mask may be a binary image including elements having a first value and elements having a second value. In some embodiments, the elements having the first value may represent air and the elements having the second value may represent other regions (e.g., objects) of the image. In some embodiments, the second air mask may be different from the first air mask (i.e., at least one element of the second air mask may be different from a corresponding element of the first air mask). In some embodiments, a second air mask may be used to modify the first air mask. In some embodiments, operation 901 may be omitted and a second air mask may be generated by performing thresholding on the first seed image based on a third threshold.
In some embodiments, for an element of the (modified) first type of sub-image, air mask generation module 406 may compare the value of the element to a third threshold. In response to determining that the value of the element of the (modified) first type image is less than the third threshold, the air mask generation module 406 may assign a first value to the corresponding element of the second air mask (i.e., the element of the second air mask that corresponds to the element of the (modified) first type image). In response to determining that the values of the elements of the (modified) first type of sub-image exceed the third threshold, the air mask generation module 406 may assign a second value to the corresponding element of the second air mask. It should be noted that in response to determining that the value of the element of the (modified) first type of sub-image is equal to the third threshold, the air mask generation module 406 may assign the first value or the second value to the corresponding element of the second air mask. For example, a first value may be assigned to a corresponding element of the second air mask. As another example, a second value may be assigned to a corresponding element of the second air mask. When each element of the second air mask is given a specific value (a first value or a second value), the second air mask can be generated. The third threshold may be a default value or an empirical value associated with the imaging system 100. In some embodiments, the third threshold may be set according to a default setting of the imaging system 100, or preset by a user or operator via the terminal 130.
In operation 905, the processing device 140 (e.g., the air mask generation module 406) may modify the first air mask (e.g., the first air mask generated in operation 607) by performing a logical operation (also referred to as a boolean operation) based on the first air mask and the second air mask. In some embodiments, the air mask generation module 406 may perform an or operation or an and operation on the first air mask and the second air mask. In some embodiments, both the first air mask and the second air mask may be binary images. For example only, in the first and second air masks, the elements representing air may have a first value, while the elements representing other regions (e.g., objects) may have a second value. The first value and the second value may be different. For example, the first value may be 1 and the second value may be 0. For another example, the first value may be 0 and the second value may be 1. These values are provided for illustrative purposes only and are not intended to limit the scope of the present application.
For example only, the air mask generation module 406 may choose to perform the "and" operation when modifying the first air mask. That is, the air mask generation module 406 may perform an and operation on the elements in the first air mask and the corresponding elements in the second air mask. As another example, the air mask generation module 406 may choose to perform an OR operation when modifying the first air mask. That is, the air mask generation module 406 may perform an or operation on an element in the first air mask and a corresponding element in the second air mask. The selection may be determined by default settings of the imaging system 100 or preset or adjusted by the user via the terminal 130.
For example only, the values of the elements representing air in the first and second air masks may be 0, and the values of the elements representing other regions in the first and second air masks may be 1. If an OR operation is performed, the number (or count) of elements in the modified first air mask that do not represent air may be increased relative to the first air mask, and correspondingly, the number (or count) of elements in the modified first air mask that represent air may be decreased relative to the first air mask. Therefore, the iterative reconstruction based on the modified first air mask may involve more pixels than the iterative reconstruction based on the original first air mask, thereby increasing the computational cost of the reconstruction. Alternatively, if the "and" operation is performed, the number (or count) of elements in the modified first air mask that do not represent air may be decreased relative to the first air mask, and correspondingly, the number (or count) of elements in the modified first air mask that represent air may be increased relative to the first air mask. Therefore, iterative reconstruction may be accelerated based on the modified first air mask.
As another example, the value of the element representing air in the first and second air masks may be 1, and the value of the element not representing air in the first and second air masks may be 0. If an OR operation is performed, the number (or count) of elements representing air in the modified first air mask may be increased relative to the first air mask, and correspondingly, the number (or count) of elements not representing air in the modified first air mask may be decreased relative to the first air mask. Therefore, iterative reconstruction may be accelerated based on the modified first air mask. Alternatively, if the "and" operation is performed, the number (or count) of elements representing air in the modified first air mask may be decreased relative to the first air mask, and correspondingly, the number (or count) of elements not representing air in the modified first air mask may be increased relative to the first air mask. Therefore, the iterative reconstruction based on the modified first air mask may involve more pixels than the iterative reconstruction based on the original first air mask, thereby increasing the computational cost of the reconstruction.
In operation 907, the processing device 140 (e.g., the air mask generation module 406) may perform morphological operations on the modified first air mask. In some embodiments, the morphological operations may include erosion operations, dilation operations, open operations, close operations, and the like, or any combination thereof. In some embodiments, the air mask generation module 406 may perform morphological operations based on structural elements. The structuring element may be a relatively small binary image (i.e., a relatively small matrix of pixels or voxels, each having a value of 1 or 0). In some embodiments, the size of the structural element may be smaller than the air mask. In some embodiments, the size of the structuring element may be 3 × 4, 5 × 5, 10 × 10, etc. In some embodiments, the starting point of the structuring element may be one of a pixel or a voxel. In some embodiments, the value of the starting point of the structuring element may be 1. In some embodiments, the modified first air mask may still be a binary image, including elements having a first value (e.g., 0) and a second value (e.g., 1). The air mask generation module 406 may perform morphological operations on the modified first air mask.
It should be noted that the above description of process 900 is provided for illustrative purposes and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, operation 907 may be omitted. In some embodiments, the modified first air mask generated in operation 905 or the further modified first air mask generated in operation 907 may be used in place of the first air mask generated in operation 505 for use in the iterative reconstruction shown in operation 507 to accelerate the image reconstruction.
FIG. 10A is a schematic diagram of an exemplary process of generating an air mask, shown in accordance with some embodiments of the present application. At least one air mask may be generated by performing different operations on the raw data.
For example only, as shown in fig. 10A, the processing device 140 may perform minimum backprojection on the raw data to generate an air mask seed image V1, and further perform thresholding on the air mask seed image V1 based on the threshold T1 to determine the air mask I. More description of the generation of the air mask I may be found elsewhere in this application (e.g., operations 603-607 shown in fig. 6 and descriptions thereof). The threshold T1 may be, for example, a default value or an empirical value associated with the imaging system 100.
For another example, the processing device 140 may first perform a deghosting operation on the raw data, perform a minimum backprojection on the deghosted raw data to generate the air mask seed image V2, and further perform a thresholding process on the air mask seed image V2 based on the threshold T2 to determine the air mask II. More description of the generation of air mask II can be found elsewhere in this application (e.g., process 600 shown in fig. 6 and description thereof). The threshold T2 may be, for example, a default value or an empirical value associated with the imaging system 100.
As another example, the processing device 140 may perform one-dimensional (1D) low pass filtering on the raw data in the channel direction, perform minimum backprojection on the filtered raw data to generate the air mask seed image V3, and further perform thresholding on the air mask seed image V3 based on the threshold T3 to determine the air mask III. More description about the generation of the air mask III can be found elsewhere in the present application (e.g., the process 600 shown in fig. 6 and the description thereof). The threshold T3 may be, for example, a default value or an empirical value associated with the imaging system 100.
As another example, the processing device 140 may perform one-dimensional (1D) low-pass filtering on the raw data in the channel direction, perform a wave-canceling operation on the filtered raw data, perform minimum backprojection on the canceled raw data to generate the air mask seed image V4, and further perform thresholding on the air mask seed image V4 based on the threshold T4 to determine the air mask IV. More description of the generation of the air mask IV may be found elsewhere in this application (e.g., process 600 shown in fig. 6 and description thereof). The threshold T4 may be, for example, a default value or an empirical value associated with the imaging system 100.
As another example, the processing device 140 may perform two-dimensional (2D) low pass filtering on the raw data in the channel direction and the row direction, perform minimum backprojection on the filtered raw data to generate the air mask seed image V5, and further perform thresholding on the air mask seed image V5 based on the threshold T5 to determine the air mask V. More description of the generation of the air mask V may be found elsewhere in this application (e.g., process 600 shown in fig. 6 and description thereof). The threshold T5 may be, for example, a default value or an empirical value associated with the imaging system 100.
As another example, the processing device 140 may perform two-dimensional (2D) low pass filtering on the raw data in the channel direction and the row direction, perform a wave-canceling operation on the filtered raw data, perform minimum backprojection on the canceled raw data to generate the air mask seed image V6, and further perform threshold processing on the air mask seed image V6 based on the threshold T6 to determine the air mask VI. More description of the generation of the air mask VI may be found elsewhere in this application (e.g., process 600 shown in fig. 6 and description thereof). The threshold T6 may be, for example, a default value or an empirical value associated with the imaging system 100. In some embodiments, all thresholds T1-T6 may be the same. In some embodiments, at least two of the thresholds T1-T6 may be the same. In some embodiments, at least two of the thresholds T1-T6 may be different. In some embodiments, all of the thresholds T1-T6 may be different from one another.
FIG. 10B is a schematic diagram of an exemplary process of generating an air mask, shown in accordance with some embodiments of the present application. As shown in fig. 10B, the processing device 140 may perform filtered backprojection on the raw data to generate a filtered backprojection seed image, and perform thresholding on the filtered backprojection seed image based on a threshold T to generate a second air mask. The raw data used in fig. 10B is the same as the raw data in fig. 10A. The threshold T may be, for example, a default value or an empirical value associated with the imaging system 100. The processing apparatus 140 may determine the air mask HI by performing a boolean operation on the air mask I (see fig. 10A) and the second air mask, determine the air mask HII by performing a boolean operation on the air mask II (see fig. 10A) and the second air mask, determine the air mask HIII by performing a boolean operation on the air mask III (see fig. 10A) and the second air mask, determine the air mask HIV by performing a boolean operation on the air mask IV (see fig. 10A) and the second air mask, determine the air mask HV by performing a boolean operation on the air mask V (see fig. 10A) and the second air mask, and determine the air mask HVI by performing a boolean operation on the air mask VI (see fig. 10A) and the second air mask. More description of the generation of the air mask HI-HVI may be found elsewhere in this application (e.g., operations 903 and 905 shown in FIG. 9, and descriptions thereof).
FIG. 11A shows an air mask seed image corresponding to the air mask depicted in FIG. 10A. The air mask seed images (i.e., the air mask seed images V1-V6) may be generated according to the process shown in FIG. 10A. By thresholding the air mask seed image, a corresponding air mask can be generated. As shown in fig. 11A, the air mask seed image corresponding to the air mask I is referred to as an air mask seed image I, the air mask seed image corresponding to the air mask II is referred to as an air mask seed image II, the air mask seed image corresponding to the air mask III is referred to as an air mask seed image III, the air mask seed image corresponding to the air mask IV is referred to as an air mask seed image IV, the air mask seed image corresponding to the air mask V is referred to as an air mask seed image V, and the air mask seed image corresponding to the air mask VI is referred to as an air mask seed image VI. The air mask seed images I-VI are generated based on the truncated data. Data truncation may be introduced if the object extends outside the (scanned) field of view (FOV) of the imaging device 110. Due to data truncation, stripes are introduced in the air mask seed images (e.g., stripes in solid and dashed circles in the air mask seed images I-VI in fig. 11A). The air mask seed image VI contains fewer stripes (e.g., stripes in solid lines), which may speed up the segmentation or differentiation of air regions from non-air regions by thresholding.
FIG. 11B shows an air mask seed image corresponding to the air mask depicted in FIG. 10B. The air mask seed image may be generated by modifying the air mask seed images I-VI using the filtered back-projected seed image. By performing thresholding on the air mask seed image, a corresponding air mask can be generated. As shown in FIG. 11B, the air mask seed image corresponding to air mask HI is referred to as air mask seed image HI, the air mask seed image corresponding to air mask HII is referred to as air mask seed image HII, the air mask seed image corresponding to air mask HIII is referred to as air mask seed image HIII, the air mask seed image corresponding to air mask HIV is referred to as air mask seed image HIV, the air mask seed image corresponding to air mask HV is referred to as air mask seed image HV, and the air mask seed image corresponding to air mask HVI is referred to as air mask seed image HVI. As shown in FIG. 11B, the stripes in the solid circles in air mask seed images HI-HVI are reduced relative to the stripes in air mask seed images I-VI in FIG. 11A. The stripes in the dashed circles in the air mask seed image HI-HVI are reduced from the stripes in the air mask seed image I-VI. The air mask seed image HVI has the least striations and provides a clear outline of the object. Thus, the air mask seed image HVI can effectively identify air regions that are contaminated with stripes and identify objects that have a similar HU as the air regions.
Fig. 12A is a schematic diagram of an exemplary process of generating an air mask, shown in accordance with some embodiments of the present application. As shown in fig. 12A, processing device 140 may perform filtered backprojection on the raw data to generate a first type of sub-image. The processing device 140 may modify the first type image by subtracting the outer ring background from the first type image to optimize the first type image. The processing device 140 may perform two-dimensional (2D) low pass filtering on the first seed image to reduce streak artifacts. Streak artifacts may be created during the back projection process due to interpolation between the projected lines. The presence of streak artifacts may lead to misjudgment of fault characteristics. Accordingly, it may be desirable in some embodiments to reduce streak artifacts. For illustrative purposes only, the size of the two-dimensional filter may be 5 × 5. The processing device 140 may also perform thresholding on the modified (i.e., optimized and/or filtered) first seed image based on the threshold T to generate the air mask VII. The first threshold T may be, for example, a default value or an empirical value associated with the imaging system 100. For example, when the raw data includes CT data, the threshold T may be 30HU. The processing apparatus 140 may also perform morphological operations on the air mask VII and may determine a modified air mask VII. For example, the processing device 140 may perform a two-dimensional dilation operation on the air mask VII. The size of the structuring elements used in the two-dimensional dilation operation may be 10 x 10, which is much smaller than the size of the air mask VII.
FIG. 12B is a schematic diagram of another exemplary process of generating an air mask, shown in accordance with some embodiments of the present application. As shown in fig. 12B, processing device 140 may perform filtered backprojection on the raw data to generate a first type of sub-image. The raw data used in fig. 12B is the same as the raw data in fig. 12A. The processing device 140 may modify the first type image by subtracting the outer ring background from the first type image to optimize the first type image. The processing device 140 may perform two-dimensional (2D) low pass filtering on the first seed image to reduce streak artifacts. Streak artifacts may be created during the back projection process due to interpolation between the projected lines. The presence of streak artifacts may lead to misjudgment of fault characteristics. Accordingly, it may be desirable in some embodiments to reduce streak artifacts. For illustrative purposes only, the size of the two-dimensional filter may be 5 × 5. The processing device 140 may perform thresholding on the modified (i.e., optimized and/or filtered) first seed image based on the first threshold T1 to generate a second air mask. The first threshold T1 may be, for example, a default value or an empirical value associated with the imaging system 100. For example, when the raw data includes CT data, the first threshold T1 may be 30HU. Meanwhile, the processing device 140 may perform two-dimensional (2D) low pass filtering on the raw data and minimum backprojection on the filtered raw data to generate the second seed image. The processing device 140 may also perform thresholding on the second seed image based on the second threshold T2 to generate a first air mask. The second threshold T2 may be, for example, a default value or an empirical value associated with the imaging system 100. For example, the second threshold T2 may be 0. The processing device 140 may perform a boolean operation on the first air mask and the second air mask. For example, the processing tool 140 may determine a modified first air mask (e.g., air mask VIII) by performing an or operation on the first air mask and the second air mask. After the or operation, the number (or count) of elements in the air mask VIII that do not represent air may be increased relative to the air mask VII. In some embodiments, the processing apparatus 140 may also perform morphological operations on the air mask VIII and may determine a modified air mask VIII. For example, the processing device 140 may perform a close operation (expand operation followed by etch operation) on the air mask VIII. The size of the structural elements used in the closed operation may be 10 × 10, which is much smaller than the size of the air mask VIII.
Fig. 13A and 13B are exemplary air mask seed images of a portion of an object shown without truncation according to some embodiments of the present application. The air mask seed image shown in fig. 13A is generated according to the process shown in fig. 12A. By thresholding the air mask seed image VII, a corresponding air mask VII may be generated. The air mask seed image shown in fig. 13B is generated according to the process shown in fig. 12B. By thresholding the air mask seed image VIII, a corresponding air mask VIII can be generated. Fig. 13C shows an exemplary filtered back-projection seed image generated based on the same raw data of the same object as in fig. 13A and 13B. The filtered backprojection seed image shown in fig. 13C is generated according to a filtered backprojection algorithm.
Fig. 13D and 13E are exemplary air mask seed images of another portion of an object with truncation, as shown in accordance with some embodiments of the present application. Similar to the air mask seed image VII in fig. 13A, the air mask seed image VII' shown in fig. 13D is generated according to the process shown in fig. 12A. Similar to the air mask seed image VIII, the air mask seed image VIII' shown in fig. 13E is generated according to the process shown in fig. 12B. Fig. 13F shows an exemplary filtered back projection seed image generated based on the same raw data of the same object as in fig. 13D and 13E. The filtered backprojection seed image shown in fig. 13F is generated according to a filtered backprojection algorithm.
As shown in fig. 13A and 13B, background information (e.g., bed plate, air) and objects can be effectively identified in the air mask seed images VII and VIII. The air mask seed image VII contains apertures between regions of interest of the subject (e.g., corresponding to each aperture of an air region), while the air mask seed image VIII contains a smaller number (or count) of air regions and may communicate more. As shown in fig. 13D and 13E, background information (e.g., bed plate, air) and objects can be effectively identified in the air mask seed image VII 'and the air mask seed image VIII'. The air mask seed image VII 'contains apertures between regions of interest of the subject (e.g., each aperture corresponding to an air region), while the air mask seed image VIII' contains a smaller number (or count) of air regions and may be more connected. Accordingly, an iterative reconstruction based on the air mask VIII (e.g., an air mask generated based on the air mask seed image VIII 'or the air mask seed image VIII) may be computationally more extensive and safer than an iterative reconstruction based on the air mask VII (e.g., an air mask generated based on the air mask seed image VII' or the air mask seed image VII).
FIG. 14A is a flow chart illustrating an exemplary process of reconstructing at least one image according to some embodiments of the present application. In some embodiments, at least one operation of process 1400 may be implemented in imaging system 100 shown in fig. 1. For example, process 1400 may be stored in storage device 150 and/or memory 220 in the form of instructions (e.g., an application) and invoked and/or executed by a processing device 140 (e.g., processor 210 of computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 1400 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 1400, when implemented, may add at least one additional operation not described, and/or delete at least one operation described herein. Additionally, the order of the operations of process 1400 and described below as shown in FIG. 14A is not limiting.
In operation 1401, the processing device 140 (e.g., acquisition module 402) may acquire raw data detected by an imaging device (e.g., imaging device 110). The raw data may reflect attenuation information of radiation rays (e.g., X-rays) passing through the object, and may generally be used to generate at least one image associated with the object. In some embodiments, raw data may be detected and/or collected by imaging device 110 from at least one angle. The raw data may include at least one set of data corresponding to at least one angle. In some embodiments, a set of raw data collected from the same angle may also be referred to as a view. More description of the raw data may be found elsewhere in this application (e.g., operation 501 of process 500 and its associated description).
In some embodiments, the acquisition module 402 may obtain raw data from at least one component of the imaging system 100 (e.g., the imaging device 110, a terminal (e.g., the terminal 130), a storage device (e.g., the storage device 150), etc.). Alternatively or additionally, the acquisition module 402 may obtain raw data from an external source via the network 120. For example, the acquisition module 402 may obtain raw data from a medical database.
In operation 1403, the processing device 140 (e.g., the seed image generation module 404) may generate a seed image. In some embodiments, the seed image generation module 404 may generate the seed image by performing filtered backprojection on the raw data obtained in operation 1401. The seed image may include at least one element. Each element may be a pixel or voxel corresponding to a spatial point of the object. In an exemplary filtered backprojection operation, the seed image generation module 404 may perform filtering on at least one set (e.g., each set) of raw data. In some embodiments, the filtering may be performed using, for example, a high pass filter, a slope filter, or the like, or any combination thereof. In some embodiments, each set of raw data (also referred to as each view) may be convolved with the kernel of the filter to generate a filtered view. The seed image generation module 404 may perform backprojection on the at least one filtered view to reconstruct the first seed image.
In operation 1405, the processing device 140 (e.g., the seed image correction module 410) may determine a corrected seed image by subtracting the outer loop background from the seed image. The outer ring background may correspond to the background of the outer ring. The outer ring may correspond to a region between a (scanned) field of view (FOV) of the imaging device 110 and a gantry aperture of the imaging device 110.
As shown in fig. 14B, when the imaging device 110 images the object 1453, the object 1453 may be located within a gantry aperture 1451 of the imaging device 110. The imaging device 110 may have a scanned field of view 1452 in a gantry bore 1451. The scanned field of view 1452 may define the size of a two-dimensional or three-dimensional region of an image. The scanned field of view may include an image region having a region of interest (ROI). In some embodiments, if object 1453 has a relatively large size, a portion of object 1453 (e.g., shaded region 1455) may extend outside of scanned field of view 1452 and into outer ring 1454. The center of the scanned field of view 1452 and the center of the gantry aperture 1451 can coincide at a center O. The area between the gantry aperture 1451 and the scanned field of view 1452 may be referred to as an outer ring 1454. The center of the outer ring 1454 can also be the center O. In some embodiments, image information (including background information) of the portion of object 1453 (e.g., shaded region 1455) in outer ring 1454 can be used in at least one forward projection of the iterative reconstruction to reconstruct an image. In some embodiments, an offset may be introduced in the reconstructed image due to the lack of data related to the portion of object 1453 in outer ring 1454. The offset may include an offset in CT values of the image background (e.g., air), a reconstruction bias of the portion of the object 1453 located in the outer ring (e.g., the shaded region 1455), and so on. In some embodiments, it may be desirable to subtract the outer ring background (i.e., the background information in the outer ring 1454) from the seed image to generate a corrected seed image. In some embodiments, the corrected seed image may be generated according to process 1600 and is not repeated here.
In operation 1407, the processing device 140 (e.g., the reconstruction module 408) may reconstruct at least one image. In some embodiments, the reconstruction module 408 may reconstruct at least one image by performing an iterative reconstruction based on the corrected seed image determined in operation 1405 and the raw data obtained in operation 1401. In some embodiments, the at least one image may comprise a 3D image (or volumetric image). Each image of the at least one image may include at least one element. Each element of the at least one element may be a pixel or a voxel.
In some embodiments, the iterative reconstruction may include at least one forward projection and at least one back projection. The corrected seed image may provide an initial image estimate for the at least one image in an iterative reconstruction. By subtracting the outer ring background from the seed image determined in operation 1403, the offset of the corrected seed image may be reduced or eliminated. That is, the corrected seed image may include relatively little or no offset. Accordingly, iterative reconstruction based on the corrected seed image can reduce the number of iterations, which can improve the efficiency of iterative reconstruction. In addition, the quality of the reconstructed at least one image may be improved. Exemplary iterative reconstruction algorithms may include Adaptive Statistical Iterative Reconstruction (ASiR), model-based iterative reconstruction (MBiR), image space Iterative Reconstruction (iRIS), raw data-based iterative reconstruction (SAFIRE), dual model-based iterative reconstruction (DMBiR), adaptive iterative dose reduction (aid) iterative reconstruction, IMR, and the like, or any combination thereof.
It should be noted that the above description of process 1400 is provided for illustrative purposes and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, the seed image generation module 404 may generate an initial image by performing filtered backprojection on the raw data and determine the seed image by performing downsampling on the initial image. Alternatively, the seed image generation module 404 may perform downsampling on the raw data and generate the seed image by performing filtered backprojection on the downsampled raw data. Alternatively, the seed image generation module 404 may select a portion of the raw data collected from a portion of the at least one angle and generate the seed image by performing filtered backprojection on the selected portion of the raw data.
FIG. 15 is a schematic diagram of an exemplary seed image correction module shown in accordance with some embodiments of the present application. Seed image correction module 410 may be implemented on computing device 200 (e.g., processor 210 as shown in fig. 2 or CPU 340 as shown in fig. 3). The seed image correction module 410 may include a coordinate transformation unit 1502, an element extraction unit 1504, an element division unit 1506, a background image generation unit 1508, and an image subtraction unit 1510.
The coordinate transformation unit 1502 may be configured to perform coordinate transformation of the image or elements in the image. For example only, the coordinate transformation unit 1502 may perform a first coordinate transformation of the seed image from a first coordinate system (e.g., a cartesian coordinate system) to a second coordinate system (e.g., a polar coordinate system) to determine the intermediate image. The seed image may comprise at least one first element in a first coordinate system and the intermediate image may comprise at least one second element in a second coordinate system. Each first or second element may be a pixel or a voxel. In some embodiments, the intermediate image and the seed image may have substantially the same image information expressed in different coordinate systems.
The element extraction unit 1504 may be configured to extract at least one element from the image. For example only, the element extraction unit 1504 may extract at least one second element corresponding to the outer loop from at least one second element in a second coordinate system (i.e., a polar coordinate system). In some embodiments, the element extraction unit 1504 may determine a first maximum pole diameter of a scan field of view of the imaging device 110 and a second maximum pole diameter of a gantry bore. The element extraction unit 1504 may determine an area between the first maximum pole diameter and the second maximum pole diameter as an outer ring. The element extraction unit 1504 may extract at least one second element located in the outer ring.
The element dividing unit 1506 may be configured to divide the at least one extracted second element into at least one group based on a position of the at least one extracted second element in the second coordinate system. Each set of extracted second elements may correspond to a sub-region of the outer ring. In some embodiments, the element division unit 1506 may divide the at least one extracted second element into at least one group according to a polar angle of the at least one extracted second element. In some embodiments, the element division unit 1506 may divide the at least one extracted second element uniformly or non-uniformly.
The background image generation unit 1508 may be configured to generate a background image corresponding to the outer ring background in the first coordinate system based on the at least one group of extracted second elements. In some embodiments, the background image may reflect background information (e.g., air) corresponding to the outer ring. Details regarding the generation of the background image may be found elsewhere in this application (e.g., operation 1607 of process 1600, process 1700, and descriptions thereof).
The image subtraction unit 1510 may be configured to subtract a background image corresponding to the outer ring background from the seed image to obtain a corrected seed image. In some embodiments, the background image may include a plurality of elements that reflect background information (e.g., element values of air) in the outer ring. In some embodiments, the background image may also be referred to as an outer ring background. The image subtraction unit 1510 may subtract the element values of the outer ring background from the seed image to obtain a corrected seed image. The corrected seed image may contain less outer-loop background than the original seed image. In some embodiments, the corrected seed image may not include an outer ring background.
It should be noted that the above description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, the seed image correction module 410 may also include a storage unit (not shown in fig. 15). The storage unit may be configured to store data generated by the units of the seed image correction module 410.
FIG. 16 is a flow diagram illustrating an exemplary process of generating a corrected seed image according to some embodiments of the present application. In some embodiments, at least one operation of process 1600 may be implemented in imaging system 100 shown in fig. 1. For example, process 1600 may be stored in storage device 150 and/or memory 220 in the form of instructions (e.g., an application) and invoked and/or executed by a processing device 140 (e.g., processor 210 of computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 1600 may be implemented on imaging device 110. The operations of the illustrated processes presented below are intended to be illustrative. In some embodiments, process 1600, when implemented, may add at least one additional operation not described, and/or delete at least one operation described herein. Further, the order of operations in process 1600 shown in FIG. 16 and the order of operations described below is not intended to be limiting. In some embodiments, operation 1405 of process 1400 may be performed in accordance with process 1600.
In operation 1601, the processing device 140 (e.g., the coordinate transformation unit 1502) may determine an intermediate image based on the seed image. In some embodiments, the seed image may include at least one first element in a first coordinate system. The intermediate image may comprise at least one second element in a second coordinate system. Each first element may be a pixel or a voxel. Each second element may be a pixel or a voxel. The coordinate transformation unit 1502 may generate an intermediate image by performing a first coordinate transformation of the seed image from the first coordinate system to the second coordinate system.
In some embodiments, the first coordinate system may be a cartesian coordinate system and the second coordinate system may be a polar coordinate system. In some embodiments, the origin of the first coordinate system and/or the origin of the second coordinate system may be located at a gantry aperture location of the gantry 111. For example, the origin of the first coordinate system and the origin of the second coordinate system may coincide and both coincide with the center of the outer ring (as shown in FIG. 18).
Fig. 18 is a schematic diagram of an exemplary seed image of an object shown in accordance with some embodiments of the present application. The seed image is a two-dimensional (2D) image. As shown in fig. 18, the outer ring refers to the area between the gantry bore of the imaging device 110 and the field of view of the scan. As shown in fig. 18, an object 1804 is positioned within the field of view 1803 of the scan. The outer ring background 1802 includes background information including air, bed plate information, etc., or any combination thereof. The seed image shown in fig. 18 is represented in an orthogonal coordinate system. The horizontal axis of the cartesian coordinate system is the X axis in fig. 18, and the vertical axis of the cartesian coordinate system is the Y axis in fig. 18. The origin O of the cartesian coordinate system coincides with the center of the outer ring.
FIG. 19 is a schematic diagram of an exemplary intermediate image of an object shown in accordance with some embodiments of the present application. The intermediate image is determined from a first coordinate transformation performed on the seed image shown in fig. 18. The intermediate image shown in fig. 19 is located in a polar coordinate system. The origin O of the polar coordinate system is also located at the center of the outer ring. The horizontal axis of the polar coordinate system is the polar angle θ in fig. 19, and the vertical axis of the polar coordinate system is the polar diameter r in fig. 19.
In some embodiments, the intermediate image and the seed image may have substantially the same image information expressed in different coordinate systems. In some embodiments, the at least one second element in the second coordinate system may not directly correspond to the at least one first element in the first coordinate system. For example, the second element may have a corresponding position between two first elements in the seed image (i.e., the corresponding position of the second element may occupy a portion of each of the two first elements). Thus, in some embodiments, the element values of the second element in the second coordinate system may be determined by interpolation based on the element values of at least two first elements in the first coordinate system.
In operation 1603, the processing device 140 (e.g., the element extraction unit 1504) may extract at least one second element corresponding to the outer ring from the at least one second element in the second coordinate system. In some embodiments, the element extraction unit 1504 may determine a first maximum pole diameter of a scanned field of view of the imaging device 110 and a second maximum pole diameter of a gantry bore. The element extraction unit 1504 may determine an area between the first maximum pole diameter and the second maximum pole diameter as an outer ring. The element extraction unit 1504 may extract at least one second element located at the outer ring. For example only, as shown in fig. 19, the maximum pole diameter r2 of the scanned field of view may be 500 millimeters and the maximum pole diameter r3 of the gantry bore of the imaging device 110 may be 700 millimeters. A region between r2=500 mm and r3=700 mm (e.g., a region indicated by a dotted line) may be designated as an outer ring, and thus, at least one second element in the region may be extracted.
In operation 1605, the processing apparatus 140 (e.g., the element dividing unit 1506) may divide the at least one extracted second element into at least one group based on the position of the at least one extracted second element in the second coordinate system. Each set of extracted second elements may correspond to a sub-region of the outer ring. In some embodiments, the element dividing unit 1506 may divide the at least one extracted second element into at least one group according to a polar angle of the at least one extracted second element. In some embodiments, the element dividing unit 1506 may divide the at least one extracted second element uniformly or non-uniformly. For example only, as shown in fig. 19, the plurality of extracted second elements corresponding to the outer ring may be uniformly divided into four groups, i.e., a first group, a second group, a third group, and a fourth group. The polar angle of the second element in the first group may be in the range of 0 deg. -90 deg., representing the first sub-region 1902 of the outer ring. The polar angle of the second element in the second group may be in the range of 90-180, representing a second sub-region 1904 of the outer ring. The polar angle of the second element in the third group may be in the range of 180-270 representing the third sub-region 1906 of the outer ring. The polar angle of the second element in the fourth group may be in the range of 270-360, representing the fourth sub-region 1908 of the outer ring. It should be noted that in some embodiments, the at least one extracted second element may be divided into other numbers (or counts) of groups, e.g., 5 groups, 8 groups, 10 groups, etc. In some embodiments, the at least one extracted second element may be divided according to at least one mode, which may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal 130.
In operation 1607, the processing device 140 (e.g., the background image generation unit 1508) may generate a background image corresponding to the outer ring background in the first coordinate system based on the at least one group of extracted second elements. In some embodiments, the background image may reflect background information (e.g., air) corresponding to the outer ring.
In some embodiments, the background image generation unit 1508 may determine at least one reference value in each sub-region of the outer ring along a first direction (e.g., radial) of the second coordinate system. The background image generation unit 1508 may determine an initial circular image including at least one third element in the first coordinate system by performing a second coordinate transformation of the at least one extracted second element from the second coordinate system (i.e., polar coordinate system) to the first coordinate system (i.e., cartesian coordinate system). The initial circular image may correspond to an outer ring. The background image generation unit 1508 may further modify the initial circular image by performing a homogenization operation to obtain a background image corresponding to the outer ring background. In some embodiments, the homogenization operation may include a rotation operation, a superposition operation, and a normalization operation. More description about the generation of the background image can be found elsewhere in the present application (e.g., fig. 17, 22, and 24 and their related descriptions).
In operation 1609, the processing device 140 (e.g., the image subtraction unit 1510) may subtract a background image corresponding to the outer ring background from the seed image to obtain a corrected seed image. In some embodiments, the background image may include at least one element reflecting background information (e.g., an element value of air) in the outer ring. In some embodiments, the background image may also be referred to as an outer ring background. The image subtraction unit 1510 may subtract the element values of the outer ring background from the seed image to obtain a corrected seed image. The corrected seed image may include less outer-loop background than the original seed image. In some embodiments, the corrected seed image may not include an outer ring background.
It should be noted that the above description of process 1600 is provided for illustrative purposes and is not intended to limit the scope of the present application. Various modifications and changes may occur to those skilled in the art in light of the description herein. However, such modifications and changes do not depart from the scope of the present application. In some embodiments, operation 1603 and/or operation 1601 may be performed in different ways. For example only, processing device 140 may extract at least one first element corresponding to the outer ring from the at least one first element in the first coordinate system. The processing device 140 may also perform a first coordinate transformation of the at least one extracted first element from the first coordinate system to the second coordinate system to determine the at least one extracted second element.
In some embodiments, operation 1601 may be omitted and no coordinate transformation may be required to generate the corrected seed image. For example only, the processing device 140 may extract at least one element of the initial circular image corresponding to the outer ring and generate at least one rotated circular image by rotating the at least one extracted element at least once relative to the center of the outer ring and by one angle in each rotation. In some embodiments, the angle may be less than 360 °. In some embodiments, the processing device 140 may modify at least one of the at least one rotated circular image by interpolation. The processing device 140 may generate a background image corresponding to the outer ring background based on the initial circular image and the at least one rotated circular image. For example, the processing device 140 may generate an overlay image by overlaying the initial circular image and the at least one rotated circular image, and generate a background image corresponding to the outer ring background by normalizing the overlay image. The processing device 140 may subtract the background image corresponding to the outer ring background from the seed image to obtain a corrected seed image.
FIG. 17 is a flow diagram of an exemplary process of generating a background image, shown in accordance with some embodiments of the present application. In some embodiments, at least one operation of process 1700 may be implemented in imaging system 100 shown in fig. 1. For example, the process 1700 may be stored in the storage device 150 and/or memory 220 in the form of instructions (e.g., an application program) and invoked and/or executed by a processing device 140 (e.g., the processor 210 of the computing device 200 shown in fig. 2, the CPU 340 of the mobile device 300 shown in fig. 3, at least one module of the processing device 140 shown in fig. 4, etc.). As another example, a portion of process 1700 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 1700, when implemented, may add at least one additional operation not described, and/or prune at least one operation described herein. Further, the order of the operations of the process 1700 shown in FIG. 17 and the order of the operations described below are not limiting. In some embodiments, operation 1607 of process 1600 may be performed in accordance with process 1700.
In operation 1701, the processing device 140 (e.g., the background image generation unit 1508) may determine at least one reference value in each sub-region of the outer ring along the first direction of the second coordinate system. The first direction of the second coordinate system may refer to a radial direction (e.g., a radial direction as shown in fig. 19). In some embodiments, the at least one extracted second element may be distributed in at least one row along the first direction of the second coordinate system. The reference value may be a median (or an average, a maximum, a minimum, etc.) of the element values of the extracted second element in the at least one row. Each reference value may correspond to one of the at least one row. Each row may correspond to a particular pole diameter (e.g., the pole diameter may be in the range of 500-700 millimeters as shown in fig. 19). Thus, the at least one reference value and the pole diameter of the outer ring may have a specific relationship as shown in fig. 20.
In some embodiments, for each sub-region of the outer ring and in each of the at least one row, the background image generation unit 1508 may specify a median value of the extracted second elements in each row of each sub-region as a reference value for each row of each sub-region. The median value (also referred to as the median value) may be a value that separates the upper half of the element values of the extracted second element in each row of each sub-region from the lower half thereof. For example only, if there are five element values (e.g., a first element value, a second element value, a third element value, a fourth element value, a fifth element value) arranged from small to large (or from large to small), the third element value may be a median value. In some embodiments, the number (or count) of extracted second elements indicating background information may exceed the number (or count) of extracted second elements indicating information related to the object, so the median value may be an element value of the background information. For example only, as shown in fig. 19, for the fourth sub-region 1908 and the row r4, the background image generation unit 1508 may sort the extracted second elements in the row r4 of the fourth sub-region 1908 according to the corresponding element values. The background image generation unit 1508 may determine a median value of the extracted second elements based on the sorted extracted second elements, and specify the median value as a reference value of the row r4 of the fourth sub-area 1908 (e.g., an element value C1 shown in fig. 20). Fig. 20 shows a schematic diagram of a plurality of reference values in a plurality of rows in the fourth sub-region 1908. As shown in fig. 20, each reference value may correspond to a row. Each reference value may be a median value of the extracted second elements in the row of the fourth sub-region 1908. For example, the reference value C1 may be the median value of the extracted second element in the row r4 of the fourth sub-region 1908.
Alternatively or additionally, for each sub-region of the outer ring and in each of the at least one row, the background-image generating unit 1508 may determine a portion of the extracted second element in each row of each sub-region. The extracted portion of the second element may have an element value within a particular range. Elemental values within a particular range may represent background information, such as air. For example only, for the sub-region of the outer ring and in one of the at least one row, the background-image generating unit 1508 may determine whether the element value of the extracted second element in the row of the sub-region is in the particular range. In response to determining that the element value of the extracted second element in the row of the sub-region is out of the specific range (i.e., the extracted second element may represent information related to the object), the background image generation unit 1508 may disregard the extracted second element when determining the reference value. In response to determining that the element value of the extracted second element in the row of the sub-region is within the particular range (i.e., the extracted second element may represent background information (e.g., air)), the background image generation unit 1508 may consider the extracted second element in determining the reference value. The background image generation unit 1508 may determine a portion of the extracted second elements (i.e., at least one extracted second element having an element value within a certain range) in each row of each sub-region and determine an average value of the element values of the portion of the extracted second elements. The background image generation unit 1508 may specify an average value of the extracted portion of the second element as a reference value for each row of each sub-region.
In operation 1703, the processing device 140 (e.g., the background image generation unit 1508) may determine an initial circular image including at least one third element in the first coordinate system by performing a second coordinate transformation of the at least one extracted second element from the second coordinate system (i.e., the polar coordinate system) to the first coordinate system (i.e., the cartesian coordinate system). In some embodiments, the initial circular image may correspond to an outer ring. In some embodiments, at least one element value of at least one third element may be determined based on the at least one parameter value, and the at least one element value of the at least one third element is different from the element value of the corresponding first element in the seed image.
By way of example only, fig. 21A is a schematic diagram of an exemplary initial circular image shown in accordance with some embodiments of the present application. The initial circular image may be generated from a second coordinate transformation of the at least one extracted second element. As shown in fig. 21A, the initial circular image may include four portions, namely a first portion 2102, a second portion 2104, a third portion 2106 and a fourth portion 2108. The first portion 2102 may correspond to the first sub-area 1902. The second portion 2104 may correspond to the second sub-region 1904. The third portion 2106 may correspond to the third sub-region 1906. The fourth portion 2108 may correspond to the fourth sub-region 1908. The initial circular image may include at least one third element. An element value of a third element in each section may be determined based on at least one reference value in the respective sub-region. For example, an element value for the third element in the first portion 2102 may be determined based on at least one reference value in the first sub-region 1902 of the outer ring.
In some embodiments, after the second coordinate transformation, the first portion of the third element in the initial circular image may directly correspond to a portion of the extracted second element in the intermediate image, so each element value of the first portion of the third element may be directly determined according to a relationship between the reference value and the polar diameter (as shown in fig. 20), and thus the reference value may be assigned to each element value of the first portion of the third element. Alternatively or additionally, the second portion of the third element in the initial circular image may not directly correspond to any extracted second element in the intermediate image, so each element value of the second portion of the third element may be determined by interpolation based on the relationship between the reference value and the polar diameter (as shown in fig. 20). Accordingly, in some embodiments, the background image generation unit 1508 may perform interpolation on a portion of the transformed extracted second elements based on at least two reference values to determine an element value of the third element. Interpolation algorithms may include nearest neighbor pixel interpolation, linear interpolation, and the like.
In operation 1705, the processing device 140 (e.g., the background image generation unit 1508) may modify the initial circular image by performing a homogenization operation to obtain a background image corresponding to the outer ring background. The background image may reflect background information (e.g., air) related to the outer ring. Fig. 21B is a schematic illustration of an exemplary background image shown according to some embodiments of the present application. The background image shown in fig. 21B may be generated by modifying the initial circular image shown in fig. 21A. As shown in fig. 21B, the background image is more uniform than the initial circular image in fig. 21A.
In some embodiments, the homogenization operation may include a rotation operation, a superposition operation, a normalization operation, or the like, or a combination thereof. For example, the background image generation unit 1508 may generate at least one rotated circular image by rotating at least one third element of the initial circular image at least once with respect to the center of the outer ring and by an angle at each rotation. By performing the rotation operation, the reference value (e.g., median value) may be uniformly distributed on the rotated image in the radial direction. For example only, when the at least one third element of the initial circular image is rotated once, the position of the at least one third element may be moved to a new position to generate a rotated circular image. For the third element of the initial circular image located at the first position, the element value of the corresponding element located at the first position in the rotated circular image may be different from the element value of the third element of the initial circular image. The background image generation unit 1508 may generate a superimposed image by superimposing the initial circular image and the at least one rotated circular image. The background image generation unit 1508 may also generate a background image corresponding to the outer ring background by normalizing the superimposed image. For example, the background image generation unit 1508 may divide each element (also referred to as a fourth element) in the superimposed image by n +1, where n refers to the number of rotations or the number (or count) of rotated circular images. For another example, the background image generation unit 1508 may rotate at least one third element of the initial circular image at least once. At each rotation, the background image generation unit 1508 may generate a rotated circular image when at least one third element of the initial circular image is rotated by an angle with respect to the center of the outer ring, and generate another image by superimposing the initial circular image and the rotated circular image. The background image generation unit 1508 may also designate the superimposed image as an initial circular image at each rotation. When at least one rotation is completed, the background image generation unit 1508 may generate a background image corresponding to the outer ring background by normalizing the superimposed image. For example, the background image generation unit 1508 may divide each element (i.e., the fourth element) in the superimposed image by 2n (n refers to the number of rotations) to generate a background image. More description about the generation of the background image can be found elsewhere in the present application (e.g., fig. 22 and 24, and their related descriptions).
FIG. 22 is a flow diagram illustrating an exemplary process of modifying an initial circular image, according to some embodiments of the present application. In some embodiments, at least one operation of process 2200 may be implemented in imaging system 100 shown in fig. 1. For example, process 2200 may be stored in the form of instructions (e.g., an application program) in storage device 150 and/or memory 220 and invoked and/or executed by a processing device 140 (e.g., processor 210 of computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 2200 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 2200 may, when implemented, add at least one additional operation not described, and/or subtract at least one operation described herein. Further, the order of the operations of process 2200 shown in FIG. 22 and the order of the operations described below are not limiting. In some embodiments, operation 1705 of process 1700 may be performed according to process 2200.
In operation 2201, the processing device 140 (e.g., the background image generation unit 1508) may generate at least one rotated circular image by rotating at least one third element of the initial circular image at least once with respect to the center of the outer ring and by an angle at each rotation. Each rotated circular image may include at least one rotated third element. The center of the outer ring may coincide with the center of the field of view of the scan and/or the gantry aperture of the imaging device 110. The angle may be less than 360 °. In some embodiments, the angle in two or more rotations may be the same. Alternatively, the angles in different numbers of rotations may be different. For example, the angle at the first rotation may be 30 °, and the angle at the second rotation may be 45 °. The number of rotations may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal 130.
As shown in fig. 23, at least one rotated circular image may be generated based on the initial circular image. FIG. 23 is a schematic diagram of an exemplary process of modifying an initial circular image, shown in accordance with some embodiments of the present application. As shown in fig. 23, the initial circular image 2301 may include four portions. The first rotated circular image 2302 may be generated by rotating the initial circular image 2301 a first time by 30 ° with respect to the center O of the outer ring. The second rotated circular image 2303 may be generated by second rotating the initial circular image 2301 by 30 ° with respect to the center O of the outer ring (or, second rotating the first rotated circular image 2302 by 30 ° with respect to the center O of the outer ring). It should be noted that the number of rotations is provided for illustrative purposes only and is not intended to limit the scope of the present application. The initial circular image 2301 may be rotated another number of times, e.g., 7 times, 10 times, 20 times, etc. The initial circular image may be rotated in the same direction (e.g., clockwise, counterclockwise).
In operation 2203, the processing device 140 (e.g., the background image generation unit 1508) may generate a new image (i.e., a superimposed image) by superimposing the initial circular image and the at least one rotated circular image. In some embodiments, the background image generation unit 1508 can superimpose the third element in the initial circular image and the corresponding rotated third element in the at least one rotated circular image. The overlay image may include at least one fourth element. Each fourth element may be a sum of the third element and the corresponding rotated third element. For example only, as shown in FIG. 23, a new image 2304 may be generated by superimposing an initial circular image 2301, a first rotated circular image 2302, and a second rotated circular image 2303.
In some embodiments, the background image generation unit 1508 may perform an interpolation operation on at least one of the at least one rotated circular image. Specifically, the background image generation unit 1508 may perform an interpolation operation on the rotated third element of the at least one rotated circular image. The background image generation unit 1508 may generate a new image (i.e., a superimposed image) by superimposing the initial circular image and the at least one (interpolated) rotated circular image.
In operation 2205, the processing device 140 (e.g., the background image generation unit 1508) may generate a background image corresponding to the outer ring background by normalizing the overlay image. The background image may reflect background information (e.g., air) related to the outer ring.
In some embodiments, the background image generation unit 1508 may divide each fourth element in the superimposed image by n +1 (n refers to the number of rotations, or number (or count) of rotated circular images). For example, if the overlay image is generated based on the initial circular image and 6 times rotated circular images, the background-image generating unit 1508 may divide each fourth pixel in the overlay image by 7 to generate a background image. For another example, as shown in fig. 23, a superimposed image 2304 may be generated based on the initial circular image 2301, the first-rotated circular image 2302, and the second-rotated circular image 2303. The background image 2305 may be generated by dividing the overlay image 2304 (e.g., the fourth element in the overlay image 2304) by 3.
FIG. 24 is a flow diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application. In some embodiments, at least one operation of process 2400 may be implemented in imaging system 100 shown in fig. 1. For example, process 2400 may be stored in the form of instructions (e.g., an application) in storage device 150 and/or memory 220 and invoked and/or executed by a processing device 140 (e.g., processor 210 of computing device 200 as shown in fig. 2, CPU 340 of mobile device 300 as shown in fig. 3, at least one module of processing device 140 as shown in fig. 4, etc.). As another example, a portion of process 2400 may be implemented on imaging device 110. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, process 2400, when implemented, may add at least one additional operation not described and/or delete at least one operation described herein. Further, the order of the operations of process 2400 shown in fig. 24 and the order of the operations described below are not intended to be limiting. In some embodiments, operation 1705 of process 1700 may be performed in accordance with process 2400.
In operation 2401, the processing device 140 (e.g., the background image generation unit 1508) may generate a rotated circular image when at least one third element of the initial circular image is rotated by an angle with respect to a center of the outer ring. The rotated circular image may include at least one rotated third element. The center of the outer ring may be located at the center of the field of view of the scan and/or the gantry bore of the imaging device 110. The angle may be less than 360 °.
In operation 2403, the processing device 140 (e.g., the background image generation unit 1508) may generate a new image (i.e., a superimposed image) by superimposing the initial circular image and the rotated circular image. In some embodiments, the background image generation unit 1508 may superimpose the third element in the initial circular image and the corresponding rotated third element in the rotated circular image. The overlay image may include at least one fourth element. Each fourth element may be a sum of the corresponding third element and the corresponding rotated third element.
In some embodiments, the background image generation unit 1508 may perform an interpolation operation on the rotated circular image. Specifically, the background image generation unit 1508 may perform an interpolation operation on the rotated third element in the rotated circular image. The background image generation unit 1508 may generate a new image (i.e., a superimposed image) by superimposing the initial circular image and the interpolated rotated circular image.
In operation 2405, the processing device 140 (e.g., the background image generation unit 1508) may determine whether a condition is satisfied. The condition may be associated with whether a specified number of rotations has been performed, whether a difference between two superimposed images in two consecutive rotations is less than a threshold, or the like, or a combination thereof. It should be noted that, in some embodiments, the background image generation unit 1508 may determine that the condition is satisfied when the difference between two superimposed images in two consecutive rotations is equal to a threshold value, or the background image generation unit 1508 may determine that the condition is not satisfied. In response to determining that the condition is not satisfied, process 2400 may perform operation 2407. In operation 2407, the processing device 140 (e.g., the background image generation unit 1508) may designate the overlay image as an initial circular image. Process 2400 may further perform operation 2401 to perform the next rotation. In some embodiments, the rotation angles for different numbers of rotations may be the same or different.
In response to determining that the condition is satisfied, process 2400 may perform operation 2409. In operation 2409, the processing device 140 (e.g., the background image generation unit 1508) may generate a background image corresponding to the outer ring background by normalizing the overlay image. The background image may reflect background information (e.g., air) related to the outer ring.
In some embodiments, the background image generation unit 1508 may divide each fourth pixel in the overlay image by 2n (n refers to the number of rotations) to generate a background image. For example, if the overlay image is generated by performing 6 rotations, the background image generation unit 1508 may divide each fourth element in the overlay image by 26 to generate a background image.
FIG. 25 is a schematic diagram of an exemplary process of modifying an initial circle image, shown in accordance with some embodiments of the present application. As shown in fig. 25, two rotations may be performed. The initial circular image 2501 may include four portions. The first rotated circular image 2502 may be generated by first rotating the initial circular image 2501 at 30 ° with respect to the center O of the outer ring. A first new image (i.e., a first superimposed image) 2503 may be generated by superimposing the initial circular image 2501 and the rotated circular image 2502. The first overlay image 2503 may be considered as an initial circular image 2501 in the next rotation. The second rotated circular image 2504 may be generated by second rotating the first superimposed image 2503 by 15 ° with respect to the center O of the outer ring. A second new image (i.e., a second overlay image) 2505 may be generated by overlaying the first overlay image 2503 and the second rotated circular image 2504. The background image 2506 may be generated by normalizing the second overlay image 2505. Specifically, the background image 2506 may be generated by dividing the second overlay image 2505 (e.g., the fourth element in the second overlay image 2505) by 4. That is, in fig. 25, a radial moving average is applied to the four-sector image.
Systems and methods for iterative reconstruction based on air masks are provided. Using the first air mask, iterative reconstruction can be accelerated. In particular, the air mask may be a binary image that may distinguish between elements in the image that represent air and elements that do not represent air (e.g., elements that represent non-air). The elements representing air may not participate in the iterative reconstruction, which may speed up the iterative reconstruction. Thus, a high precision air mask that can distinguish between air and non-air can facilitate iterative reconstruction and improve the quality of the reconstructed image. In the present application, the air mask may be generated by performing a minimum back projection (described elsewhere in the present application) on the raw data. By performing a minimum back projection, an air mask seed image (also referred to as a second seed image) may be generated. Whether or not truncation is introduced in the reconstruction, the air mask seed image generated based on minimum back projection may provide a sharp contour of the object, which may facilitate segmentation or differentiation of the air region from the rest of the image by thresholding to generate the air mask. Therefore, using minimum back projection, the accuracy of the air mask can be improved.
To accelerate the iterative reconstruction, the present application provides a corrected seed image. The seed image (after correction) may provide an initial image estimate for the iterative reconstruction. In general, the seed image may include an offset that occurs due to the lack of data in the outer loop (described elsewhere in this application). The offset may be reduced or eliminated by increasing the number of iterations of the iterative reconstruction. In the present application, the seed image may be corrected by subtracting the outer-loop background from the seed image to reduce or eliminate the offset. Thus, an initial image estimate (i.e., a corrected seed image) of good quality may be determined. Iterative reconstruction based on the corrected seed image can reduce the number of iterations, which can improve the efficiency of iterative reconstruction and also improve the quality of the reconstructed image.
It should be noted that in some embodiments, the corrected seed image and the air mask may be used simultaneously in the iterative reconstruction, which may further accelerate the iterative reconstruction and improve the quality of the reconstructed image. For example only, the processing device 140 may reconstruct at least one image based on the corrected seed image, the first air mask, and/or the raw data.
It should be noted that the terms "first," "second," "third," "fourth," etc. are used for convenience of description only and do not denote a particular order or name.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific terminology to describe embodiments of the application. For example, "one embodiment," "an embodiment," and/or "some embodiments" means a certain feature, structure, or characteristic described in connection with at least one embodiment of the application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of at least one embodiment of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, aspects of the present application may be performed entirely by hardware, entirely by software (including firmware, resident software, micro-code, etc.) or by a combination of hardware and software. The above hardware or software may be referred to as a "unit", "module", or "system". Furthermore, aspects of the present application may be presented as a computer product having computer readable program code embodied in at least one computer readable medium.
A non-transitory computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, and the like, or any suitable combination. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code on a computer readable signal medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, scala, smalltalk, eiffel, JADE, emerald, C + +, C #, VB.NET, python, and the like, conventional procedural programming languages, such as C, visual Basic, fortran 2003, perl, COBOL 2002, PHP, ABAP, dynamic programming languages, such as Python, ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Furthermore, unless explicitly stated in the claims, the order of processing elements or sequences, use of numbers or letters, or use of other names is not intended to limit the order of the processes and methods described herein. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the previous description of embodiments of the present application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of at least one embodiment of the invention. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Rather, the inventive body should possess fewer features than the single embodiment described above.
In some embodiments, the number, nature, etc. of certain embodiments used to describe and claim the present application should be understood to be modified in some instances by the terms "about", "approximately", or "substantially". For example, unless otherwise specified, "about," "approximately," or "substantially" may mean a ± 20% variation of the value it describes. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
All patents, patent applications, patent application publications, and other materials (e.g., articles, books, specifications, publications, records, things, and/or the like) referred to in this application are herein incorporated by reference in their entirety for all purposes, except to the extent any document referred to is inconsistent or conflicting with this document or for any document referred to in the claims or later herein is incorporated by reference for all purposes. For example, descriptions, definitions, and/or use of terms in this document should take precedence if there is any inconsistency or conflict between the descriptions, definitions, and/or use of terms in relation to any incorporated material that is relevant to this document.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application may be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and depicted herein.

Claims (16)

1. A method of iterative reconstruction, said method for reconstructing an image, said image comprising at least one element, each element of said at least one element being a pixel or a voxel, characterized in that said method comprises:
acquiring raw data detected by an imaging device from at least one angle;
generating a first type image by performing filtered back-projection on the raw data;
determining a first air mask by performing a minimum backprojection on the raw data; and
reconstructing at least one image by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.
2. The method of iterative reconstruction of claim 1, wherein said at least one image comprises a 3D image or a volumetric image.
3. The method of iterative reconstruction of claim 1, wherein said first air mask includes at least one element having zero attenuation to radiation rays in said imaging device.
4. The method of iterative reconstruction of claim 1, wherein determining a first air mask by performing a minimum backprojection on the raw data comprises:
for each element of the at least one image, performing minimum value back-projection by specifying, as a back-projection value for the each element, minimum original data of a portion of the original data or modified data, the portion of the original data being related to the each element and detected from the at least one angle, the modified data being generated by interpolation based on the original data;
generating a second seed image based on at least one backprojection value associated with the at least one element; and
generating the first air mask by performing thresholding on the second seed image based on a second threshold.
5. The method of iterative reconstruction of claim 4, wherein said raw data is detected by at least one detector of said imaging device, said at least one detector being aligned along a channel direction and a row direction, determining a first air mask by performing a minimum back-projection on said raw data further comprises:
performing one-dimensional low-pass filtering on the raw data in the channel direction before performing the minimum back-projection.
6. The method of iterative reconstruction of claim 4, wherein said raw data corresponds to a sinogram, determining a first air mask by performing a minimum back-projection on said raw data further comprises:
performing two-dimensional low-pass filtering on the raw data before performing the minimum back-projection.
7. The method of iterative reconstruction of claim 4, wherein determining a first air mask by performing a minimum backprojection on the raw data further comprises:
generating a second air mask by performing thresholding on the first seed image based on a third threshold; and
modifying the first air mask by performing a Boolean operation based on the first air mask and the second air mask.
8. The method of iterative reconstruction of claim 4, wherein determining a first air mask by performing a minimum backprojection on the raw data further comprises:
modifying the first seed image by at least one of subtracting an outer loop background from the first seed image or performing two-dimensional low pass filtering on the first seed image;
generating a second air mask by performing thresholding on the modified first seed image based on a third threshold; and
modifying the first air mask by performing a Boolean operation based on the first air mask and the second air mask.
9. The method of iterative reconstruction of claim 8, wherein determining a first air mask by performing a minimum backprojection on the raw data further comprises:
performing a morphological operation on the modified first air mask.
10. The method of iterative reconstruction of claim 1 wherein said first air mask is determined from at least one block of pixels, wherein each block of said at least one block of pixels comprises nxm pixels.
11. The method of iterative reconstruction of claim 1, wherein said iterative reconstruction includes at least one forward projection and at least one back projection, and said first air mask is used in said at least one forward projection and said at least one back projection.
12. The method of iterative reconstruction of claim 1, further comprising:
updating the first air mask in at least one iteration of the iterative reconstruction.
13. The method of iterative reconstruction of claim 1, further comprising:
modifying the first type of image by subtracting an outer ring background from the first type of image.
14. A system for iterative reconstruction, the system for reconstructing an image, comprising:
at least one processor; and
a memory configured to store instructions, wherein the at least one processor, when executing the instructions, is to:
acquiring raw data detected by an imaging device from at least one angle;
generating a first type image by performing filtered back-projection on the raw data;
determining a first air mask by performing a minimum backprojection on the raw data; and
reconstructing at least one image by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.
15. An iterative reconstruction system, which is used for reconstructing an image, is characterized by comprising an acquisition module, a seed image generation module, an air mask generation module and a reconstruction module;
the acquisition module is used for acquiring raw data detected by an imaging device from at least one angle;
the seed image generation module is used for generating a first seed image by performing filtered back projection on the original data;
the air mask generation module is to determine a first air mask by performing a minimum backprojection on the raw data; and
the reconstruction module is configured to reconstruct at least one image by performing an iterative reconstruction based on the first seed image, the first air mask and the raw data.
16. A computer-readable storage medium, wherein the storage medium stores computer instructions that, when executed, perform the method of iterative reconstruction of any of claims 1-13.
CN201810954772.7A 2018-08-21 2018-08-21 System and method for iterative reconstruction Active CN109272562B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201810954772.7A CN109272562B (en) 2018-08-21 2018-08-21 System and method for iterative reconstruction
US16/206,167 US10977840B2 (en) 2018-08-21 2018-11-30 Systems and methods for iterative reconstruction
US17/221,012 US11481935B2 (en) 2018-08-21 2021-04-02 Systems and methods for iterative reconstruction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810954772.7A CN109272562B (en) 2018-08-21 2018-08-21 System and method for iterative reconstruction

Publications (2)

Publication Number Publication Date
CN109272562A CN109272562A (en) 2019-01-25
CN109272562B true CN109272562B (en) 2022-11-11

Family

ID=65153869

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810954772.7A Active CN109272562B (en) 2018-08-21 2018-08-21 System and method for iterative reconstruction

Country Status (1)

Country Link
CN (1) CN109272562B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110415311B (en) * 2019-07-29 2024-04-16 上海联影医疗科技股份有限公司 PET image reconstruction method, system, readable storage medium and apparatus
US11763498B2 (en) 2019-11-12 2023-09-19 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for image reconstruction
CN110853742B (en) * 2019-11-12 2023-03-28 上海联影医疗科技股份有限公司 Image reconstruction method, system, device and storage medium
CN113393547B (en) * 2021-05-25 2023-03-24 上海联影医疗科技股份有限公司 PET (positron emission tomography) coincidence data volume control method, device, equipment and storage medium
CN113643420B (en) * 2021-07-02 2023-02-03 北京三快在线科技有限公司 Three-dimensional reconstruction method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842141A (en) * 2012-07-03 2012-12-26 东南大学 Rotary X-ray contrastographic picture iteration reconstruction method
CN103366389A (en) * 2013-04-27 2013-10-23 中国人民解放军北京军区总医院 CT (computed tomography) image reconstruction method
GB201710479D0 (en) * 2016-05-20 2017-08-16 Shanghai United Imaging Healthcare Co Ltd No title

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842141A (en) * 2012-07-03 2012-12-26 东南大学 Rotary X-ray contrastographic picture iteration reconstruction method
CN103366389A (en) * 2013-04-27 2013-10-23 中国人民解放军北京军区总医院 CT (computed tomography) image reconstruction method
GB201710479D0 (en) * 2016-05-20 2017-08-16 Shanghai United Imaging Healthcare Co Ltd No title

Also Published As

Publication number Publication date
CN109272562A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109272562B (en) System and method for iterative reconstruction
US11653890B2 (en) Systems and methods for image acquisition
US11105939B2 (en) System and method for PET data correction
CN110809782B (en) Attenuation correction system and method
US10706550B2 (en) System and method for positron emission tomography imaging
US11776170B2 (en) Systems and methods for medical imaging
CN109242923B (en) System and method for iterative reconstruction
US10438378B2 (en) System and method for determining an activity map and an attenuation map
CN107886553B (en) Image reconstruction system and method
US11393139B2 (en) System and method for MPR streak reduction
US11481935B2 (en) Systems and methods for iterative reconstruction
US11361480B2 (en) System and method for 3D image reconstruction from axial step-and-shoot CT
US11717248B2 (en) Systems and methods for image generation
US20230293132A1 (en) Systems and methods for image acquisition
WO2019091087A1 (en) Systems and methods for correcting projection images in computed tomography image reconstruction
WO2023039736A1 (en) Systems and methods for image reconstruction
US11900602B2 (en) System and method for medical imaging
CN117897733A (en) Medical imaging system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant after: Shanghai Lianying Medical Technology Co.,Ltd.

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant