CN111627083B - Bone hardening artifact correction method, device, computer equipment and readable storage medium - Google Patents
Bone hardening artifact correction method, device, computer equipment and readable storage medium Download PDFInfo
- Publication number
- CN111627083B CN111627083B CN202010456142.4A CN202010456142A CN111627083B CN 111627083 B CN111627083 B CN 111627083B CN 202010456142 A CN202010456142 A CN 202010456142A CN 111627083 B CN111627083 B CN 111627083B
- Authority
- CN
- China
- Prior art keywords
- projection data
- bone
- image
- artifact
- neural network
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 210000000988 bone and bone Anatomy 0.000 title claims abstract description 122
- 238000000034 method Methods 0.000 title claims abstract description 88
- 238000012937 correction Methods 0.000 title claims abstract description 83
- 238000003860 storage Methods 0.000 title claims abstract description 12
- 238000013528 artificial neural network Methods 0.000 claims abstract description 92
- 238000012545 processing Methods 0.000 claims abstract description 38
- 230000008569 process Effects 0.000 claims abstract description 19
- 238000012549 training Methods 0.000 claims description 48
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 claims description 29
- 238000004590 computer program Methods 0.000 claims description 10
- 238000003709 image segmentation Methods 0.000 claims description 8
- 201000000023 Osteosclerosis Diseases 0.000 claims description 2
- 238000005516 engineering process Methods 0.000 abstract description 3
- 230000005855 radiation Effects 0.000 description 26
- 238000010586 diagram Methods 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 239000000463 material Substances 0.000 description 6
- 239000000126 substance Substances 0.000 description 6
- 208000034189 Sclerosis Diseases 0.000 description 5
- 230000008901 benefit Effects 0.000 description 4
- 238000009826 distribution Methods 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 238000001228 spectrum Methods 0.000 description 4
- 238000002591 computed tomography Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 238000004519 manufacturing process Methods 0.000 description 3
- 230000000149 penetrating effect Effects 0.000 description 3
- 238000007781 pre-processing Methods 0.000 description 3
- 241000282414 Homo sapiens Species 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000001360 synchronised effect Effects 0.000 description 2
- 108010001267 Protein Subunits Proteins 0.000 description 1
- 241001122767 Theaceae Species 0.000 description 1
- 238000002083 X-ray spectrum Methods 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000000862 absorption spectrum Methods 0.000 description 1
- 230000001133 acceleration Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004204 blood vessel Anatomy 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 230000008878 coupling Effects 0.000 description 1
- 238000010168 coupling process Methods 0.000 description 1
- 238000005859 coupling reaction Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 210000003275 diaphysis Anatomy 0.000 description 1
- 230000008030 elimination Effects 0.000 description 1
- 238000003379 elimination reaction Methods 0.000 description 1
- 238000000802 evaporation-induced self-assembly Methods 0.000 description 1
- 210000003414 extremity Anatomy 0.000 description 1
- 210000000232 gallbladder Anatomy 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000001788 irregular Effects 0.000 description 1
- 210000002429 large intestine Anatomy 0.000 description 1
- 210000004072 lung Anatomy 0.000 description 1
- 238000010801 machine learning Methods 0.000 description 1
- 210000001370 mediastinum Anatomy 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000000053 physical method Methods 0.000 description 1
- 210000004224 pleura Anatomy 0.000 description 1
- 238000002203 pretreatment Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 210000000813 small intestine Anatomy 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 210000001519 tissue Anatomy 0.000 description 1
- 210000003932 urinary bladder Anatomy 0.000 description 1
- 238000004804 winding Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/003—Reconstruction from projections, e.g. tomography
- G06T11/008—Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30008—Bone
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Apparatus For Radiation Diagnosis (AREA)
Abstract
The application relates to a method, a device, equipment and a computer readable storage medium for correcting bone hardening artifact. The bone hardening artifact correction method comprises the following steps: acquiring first projection data of a first scanning image; processing the first projection data by using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model comprises a trained artificial neural network which is trained to predict projection data of a scanned image without bone-setting artifact or projection data of a bone-setting artifact image; and reconstructing to obtain a second scanning image according to the second projection data. The application solves the problem of complex correction process of the bone hardening artifact in the related technology and simplifies the correction process of the bone hardening artifact.
Description
Technical Field
The present application relates to the field of image processing, and in particular, to a method, apparatus, computer device, and computer readable storage medium for correcting bone hardening artifacts.
Background
For bone hardening artifacts, many correction algorithms have been proposed in the conventional art, and can be classified into hardware and software in terms of implementation. The hardware method has the effect of inhibiting the bone hardening artifact by adding a correction tool into the entity system of various computer tomography equipment, and the software method uses the essence generated by the bone hardening artifact as the basis, and uses a correction algorithm to correspondingly process the acquired projection data or the reconstructed image so as to reduce the degree of the bone hardening artifact in the image.
In terms of hardware correction, the correction method mainly comprises a water bag correction method and a filter slice method. The water bag correction method mainly ensures that the paths of ray bundles at different positions penetrating through an object are consistent by uniformly winding the water bag on the surface of the detected object, thereby achieving the purpose of correction, but the mode has poor convenience and lacks simplicity. However, in a medical CT apparatus, in addition to beam bone setting, the dose to be applied to a patient in a scanning field of view should be kept constant, and a shape filter is generally used to correct bone setting.
In terms of software correction, software correction methods are largely divided into two categories, preprocessing and post-processing. The preprocessing correction method is applied before the final CT image reconstruction is completed. Post-processing correction methods are typically applied to the reconstructed image, and the algorithm requires the participation of the original reconstructed image or image segmentation operation in the execution process. The pretreatment method comprises polynomial fitting, monte Carlo fitting and the like, and is suitable for single-component substances, but the correction die body which is the same as the scanned object material needs to be manufactured, so that the method is not beneficial to engineering popularization. Moreover, the human body is composed of complex elements, and it is impossible to manufacture a mold body of the same material as human beings, so that the application of the method in medical CT is also limited. In addition, the preprocessing method further comprises an algorithm based on iterative reconstruction, and the method creatively fuses the bone hardening correction operation and the image reconstruction process, but the method is limited by complicated parameter selection and long algorithm running time, so that the method is not widely used.
However, in recent years, the dual energy method has been applied to the CT field in the related art. The idea of the dual-energy method is that a target object is scanned twice with the difference of the ray energy spectrum, and then the linear attenuation coefficient distribution of the scanned section under a certain ray energy level is obtained, so that a correction image without bone hardening artifact is obtained. The generation of monochromatic images by dual-energy CT is based on the absorption spectrum of the X-ray beam, and X-ray spectrum information needs to be acquired during correction, which results in complex and time-consuming correction process of bone-setting artifacts.
In summary, the existing methods for correcting bone-setting artifacts are various, but have limitations in specific applications. At present, no effective solution is proposed for solving the problem of complex bone hardening artifact correction process in the related art.
Disclosure of Invention
The embodiment of the application provides a method, a device, computer equipment and a computer readable storage medium for correcting bone hardening artifact, which are used for at least solving the problem of complex bone hardening artifact correction process in the related art.
In a first aspect, an embodiment of the present application provides a method for correcting an osteohardening artifact, the method including:
acquiring first projection data of a first scanning image;
Processing the first projection data using a bone-hardening artifact correction model to obtain second projection data, wherein the bone-hardening artifact correction model comprises a trained artificial neural network trained to predict projection data of a scanned image without bone-hardening artifacts or projection data of a bone-hardening artifact image;
and reconstructing to obtain a second scanning image according to the second projection data.
In some of these embodiments, processing the first projection data using the bone hardening artifact correction model to obtain second projection data comprises:
processing the first projection data by using the trained artificial neural network to obtain third projection data of the bone hardening artifact image;
and determining the second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to predict projection data of a bone hardening artifact image.
In some of these embodiments, the training process of the trained artificial neural network comprises:
obtaining a training sample and an untrained artificial neural network, the training sample comprising: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image;
And training the untrained artificial neural network by taking the fourth projection data as input and taking the fifth projection data as supervision until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
In some of these embodiments, processing the first projection data using the bone hardening artifact correction model to obtain second projection data comprises:
processing the first projection data using the trained artificial neural network to obtain the second projection data, wherein the trained artificial neural network is trained to predict projection data of a scanned image without bone hardening artifacts.
In some of these embodiments, the training process of the trained artificial neural network comprises:
obtaining a training sample and an untrained artificial neural network, the training sample comprising: sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact;
and training the untrained artificial neural network by taking the sixth projection data as input and taking the seventh projection data as supervision until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
In some embodiments, the generating of the seventh projection data includes:
image segmentation is carried out on the scanning image corresponding to the sixth projection data, so as to obtain a water image and a bone image;
projecting the water image to obtain water image projection data and projecting the bone image to obtain bone image projection data;
and performing beam hardening correction according to the water image projection data and the bone image projection data to obtain seventh projection data.
In some embodiments, the first, fourth, and sixth projection data are all pre-corrected by hydraulic hardening.
In a second aspect, an embodiment of the present application provides an apparatus for correcting an osteohardening artifact, the apparatus comprising:
the first acquisition module is used for acquiring first projection data of a first scanning image;
a first processing module for processing the first projection data using a bone-hardening artifact correction model to obtain second projection data, wherein the bone-hardening artifact correction model comprises a trained artificial neural network trained to predict projection data of a scanned image without bone-hardening artifacts or projection data of a bone-hardening artifact image;
And the reconstruction module is used for reconstructing and obtaining a second scanning image according to the second projection data.
In a third aspect, an embodiment of the present application provides a computer device, including a memory, a processor, and a computer program stored on the memory and executable on the processor, where the processor implements the method for correcting bone hardening artifacts according to the first aspect.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, on which a computer program is stored, which program, when being executed by a processor, implements the method for correcting bone-setting artifacts as described in the first aspect above.
Compared with the related art, the method, the device, the computer equipment and the computer readable storage medium for correcting the bone hardening artifact provided by the embodiment of the application acquire the first projection data of the first scanning image; processing the first projection data by using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model comprises a trained artificial neural network which is trained to predict projection data of a scanned image without bone-setting artifact or projection data of a bone-setting artifact image; according to the second projection data, the mode of reconstructing the second scanning image solves the problem that the correction process of the bone hardening artifact is complex in the related art, and simplifies the correction process of the bone hardening artifact.
The details of one or more embodiments of the application are set forth in the accompanying drawings and the description below to provide a more thorough understanding of the other features, objects, and advantages of the application.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application and do not constitute a limitation on the application. In the drawings:
FIG. 1 is a schematic diagram of a CT system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of the hardware architecture of a computer device according to an embodiment of the application;
FIG. 3 is a flow chart of a method of correcting bone hardening artifacts according to an embodiment of the present application;
FIG. 4 is a schematic diagram of artificial neural network-based reduction of bone hardening artifacts according to an embodiment of the present application;
FIG. 5 is a schematic diagram II of artificial neural network-based reduction of bone hardening artifacts according to an embodiment of the present application;
fig. 6 is a block diagram of a bone-setting artifact correction device according to an embodiment of the present application.
Detailed Description
The present application will be described and illustrated with reference to the accompanying drawings and examples in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. All other embodiments, which can be made by a person of ordinary skill in the art based on the embodiments provided by the present application without making any inventive effort, are intended to fall within the scope of the present application.
It is apparent that the drawings in the following description are only some examples or embodiments of the present application, and it is possible for those of ordinary skill in the art to apply the present application to other similar situations according to these drawings without inventive effort. Moreover, it should be appreciated that while such a development effort might be complex and lengthy, it would nevertheless be a routine undertaking of design, fabrication, or manufacture for those of ordinary skill having the benefit of this disclosure, and thus should not be construed as having the benefit of this disclosure.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment may be included in at least one embodiment of the application. The appearances of such phrases in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is to be expressly and implicitly understood by those of ordinary skill in the art that the described embodiments of the application can be combined with other embodiments without conflict.
Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The terms "a," "an," "the," and similar referents in the context of the application are not to be construed as limiting the quantity, but rather as singular or plural. The terms "comprising," "including," "having," and any variations thereof, are intended to cover a non-exclusive inclusion; for example, a process, method, system, article, or apparatus that comprises a list of steps or modules (elements) is not limited to only those steps or elements but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The terms "connected," "coupled," and the like in connection with the present application are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. The term "plurality" as used herein means two or more. "and/or" describes an association relationship of an association object, meaning that there may be three relationships, e.g., "a and/or B" may mean: a exists alone, A and B exist together, and B exists alone. The character "/" generally indicates that the context-dependent object is an "or" relationship. The terms "first," "second," "third," and the like, as used herein, are merely distinguishing between similar objects and not representing a particular ordering of objects.
The computer equipment provided by the embodiment of the application can be applied to a medical image processing system, and the medical image processing system can comprise medical image scanning equipment and computer equipment.
The medical image scanning device may be any one or more medical image scanning systems such as a magnetic resonance system (MR system), a positron emission computed tomography system (PET system), a positron emission computed tomography-magnetic resonance multi-modality hybrid system (PET-MR system), an electronic computed tomography system (CT system), an X-ray scanning system, and the like.
The following describes and illustrates embodiments of the present application using a medical image scanning apparatus as an example of a CT system.
In this embodiment, CT system 100 includes an examination couch 110 and a scanning assembly 120. Wherein the couch 110 is adapted to carry a person to be examined. The couch 110 is movable such that the part to be examined of the person to be examined is moved to a position suitable for being examined, for example the position denoted 200 in fig. 1. The scan site 120 has a source 121 and a detector 122.
The radiation source 121 may be configured to emit radiation to a region to be examined of a person to be examined for generating scan data of a medical image. The part to be inspected of the inspector may include a substance, a tissue, an organ, a sample, a body, or the like, or any other combination. In certain embodiments, the site to be examined of the subject may comprise a patient or a portion thereof, i.e., may comprise the head, chest, lung, pleura, mediastinum, abdomen, large intestine, small intestine, bladder, gall bladder, triple, pelvic, diaphysis, extremities, bones, blood vessels, or the like, or any combination thereof. The radiation source 121 is configured to generate radiation or other types of radiation. The radiation can pass through the part to be inspected of the person to be inspected. After passing through the part to be inspected of the person to be inspected, is received by the detector 122.
The radiation source 121 may include a radiation generator. The radiation generator may comprise one or more radiation tubes. The tube may emit radiation or a beam of radiation. The source 121 may be an X-ray tube, a cold cathode ion tube, a high vacuum hot cathode tube, a rotating anode tube, or the like. The shape of the emitted radiation beam may be linear, narrow pen-shaped, narrow fan-shaped, cone-shaped, wedge-shaped, or the like, or irregular, or any combination thereof. The fan angle of the beam may be a certain value in the range of 20 deg. to 90 deg.. The tube in the source 121 may be fixed in one position. In some cases, the tube may be translated or rotated.
The detector 122 may be configured to receive radiation from the radiation source 121 or other radiation source. Radiation from the radiation source 121 may pass through the person to be examined and then reach the detector 122. After receiving the radiation, the detector 122 generates a detection result containing a radiation image of the person to be examined. The detector 122 includes a radiation detector or other component. The shape of the radiation detector may be flat, arcuate, circular, or the like, or any combination thereof. The fan angle of the arcuate detector may range from 20 ° to 90 °. The fan angle may be fixed or adjustable according to different circumstances. Different situations include desired image resolution, image size, sensitivity of the detector, stability of the detector, or the like, or any combination thereof. In some embodiments, the pixels of the detector may be a minimum number of detection units, such as a number of detector units (e.g., scintillators or photosensors, etc.). The pixels of the detector may be arranged in a single row, a double row or another number of rows. The radiation detector is one-dimensional, two-dimensional, or three-dimensional.
The CT system also comprises a scanning control device and an image generation device. Wherein the scanning control device is configured to control the couch 110 and the scanning unit 120 to perform scanning. The image generating means is for generating a medical image based on the detection result of the detector 122.
Since the scanning component 120 tends to emit radiation when scanning, in some embodiments, to avoid exposure of an operator of the CT system 100 to such radiation, the image generation device may be located in a different room than the scanning component 120, so that the operator of the CT system 100 may be located in another room, protected from the radiation, and able to generate and view the scan results via the image generation device.
The computer device of the present embodiment includes a processor 211, a memory 212 storing computer program instructions, and a display device 214.
The processor 211 may include a Central Processing Unit (CPU), or an application specific integrated circuit (Application Specific Integrated Circuit, abbreviated as ASIC), or may be configured as one or more integrated circuits that implement embodiments of the present application.
In some of these embodiments, the processor 211 may be configured to: acquiring first projection data of a first scanning image; processing the first projection data by using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model comprises a trained artificial neural network which is trained to predict projection data of a scanned image without bone-setting artifact or projection data of a bone-setting artifact image; and reconstructing to obtain a second scanning image according to the second projection data.
In some of these embodiments, the processor 211 may be configured to: processing the first projection data by using a trained artificial neural network to obtain third projection data of the bone sclerosis artifact image; and determining second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to predict projection data of the bone sclerosis artifact image.
In some of these embodiments, the processor 211 may be configured to: obtaining training samples and an untrained artificial neural network, wherein the training samples comprise: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image; and taking the fourth projection data as input, taking the fifth projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
In some of these embodiments, the processor 211 may be configured to: the first projection data is processed using a trained artificial neural network to obtain second projection data, wherein the trained artificial neural network is trained to predict projection data of a scanned image without bone hardening artifacts.
In some of these embodiments, the processor 211 may be configured to: obtaining training samples and an untrained artificial neural network, wherein the training samples comprise: sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact; and taking the sixth projection data as input, taking the seventh projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters converge, so as to obtain the trained artificial neural network.
In some of these embodiments, the processor 211 may be configured to: image segmentation is carried out on the scanned image corresponding to the sixth projection data, so as to obtain a water image and a bone image; projecting the water image to obtain water image projection data and projecting the bone image to obtain bone image projection data; and performing beam hardening correction according to the water image projection data and the bone image projection data to obtain seventh projection data.
In some of these embodiments, the first projection data, the fourth projection data, and the sixth projection data are all pre-corrected by hydraulic hardening.
Memory 212 may include mass storage for data or instructions. By way of example, and not limitation, memory 212 may comprise a Hard Disk Drive (HDD), floppy Disk Drive, solid state Drive (Solid State Drive, SSD), flash memory, optical Disk, magneto-optical Disk, tape, or universal serial bus (Universal Serial Bus, USB) Drive, or a combination of two or more of the foregoing. Memory 212 may include removable or non-removable (or fixed) media, where appropriate. The memory 212 may be internal or external to the data processing apparatus, where appropriate. In a particular embodiment, the memory 212 is a Non-Volatile (Non-Volatile) memory. In a particular embodiment, the Memory 212 includes Read-Only Memory (ROM) and random access Memory (Random Access Memory, RAM). Where appropriate, the ROM may be a mask-programmed ROM, a programmable ROM (Programmable Read-Only Memory, abbreviated PROM), an erasable PROM (Erasable Programmable Read-Only Memory, abbreviated EPROM), an electrically erasable PROM (Electrically Erasable Programmable Read-Only Memory, abbreviated EEPROM), an electrically rewritable ROM (Electrically Alterable Read-Only Memory, abbreviated EAROM), or a FLASH Memory (FLASH), or a combination of two or more of these. The RAM may be Static Random-Access Memory (SRAM) or dynamic Random-Access Memory (Dynamic Random Access Memory DRAM), where the DRAM may be a fast page mode dynamic Random-Access Memory (Fast Page Mode Dynamic Random Access Memory FPMDRAM), extended data output dynamic Random-Access Memory (Extended Date Out Dynamic Random Access Memory EDODRAM), synchronous dynamic Random-Access Memory (Synchronous Dynamic Random-Access Memory SDRAM), or the like, as appropriate.
Memory 212 may be used to store or cache various data files (e.g., medical images, operating systems, preset knowledge base data, machine learning models, etc.) that need to be processed and/or communicated for use, as well as possible computer program instructions executed by processor 211.
The processor 211 implements the method for correcting bone hardening artifacts according to the embodiment of the present application by reading and executing the computer program instructions stored in the memory 212.
In some of these embodiments, the computer device may also include a communication interface 213 and a bus 210. As shown in fig. 2, the processor 211, the memory 212, the communication interface 213, and the display device 214 are connected and communicate with each other through the bus 210.
The communication interface 213 is used to implement communication between the modules, devices, units and/or units in the present embodiment. The communication interface 213 may also enable communication with other components such as: and the external equipment, the medical image scanning equipment, the database, the external storage, the image/data processing workstation and the like are used for data communication.
Bus 210 includes hardware, software, or both, coupling components of a computer device to each other. Bus 210 includes, but is not limited to, at least one of: data Bus (Data Bus), address Bus (Address Bus), control Bus (Control Bus), expansion Bus (Expansion Bus), local Bus (LocalBus). By way of example, and not limitation, bus 210 may include a graphics acceleration interface (Accelerated Graphics Port), AGP or other graphics Bus, an enhanced industry standard architecture (Extended Industry Standard Architecture), EISA) Bus, front Side Bus (FSB), hyperTransport (HT) interconnect, industry standard architecture (Industry Standard Architecture), ISA) Bus, infiniBand (InfiniBand) interconnect, low Pin Count (LPC) Bus, memory Bus, micro channel architecture (Micro Channel Architecture), MCA Bus, peripheral component interconnect (Peripheral Component Interconnect), PCI-Express (PCI-X) Bus, serial advanced technology attachment (Serial Advanced Technology Attachment, SATA) Bus, video electronics standards association local (Video Electronics Standards Association Local Bus, VLB) Bus, or other suitable Bus, or a combination of two or more of the above. Bus 210 may include one or more buses, where appropriate. Although embodiments of the application have been described and illustrated with respect to a particular bus, the application contemplates any suitable bus or interconnect.
The computer device may perform the method for correcting the bone hardening artifact provided in the present embodiment based on the acquired first projection data of the first scan image.
In addition, embodiments of the present application may be implemented by providing a computer-readable storage medium. The computer readable storage medium has stored thereon computer program instructions; the computer program instructions, when executed by a processor, implement the method for correcting bone hardening artifacts provided in the present embodiment.
The embodiment also provides a correction method of the bone hardening artifact. Fig. 3 is a flowchart of a method for correcting an osteohardening artifact according to an embodiment of the present application, as shown in fig. 3, the flowchart includes the steps of:
in step S301, first projection data of a first scan image is acquired.
The first scan image acquired in this embodiment may be a scan image acquired from a CT system in real time, or may be a scan image acquired from a scan image database. Preferably, the first scan image is a scan image with or without significant artifacts.
In step S302, the first projection data is processed using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model includes a trained artificial neural network trained to predict projection data of a scan image without bone-setting artifacts or projection data of a bone-setting artifact image.
It should be noted that the artificial neural network in this step may move any currently existing network structure variety, such as an image segmentation network (UNet), a residual network (res net), a dense convolution network (densnet), and a generation countermeasure network (GAN).
Step S303, reconstructing a second scanning image according to the second projection data.
In step S302, the bone hardening artifact correction model corrects the bone hardening artifact of the first projection data to obtain second projection data, and then the second projection data is reconstructed to obtain a second scan image which is an image without artifact or very close to the image without artifact, thereby realizing the correction of the bone hardening artifact image.
Compared with the prior art, the neural network is introduced in the above steps provided in this embodiment, so that the projection data of the scan image without the bone hardening artifact or the projection data of the bone hardening artifact image are predicted by acquiring the projection data of the scan image, and the image reconstruction is performed by controlling the magnetic resonance scanner according to the acquired projection data of the scan image without the bone hardening artifact or the projection data of the bone hardening artifact image, so that the correction of the image artifact is realized.
In this embodiment, the trained artificial neural network is trained to predict projection data of a scanned image without a bone-setting artifact or projection data of a bone-setting artifact image, the process of processing the first projection data using a bone-setting artifact correction model to obtain the second projection data is different, and the training process of the corresponding bone-setting artifact correction model is also different. The present embodiment will be described below with reference to some examples.
For example, in some of these embodiments, processing the first projection data using the bone hardening artifact correction model to obtain the second projection data includes: processing the first projection data by using a trained artificial neural network to obtain third projection data of the bone sclerosis artifact image; and determining second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to predict projection data of the bone sclerosis artifact image.
In this embodiment, the first projection data is trained through an artificial neural network trained to predict the projection data of the bone setting artifact image, and the obtained difference projection data is the third projection data, that is, the portion to be removed in the first projection data of the first scan image with the bone setting artifact.
In this embodiment, the second projection data may be determined by directly differentiating the first projection data from the third projection data, or by performing some processing on the third projection data and then determining the second projection data, for example, by weighting the third projection data and then differentiating the first projection data from the third projection data, and the method of determining the second projection data is not limited to the above method, but may be other methods capable of determining the second projection data.
For another example, in some of these embodiments, processing the first projection data using the bone hardening artifact correction model to obtain the second projection data includes: the first projection data is processed using a trained artificial neural network to obtain second projection data, wherein the trained artificial neural network is trained to predict projection data of a scanned image without bone hardening artifacts.
In this embodiment, the first projection data is trained by the artificial neural network trained to predict the projection data of the scan image without the bone-setting artifact, and the bone-setting artifact in the first scan image with the bone-setting artifact can be directly reduced or eliminated by obtaining the second projection data.
In the above embodiment, the training method for the artificial neural network of the bone sclerosis artifact correction model may be divided into:
mode 1: obtaining training samples and an untrained artificial neural network, wherein the training samples comprise: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image; and taking the fourth projection data as input, taking the fifth projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
Mode 2: obtaining training samples and an untrained artificial neural network, wherein the training samples comprise: sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact; and taking the sixth projection data as input, taking the seventh projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters converge, so as to obtain the trained artificial neural network.
After the training of the artificial neural network obtained in the mode 1 is complete, the obtained difference projection data, namely the part to be removed in the projection data of the scanned image with the bone hardening artifact, is used for subtracting the difference projection data output by the bone hardening artifact correction model from the projection data of the scanned image with the bone hardening artifact, so that the reduction or elimination of the bone hardening artifact can be realized. After the training of the artificial neural network obtained by the training in the mode 2 is complete, the scanning image can be reconstructed according to the projection data output by the artificial neural network with complete training, so that the reduction or removal of the bone hardening artifact in the scanning image with bone hardening can be realized.
In the above embodiments, a typical material has a higher attenuation coefficient for low energy photons and a lower attenuation coefficient for high energy photons, due to the linear attenuation coefficient and energy dependence of most materials. While the X-ray beam produced by the X-ray tube covers a broad spectrum. As the polychromatic X-ray beam passes through a material, an increased proportion of high-energy photons are generated, resulting in "beam hardening".
Beam hardening is caused by the energy dependence of the attenuation coefficient and the energy of the X-ray beam energy spectrum. When the multi-energy X-ray passes through the object, the X-ray photons with lower energy are preferentially absorbed due to the photoelectric effect, so that the proportion of the high-energy component of the X-ray penetrating through the substance is increased, the average energy is increased, the ray is easier to penetrate along with the increase of the penetrating length, and the peak value of the spectrum distribution moves towards the higher energy direction, namely the beam hardening effect. The method can lead to that the object slices with uniform density are represented as brightness differences on the reconstructed CT image, and the distribution of pixel values on the image is in a tea cup shape with a bright middle black edge, namely, cup-shaped artifact.
Therefore, in order to avoid that the object slice with uniform density appears as brightness on the reconstructed CT image, in some embodiments, the generation process of the seventh projection data in the foregoing manner 2 may include: image segmentation is carried out on the scanned image corresponding to the sixth projection data, so as to obtain a water image and a bone image; projecting the water image to obtain water image projection data and projecting the bone image to obtain bone image projection data; and performing beam hardening correction according to the water image projection data and the bone image projection data to obtain seventh projection data.
In the present embodiment, the X-ray hardening phenomenon in the sixth projection data can be eliminated or reduced by performing beam hardening correction on the water image projection data and the bone image projection data to obtain the seventh projection data.
The beam hardening correction in the above embodiment may be corrected according to the following formula:
the above formula represents a beam hardening correction process, wherein P c Is a nonlinear curve, P is a linear curve after beam correction, a i The coefficients representing the i-th polynomial. Wherein a plurality of itemsThe coefficients of the formula can be configured online by a user, can be pre-configured by the user, and can be calculated through a preset specific algorithm. Correction by this method may reduce or eliminate beam hardening artifacts.
In addition to the above-described beam hardening correction method, in the present embodiment, the beam hardening correction method may also employ a single-energy method, a dual-energy method, or other methods capable of realizing the beam hardening correction method in the related art, as the conditions allow. The beam hardening method in this embodiment is described and explained below by way of examples.
In the present embodiment, the beam hardening correction may also employ a monoenergetic method, which is one of the methods of early studies. The monoenergetic method proposed by nalciglu assumes that the attenuation coefficient of X-rays is either the same as that of bone or water. On the basis, the material distribution condition corresponding to each projection line is obtained through one reconstruction, then each ray is processed to obtain processed projection data, and a corrected image is obtained after the reconstruction.
The monoenergetic method in this embodiment can be corrected by the following formula:
wherein,substances a and b are indicated as being +.>τ (E) is the normalized energy spectrum expression of the multi-energy X-ray, and m is the monochromatic projection at the multi-energy ray average energy E.
To the right in the above formula is a monotonically increasing function with respect to the unienergy projection m. Therefore, beam hardening correction can be performed using a relationship between the mono-energy projection and the multi-energy projection, and the mono-color projection m is estimated using a function f (p) of the multi-energy projection p, thereby obtaining an approximate mono-energy reconstruction result.
In this embodiment, the beam hardening correction may also employ a dual energy method, the correction concept of which is: assuming that the multi-energy X-ray is mainly composed of rays of two energies (E1, E2, E1< E2), the projection data is:
wherein, in the above formula, mu 1 、μ 2 Are all substances with energy of E respectively 1 And E is 2 D is the distance of the line passing through the object, and a is the fitting parameter. The method mainly comprises the following implementation steps:
(1) Selecting two different ray tube voltages to scan an object to be measured;
(2) Measuring the thicknesses of different substances (standard objects) under the voltages of two selected ray tubes 0 Calculating a multi-energy ray projection value;
(3) Estimating the parameter mu by using least square method based on the obtained data 1 、μ 2 、a;
(4) Reconstruction is performed from the data calculated by fitting the parameters.
In the above embodiment, the first projection data, the fourth projection data, and the sixth projection data are all pre-corrected by hydraulic hardening.
The application is described and illustrated below in connection with the drawings and the preferred embodiments.
Fig. 4 is a schematic diagram of reducing bone hardening artifacts based on an artificial neural network according to an embodiment of the present application, as shown in fig. 4, in this embodiment, by performing image segmentation on an acquired sixth scan image to obtain a water image and a bone image corresponding to the sixth scan image, performing orthographic projection on the bone image and the water image to obtain projection data of the bone image and projection data of the water image, and performing beam hardening correction on the projection data of the water image and the projection data of the bone image to obtain projection data of the scan image predicted to have no bone hardening artifacts.
And then projecting the obtained first scanning image to obtain projection data of the first scanning image, taking the projection data of the first scanning image as input of a neural network, training the projection data of the scanning image predicted to have no bone hardening artifact as a gold standard of the artificial neural network until the network converges, outputting the projection data of the scanning image predicted to have no bone hardening artifact, or alternatively, taking the difference projection data as the gold standard of the artificial neural network, training the difference projection data until the network converges, outputting the difference projection data, and finally taking the projection data of the first scanning image and the difference projection data as differences to obtain the projection data of the scanning image predicted to have no bone hardening artifact. And finally reconstructing according to the projection data of the scanned image predicted to have no bone hardening artifact, and obtaining an image after bone hardening artifact correction.
Fig. 5 is a schematic diagram II of reducing bone setting artifacts based on an artificial neural network according to an embodiment of the present application, as shown in fig. 5, a and d in fig. 5 are images processed by a conventional physical bone setting correction method, b and e in fig. 5 are images processed by the method according to the embodiment of the present application, and c and f in fig. 5 are images not corrected for bone setting. By comparing the images corresponding to the box positions in fig. 5, the method for removing the artifacts in the scanned images by utilizing the characteristic that the neural network has very good generalization capability in the embodiment of the application is superior to the traditional physical method, and the bone hardening artifacts in the scanned images can be removed better.
The present embodiment also provides an apparatus for correcting an osteosclerosis artifact, which is used to implement the foregoing embodiments and preferred embodiments, and is not described in detail. As used below, the terms "module," "unit," "sub-unit," and the like may be a combination of software and/or hardware that implements a predetermined function. While the means described in the following embodiments are preferably implemented in software, implementation in hardware, or a combination of software and hardware, is also possible and contemplated.
Fig. 6 is a block diagram of a bone hardening artifact correction device according to an embodiment of the present application, as shown in fig. 6, the device includes:
a first acquisition module 610, configured to acquire first projection data of a first scan image;
a first processing module 620, coupled to the first obtaining module 610, for processing the first projection data using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model comprises a trained artificial neural network trained to predict projection data of a scanned image without bone-setting artifacts or projection data of a bone-setting artifact image;
a reconstruction module 630, coupled to the first processing module 620, is configured to reconstruct a second scan image according to the second projection data.
In some of these embodiments, the apparatus further comprises a second processing module for processing the first projection data using a trained artificial neural network to obtain third projection data of the bone hardening artifact image; and the determining module is used for determining second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to be projection data for predicting the bone hardening artifact image.
In some of these embodiments, the apparatus further comprises: the second acquisition module is used for acquiring training samples and untrained artificial neural networks, and the training samples comprise: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image; the first training module is used for taking the fourth projection data as input, taking the fifth projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
In some of these embodiments, the apparatus further comprises: and a third processing module for processing the first projection data using a trained artificial neural network to obtain second projection data, wherein the trained artificial neural network is trained to predict projection data of a scanned image without bone hardening artifacts.
In some of these embodiments, the apparatus further comprises: the third obtaining module is configured to obtain a training sample and an untrained artificial neural network, where the training sample includes: sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact; and the second training module is used for taking the sixth projection data as input, taking the seventh projection data as supervision, and training the untrained artificial neural network until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
In some of these embodiments, the apparatus further comprises: the segmentation module is used for carrying out image segmentation on the scanning image corresponding to the sixth projection data to obtain a water image and a bone image; the projection module is used for projecting the water image to obtain water image projection data and projecting the bone image to obtain bone image projection data; and the correction module is used for carrying out beam hardening correction according to the water image projection data and the bone image projection data to obtain seventh projection data.
In some of these embodiments, the first projection data, the fourth projection data, and the sixth projection data are all pre-corrected by hydraulic hardening.
The above-described respective modules may be functional modules or program modules, and may be implemented by software or hardware. For modules implemented in hardware, the various modules described above may be located in the same processor; or the above modules may be located in different processors in any combination.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples illustrate only a few embodiments of the application, which are described in detail and are not to be construed as limiting the scope of the application. It should be noted that it will be apparent to those skilled in the art that several variations and modifications can be made without departing from the spirit of the application, which are all within the scope of the application. Accordingly, the scope of protection of the present application is to be determined by the appended claims.
Claims (9)
1. A method of correcting bone hardening artifacts, the method comprising:
acquiring first projection data of a first scanning image;
processing the first projection data by using a bone-hardening artifact correction model to obtain second projection data, wherein the bone-hardening artifact correction model comprises a trained artificial neural network which is trained by training data to predict projection data of a scan image without bone-hardening artifacts or projection data of a bone-hardening artifact image; the training data includes: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image; or sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact; the processing the first projection data by using the bone hardening artifact correction model to obtain second projection data specifically includes:
Processing the first projection data by using the trained artificial neural network to obtain third projection data of the bone hardening artifact image;
determining the second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to predict projection data of a bone hardening artifact image;
and reconstructing to obtain a second scanning image according to the second projection data.
2. The method of claim 1, wherein the training process of the trained artificial neural network comprises:
obtaining a training sample and an untrained artificial neural network, the training sample comprising: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image;
and training the untrained artificial neural network by taking the fourth projection data as input and taking the fifth projection data as supervision until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
3. The method of claim 1, wherein processing the first projection data using a bone-setting artifact correction model to obtain second projection data comprises:
Processing the first projection data using the trained artificial neural network to obtain the second projection data, wherein the trained artificial neural network is trained to predict projection data of a scanned image without bone hardening artifacts.
4. A method of correcting osteosclerotic artifacts in accordance with claim 3, wherein the training procedure of said trained artificial neural network comprises:
obtaining a training sample and an untrained artificial neural network, the training sample comprising: sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact;
and training the untrained artificial neural network by taking the sixth projection data as input and taking the seventh projection data as supervision until the untrained artificial neural network parameters are converged, so as to obtain the trained artificial neural network.
5. The method of claim 4, wherein the generating of the seventh projection data comprises:
image segmentation is carried out on the scanning image corresponding to the sixth projection data, so as to obtain a water image and a bone image;
Projecting the water image to obtain water image projection data and projecting the bone image to obtain bone image projection data;
and performing beam hardening correction according to the water image projection data and the bone image projection data to obtain seventh projection data.
6. The method according to any one of claims 1 to 5, wherein the first projection data, the fourth projection data, and the sixth projection data are all pre-corrected by hydraulic hardening.
7. An osteosclerosis artifact correction device, the device comprising:
the first acquisition module is used for acquiring first projection data of a first scanning image;
a first processing module for processing the first projection data using a bone-setting artifact correction model to obtain second projection data, wherein the bone-setting artifact correction model comprises a trained artificial neural network trained by training data to predict projection data of a scan image without bone-setting artifacts or projection data of a bone-setting artifact image; the training data includes: fourth projection data of the scanned image with the bone-setting artifact and fifth projection data of the corresponding bone-setting artifact image; or sixth projection data of the scan image with the bone hardening artifact and seventh projection data of the corresponding scan image without the bone hardening artifact; the processing the first projection data by using the bone hardening artifact correction model to obtain second projection data specifically includes:
Processing the first projection data by using the trained artificial neural network to obtain third projection data of the bone hardening artifact image;
determining the second projection data according to the first projection data and the third projection data, wherein the trained artificial neural network is trained to predict projection data of a bone hardening artifact image;
and the reconstruction module is used for reconstructing and obtaining a second scanning image according to the second projection data.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of bone-hardening artifact correction according to any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, on which a computer program is stored, characterized in that the program, when executed by a processor, implements the bone-hardening artifact correction method according to any one of claims 1 to 6.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010456142.4A CN111627083B (en) | 2020-05-26 | 2020-05-26 | Bone hardening artifact correction method, device, computer equipment and readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010456142.4A CN111627083B (en) | 2020-05-26 | 2020-05-26 | Bone hardening artifact correction method, device, computer equipment and readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN111627083A CN111627083A (en) | 2020-09-04 |
CN111627083B true CN111627083B (en) | 2023-11-21 |
Family
ID=72271123
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010456142.4A Active CN111627083B (en) | 2020-05-26 | 2020-05-26 | Bone hardening artifact correction method, device, computer equipment and readable storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111627083B (en) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116097301A (en) * | 2020-12-01 | 2023-05-09 | 西安大医集团股份有限公司 | Image processing method and device, medical image equipment and storage medium |
CN112734877B (en) * | 2021-01-13 | 2023-04-07 | 上海联影医疗科技股份有限公司 | Method and system for correcting artifacts |
CN115063498A (en) * | 2021-12-31 | 2022-09-16 | 深圳微创踪影医疗装备有限公司 | Guide wire artifact suppression method and device, IVUS system and storage medium |
CN115409736B (en) * | 2022-09-16 | 2023-06-20 | 深圳市宝润科技有限公司 | Geometric correction method for medical digital X-ray photographic system and related equipment |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8023767B1 (en) * | 2008-03-10 | 2011-09-20 | University Of Rochester | Method and apparatus for 3D metal and high-density artifact correction for cone-beam and fan-beam CT imaging |
CN105608721A (en) * | 2016-01-30 | 2016-05-25 | 上海联影医疗科技有限公司 | Computer tomography pseudo shadow correction method and apparatus |
CN107330949A (en) * | 2017-06-28 | 2017-11-07 | 上海联影医疗科技有限公司 | A kind of artifact correction method and system |
CN108670282A (en) * | 2018-03-28 | 2018-10-19 | 上海联影医疗科技有限公司 | A kind of osteosclerosis artifact correction method |
CN110916708A (en) * | 2019-12-26 | 2020-03-27 | 南京安科医疗科技有限公司 | CT scanning projection data artifact correction method and CT image reconstruction method |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017128892A1 (en) * | 2016-01-30 | 2017-08-03 | 上海联影医疗科技有限公司 | Computed tomography artifact correction method and system |
US11126914B2 (en) * | 2017-10-11 | 2021-09-21 | General Electric Company | Image generation using machine learning |
-
2020
- 2020-05-26 CN CN202010456142.4A patent/CN111627083B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8023767B1 (en) * | 2008-03-10 | 2011-09-20 | University Of Rochester | Method and apparatus for 3D metal and high-density artifact correction for cone-beam and fan-beam CT imaging |
CN105608721A (en) * | 2016-01-30 | 2016-05-25 | 上海联影医疗科技有限公司 | Computer tomography pseudo shadow correction method and apparatus |
CN107330949A (en) * | 2017-06-28 | 2017-11-07 | 上海联影医疗科技有限公司 | A kind of artifact correction method and system |
CN108670282A (en) * | 2018-03-28 | 2018-10-19 | 上海联影医疗科技有限公司 | A kind of osteosclerosis artifact correction method |
CN110916708A (en) * | 2019-12-26 | 2020-03-27 | 南京安科医疗科技有限公司 | CT scanning projection data artifact correction method and CT image reconstruction method |
Also Published As
Publication number | Publication date |
---|---|
CN111627083A (en) | 2020-09-04 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111627083B (en) | Bone hardening artifact correction method, device, computer equipment and readable storage medium | |
US11341613B2 (en) | System and method for image reconstruction | |
US10255696B2 (en) | System and method for image reconstruction | |
JP2019516460A (en) | System and method for noise control in multi-energy CT images based on spatial and spectral information | |
CN110636796B (en) | Beam hardening correction in X-ray dark field imaging | |
CN103649990A (en) | Spectral CT | |
WO2010127241A2 (en) | System and methods for fast implementation of equally-sloped tomography | |
EP2036038A1 (en) | Method and system for error compensation | |
JP2011510757A (en) | Radiation dose reduction and image quality improvement in tomography by using the subject's surroundings as dynamic constraints | |
US8842899B2 (en) | Method and apparatus for reconstructing image and medical image system employing the method | |
US11419566B2 (en) | Systems and methods for improving image quality with three-dimensional scout | |
CN111754436B (en) | Acceleration method for medical image artifact correction, computer device and storage medium | |
CN112204607B (en) | Scattering correction for X-ray imaging | |
CN117437144A (en) | Method and system for image denoising | |
CN112581556B (en) | Multi-energy CT image hardening correction method and device, computer equipment and storage medium | |
JP7346546B2 (en) | How to perform automatic adaptive energy settings for CT virtual monochromatic imaging | |
CN111583354A (en) | Training method for medical image processing unit and medical image motion estimation method | |
CN109259779B (en) | Medical image generation method and medical image processing system | |
CN111768463A (en) | PET image attenuation correction method and device and computer equipment | |
CN110840479A (en) | Bone component determination method, device, equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
CB02 | Change of applicant information |
Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258 Applicant after: Shanghai Lianying Medical Technology Co.,Ltd. Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258 Applicant before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd. |
|
CB02 | Change of applicant information | ||
GR01 | Patent grant | ||
GR01 | Patent grant |