CN115482183A - Image correction method and system - Google Patents

Image correction method and system Download PDF

Info

Publication number
CN115482183A
CN115482183A CN202110602490.2A CN202110602490A CN115482183A CN 115482183 A CN115482183 A CN 115482183A CN 202110602490 A CN202110602490 A CN 202110602490A CN 115482183 A CN115482183 A CN 115482183A
Authority
CN
China
Prior art keywords
image
quality evaluation
image quality
evaluation result
target object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110602490.2A
Other languages
Chinese (zh)
Inventor
赵小芬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN202110602490.2A priority Critical patent/CN115482183A/en
Priority to PCT/CN2022/096256 priority patent/WO2022253227A1/en
Priority to EP22815271.6A priority patent/EP4330915A1/en
Publication of CN115482183A publication Critical patent/CN115482183A/en
Priority to US18/523,960 priority patent/US20240104705A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Computational Linguistics (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Mathematical Physics (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The embodiment of the application discloses a method and a system for correcting an image. The image correction method comprises the following steps: acquiring a first image of a target object; determining an image quality evaluation result of the first image through a preset algorithm; and automatically starting an image correction algorithm to correct the first image in response to the image quality evaluation result of the first image not meeting a preset condition.

Description

Image correction method and system
Technical Field
The present application relates to the field of medical technology, and in particular, to a method and a system for correcting an image.
Background
In images obtained by scanning and acquiring imaging systems of various modalities such as Magnetic Resonance Imaging (MRI), magnetic Resonance Angiography (MRA), computed Tomography (CT), positron Emission Tomography (PET), etc., due to influences of uneven magnetic field or coil sensitivity, uneven contrast medium display, patient positioning, target part movement, etc., the images may have poor effects and are difficult to diagnose. For example, in a coronary artery (hereinafter, referred to as a coronary artery) CTA examination, the pulsation of the coronary artery generally causes a motion artifact in a CTA image, thereby affecting a diagnostic result.
Therefore, it is desirable to provide a method for correcting an image.
Disclosure of Invention
One of the embodiments of the present application provides a method for correcting an image, where the method includes: acquiring a first image of a target object; determining an image quality evaluation result of the first image through a preset algorithm; and automatically starting an image correction algorithm to correct the first image in response to the image quality evaluation result of the first image not meeting a preset condition.
One of the embodiments of the present application provides a system for correcting an image, the system including: the acquisition module is used for acquiring a first image of a target object; the quality evaluation module is used for determining an image quality evaluation result of the first image through a preset algorithm; and the image correction module is used for responding to the image quality evaluation result of the first image not meeting the preset condition, and automatically starting an image correction algorithm so as to correct the first image.
In some embodiments, the determining the result of the image quality evaluation of the first image by the preset algorithm includes: determining a target area image of the target object based on the first image; and determining an image quality evaluation result of the first image through a preset algorithm based on the target area image.
In some embodiments, said determining a target area image of said target object based on said first image comprises: based on the first image, a target area image of the target object is determined using an image segmentation algorithm.
In some embodiments, the determining the result of the image quality evaluation of the first image by the preset algorithm includes: and determining an image quality evaluation result of the first image by using a trained image quality evaluation model, wherein the evaluation index of the image quality comprises one or more of the definition of an anatomical structure, the contrast of a target part, the form fitting degree of the target part, the strengthening degree of the target part, the uniformity of an image signal, the level of image noise and the suppression degree of an artifact.
In some embodiments, the target object is a coronary artery; the first image is an image containing the coronary arteries; the target region image is an image of the coronary artery.
In some embodiments, the image quality assessment of the first image comprises a composite image quality assessment of a morphological cut and/or a degree of enhancement of the coronary artery.
In some embodiments, the method further comprises: acquiring a corrected first image; determining an image quality evaluation result of the corrected first image through the preset algorithm; and sending a prompt to a user in response to the corrected image quality evaluation result of the first image not meeting the preset condition.
In some embodiments, the method further comprises: and determining a corresponding correction algorithm according to the image quality evaluation result of the first image and/or a preset algorithm corresponding to the image quality evaluation.
One embodiment of the present application provides a computer-readable storage medium, where the storage medium stores computer instructions, and after the computer reads the computer instructions in the storage medium, the computer executes the method for correcting an image as described above.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals refer to like structures, wherein:
FIG. 1 is a schematic diagram of an application scenario of a system for correcting an image according to some embodiments of the present application;
FIG. 2 is an exemplary flow chart of a method of correcting an image according to some embodiments of the present application;
FIG. 3 is an exemplary block diagram of a system for correcting an image shown in accordance with some embodiments of the present application.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, and that for a person skilled in the art the application can also be applied to other similar contexts on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "apparatus", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, the various steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or several steps of operations may be removed from the processes.
Fig. 1 is a schematic view of an application scenario of a correction system for an image according to some embodiments of the present application.
In some embodiments, as shown in fig. 1, the system 100 for correcting an image may include a processing device 110, a terminal device 120, a storage device 130, a network 140, and an imaging device 150.
In some embodiments, the system 100 may perform an image quality evaluation of an initial image (e.g., a first image) of the target object through one or more components (e.g., the processing device 110, the terminal device 120, the storage device 130, the network 140, and/or the imaging device 150) included therein, and perform a correction of the initial image based on the evaluation result. For example, the processing device 110 may acquire a medical scan image of the target object from the imaging device 150 or the storage device 130 through the network 140, determine a score of image quality of the medical scan image through a preset algorithm, and automatically initiate an image artifact correction algorithm to artifact correct the medical scan image in response to the score of image quality of the medical scan image being less than a preset threshold. In some embodiments, the processing device 110 may determine the image quality of the artifact-corrected medical image through a preset algorithm, and in response to the result of the image quality evaluation of the artifact-corrected medical image not satisfying a preset condition, send a prompt message to the terminal device 120 to prompt the user. In some embodiments, the processing device 110 may send the quality score or rating of the medical image of the target object to the terminal device 120 for output or to the storage device 130 for storage in response to it being greater than or equal to a preset threshold.
In some embodiments, the imaging device 150 may transmit the medical image of the target object to the processing device 110, the storage device 130, and/or the terminal device 120 via the network 140. In some embodiments, the processing device 110 may store the processed image in the storage device 130 or send the processed image to the terminal device 120.
The processing device 110 may process data and/or information obtained from the imaging device 150, the at least one terminal device 120, the storage device 130, or other components of the system 100. For example, the processing device 110 may acquire a first image of the target object from the imaging device 150 and perform motion artifact correction thereon. In some embodiments, the processing device 110 may be a stand-alone server or a group of servers, which may be centralized or distributed. In some embodiments, the processing device 110 may be regional or remote. In some embodiments, the processing device 110 may execute on a cloud platform. For example, the cloud platform may include one or any combination of a private cloud, a public cloud, a hybrid cloud, a community cloud, a decentralized cloud, an internal cloud, and the like. In some embodiments, processing device 110 may access information and/or data stored in terminal device 120, storage device 130, imaging device 150 through network 140. In some embodiments, the processing device 110 may be directly connected to the terminal device 120, the storage device 130, the imaging device 150 to access data and/or information stored therein.
In some embodiments, processing apparatus 110 may include one or more processors (e.g., a single chip processor or a multi-chip processor). By way of example only, the processing device 110 may include a Central Processing Unit (CPU), an Application Specific Integrated Circuit (ASIC), an application specific instruction set processor (ASIP), an image processing unit (GPU), a physical arithmetic processing unit (PPU), a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), a Programmable Logic Device (PLD), a controller, a micro-controller unit, a Reduced Instruction Set Computer (RISC), a microprocessor, or the like, or any combination thereof. In some embodiments, the processing device 110 may be part of the imaging device 150 or the terminal device 120. For example, the processing device 110 may be integrated within the imaging device 150 for analyzing the first image of the processing target object to obtain a corrected first image.
The terminal device 120 may be in communication and/or connected with the imaging device 150, the processing device 110, and/or the storage device 130. For example, a user may interact with the imaging device 150 through the terminal device 120 to control one or more components of the imaging device 150. In some embodiments, the terminal device 120 may include a mobile device 121, a tablet computer 122, a laptop computer 123, the like, or any combination thereof. For example, the mobile device 121 may include a mobile joystick, a Personal Digital Assistant (PDA), a smart phone, or the like, or any combination thereof.
In some embodiments, terminal device 120 may include an input device, an output device, and the like. The input device may be selected from keyboard input, touch screen (e.g., with tactile or haptic feedback) input, voice input, eye tracking input, gesture tracking input, brain monitoring system input, image input, video input, or any other similar input mechanism. Input information received via the input device may be transmitted, e.g., via a bus, to the processing device 110 for further processing. Other types of input devices may include cursor control devices, such as a mouse, a trackball, or cursor direction keys. In some embodiments, an operator (e.g., a medical professional) may input instructions reflecting the medical image category of the target object via an input device. Output devices may include a display, speakers, printer, etc., or any combination thereof. In some embodiments, the output device may be used to output a medical image (e.g., a first image of the target object) scanned by the imaging device 150, and/or an image determined by the processing device 110 (e.g., a corrected first image of the target object), and/or the like. In some embodiments, terminal device 120 may be part of processing device 110.
Storage device 130 may store data, instructions, and/or any other information. For example, the storage device 130 may store a first image of the target object acquired by the imaging device 150, a first image of the target object after artifact correction obtained by the processing device 110, and so on. In some embodiments, the storage device 130 may store data obtained from the imaging device 150, the at least one terminal device 120, and/or the processing device 110. In some embodiments, storage device 130 may store data and/or instructions for use by processing device 110 in performing or using the exemplary methods described in this application.
In some embodiments, the storage device 130 may include mass storage, removable storage, volatile read-write memory, read-only memory (ROM), and the like, or any combination thereof. Exemplary mass storage devices may include magnetic disks, optical disks, solid state disks, and the like. Exemplary removable memory may include flash drives, floppy disks, optical disks, memory cards, compact disks, magnetic tape, and the like. Exemplary volatile read-only memory can include Random Access Memory (RAM). Exemplary RAM may include Dynamic RAM (DRAM), double-data-rate synchronous dynamic RAM (DDR SDRAM), static RAM (SRAM), thyristor RAM (T-RAM), zero-capacitance RAM (Z-RAM), and the like. Exemplary ROMs may include Mask ROM (MROM), programmable ROM (PROM), erasable Programmable ROM (EPROM), electrically Erasable Programmable ROM (EEPROM), optical disk ROM (CD-ROM), digital versatile disk ROM, and the like. In some embodiments, the storage device 130 may be implemented on a cloud platform. By way of example only, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an internal cloud, a multi-tiered cloud, and the like, or any combination thereof. In some embodiments, certain types of historical data may be uniformly stored on one cloud platform, so that multiple processing devices 110 or terminal devices 120 can access or update the data, and the real-time performance and cross-platform use of the data are guaranteed.
In some embodiments, at least one component of system 100 (e.g., processing device 110, terminal device 120, storage device 130, and imaging device 150) may exchange information and/or data with at least one other component in system 100 via network 140. In some embodiments, the network 140 may be any form or combination of wired or wireless network. By way of example only, network 140 may include a cable network, a wireline network, a fiber optic network, a telecommunications network, an intranet, the Internet, a Local Area Network (LAN), a Wide Area Network (WAN), a Wireless Local Area Network (WLAN), a Metropolitan Area Network (MAN), a Public Switched Telephone Network (PSTN), bluetooth TM Network purple bee TM A network, a Near Field Communication (NFC) network, a global system for mobile communications (GSM) network, a Code Division Multiple Access (CDMA) network, a Time Division Multiple Access (TDMA) network, a General Packet Radio Service (GPRS) network, an enhanced data rates for GSM evolution (EDGE) network, a Wideband Code Division Multiple Access (WCDMA) network, a High Speed Downlink Packet Access (HSDPA) network, a Long Term Evolution (LTE) network, a User Datagram Protocol (UDP) network, a transmission control protocol/internet protocol (TCP/IP) network, a Short Message Service (SMS) network, a Wireless Application Protocol (WAP) network, an Ultra Wideband (UWB) network, a mobile communication (1G, 2G, 3G, 4G, 5G) network, wi-Fi, li-Fi, a narrowband internet of things (NB-IoT), and the like, or any combination thereof.
The imaging device 150 may be used to scan a target object within the examination region, resulting in scan data for the target object. In some embodiments, the target object may include a biological object and/or a non-biological object. For example, the target object may include a particular portion of the body, such as the head, chest, abdomen, coronary arteries, etc., or any combination thereof. As another example, the target object may be an artificial composition of organic and/or inorganic matter, living or non-living. In some embodiments, the medical image data related to the target object may include projection data, one or more scan images, etc. of the target object.
In some embodiments, the imaging apparatus 150 may be a non-invasive biomedical imaging device for disease diagnosis or research purposes. For example, the imaging device 150 may include a single modality scanner and/or a multi-modality scanner. The single modality scanner may include, for example, an ultrasound scanner, an X-ray scanner, a Computed Tomography (CT) scanner, a Magnetic Resonance Imaging (MRI) scanner, an ultrasound tester, a Positron Emission Tomography (PET) scanner, an Optical Coherence Tomography (OCT) scanner, an Ultrasound (US) scanner, an intravascular ultrasound (IVUS) scanner, a near infrared spectroscopy (NIRS) scanner, a Far Infrared (FIR) scanner, or the like, or any combination thereof. The multi-modality scanner may include, for example, an X-ray imaging-magnetic resonance imaging (X-ray-MRI) scanner, a positron emission tomography-X-ray imaging (PET-X-ray) scanner, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) scanner, a positron emission tomography-computed tomography (PET-CT) scanner, a digital subtraction angiography-magnetic resonance imaging (DSA-MRI) scanner, and the like. The scanners provided above are for illustration purposes only and are not intended to limit the scope of the present application. As used herein, the term "imaging modality" or "modality" broadly refers to an imaging method or technique that collects, generates, processes, and/or analyzes imaging information of a target object.
In some embodiments, the imaging device 150 may include a gantry, a detector, an examination region, a scanning bed, and a radiation source. A gantry may be used to support the detector and the source of radiation. The scanning bed may be used to position a target object for scanning. For example, the user may lie on their back, side, or front on a scanning bed. In some embodiments, the scanning bed may be a separate device from the imaging device 150. The target object may include a patient, a phantom, or other scanned object. The radiation source may emit radiation toward the target object to illuminate the target object. The detector may be configured to receive radiation that has passed through the target object. In some embodiments, the imaging device 150 may be or include an X-ray imaging device, such as may include DSA (Digital subtraction angiography), digital Radiography (DR), computed Radiography (CR), digital Fluoroscopy (DF), CT scanner, magnetic resonance scanner, mammography machine, C-arm device, and the like.
In some embodiments, the imaging device 150 may also include a display screen. The display screen may be used to observe the imaging device 150 and/or data information of a target object scanned by the imaging device 150. For example, the medical staff can observe the lesion information of the detection part of the chest, the bone, the mammary gland and the like of the patient through the display screen. In some embodiments, the display screen may include a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) based display, a flat panel back display, a curved screen, a television device, a Cathode Ray Tube (CRT), a touch screen, or the like, or combinations thereof. In some embodiments, the display screen may also include output devices such as speakers, printers, and/or input devices such as a keyboard, mouse, and the like.
In some embodiments, imaging device 150 may include modules and/or components for performing imaging and/or correlation analysis. For example, the imaging device 150 may include a radiation source, a detector, and a processor (e.g., processing device 110), among others.
In some embodiments, the image data (e.g., the first image of the target object) acquired by the imaging device 150 may be communicated to the processing device 110 for further analysis. Additionally or alternatively, image data acquired by imaging device 150 may be sent to a terminal device (e.g., terminal device 120) for display and/or a storage device (e.g., storage device 130) for storage.
It should be noted that the foregoing description is provided for illustrative purposes only, and is not intended to limit the scope of the present application. Many variations and modifications will occur to those skilled in the art in light of the teachings herein. The features, structures, methods, and other features of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 130 may be a data storage device comprising a cloud computing platform (e.g., public cloud, private cloud, community and hybrid cloud, etc.). However, such changes and modifications do not depart from the scope of the present application.
In some embodiments, during the scanning and imaging process of the imaging device, due to the influence of non-uniform magnetic field or coil sensitivity, or non-uniform contrast agent display, patient positioning, and movement of the target portion, the obtained scanning image may be poor in effect, and the diagnosis result of the target portion may be affected. For example, in a coronary CTA diagnostic procedure, motion artifacts may be present in the obtained scan images due to the pulsation of the coronary artery, resulting in poor image resolution, heavy artifacts, and difficult diagnosis. In some embodiments, the obtained image may be image reconstructed by an image correction algorithm to obtain a corrected medical image. However, before the image is reconstructed by the image correction algorithm, the medical staff needs to check the image quality of the image, and when the image quality is poor, the medical staff manually increases the reconstruction sequence and selects the corresponding correction algorithm to reconstruct the image. The method needs to consume large manpower, and when the scanning flux is large, the workload of medical staff is large, so that misjudgment of image quality can be caused, and the diagnosis is influenced.
The application provides an image correction method which can intelligently judge whether the image quality of a scanned image or a once reconstructed image of a target object meets a preset condition or not, and can automatically start a corresponding image correction algorithm to correct the image when the image quality does not meet the preset condition. In some embodiments, the image quality is intelligently evaluated, and an image correction algorithm is intelligently started based on the evaluation result, so that the image correction efficiency and accuracy can be improved, the manpower can be saved, and the medical diagnosis efficiency can be improved.
FIG. 2 is an exemplary flow chart of a method of correcting an image shown in accordance with some embodiments of the present application.
In some embodiments, the method 200 for correcting an image may be performed by the system 100 for correcting an image (e.g., the processing device 110). For example, the method 200 may be stored in a storage device (e.g., storage device 130) in the form of a program or instructions that when executed by the image correction system 100 (e.g., processing device 110) may implement the method 200. In some embodiments, the method 200 may be performed by a correction system 300 for an image.
Step 210, a first image of the target object is acquired. In some embodiments, step 210 may be performed by the acquisition module 310.
Illustratively, the target object may be any object for which an image scan is desired, e.g. a patient or some part of the patient's body. In some embodiments, the first image may reflect projection data of a target object scanned by the imaging device. In some embodiments, the first image may be an original scanned image of the target object. For example, the first image may be raw scan data obtained by scanning the target object by the imaging device 150. In some embodiments, the first image may be an image reconstructed based on the raw scan data. For example, the first image may be an FBP (filtered back projection) map obtained by the imaging device 150 based on raw scan data of the target object. In some embodiments, the first image may be an image determined by a preset algorithm. For example, the first image may be the best time phase map selected by the imaging device 150 through its built-in algorithm from the existing FBP map of the target object.
In some embodiments, the first image may be an image with artifacts. Artifacts (Artifacts) are images of various forms that appear in an image without an object to be scanned. In some embodiments, the morphology of the artifact may include a triangle, an arc, a trailing tail, etc., or any combination thereof. In some embodiments, the cause of the artifact may include: physiological activity of the target object (e.g., movement, respiration, heartbeat, pulse, bowel movement, etc.), metallic foreign objects in and/or out of the target object, sample aliasing during scanning, radiotherapeutic hardening, noise, etc. Artifacts due to its voluntary movement (e.g., limb movement, swallowing, etc.) or involuntary movement (e.g., heartbeat, vascular pulsation, etc.) are referred to as motion artifacts when the target object is scanned.
In some embodiments, the processing device may acquire a first image of the target object from an imaging device (e.g., imaging device 150). In some embodiments, the processing device may retrieve a first image of the target object from a storage device (e.g., storage device 130). In some embodiments, the processing device may obtain the first image of the target object from other data sources and in any reasonable manner, which is not limited by this specification.
Step 220, determining an image quality evaluation result of the first image through a preset algorithm. In some embodiments, step 220 may be performed by quality assessment module 320.
The image quality evaluation result can reflect the information presentation accuracy degree of the image of the target object. For example, the image quality evaluation result of the medical image of the coronary artery may reflect the accuracy of the shape and size of the coronary artery of the user presented in the medical image. In some embodiments, the image quality assessment results may include a quality score or quality rating of the image. Generally, the higher the quality score or rating, the more accurate the information presentation of its corresponding image. Generally, a rating corresponds to a smaller value, which corresponds to a higher rating, e.g., a first rating is higher than a second rating. In some alternative embodiments, the smaller the value corresponding to the rating, the lower the rating may be, and the specification is not limited thereto.
In some embodiments, the processing device may determine a target area image of the target object based on the first image of the target object; and determining an image quality evaluation result of the first image of the target object through a preset algorithm based on the target area image. The target area image refers to an image of a region of interest of a target object by a user (e.g., a medical staff). In some embodiments, the target area image may be an image of the first image that contains only the target object. For example, when the first image is an image including coronary arteries of the heart, the target region image may be an image including only coronary arteries. In some embodiments, the processing device may determine a target area image of the target object by an image segmentation algorithm. For example, the image segmentation algorithm may include, but is not limited to, a threshold-based segmentation method, a region-based segmentation method, an edge-based segmentation method, a particular theory-based segmentation method, a histogram-based segmentation method, a genetic code-based segmentation method, and the like, or any combination thereof. In some embodiments, the processing device may determine the target area image of the target object by other feasible methods, for example, a trained neural network model, and the like, which is not limited by the present specification.
In some embodiments, the processing device may determine the image quality evaluation result of the target area image as the image quality evaluation result of the first image by a preset algorithm. In some embodiments, the preset algorithm may include target object evaluation index, scoring criteria and weights, etc. For example, the preset algorithm may determine a quality score or rating of the image according to the shape, area, center line, diameter, edge thickness, and other indicators of the target object in the image and the weight corresponding to each portion. For another example, the preset algorithm may determine the quality score or rating of the image according to indexes such as contrast, gray scale, and CT value of pixel points of the target object in the image.
For example only, when the target object is a coronary artery, the processing device 110 may determine a coronary artery corresponding region in the first image by an image segmentation algorithm based on the first image of the coronary artery acquired from the imaging device 150, and segment each part of the coronary artery image region to determine a target region image of the coronary artery. The processing device 110 may determine an image quality evaluation result of the first image based on a comprehensive image quality evaluation result of a coronary artery morphology cut degree and an intensification degree in the target region image of the coronary artery. In some embodiments, in the segmented coronary artery region images, if the image quality evaluation result of at least one image does not satisfy the preset condition, the correction is required.
In some embodiments, the coronary morphology may include the shape, contour, diameter, thickness, etc., of the coronary arteries, or any combination thereof. In some embodiments, the processing device 110 may perform morphology scoring or ranking based on similarity (or degree of tangency) of the morphology of the coronary arteries in the target region image to its standard morphology. For example, if the similarity between the morphology of the coronary artery in the target region image and the morphology of the standard morphology circle is more than 95%, the score is 5 or the first grade; at 90% -95%, the rating is 4 points or the second rating; at 85% -90%, the rating is 3 points or the third grade; when the content is 50% -85%, the rating is 2 points or the fourth grade; less than 50%, the rating is 1 point or the fifth.
The intensity level refers to the CT value of the coronary artery region in the image. CT values are a response of human tissue density and are generally expressed in HU units. In some embodiments, the processing device 110 may score or rank the degree of enhancement based on the mean of the coronary CT values in the target region image. For example, when the mean CT value of the coronary artery in the target region image is greater than 450HU, it can be scored as 5 points; when the average molecular weight is 400HU-450HU, the score is 4; when 300HU-400HU, mark as 3 points; when the milk is 200-300HU, the milk is marked as 2 points; when the average molecular weight is less than 200HU, the score is 1. In some embodiments, the processing device 110 may score or rank the degree of enhancement based on CT value criteria values for the coronary arteries. For example, the CT value standard value of the coronary artery may be set to 300HU, and when the mean CT value of the coronary artery in the target region image is greater than 300HU, it is scored as 5 points; when the average molecular weight is less than 300HU, the score is 1.
In some embodiments, the morphological score/grade and the intensity score/grade of the coronary arteries may be combined in any feasible way. Such as a weighted average. For example, the integrated image quality evaluation result may be obtained by calculating E = a × X + b × Y, where a and b are weighting coefficients, and X and Y are scores or grades corresponding to the evaluation indexes. For example, X may be a morphological score of coronary artery, Y may be a degree of enhancement score, and if a =0.7, b =0.3, X5, and Y3, the composite score E =5 + 0.7+3 + 0.3=4.4.
It is to be understood that the above evaluation indicators and their corresponding values are merely examples, and in some embodiments, one or more of the evaluation indicators, the evaluation criteria, the weight value, and the like of the image quality may be adjusted periodically or at any time according to actual situations, for example, the evaluation indicators may include one or more of the anatomical structure definition, the target portion contrast, the image signal uniformity, the image noise level, the artifact suppression degree, the edge sharpness of coronary arteries, and the like, and the user may adjust the weight value through an open configuration item, which is not limited in this specification.
In some embodiments, the first image may optionally be pre-processed before the image quality evaluation is performed. For example, the pre-processing may include, but is not limited to, left-right horizontal flipping, translation transformation in horizontal and vertical directions, random rotation, filling of edges and changes in contrast, image normalization, and the like, or any combination thereof.
In some embodiments, the processing device may determine an image quality evaluation result of the first image using a trained image quality evaluation model. For example, the image quality evaluation model may include, but is not limited to, a google lenet model, an AlexNet model, a VGG model, a ResNet model, and the like. In some embodiments, the input to the image quality evaluation model may be a first image and the output may be a quality score or rating of the first image, such as a score of [0,10], or a rating of a first level to a third level. In some embodiments, the output of the image quality assessment model may be whether the image requires correction. For example, the output may be 1 or 0, where 1 indicates no correction is required and 0 indicates correction is required. In some embodiments, each evaluation index may correspond to a trained image quality evaluation model. For example, the morphological score of the coronary artery may be obtained by a morphological evaluation model, and the reinforcement degree score may be obtained by a reinforcement degree evaluation model. In some embodiments, multiple evaluation metrics may correspond to a same quality evaluation model. For example, the image quality evaluation model may simultaneously evaluate the morphological cut degree and the reinforcement degree of the coronary artery. In some embodiments, the output of the image quality evaluation model may be an evaluation result corresponding to each evaluation index, respectively. For example, the output of the image quality evaluation model may be: coronary artery morphology 5 points & intensity 3 points. In some embodiments, the output of the image quality evaluation model may be a composite image quality evaluation result of all evaluation indexes. In some embodiments, the initial model may be trained based on a plurality of sets of training samples with identifications, to obtain a trained image quality evaluation model. In some embodiments, the training may be ended when the trained image quality evaluation model satisfies a preset training condition. The preset training condition may be that the result of the loss function converges or is smaller than a preset value, and the like.
In some embodiments, the image segmentation algorithm and/or the preset algorithm may be stored in a storage device (e.g., storage device 130) in software.
Step 230, determining whether the image quality evaluation result of the first image meets a preset condition. In some embodiments, step 230 may be performed by quality assessment module 320.
The preset condition may reflect a requirement for image quality, for example, the preset condition may be an image quality score or a grade that the first image needs to meet. In some embodiments, the processing device may determine whether the image quality evaluation result of the first image satisfies a preset condition, and when the image quality evaluation result of the first image satisfies the preset condition, output the first image as a final medical image of the target object, and the medical imaging process is ended; when the image quality evaluation result of the first image does not satisfy the preset condition, step 233 is executed to automatically start the image correction algorithm. In some embodiments, step 233 may be performed by image correction module 330.
In some embodiments, the preset condition may be a quality score or a rank value that the image needs to satisfy, e.g., a preset threshold. In some embodiments, the preset threshold may be any reasonable value. For example, when the quality score is 10 points, the preset threshold may be set to 8. In some embodiments, the preset threshold may be determined based on the diagnostic needs of the target subject. For example, in the coronary CTA, the quality score of the coronary artery image may be set to a 1-5-point scoring method, and accordingly, the preset threshold may be set to 3 points, and when the image quality score is 3 points or more, the image is a diagnosable image, and the corresponding image may be directly output; when the time is less than 3 minutes, the image is an undiagnostic image, an image artifact correction algorithm is automatically started, and the group of data is reconstructed by a coronary artery motion artifact correction algorithm, so that the quality of the coronary artery image is improved. In some embodiments, the preset threshold may be adjusted periodically or at any time.
In some embodiments, the image correction algorithm may be any feasible correction algorithm, for example, a filtered back projection reconstruction algorithm, a registration algorithm, a noise processing algorithm, a contrast processing algorithm, a deghost algorithm, and the like, which is not limited by this specification. In some embodiments, the processing device may determine a corresponding correction algorithm based on the image quality evaluation result of the first image. For example, when the scores corresponding to the image qualities are different (e.g., 3 scores and 1 score), different artifact correction algorithms may be selected. For example, when the score corresponding to the morphology of the coronary artery in the image is high and the score corresponding to the high degree of enhancement is low, and when the score corresponding to the morphology of the coronary artery in the image is low, a different image correction algorithm may be selected. In some embodiments, the processing device may determine the respective correction algorithm according to a preset algorithm corresponding to the image quality evaluation. For example, the corresponding correction algorithm when the image quality evaluation algorithm is an image noise evaluation model may be a noise processing algorithm, and the corresponding correction algorithm when the image quality evaluation algorithm is an image artifact evaluation algorithm may be an artifact removal algorithm, and the like. In some embodiments, the processing device may determine a corresponding correction algorithm according to the image quality of the first image and a preset algorithm corresponding to the image quality evaluation. In some embodiments, different image quality assessment results and/or different image quality assessment algorithms may correspond to the same or different correction algorithms. In some embodiments, the image correction algorithm may be stored in a storage device (e.g., storage device 130) in software.
Step 240, acquiring the corrected first image. In some embodiments, step 240 may be performed by acquisition module 310.
In some embodiments, the processing device may acquire the corrected first image after execution of the image correction algorithm is complete. For example, the acquisition module 310 may acquire the corrected first image from the imaging device 150 after the image correction algorithm is performed. In some embodiments, the image quality of the corrected image may or may not meet a preset condition, e.g. the artifact-corrected first image may or may not contain motion artifacts.
And step 250, determining an image quality evaluation result of the corrected first image through a preset algorithm. In some embodiments, step 250 may be performed by quality assessment module 320.
In some embodiments, the processing device may determine a target area image of the target object based on the corrected first image, and determine an image quality evaluation result of the corrected first image by a preset algorithm based on the target area image. In some embodiments, the processing device may directly score (or rate) the quality based on the corrected first image. In some embodiments, the processing device may determine the image quality evaluation result of the corrected first image by an algorithm that is the same as or different from an algorithm for evaluating the image quality of the first image. For more details on the image quality evaluation, reference may be made to the related description in step 220, which is not repeated herein.
Due to individual differences and the like, the corrected first image may still not meet the preset condition, for example, the artifact may still exist in the artifact-corrected image, or the quality is poor, and the image correction effect may be further detected by determining the image quality of the corrected image.
And step 260, judging whether the image quality evaluation result of the corrected first image meets a preset condition. In some embodiments, step 260 may be performed by quality assessment module 320.
In some embodiments, the processing device may determine whether the quality score of the corrected first image is smaller than a preset threshold, and when the quality score of the corrected first image is greater than or equal to the preset threshold, output the corrected first image as a final medical image of the target object, and the medical imaging process is ended; when the quality score of the corrected first image is smaller than the preset threshold, step 263 may be executed to send a prompt to the user. In some embodiments, the prompting means may include, but is not limited to, a musical prompt, a voice announcement prompt, an information prompt, a video prompt, etc., or any combination thereof. In some embodiments, the processing device may send the reminder information and/or the corrected first image to a terminal device, e.g., terminal device 120, to remind the user.
When the corrected image quality still does not meet the preset condition, a prompt is sent to a user, the user can manually further judge the image quality of the corrected image, or the corrected image can be processed by rescanning the target object or other modes, so that a medical image with higher quality is obtained, and the diagnosis accuracy is improved.
In some embodiments, the preset conditions of the pre-correction and corrected image quality evaluation results may be set to different evaluation conditions, for example, the score threshold may be set to different values. For example, the threshold corresponding to the first image may be set smaller than the threshold corresponding to the corrected first image. In some embodiments, the quality score threshold of the corrected first image may be adjusted according to the secondary determination result of the user. For example, when the user manually determines that the artifact-corrected first image can be used for diagnosis after receiving the prompt, the preset threshold corresponding to the artifact-corrected first image may be reduced.
It should be noted that the above description of method 200 is for purposes of example and illustration only and is not intended to limit the scope of applicability of the present application. Various modifications and alterations to method 200 will be apparent to those skilled in the art in light of the present disclosure. However, such modifications and variations are intended to be within the scope of the present application. For example, the method 200 may be used for correction of images other than medical images. For another example, in step 260, a new correction algorithm may be stored in the processing device, and automatically started when the quality score of the corrected first image is still less than the preset threshold.
FIG. 3 is an exemplary block diagram of a system for correcting an image, shown in accordance with some embodiments of the present application.
As shown in fig. 3, the system may include an acquisition module 310, a quality evaluation module 320, and an image correction module 330. These modules may also be implemented as an application or a set of instructions that are read and executed by a processing device. Further, a module may be any combination of hardware circuitry and applications/instructions. For example, a module may be part of a processor when a processing device or processor executes an application/set of instructions.
The acquisition module 310 may be used to acquire an image of a target object. In some embodiments, the acquisition module 310 may be used to acquire a first image of a target object. In some embodiments, the acquisition module 310 may be configured to acquire the corrected first image. In some embodiments, the target object may be a coronary artery and the first image may be a medical image of the coronary artery.
The quality evaluation module 320 may be used to evaluate the image quality. In some embodiments, the quality evaluation module 320 may be configured to determine an image quality evaluation result of the first image of the target object by a preset algorithm. In some embodiments, the quality evaluation module 320 may be configured to determine an image quality evaluation result of the corrected first image through a preset algorithm. In some embodiments, the quality evaluation module 320 may determine a target area image of the target object based on the first image of the target object or the corrected first image, and determine an image quality evaluation result of the first image through a preset algorithm based on the target area image. In some embodiments, the quality evaluation module 320 may determine the target area image of the target object using an image segmentation algorithm based on the first image of the target object or the corrected first image. In some embodiments, the target region image may be an image containing only coronary arteries. In some embodiments, the quality evaluation module 320 may determine an image quality evaluation result of the first image or the corrected first image using a trained image quality evaluation model. In some embodiments, the image quality assessment of the first image may include a composite image quality assessment of a morphological cut and a degree of enhancement of the coronary artery.
The image correction module 330 may be used to correct the image. In some embodiments, the image correction module 330 may automatically start the image correction algorithm to correct the first image in response to the image quality evaluation result of the first image not satisfying the preset condition. In some embodiments, the image correction module 330 may issue a prompt to the user in response to the result of the image quality evaluation of the corrected first image not satisfying the preset condition.
It should be appreciated that the system 300 shown in FIG. 3 and its modules may be implemented in a variety of ways. For example, in some embodiments, system 300 and its modules may be implemented in hardware, software, or a combination of software and hardware. Wherein the hardware portion may be implemented using dedicated logic; the software portions may be stored in a memory for execution by a suitable instruction execution system, such as a microprocessor or specially designed hardware. Those skilled in the art will appreciate that the methods and systems described above may be implemented using computer executable instructions and/or embodied in processor control code, for example such code provided on a carrier medium such as a diskette, CD-or DVD-ROM, programmable memory such as read-only memory (firmware), or a data carrier such as an optical or electronic signal carrier. The system and its modules of the present application may be implemented not only by hardware circuits such as very large scale integrated circuits or gate arrays, semiconductors such as logic chips, transistors, or programmable hardware devices such as field programmable gate arrays, programmable logic devices, etc., but also by software executed by various types of processors, for example, or by a combination of the above hardware circuits and software (e.g., firmware).
It should be noted that the above description of the system 300 and its modules is merely for convenience of description and should not limit the present disclosure to the illustrated embodiments. It will be appreciated by those skilled in the art that, given the teachings of the present system, any combination of modules or sub-system configurations may be used to connect to other modules without departing from such teachings. In some embodiments, the acquisition module 310, the quality evaluation module 320, and the image correction module 330 may be different modules in a system, or may be a module that implements the functions of two or more modules. In some embodiments, the obtaining module 310, the quality evaluating module 320, and the image correcting module 330 may share one storage module, and each of the modules may have its own storage module. Such variations are within the scope of the present disclosure.
The beneficial effects that may be brought by the embodiments of the present description include, but are not limited to: (1) The quality evaluation is carried out on the image of the target object through a preset algorithm, so that the efficiency of evaluating the image quality can be improved; (2) When the image quality does not meet the preset condition, the correction algorithm is automatically started, so that the medical image quality can be improved, and the process of manual intervention screening can be reduced; (3) The corrected image is subjected to secondary quality evaluation, and a user is prompted when the image quality does not meet a preset condition, so that the misjudgment rate can be reduced, and the diagnosis efficiency can be improved; (4) According to different image quality evaluation results and/or image quality evaluation algorithms, the corresponding correction algorithm is selected to perform targeted correction on the image which does not meet the preset condition, so that the image correction effect can be improved, and the diagnosis efficiency is further improved. It is to be noted that different embodiments may produce different advantages, and in different embodiments, the advantages that may be produced may be any one or combination of the above, or any other advantages that may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be regarded as illustrative only and not as limiting the present specification. Various modifications, improvements and adaptations to the present description may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present specification and thus fall within the spirit and scope of the exemplary embodiments of the present specification.
Also, the description uses specific words to describe embodiments of the specification. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the specification is included. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the specification may be combined as appropriate.
Additionally, the order in which the elements and sequences of the process are recited in the specification, the use of alphanumeric characters, or other designations, is not intended to limit the order in which the processes and methods of the specification occur, unless otherwise specified in the claims. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the present specification, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to imply that more features are required than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit-preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
For each patent, patent application publication, and other material, such as articles, books, specifications, publications, documents, etc., cited in this specification, the entire contents of each are hereby incorporated by reference into the specification. Except where the application history document does not conform to or conflict with the contents of the present specification, it is to be understood that the application history document, as used herein in the present specification or appended claims, is intended to define the broadest scope of the present specification (whether presently or later in the specification) rather than the broadest scope of the present specification. It is to be understood that the descriptions, definitions and/or uses of terms in the accompanying materials of the present specification shall control if they are inconsistent or inconsistent with the statements and/or uses of the present specification.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments described herein. Other variations are also possible within the scope of the present description. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present specification can be seen as consistent with the teachings of the present specification. Accordingly, the embodiments of the present description are not limited to only those explicitly described and depicted herein.

Claims (10)

1. A method for correcting an image, comprising:
acquiring a first image of a target object;
determining an image quality evaluation result of the first image through a preset algorithm;
and automatically starting an image correction algorithm to correct the first image in response to the image quality evaluation result of the first image not meeting a preset condition.
2. The method according to claim 1, wherein the determining the image quality evaluation result of the first image by a preset algorithm comprises:
determining a target area image of the target object based on the first image;
and determining an image quality evaluation result of the first image through a preset algorithm based on the target area image.
3. The method of claim 2, wherein determining a target area image of the target object based on the first image comprises:
based on the first image, a target area image of the target object is determined using an image segmentation algorithm.
4. The method according to claim 1, wherein the determining the image quality evaluation result of the first image by a preset algorithm comprises:
and determining an image quality evaluation result of the first image by using a trained image quality evaluation model, wherein the evaluation index of the image quality comprises one or more of the definition of an anatomical structure, the contrast of a target part, the form fitting degree of the target part, the strengthening degree of the target part, the uniformity of an image signal, the level of image noise and the suppression degree of an artifact.
5. The method of claim 1, wherein the target object is a coronary artery;
the first image is an image containing the coronary arteries;
the target region image is an image of the coronary artery.
6. The method of claim 5, wherein the image quality assessment of the first image comprises a composite image quality assessment of a morphological cut and a degree of enhancement of the coronary arteries.
7. The method of claim 1, further comprising:
acquiring a corrected first image;
determining an image quality evaluation result of the corrected first image through the preset algorithm;
and sending a prompt to a user in response to the fact that the image quality evaluation result of the corrected first image does not meet the preset condition.
8. The method of claim 1, further comprising: and determining a corresponding correction algorithm according to the image quality evaluation result of the first image and/or a preset algorithm corresponding to the image quality evaluation.
9. A system for correcting an image, comprising:
the acquisition module is used for acquiring a first image of a target object;
the quality evaluation module is used for determining an image quality evaluation result of the first image through a preset algorithm;
and the image correction module is used for responding to the image quality evaluation result of the first image not meeting the preset condition, and automatically starting an image correction algorithm so as to correct the first image.
10. A computer-readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of any one of claims 1-8.
CN202110602490.2A 2021-05-31 2021-05-31 Image correction method and system Pending CN115482183A (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
CN202110602490.2A CN115482183A (en) 2021-05-31 2021-05-31 Image correction method and system
PCT/CN2022/096256 WO2022253227A1 (en) 2021-05-31 2022-05-31 Systems and methods for image correction
EP22815271.6A EP4330915A1 (en) 2021-05-31 2022-05-31 Systems and methods for image correction
US18/523,960 US20240104705A1 (en) 2021-05-31 2023-11-30 Systems and methods for image correction

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110602490.2A CN115482183A (en) 2021-05-31 2021-05-31 Image correction method and system

Publications (1)

Publication Number Publication Date
CN115482183A true CN115482183A (en) 2022-12-16

Family

ID=84419185

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110602490.2A Pending CN115482183A (en) 2021-05-31 2021-05-31 Image correction method and system

Country Status (1)

Country Link
CN (1) CN115482183A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808718A (en) * 2024-02-29 2024-04-02 江西科技学院 Method and system for improving medical image data quality based on Internet
CN118096735A (en) * 2024-04-23 2024-05-28 大连华璟科技有限公司 Livestock and veterinary vision detection system

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808718A (en) * 2024-02-29 2024-04-02 江西科技学院 Method and system for improving medical image data quality based on Internet
CN117808718B (en) * 2024-02-29 2024-05-24 江西科技学院 Method and system for improving medical image data quality based on Internet
CN118096735A (en) * 2024-04-23 2024-05-28 大连华璟科技有限公司 Livestock and veterinary vision detection system

Similar Documents

Publication Publication Date Title
CN112150574B (en) Method, system and device for automatically correcting image artifacts and storage medium
WO2021213519A1 (en) Image obtaining method and system, image quality determination method and system, and medical image acquisition method and system
JP6296942B2 (en) CT image evaluation apparatus and CT image evaluation method
JP2021039748A (en) Information processor, information processing method, information processing system, and program
CN108135551A (en) For the method and system of adaptive scanning control
US20170209105A1 (en) Methods and systems for automated tube current modulation
US20130066198A1 (en) Contrast Agent Perfusion Adaptive Imaging System
US11690950B2 (en) Methods and systems for timing a second contrast bolus
CN115482183A (en) Image correction method and system
CN111462112A (en) Method and system for judging image quality
US11341636B2 (en) Methods and systems for an adaptive five-zone perfusion scan
US11179127B2 (en) Methods and systems for a single-bolus angiography and perfusion scan
KR101946576B1 (en) Apparatus and method for processing medical image, and computer readable recording medium related to the method
US20240104705A1 (en) Systems and methods for image correction
CN113962949A (en) Brain medical image processing system, method and computer readable storage medium
US20200240934A1 (en) Tomography apparatus and controlling method for the same
US11452490B2 (en) Methods and systems for an adaptive perfusion scan
US11350896B2 (en) Methods and systems for an adaptive four-zone perfusion scan
US20230076352A1 (en) Systems and methods for image processing
US20230030595A1 (en) Methods and systems for selecting data processing models
EP4084010A1 (en) Method for operating an evaluation system for medical image data sets, evaluation system, computer program and electronically readable storage medium
US11564648B2 (en) Methods and systems for an adaptive multi-phase angiography scan
KR20190002960A (en) Method for task based CT reconstruction protocol optimization using body-recognition
CN115700740A (en) Medical image processing method, apparatus, computer device and storage medium
CN111281404A (en) Imaging method of X-ray imaging equipment and X-ray imaging equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination