CN111919264A - System and method for synchronizing an imaging system and an edge calculation system - Google Patents

System and method for synchronizing an imaging system and an edge calculation system Download PDF

Info

Publication number
CN111919264A
CN111919264A CN201980022365.5A CN201980022365A CN111919264A CN 111919264 A CN111919264 A CN 111919264A CN 201980022365 A CN201980022365 A CN 201980022365A CN 111919264 A CN111919264 A CN 111919264A
Authority
CN
China
Prior art keywords
imaging system
data
computing device
imaging
scan
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201980022365.5A
Other languages
Chinese (zh)
Inventor
大卫·埃里克·希瓦利埃
桑迪普·杜塔
萨阿德·西罗海
拉贾·拉姆纳拉扬
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
General Electric Co
Original Assignee
General Electric Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by General Electric Co filed Critical General Electric Co
Publication of CN111919264A publication Critical patent/CN111919264A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Pathology (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The present invention provides a method and system for synchronizing an imaging system with an Edge Computing System (ECS). In one embodiment, a system includes an imaging system including at least a scanner and a processor configured to reconstruct an image from data acquired during scanning of a subject via the scanner; and a computing device communicatively coupled to and positioned external to the imaging system, the computing device configured to generate a decision support calculation based on the data, wherein the imaging system transmits the data to the computing device during the scan. In this way, the processing power of the imaging system can be substantially extended. Furthermore, post-processing techniques may be performed in parallel with the scan, thereby reducing time for clinical diagnosis.

Description

System and method for synchronizing an imaging system and an edge calculation system
Cross Reference to Related Applications
This application claims priority to U.S. provisional application No. 62/657,633, entitled "SYSTEMS AND METHODS FOR SYNCHRONIZATION OF IMAGING SYSTEMS AND AN EDGE COMPUTING SYSTEM" (SYSTEM and method FOR synchronizing an imaging SYSTEM and an edge COMPUTING SYSTEM) and filed on 2018, 4/13/month. The entire contents of the above application are hereby incorporated by reference for all purposes.
Technical Field
Embodiments of the subject matter disclosed herein relate to non-invasive diagnostic imaging and, more particularly, to expanding the processing power of an imaging system.
Background
Non-invasive imaging techniques allow images of the internal structure of a patient or subject to be obtained without the need to perform invasive procedures on the patient or subject. In particular, techniques such as Computed Tomography (CT) use various physical principles, such as differential transmission of X-rays through a target volume, to acquire image data and construct tomographic images (e.g., three-dimensional representations of the interior of a human body or other imaging structure).
New post-processing techniques can substantially improve the functionality of the imaging system and the accuracy of clinical diagnosis. For example, modern deep learning techniques may allow for accurate detection of lesions in tomographic images with lower image quality, thereby enabling a reduction in radiation dose (and thus potentially image quality) without sacrificing diagnostic effectiveness of the imaging system.
However, the processing power of the imaging system is limited and any overhead in processing power may not be sufficient to support the new post-processing techniques. Upgrading the imaging system to accommodate additional processing power or replacing the imaging system is cost prohibitive. Furthermore, even if such post-processing techniques can be performed by the imaging system, they may be performed after the imaging procedure, which will increase the time for clinical diagnosis.
Disclosure of Invention
In one embodiment, a system includes an imaging system including at least a scanner and a processor configured to reconstruct an image from data acquired during scanning of a subject via the scanner; and a computing device communicatively coupled to and positioned external to the imaging system, the computing device configured to generate a decision support calculation based on the data, wherein the imaging system transmits the data to the computing device during the scan. In this way, the processing power of the imaging system can be substantially extended. Furthermore, post-processing techniques may be performed in parallel with the scan, thereby reducing time for clinical diagnosis.
It should be appreciated that the brief description above is provided to introduce in simplified form a selection of concepts that are further described in the detailed description. It is not meant to identify key or essential features of the claimed subject matter, the scope of which is defined uniquely by the claims that follow the detailed description. Furthermore, the claimed subject matter is not limited to implementations that solve any disadvantages noted above or in any part of this disclosure.
Drawings
The invention will be better understood by reading the following description of non-limiting embodiments with reference to the attached drawings, in which:
FIG. 1 shows a block schematic diagram of a simplified exemplary system for extending the capabilities of an imaging system, according to one embodiment;
FIG. 2 depicts a high level flow chart that illustrates an exemplary method for synchronizing an imaging system with an edge computing system in accordance with one embodiment;
FIG. 3 depicts a high level flow chart that illustrates an exemplary method for generating deep learning training data with an imaging system in accordance with one embodiment;
FIG. 4 depicts a high level flow chart that illustrates an exemplary method for imaging system shortcut mode in accordance with one embodiment;
FIG. 5 depicts a high level flow chart that illustrates an exemplary method for managing applications on an edge computing system in accordance with one embodiment;
FIG. 6 shows a pictorial view of an imaging system according to one embodiment;
FIG. 7 shows a block schematic diagram of an exemplary imaging system according to one embodiment;
FIG. 8 shows a block schematic diagram of an exemplary system for extending the capabilities of an imaging system in greater detail than FIG. 1, according to one embodiment; and is
FIG. 9 illustrates a flow diagram showing an exemplary method for generating a decision support output for an imaging system using an edge computing system, according to one embodiment.
Detailed Description
The following description relates to various embodiments of diagnostic imaging. Specifically, systems and methods for synchronizing at least one imaging system with an Edge Computing System (ECS) are provided. The processing power of the imaging system may be extended by coupling the imaging system to the ECS, as shown in fig. 1. Imaging data acquired by the imaging system may be transmitted or streamed to the ECS during the scan. Methods for synchronizing imaging data transmission to an ECS, such as the method depicted in fig. 2, allow imaging data to be processed by a Deep Learning (DL) application while scanning, thereby reducing the amount of time to obtain decision support. The DL application may be trained using additional information related to acquiring imaging data as well as information characterizing the scanned subject, as depicted in fig. 3. When coupled with the ECS, the extended processing capabilities of the imaging system allow the imaging system to operate in a "express mode" in which scanning is performed with minimal intervention by or input from an operator of the imaging system, as depicted in fig. 4. New and updated DL applications may be retrieved from a remote repository, as depicted in fig. 5, allowing the ECS to provide up-to-date DL capabilities compatible with the imaging system. An example of a CT imaging system that may be used to acquire images processed in accordance with the present techniques is provided in FIGS. 6 and 7.
The ECS is connected to an imaging system/scanner. The ECS includes a CPU/GPU running one or more Virtual Machines (VMs) configured for different types of tasks. Data is streamed from the scanner in real time to the ECS, which processes the data (in image and/or projection space) and returns the results. In this way, the imaging system appears to have additional processing power, as the post-processing performed by the ECS is output through the user interface of the imaging system along with the reconstructed image.
The data streaming to the ECS is synchronized with the status of the scanner. Data is transferred from the scanner to the ECS only when the scanner is not in a critical state. When in the interventional mode (e.g., when a physician performs an intervention at the scanner, such as a contrast agent injection), the scanner does not transmit data at all to avoid data corruption.
ECS provides task-based decision support. The specific tasks input to the imaging system trigger secondary tasks input to and executed by the ECS. For example, a task may specify a particular scanning protocol and/or type of image reconstruction to be performed by the scanner, while a secondary task may specify the application of relevant post-processing techniques to the acquired data. These post-processing techniques may include deep learning analysis of the acquired data. As depicted in the method of fig. 9, the ECS may select an appropriate DL application based on the secondary task, the exam type, and/or other information, and generate decision support with the selected DL application. Each instance of decision support generated at the ECS can be saved with associated data (e.g., DL application used, type of exam, final/edited exam report). After a threshold amount of decision support has been generated and saved on the ECS, one or more of the DL applications may be retrained with the new data both locally (e.g., DL applications stored on the ECS) and globally (e.g., updated model weights may be sent to a central server along with data from other locations to create an updated global model).
FIG. 1 shows a block schematic diagram of an exemplary system 100 for extending the capabilities of an imaging system 101 with an Edge Computing System (ECS)110, according to one embodiment. The imaging system 101 may include any suitable non-invasive imaging system, including, but not limited to, a Computed Tomography (CT) imaging system, a Positron Emission Tomography (PET) imaging system, a Single Photon Emission Computed Tomography (SPECT) imaging system, a Magnetic Resonance (MR) imaging system, an X-ray imaging system, an ultrasound system, and combinations thereof (e.g., a multi-modality imaging system, such as a PET/CT, PET/MR, or SPECT/CT imaging system). Exemplary imaging systems are further described herein with respect to fig. 6 and 7.
The imaging system 101 includes a processor 103 and a non-transitory memory 104. One or more of the methods described herein may be implemented as executable instructions in the non-transitory memory 104 that, when executed by the processor 103, cause the processor 103 to perform various acts. Such methods are further described herein with respect to fig. 2-4.
The imaging system 101 also includes a scanner 105 for scanning a subject (such as a patient) to acquire imaging data. Depending on the type of imaging system 101, the scanner 105 may include a number of components necessary to scan a subject. For example, if the imaging system 101 includes a CT imaging system, the scanner 105 may include a CT tube and a detector array, as well as various components for controlling the CT tube and the detector array, as discussed further herein with respect to fig. 6 and 7. As another example, if the imaging system 101 comprises an ultrasound imaging system, the scanner 105 may comprise an ultrasound transducer. Thus, the term "scanner" as used herein refers to a component of an imaging system that is used and controlled to perform a scan of a subject.
The type of imaging data acquired by the scanner 105 also depends on the type of imaging system 101. For example, if the imaging system 101 comprises a CT imaging system, the imaging data acquired by the scanner 105 may comprise projection data. Similarly, if the imaging system 101 comprises an ultrasound imaging system, the imaging data acquired by the scanner 105 may comprise analog echoes and/or digital echoes of ultrasound waves emitted by the ultrasound transducer into the subject.
In some examples, the imaging system 101 includes a protocol engine 106 for automatically selecting and adjusting a scan protocol for scanning a subject. The scan protocol selected by the protocol engine 106 specifies a variety of settings for controlling the scanner 105 during a scan of a subject. As discussed further herein, the protocol engine 106 may select or determine a scanning protocol based on the indicated primary task. Although the protocol engine 106 is depicted as a separate component from the non-transitory memory 104, it should be understood that in some examples, the protocol engine 106 may include a software module stored in the non-transitory memory 104 as executable instructions that, when executed by the processor 103, cause the processor 103 to select and adjust a scanning protocol.
The imaging system 101 also includes a user interface 107 configured to receive input from and display information to an operator of the imaging system 101. To this end, the user interface 107 may include one or more of an input device (including but not limited to a keyboard, mouse, touch screen device, microphone, etc.) and an output device (including but not limited to a display device, printer, etc.).
In some examples, the imaging system 101 also includes a camera 108 for assisting in the automatic positioning of the subject within the imaging system. For example, the camera 108 may capture real-time images of a subject within the imaging system, and the processor 103 determines the position of the subject within the imaging system based on the real-time images. The processor 103 may then control the table motor controller, for example, to adjust the position of the table against which the subject is resting in order to position at least one region of interest (ROI) of the subject within the imaging system. Further, in some examples, the scan range may be determined based at least approximately on real-time images captured by camera 108.
The system 100 also includes an ECS 110 communicatively coupled to the imaging system 101 via a wired or wireless connection. ECS 110 includes a plurality of processors 113 that run one or more Virtual Machines (VMs) 114 configured for different types of tasks. The plurality of processors 113 includes one or more Graphics Processing Units (GPUs) and/or one or more Central Processing Units (CPUs). The ECS 110 also includes a non-transitory memory 115 that stores executable instructions that are executable by one or more of the plurality of processors 113. The non-transitory memory 115 may further include a Deep Learning (DL) model 116 that may be executed by the virtual machines 114 of the plurality of processors 113. Although only one DL model is depicted in fig. 1, it should be understood that ECS 110 may include more than one DL model stored in non-transitory memory.
In some examples, system 100 also includes one or more external databases 130 to which imaging system 101 and ECS 110 may be communicatively coupled via network 120. As an illustrative and non-limiting example, the one or more external databases 130 may include one or more of a Hospital Information System (HIS), a radiology department information system (RIS), a Picture Archiving and Communication System (PACS), and an Electronic Medical Records (EMR) system. The imaging system 101 and/or the ECS 110 can retrieve information, such as subject metadata, which can include metadata describing or relating to the particular subject to be scanned (e.g., the patient's age, sex, height, and weight), which can be retrieved from the EMR for the subject. As further described herein, the imaging system 101 and/or ECS 110 can use subject metadata retrieved from one or more external databases 130 to automatically determine scan protocols, train deep learning models, and the like.
In some examples, the system 100 further includes a repository 140 communicatively coupled to one or more of the imaging system 101 and the ECS 110 via the network 120. The repository 140 stores a plurality of applications 142 that may be utilized by one or more of the imaging system 101 and the ECS 110. To this end, the imaging system 101 and/or the ECS 110 may retrieve one or more of the plurality of applications 142 from the repository 140 via the network 120. Alternatively, the repository 140 may push one of the plurality of applications 142 to one or more of the imaging system 101 and the ECS 110. The method for retrieving an application from repository 140 is further described herein with respect to fig. 5.
FIG. 2 illustrates a high-level flow diagram showing an exemplary method 200 for synchronizing an imaging system with an Edge Computing System (ECS), according to one embodiment. In particular, the method 200 involves streaming imaging data acquired from an imaging system (such as imaging system 101) during a scan to an ECS (such as ECS 110) for processing the imaging data while scanning. The method 200 is described with respect to the systems and components depicted in fig. 1 and described below, but it should be understood that the method may be implemented with other systems and components without departing from the scope of the present disclosure. The method 200 may be implemented as executable instructions in the non-transitory memory 104 of the imaging system 101 that are executable by the processor 103 of the imaging system 101.
Method 200 begins at 205. At 205, the method 200 receives an indication of an initial task, which is also referred to herein as a primary task. The initial task comprises a clinical task to be performed by the imaging system and thus typically specifies what type of scan should be performed by the imaging system. The initial task is typically a clinical action that must be completed during a pre-scan and a scan of the imaging prescription. Some examples of tasks related to pre-scanning are patient setup, receiving and viewing data from EMR, CDSS, and HIS/RIS, and selecting a protocol for scanning. Some examples of scanning-related tasks are patient positioning, image acquisition and image reconstruction. In one example, the method 200 may receive an indication of an initial task, e.g., via the user interface 107 of the imaging system 101. That is, an operator of the imaging system 101 may select or otherwise indicate an initial task via the user interface 107. In another example, the method 200 may automatically identify the initial task, such as by evaluating the EMR of the subject to be scanned.
For example, the initial task may include or be defined by a diagnostic goal of the imaging scan (e.g., referencing the patient to the reason for the imaging scan). Exemplary diagnostic goals may include diagnosing the presence (or absence) of cerebral hemorrhage, diagnosing the presence (or absence) of a liver lesion, determining the presence or extent of a spinal fracture, and performing organ segmentation to plan for radiation therapy. Each diagnostic target may be specific to a particular anatomy or set of anatomies, and thus may dictate the type of scan/examination to be performed. For example, a diagnosis of cerebral hemorrhage may be for the brain and may therefore prescribe performing a non-contrast head scan, a diagnosis of liver lesions may be for the liver and may therefore prescribe performing an abdominal scan, and in particular, a multi-phasic liver study with contrast at different timings, a diagnosis of spinal fractures may be for the neck and/or back and may therefore prescribe performing a complete spinal non-contrast scan, and organ segmentation may be for the entire abdominal region (or even the whole body) and may therefore prescribe performing a non-contrast whole body scan. Each different scan type may have an associated set of scan parameters that specify how the scanner is to be controlled during the scan. For example, for computed tomography scans, each different scan type may specify the specific tube current (mA), tube voltage (kV), and gantry rotational speed of the CT scanner during the scan.
Continuing at 210, method 200 determines whether the initial task is associated with a secondary task. The initial task specifies how imaging data may be acquired, while the secondary task specifies how the imaging data may be processed. By way of non-limiting example, the secondary tasks may include automatic lesion detection, organ classification, segmentation, or any type of post-processing method applicable to the imaging data. A secondary task is typically a clinical action that must be completed during a post-scan of the imaging prescription. Some examples of post-scan related tasks are post-processing of images by applying post-processing applications.
In some examples, a particular initial task may always specify one or more secondary tasks to be performed in conjunction with the initial task or the primary task. Additionally or alternatively, the initial task indication received at 205 may specify one or more secondary tasks. For example, the initial task may not specify the secondary task, but an operator of the imaging system may select the secondary task when indicating the initial task. As a non-limiting example, the initial task may specify a brain scan, and the secondary task may include lesion detection. Since lesion detection may not be necessary for each brain scan, an operator of the imaging system may optionally select a secondary task comprising lesion detection for the initial task of the brain scan for a particular situation in which the presence of a lesion may be suspected. Additionally or as an alternative to the operator manually selecting the secondary task, in some examples, the method 200 may automatically determine the secondary task based on the initial task and the subject metadata (such as the subject's EMR).
If the initial task is not associated with a secondary task ("NO"), the method 200 proceeds to 212. At 212, the method 200 performs a scan according to the initial task instructions. The method 200 may further reconstruct and output one or more images from imaging data acquired during the scan. Since no secondary tasks are associated with the initial task, the imaging data acquired at 212 is not streamed to the ECS. The method 200 then ends.
However, referring again to 210, if the initial task is associated with a secondary task ("yes"), the method 200 continues to 215. At 215, the method 200 outputs the secondary task indication to the ECS 110. The ECS 110 performs post-processing of the imaging data based on the secondary task instructions.
At 220, the method 200 begins scanning the subject with the scanner 105 to acquire imaging data according to the initial mission instructions. For example, the method 200 begins scanning the subject according to a scan protocol associated with the initial task. As a non-limiting example, the scanning protocol may be selected by the protocol engine 106.
During the scan beginning at 220, the method 200 synchronizes the imaging system 101 to the ECS 110 based on the state of the scanner 105. To do so, at 225, the method 200 evaluates the status of the scanner 105. At 230, the method 200 determines whether the scanner 105 is in a critical state. The scanner 105 may be in a critical state when asynchronous external operations may negatively impact its ability to meet basic safety requirements. For example, the scanner 105 may be in a critical state when active data acquisition is being performed, such as when the X-ray source is on and image data is being stored to a disk or memory storage location. If the imaging system is synchronized with the ECS during such times (so that imaging data can be sent from the imaging system to the ECS), scanning failures can result (e.g., due to data loss, delays, or other problems) and the patient will need to be rescanned, exposing the patient to additional doses of radiation. Primarily, CT scanners are in a critical state when the X-ray source is on or during a fast acquisition sequence. Another example of a critical state is when the system performs an interventional procedure in which both the scan data storage and image display operations are time sensitive and essential for safety.
If the scanner is in the critical state ("YES"), the method 200 proceeds to 232, where the method 200 continues scanning. More specifically, the method 200 continues to scan the subject, but does not transmit the acquired imaging data to the ECS 110.
However, referring again to 230, if the scanner is not in a critical state ("no"), the method 200 continues to 233, where the method 200 transmits the acquired imaging data to the ECS 110 for post-processing. The ECS 110 processes the transmitted imaging data according to the secondary task.
From both 232 and 233, method 200 proceeds to 235. At 235, the method 200 determines whether the scan is complete. If the scan is not complete ("NO"), the method 200 proceeds back to 237, where the method 200 continues the scan. Method 200 proceeds to 225 to re-evaluate the scanner status. Thus, the method 200 continuously evaluates the state of the scanner 105 during a scan to determine whether the scanner 105 is in a critical state, and streams or transmits imaging data acquired during the scan to the ECS 110 only when the scanner 105 is not in the critical state. In other words, unless the scanner 105 is in a critical state, the method 200 continuously streams imaging data to the ECS 110 as it is acquired during a scan. In this manner, the ECS 110 can receive and process imaging data while scanning. Furthermore, when the scanner 105 is operating in a critical state, the imaging data is not damaged by transmission during the critical state, and the operation of the imaging system 101 is not disturbed.
Once the method 200 determines at 235 that the scan is complete ("yes"), the method 200 proceeds to 240. At 240, the method 200 reconstructs one or more images from the acquired imaging data. By way of non-limiting example, the method 200 may reconstruct one or more images using any suitable iterative or analytical image reconstruction algorithm.
At 245, the method 200 receives decision support from the ECS 110. The decision support includes the results of one or more processing algorithms applied to the imaging data that is streamed to the ECS 110 during the scan at 233. For example, if the examination is a cerebral hemorrhage examination performing a head scan, the decision support output may include an indication of whether bleeding is detected. If the examination is a liver lesion examination of a scanned liver, the decision support output may comprise whether a lesion is detected, and if a lesion is detected, the decision support output may comprise an indication of the size, shape, location, etc. of the lesion.
At 250, the method 200 outputs one or more images and decision support. For example, the method 200 may output the one or more images and the decision support to a display device for display to an operator of the imaging system 101. Additionally or alternatively, for example, the method 200 may output one or more images and decision support to a mass storage device for later viewing, or to a Picture Archiving and Communication System (PACS) for viewing at a remote workstation. The method 200 then ends.
Accordingly, a method for an imaging system includes performing a scan of a subject to acquire imaging data, transmitting the imaging data during the scan to a computing device communicatively coupled to and positioned external to the imaging system, receiving a decision support output from the computing device, the decision support output calculated by the computing device from the imaging data, and displaying an image reconstructed from the imaging data and the decision support output. In this manner, deep learning techniques may be used to provide decision support in parallel with the acquisition of imaging data, thereby reducing the time required to make an informed diagnosis.
Automated analysis of clinical images using Deep Learning (DL) techniques can greatly simplify and improve clinical diagnosis by physicians using such images. However, preparing a data set to train the DL algorithm can be difficult and time consuming because the images and corresponding diagnoses must be manually collated and prepared for input into the algorithm. Furthermore, such training data sets are unnecessarily limited with respect to the amount of potentially available information that is relevant to the scan and that can be utilized by the DL algorithm. Thus, as described below with respect to fig. 3, the imaging system 101 outputs data suitable for input to a DL algorithm (such as the DL application 116 in the ECS 110). The output data includes scan data, image data, EMR data of the patient, scanner type (and other scanner metadata), scan protocol, decision support output, and any clinical diagnosis associated with the scan. To this end, the imaging system may retrieve data from a Hospital Information System (HIS) and/or Radiology Information System (RIS) and the EMR of a given patient (to obtain patient data as well as clinical results). The output data can be used by DL algorithms to optimize/personalize the scanning protocol for different patients and image quality goals, improve decision support, and potentially automate clinical diagnosis.
Fig. 3 illustrates a high-level flow diagram showing an exemplary method 300 for generating deep learning training data with an imaging system, according to one embodiment. In particular, the method 300 involves training a deep learning model with all information and analysis thereof that may potentially be relevant to the quality of the reconstructed image. The method 300 is described with respect to the systems and components of fig. 1, but it should be understood that the method may be implemented with other systems and components without departing from the scope of the present disclosure. The method 300 may be implemented as executable instructions in the non-transitory memory 104 of the imaging system 101 and executed by the processor 103 of the imaging system 101.
The method 300 begins at 305. At 305, the method 300 receives a selection of a scan protocol. The scanning protocol may be manually selected, e.g., by an operator via the user interface 107, or may be automatically selected, e.g., via the protocol engine 106.
At 310, the method 300 retrieves subject metadata for a subject to be scanned from one or more external databases. The subject metadata may include at least a subset of information related to the subject, and thus may include, but is not limited to, demographic information and medical history. The method 300 may retrieve subject metadata from one or more databases 130, including one or more of a HIS, RIS, and EMR database, e.g., via the network 120.
At 315, the method 300 performs a scan of the subject according to a scan protocol to acquire imaging data. For example, the method 300 may control the scanner 105 to scan a subject, where the scan protocol selected in 305 dictates the control parameters of the scanner 105. At 320, the method 300 reconstructs an image from the imaging data acquired during the scan at 315. The method 300 may reconstruct the image using any suitable image reconstruction algorithm according to the modality of the imaging system.
At 325, the method 300 transmits the image reconstructed at 320 and/or the imaging data acquired at 315 to the ECS 110. The ECS 110 processes the images and/or imaging data using one or more DL algorithms to generate a decision support output. While transmitting imaging data is depicted in fig. 3 as occurring after a scan, it should be understood that in some examples, the method 300 may transmit imaging data during a scan at 315 such that the ECS 110 processes the imaging data while the scan at 315, as discussed above with respect to fig. 2.
The ECS 110 processes the image and/or imaging data during the scan or after the scan is completed to generate a decision support output. At 330, the method 300 receives a decision support output calculated for the image and/or imaging data from the ECS 110 using a learning model (e.g., a DL algorithm, such as a neural network).
At 335, the method 300 displays the image and the decision support output. Both the image and the decision support output may be displayed via a display device, such as the imaging system 101. Depending on the type of decision support output, the decision support output may be superimposed with the image or displayed side-by-side.
At 340, the method 300 receives a resulting decision related to the displayed image and the decision support output. The outcome decision comprises a clinical diagnosis made, for example, by a physician or radiologist based on the displayed image and decision support output. Additionally or alternatively, the resulting decision may include ground facts (ground truth) about the decision support output. For example, if the decision support output includes detection or classification of an object (such as a lesion) in the image, the ground truth for the decision support output may include an indication that the detection or classification is correct or incorrect.
At 345, the method 300 updates the training data set with the imaging data, images, scan protocols, subject metadata, decision support output, outcome decisions, and system metadata. Since the training data set may be located remotely from the imaging system 101 rather than in the non-transitory memory 104 of the imaging system, in some examples, updating the training data set may include aggregating the imaging data, images, scanning protocols, subject metadata, decision support output, result decisions, and system metadata into a single instance to be added to the training data set. The system metadata includes information characterizing the imaging system 101 itself, such as the model of the imaging system 101, the date of manufacture of the imaging system 101, and the like.
At 350, the method 300 transmits the updated training data set to the ECS 110 for updating the learning model used to generate the decision support output received at 330. In this manner, additional data that may affect the performance of the learning model, such as subject metadata, system metadata, scanning protocols, and clinical diagnoses made by physicians based on reconstructed images and/or decision support outputs, may be utilized to improve the performance of the learning model. The method 300 then ends.
Accordingly, a method for an imaging system includes performing a scan of a subject according to a scan protocol to acquire imaging data; displaying an image and decision support associated with the image, the image reconstructed from the imaging data, and the decision support calculated using the learning model and the imaging data; and updating a training data set for the learning model with the imaging data, the images, the scan protocol, subject metadata describing the subject, decision support, outcome decisions related to the images and the decision support, and system metadata related to the imaging system.
As the functions of imaging systems become more powerful and complex, users may find the operation of the imaging systems too complex. Radiologists and technicians must be trained to select and prepare the correct scanning protocol for each patient, who may also be positioned differently within the imaging system for a given protocol. If any part of the scan preparation is incorrect, the quality of the resulting image or images may be too poor for clinical use, so the scan must be performed again. Furthermore, the image analysis has to be performed by the radiologist/doctor, so the round-trip (turn-around) time of the clinical report may be slow. As described further below with respect to fig. 4, the imaging system 101 may thus include a "express mode" that automates as many portions of the imaging process as possible with the extended processing power provided by the ECS 110 and the more complex DL applications 116. The hospital information system, radiology department information system, and electronic medical record data are used by the protocol engine 106 to select and personalize a scanning protocol for a given patient, which in some examples may be aided by deep learning. The patient is automatically positioned based on the imaging protocol and with the assistance of the camera. The camera further enables the determination of the correct scanning range for the patient so that the scanning protocol can be further adjusted and personalized for the patient. The scanning and post-processing/decision support are performed automatically by the imaging system 101 and the ECS 110. The DL application 116 executed by the plurality of processors 113 of the ECS 110 provides automated processing of the acquired data and at least draft evaluation of the reconstructed image.
FIG. 4 illustrates a high-level flow diagram showing an exemplary method 400 for imaging system shortcut mode in accordance with one embodiment. In particular, the method 400 involves controlling the imaging system to scan a subject with minimal operator intervention or input. The method 400 is described with respect to the systems and components of fig. 1, but it should be understood that the method may be implemented with other systems and components without departing from the scope of the present disclosure. The method 400 may be implemented as executable instructions in the non-transitory memory 104 of the imaging system 101 and the non-transitory memory 115 of the ECS 110 and may be executed by the processor 103 of the imaging system 101 and the plurality of processors 113 of the ECS 110.
The method 400 begins at 405. At 405, the method 400 receives an indication of a shortcut mode for scanning. The method 400 may receive an indication of a shortcut mode for scanning, for example, via the user interface 107 of the imaging system 101. In some examples, the user interface 107 may include a dedicated button, switch, or another mechanism for indicating a shortcut mode in which use of the imaging system 101 is desired.
At 410, the method 400 receives an identification of a subject to be scanned. In some examples, the method 400 may receive, via the user interface 107, an identification of a subject to be scanned. For example, an operator of the imaging system 101 may manually input an identification of the subject to be scanned. In other examples, the method 400 may automatically identify the subject to be scanned. As an illustrative and non-limiting example, the method 400 may obtain an image of the face of a subject via the camera 108, and may employ facial recognition techniques to identify the subject based on the image.
At 415, the method 400 retrieves the EMR of the subject identified at 410, for example, by accessing the EMRs in the one or more databases 130 via the network 120. At 420, the method 400 determines an initial task or primary task based on the EMR. As discussed above, the primary task may indicate the clinical context of the scan, and thus may indicate what type of scan should be performed. The method 400 may determine a primary task from the EMR, which may include a prescription of a physician for a particular type of scan.
At 425, the method 400 determines a scan protocol for the primary task. For example, the method 400 may input the primary task to the protocol engine 106 of the imaging system 101 to determine the scan protocol. In other examples, the primary task may be associated with a particular scanning protocol. Further, the scan protocol may be determined or adjusted based on the subject. For example, the scan protocol may prescribe more or less radiation doses depending on the age and size of the subject.
At 430, the method 400 positions the subject within the imaging system. In some examples, the method 400 may determine the position of the subject on the movable table relative to the imaging system, for example, by processing real-time images of the subject captured by the camera 108. The method 400 may control a table motor controller to adjust the position of the table and, thus, the subject such that the region of interest to be scanned is within an imaging region of the imaging system (e.g., positioned within a gantry between the radiation source and the radiation detector).
At 435, the method 400 determines a scan range for the subject. The method 400 may determine a scan range for the subject based on real-time images captured by the camera 108. For example, different subjects have different body sizes, and therefore the scan range should be adjusted accordingly. Thus, the method 400 may evaluate images captured by the camera 108 to determine the size and scale of the subject, and may then set an appropriate scan range over which the ROI of the subject is to be scanned. At 440, the method 400 adjusts the scan protocol using the scan range determined at 435.
At 445, the method 400 performs a scan of the subject according to the adjusted scan protocol to acquire imaging data. For example, the method 400 controls the scanner 105 to scan the subject according to the adjusted scan protocol. At 450, the method 400 outputs the imaging data to the ECS 110. As discussed above with respect to fig. 2, in some examples, imaging data may be output to the ECS 110 for processing during a scan. The ECS 110 processes the imaging data using a learning model to generate a decision support output. At 455, the method 400 receives the decision support output from the ECS 110.
Continuing at 460, method 400 reconstructs an image from the imaging data acquired at 445. The method 400 may reconstruct an image from the imaging data using any suitable image reconstruction technique. Although fig. 4 depicts the reconstruction of the image as occurring after the imaging data is output to the ECS 110, it should be understood that in some examples, the reconstructed image may be transmitted to the ECS 110 and the decision support output received at 455 may be generated by the ECS 110 based on the reconstructed image rather than the original imaging data. At 465, the method 400 outputs the image and decision support output to one or more of a display device for display, a mass storage device for subsequent retrieval and viewing, and a PACS. The method 400 then ends.
Accordingly, a method for an imaging system includes receiving an identification of a subject to be scanned, automatically determining a personalized scan protocol for the subject, automatically performing a scan of the subject according to the personalized scan protocol to acquire imaging data, and displaying an image and decision support, the image and decision support being automatically generated from the imaging data.
As mentioned above, post-processing techniques for imaging systems are regularly improving over time. However, once the imaging system is installed, it is difficult to update the imaging system to include improved algorithms and new functionality. As discussed further herein with respect to fig. 5, the application repository enables remote deployment of software applications for the imaging system. As discussed above with respect to fig. 1, the imaging system 101 and/or ECS 110 may be coupled to an application repository 140 via a network 120. In some examples, the imaging system 101 or ECS 110 may retrieve new or updated applications 142 from the application repository 140. Alternatively, the application repository 140 may push the application 142 to the ECS 110 and/or the imaging system 101. Furthermore, as discussed below, certain applications may only be compatible with a particular combination of imaging system 101 and ECS 110. For example, different applications may be displayed/deployed to newer high-power imaging systems coupled to low-power ECSs relative to outdated low-power imaging systems coupled to high-power ECSs.
FIG. 5 depicts a high-level flow diagram that illustrates an exemplary method 500 for managing applications on an Edge Computing System (ECS), according to one embodiment. In particular, method 500 involves retrieving new or updated applications for processing images and/or imaging data from an external repository. The method 500 is described with respect to the systems and components of fig. 1, but it should be understood that the method may be implemented with other systems and components without departing from the scope of the present disclosure. The method 500 may be implemented as executable instructions in the non-transitory memory 115 of the ECS 110 and may be executed by one or more of the plurality of processors 113 of the ECS 110.
The method 500 begins at 505. At 505, method 500 transmits the access request to a repository, such as repository 140. The access request may include an identification of the imaging system 101 and the ECS 110. The repository 140 includes a plurality of applications that may be compatible or incompatible with one or more of the imaging system 101 and the ECS 110. Thus, the repository 140 determines which of the plurality of applications are compatible with a given combination of the imaging system 101 and the ECS 110 based on the identification of the imaging system 101 and the ECS 110.
At 510, method 500 receives a list of compatible applications from repository 140. The list of compatible applications may include a subset of the plurality of applications stored in repository 140.
At 515, method 500 receives a selection of an application in the list of compatible applications. For example, an operator of the imaging system 101 may view a list of compatible applications and select a desired application from the list via the user interface 107. Accordingly, the selection may be communicated from the imaging system 101 to the ECS 110. In other examples, ECS 110 may include its own user interface for enabling selection of an application, and thus may receive selection of an application via the user interface.
Additionally or alternatively, the selection of the application may be automatic. For example, if the repository 140 includes an updated version of an application that is already installed on the ECS 110, the method 500 may automatically select the updated version of the application from the list of compatible applications.
At 520, method 500 retrieves the selected application from repository 140 and installs it locally in non-transitory memory 115. Thus, the DL applications 116 in the non-transitory memory 115 of the ECS 110 depicted in fig. 1 may include applications retrieved from the repository 140.
At 525, the method 500 generates a decision support calculation from imaging data received by the imaging system 101 using an application. For example, if the application comprises a segmentation application, the method 500 may segment an image reconstructed from imaging data acquired with the imaging system 101, and the segmentation of the image includes decision support calculations. At 530, the method 500 outputs the decision support calculation to the imaging system 101. The method 500 then ends.
Accordingly, a method for an ECS communicatively coupled to and positioned external to an imaging system includes: the method includes receiving, via a network, a list of compatible deep learning applications from an application repository communicatively coupled to the ECS, retrieving a deep learning application selected from the list from the application repository, receiving imaging data from the imaging system, processing the imaging data with the deep learning application to generate a decision support calculation, and outputting the decision support calculation to the imaging system.
Fig. 6 illustrates an exemplary CT system 600 configured to allow fast and iterative image reconstruction. Specifically, the CT system 600 is configured to image a subject 612 (such as a patient, an inanimate object, one or more manufacturing parts) and/or a foreign object (such as a dental implant, a stent and/or a contrast agent present within the body). In one embodiment, the CT system 600 includes a gantry 602, which in turn may further include at least one X-ray radiation source 604 configured to project a beam of X-ray radiation 606 for imaging a subject 612. In particular, the X-ray radiation source 604 is configured to project X-rays 606 toward a detector array 608 positioned on an opposite side of the gantry 602. Although fig. 6 depicts only a single source of X-ray radiation 604, in certain embodiments, multiple sources of radiation may be employed to project multiple X-rays 606 to acquire projection data corresponding to the subject 612 at different energy levels.
Although a CT system is described by way of example, it will be appreciated that the present techniques may also be useful when applied to images acquired using other imaging modalities, such as tomosynthesis, MRI, C-arm angiography, and the like. The present discussion of CT imaging modalities is provided merely as an example of one suitable imaging modality.
In certain embodiments, the CT system 600 is communicatively coupled to an Edge Computing System (ECS)610 configured to process the projection data using DL techniques. For example, as discussed above, the CT system 600 may transmit the projection data to the ECS 610 as it is acquired, and then the ECS 610 processes the projection data to provide decision support with the image reconstructed from the projection data.
In some known CT imaging system configurations, a radiation source projects a fan-shaped beam that is collimated to lie within an X-Y plane of a Cartesian coordinate system (Cartesian coordinate system) and is commonly referred to as an "imaging plane". The radiation beam passes through an object being imaged, such as a patient or subject 612. The beam, after being attenuated by the object, impinges upon an array of radiation detectors. The intensity of the attenuated radiation beam received at the detector array is dependent upon the attenuation of the radiation beam by the object. Each detector element of the array produces a separate electrical signal that is a measure of the beam attenuation at the detector location. Attenuation measurements from all detectors are collected separately to produce a transmission profile.
In some CT systems, a gantry is used to rotate the radiation source and the detector array in an imaging plane and around the object to be imaged such that the angle at which the radiation beam intersects the object constantly changes. A set of radiation attenuation measurements (i.e., projection data) from the detector array at one gantry angle is referred to as a "view". A "scan" of the object comprises a set of views made at different gantry angles, or view angles, during one rotation of the radiation source and detector. It is envisaged that the benefits of the methods described herein accrue to medical imaging modalities other than CT, and thus as used herein the term view is not limited to the use described above in relation to projection data from one gantry angle. The term "view" is used to mean one data acquisition whenever there are multiple data acquisitions from different angles (whether from CT, PET, or SPECT acquisitions), and/or any other modality (including modalities yet to be developed) and combinations thereof in the fusion embodiment.
In an axial scan, the projection data is processed to reconstruct an image corresponding to a two-dimensional slice taken through the object. One method for reconstructing an image from a set of projection data is known in the art as the Filtered Back Projection (FBP) technique. Transmission and emission tomography reconstruction techniques also include statistical iterative methods such as Maximum Likelihood Expectation Maximization (MLEM) and ordered subset expectation reconstruction techniques, as well as iterative reconstruction techniques. The method converts the attenuation measurements from the scan into integers called "CT numbers" or "Hounsfield units" which are used to control the brightness of the corresponding pixel on the display device.
To reduce the total scan time, a "helical" scan may be performed. To perform a helical scan, the patient is moved while data for a prescribed number of slices is acquired. Such systems generate a single helix from a cone beam helical scan. The helix mapped out by the cone beam produces projection data from which an image in each prescribed slice can be reconstructed.
As used herein, the phrase "reconstructing an image" is not intended to exclude embodiments of the present disclosure in which data representing an image is generated rather than a visual image. Thus, as used herein, the term "image" broadly refers to both a viewable image and data representing a viewable image. However, many embodiments generate (or are configured to generate) at least one viewable image.
Additionally or alternatively, the CT system 600 may be communicatively coupled to a "cloud" network 620 by communicating with the ECS 610.
Fig. 7 shows an exemplary imaging system 700 similar to CT system 600 of fig. 6. The imaging system 700 may include the imaging system 101 described above with respect to fig. 1. In one embodiment, imaging system 700 includes detector array 608 (see fig. 6). The detector array 608 also includes a plurality of detector elements 702 that together sense an X-ray beam 606 (see fig. 6) passing through a subject 704, such as a patient, to acquire corresponding projection data. Thus, in one embodiment, detector array 608 is fabricated in a multi-slice configuration including multiple rows of cells or detector elements 702. In such a configuration, one or more additional rows of detector elements 702 are arranged in a parallel configuration for acquiring projection data.
In certain embodiments, the imaging system 700 is configured to traverse different angular positions around the subject 704 to acquire desired projection data. Thus, the gantry 602 and the components mounted thereon may be configured to rotate about a center of rotation 706 to acquire projection data at different energy levels, for example. Alternatively, in embodiments where the angle of projection relative to the subject 704 varies over time, the mounted components may be configured to move along a generally arc rather than along a segment of a circle.
As the X-ray source 604 and detector array 608 rotate, the detector array 608 collects data for the attenuated X-ray beam. The data collected by the detector array 608 undergoes pre-processing and calibration to adjust the data to represent the line integrals of the attenuation coefficients of the scanned subject 704. The processed data is commonly referred to as projections.
In dual or multi-energy imaging, two or more sets of projection data of an imaged object are typically obtained with energy-resolving detectors of the detector array 608 at different tube peak kilovoltage (kVp) levels, which change the peak and spectrum of the energy of the incident photons comprising the emitted X-ray beam, or alternatively, at a single tube kVp level or spectrum.
The acquired projection data set may be used for Basis Material Decomposition (BMD). During BMD, the measured projections are converted into a set of intensity line integral projections. The density line integral projections may be reconstructed to form a density map or density image (such as a map of bone, soft tissue, and/or contrast agent) for each respective base material. The density map or density images may then be correlated to form a volume rendering of the base material (e.g., bone, soft tissue, and/or contrast agent) in the imaging volume.
Once reconstructed, the base material image produced by the imaging system 700 reveals internal features of the subject 704 in terms of the densities of the two base materials. A density image may be displayed to show these features. In conventional methods of diagnosing medical conditions (such as disease states) and more generally medical events, the radiologist or physician will consider a hard copy or display of the density image to identify characteristic features of interest. Such features may include lesions, sizes, and shapes of particular anatomical structures or organs, as well as other features that will be discernable in the image based on the skill and knowledge of the individual physician.
In one embodiment, the imaging system 700 includes a control mechanism 608 to control movement of components, such as rotation of the gantry 602 and operation of the X-ray radiation source 604. In certain embodiments, the control mechanism 708 further comprises an X-ray controller 710 configured to provide power and timing signals to the X-ray radiation source 604. In addition, the control mechanism 708 includes a gantry motor controller 712 configured to control the rotational speed and/or position of the gantry 602 based on imaging requirements.
In certain embodiments, control mechanism 708 further includes a Data Acquisition System (DAS)714 configured to sample analog data received from detector elements 702 and convert the analog data to digital signals for subsequent processing. The data sampled and digitized by DAS 714 is transmitted to a computer or computing device 716. In one example, computing device 716 stores data in storage device 718, such as a mass storage device. For example, mass storage device 718 may include a hard disk drive, a floppy disk drive, a compact disk-read/write (CD-R/W) drive, a Digital Versatile Disk (DVD) drive, a flash drive, and/or a solid state storage drive.
In addition, computing device 716 provides commands and parameters to one or more of DAS 714, X-ray controller 710, and gantry motor controller 712 for controlling system operations, such as data acquisition and/or processing. In certain embodiments, the computing device 716 controls system operation based on operator input. The computing device 716 receives operator input, including, for example, commands and/or scan parameters, via an operator console 720 operatively coupled to the computing device 716. Operator console 720 may include a keyboard (not shown) and/or a touch screen to allow an operator to specify commands and/or scanning parameters.
Although only one operator console 720 is shown in fig. 7, more than one operator console may be coupled to the imaging system 700, for example, for inputting or outputting system parameters, requesting inspection, and/or viewing images. Further, in certain embodiments, imaging system 700 may be coupled via one or more configurable wired and/or wireless networks (such as the internet and/or a virtual private network) to a plurality of displays, printers, workstations, and/or the like, located locally or remotely, e.g., within an institution or hospital, or in disparate locations.
In one embodiment, for example, the imaging system 700 includes or is coupled to a Picture Archiving and Communication System (PACS) 724. In an exemplary embodiment, the PACS 724 is further coupled to a remote system (such as radiology department information system, hospital information system) and/or to an internal or external network (not shown) to allow operators at different locations to supply commands and parameters and/or gain access to image data.
Computing device 716 uses operator supplied and/or system defined commands and parameters to operate a table motor controller 726, which in turn may control a table 728, which may include a motorized table. In particular, the table motor controller 726 moves the table 728 for properly positioning the subject 704 in the gantry 602 to acquire projection data corresponding to a target volume of the subject 704.
As previously described, DAS 714 samples and digitizes projection data acquired by detector elements 702. Subsequently, the image reconstructor 730 performs a high-speed reconstruction using the sampled and digitized X-ray data. Although fig. 7 illustrates image reconstructor 730 as a separate entity, in certain embodiments, image reconstructor 730 may form a portion of computing device 716. Alternatively, image reconstructor 730 may not be present in imaging system 700, and computing device 716 may instead perform one or more functions of image reconstructor 730. Further, the image reconstructor 730 may be located locally or remotely and may be operatively connected to the imaging system 700 using a wired or wireless network. In particular, one exemplary embodiment may use computing resources in a "cloud" network cluster for image reconstructor 730.
In one embodiment, the image reconstructor 730 stores the reconstructed image in a storage device or mass storage device 718. Alternatively, image reconstructor 730 transmits the reconstructed image to computing device 716 to generate available patient information for diagnosis and evaluation. In certain embodiments, the computing device 716 transmits the reconstructed image and/or patient information to a display 732 that is communicatively coupled to the computing device 716 and/or the image reconstructor 730.
Various methods and processes further described herein may be stored as executable instructions in non-transitory memory on a computing device in the imaging system 700. For example, the image reconstructor 730 may include such executable instructions in a non-transitory memory and may apply the methods described herein to reconstruct an image from scan data. In another embodiment, the computing device 716 may include instructions in a non-transitory memory, and may apply the methods described herein, at least in part, to the reconstructed image after receiving the reconstructed image from the image reconstructor 730. In yet another embodiment, the methods and processes described herein may be distributed throughout the image reconstructor 730 and the computing system 716.
In one embodiment, display 732 allows the operator to evaluate the imaged anatomy. The display 732 may also allow an operator to select a volume of interest (VOI) and/or request patient information, e.g., via a Graphical User Interface (GUI), for subsequent scanning or processing.
A technical effect of the present disclosure is to transmit imaging data to an external computing system for processing as it is acquired, in parallel with the acquisition of the imaging data. Another technical effect of the present disclosure is to suspend transmission of imaging data to an external computing system when the imaging system is in a critical state. Yet another technical effect of the present disclosure is to display reconstructed images and automatically generated decision support.
In one embodiment, a system includes an imaging system including at least a scanner and a processor configured to reconstruct an image from data acquired during scanning of a subject via the scanner; and a computing device communicatively coupled to and positioned external to the imaging system, the computing device configured to generate a decision support calculation based on the data, wherein the imaging system transmits the data to the computing device during the scan.
In a first example of the system, the imaging system further comprises a user interface, wherein the scan protocol for the scan is determined based on a primary task input to the processor via the user interface. In a second example of the system, optionally including the first example, the processor transmits a secondary task based on the primary task to the computing device, and the computing device generates the decision support calculation from the secondary task. In a third example of the system, optionally including one or more of the first example and the second example, the imaging system further comprises a display device, wherein the processor is configured to display the image and the decision support calculation via the display device. In a fourth example of the system, which optionally includes one or more of the first through third examples, the imaging system transmits data only during the scan when the imaging system is not in the critical state. In a fifth example of the system, which optionally includes one or more of the first through fourth examples, the computing device includes a deep learning model in the non-transitory memory, wherein the computing device inputs data to the deep learning model to generate the decision support calculation.
In another embodiment, a method for an imaging system includes performing a scan of a subject to acquire imaging data, transmitting the imaging data during the scan to a computing device communicatively coupled to and positioned external to the imaging system, receiving a decision support output from the computing device, the decision support output calculated by the computing device from the imaging data, and displaying an image reconstructed from the imaging data and the decision support output.
In a first example of the method, the method further comprises evaluating a state of the imaging system during the scan, wherein transmitting the imaging data to the computing device comprises transmitting the imaging data to the computing device when the state of the imaging system is not critical, and not transmitting the imaging data to the computing device when the state of the imaging system is critical. In a second example of the method, which optionally includes the first example, the method further includes receiving an indication of the primary task, and transmitting an indication of a secondary task related to the primary task to the computing system. In a third example of the method, which optionally includes one or more of the first example and the second example, the method further comprises determining a scanning protocol for scanning according to the primary task, wherein the computing system computes the decision support output according to the secondary task. In a fourth example of the method, which optionally includes one or more of the first through third examples, the computing system calculates the decision support output during the scan. In a fifth example of the method, which optionally includes one or more of the first through fourth examples, displaying the image and the decision support output includes displaying the decision support output superimposed on the image. In a sixth example of the method, which optionally includes one or more of the first through fifth examples, the computing device calculates the decision support output by inputting the imaging data to a deep learning model that outputs the decision support output. In a seventh example of the method, which optionally includes one or more of the first through sixth examples, the method further includes receiving, via a user interface of the imaging system, ground facts about the decision support output, and transmitting the ground facts to the computing device to update the deep learning model.
In yet another embodiment, an imaging system includes an X-ray source that emits an X-ray beam toward a subject to be imaged; a detector that receives X-rays attenuated by a subject; a Data Acquisition System (DAS) operably connected to the detector, and a computing device operably connected to the DAS and configured with executable instructions in a non-transitory memory that, when executed, cause the computing device to: controlling an X-ray source to perform a scan of a subject; during a scan, receiving projection data via the DAS and transmitting the projection data to an Edge Computing System (ECS) communicatively coupled to and positioned external to the imaging system; receiving a decision support output generated by the ECS based on the projection data; and outputting the decision support output and the image reconstructed from the projection data.
In a first example of the imaging system, the imaging system further comprises a display device, wherein the decision support output and the image are output to the display device. In a second example of the imaging system, optionally including the first example, the computing device is further configured with executable instructions in the non-transitory memory that, when executed, cause the computing device to reconstruct an image from the projection data. In a third example of the imaging system, optionally including one or more of the first example and the second example, the computing device is further configured with executable instructions in the non-transitory memory that, when executed, cause the computing device to receive, from the ECS, an image reconstructed by the ECS from the projection data. In a fourth example of the imaging system, optionally including one or more of the first through third examples, the imaging system further includes an operator console coupled to the computing device and configured to receive input from an operator of the imaging system, wherein the computing device is further configured with executable instructions in the non-transitory memory that, when executed, cause the computing device to receive an indication of the primary task via the operator console and select a scanning protocol for scanning in accordance with the primary task. In a fifth example of the imaging system, optionally including one or more of the first example through the fourth example, the computing device is further configured with executable instructions in the non-transitory memory that, when executed, cause the computing device to output an indication of a secondary task to the ECS, the secondary task being associated with the primary task, wherein the decision support output is generated by the ECS from the secondary task.
Fig. 8 shows a block schematic diagram of an exemplary system 800 in greater detail than fig. 1 for extending the capabilities of an imaging system or other device 802, according to one embodiment. The system 800 may include a plurality of imaging systems 802, such as any suitable non-invasive imaging system, including but not limited to Computed Tomography (CT) imaging systems, Positron Emission Tomography (PET) imaging systems, Magnetic Resonance (MR) imaging systems, ultrasound systems, and combinations thereof (e.g., multi-modality imaging systems, such as PET/CT or PET/MR imaging systems), or other devices 802, such as Advanced Workstations (AW) for visualizing images, PACS, and mobile client devices, such as tablets, ipads, smartphones, iphones, and so forth. A plurality of imaging systems and other devices 802 may be communicatively coupled to the edge computing system 804 via a network. A plurality of imaging systems and other devices 802 and edge computing systems 804 are communicatively coupled to a cloud network 806. The communication between the multiple imaging systems and the other devices 802, ECS 804, and cloud network 806 is secure.
The ECS 804 may also be referred to as an on-premise cloud and may include data 808 from various inputs and sources (including EMR, HIS/RIS data, ensemble data, etc.) that is communicatively coupled to a DL interface 810 and DL training 812. The ECS 804 can further include at least one edge host 814, which can include at least one edge platform 816 and an application store 818 having a plurality of applications 820.
The cloud network 806 may include data 822 from various inputs and sources as well as DL training 824. Additionally or alternatively, the cloud network 806 can include an application store having a plurality of applications 826. The cloud network 806 may also include an Artificial Intelligence (AI) interface for generating learning models or other models that may be used in the system 800.
Data from the system 800 may be shared and/or stored between multiple imaging systems and other devices 802, ECS 804, and cloud network 806. Post-processing may be implemented on data sent to the ECS before being sent to the DICOM. The quantization and segmentation may be performed on the imaging system before being sent to the PACS.
Some examples provide core processing capabilities that are organized into units or modules that can be deployed in a variety of locations. Off-device processing may be utilized to provide a micro cloud, mini cloud, and/or global cloud, among others. For example, the cloudlet provides a one-to-one configuration with an imaging device console that addresses ultra-low latency processing (e.g., stroking, etc.) for customers without cloud connectivity or the like. Mini-clouds are deployed on customer networks or the like for low-latency processing, e.g., for customers who prefer to keep their data inside. Global clouds are deployed throughout customer organizations in order to enable high performance computing and management that operate an excellent information technology infrastructure.
In certain other examples, one or more off-device processing engines (e.g., an acquisition engine, a reconstruction engine, a diagnostic engine, etc., and their associated deployed deep learning network devices, etc.) may be included in system 800. For example, examination purposes, electronic medical record information, heart rate and/or heart rate variability, blood pressure, weight, prone/supine, head-advanced or foot-advanced visual assessments, and the like may be used to determine one or more acquisition settings, such as default field of view (DFOV), center, spacing, orientation, contrast agent injection rate, contrast agent injection timing, voltage, current, and the like, thereby providing a "one-click" imaging device. Similarly, for example, nuclear information, layer thicknesses, layer spacings, and the like may be used to determine one or more reconstruction parameters, including image quality feedback. Acquisition feedback, reconstruction feedback, etc. may be provided to a system design engine to provide real-time (or substantially real-time (accounting for processing and/or transmission delays)) health analysis for the imaging device, as represented by one or more digital models (e.g., deep learning models, machine models, digital twins, etc.). The one or more digital models may be used to predict the health of components of the imaging device in real-time (or substantially real-time (accounting for processing and/or transmission delays).
Fig. 9 shows a flow diagram illustrating a method 900 for generating decision support output using one or more Deep Learning (DL) applications. The method 900 is described with respect to the systems and components of fig. 1, but it should be understood that the method may be implemented with other systems and components without departing from the scope of the present disclosure. The method 900 may be implemented as executable instructions in the non-transitory memory 115 of the ECS 110 and may be executed by one or more of the plurality of processors 113 of the ECS 110.
At 905, an exam type and a secondary task for decision support output related to an imaging scan are received. As explained above, when performing an imaging scan in order to diagnose a patient pathology, an operator of the scanner may enter an initial task or a primary task (as explained above with respect to fig. 2), or may obtain the initial/primary task from the patient's EMR (as explained above with respect to fig. 4). The initial/primary task may include an indication of the type of examination and may include an associated secondary task. The examination type may include the target anatomy to be scanned and certain features of the scan used to perform the examination (such as a cerebral hemorrhage examination, a liver lesion examination, etc.). The associated secondary task may indicate decision support to be generated and output with the images obtained during the imaging scan. Decision support may include bleeding detection, lesion detection, organ segmentation, etc.
At 910, an application to be executed is selected for generating a decision support output. The selected application may be a deep learning application (DL application) that utilizes imaging information sent from the scanner (e.g., as explained above with respect to fig. 2) as input into the DL model. The DL model then generates a decision support output using the DL model and the input imaging information. The DL application may be selected based on the secondary task and the type of examination. For example, if the exam is a brain exam to be performed to detect the presence of bleeding, a brain bleeding DL application may be selected. If the examination is a liver scan to be performed to detect the presence of a lesion, a liver lesion DL application may be selected. In some examples, the same DL application may be applied to more than one type of examination. For example, a lesion detection DL application may be selected for liver lesion examination and kidney lesion detection. In other examples, a separate DL application may be selected for each different exam type/secondary task combination. In such examples, different DL applications may be selected for liver and kidney lesion detection examinations.
The appropriate DL application may be selected according to a suitable mechanism. In some embodiments, the ECS may store a look-up table in memory that indexes the DL application according to the type of exam and the secondary task. Other mechanisms for selecting DL applications are possible. For example, rather than the ECS selecting the appropriate DL application, the operator may select the desired DL application and an indication of the selected DL application may be sent to the ECS along with the type of exam and the secondary task. In some embodiments, selection of a particular DL application may automatically trigger the sequence of other DL applications to be triggered for pre-or post-processing of data for the primary task. For example, selecting a DL application for liver lesion detection may trigger an anatomy localization DL application that may be executed to reduce the acquired image to an image that includes only the liver.
At 915, a request for additional input is optionally sent. Additional inputs may include subject and/or scanner metadata or other information that may be used by the selected DL application to generate decision support output. Thus, requests for additional input may be sent to the EMR, PACS, or other database storing additional input. For example, as described above with respect to fig. 3, in addition to the images/imaging data obtained during an imaging scan of a subject, the DL application may also utilize various information as input to the corresponding DL model, including but not limited to subject information (e.g., demographic data, medical history) and scanner information. The scanner information may include the type of scanner (such as a CT scanner or ultrasound), imaging class/type (such as contrast agent imaging, non-contrast agent imaging, doppler, B-mode), scanner settings (such as tube current and tube voltage), and so forth. Additional inputs may also include laboratory test results of the subject, previous diagnostic imaging scans, and complementary images from other imaging modalities. In some embodiments, additional input requested and/or used by the selected DL application may be based on the type of examination and the secondary task. For example, the ECS may store a look-up table that stores additional inputs requested according to the type of examination and the secondary task. As one non-limiting example, if the examination type is a non-contrast head CT scan to be analyzed by a cerebral hemorrhage detection DL application, the requested additional inputs may include laboratory reports, neurophysiological examination reports, X-ray tube kVp used during the scan, and/or the type of CT scan. As another non-limiting example, if the examination type is a multi-phase liver study conducted with contrast agents administered at different timings to be analyzed by the liver lesion detection module, additional inputs may include the scan type and X-ray tube kVp used during the scan, contrast agent timing, volume of contrast agent injected, ejection fraction of the heart, and/or previous CT studies for the patient.
At 920, images and/or imaging data are received from the scanner. As explained above with respect to fig. 2 and 3, the scanner may send reconstructed images and/or raw imaging data to the ECS during and/or after the imaging scan. At 925, a decision support output is generated using the selected DL application. To generate a decision support output, the selected DL application may input the received image/image data as input into a DL model executed by the DL application, along with any additional information specified by the DL application. The DL model may then output decision support. As explained above, decision support may include organ segmentation, anatomical structure identification, indications of whether bleeding is detected, lesions, fractures, etc. At 930, the decision support output is sent to a requesting device, such as a scanner that sends imaging information, and/or other device, such as a PACS, a care provider device, or other suitable device. As explained above, the decision support output may be displayed or otherwise presented with the reconstructed image from the imaging scan. The decision support output may support diagnosis or determination of clinical findings via reconstructed images. The decision support output and the reconstructed image may be analyzed by one or more clinicians, such as radiologists. Decision support outputs may be accepted or rejected by one or more clinicians. For example, if the decision support output includes an indication of a lesion on the subject's liver, the clinician may agree to lesion detection and accept the output, or disagree that the identified lesion is in fact a lesion and reject the output. In examples where the decision support output comprises organ segmentation, the clinician may edit or update a contour defining the organ, where the contour is generated by the DL application. Further, the clinician may generate a final report with detailed and, if indicated, edited findings (e.g., updated contours, rejected decision support, etc.). The final report may be saved in the subject's EMR.
At 935, a final report, which may include the edited results, is received (e.g., from an EMR database). The final report may be analyzed by the ECS (e.g., by a training data aggregator module of the ECS) to identify the type of examination and DL application used to generate the decision support included in the report, which may be flagged/stored with the final report. At 940, the labeled report (e.g., the final report, including edited results where appropriate, and labeled with exam type and DL application) is placed in a training, testing, and/or validation dataset. Each data set may be stored on the ECS. Further, each data set may be specific to a DL application. For example, each DL application may have a particular training data set, test data set, and validation data set. Thus, the flagged report may be placed in a data set based on the DL application that is used to generate the decision support listed in the report. In at least some embodiments, within a data set for a particular DL application, a labeled report may be randomly or semi-randomly assigned to one of the training, testing, and/or validation data sets.
At 945, method 900 includes determining whether the training data set includes a threshold number of reports. The threshold number of reports may be a number sufficient to accurately reflect the functionality of the DL model performed by the DL application, such as 100 reports. Thus, if the training data set for a given DL application reaches 100 reports, the answer at 945 includes "yes", otherwise the answer at 945 includes "no". If the answer at 945 is NO, the method 900 returns. If the answer is "yes," the method 900 proceeds to 950 to retrain the DL application with the new data in the data set. In this way, the DL model can be fine-tuned by retraining with new data. The updated weights may be deployed locally to the site tuning model. For example, the DL model executed by the DL application on the ECS may include weighted connections, decision trees, etc., and the DL model may be retrained such that the weights are updated to better fit the new data. As more and more data is collected, the DL model may become more accurate. By retraining the DL model locally (e.g., on the ECS) rather than globally via a central device such as the cloud, site-specific preferences in deploying the DL model can be maintained. For example, a particular medical facility may prefer an aggressive (aggregate) lesion detection method to ensure that false negative detections are reduced, while a different medical facility may prefer to reduce false positives by resorting to a more conservative (1 aggregate) lesion detection method. Such preferences may be established and maintained by allowing local retraining of the appropriate DL model. Furthermore, local retraining of DL models can be used to customize DL models for specific patient demographics in certain regions of the world. For example, if the majority of the patient population in a hospital is obese, the DL model may need to be tuned for that type of patient demographic imaging characteristics. However, at least in some examples, a subset or all of the data and/or updated weights may be sent to a central device (e.g., a cloud) along with data from other sites to create an updated global model, as shown at 955. The data sent back to the cloud may include images and other specific data sets that may be used to train a particular model. The global model may also be sent back to the locally stored DL application model.
As used herein, an element or step recited in the singular and proceeded with the word "a" or "an" should be understood as not excluding plural said elements or steps, unless such exclusion is explicitly recited. Furthermore, references to "one embodiment" of the present invention are not intended to be interpreted as excluding the existence of additional embodiments that also incorporate the recited features. Moreover, unless explicitly stated to the contrary, embodiments "comprising," "including," or "having" an element or a plurality of elements having a particular property may include additional such elements not having that property. The terms "including" and "in. Furthermore, the terms "first," "second," and "third," etc. are used merely as labels, and are not intended to impose numerical requirements or a particular positional order on their objects.
This written description uses examples to disclose the invention, including the best mode, and also to enable any person skilled in the relevant art to practice the invention, including making and using any devices or systems and performing any incorporated methods. The patentable scope of the invention is defined by the claims, and may include other examples that occur to those of ordinary skill in the art. Such other examples are intended to be within the scope of the claims if they have structural elements that do not differ from the literal language of the claims, or if they include equivalent structural elements with insubstantial differences from the literal languages of the claims.

Claims (20)

1. A system, comprising:
an imaging system comprising at least a scanner and a processor configured to reconstruct an image from data acquired during scanning of a subject via the scanner; and
a computing device communicatively coupled to and positioned external to the imaging system, the computing device configured to generate a decision support calculation based on the data, wherein the imaging system transmits the data to the computing device during the scan.
2. The system of claim 1, wherein the imaging system further comprises a user interface, wherein a scan protocol for the scan is determined based on a primary task input to the processor via the user interface.
3. The system of claim 2, wherein the processor transmits a secondary task based on the primary task to the computing device, and wherein the computing device generates the decision support calculation as a function of the secondary task.
4. The system of claim 1, wherein the imaging system further comprises a display device, wherein the processor is configured to display the image and the decision support calculation via the display device.
5. The system of claim 1, wherein the imaging system transmits the data only during the scan when the imaging system is not in a critical state.
6. The system of claim 1, wherein the computing device comprises a deep learning model in non-transitory memory, wherein the computing device inputs the data to the deep learning model to generate the decision support computation.
7. A method for an imaging system, comprising:
performing a scan of a subject to acquire imaging data;
transmitting the imaging data to a computing device communicatively coupled to and positioned external to the imaging system during the scan;
receiving a decision support output from the computing device, the decision support output calculated by the computing device from the imaging data; and
displaying an image reconstructed from the imaging data and the decision support output.
8. The method of claim 7, further comprising evaluating a state of the imaging system during the scan, wherein transmitting the imaging data to the computing device comprises transmitting the imaging data to the computing device when the state of the imaging system is not critical and not transmitting the imaging data to the computing device when the state of the imaging system is critical.
9. The method of claim 7, further comprising receiving an indication of a primary task and transmitting an indication of a secondary task related to the primary task to the computing system.
10. The method of claim 9, further comprising determining a scanning protocol for the scan according to the primary task, wherein the computing system calculates the decision support output according to the secondary task.
11. The method of claim 7, wherein the computing system computes the decision support output during the scan.
12. The method of claim 7, wherein displaying the image and the decision support output comprises displaying the decision support output superimposed on the image.
13. The method of claim 7, wherein the computing device computes the decision support output by inputting the imaging data to a deep learning model that outputs the decision support output.
14. The method of claim 13, further comprising receiving, via a user interface of the imaging system, ground facts about the decision support output, and transmitting the ground facts to the computing device to update the deep learning model.
15. An imaging system, comprising:
an X-ray source that emits an X-ray beam toward a subject to be imaged;
a detector that receives the X-rays attenuated by the subject;
a Data Acquisition System (DAS) operably connected to the detectors; and
a computing device operatively connected to the DAS and configured with executable instructions in non-transitory memory that, when executed, cause the computing device to:
controlling the X-ray source to perform a scan of the subject;
during the scan, receiving projection data via the DAS and transmitting the projection data to an Edge Computing System (ECS) communicatively coupled to and positioned external to the imaging system;
receiving a decision support output generated by the ECS based on the projection data; and
outputting the decision support output and an image reconstructed from the projection data.
16. The imaging system of claim 15, further comprising a display device, wherein the decision support output and the image are output to the display device.
17. The imaging system of claim 15, wherein the computing device is further configured with executable instructions in non-transitory memory that, when executed, cause the computing device to reconstruct the image from the projection data.
18. The imaging system of claim 15, wherein the computing device is further configured with executable instructions in non-transitory memory that, when executed, cause the computing device to receive the image reconstructed by the ECS from the projection data from the ECS.
19. The imaging system of claim 15, further comprising an operator console coupled to the computing device and configured to receive input from an operator of the imaging system, wherein the computing device is further configured with executable instructions in non-transitory memory that, when executed, cause the computing device to receive an indication of a primary task via the operator console and select a scan protocol for the scan according to the primary task.
20. The imaging system of claim 19, wherein the computing device is further configured with executable instructions in non-transitory memory that, when executed, cause the computing device to output an indication of a secondary task to the ECS, the secondary task associated with the primary task, wherein the decision support output is generated by the ECS in accordance with the secondary task.
CN201980022365.5A 2018-04-13 2019-04-12 System and method for synchronizing an imaging system and an edge calculation system Pending CN111919264A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201862657633P 2018-04-13 2018-04-13
US62/657,633 2018-04-13
PCT/US2019/027370 WO2019200346A1 (en) 2018-04-13 2019-04-12 Systems and methods for synchronization of imaging systems and an edge computing system

Publications (1)

Publication Number Publication Date
CN111919264A true CN111919264A (en) 2020-11-10

Family

ID=66397456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201980022365.5A Pending CN111919264A (en) 2018-04-13 2019-04-12 System and method for synchronizing an imaging system and an edge calculation system

Country Status (2)

Country Link
CN (1) CN111919264A (en)
WO (1) WO2019200346A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023280121A1 (en) * 2021-07-07 2023-01-12 华为技术有限公司 Method and apparatus for obtaining edge service

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114002332B (en) * 2021-09-29 2023-07-25 西安交通大学 Structural damage monitoring and early warning method and structural integrity digital twin system

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050267348A1 (en) * 2004-06-01 2005-12-01 Wollenweber Scott D Methods and apparatus for automatic protocol selection
CN106537398A (en) * 2014-07-16 2017-03-22 皇家飞利浦有限公司 IRECON: intelligent image reconstruction system with anticipatory execution

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050267348A1 (en) * 2004-06-01 2005-12-01 Wollenweber Scott D Methods and apparatus for automatic protocol selection
CN106537398A (en) * 2014-07-16 2017-03-22 皇家飞利浦有限公司 IRECON: intelligent image reconstruction system with anticipatory execution

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
MOHAMMAD-PARSA HOSSEINI ET AL: "Deep Learning with Edge Computing for Localization of Epileptogenicity using Multimodal rs-fMRI and EEG Big Data", 2017 IEEE INTERNATIONAL CONFERENCE ON AUTONOMIC COMPUTING, 10 August 2017 (2017-08-10), pages 83 - 92 *
QINGZENG SONG ET AL: "Using Deep Learning for Classification of Lung Nodules on Computed Tomography Images", JOURNAL OF HEALTHCARE ENGINEERING, vol. 2017, 9 August 2017 (2017-08-09), pages 1 - 7, XP055595889, DOI: 10.1155/2017/8314740 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023280121A1 (en) * 2021-07-07 2023-01-12 华为技术有限公司 Method and apparatus for obtaining edge service

Also Published As

Publication number Publication date
WO2019200346A1 (en) 2019-10-17

Similar Documents

Publication Publication Date Title
US20220117570A1 (en) Systems and methods for contrast flow modeling with deep learning
US7756314B2 (en) Methods and systems for computer aided targeting
US10755407B2 (en) Systems and methods for capturing deep learning training data from imaging systems
US11393579B2 (en) Methods and systems for workflow management
CN112005314A (en) System and method for training a deep learning model of an imaging system
JP2004105728A (en) Computer aided acquisition of medical image
US11141079B2 (en) Systems and methods for profile-based scanning
US10679346B2 (en) Systems and methods for capturing deep learning training data from imaging systems
US20170042494A1 (en) Computed tomography apparatus and method of reconstructing a computed tomography image by the computed tomography apparatus
CN111374690A (en) Medical imaging method and system
CN112004471A (en) System and method for imaging system shortcut mode
US20220399107A1 (en) Automated protocoling in medical imaging systems
CN111919264A (en) System and method for synchronizing an imaging system and an edge calculation system
US10552959B2 (en) System and method for using imaging quality metric ranking
US20220375038A1 (en) Systems and methods for computed tomography image denoising with a bias-reducing loss function
CN112005313A (en) System and method for deploying deep learning applications to imaging systems
US11955228B2 (en) Methods and system for simulated radiology studies based on prior imaging data
US20230048231A1 (en) Method and systems for aliasing artifact reduction in computed tomography imaging
US20230154594A1 (en) Systems and methods for protocol recommendations in medical imaging
US20240029415A1 (en) Simulating pathology images based on anatomy data
US11941022B2 (en) Systems and methods for database synchronization
WO2021252751A1 (en) Systems and methods for generating synthetic baseline x-ray images from computed tomography for longitudinal analysis
KR20160072004A (en) Tomography apparatus and method for reconstructing a tomography image thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination