WO2022089626A1 - Systems and methods for medical imaging - Google Patents

Systems and methods for medical imaging Download PDF

Info

Publication number
WO2022089626A1
WO2022089626A1 PCT/CN2021/127787 CN2021127787W WO2022089626A1 WO 2022089626 A1 WO2022089626 A1 WO 2022089626A1 CN 2021127787 W CN2021127787 W CN 2021127787W WO 2022089626 A1 WO2022089626 A1 WO 2022089626A1
Authority
WO
WIPO (PCT)
Prior art keywords
parameter values
sample
exposure parameters
learning model
machine learning
Prior art date
Application number
PCT/CN2021/127787
Other languages
French (fr)
Inventor
Haijun Chen
Kun TANG
Original Assignee
Shanghai United Imaging Healthcare Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co., Ltd. filed Critical Shanghai United Imaging Healthcare Co., Ltd.
Publication of WO2022089626A1 publication Critical patent/WO2022089626A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/54Control of apparatus or devices for radiation diagnosis
    • A61B6/542Control of apparatus or devices for radiation diagnosis involving control of exposure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/037Emission tomography
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present disclosure generally relates to medical imaging, and more particularly, relates to systems and methods for determining one or more exposure parameters of an X-ray radiography device.
  • X-ray radiography is widely used in medical imaging and diagnosis.
  • Conventional X-ray radiography devices including computed tomography (CT) devices, digital radiography (DR) devices, C-arm X-ray devices, and so on, generally acquire images by scanning part or all of a subject.
  • CT computed tomography
  • DR digital radiography
  • C-arm X-ray devices C-arm X-ray devices
  • exposure parameters associated with an X-ray generator of an X-ray radiography device need to be automatically adjusted multiple times, usually according to an automatic brightness stabilization (ABS) curve of the X-ray generator, in order to obtain desired images (or stable images) .
  • ABS automatic brightness stabilization
  • making multiple adjustments can be time-consuming and cause the scanned subject (e.g., a patient) to be exposed to excessive amount of radiation.
  • ABS automatic brightness stabilization
  • a system may include at least one storage device including a set of instructions for generating an image; and at least one processor in communication with the at least one storage device. When executing the set of instructions, the at least one processor is configured to cause the system to perform operations.
  • the operations may include obtaining parameter values of one or more acquisition parameters of an imaging device for scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
  • the one or more acquisition parameters may include at least one of: an object type of the object, an acquisition type, a source image receptor distance (SID) , a source object distance (SOD) , or a filtration mode.
  • SID source image receptor distance
  • SOD source object distance
  • the exposure parameters may include at least one of a tube voltage, a tube current, or a radiation time.
  • the obtaining the trained machine learning model may include obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process.
  • Each group of the multiple groups of training samples may include sample parameter values of the one or more acquisition parameters, sample initial parameter values of the exposure parameters, a sample target dose, and sample stable parameter values of the exposure parameters.
  • the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose may serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters may serve as desired outputs of the preliminary machine learning model during the training process.
  • the sample stable parameter values of the exposure parameters may be determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  • the determining the sample stable parameter values of the exposure parameters may include performing an iteration process including one or more iterations. At least one iteration of the one or more iterations may include: obtaining an updated image that is captured by scanning the sample object based on updated parameter values of the exposure parameters using the imaging device according to the sample parameter values of the one or more acquisition parameters; determining an updated practical dose corresponding to the updated image; determining whether a dose difference between the updated practical dose and the sample target dose is less than a threshold value; and in response to determining that the dose difference is less than the threshold value, designating the updated parameter values of the exposure parameters as the sample stable parameter values of the exposure parameters.
  • the updated parameter values of the exposure parameters may be determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of an X-ray generator of the imaging device.
  • ABS automatic brightness stabilization
  • the training process may include configuring a first weight for the sample parameter values of the one or more acquisition parameters, a second weight for the sample target dose, and a third weight for the sample initial parameter values of the exposure parameters, wherein both the first weight and the second weight are greater than the third weight; and training the preliminary machine learning model based on the sample parameter values of the one or more acquisition parameters with the first weight, the sample target dose with the second weight, and the sample initial parameter values of the exposure parameters with the third weight.
  • the trained machine learning model may include a deep learning model.
  • a method for image generation may be implemented on a computing device having at least one processor and at least one storage device, the method may include: obtaining parameter values of one or more acquisition parameters of an imaging device for scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
  • a non-transitory computer readable medium may include at least one set of instructions for image generation.
  • the at least one set of instructions When executed by at least one processor of a computing device, the at least one set of instructions direct the at least one processor to perform operations including: obtaining parameter values of one or more acquisition parameters of an imaging device for scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
  • the system for model training may include at least one storage device storing executable instructions for generating a trained machine learning model; and at least one processor in communication with the at least one storage device. When executing the set of instructions, the at least one processor is configured to cause the system to perform operations.
  • the operations may include obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  • a method for model training may be implemented on a computing device having at least one processor and at least one storage device, the method may include: obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  • a non-transitory computer readable medium may include at least one set of instructions for model training.
  • the at least one set of instructions direct the at least one processor to perform operations including: obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined
  • FIG. 1 is a schematic diagram illustrating an exemplary medical system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure
  • FIG. 4A is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 4B is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating an exemplary process for generating an image of a subject according to some embodiments of the present disclosure
  • FIG. 6 is a flowchart illustrating an exemplary process for generating a trained machine learning model for determining exposure parameters according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for determining a training sample for a trained machine learning model according to some embodiments of the present disclosure.
  • FIG. 8 is a schematic diagram illustrating an exemplary trained machine learning model according to some embodiments of the present disclosure.
  • system, ” “engine, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections, or assembly of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
  • module, ” “unit, ” or “block, ” as used herein refers to logic embodied in hardware or firmware, or to a collection of software instructions.
  • a module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device.
  • a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG.
  • a computer-readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • a computer-readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or can be included in programmable units, such as programmable gate arrays or processors.
  • modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks but may be represented in hardware or firmware.
  • the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage. The description may be applicable to a system, an engine, or a portion thereof.
  • image in the present disclosure is used to collectively refer to image data (e.g., scan data, projection data) and/or images of various forms, including a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) , etc.
  • pixel and “voxel” in the present disclosure are used interchangeably to refer to an element of an image.
  • region, ” “location, ” and “area” in the present disclosure may refer to a location of an anatomical structure shown in the image or an actual location of the anatomical structure existing in or on a target subject’s body, since the image may indicate the actual location of a certain anatomical structure existing in or on the target subject’s body.
  • an image of a subject may be referred to as the subject for brevity.
  • Segmentation of an image of a subject may be referred to as segmentation of the subject.
  • segmentation of an organ refers to segmentation of a region corresponding to the organ in an image.
  • a method provided in the present disclosure may include obtaining parameter values of one or more acquisition parameters of an imaging device for scanning a subject.
  • the method may further include obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject.
  • the method may further include determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose.
  • the method may further include generating an image by causing the imaging device to scan the subject based on the stable parameter values of the exposure parameters.
  • the stable parameter values of the exposure parameters may be determined by automatically adjusting the initial parameter values of the exposure parameters multiple times according to an automatic brightness stabilization (ABS) curve of a ray generator of the imaging device, which is time-consuming and causes the scanned subject (e.g., a patient) to be exposed to excessive radiation.
  • ABS automatic brightness stabilization
  • the systems and methods of the present disclosure may directly determine the stable parameter values of the exposure parameters using a trained machine learning model based on the parameter values of the acquisition parameters, the initial parameter values of the exposure parameters, and the target dose, thereby improving scanning efficiency and protect the subject from excessive radiation exposure.
  • FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure.
  • the imaging system 100 may be applied to any application scenario in which radiation rays (e.g., X-rays) are used for generating images and/or providing treatment, such as a computed tomography (CT) system, a digital radiography (DR) system, a C-arm X-ray system, a computed tomography-positron emission tomography (CT-PET) system, or the like, or a combination thereof.
  • CT computed tomography
  • DR digital radiography
  • C-arm X-ray system a computed tomography-positron emission tomography
  • CT-PET computed tomography-positron emission tomography
  • the imaging system 100 may include an imaging device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150.
  • the components in the imaging system 100 may be connected in one or more of various ways.
  • the imaging device 110 may be connected to the processing device 140 through the network 120.
  • the imaging device 110 may be connected to the processing device 140 directly as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140.
  • the storage device 150 may be connected to the processing device 140 directly or through the network 120.
  • the terminal device 130 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal device 130 and the processing device 140) or through the network 120.
  • the imaging device 110 may be configured to scan a subject using radiation rays and generate imaging data used to generate one or more images relating to the subject.
  • the imaging data relating to at least one part of the subject may include an image (e.g., an image slice) , projection data, or a combination thereof.
  • the imaging data may be a two-dimensional (2D) imaging data, a three-dimensional (3D) imaging data, a four-dimensional (4D) imaging data, or the like, or any combination thereof.
  • the imaging device 110 may transmit the imaging data to the processing device 140 for further processing (e.g., generating one or more images) .
  • the imaging data and/or the one or more images associated with the subject may be stored in the storage device 150 and/or the processing device 140.
  • the imaging device 110 may include a computed tomography (CT) device, a digital radiography (DR) device, a C-arm X-ray device, a digital subtraction angiography (DSA) device, a dynamic spatial reconstructor (DSR) scanner, an X-ray microscopy device, a computed tomography-positron emission tomography (CT-PET) device, a computed tomography-magnetic resonance imaging (CT-MRI) device, or the like, or a combination thereof.
  • CT-PET computed tomography-positron emission tomography
  • CT-MRI computed tomography-magnetic resonance imaging
  • one or more components in the imaging system 100 may be omitted.
  • the imaging system 100 may not include the terminal device 130.
  • the imaging device 110 may include a gantry 111, a detector 112, a detecting region 113, a table 114, a ray generator 115, or any other components.
  • the gantry 111 may be configured to provide support for other components (e.g., the ray generator 115, the detector 112, etc. ) of the imaging device 110.
  • the detector 112 and the ray generator 115 may be oppositely mounted on the gantry 111.
  • the gantry 111 may rotate and/or move.
  • the detector 112 and the ray generator 115 may rotate along with the rotation of the gantry 111.
  • the table 114 may be configured to locate and/or support a subject to be scanned.
  • the subject may be placed on the table 114 and moved into the detecting region 113 (e.g., a space between the detector 112 and the ray generator 115) of the imaging device 110.
  • the subject may be biological or non-biological.
  • the subject may include a patient, a man-made subject, etc.
  • the subject may include a specific portion, organ, and/or tissue of the patient.
  • the subject may include the head, the brain, the neck, the body, a shoulder, an arm, the thorax, the heart, the stomach, a blood vessel, a soft tissue, a knee, feet, or the like, or any combination thereof.
  • the ray generator 115 may emit radiation rays to scan the subject that is placed on the table 114.
  • the radiation rays may include X-rays, ⁇ -rays, ⁇ -rays, ultraviolet, laser, neutron, proton, or the like, or a combination thereof.
  • the detector 112 may receive the radiation rays passed through the subject.
  • the detector 112 may include a plurality of detector units, which may be arranged in a channel direction and a row direction.
  • the detector 112 may include a scintillation detector (e.g., a cesium iodide detector) or a gas detector.
  • the network 120 may include any suitable network that can facilitate the exchange of information and/or data for the imaging system 100.
  • one or more components of the imaging system 100 e.g., the imaging device 110, the terminal device 130, the processing device 140, the storage device 150
  • the processing device 140 may obtain image data from the imaging device 110 via the network 120.
  • the processing device 140 may obtain user instruction (s) from the terminal device 130 via the network 120.
  • the network 120 may be any type of wired or wireless network, or a combination thereof.
  • the network 120 may be or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc. ) , a cellular network (e.g., a Long Term Evolution (LTE) network) , a frame relay network, a virtual private network (VPN) , a satellite network, a telephone network, routers, hubs, switches, server computers, and/or any combination thereof.
  • a public network e.g., the Internet
  • a private network e.g., a local area network (LAN) , a wide area network (WAN)
  • a wired network e.g., an Ethernet network
  • a wireless network e.g., an 802.11 network, a Wi-Fi network
  • the network 120 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 120 may include one or more network access points.
  • the network 120 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the imaging system 100 may be connected to the network 120 to exchange data and/or information.
  • the terminal device 130 may be connected to and/or communicate with the imaging device 110, the processing device 140, and/or the storage device 150.
  • the terminal device 130 may obtain a processed image from the processing device 140.
  • the terminal device 130 may enable user interactions with the imaging system 100.
  • the terminal device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof.
  • the mobile device 131 may include a mobile phone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, a laptop, a tablet computer, a desktop, or the like, or any combination thereof.
  • PDA personal digital assistant
  • POS point of sale
  • the terminal device 130 may include an input device, an output device, etc.
  • the input device may include alphanumeric and other keys that may be input via a keyboard, a touch screen (for example, with haptics or tactile feedback) , a speech input, an eye-tracking input, a brain monitoring system, or any other comparable input mechanism.
  • the input information received through the input device may be transmitted to the processing device 140 via, for example, a bus, for further processing.
  • Other types of input device may include a cursor control device, such as a mouse, a trackball, or cursor direction keys, etc.
  • the output device may include a display, a speaker, a printer, or the like, or a combination thereof.
  • the terminal device 130 may be part of the processing device 140.
  • the processing device 140 may process data and/or information obtained from the imaging device 110, the storage device 150, the terminal device 130, or other components of the imaging system 100. For example, the processing device 140 may determine stable parameter values of exposure parameters corresponding to parameter values of one or more acquisition parameters, initial parameter values of the exposure parameters, and a target dose based on a trained machine learning model. As another example, the processing device 140 may obtain multiple groups of training samples and generate the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples. As a further example, for determining one group of training sample, the processing device 140 may direct the imaging device 110 to perform an iteration operation until an image that satisfying a preset condition is generated.
  • the processing device 140 may determine exposure parameters that is used to generate the image as sample stable parameter values of the exposure parameters.
  • the trained machine learning model may be updated from time to time, e.g., periodically or not, based on a sample set that is at least partially different from the original sample set from which the original trained machine learning model is determined. For instance, the trained machine learning model may be updated based on a sample set including new samples that are not in the original sample set, samples processed using the machine learning model in connection with the original trained machine learning model of a prior version, or the like, or a combination thereof.
  • the generation and/or updating of the trained machine learning model may be performed on a processing device, while the application of the trained machine learning model may be performed on a different processing device.
  • the determination and/or updating of the trained machine learning model may be performed on a processing device of a system different from the imaging system 100 or a server different from a server including the processing device 140 on which the application of the trained machine learning model is performed.
  • the determination and/or updating of the trained machine learning model may be performed on a first system of a vendor who provides and/or maintains such a machine learning model and/or has access to training samples used to determine and/or update the trained machine learning model, while exposure parameter determination based on the provided trained machine learning model may be performed on a second system of a client of the vendor.
  • the determination and/or updating of the trained machine learning model may be performed online in response to a request for scanning a subject.
  • the determination and/or updating of the trained machine learning model may be performed offline.
  • the trained machine learning model may be generated and/or updated (or maintained) by, e.g., the manufacturer of the imaging device 110 or a vendor.
  • the manufacturer or the vendor may load the trained machine learning model into the imaging system 100 or a portion thereof (e.g., the processing device 140) before or during the installation of the imaging device 110 and/or the processing device 140, and maintain or update the trained machine learning model from time to time (periodically or not) .
  • the maintenance or update may be achieved by installing a program stored on a storage device (e.g., a compact disc, a USB drive, etc. ) or retrieved from an external source (e.g., a server maintained by the manufacturer or vendor) via the network 120.
  • the program may include a new model (e.g., a new trained machine learning model) or a portion of a model that substitute or supplement a corresponding portion of the model.
  • the processing device 140 may be a single server or a server group.
  • the server group may be centralized or distributed.
  • the processing device 140 may be local to or remote from the imaging system 100.
  • the processing device 140 may access information and/or data from the imaging device 110, the storage device 150, and/or the terminal device 130 via the network 120.
  • the processing device 140 may be directly connected to the imaging device 110, the terminal device 130, and/or the storage device 150 to access information and/or data.
  • the processing device 140 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof.
  • the processing device 140 may be implemented on a computing device 200 having one or more components as described in connection with FIG. 2.
  • the storage device 150 may store data, instructions, and/or any other information.
  • the storage device 150 may store data obtained from the processing device 140 and/or the terminal device 130.
  • the storage device 150 may store one or more images obtained from the processing device 140.
  • the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform exemplary methods described in the present disclosure.
  • the storage device 150 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary mass storage devices may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • Exemplary removable storage devices may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • DRAM dynamic RAM
  • DDR SDRAM double date rate synchronous dynamic RAM
  • SRAM static RAM
  • T-RAM thyristor RAM
  • Z-RAM zero-capacitor RAM
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • MROM mask ROM
  • PROM programmable ROM
  • EPROM erasable programmable ROM
  • EEPROM electrically erasable programmable ROM
  • CD-ROM compact disk ROM
  • digital versatile disk ROM etc.
  • the storage device 150 may be implemented on a cloud platform as described elsewhere in the disclosure.
  • the storage device 150 may be connected to the network 120 to communicate with one or more other components of the imaging system 100 (e.g., the processing device 140, the terminal device 130, etc. ) .
  • One or more components of the imaging system 100 may access the data or instructions stored in the storage device 150 via the network 120.
  • the storage device 150 may be part of the processing device 140.
  • the storage device 150 may be a data storage device including cloud computing platforms, such as public cloud, private cloud, community, and hybrid clouds, etc. However, those variations and modifications do not depart the scope of the present disclosure.
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device on which the processing device 140 may be implemented according to some embodiments of the present disclosure.
  • a computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
  • I/O input/output
  • the processor 210 may execute computer instructions (e.g., program code) and perform functions of the processing device 140 in accordance with techniques described herein.
  • the computer instructions may include, for example, routines, programs, subjects, components, data structures, procedures, modules, and functions, which perform particular functions described herein.
  • the processor 210 may process image data obtained from the imaging device 110, the terminal device 130, the storage device 150, and/or any other component of the imaging system 100.
  • the processor 210 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application-specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
  • RISC reduced instruction set computer
  • ASICs application-specific integrated circuits
  • ASIP application-specific instruction-set processor
  • CPU central processing unit
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • ARM advanced RIS
  • the computing device 200 in the present disclosure may also include multiple processors, and thus operations and/or method operations that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both operation A and operation B
  • operation A and operation B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
  • the storage 220 may store data/information obtained from the imaging device 110, the terminal device 130, the storage device 150, and/or any other component of the imaging system 100.
  • the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
  • the storage 220 may store a program for the processing device 140 for determining stable exposure parameters of an X-ray generator of the imaging device for scanning a subject.
  • the I/O 230 may input and/or output signals, data, information, etc. In some embodiments, the I/O 230 may enable a user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof.
  • Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , a touch screen, or the like, or a combination thereof.
  • LCD liquid crystal display
  • LED light-emitting diode
  • flat panel display a flat panel display
  • curved screen a curved screen
  • television device a cathode ray tube (CRT)
  • CTR cathode ray tube
  • touch screen or the like, or a combination thereof.
  • the communication port 240 may be connected to a network (e.g., the network 120) to facilitate data communications.
  • the communication port 240 may establish connections between the processing device 140 and the imaging device 110, the terminal device 130, and/or the storage device 150.
  • the connection may be a wired connection, a wireless connection, any other communication connection that can enable data transmission and/or reception, and/or any combination of these connections.
  • the wired connection may include, for example, an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
  • the wireless connection may include, for example, a Bluetooth TM link, a Wi-Fi TM link, a WiMax TM link, a WLAN link, a ZigBee TM link, a mobile network link (e.g., 3G, 4G, 5G) , or the like, or a combination thereof.
  • the communication port 240 may be and/or include a standardized communication port, such as RS232, RS485, etc.
  • the communication port 240 may be a specially designed communication port.
  • the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
  • DICOM digital imaging and communications in medicine
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure.
  • one or more components e.g., the terminal device 130 and/or the processing device 140
  • the imaging system 100 may be implemented on the mobile device 300.
  • the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS TM , Android TM , Windows Phone TM
  • one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to image processing or other information from the processing device 140.
  • User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 140 and/or other components of the imaging system 100 via the network 120.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device.
  • PC personal computer
  • a computer may also act as a server if appropriately programmed.
  • FIGs. 4A and 4B are block diagrams illustrating exemplary processing devices according to some embodiments of the present disclosure.
  • the processing devices 140A and 140B may be exemplary processing devices 140 as described in connection with FIG. 1.
  • the processing device 140A may be configured to generate a stable image of a subject.
  • the processing device 140B may be configured to generate a trained machine learning model for exposure parameter determination.
  • the processing devices 140A and 140B may be respectively implemented on a processing unit (e.g., the processor 210 illustrated in FIG. 2 or the CPU 340 illustrated in FIG. 3) .
  • the processing device 140A may be implemented on a CPU 340 of a terminal device, and the processing device 140B may be implemented on a computing device 200.
  • the processing devices 140A and 140B may be implemented on a same computing device 200 or a same CPU 340.
  • the processing devices 140A and 140B may be implemented on a same computing device 200.
  • the processing device 140A may include an acquisition module 410, an exposure parameter determination module 420, and a scanning module 430.
  • the acquisition module 410 may be configured to obtain parameter values of one or more acquisition parameters of an imaging device for scanning an object.
  • the acquisition module 410 may further be configured to obtain initial parameter values of exposure parameters of the imaging device and a target dose associated with the object.
  • the exposure parameter determination module 420 may be configured to determine stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose by using a trained machine learning model. For example, the exposure parameter determination module 420 may input the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose into the trained machine learning model, and the trained machine learning model may output the stable exposure parameters.
  • the scanning module 430 may be configured to generate an image (also be referred to as a stable image) of the object by directing the imaging device to scan the object based on the stable exposure parameters.
  • the processing device 140B may include an acquisition module 440, a sample generation module 450, and model determination module 460.
  • the acquisition module 440 may be configured to obtain multiple groups of training samples.
  • Each group of the multiple groups of training samples may include sample parameter values of one or more acquisition parameters of an imaging device, a sample target dose, sample initial parameter values of exposure parameters of the imaging device, and sample stable parameter values of the exposure parameters.
  • Each group of the multiple groups of training samples may correspond to a sample object.
  • the sample generation module 450 may be configured to determine a training sample for a trained machine learning model. For example, the sample generation module 450 may obtain sample parameter values of one or more acquisition parameters of an imaging device for scanning a sample object and sample initial parameter values of exposure parameters of the imaging device. The sample generation module 450 may further obtain an initial image that is captured by scanning the sample object based on the sample initial exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) . The sample generation module 450 may determine a practical dose corresponding to the initial image. The sample generation module 450 may determine whether a dose difference between the practical dose and a sample target dose is less than a threshold value.
  • the sample generation module 450 may designate the sample initial exposure parameters as sample stable exposure parameters. In response to a determination that the dose difference is not less than the threshold value, the sample generation module 450 may obtain an updated image that is captured by scanning the sample object based on updated parameter values of the exposure parameters (also be referred to as updated exposure parameters) using the imaging device according to the sample parameter values of the acquisition parameter (s) .
  • updated exposure parameters also be referred to as updated exposure parameters
  • the model determination module 460 may be configured to generate a trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process.
  • the processing device 140A and/or the processing device 140B may share two or more of the modules, and any one of the modules may be divided into two or more units.
  • the processing devices 140A and 140B may share a same acquisition module, that is, the acquisition module 410 and the acquisition module 440 are a same module.
  • the processing device 140A and/or the processing device 140B may include one or more additional modules, such as a storage module (not shown) for storing data. In some embodiments, the processing device 140A and the processing device 140B may be integrated into one processing device 140.
  • FIG. 5 is a flowchart illustrating an exemplary process for generating an image of a subject according to some embodiments of the present disclosure.
  • a process 500 may be implemented as a set of instructions (e.g., an application) stored in the storage device 150, the storage 220, or the storage 390.
  • the processing device 140A e.g., implemented on the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4A
  • the operations of the illustrated process 500 presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order of the operations of the process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing device 140A may obtain parameter values of one or more acquisition parameters of an imaging device for scanning a subject.
  • the subject may be biological or non-biological.
  • the subject may include a patient (or a portion thereof) , a man-made subject (e.g., a phantom) , etc., as described elsewhere in the present disclosure (e.g., FIG. 1 and the descriptions thereof) .
  • the imaging device may include a ray generator, a detector, a table, etc., in connection with the imaging device 110 as described in FIG. 1.
  • an X-ray generator may be taken as an example of the ray generator of the imaging device.
  • the imaging device may include a DR device, a C-arm X-ray device, etc.
  • the acquisition parameters may also be referred to as intrinsic characteristic parameters.
  • the intrinsic characteristic parameters may refer parameters of the imaging device that may not change once they are set during a scanning process.
  • the acquisition parameter (s) may include a subject type of a subject to be scanned, an acquisition type, a filtration mode, a source image distance (SID) , a source subject distance (SOD) , an image subject distance (IOD) , a focal spot size, a beam field size, or the like, or any combination thereof.
  • the subject type refers to a category to which the subject belongs.
  • the subject type may include a biological type and a non-biological type.
  • different organs and/or tissues may correspond to different subject types.
  • the acquisition type may be related to an application scenario of the imaging device.
  • the acquisition type may be related to an X-ray dose on the detector.
  • the acquisition type may include an exposure type and a transmission type.
  • a dose corresponding to the exposure type may be higher than a dose corresponding to the transmission type.
  • the filtration mode may include a filter material placed in front of the X-ray generator in order to reduce the intensity of particular X-ray wavelengths from its spectrum and selectively alter the distribution of X-ray wavelengths within a given x-ray beam.
  • Exemplary filter materials may include aluminum, copper, silver, iron, and so on.
  • the thickness of the filters may be different, which causes different doses of the X-rays transmitted through the filters.
  • the SID refers to a distance between a focal spot of the X-ray generator and the detector (e.g., an imaging region of the detector) .
  • the SOD refers to a distance between the focal spot of the X-ray generator and the subject (e.g., a region of interest (ROI) of the subject) .
  • the IOD refers to a distance between the subject and the detector (e.g., an imaging region of the detector) .
  • the SID may be equal to a sum of the SOD and IOD, and the focal spot of the X-ray generator, a center of the imaging region of the detector, and a center of the ROI of the subject may be in a straight line.
  • the acquisition parameters may also include information of the subject. For example, if the subject is a patient, the information may include a body thickness of the patient, an age of the patient, etc.
  • different subjects to be scanned may correspond to different parameter values of the acquisition parameter (s) . That is, the parameter values of the acquisition parameter (s) may be associated with the subject type of the subject to be scanned. For example, parameter values of the acquisition parameter (s) for scanning the head may be different from parameter values of the acquisition parameter (s) for scanning the heart.
  • the parameter values of the acquisition parameter (s) may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130. For example, multiple sets of candidate parameter values of the acquisition parameter (s) may be stored in a storage device (e.g., the storage device 150) as multiple scan protocols.
  • the processing device 140A may determine the parameter values of the acquisition parameter (s) by selecting a scan protocol from the storage device. In some embodiments, a user (e.g., a doctor) may adjust the parameter values in the scan protocol via the terminal device 130.
  • the processing device 140A may obtain initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject.
  • the initial parameter values of the exposure parameters may also be referred to as initial exposure parameters.
  • the exposure parameters may refer to parameters of an X-ray generator (e.g., a high voltage generator) of the imaging device used to capture an image of the subject.
  • Parameter values of the exposure parameters may affect an X-ray dose on the detector of the imaging device and a brightness of an image of the subject obtained based on the parameter values of the exposure parameters.
  • the brightness of the image may refer to an average value of gray values of pixels in at least a portion (e.g., an ROI) of the image.
  • a dose (or an X-ray dose) disclosed in the present disclosure may refer a dose on the detector of the imaging device.
  • the target dose may refer to a desired dose on the detector of the imaging device.
  • the target dose may be related to a target brightness of a desired image (e.g., an image whose resolution exceeds a resolution threshold) .
  • An X-ray dose corresponding to the desired image may be close to or the same as the target dose.
  • “close to or the same as” may indicate that the deviation of the X-ray dose corresponding to the desired image and the target dose does not exceed a threshold, e.g., 30%, 20%, or 15%, or 10%, or 5%of one of the two doses.
  • the exposure parameters of the imaging device may include a tube voltage of the X-ray generator, a tube current of the X-ray generator, a radiation time, etc.
  • the tube voltage refers to a voltage between a cathode of the X-ray generator and an anode of the X-ray generator during the X-ray generator is emitting X-ray beams.
  • the tube current refers to a current between the cathode of the X-ray generator and the anode of the X-ray generator during the X-ray generator is emitting X-ray beams.
  • different subjects to be scanned may correspond to different initial exposure parameters.
  • initial exposure parameters for scanning the head may be different from initial exposure parameters for scanning the heart.
  • the initial exposure parameters and/or the target dose may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
  • multiple sets of candidate initial exposure parameters may also be stored in the storage device.
  • the processing device 140A may select the initial exposure parameters from the multiple sets of candidate initial exposure parameters based on the subject type of the subject.
  • each set of candidate initial exposure parameters may be stored with each set of candidate parameter values of the acquisition parameter (s) as a scan protocol.
  • each scan protocol may include a set of candidate initial exposure parameters and a set of candidate parameter values of the acquisition parameter (s) .
  • the processing device 140A may directly determine the parameter values of the acquisition parameter (s) and the initial exposure parameters by selecting the corresponding scan protocol.
  • a user e.g., a doctor
  • the initial exposure parameters may be determined based on one or more experiments on a phantom (e.g., water) .
  • the imaging device 110 may scan the subject under different sets of parameter values of the exposure parameters to generate different initial images.
  • the processing device 140A may determine parameter values of the exposure parameters corresponding to an initial image that satisfies a preset condition (e.g., a brightness is greater than a brightness threshold) as the initial exposure parameters.
  • a preset condition e.g., a brightness is greater than a brightness threshold
  • the processing device 140A may determine stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose by using a trained machine learning model.
  • the stable parameter values of the exposure parameters may also be referred to as stable exposure parameters.
  • the processing device 140A may input the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose into the trained machine learning model, and the trained machine learning model may output the stable exposure parameters.
  • the stable exposure parameters may refer to parameters of the X-ray generator for generating a stable image in which a dose difference between an X-ray dose (also be referred to as a practical dose) corresponding to the stable image and the target dose is less than a threshold value.
  • the stable exposure parameters correspond to the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose may refer that the stable exposure parameters are determined by the imaging device based on the initial exposure parameters and the target dose by performing an iteration process as illustrated in FIG. 7.
  • the trained machine learning model may include a deep learning model, a neutral network model, etc.
  • the trained machine learning model may be generated by training a preliminary machine learning model using multiple groups of training samples.
  • Each group of the training samples may include sample parameter values of the acquisition parameter (s) , sample initial exposure parameters, a sample target dose, and sample stable exposure parameters.
  • the sample stable exposure parameters may correspond to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose.
  • the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose may be used as an input of the preliminary machine learning model, and the sample stable exposure parameters may be used as desired outputs (i.e., Ground Truth) of the preliminary machine learning model. More descriptions regarding the training of the preliminary machine learning model may be found elsewhere of the present disclosure (e.g., the FIG. 6 and the descriptions thereof) .
  • multiple groups of training samples of heads may be used to train a second preliminary machine learning model.
  • the first preliminary machine learning model, the second preliminary machine learning model, and the preliminary machine learning model of the trained machine learning model may be different or the same.
  • the processing device 140A may transmit the stable exposure parameters to a terminal device (e.g., the terminal device 130 in the imaging system 100) .
  • a user may further adjust the stable exposure parameters via the terminal device.
  • the processing device 140A may generate an image (also be referred to as a stable image) of the subject by directing the imaging device to scan the subject based on the stable exposure parameters.
  • the imaging device may scan the subject using the parameter values of the acquisition parameter (s) and the stable exposure parameters.
  • a dose difference between an X-ray dose corresponding to the stable image and the target dose may be less than a threshold value.
  • the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the target dose.
  • the user may further adjust the exposure parameters to obtain a new image.
  • one or more operations may be omitted and/or one or more additional operations may be added.
  • operation 510 and operation 520 may be combined into a single operation.
  • the process 500 may include an additional operation to transmit the stable image to the terminal device 130 for display.
  • the process 500 may be executed by a cloud server to reduce the performance requirements and/or the workload of a local processing device.
  • a special trained machine learning model may be generated by training a special preliminary machine learning model using a plurality groups of special training samples.
  • Each group of the special training samples may include sample parameter values of the acquisition parameter (s) , sample initial exposure parameters, a sample target brightness, and sample stable exposure parameters.
  • the sample stable exposure parameters may correspond to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target brightness.
  • the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target brightness may be used as an input of the special preliminary machine learning model, and the sample stable exposure parameters may be used as desired outputs (i.e., Ground Truth) of the special preliminary machine learning model.
  • the processing device 140A may convert the target dose into a target brightness.
  • the processing device 140A may input the parameter values of the acquisition parameters, the initial exposure parameters, and the target brightness into the specific trained machine learning model to determine the stable exposure parameters.
  • the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 600 illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, the trained machine learning model described in connection with operation 530 in FIG. 5 may be obtained according to the process 600. In some embodiments, the process 600 may be performed by another device or system other than the imaging system 100, e.g., a device or system of a vendor of a manufacturer. For illustration purposes, the implementation of the process 600 by the processing device 140B is described as an example.
  • the processing device 140B may obtain multiple groups of training samples.
  • Each group of the multiple groups of training samples may include sample parameter values of one or more acquisition parameters of an imaging device (e.g., the imaging device 110) , a sample target dose, sample initial parameter values of exposure parameters (also be referred to as sample initial exposure parameters) of the imaging device, and sample stable parameter values of the exposure parameters (also be referred to as sample stable exposure parameters) .
  • Each group of the multiple groups of training samples may correspond to a sample subject.
  • the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose in each group of the multiple groups of training samples may be used as an input of a preliminary machine learning model during a training process of the preliminary machine learning model.
  • the sample stable exposure parameters corresponding to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose in each group of the multiple groups of training samples may be used as desired outputs of the preliminary machine learning model during the training process of the preliminary machine learning model.
  • the sample subject corresponding to one or more of the multiple groups of training samples may be a subject as described elsewhere in the present disclosure (e.g., FIG. 5 and the descriptions thereof) .
  • the sample stable exposure parameters may be determined based on the sample parameter values of the acquisition parameter (s) , the sample target dose, and the sample initial exposure parameters.
  • the sample stable exposure parameters may be determined using the imaging device to perform an iteration process on the sample subject based on the sample parameter values of the acquisition parameter (s) and the sample initial exposure parameters.
  • a dose difference between an X-ray dose corresponding to the generated image and the sample target dose may be less than a threshold value.
  • the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the sample target dose. More descriptions for determining the sample stable exposure parameters of each group of training sample may be found in FIG. 7 and the descriptions thereof.
  • the processing device 140B may generate a trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process.
  • the trained machine learning model may be constructed based on a convolutional neural network (CNN) model, a fully convolutional neural network (FCN) model, a recurrent neural network (RNN) model, a fast-RCNN model, a generative adversarial network (GAN) model, or the like, or any combination thereof.
  • CNN convolutional neural network
  • FCN fully convolutional neural network
  • RNN recurrent neural network
  • GAN generative adversarial network
  • the preliminary machine learning model may include a plurality of parameters, such as architecture parameters, learning parameters, etc.
  • the plurality of parameters may also be referred to as training parameters.
  • One or more parameter values of the plurality of parameters (e.g., the learning parameters) may be altered during the training of the preliminary machine learning model using the multiple groups of training samples.
  • the parameter values of the plurality of parameters may be initialized, set, and/or adjusted before the training of the preliminary machine learning model to obtain an initialized preliminary machine learning model.
  • Exemplary parameters of the preliminary machine learning model may include the size of a kernel of a layer, the total count (or number) of layers, the count (or number) of nodes in each layer, a learning rate, a batch size, an epoch, a connected weight between two connected nodes, a bias vector relating to a node, etc.
  • the preliminary machine learning model may be trained based on the multiple groups of training samples using a machine learning algorithm.
  • Exemplary machine learning algorithms may include a gradient descent algorithm, Newton’s algorithm, an artificial neural network algorithm, a deep learning algorithm, a decision tree algorithm, an association rule algorithm, an inductive logic programming algorithm, a support vector machine algorithm, a clustering algorithm, a Bayesian network algorithm, a reinforcement learning algorithm, a representation learning algorithm, a similarity and metric learning algorithm, a sparse dictionary learning algorithm, a genetic algorithm, a rule-based machine learning algorithm, or the like, or any combination thereof.
  • the trained machine learning model may be obtained by performing a plurality of iterations to iteratively update one or more parameter values of the preliminary machine learning model.
  • the parameter values of the preliminary machine learning model may be initialized.
  • the connected weights and/or the bias vector of nodes of the preliminary machine learning model may be initialized by assigning random values in a range, e.g., the range from -1 to 1.
  • all the connected weights of the preliminary machine learning model may be assigned a same value in the range from -1 to 1, for example, 0.
  • the bias vector of nodes in the preliminary machine learning model may be initialized by assigning random values in a range from 0 to 1.
  • the parameter values of the preliminary machine learning model may be initialized based on a Gaussian random algorithm, a Xavier algorithm, etc.
  • a specific group of training sample may first be input into the preliminary machine learning model.
  • specific sample parameter values of the acquisition parameter (s) , specific sample initial exposure parameters, and a specific sample target dose in a specific group of training sample may be input into an input layer (e.g., the input layer shown in FIG. 8) of the preliminary machine learning model, and the specific sample stable exposure parameters corresponding to the specific sample parameter values of the acquisition parameter (s) , specific sample initial exposure parameters, and a specific sample target dose may be input into an output layer (e.g., the output layer shown in FIG. 8) of the preliminary machine learning model as desired outputs of the preliminary machine learning model.
  • the specific sample parameter values of the acquisition parameter (s) , the specific sample initial exposure parameters, and the specific sample target dose may be processed by one or more layers of the preliminary machine learning model to generate estimated stable exposure parameters.
  • the estimated stable exposure parameters may be compared with the specific sample stable exposure parameters based on a cost function of the preliminary machine learning model.
  • the cost function of the preliminary machine learning model may be configured to assess a difference between estimated values (e.g., predicted outputs or estimated stable exposure parameters) of the preliminary machine learning model and expected values (e.g., the desired outputs or the input specific sample stable exposure parameters) .
  • the value of the loss function may be used to evaluate the accuracy and reliability of the updated preliminary machine learning model, for example, the smaller the loss function is, the more reliable the updated preliminary machine learning model is.
  • Exemplary loss functions may include an L1 loss function, a focal loss function, a log loss function, a cross-entropy loss function, a Dice loss function, etc. If the value of the cost function exceeds a threshold in a current iteration, the parameter values of the preliminary machine learning model may be adjusted and/or updated in order to decrease the value of the cost function (i.e., the difference between the predicted outputs and the input specific sample stable exposure parameters) smaller than the threshold. Accordingly, in a next iteration, another group of training sample may be input into the preliminary machine learning model to train the preliminary machine learning model as described above.
  • the plurality of iterations may be performed to update the parameter values of the preliminary machine learning model until a termination condition is satisfied.
  • the termination condition may provide an indication of whether the preliminary machine learning model is sufficiently trained.
  • the termination condition may relate to the cost function or an iteration count of the iterative process or training process. For example, the termination condition may be satisfied if the value of the cost function associated with the preliminary machine learning model is minimal or smaller than a threshold (e.g., a constant) . As another example, the termination condition may be satisfied if the value of the cost function converges. The convergence may be deemed to have occurred if the variation of the values of the cost function in two or more consecutive iterations is smaller than a threshold (e.g., a constant) .
  • a threshold e.g., a constant
  • the termination condition may be satisfied when a specified number (or count) of iterations are performed in the training process.
  • the trained machine learning model may be determined based on the updated parameter values.
  • the trained machine learning model may be transmitted to a storage device (e.g., the storage device 150, the storage 220, the storage 390, etc. ) for storage.
  • the processing device 140B may configure a first weight for the sample parameter values of the acquisition parameter (s) , a second weight for the sample target dose, and a third weight for the sample initial exposure parameters. Both the first weight and the second weight are greater than the third weight.
  • the processing device 140B may train the preliminary machine learning model based on the sample parameter values of the acquisition parameter (s) with the first weight, the sample target dose with the second weight, and the sample initial exposure parameters with the third weight.
  • one or more operations may be added or omitted.
  • the processing device 140B may further test the trained machine learning model using a set of testing samples. Additionally or alternatively, the processing device 140B may update the trained machine learning model periodically or irregularly based on one or more newly-generated training samples. For example, the processing device 140B may update the trained matching model at a certain time interval (e.g., per month, per two months) based on a plurality of newly-generated training samples.
  • FIG. 7 is a flowchart illustrating an exemplary process for determining a training sample for a trained machine learning model according to some embodiments of the present disclosure.
  • a process 700 may be implemented as a set of instructions (e.g., an application) stored in the storage device 150, storage 220, and/or storage 390.
  • the processing device 140B e.g., implemented on the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4B
  • the operations of the illustrated process presented below are intended to be illustrative.
  • the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 700 illustrated in FIG. 7 and described below is not intended to be limiting. In some embodiments, a group of training sample in the multiple groups of training samples described elsewhere in the present disclosure (e.g., operation 610 illustrated in FIG. 6) may be determined according to the process 700. In some embodiments, the process 700 may be performed by another device or system other than the imaging system 100, e.g., a device or system of a vendor of a manufacturer.
  • the processing device 140B may obtain sample parameter values of one or more acquisition parameters of an imaging device for scanning a sample subject.
  • the sample subject may be same as a subject as described elsewhere in the present disclosure (e.g., FIG. 5 and the descriptions thereof) .
  • the acquisition parameter (s) of the imaging device may be same as or similar to the acquisition parameter (s) described in FIG. 5.
  • the acquisition parameter (s) may include a subject type of the sample subject, an acquisition type, a filtration mode, a source image distance (SID) , a source subject distance (SOD) , an image subject distance (IOD) , a focal spot size, a beam field size, or the like, or any combination thereof.
  • the sample parameter values of the acquisition parameter (s) may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
  • the processing device 140B may obtain sample initial parameter values of exposure parameters of the imaging device.
  • the exposure parameters of the imaging device may include parameters of an X-ray generator of the imaging device, such as a tube voltage of the X-ray generator, a tube current of the X-ray generator, a radiation time, etc.
  • sample initial parameter values of the exposure parameters may also be referred to as sample initial exposure parameters.
  • sample initial exposure parameters may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
  • the processing device 140B may obtain an initial image that is captured by scanning the sample subject based on the sample initial exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) .
  • the processing device 140B may direct the imaging device to scan the sample subject using the sample initial exposure parameters and the sample parameter values of the acquisition parameter (s) to generate the initial image.
  • the initial image may include a representation of the sample subject using a plurality of pixels or voxels with gray values.
  • the processing device 140B may obtain the initial image from the imaging device (e.g., the imaging device 110) , the storage device 150, or any other storage device.
  • the imaging device may transmit acquired initial imaging data (e.g., projection data) to the storage device 150 or any other storage device for storage.
  • the processing device 140A may obtain the initial imaging data from the storage device 150 or any other storage device and generate the initial image based on the initial imaging data.
  • the processing device 140B may obtain the initial image from the imaging device directly.
  • the processing device 140B may determine a practical dose corresponding to the initial image.
  • the processing device 140B may determine the practical dose based on the initial image and one or more conversion coefficients associated with the imaging device.
  • the one or more conversion coefficients may be related to a detector, a filter, etc., of the imaging device.
  • the processing device 140B may determine an average value of gray values of pixels in at least a portion (e.g., an ROI) of the initial image.
  • the processing device 140B may directly obtain a conversion coefficient between the average value and the practical dose.
  • the conversion coefficient may be related to a gain of the detector of the imaging device.
  • the conversion coefficient between the average value and the practical dose may be related to a first conversion coefficient related to the gain of the detector of the imaging device and a second conversion coefficient related to the filter (e.g., a copper filter) of the imaging device.
  • the processing device 140B may determine the practical dose based on Equation (1) as follows:
  • a gray P dose ⁇ C 1 ⁇ C 2 (1)
  • a gray denotes the average value of gray values of pixels in the at least one portion of the initial image
  • P dose denotes the practical dose
  • C 1 denotes the first conversion coefficient related to the gain of the detector of the imaging device
  • C 2 denotes the second conversion coefficient related to the filter of the imaging device.
  • the first conversion coefficient and/or the second conversion coefficient may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
  • the processing device 140B may determine whether a dose difference between the practical dose and a sample target dose is less than a threshold value.
  • the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the target dose.
  • the processing device 140B may designate the sample initial exposure parameters as sample stable exposure parameters in operation 760.
  • the processing device 140B may execute the process 700 to return operation 730 to obtain an updated image that is captured by scanning the sample subject based on updated parameter values of the exposure parameters (also be referred to as updated exposure parameters) using the imaging device according to the sample parameter values of the acquisition parameter (s) .
  • the processing device 140B may also determine an updated practical dose corresponding to the updated image.
  • the processing device 140B may also determine whether an updated dose difference between the updated practical dose and the sample target dose is less than the threshold value.
  • the processing device 140B may designate the updated exposure parameters as the sample stable exposure parameters in operation 760.
  • the processing device 140B may still obtain a further updated image by scanning the sample subject based on further updated exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) until the further updated dose difference is less than the threshold value.
  • the updated exposure parameters may be determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of the X-ray generator of the imaging device.
  • ABS curve of the X-ray generator may include a plurality sets of parameter values of the exposure parameters.
  • the processing device 140B may identify a current position of the sample initial exposure parameters on the ABS curve.
  • the processing device 140B may convert the sample target dose into a sample target brightness.
  • the processing device 140B may compare a brightness of the initial image with the sample target brightness. If the brightness of the initial image is greater than the sample target brightness, the processing device 140B may determine parameter values on the ABS curve corresponding to the next position of the current position downward along the ABS curve as the updated exposure parameters. If the brightness of the initial image is less than the sample target brightness, the processing device 140B may determine parameter values on the ABS curve corresponding to the next position of the current position upward along the ABS curve as the updated exposure parameters.
  • the processing device 140B may determine a ratio of the practical dose and the sample target dose.
  • the processing device 140B may determine whether an absolute value of a difference between the ratio and one is less than a second threshold value (e.g., 0.05, 0.1, 0.15, 0.2, etc. ) .
  • a second threshold value e.g., 0.05, 0.1, 0.15, 0.2, etc.
  • the processing device 140B may designed the sample initial exposure parameters as sample stable exposure parameters in operation 760.
  • the processing device 140B may return to operation 730 as described above.
  • the processing device 140B may determine the sample parameter values of the acquisition parameters, the sample initial exposure parameters, the sample target dose, and the sample stable exposure parameters as a group of training samples for training a preliminary machine learning model. In some embodiments, the processing device 140B may transmit the training sample to the storage device 150, the storage 220, the storage 390, or any other storage device via the network 120.
  • one or more operations may be omitted and/or one or more additional operations may be added.
  • operation 710 and operation 720 may be combined into a single operation.
  • one or more other optional operations e.g., a storing operation
  • the processing device 140B may store information and/or data associated with the imaging system 100 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure.
  • the sample initial parameters values of the exposure parameters and the sample target dose may be designed as control factor (s) for optimizing the ABS curve of the imaging device to obtain an optimized ABS curve.
  • the processing device 140B may update exposure parameters based on the optimized ABS curve.
  • the sample parameter values of the acquisition parameters of the imaging device may also be designated as the control factor (s) for optimizing the ABS curve.
  • the sample initial exposure parameter values and the sample target dose may be designed as main control factors and each main control factor may be configured with a fourth weight.
  • the sample parameter values of the acquisition parameters may be designed as auxiliary control factor (s) and each auxiliary control factor may be configured with a fifth weight. The fourth weight may be greater than the fifth weight.
  • the ABS curve may be optimized using a neural network model.
  • the main control factor (s) and the auxiliary control factor (s) may be input into the neural network model.
  • the neural network model may learn the input main control factor (s) and the auxiliary control factor (s) inputted into the neural network model to adjust the ABS curve, for example, correct data (e.g., a tube voltage, a tube current, or a radiation time) of one or more discrete points in the ABS curve.
  • FIG. 8 is a schematic diagram illustrating an exemplary trained machine learning model according to some embodiments of the present disclosure.
  • the trained machine learning model 800 may include an input layer 810, a hidden layer 820, and an output layer 830.
  • the hidden layer 820 may include a plurality of hidden sub-layers including one or more convolutional layers, one or more Rectified Linear Units layers (ReLU layers) , one or more pooling layers, one or more fully connected layers, or the like, or a combination thereof.
  • ReLU layers Rectified Linear Units layers
  • the processing device 140A may acquire parameter values of acquisition parameters (e.g., a subject type, an acquisition type (AcqType) , a SID, a SOD, a Filtration mode) of an imaging device, a target dose, and initial exposure parameters (e.g., an initial tube voltage (kV_initial) , an initial tube current (mA_initial) , and an initial radiation time (ms_initial) ) of the imaging device as input of the trained machine learning model.
  • the parameter values of the acquisition parameters, the target dose, and the initial exposure parameters may be denoted by a plurality of initial vectors. At least one portion of the plurality of initial vectors may be inputted into the hidden layer 820.
  • one or more features such as a tube voltage feature (kV_feature) , a tube current feature (mA_feature) , a radiation time feature (ms_feature) , a convergence feature, etc.
  • the hidden layer 820 may configure a first weight for the parameter values of the acquisition parameter (s) , a second weight for the target dose, and a third weight for the initial exposure parameters. Both the first weight and the second weight are greater than the third weight.
  • the one or more features may be extracted from the parameter values of the acquisition parameter (s) with the first weight, the target dose with the second weight, and the initial exposure parameters with the third weight.
  • the trained machine learning model may determine a plurality of target vectors corresponding to a plurality of neurons based on the input of the trained machine learning model.
  • the output layer 830 may determine outputs, such as a stable tube voltage (kV_stable) , a stable tube current (mA_stable) , and a stable radiation time (ms_stable) , based on the plurality of target vectors and weighting coefficients obtained in the hidden layer 820.
  • the trained machine learning model may be modified when applied in different conditions.
  • a Rectified Linear Units layer may be added into the hidden layer 820.
  • An activation function may be used by the Rectified Linear Units layer to constrain an output of the Rectified Linear Units layer.
  • Exemplary activation functions may include a linear function, a ramp function, a threshold function, a Sigmoid function, etc.
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “module, ” “unit, ” “component, ” “device, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
  • a computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including a subject oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

Systems (100) and methods for image generation are provided. The methods may include obtaining initial parameter values of exposure parameters of the imaging device (110) and a target dose associated with the subject (520); determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose (530); and generating an image by causing the imaging device (110) to scan the subject based on the stable parameter values of the exposure parameters (540).

Description

SYSTEMS AND METHODS FOR MEDICAL IMAGING
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority of Chinese Patent Application No. 202011196345.0, filed on October 30, 2020, the contents of which are hereby incorporated by reference.
TECHNICAL FIELD
The present disclosure generally relates to medical imaging, and more particularly, relates to systems and methods for determining one or more exposure parameters of an X-ray radiography device.
BACKGROUND
X-ray radiography is widely used in medical imaging and diagnosis. Conventional X-ray radiography devices, including computed tomography (CT) devices, digital radiography (DR) devices, C-arm X-ray devices, and so on, generally acquire images by scanning part or all of a subject. To perform a sequence scan using the existing technology, exposure parameters associated with an X-ray generator of an X-ray radiography device need to be automatically adjusted multiple times, usually according to an automatic brightness stabilization (ABS) curve of the X-ray generator, in order to obtain desired images (or stable images) . However, making multiple adjustments can be time-consuming and cause the scanned subject (e.g., a patient) to be exposed to excessive amount of radiation. Thus, it is desirable to provide systems and methods for directly determining stable exposure parameters to improve scanning efficiency.
SUMMARY
According to an aspect of the present disclosure, a system is provided. The system may include at least one storage device including a set of instructions for generating an image; and at least one processor in communication with the at least one storage device. When executing the set of instructions, the at least one processor is configured to cause the system to perform operations. The operations may include obtaining parameter values of one or more acquisition parameters of an imaging device for  scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
In some embodiments, the one or more acquisition parameters may include at least one of: an object type of the object, an acquisition type, a source image receptor distance (SID) , a source object distance (SOD) , or a filtration mode.
In some embodiments, the exposure parameters may include at least one of a tube voltage, a tube current, or a radiation time.
In some embodiments, the obtaining the trained machine learning model may include obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process. Each group of the multiple groups of training samples may include sample parameter values of the one or more acquisition parameters, sample initial parameter values of the exposure parameters, a sample target dose, and sample stable parameter values of the exposure parameters. The sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose may serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters may serve as desired outputs of the preliminary machine learning model during the training process. The sample stable parameter values of the exposure parameters may be determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
In some embodiments, the determining the sample stable parameter values of the exposure parameters may include performing an iteration process including one or more iterations. At least one iteration of the one or more iterations may include: obtaining an updated image that is captured by scanning the sample object based on updated parameter values of the exposure parameters using the imaging device according to the sample parameter values of the one or more acquisition parameters; determining an updated practical dose corresponding to the updated image; determining whether a dose difference between the updated practical dose and the sample target dose is less than a threshold value; and in response to determining that the dose difference is less than the threshold value, designating the updated parameter values of the exposure parameters as the sample stable parameter values of the exposure parameters.
In some embodiments, the updated parameter values of the exposure parameters may be determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of an X-ray generator of the imaging device.
In some embodiments, the training process may include configuring a first weight for the sample parameter values of the one or more acquisition parameters, a second weight for the sample target dose, and a third weight for the sample initial parameter values of the exposure parameters, wherein both the first weight and the second weight are greater than the third weight; and training the preliminary machine learning model based on the sample parameter values of the one or more acquisition parameters with the first weight, the sample target dose with the second weight, and the sample initial parameter values of the exposure parameters with the third weight.
In some embodiments, the trained machine learning model may include a deep learning model.
According to another aspect of the present disclosure, a method for image generation is provided. The method may be implemented on a computing device having at least one processor and at least one storage device, the method may include: obtaining  parameter values of one or more acquisition parameters of an imaging device for scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
According to yet another aspect of the present disclosure, a non-transitory computer readable medium is provided. The non-transitory computer readable medium may include at least one set of instructions for image generation. When executed by at least one processor of a computing device, the at least one set of instructions direct the at least one processor to perform operations including: obtaining parameter values of one or more acquisition parameters of an imaging device for scanning an object; obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the object; determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and generating an image by causing the imaging device to scan the object based on the stable parameter values of the exposure parameters.
According to yet another aspect of the present disclosure, a system for model training is provided. The system for model training may include at least one storage device storing executable instructions for generating a trained machine learning model; and at least one processor in communication with the at least one storage device. When executing the set of instructions, the at least one processor is configured to cause the system to perform operations. The operations may include obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the  multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
According to another aspect of the present disclosure, a method for model training is provided. The method may be implemented on a computing device having at least one processor and at least one storage device, the method may include: obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based  on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
According to yet another aspect of the present disclosure, a non-transitory computer readable medium is provided. The non-transitory computer readable medium may include at least one set of instructions for model training. When executed by at least one processor of a computing device, the at least one set of instructions direct the at least one processor to perform operations including: obtaining multiple groups of training samples, each group corresponding to a sample object; and generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process, wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies, instrumentalities, and combinations set forth in the detailed examples discussed below.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. The drawings are not to scale. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a schematic diagram illustrating an exemplary medical system according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure;
FIG. 4A is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure;
FIG. 4B is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure;
FIG. 5 is a flowchart illustrating an exemplary process for generating an image of a subject according to some embodiments of the present disclosure;
FIG. 6 is a flowchart illustrating an exemplary process for generating a trained machine learning model for determining exposure parameters according to some embodiments of the present disclosure;
FIG. 7 is a flowchart illustrating an exemplary process for determining a training sample for a trained machine learning model according to some embodiments of the present disclosure; and
FIG. 8 is a schematic diagram illustrating an exemplary trained machine learning model according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to those skilled in the art that the present disclosure may be practiced without such details. In other instances, well-known methods, procedures, systems, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise, ” “comprises, ” and/or “comprising, ” “include, ” “includes, ” and/or “including, ” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that the term “system, ” “engine, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections, or assembly of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
Generally, the word “module, ” “unit, ” or “block, ” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions. A module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device. In some embodiments, a software module/unit/block may be compiled and linked  into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG. 2) may be provided on a computer-readable medium, such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) . Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or can be included in programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks but may be represented in hardware or firmware. In general, the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage. The description may be applicable to a system, an engine, or a portion thereof.
It will be understood that when a unit, engine, module, or block is referred to as being “on, ” “connected to, ” or “coupled to, ” another unit, engine, module, or block, it may be directly on, connected or coupled to, or communicate with the other unit, engine, module, or block, or an intervening unit, engine, module, or block may be present unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
These and other features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon  consideration of the following description with reference to the accompanying drawings, all of which form a part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended to limit the scope of the present disclosure. It is understood that the drawings are not to scale.
The term “image” in the present disclosure is used to collectively refer to image data (e.g., scan data, projection data) and/or images of various forms, including a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) , etc. The term “pixel” and “voxel” in the present disclosure are used interchangeably to refer to an element of an image. The term “region, ” “location, ” and "area" in the present disclosure may refer to a location of an anatomical structure shown in the image or an actual location of the anatomical structure existing in or on a target subject’s body, since the image may indicate the actual location of a certain anatomical structure existing in or on the target subject’s body. In some embodiments, an image of a subject may be referred to as the subject for brevity. Segmentation of an image of a subject may be referred to as segmentation of the subject. For example, segmentation of an organ refers to segmentation of a region corresponding to the organ in an image.
The present disclosure provides mechanisms (which can include methods, systems, a computer-readable medium, etc. ) for generating an image of a subject. A method provided in the present disclosure may include obtaining parameter values of one or more acquisition parameters of an imaging device for scanning a subject. The method may further include obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject. The method may further include determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose. The method may further include generating an image by causing the  imaging device to scan the subject based on the stable parameter values of the exposure parameters.
In some conventional methods for image generation in sequence scanning, the stable parameter values of the exposure parameters may be determined by automatically adjusting the initial parameter values of the exposure parameters multiple times according to an automatic brightness stabilization (ABS) curve of a ray generator of the imaging device, which is time-consuming and causes the scanned subject (e.g., a patient) to be exposed to excessive radiation. The systems and methods of the present disclosure may directly determine the stable parameter values of the exposure parameters using a trained machine learning model based on the parameter values of the acquisition parameters, the initial parameter values of the exposure parameters, and the target dose, thereby improving scanning efficiency and protect the subject from excessive radiation exposure.
FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure. In some embodiments, the imaging system 100 may be applied to any application scenario in which radiation rays (e.g., X-rays) are used for generating images and/or providing treatment, such as a computed tomography (CT) system, a digital radiography (DR) system, a C-arm X-ray system, a computed tomography-positron emission tomography (CT-PET) system, or the like, or a combination thereof.
As shown in FIG. 1, the imaging system 100 may include an imaging device 110, a network 120, a terminal device 130, a processing device 140, and a storage device 150. The components in the imaging system 100 may be connected in one or more of various ways. Merely by way of example, the imaging device 110 may be connected to the processing device 140 through the network 120. As another example, the imaging device 110 may be connected to the processing device 140 directly as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140. As a further example, the storage device 150 may be connected to the processing device 140 directly or through the network 120. As still a further example, the terminal  device 130 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal device 130 and the processing device 140) or through the network 120.
The imaging device 110 may be configured to scan a subject using radiation rays and generate imaging data used to generate one or more images relating to the subject. The imaging data relating to at least one part of the subject may include an image (e.g., an image slice) , projection data, or a combination thereof. In some embodiments, the imaging data may be a two-dimensional (2D) imaging data, a three-dimensional (3D) imaging data, a four-dimensional (4D) imaging data, or the like, or any combination thereof. In some embodiments, the imaging device 110 may transmit the imaging data to the processing device 140 for further processing (e.g., generating one or more images) . In some embodiments, the imaging data and/or the one or more images associated with the subject may be stored in the storage device 150 and/or the processing device 140. In some embodiments, the imaging device 110 may include a computed tomography (CT) device, a digital radiography (DR) device, a C-arm X-ray device, a digital subtraction angiography (DSA) device, a dynamic spatial reconstructor (DSR) scanner, an X-ray microscopy device, a computed tomography-positron emission tomography (CT-PET) device, a computed tomography-magnetic resonance imaging (CT-MRI) device, or the like, or a combination thereof. In some embodiments, one or more components in the imaging system 100 may be omitted. Merely by way of example, the imaging system 100 may not include the terminal device 130.
In some embodiments, the imaging device 110 may include a gantry 111, a detector 112, a detecting region 113, a table 114, a ray generator 115, or any other components. The gantry 111 may be configured to provide support for other components (e.g., the ray generator 115, the detector 112, etc. ) of the imaging device 110. In some embodiments, the detector 112 and the ray generator 115 may be oppositely mounted on the gantry 111. In some embodiments, the gantry 111 may rotate and/or move. The detector 112 and the ray generator 115 may rotate along with the rotation of the gantry  111. The table 114 may be configured to locate and/or support a subject to be scanned. The subject may be placed on the table 114 and moved into the detecting region 113 (e.g., a space between the detector 112 and the ray generator 115) of the imaging device 110. The subject may be biological or non-biological. Merely by way of example, the subject may include a patient, a man-made subject, etc. As another example, the subject may include a specific portion, organ, and/or tissue of the patient. For example, the subject may include the head, the brain, the neck, the body, a shoulder, an arm, the thorax, the heart, the stomach, a blood vessel, a soft tissue, a knee, feet, or the like, or any combination thereof.
The ray generator 115 may emit radiation rays to scan the subject that is placed on the table 114. The radiation rays may include X-rays, γ-rays, α-rays, ultraviolet, laser, neutron, proton, or the like, or a combination thereof. The detector 112 may receive the radiation rays passed through the subject. In some embodiments, the detector 112 may include a plurality of detector units, which may be arranged in a channel direction and a row direction. The detector 112 may include a scintillation detector (e.g., a cesium iodide detector) or a gas detector.
The network 120 may include any suitable network that can facilitate the exchange of information and/or data for the imaging system 100. In some embodiments, one or more components of the imaging system 100 (e.g., the imaging device 110, the terminal device 130, the processing device 140, the storage device 150) may communicate information and/or data with one or more other components of the imaging system 100 via the network 120. For example, the processing device 140 may obtain image data from the imaging device 110 via the network 120. As another example, the processing device 140 may obtain user instruction (s) from the terminal device 130 via the network 120. In some embodiments, the network 120 may be any type of wired or wireless network, or a combination thereof. The network 120 may be or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a  Wi-Fi network, etc. ) , a cellular network (e.g., a Long Term Evolution (LTE) network) , a frame relay network, a virtual private network (VPN) , a satellite network, a telephone network, routers, hubs, switches, server computers, and/or any combination thereof. For example, the network 120 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof. In some embodiments, the network 120 may include one or more network access points. For example, the network 120 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the imaging system 100 may be connected to the network 120 to exchange data and/or information.
The terminal device 130 may be connected to and/or communicate with the imaging device 110, the processing device 140, and/or the storage device 150. For example, the terminal device 130 may obtain a processed image from the processing device 140. As another example, the terminal device 130 may enable user interactions with the imaging system 100. In some embodiments, the terminal device 130 may include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof. For example, the mobile device 131 may include a mobile phone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, a laptop, a tablet computer, a desktop, or the like, or any combination thereof. In some embodiments, the terminal device 130 may include an input device, an output device, etc. The input device may include alphanumeric and other keys that may be input via a keyboard, a touch screen (for example, with haptics or tactile feedback) , a speech input, an eye-tracking input, a brain monitoring system, or any other comparable input mechanism. The input information received through the input device may be transmitted to the processing device 140 via, for example, a bus, for further processing. Other types of input device may include a cursor control device, such as a mouse, a  trackball, or cursor direction keys, etc. The output device may include a display, a speaker, a printer, or the like, or a combination thereof. In some embodiments, the terminal device 130 may be part of the processing device 140.
The processing device 140 may process data and/or information obtained from the imaging device 110, the storage device 150, the terminal device 130, or other components of the imaging system 100. For example, the processing device 140 may determine stable parameter values of exposure parameters corresponding to parameter values of one or more acquisition parameters, initial parameter values of the exposure parameters, and a target dose based on a trained machine learning model. As another example, the processing device 140 may obtain multiple groups of training samples and generate the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples. As a further example, for determining one group of training sample, the processing device 140 may direct the imaging device 110 to perform an iteration operation until an image that satisfying a preset condition is generated. The processing device 140 may determine exposure parameters that is used to generate the image as sample stable parameter values of the exposure parameters. The trained machine learning model may be updated from time to time, e.g., periodically or not, based on a sample set that is at least partially different from the original sample set from which the original trained machine learning model is determined. For instance, the trained machine learning model may be updated based on a sample set including new samples that are not in the original sample set, samples processed using the machine learning model in connection with the original trained machine learning model of a prior version, or the like, or a combination thereof. In some embodiments, the generation and/or updating of the trained machine learning model may be performed on a processing device, while the application of the trained machine learning model may be performed on a different processing device. In some embodiments, the determination and/or updating of the trained machine learning model may be performed on a processing device of a system different from the imaging system 100 or a server different from a server including the  processing device 140 on which the application of the trained machine learning model is performed. For instance, the determination and/or updating of the trained machine learning model may be performed on a first system of a vendor who provides and/or maintains such a machine learning model and/or has access to training samples used to determine and/or update the trained machine learning model, while exposure parameter determination based on the provided trained machine learning model may be performed on a second system of a client of the vendor. In some embodiments, the determination and/or updating of the trained machine learning model may be performed online in response to a request for scanning a subject. In some embodiments, the determination and/or updating of the trained machine learning model may be performed offline.
In some embodiments, the trained machine learning model may be generated and/or updated (or maintained) by, e.g., the manufacturer of the imaging device 110 or a vendor. For instance, the manufacturer or the vendor may load the trained machine learning model into the imaging system 100 or a portion thereof (e.g., the processing device 140) before or during the installation of the imaging device 110 and/or the processing device 140, and maintain or update the trained machine learning model from time to time (periodically or not) . The maintenance or update may be achieved by installing a program stored on a storage device (e.g., a compact disc, a USB drive, etc. ) or retrieved from an external source (e.g., a server maintained by the manufacturer or vendor) via the network 120. The program may include a new model (e.g., a new trained machine learning model) or a portion of a model that substitute or supplement a corresponding portion of the model.
In some embodiments, the processing device 140 may be a single server or a server group. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local to or remote from the imaging system 100. For example, the processing device 140 may access information and/or data from the imaging device 110, the storage device 150, and/or the terminal device 130 via the network 120. As another example, the processing device 140 may be directly  connected to the imaging device 110, the terminal device 130, and/or the storage device 150 to access information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. For example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or a combination thereof. In some embodiments, the processing device 140 may be implemented on a computing device 200 having one or more components as described in connection with FIG. 2.
The storage device 150 may store data, instructions, and/or any other information. In some embodiments, the storage device 150 may store data obtained from the processing device 140 and/or the terminal device 130. For example, the storage device 150 may store one or more images obtained from the processing device 140. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform exemplary methods described in the present disclosure. In some embodiments, the storage device 150 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage devices may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage devices may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. Exemplary volatile read-and-write memory may include a random access memory (RAM) . Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc. In some embodiments, the storage device 150 may be implemented on a cloud platform as described elsewhere in the disclosure.
In some embodiments, the storage device 150 may be connected to the network  120 to communicate with one or more other components of the imaging system 100 (e.g., the processing device 140, the terminal device 130, etc. ) . One or more components of the imaging system 100 may access the data or instructions stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be part of the processing device 140.
This description is intended to be illustrative, and not to limit the scope of the present disclosure. Many alternatives, modifications, and variations will be apparent to those skilled in the art. The features, structures, methods, and other characteristics of the exemplary embodiments described herein may be combined in various ways to obtain additional and/or alternative exemplary embodiments. For example, the storage device 150 may be a data storage device including cloud computing platforms, such as public cloud, private cloud, community, and hybrid clouds, etc. However, those variations and modifications do not depart the scope of the present disclosure.
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary computing device on which the processing device 140 may be implemented according to some embodiments of the present disclosure. As illustrated in FIG. 2, a computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
The processor 210 may execute computer instructions (e.g., program code) and perform functions of the processing device 140 in accordance with techniques described herein. The computer instructions may include, for example, routines, programs, subjects, components, data structures, procedures, modules, and functions, which perform particular functions described herein. For example, the processor 210 may process image data obtained from the imaging device 110, the terminal device 130, the storage device 150, and/or any other component of the imaging system 100. In some embodiments, the processor 210 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application-specific integrated circuits (ASICs) , an application-specific instruction-set  processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
Merely for illustration, only one processor is described in the computing device 200. However, it should be noted that the computing device 200 in the present disclosure may also include multiple processors, and thus operations and/or method operations that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors. For example, if in the present disclosure the processor of the computing device 200 executes both operation A and operation B, it should be understood that operation A and operation B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
The storage 220 may store data/information obtained from the imaging device 110, the terminal device 130, the storage device 150, and/or any other component of the imaging system 100. In some embodiments, the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. In some embodiments, the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure. For example, the storage 220 may store a program for the processing device 140 for determining stable exposure parameters of an X-ray generator of the imaging device for scanning a subject.
The I/O 230 may input and/or output signals, data, information, etc. In some embodiments, the I/O 230 may enable a user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone,  or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof. Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , a touch screen, or the like, or a combination thereof.
The communication port 240 may be connected to a network (e.g., the network 120) to facilitate data communications. The communication port 240 may establish connections between the processing device 140 and the imaging device 110, the terminal device 130, and/or the storage device 150. The connection may be a wired connection, a wireless connection, any other communication connection that can enable data transmission and/or reception, and/or any combination of these connections. The wired connection may include, for example, an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof. The wireless connection may include, for example, a Bluetooth TM link, a Wi-Fi TM link, a WiMax TM link, a WLAN link, a ZigBee TM link, a mobile network link (e.g., 3G, 4G, 5G) , or the like, or a combination thereof. In some embodiments, the communication port 240 may be and/or include a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure. In some embodiments, one or more components (e.g., the terminal device 130 and/or the processing device 140) of the imaging system 100 may be implemented on the mobile device 300.
As illustrated in FIG. 3, the mobile device 300 may include a communication platform 310, a display 320, a graphic processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390. In some embodiments, any  other suitable component, including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300. In some embodiments, a mobile operating system 370 (e.g., iOS TM, Android TM, Windows Phone TM) and one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340. The applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to image processing or other information from the processing device 140. User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 140 and/or other components of the imaging system 100 via the network 120.
To implement various modules, units, and their functionalities described in the present disclosure, computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein. A computer with user interface elements may be used to implement a personal computer (PC) or any other type of work station or terminal device. A computer may also act as a server if appropriately programmed.
FIGs. 4A and 4B are block diagrams illustrating exemplary processing devices according to some embodiments of the present disclosure. The  processing devices  140A and 140B may be exemplary processing devices 140 as described in connection with FIG. 1. In some embodiments, the processing device 140A may be configured to generate a stable image of a subject. The processing device 140B may be configured to generate a trained machine learning model for exposure parameter determination. In some embodiments, the  processing devices  140A and 140B may be respectively implemented on a processing unit (e.g., the processor 210 illustrated in FIG. 2 or the CPU 340 illustrated in FIG. 3) . Merely by way of example, the processing device 140A may be implemented on a CPU 340 of a terminal device, and the processing device 140B may be implemented on a computing device 200. Alternatively, the  processing devices  140A and 140B may be implemented on a same computing device 200 or a same CPU 340. For  example, the  processing devices  140A and 140B may be implemented on a same computing device 200.
As illustrated in FIG. 4A, the processing device 140A may include an acquisition module 410, an exposure parameter determination module 420, and a scanning module 430.
The acquisition module 410 may be configured to obtain parameter values of one or more acquisition parameters of an imaging device for scanning an object. The acquisition module 410 may further be configured to obtain initial parameter values of exposure parameters of the imaging device and a target dose associated with the object.
The exposure parameter determination module 420 may be configured to determine stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose by using a trained machine learning model. For example, the exposure parameter determination module 420 may input the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose into the trained machine learning model, and the trained machine learning model may output the stable exposure parameters.
The scanning module 430 may be configured to generate an image (also be referred to as a stable image) of the object by directing the imaging device to scan the object based on the stable exposure parameters.
As illustrated in FIG. 4B, the processing device 140B may include an acquisition module 440, a sample generation module 450, and model determination module 460.
The acquisition module 440 may be configured to obtain multiple groups of training samples. Each group of the multiple groups of training samples may include sample parameter values of one or more acquisition parameters of an imaging device, a sample target dose, sample initial parameter values of exposure parameters of the imaging device, and sample stable parameter values of the exposure parameters. Each group of the multiple groups of training samples may correspond to a sample object.
The sample generation module 450 may be configured to determine a training sample for a trained machine learning model. For example, the sample generation module 450 may obtain sample parameter values of one or more acquisition parameters of an imaging device for scanning a sample object and sample initial parameter values of exposure parameters of the imaging device. The sample generation module 450 may further obtain an initial image that is captured by scanning the sample object based on the sample initial exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) . The sample generation module 450 may determine a practical dose corresponding to the initial image. The sample generation module 450 may determine whether a dose difference between the practical dose and a sample target dose is less than a threshold value. In response to a determination that the dose difference is less than the threshold value, the sample generation module 450 may designate the sample initial exposure parameters as sample stable exposure parameters. In response to a determination that the dose difference is not less than the threshold value, the sample generation module 450 may obtain an updated image that is captured by scanning the sample object based on updated parameter values of the exposure parameters (also be referred to as updated exposure parameters) using the imaging device according to the sample parameter values of the acquisition parameter (s) .
The model determination module 460 may be configured to generate a trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations or modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, the processing device 140A and/or the processing device 140B may share two or more of the modules,  and any one of the modules may be divided into two or more units. For instance, the  processing devices  140A and 140B may share a same acquisition module, that is, the acquisition module 410 and the acquisition module 440 are a same module. In some embodiments, the processing device 140A and/or the processing device 140B may include one or more additional modules, such as a storage module (not shown) for storing data. In some embodiments, the processing device 140A and the processing device 140B may be integrated into one processing device 140.
FIG. 5 is a flowchart illustrating an exemplary process for generating an image of a subject according to some embodiments of the present disclosure. In some embodiments, a process 500 may be implemented as a set of instructions (e.g., an application) stored in the storage device 150, the storage 220, or the storage 390. The processing device 140A (e.g., implemented on the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4A) may execute the set of instructions, and when executing the instructions, the processing device 140A may be configured to perform the process 500. The operations of the illustrated process 500 presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order of the operations of the process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
In 510, the processing device 140A (e.g., the acquisition module 410) may obtain parameter values of one or more acquisition parameters of an imaging device for scanning a subject. The subject may be biological or non-biological. For example, the subject may include a patient (or a portion thereof) , a man-made subject (e.g., a phantom) , etc., as described elsewhere in the present disclosure (e.g., FIG. 1 and the descriptions thereof) . The imaging device may include a ray generator, a detector, a table, etc., in connection with the imaging device 110 as described in FIG. 1. For illustration purposes, an X-ray generator may be taken as an example of the ray generator of the imaging device. For example, the imaging device may include a DR device, a C-arm X-ray device, etc.
In some embodiments, the acquisition parameters may also be referred to as intrinsic characteristic parameters. The intrinsic characteristic parameters may refer parameters of the imaging device that may not change once they are set during a scanning process. In some embodiments, the acquisition parameter (s) may include a subject type of a subject to be scanned, an acquisition type, a filtration mode, a source image distance (SID) , a source subject distance (SOD) , an image subject distance (IOD) , a focal spot size, a beam field size, or the like, or any combination thereof. The subject type refers to a category to which the subject belongs. For example, the subject type may include a biological type and a non-biological type. As another example, different organs and/or tissues may correspond to different subject types. The acquisition type may be related to an application scenario of the imaging device. In some embodiments, the acquisition type may be related to an X-ray dose on the detector. For example, the acquisition type may include an exposure type and a transmission type. A dose corresponding to the exposure type may be higher than a dose corresponding to the transmission type. The filtration mode may include a filter material placed in front of the X-ray generator in order to reduce the intensity of particular X-ray wavelengths from its spectrum and selectively alter the distribution of X-ray wavelengths within a given x-ray beam. Exemplary filter materials may include aluminum, copper, silver, iron, and so on. For filters of different filter materials, the thickness of the filters may be different, which causes different doses of the X-rays transmitted through the filters. The SID refers to a distance between a focal spot of the X-ray generator and the detector (e.g., an imaging region of the detector) . The SOD refers to a distance between the focal spot of the X-ray generator and the subject (e.g., a region of interest (ROI) of the subject) . The IOD refers to a distance between the subject and the detector (e.g., an imaging region of the detector) . In some embodiments, the SID may be equal to a sum of the SOD and IOD, and the focal spot of the X-ray generator, a center of the imaging region of the detector, and a center of the ROI of the subject may be in a straight line. In some embodiments, the acquisition parameters may also include information of the subject. For example, if  the subject is a patient, the information may include a body thickness of the patient, an age of the patient, etc.
In some embodiments, different subjects to be scanned may correspond to different parameter values of the acquisition parameter (s) . That is, the parameter values of the acquisition parameter (s) may be associated with the subject type of the subject to be scanned. For example, parameter values of the acquisition parameter (s) for scanning the head may be different from parameter values of the acquisition parameter (s) for scanning the heart. In some embodiments, the parameter values of the acquisition parameter (s) may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130. For example, multiple sets of candidate parameter values of the acquisition parameter (s) may be stored in a storage device (e.g., the storage device 150) as multiple scan protocols. The processing device 140A may determine the parameter values of the acquisition parameter (s) by selecting a scan protocol from the storage device. In some embodiments, a user (e.g., a doctor) may adjust the parameter values in the scan protocol via the terminal device 130.
In 520, the processing device 140A (e.g., the acquisition module 410) may obtain initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject. In some embodiments, the initial parameter values of the exposure parameters may also be referred to as initial exposure parameters.
As used herein, the exposure parameters may refer to parameters of an X-ray generator (e.g., a high voltage generator) of the imaging device used to capture an image of the subject. Parameter values of the exposure parameters may affect an X-ray dose on the detector of the imaging device and a brightness of an image of the subject obtained based on the parameter values of the exposure parameters. The brightness of the image may refer to an average value of gray values of pixels in at least a portion (e.g., an ROI) of the image. It should be noted that a dose (or an X-ray dose) disclosed in the present disclosure may refer a dose on the detector of the imaging device. The target dose may refer to a desired dose on the detector of the imaging device. The target dose may be  related to a target brightness of a desired image (e.g., an image whose resolution exceeds a resolution threshold) . An X-ray dose corresponding to the desired image may be close to or the same as the target dose. As used herein, “close to or the same as” may indicate that the deviation of the X-ray dose corresponding to the desired image and the target dose does not exceed a threshold, e.g., 30%, 20%, or 15%, or 10%, or 5%of one of the two doses.
In some embodiments, the exposure parameters of the imaging device may include a tube voltage of the X-ray generator, a tube current of the X-ray generator, a radiation time, etc. The tube voltage refers to a voltage between a cathode of the X-ray generator and an anode of the X-ray generator during the X-ray generator is emitting X-ray beams. The tube current refers to a current between the cathode of the X-ray generator and the anode of the X-ray generator during the X-ray generator is emitting X-ray beams.
In some embodiments, different subjects to be scanned may correspond to different initial exposure parameters. For example, initial exposure parameters for scanning the head may be different from initial exposure parameters for scanning the heart. In some embodiments, the initial exposure parameters and/or the target dose may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130. For example, multiple sets of candidate initial exposure parameters may also be stored in the storage device. The processing device 140A may select the initial exposure parameters from the multiple sets of candidate initial exposure parameters based on the subject type of the subject. In some embodiments, each set of candidate initial exposure parameters may be stored with each set of candidate parameter values of the acquisition parameter (s) as a scan protocol. That is, each scan protocol may include a set of candidate initial exposure parameters and a set of candidate parameter values of the acquisition parameter (s) . The processing device 140A may directly determine the parameter values of the acquisition parameter (s) and the initial exposure parameters by selecting the corresponding scan protocol. In some  embodiments, a user (e.g., a doctor) may adjust the initial exposure parameters in the scan protocol via the terminal device 130.
In some embodiments, if the subject is a patient, the initial exposure parameters may be determined based on one or more experiments on a phantom (e.g., water) . Specifically, before scanning the patient, the imaging device 110 may scan the subject under different sets of parameter values of the exposure parameters to generate different initial images. The processing device 140A may determine parameter values of the exposure parameters corresponding to an initial image that satisfies a preset condition (e.g., a brightness is greater than a brightness threshold) as the initial exposure parameters.
In 530, the processing device 140A (e.g., the exposure parameter determination module 420) may determine stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose by using a trained machine learning model. In some embodiments, the stable parameter values of the exposure parameters may also be referred to as stable exposure parameters.
The processing device 140A may input the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose into the trained machine learning model, and the trained machine learning model may output the stable exposure parameters. As used herein, the stable exposure parameters may refer to parameters of the X-ray generator for generating a stable image in which a dose difference between an X-ray dose (also be referred to as a practical dose) corresponding to the stable image and the target dose is less than a threshold value. The stable exposure parameters correspond to the parameter values of the acquisition parameter (s) , the initial exposure parameters, and the target dose may refer that the stable exposure parameters are determined by the imaging device based on the initial exposure parameters and the target dose by performing an iteration process as illustrated in FIG. 7.
In some embodiments, the trained machine learning model may include a deep learning model, a neutral network model, etc. The trained machine learning model may be generated by training a preliminary machine learning model using multiple groups of training samples. Each group of the training samples may include sample parameter values of the acquisition parameter (s) , sample initial exposure parameters, a sample target dose, and sample stable exposure parameters. The sample stable exposure parameters may correspond to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose. During the training, the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose may be used as an input of the preliminary machine learning model, and the sample stable exposure parameters may be used as desired outputs (i.e., Ground Truth) of the preliminary machine learning model. More descriptions regarding the training of the preliminary machine learning model may be found elsewhere of the present disclosure (e.g., the FIG. 6 and the descriptions thereof) .
In some embodiments, the processing device 140A may obtain the trained machine learning model from a model library. The model library may include a plurality of candidate trained machine learning models each of which corresponds to a subject type. For example, the model library may include trained machine learning models for the head, an arm, the chest, the heart, the stomach, etc. The processing device 140A may identify the trained machine learning model for the subject from the model library based on the subject type of the subject. It should be noted that different trained machine learning models in the model library may be trained based on different training samples. For example, for determining the trained machine learning model for the heart, multiple groups of training samples of hearts may be used to train a first preliminary machine learning model. As another example, for determining the trained machine learning model for the head, multiple groups of training samples of heads may be used to train a second preliminary machine learning model. In some embodiments, the first preliminary machine  learning model, the second preliminary machine learning model, and the preliminary machine learning model of the trained machine learning model may be different or the same.
In some embodiments, the processing device 140A may transmit the stable exposure parameters to a terminal device (e.g., the terminal device 130 in the imaging system 100) . A user may further adjust the stable exposure parameters via the terminal device.
In 540, the processing device 140A (e.g., the scanning module 430) may generate an image (also be referred to as a stable image) of the subject by directing the imaging device to scan the subject based on the stable exposure parameters.
The imaging device may scan the subject using the parameter values of the acquisition parameter (s) and the stable exposure parameters. A dose difference between an X-ray dose corresponding to the stable image and the target dose may be less than a threshold value. For example, the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the target dose. In some embodiments, if the stable image does not meet an requirement of the user, the user may further adjust the exposure parameters to obtain a new image.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, one or more operations may be omitted and/or one or more additional operations may be added. For example, operation 510 and operation 520 may be combined into a single operation. As a still example, the process 500 may include an additional operation to transmit the stable image to the terminal device 130 for display. In some embodiments, the process 500 may be executed by a cloud server to reduce the performance requirements and/or the workload of a local processing device.
In some embodiments, a special trained machine learning model may be generated by training a special preliminary machine learning model using a plurality groups of special training samples. Each group of the special training samples may include sample parameter values of the acquisition parameter (s) , sample initial exposure parameters, a sample target brightness, and sample stable exposure parameters. The sample stable exposure parameters may correspond to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target brightness. During the training, the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target brightness may be used as an input of the special preliminary machine learning model, and the sample stable exposure parameters may be used as desired outputs (i.e., Ground Truth) of the special preliminary machine learning model. The processing device 140A may convert the target dose into a target brightness. The processing device 140A may input the parameter values of the acquisition parameters, the initial exposure parameters, and the target brightness into the specific trained machine learning model to determine the stable exposure parameters.
FIG. 6 is a flowchart illustrating an exemplary process for generating a trained machine learning model for determining exposure parameters according to some embodiments of the present disclosure. In some embodiments, a process 600 may be implemented as a set of instructions (e.g., an application) stored in the storage device 150, storage 220, and/or storage 390. The processing device 140B (e.g., implemented on the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4B) may execute the set of instructions, and when executing the instructions, the processing device 140B may be configured to perform the process 600. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 600 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 600 illustrated in FIG. 6 and described below is not intended to be  limiting. In some embodiments, the trained machine learning model described in connection with operation 530 in FIG. 5 may be obtained according to the process 600. In some embodiments, the process 600 may be performed by another device or system other than the imaging system 100, e.g., a device or system of a vendor of a manufacturer. For illustration purposes, the implementation of the process 600 by the processing device 140B is described as an example.
In 610, the processing device 140B (e.g., the acquisition module 440) may obtain multiple groups of training samples. Each group of the multiple groups of training samples may include sample parameter values of one or more acquisition parameters of an imaging device (e.g., the imaging device 110) , a sample target dose, sample initial parameter values of exposure parameters (also be referred to as sample initial exposure parameters) of the imaging device, and sample stable parameter values of the exposure parameters (also be referred to as sample stable exposure parameters) . Each group of the multiple groups of training samples may correspond to a sample subject. The sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose in each group of the multiple groups of training samples may be used as an input of a preliminary machine learning model during a training process of the preliminary machine learning model. And the sample stable exposure parameters corresponding to the sample parameter values of the acquisition parameter (s) , the sample initial exposure parameters, and the sample target dose in each group of the multiple groups of training samples may be used as desired outputs of the preliminary machine learning model during the training process of the preliminary machine learning model.
In some embodiments, the sample subject corresponding to one or more of the multiple groups of training samples may be a subject as described elsewhere in the present disclosure (e.g., FIG. 5 and the descriptions thereof) . The sample stable exposure parameters may be determined based on the sample parameter values of the acquisition parameter (s) , the sample target dose, and the sample initial exposure parameters. For example, the sample stable exposure parameters may be determined  using the imaging device to perform an iteration process on the sample subject based on the sample parameter values of the acquisition parameter (s) and the sample initial exposure parameters. When the imaging device scans the sample subject based on the sample parameter values of the acquisition parameter (s) and the sample stable exposure parameters, a dose difference between an X-ray dose corresponding to the generated image and the sample target dose may be less than a threshold value. For example, the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the sample target dose. More descriptions for determining the sample stable exposure parameters of each group of training sample may be found in FIG. 7 and the descriptions thereof.
In 620, the processing device 140B (e.g., the model determination module 460) may generate a trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process. In some embodiments, the trained machine learning model may be constructed based on a convolutional neural network (CNN) model, a fully convolutional neural network (FCN) model, a recurrent neural network (RNN) model, a fast-RCNN model, a generative adversarial network (GAN) model, or the like, or any combination thereof.
In some embodiments, the preliminary machine learning model may include a plurality of parameters, such as architecture parameters, learning parameters, etc. The plurality of parameters may also be referred to as training parameters. One or more parameter values of the plurality of parameters (e.g., the learning parameters) may be altered during the training of the preliminary machine learning model using the multiple groups of training samples. The parameter values of the plurality of parameters may be initialized, set, and/or adjusted before the training of the preliminary machine learning model to obtain an initialized preliminary machine learning model. Exemplary parameters of the preliminary machine learning model may include the size of a kernel of a layer, the total count (or number) of layers, the count (or number) of nodes in each layer, a learning  rate, a batch size, an epoch, a connected weight between two connected nodes, a bias vector relating to a node, etc.
The preliminary machine learning model may be trained based on the multiple groups of training samples using a machine learning algorithm. Exemplary machine learning algorithms may include a gradient descent algorithm, Newton’s algorithm, an artificial neural network algorithm, a deep learning algorithm, a decision tree algorithm, an association rule algorithm, an inductive logic programming algorithm, a support vector machine algorithm, a clustering algorithm, a Bayesian network algorithm, a reinforcement learning algorithm, a representation learning algorithm, a similarity and metric learning algorithm, a sparse dictionary learning algorithm, a genetic algorithm, a rule-based machine learning algorithm, or the like, or any combination thereof. In some embodiments, the trained machine learning model may be obtained by performing a plurality of iterations to iteratively update one or more parameter values of the preliminary machine learning model. Before the plurality of iterations start, the parameter values of the preliminary machine learning model may be initialized. For example, the connected weights and/or the bias vector of nodes of the preliminary machine learning model may be initialized by assigning random values in a range, e.g., the range from -1 to 1. As another example, all the connected weights of the preliminary machine learning model may be assigned a same value in the range from -1 to 1, for example, 0. As another example, the bias vector of nodes in the preliminary machine learning model may be initialized by assigning random values in a range from 0 to 1. In some embodiments, the parameter values of the preliminary machine learning model may be initialized based on a Gaussian random algorithm, a Xavier algorithm, etc.
For each of the plurality of iterations, a specific group of training sample may first be input into the preliminary machine learning model. For example, specific sample parameter values of the acquisition parameter (s) , specific sample initial exposure parameters, and a specific sample target dose in a specific group of training sample may be input into an input layer (e.g., the input layer shown in FIG. 8) of the preliminary  machine learning model, and the specific sample stable exposure parameters corresponding to the specific sample parameter values of the acquisition parameter (s) , specific sample initial exposure parameters, and a specific sample target dose may be input into an output layer (e.g., the output layer shown in FIG. 8) of the preliminary machine learning model as desired outputs of the preliminary machine learning model. The specific sample parameter values of the acquisition parameter (s) , the specific sample initial exposure parameters, and the specific sample target dose may be processed by one or more layers of the preliminary machine learning model to generate estimated stable exposure parameters. The estimated stable exposure parameters may be compared with the specific sample stable exposure parameters based on a cost function of the preliminary machine learning model. The cost function of the preliminary machine learning model may be configured to assess a difference between estimated values (e.g., predicted outputs or estimated stable exposure parameters) of the preliminary machine learning model and expected values (e.g., the desired outputs or the input specific sample stable exposure parameters) . The value of the loss function may be used to evaluate the accuracy and reliability of the updated preliminary machine learning model, for example, the smaller the loss function is, the more reliable the updated preliminary machine learning model is. Exemplary loss functions may include an L1 loss function, a focal loss function, a log loss function, a cross-entropy loss function, a Dice loss function, etc. If the value of the cost function exceeds a threshold in a current iteration, the parameter values of the preliminary machine learning model may be adjusted and/or updated in order to decrease the value of the cost function (i.e., the difference between the predicted outputs and the input specific sample stable exposure parameters) smaller than the threshold. Accordingly, in a next iteration, another group of training sample may be input into the preliminary machine learning model to train the preliminary machine learning model as described above.
The plurality of iterations may be performed to update the parameter values of the preliminary machine learning model until a termination condition is satisfied. The  termination condition may provide an indication of whether the preliminary machine learning model is sufficiently trained. The termination condition may relate to the cost function or an iteration count of the iterative process or training process. For example, the termination condition may be satisfied if the value of the cost function associated with the preliminary machine learning model is minimal or smaller than a threshold (e.g., a constant) . As another example, the termination condition may be satisfied if the value of the cost function converges. The convergence may be deemed to have occurred if the variation of the values of the cost function in two or more consecutive iterations is smaller than a threshold (e.g., a constant) . As still an example, the termination condition may be satisfied when a specified number (or count) of iterations are performed in the training process. The trained machine learning model may be determined based on the updated parameter values. In some embodiments, the trained machine learning model may be transmitted to a storage device (e.g., the storage device 150, the storage 220, the storage 390, etc. ) for storage.
In some embodiments, during the training of the preliminary machine learning model, the processing device 140B may configure a first weight for the sample parameter values of the acquisition parameter (s) , a second weight for the sample target dose, and a third weight for the sample initial exposure parameters. Both the first weight and the second weight are greater than the third weight. The processing device 140B may train the preliminary machine learning model based on the sample parameter values of the acquisition parameter (s) with the first weight, the sample target dose with the second weight, and the sample initial exposure parameters with the third weight.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, one or more operations may be added or omitted. For example, after the trained machine learning  model is generated, the processing device 140B may further test the trained machine learning model using a set of testing samples. Additionally or alternatively, the processing device 140B may update the trained machine learning model periodically or irregularly based on one or more newly-generated training samples. For example, the processing device 140B may update the trained matching model at a certain time interval (e.g., per month, per two months) based on a plurality of newly-generated training samples.
FIG. 7 is a flowchart illustrating an exemplary process for determining a training sample for a trained machine learning model according to some embodiments of the present disclosure. In some embodiments, a process 700 may be implemented as a set of instructions (e.g., an application) stored in the storage device 150, storage 220, and/or storage 390. The processing device 140B (e.g., implemented on the processor 210, the CPU 340, and/or one or more modules illustrated in FIG. 4B) may execute the set of instructions, and when executing the instructions, the processing device 140B may be configured to perform the process 700. The operations of the illustrated process presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 700 illustrated in FIG. 7 and described below is not intended to be limiting. In some embodiments, a group of training sample in the multiple groups of training samples described elsewhere in the present disclosure (e.g., operation 610 illustrated in FIG. 6) may be determined according to the process 700. In some embodiments, the process 700 may be performed by another device or system other than the imaging system 100, e.g., a device or system of a vendor of a manufacturer.
In 710, the processing device 140B (e.g., the acquisition module 440) may obtain sample parameter values of one or more acquisition parameters of an imaging device for scanning a sample subject. The sample subject may be same as a subject as described elsewhere in the present disclosure (e.g., FIG. 5 and the descriptions thereof) . The  acquisition parameter (s) of the imaging device may be same as or similar to the acquisition parameter (s) described in FIG. 5. For example, the acquisition parameter (s) may include a subject type of the sample subject, an acquisition type, a filtration mode, a source image distance (SID) , a source subject distance (SOD) , an image subject distance (IOD) , a focal spot size, a beam field size, or the like, or any combination thereof. As another example, the sample parameter values of the acquisition parameter (s) may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
In 720, the processing device 140B (e.g., the acquisition module 440) may obtain sample initial parameter values of exposure parameters of the imaging device. In some embodiments, the exposure parameters of the imaging device may include parameters of an X-ray generator of the imaging device, such as a tube voltage of the X-ray generator, a tube current of the X-ray generator, a radiation time, etc.
In some embodiments, the sample initial parameter values of the exposure parameters may also be referred to as sample initial exposure parameters. In some embodiments, the sample initial exposure parameters may be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
In 730, the processing device 140B (e.g., the sample generation module 450) may obtain an initial image that is captured by scanning the sample subject based on the sample initial exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) . In other words, the processing device 140B may direct the imaging device to scan the sample subject using the sample initial exposure parameters and the sample parameter values of the acquisition parameter (s) to generate the initial image.
The initial image may include a representation of the sample subject using a plurality of pixels or voxels with gray values. In some embodiments, the processing device 140B may obtain the initial image from the imaging device (e.g., the imaging device  110) , the storage device 150, or any other storage device. For example, the imaging device may transmit acquired initial imaging data (e.g., projection data) to the storage device 150 or any other storage device for storage. The processing device 140A may obtain the initial imaging data from the storage device 150 or any other storage device and generate the initial image based on the initial imaging data. As another example, the processing device 140B may obtain the initial image from the imaging device directly.
In 740, the processing device 140B (e.g., the sample generation module 450) may determine a practical dose corresponding to the initial image.
In some embodiments, the processing device 140B may determine the practical dose based on the initial image and one or more conversion coefficients associated with the imaging device. The one or more conversion coefficients may be related to a detector, a filter, etc., of the imaging device. For example, the processing device 140B may determine an average value of gray values of pixels in at least a portion (e.g., an ROI) of the initial image. The processing device 140B may directly obtain a conversion coefficient between the average value and the practical dose. The conversion coefficient may be related to a gain of the detector of the imaging device. In some embodiments, the conversion coefficient between the average value and the practical dose may be related to a first conversion coefficient related to the gain of the detector of the imaging device and a second conversion coefficient related to the filter (e.g., a copper filter) of the imaging device. In such cases, the processing device 140B may determine the practical dose based on Equation (1) as follows:
A gray=P dose×C 1×C 2         (1)
where, A gray denotes the average value of gray values of pixels in the at least one portion of the initial image, P dose denotes the practical dose, C 1 denotes the first conversion coefficient related to the gain of the detector of the imaging device, and C 2 denotes the second conversion coefficient related to the filter of the imaging device. In some embodiments, the first conversion coefficient and/or the second conversion coefficient may  be set according to a default setting of the imaging system 100 or preset by a user or operator via the terminal device 130.
In 750, the processing device 140B (e.g., the sample generation module 450) may determine whether a dose difference between the practical dose and a sample target dose is less than a threshold value. For example, the dose difference may be smaller than or equal to a certain percentage (e.g., 5%, 10%, 15%, 20%, etc. ) of the target dose. In response to a determination that the dose difference is less than the threshold value, the processing device 140B may designate the sample initial exposure parameters as sample stable exposure parameters in operation 760. On the other hand, in response to a determination that the dose difference is not less than the threshold value, the processing device 140B may execute the process 700 to return operation 730 to obtain an updated image that is captured by scanning the sample subject based on updated parameter values of the exposure parameters (also be referred to as updated exposure parameters) using the imaging device according to the sample parameter values of the acquisition parameter (s) .
The processing device 140B may also determine an updated practical dose corresponding to the updated image. The processing device 140B may also determine whether an updated dose difference between the updated practical dose and the sample target dose is less than the threshold value. In response to a determination that the updated dose difference is less than the threshold value, the processing device 140B may designate the updated exposure parameters as the sample stable exposure parameters in operation 760. On the other hand, in response to a determination that the updated dose difference is not less than the threshold value, the processing device 140B may still obtain a further updated image by scanning the sample subject based on further updated exposure parameters using the imaging device according to the sample parameter values of the acquisition parameter (s) until the further updated dose difference is less than the threshold value.
In some embodiments, the updated exposure parameters may be determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of the X-ray generator of the imaging device. For example, the ABS curve of the X-ray generator may include a plurality sets of parameter values of the exposure parameters. The processing device 140B may identify a current position of the sample initial exposure parameters on the ABS curve. The processing device 140B may convert the sample target dose into a sample target brightness. The processing device 140B may compare a brightness of the initial image with the sample target brightness. If the brightness of the initial image is greater than the sample target brightness, the processing device 140B may determine parameter values on the ABS curve corresponding to the next position of the current position downward along the ABS curve as the updated exposure parameters. If the brightness of the initial image is less than the sample target brightness, the processing device 140B may determine parameter values on the ABS curve corresponding to the next position of the current position upward along the ABS curve as the updated exposure parameters.
In some embodiments, the processing device 140B may determine a ratio of the practical dose and the sample target dose. The processing device 140B may determine whether an absolute value of a difference between the ratio and one is less than a second threshold value (e.g., 0.05, 0.1, 0.15, 0.2, etc. ) . In response to a determination that the absolute value is less than the second threshold value, the processing device 140B may designed the sample initial exposure parameters as sample stable exposure parameters in operation 760. In response to a determination that the absolute value is not less than the second threshold value, the processing device 140B may return to operation 730 as described above.
In some embodiments, the processing device 140B may determine the sample parameter values of the acquisition parameters, the sample initial exposure parameters, the sample target dose, and the sample stable exposure parameters as a group of training samples for training a preliminary machine learning model. In some embodiments, the  processing device 140B may transmit the training sample to the storage device 150, the storage 220, the storage 390, or any other storage device via the network 120.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. In some embodiments, one or more operations may be omitted and/or one or more additional operations may be added. For example, operation 710 and operation 720 may be combined into a single operation. As another example, one or more other optional operations (e.g., a storing operation) may be added elsewhere in the process 700. In the storing operation, the processing device 140B may store information and/or data associated with the imaging system 100 in a storage device (e.g., the storage device 150) disclosed elsewhere in the present disclosure.
In some embodiments, the sample initial parameters values of the exposure parameters and the sample target dose may be designed as control factor (s) for optimizing the ABS curve of the imaging device to obtain an optimized ABS curve. In some embodiments, for another training sample, the processing device 140B may update exposure parameters based on the optimized ABS curve. In some embodiments, the sample parameter values of the acquisition parameters of the imaging device may also be designated as the control factor (s) for optimizing the ABS curve. In some embodiments, the sample initial exposure parameter values and the sample target dose may be designed as main control factors and each main control factor may be configured with a fourth weight. The sample parameter values of the acquisition parameters may be designed as auxiliary control factor (s) and each auxiliary control factor may be configured with a fifth weight. The fourth weight may be greater than the fifth weight. In some embodiments, the ABS curve may be optimized using a neural network model. For example, the main control factor (s) and the auxiliary control factor (s) may be input into the neural network  model. The neural network model may learn the input main control factor (s) and the auxiliary control factor (s) inputted into the neural network model to adjust the ABS curve, for example, correct data (e.g., a tube voltage, a tube current, or a radiation time) of one or more discrete points in the ABS curve.
FIG. 8 is a schematic diagram illustrating an exemplary trained machine learning model according to some embodiments of the present disclosure. The trained machine learning model 800 may include an input layer 810, a hidden layer 820, and an output layer 830. The hidden layer 820 may include a plurality of hidden sub-layers including one or more convolutional layers, one or more Rectified Linear Units layers (ReLU layers) , one or more pooling layers, one or more fully connected layers, or the like, or a combination thereof.
As described in connection with process 500, the processing device 140A may acquire parameter values of acquisition parameters (e.g., a subject type, an acquisition type (AcqType) , a SID, a SOD, a Filtration mode) of an imaging device, a target dose, and initial exposure parameters (e.g., an initial tube voltage (kV_initial) , an initial tube current (mA_initial) , and an initial radiation time (ms_initial) ) of the imaging device as input of the trained machine learning model. The parameter values of the acquisition parameters, the target dose, and the initial exposure parameters may be denoted by a plurality of initial vectors. At least one portion of the plurality of initial vectors may be inputted into the hidden layer 820.
In the hidden layer 820, one or more features, such as a tube voltage feature (kV_feature) , a tube current feature (mA_feature) , a radiation time feature (ms_feature) , a convergence feature, etc., may be extracted from the plurality of initial vectors. In some embodiments, the hidden layer 820 may configure a first weight for the parameter values of the acquisition parameter (s) , a second weight for the target dose, and a third weight for the initial exposure parameters. Both the first weight and the second weight are greater than the third weight. The one or more features may be extracted from the parameter values of the acquisition parameter (s) with the first weight, the target dose with the second  weight, and the initial exposure parameters with the third weight. The trained machine learning model may determine a plurality of target vectors corresponding to a plurality of neurons based on the input of the trained machine learning model. The output layer 830 may determine outputs, such as a stable tube voltage (kV_stable) , a stable tube current (mA_stable) , and a stable radiation time (ms_stable) , based on the plurality of target vectors and weighting coefficients obtained in the hidden layer 820.
It should be noted that the trained machine learning model may be modified when applied in different conditions. For example, in a training process, a Rectified Linear Units layer may be added into the hidden layer 820. An activation function may be used by the Rectified Linear Units layer to constrain an output of the Rectified Linear Units layer. Exemplary activation functions may include a linear function, a ramp function, a threshold function, a Sigmoid function, etc.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ” and “some embodiments” mean that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “module, ” “unit, ” “component, ” “device, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
A computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof. A computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including a subject oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software  package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For example, although the implementation of various components described above may be embodied in a hardware device, it may also be implemented as a software only solution, e.g., an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, claim subject matter lies in less than all features of a single foregoing disclosed embodiment.

Claims (23)

  1. A system for image generation, comprising:
    at least one storage device storing executable instructions; and
    at least one processor in communication with the at least one storage device, wherein when executing the executable instructions, the at least one processor is configured to cause the system to perform operations including:
    obtaining parameter values of one or more acquisition parameters of an imaging device for scanning a subject;
    obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject;
    determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and
    generating an image by causing the imaging device to scan the subject based on the stable parameter values of the exposure parameters.
  2. The system of claim 1, wherein the one or more acquisition parameters include at least one of:
    a subject type of the subject,
    an acquisition type,
    a source image receptor distance (SID) ,
    a source subject distance (SOD) , or
    a filtration mode.
  3. The system of claim 1 or claim 2, wherein the exposure parameters include at least one of a tube voltage, a tube current, or a radiation time.
  4. The system of any one of claims 1 to 3, wherein obtaining the trained machine learning model includes:
    obtaining multiple groups of training samples, each group corresponding to a sample subject; and
    generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process,
    wherein each group of the multiple groups of training samples includes sample parameter values of the one or more acquisition parameters, sample initial parameter values of the exposure parameters, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  5. The system of claim 4, wherein determining the sample stable parameter values of the exposure parameters includes performing an iteration process including one or more iterations, at least one iteration of the one or more iterations including:
    obtaining an updated image that is captured by scanning the sample subject based on updated parameter values of the exposure parameters using the imaging device according to the sample parameter values of the one or more acquisition parameters;
    determining an updated practical dose corresponding to the updated image;
    determining whether a dose difference between the updated practical dose and the sample target dose is less than a threshold value; and
    in response to determining that the dose difference is less than the threshold value, designating the updated parameter values of the exposure parameters as the sample stable parameter values of the exposure parameters.
  6. The system of claim 5, wherein the updated parameter values of the exposure parameters are determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of an X-ray generator of the imaging device.
  7. The system of any one of claims 4 to 6, wherein the training process includes:
    configuring a first weight for the sample parameter values of the one or more acquisition parameters, a second weight for the sample target dose, and a third weight for the sample initial parameter values of the exposure parameters, wherein both the first weight and the second weight are greater than the third weight; and
    training the preliminary machine learning model based on the sample parameter values of the one or more acquisition parameters with the first weight, the sample target dose with the second weight, and the sample initial parameter values of the exposure parameters with the third weight.
  8. The system of any one of claims 1 to 7, wherein the trained machine learning model includes a deep learning model.
  9. A method for image generation, implemented on a computing device having at least one processor and at least one storage device, the method comprising:
    obtaining parameter values of one or more acquisition parameters of an imaging device for scanning a subject;
    obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject;
    determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and
    generating an image by causing the imaging device to scan the subject based on the stable parameter values of the exposure parameters.
  10. The method of claim 9, wherein the one or more acquisition parameters include at least one of:
    a subject type of the subject,
    an acquisition type,
    a source image receptor distance (SID) ,
    a source subject distance (SOD) , or
    a filtration mode.
  11. The method of claim 9 or claim 10, wherein the exposure parameters include at least one of a tube voltage, a tube current, or a radiation time.
  12. The method of any one of claims 9 to 11, wherein obtaining the trained machine learning model includes:
    obtaining multiple groups of training samples, each group corresponding to a sample subject; and
    generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process,
    wherein each group of the multiple groups of training samples includes sample parameter values of the one or more acquisition parameters, sample initial parameter values of the exposure parameters, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or  more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  13. The method of claim 12, wherein determining the sample stable parameter values of the exposure parameters includes performing an iteration process including one or more iterations, at least one iteration of the one or more iterations including:
    obtaining an updated image that is captured by scanning the sample subject based on updated parameter values of the exposure parameters using the imaging device according to the sample parameter values of the one or more acquisition parameters;
    determining an updated practical dose corresponding to the updated image;
    determining whether a dose difference between the updated practical dose and the sample target dose is less than a threshold value; and
    in response to determining that the dose difference is less than the threshold value, designating the updated parameter values of the exposure parameters as the sample stable parameter values of the exposure parameters.
  14. The method of claim 13, wherein the updated parameter values of the exposure parameters are determined based on a previous dose difference determined in a previous iteration and an automatic brightness stabilization (ABS) curve of an X-ray generator of the imaging device.
  15. The method of any one of claims 12 to 14, wherein the training process includes:
    configuring a first weight for the sample parameter values of the one or more acquisition parameters, a second weight for the sample target dose, and a third weight for the sample initial parameter values of the exposure parameters, wherein both the first weight and the second weight are greater than the third weight; and
    training the preliminary machine learning model based on the sample parameter values of the one or more acquisition parameters with the first weight, the sample target dose with the second weight, and the sample initial parameter values of the exposure parameters with the third weight.
  16. The method of any one of claims 9 to 15, wherein the trained machine learning model includes a deep learning model.
  17. A non-transitory computer readable medium, comprising at least one set of instructions for image generation, wherein when executed by at least one processor of a computing device, the at least one set of instructions direct the at least one processor to perform operations including:
    obtaining parameter values of one or more acquisition parameters of an imaging device for scanning a subject;
    obtaining initial parameter values of exposure parameters of the imaging device and a target dose associated with the subject;
    determining, by using a trained machine learning model, stable parameter values of the exposure parameters corresponding to the parameter values of the one or more acquisition parameters, the initial parameter values of the exposure parameters, and the target dose; and
    generating an image by causing the imaging device to scan the subject based on the stable parameter values of the exposure parameters.
  18. A system for model training, comprising:
    at least one storage device storing executable instructions for generating a trained machine learning model; and
    at least one processor in communication with the at least one storage device, when executing the executable instructions, causing the system to perform operations including:
    obtaining multiple groups of training samples, each group corresponding to a sample subject; and
    generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples in a training process,
    wherein each group of the multiple groups of training samples includes sample parameter values of one or more acquisition parameters of an imaging device, sample initial parameter values of exposure parameters of the imaging device, a sample target dose, and sample stable parameter values of the exposure parameters, wherein the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose serve as an input of the preliminary machine learning model and the sample stable parameter values of the exposure parameters serve as desired outputs of the preliminary machine learning model during the training process, the sample stable parameter values of the exposure parameters being determined based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose.
  19. The system of claim 18, wherein to determine the sample stable parameter values of the exposure parameters based on the sample parameter values of the one or more acquisition parameters, the sample initial parameter values of the exposure parameters, and the sample target dose, the at least one processor is further configured to cause the system to perform a first iteration operation including one or more first iterations, at least one first iteration of the one or more first iterations including:
    obtaining an updated image that is captured by scanning the sample subject based on updated parameter values of the exposure parameters using the imaging device according to the sample parameter values of the one or more acquisition parameters;
    determining an updated practical dose corresponding to the updated image;
    determining whether a dose difference between the updated practical dose and the sample target dose is less than a threshold value; and
    in response to determining that the dose difference is less than the threshold value, designating the updated parameter values of the exposure parameters as the sample stable parameter values of the exposure parameters.
  20. The system of claim 19, wherein the updated parameter values of the exposure parameters are determined based on a previous dose difference determined in a previous first iteration and an automatic brightness stabilization (ABS) curve of an X-ray generator of the imaging device.
  21. The system of any one of claims 18 to 20, wherein the training process includes:
    configuring a first weight for the sample parameter values of the one or more acquisition parameters, a second weight for the sample target dose, and a third weight for the sample initial parameter values of the exposure parameters, wherein both the first weight and the second weight are greater than the third weight; and
    training the preliminary machine learning model based on the sample parameter values of the one or more acquisition parameters with the first weight, the sample target dose with the second weight, and the sample initial parameter values of the exposure parameters with the third weight.
  22. The system of any one of claims 18 to 21, wherein the generating the trained machine learning model by training a preliminary machine learning model using the multiple groups of training samples includes:
    initializing parameter values of the preliminary machine learning model; and
    generating the trained machine learning model by iteratively updating the parameter values of the preliminary machine learning model based on the multiple groups of training samples.
  23. The system of claim 22, wherein the iteratively updating the parameter values of the preliminary machine learning model includes performing a second iterative operation including one or more second iterations, at least one second iteration of the one or more second iterations includes:
    for each of at least some of the multiple groups of training samples, generating estimated parameter values of the exposure parameters by processing the sample initial parameter values of the exposure parameters, the sample parameter values of the one or more acquisition parameters, and the sample target dose of the training sample using an updated preliminary machine learning model determined in a previous second iteration;
    determining a value of a loss function based on the estimated parameter values of the exposure parameters and the sample stable parameter values of the exposure parameters of each of the training sample; and
    further updating at least some of the parameter values of the updated preliminary machine learning model to be used in a next second iteration based on the value of the loss function.
PCT/CN2021/127787 2020-10-30 2021-10-30 Systems and methods for medical imaging WO2022089626A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202011196345.0 2020-10-30
CN202011196345.0A CN112315490B (en) 2020-10-30 2020-10-30 Self-adaptive adjustment method for automatic brightness adjustment curve, image equipment and readable storage medium

Publications (1)

Publication Number Publication Date
WO2022089626A1 true WO2022089626A1 (en) 2022-05-05

Family

ID=74323219

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/127787 WO2022089626A1 (en) 2020-10-30 2021-10-30 Systems and methods for medical imaging

Country Status (2)

Country Link
CN (1) CN112315490B (en)
WO (1) WO2022089626A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112315490B (en) * 2020-10-30 2023-06-02 上海联影医疗科技股份有限公司 Self-adaptive adjustment method for automatic brightness adjustment curve, image equipment and readable storage medium
CN115379130B (en) * 2022-08-25 2024-03-29 上海联影医疗科技股份有限公司 Automatic exposure control system, method, device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5281995A (en) * 1990-07-27 1994-01-25 Fuji Photo Film Co., Ltd. Method of determining an exposure for use in an image forming apparatus
US20060170816A1 (en) * 2005-01-28 2006-08-03 Silverstein D A Method and system for automatically adjusting exposure parameters of an imaging device
DE102013218170A1 (en) * 2013-09-11 2015-03-12 Siemens Aktiengesellschaft Method for generating a digital X-ray image and X-ray apparatus
CN104545957A (en) * 2013-10-22 2015-04-29 上海西门子医疗器械有限公司 X-ray image system, X-ray image device, server and method
CN106413236A (en) * 2016-09-08 2017-02-15 沈阳东软医疗系统有限公司 Exposure parameter adjusting method and device
CN107811646A (en) * 2017-11-28 2018-03-20 上海联影医疗科技有限公司 Have an X-rayed exposal control method, device and X-ray production apparatus
CN110120045A (en) * 2019-06-14 2019-08-13 晓智科技(成都)有限公司 A kind of X-ray Imagewise exposure parameter determination method based on regression model
CN110916693A (en) * 2019-11-01 2020-03-27 东软医疗系统股份有限公司 Imaging method, imaging device, detector and X-ray camera system
CN111568451A (en) * 2020-05-22 2020-08-25 上海联影医疗科技有限公司 Exposure dose adjusting method and system
CN112315490A (en) * 2020-10-30 2021-02-05 上海联影医疗科技股份有限公司 Automatic brightness adjustment curve self-adaptive adjustment method, image device and readable storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3960335B2 (en) * 2002-07-12 2007-08-15 セイコーエプソン株式会社 Output image adjustment of image data
CN101262567B (en) * 2008-04-07 2010-12-08 北京中星微电子有限公司 Automatic exposure method and device
CN104287750A (en) * 2013-07-16 2015-01-21 南京普爱射线影像设备有限公司 X-ray machine automatic perspective method and device based on embedded software
CN104202535B (en) * 2014-06-30 2017-10-27 杭州光典医疗器械有限公司 A kind of automatic explosion method and device
CN106791472B (en) * 2016-12-29 2019-07-30 努比亚技术有限公司 A kind of exposure method and terminal
TWI705708B (en) * 2018-09-10 2020-09-21 仁寶電腦工業股份有限公司 Video recording device and video operation method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5281995A (en) * 1990-07-27 1994-01-25 Fuji Photo Film Co., Ltd. Method of determining an exposure for use in an image forming apparatus
US20060170816A1 (en) * 2005-01-28 2006-08-03 Silverstein D A Method and system for automatically adjusting exposure parameters of an imaging device
DE102013218170A1 (en) * 2013-09-11 2015-03-12 Siemens Aktiengesellschaft Method for generating a digital X-ray image and X-ray apparatus
CN104545957A (en) * 2013-10-22 2015-04-29 上海西门子医疗器械有限公司 X-ray image system, X-ray image device, server and method
CN106413236A (en) * 2016-09-08 2017-02-15 沈阳东软医疗系统有限公司 Exposure parameter adjusting method and device
CN107811646A (en) * 2017-11-28 2018-03-20 上海联影医疗科技有限公司 Have an X-rayed exposal control method, device and X-ray production apparatus
CN110120045A (en) * 2019-06-14 2019-08-13 晓智科技(成都)有限公司 A kind of X-ray Imagewise exposure parameter determination method based on regression model
CN110916693A (en) * 2019-11-01 2020-03-27 东软医疗系统股份有限公司 Imaging method, imaging device, detector and X-ray camera system
CN111568451A (en) * 2020-05-22 2020-08-25 上海联影医疗科技有限公司 Exposure dose adjusting method and system
CN112315490A (en) * 2020-10-30 2021-02-05 上海联影医疗科技股份有限公司 Automatic brightness adjustment curve self-adaptive adjustment method, image device and readable storage medium

Also Published As

Publication number Publication date
CN112315490A (en) 2021-02-05
CN112315490B (en) 2023-06-02

Similar Documents

Publication Publication Date Title
US11694086B2 (en) Systems and methods for detecting region of interset in image
US20210201066A1 (en) Systems and methods for displaying region of interest on multi-plane reconstruction image
US11369805B2 (en) System and method for pretreatement imaging in adaptive radiation therapy
US11508102B2 (en) Systems and methods for image processing
WO2019090533A1 (en) System and method for diagnositic and treatment
US11727610B2 (en) System and method for image processing
CN109060849B (en) Method, system and device for determining radiation dose modulation line
WO2021068975A1 (en) Systems and methods for image reconstruction
WO2022089626A1 (en) Systems and methods for medical imaging
US11813103B2 (en) Methods and systems for modulating radiation dose
US20230083657A1 (en) Systems and methods for image evaluation
US20220192619A1 (en) Imaging systems and methods
US11610753B2 (en) Systems and methods for correction of position of focal point
US20230290480A1 (en) Systems and methods for clinical target contouring in radiotherapy
CN109077746B (en) Method, system and device for determining radiation dose modulation line
US20240087186A1 (en) Systems and methods for image reconstruction
US11308610B2 (en) Systems and methods for machine learning based automatic bullseye plot generation
US20230125320A1 (en) Systems and methods for x-ray imaging
US11037339B2 (en) Systems and methods for image reconstruction
US20230083704A1 (en) Systems and methods for determining examination parameters
US20220387822A1 (en) Systems and methods for radiotherapy planning
WO2023087260A1 (en) Systems and methods for data processing
WO2023123361A1 (en) Systems and methods for motion correction for a medical image
US20230381542A1 (en) Systems and methods for quality assurance

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21885352

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21885352

Country of ref document: EP

Kind code of ref document: A1