WO2021136505A1 - Imaging systems and methods - Google Patents

Imaging systems and methods Download PDF

Info

Publication number
WO2021136505A1
WO2021136505A1 PCT/CN2020/142136 CN2020142136W WO2021136505A1 WO 2021136505 A1 WO2021136505 A1 WO 2021136505A1 CN 2020142136 W CN2020142136 W CN 2020142136W WO 2021136505 A1 WO2021136505 A1 WO 2021136505A1
Authority
WO
WIPO (PCT)
Prior art keywords
breast
image
model
feature
parameters
Prior art date
Application number
PCT/CN2020/142136
Other languages
French (fr)
Inventor
Kan LIANG
Ming Chen
Wei Li
Dingche TANG
Original Assignee
Shanghai United Imaging Healthcare Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co., Ltd. filed Critical Shanghai United Imaging Healthcare Co., Ltd.
Publication of WO2021136505A1 publication Critical patent/WO2021136505A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/005Specific pre-processing for tomographic reconstruction, e.g. calibration, source positioning, rebinning, scatter correction, retrospective gating
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10112Digital tomosynthesis [DTS]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2211/00Image generation
    • G06T2211/40Computed tomography
    • G06T2211/424Iterative

Definitions

  • the present disclosure generally relates to medical imaging, and in particular, to systems and methods for determining scan parameters of breast tomography.
  • Mammography is an effective screening and diagnostic tool for detection of breast disease (e.g., breast cancer) .
  • breast disease e.g., breast cancer
  • 2D mammography e.g., full-field digital mammography, FFDM
  • FFDM full-field digital mammography
  • Breast tomography e.g., digital breast tomosynthesis (DBT) is a three-dimensional (3D) imaging technique that is designed to overcome this problem. Therefore, it is desirable to provide systems and methods for automatically determining scan parameters of breast tomography to realize more accurate and more efficient breast tomography.
  • a system may include one or more storage devices and one or more processors configured to communicate with the one or more storage devices.
  • the one or more storage devices may include a set of instructions. When the one or more processors execute the set of instructions, the one or more processors may be directed to perform one or more of the following operations.
  • the one or more processors may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • the one or more processors may process the first breast image.
  • the one or more processors may perform at least one of downsampling, upsampling, image segmentation, or image enhancement on the first breast image.
  • the one or more processors may obtain an equalization image by performing histogram equalization on the first breast image.
  • the one or more processors may transform the equalization image into a frequency domain image.
  • the one or more processors may obtain an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image.
  • the one or more processors may obtain a spatial domain image by performing inverse transformation on the amplified frequency domain image.
  • the one or more processors may perform enhancement related to a greyscale range and contour extraction on the spatial domain image.
  • the feature extraction model may include an expert system or an artificial intelligence network model.
  • the artificial intelligence network model may include a machine learning model.
  • the artificial intelligence network model may include a deep learning model.
  • the deep learning model may include a convolutional neural network model.
  • the convolutional neural network model may include an input layer, a convolutional layer, a pooling layer, a fully connected later, and an output layer.
  • the convolutional layer may include 36 convolutional kernels each of which has a size of 13*13 and a stride of 1.
  • the pooling layer may include a max-pooling layer.
  • the feature extraction model may be provided by: obtaining a plurality of sample breast images; and obtaining the feature extraction model by training a preliminary model using the plurality of sample breast images.
  • Each of the plurality of sample breast images may include one or more sample feature parameters.
  • the one or more processors may obtain the feature extraction model by performing an iteration process including one or more iterations. At least one of the one or more iterations may include: obtaining one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model; and updating the intermediate model based on a difference between the one or more output feature parameters and the one or more sample feature parameters.
  • the intermediate model may include the preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the at least one of the one or more iterations.
  • the one or more processors may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  • the parameter determination model may include a machine learning model.
  • the one or more processors may determine a breast density feature of the target breast based on the compression thickness.
  • the one or more processors may modify at least one of the one or more breast feature parameters based on the breast density feature.
  • the one or more processors may determine the one or more scan parameters based on the one or more modified breast feature parameters.
  • the one or more breast feature parameters may include at least one of a type, shape, distribution, edge, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
  • ROIs regions of interest
  • the scan angle range and the frame number may be any value allowed by an imaging device that performs the breast tomography.
  • the one or more processors may cause, based on the one or more scan parameters, an imaging device to perform the breast tomography on the target breast.
  • the one or more processors may generate a second breast image of the target breast based on the breast tomography.
  • the breast tomography may include digital breast tomosynthesis (DBT) .
  • DBT digital breast tomosynthesis
  • a method may include one or more of the following operations.
  • One or more processors may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • a system may include an obtaining module configured to obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the system may also include a feature parameter determination module configured to determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the system may also include a scan parameter determination module configured to determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • a non-transitory computer readable medium may comprise at least one set of instructions.
  • the at least one set of instructions may be executed by one or more processors of a computing device.
  • the one or more processors may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a schematic block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure
  • FIG. 6 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure
  • FIG. 7 is a flowchart illustrating an exemplary process for obtaining a feature extraction model according to some embodiments of the present disclosure
  • FIG. 8 is a flowchart illustrating an exemplary training process for obtaining a feature extraction model according to some embodiments of the present disclosure
  • FIG. 9 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • FIG. 10 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • system, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, section or assembly of different level in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
  • module, ” “unit, ” or “block, ” as used herein refers to logic embodied in hardware or firmware, or to a collection of software instructions.
  • a module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device.
  • a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG.
  • a computer readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • a computer readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules/units/blocks may be included of connected logic components, such as gates and flip-flops, and/or can be included of programmable units, such as programmable gate arrays or processors.
  • modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware.
  • the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage.
  • the medical system may include an imaging system.
  • the imaging system may include a single modality imaging system and/or a multi-modality imaging system.
  • the single modality imaging system may include, for example, an X-ray imaging system (e.g., a computed tomography (CT) imaging system, a digital subtraction angiography (DSA) imaging system, a digital radiology (DR) imaging system, a computed radiology (CR) imaging system, etc.
  • CT computed tomography
  • DSA digital subtraction angiography
  • DR digital radiology
  • CR computed radiology
  • an ultrasound imaging system e.g., a color Doppler flow imaging (CDFI) system
  • a magnetic resonance imaging (MRI) system e.g., a nuclear medical imaging system
  • a nuclear medical imaging system e.g., a positron emission tomography (PET) imaging system, a single photon emission computed tomography (SPECT) imaging system, etc.
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • the multi-modality imaging system may include, for example, a computed tomography-magnetic resonance imaging (MRI-CT) system, a positron emission tomography-magnetic resonance imaging (PET-MRI) system, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) system, a digital subtraction angiography-magnetic resonance imaging (DSA-MRI) system, a positron emission tomography-magnetic resonance imaging-computed tomography (PET-CT) imaging system, etc.
  • the medical system may include a treatment system.
  • the treatment system may include a treatment plan system (TPS) , image-guide radiotherapy (IGRT) , etc.
  • the image-guide radiotherapy may include a treatment device and an imaging device.
  • the treatment device may include a linear accelerator, a cyclotron, a synchrotron, etc., configured to perform a radio therapy on a subject.
  • the treatment device may include an accelerator of species of particles including, for example, photons, electrons, protons, or heavy ions.
  • the imaging device may include an MRI scanner, a CT scanner (e.g., cone beam computed tomography (CBCT) scanner) , a digital radiology (DR) scanner, an electronic portal imaging device (EPID) , etc.
  • CBCT cone beam computed tomography
  • DR digital radiology
  • EPID electronic portal imaging device
  • an image, or a portion thereof (e.g., a region in the image) corresponding to an object may be referred to as an image, or a portion of thereof (e.g., a region) of or including the object, or the object itself.
  • a region in an image that corresponds to or represents a breast may be described as that the region includes a breast.
  • an image of or including a breast may be referred to a breast image, or simply breast.
  • a portion of an image corresponding to or representing an object is processed (e.g., extracted, segmented, etc. ) may be described as the object is processed.
  • a portion of an image corresponding to a breast is segmented from the rest of the image may be described as that the breast is segmented from the image.
  • Mammography is an effective screening and diagnostic tool for detection of breast disease (e.g., breast cancer) .
  • 2D mammography e.g., full-field digital mammography, FFDM
  • FFDM full-field digital mammography
  • Breast tomography e.g., digital breast tomosynthesis (DBT)
  • DBT digital breast tomosynthesis
  • Volumetric images may be generated based on a number of 2D X-ray images or 2D projections that are obtained from different scan angles in the scan angle range. The number (or count) of the 2D X-ray images or 2D projections may be referred to as a frame number of the breast tomography.
  • An aspect of the present disclosure provides systems and methods for automatically determining scan parameters of breast tomography to realize more accurate and more efficient breast tomography.
  • a compression thickness of a compressed target breast and a first breast image of the target breast may be obtained.
  • the first breast image may be a 2D preliminary breast image of the target breast.
  • One or more breast feature parameters may be determined by inputting the first breast image into a feature extraction model.
  • One or more scan parameters of breast tomography on the target breast may be determined based on the compression thickness and the one or more breast feature parameters.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • the one or more scan parameters may be configured to cause an imaging device to perform the breast tomography on the target breast.
  • a second breast image of the target breast may be generated based on the breast tomography.
  • the second breast image may be a volumetric image and used to diagnose disease of the target breast.
  • the systems and methods for automatically determining scan parameters of breast tomography provided in the present disclosure may save human resources through automatic parameter determination.
  • the automatic parameter determination is no longer limited to the exposure dose, but related to the scan angle range and the frame number, which improves the quality of the resultant images, makes the resultant images conducive to the diagnosis of breast disease, avoids repeated scanning and meaningless multi-frame scanning, and prevents patients from receiving a larger dose of radiation energy.
  • FIG. 1 is a schematic diagram illustrating an exemplary imaging system 100 according to some embodiments of the present disclosure.
  • the imaging system 100 may be applied in breast tomography, e.g., digital breast tomosynthesis (DBT) .
  • the imaging system 100 may be applied a Combo mode and/or a U-view mode.
  • the Combo mode may refer to a mode in which one or more 2D breast images and one or more tomographic breast images are obtained in the breast tomography.
  • the U-view mode may refer to a mode in which one or more fused 2D breast images and one or more tomographic breast images are obtained in the breast tomography.
  • the imaging system 100 may include an imaging device 110, a network 120, a terminal 130, a processing device 140, and a storage device 150.
  • the components of the imaging system 100 may be connected in one or more of various ways.
  • the imaging device 110 may be connected to the processing device 140 through the network 120.
  • the imaging device 110 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140) .
  • the storage device 150 may be connected to the processing device 140 directly or through the network 120.
  • a terminal device e.g., 131, 132, 133, etc.
  • the imaging device 110 may include a medical imaging device.
  • the imaging device 110 may be used to scan an object located within its detection region and generate a plurality of scan data (e.g., digital signals) used to generate one or more images relating to the object.
  • scan data e.g., digital signals
  • “subject” and “object” are used interchangeably.
  • the object may include a patient, a man-made object, etc.
  • the object may include a specific portion, organ, and/or tissue of a patient.
  • the object may include head, brain, neck, body, shoulder, arm, thorax, cardiac, stomach, blood vessel, soft tissue, knee, feet, breast, or the like, or any combination thereof.
  • the imaging device 110 may be used to scan a breast of the object (e.g., a patient) .
  • the imaging device 110 may be an X-ray device.
  • the X-ray device may include a breast-holder tray on which the patient lays her breast, an X-ray tube, and a detector.
  • the breast-holder tray may be placed on the top of the detector.
  • the detector may be placed beneath the breast-holder tray.
  • the X-ray tube may emit X-rays going through the breast.
  • the detector may be located opposite to the X-ray tube so as to detect the X-rays that have crossed the patient’s breast and the breast-holder tray.
  • the detector may transform the light signals of the detected X-rays into digital signals and transmit the digital signals to the processing device 140 for further processing (e.g., generating a breast image) .
  • the X-ray device may further include a compression pad. For reasons related both to the immobilizing of the breast and to image quality or intensity of X-rays delivered to the patient's breast, it is necessary to compress the patient's breast during the scan process. The compression force may be applied through the compression pad that compresses the breast on the breast-holder tray.
  • the X-ray device may further include a high-voltage generator configured to provide the voltage that is required for the X-ray tube to produce X-rays.
  • the X-ray device may further include a collimator configured to adjust an X-ray irradiation range.
  • the collimator also can absorb some scattered X-rays, which may improve the image quality.
  • the collimator may be located in front of the X-ray tube in the emitting direction of the X-rays.
  • the imaging device 110 may include a digital breast tomosynthesis (DBT) device.
  • DBT digital breast tomosynthesis
  • the network 120 may include any suitable network that can facilitate the exchange of information and/or data for the imaging system 100.
  • one or more components of the imaging system 100 e.g., the imaging device 110, the terminal 130, the processing device 140, or the storage device 150
  • the processing device 140 may obtain scan data (e.g., digital signals) of a breast of an object (e.g., a patient) from the imaging device 110 via the network 120.
  • the network 120 may be any type of wired or wireless network, or a combination thereof.
  • the network 120 may be and/or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , etc. ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc. ) , a cellular network (e.g., a Long Term Evolution (LTE) network) , a frame relay network, a virtual private network ( "VPN" ) , a satellite network, a telephone network, routers, hubs, switches, server computers, and/or any combination thereof.
  • a public network e.g., the Internet
  • a private network e.g., a local area network (LAN) , a wide area network (WAN) ) , etc.
  • a wired network e.g., an Ethernet network
  • a wireless network
  • the network 120 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 120 may include one or more network access points.
  • the network 120 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the imaging system 100 may be connected to the network 120 to exchange data and/or information.
  • the terminal 130 include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof.
  • the mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a smart bracelet, smart footgear, a pair of smart glasses, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof.
  • the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a Hololens, a Gear VR, etc.
  • the terminal 130 may remotely operate the imaging device 110 and/or the processing device 140.
  • the terminal 130 may operate the imaging device 110 and/or the processing device 140 via a wireless connection.
  • the terminal 130 may receive information and/or instructions inputted by a user, and send the received information and/or instructions to the imaging device 110 or to the processing device 140 via the network 120.
  • the terminal 130 may receive data and/or information from the processing device 140.
  • the terminal 130 may be part of the processing device 140.
  • the terminal 130 may be omitted.
  • the processing device 140 may process data and/or information obtained from the imaging device 110, the terminal 130, and/or the storage device 150.
  • the processing device 140 may generate one or more medical images (e.g., breast images) by processing scan data (e.g., digital signals) from the imaging device 110.
  • the processing device 140 may determine one or more scan parameters for scanning a breast using the imaging device 110.
  • the processing device 140 may be a single server, or a server group. The server group may be centralized or distributed.
  • the processing device 140 may be local or remote.
  • the processing device 140 may access information and/or data stored in or acquired by the imaging device 110, the terminal 130, and/or the storage device 150 via the network 120.
  • the processing device 140 may be directly connected to the imaging device 110 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the imaging device 110 in FIG. 1) , the terminal 130 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the terminal 130 in FIG. 1) , and/or the storage device 150 to access stored or acquired information and/or data.
  • the processing device 140 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the processing device 140 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
  • the storage device 150 may store data and/or instructions.
  • the storage device 150 may store data obtained from the imaging device 110, the terminal 130 and/or the processing device 140.
  • the storage device 150 may store medical images (e.g., breast images) generated by the processing device 140.
  • the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform exemplary methods described in the present disclosure.
  • the storage device 150 may store instructions that the processing device 140 may execute to determine one or more scan parameters.
  • the storage device 150 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memory may include a random access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • DRAM dynamic RAM
  • DDR SDRAM double date rate synchronous dynamic RAM
  • SRAM static RAM
  • T-RAM thyristor RAM
  • Z-RAM zero-capacitor RAM
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • the storage device 150 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 150 may be connected to the network 120 to communicate with one or more components of the imaging system 100 (e.g., the imaging device 110, the processing device 140, the terminal 130, etc. ) .
  • One or more components of the imaging system 100 may access the data or instructions stored in the storage device 150 via the network 120.
  • the storage device 150 may be directly connected to or communicate with one or more components of the imaging system 100 (e.g., the imaging device 110, the processing device 140, the terminal 130, etc. ) .
  • the storage device 150 may be part of the processing device 140.
  • the imaging system 100 may further include one or more power supplies (not shown in FIG. 1) connected to one or more components of the imaging system 100 (e.g., the imaging device 110, the processing device 140, the terminal 130, the storage device 150, etc. ) .
  • FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device on which the processing device 140 may be implemented according to some embodiments of the present disclosure.
  • the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
  • I/O input/output
  • the processor 210 may execute computer instructions (program code) and perform functions of the processing device140 in accordance with techniques described herein.
  • the computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein.
  • the processor 210 may generate one or more medical images (e.g., breast images) by processing scan data (e.g., digital signals) from the imaging device 110.
  • the processor 210 may determine one or more scan parameters for scanning a breast using the imaging device 110.
  • the processor 210 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
  • RISC reduced instruction set computer
  • ASICs application specific integrated circuits
  • ASIP application-specific instruction-set processor
  • CPU central processing unit
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field programmable gate array
  • ARM advanced RISC machine
  • PLD programmable logic device
  • the computing device 200 in the present disclosure may also include multiple processors, and thus operations of a method that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both operations A and B
  • operations A and step B may also be performed by two different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
  • the storage 220 may store data/information obtained from the imaging device 110, the terminal 130, the storage device 150, or any other component of the imaging system 100.
  • the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • the mass storage device may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • the removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • the volatile read-and-write memory may include a random access memory (RAM) .
  • the RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • the ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
  • the storage 220 may store instructions that the processor 210 may execute to determine one or more scan parameters.
  • the I/O 230 may input or output signals, data, or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, a trackball, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof.
  • Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
  • LCD liquid crystal display
  • LED light-emitting diode
  • CRT cathode ray tube
  • the communication port 240 may be connected to a network (e.g., the network 120) to facilitate data communications.
  • the communication port 240 may establish connections between the processing device140 and the imaging device 110, the terminal 130, or the storage device 150.
  • the connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception.
  • the wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
  • the wireless connection may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile network (e.g., 3G, 4G, 5G, etc. ) , or the like, or a combination thereof.
  • the communication port 240 may be a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
  • DICOM digital imaging and communications in medicine
  • FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device on which the terminal 130 may be implemented according to some embodiments of the present disclosure.
  • the mobile device 300 may include a communication platform 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 370 e.g., iOS, Android, Windows Phone, etc.
  • the applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to image processing or other information from the processing device 140. User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device140 and/or other components of the imaging system 100 via the network 120.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • the hardware elements, operating systems and programming languages of such computers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith to adapt those technologies to the blood pressure monitoring as described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or another type of work station or terminal device, although a computer may also act as a server if appropriately programmed. It is believed that those skilled in the art are familiar with the structure, programming and general operation of such computer equipment and as a result the drawings should be self-explanatory.
  • FIG. 4 is a schematic block diagram illustrating an exemplary processing engine 400 according to some embodiments of the present disclosure.
  • the processing engine 400 may include an obtaining module 410, a feature parameter determination module 420, and a scan parameter determination module 430.
  • the obtaining module 410 may be configured to obtain a compression thickness of a compressed target breast and a first breast image of the target breast. In some embodiments, the obtaining module 410 may be further configured to process the first breast image.
  • the feature parameter determination module 420 may be configured to determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the scan parameter determination module 430 may be configured to determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  • the scan parameter determination module 430 may be configured to determine, based on the compression thickness and the first breast image, one or more scan parameters of breast tomography on the target breast. In some embodiments, the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the first breast image into a scan parameter model.
  • the scan parameter determination module 430 may be configured to cause, based on the one or more scan parameters, the imaging device 110 to perform the breast tomography on the target breast.
  • the scan parameter determination module 430 may be configured to generate a second breast image of the target breast based on the breast tomography.
  • the feature parameter determination module may be configured to obtain a feature extraction model.
  • the scan parameter determination module may be configured to obtain a parameter determination model and/or a scan parameter model.
  • the modules in the processing engine 400 may be connected to or communicate with each other via a wired connection or a wireless connection.
  • the wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof.
  • the wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof.
  • LAN Local Area Network
  • WAN Wide Area Network
  • NFC Near Field Communication
  • Two or more of the modules may be combined as a single module, and any one of the modules may be divided into two or more units.
  • the obtaining module 410 may be divided into two units One of the two unit may be configured to obtain the compression thickness, and the other one of the two unit may be configured to obtain the first breast image.
  • the processing engine 400 may further include a storage module (not shown in FIG. 4) .
  • the storage module may be configured to store data generated during any process performed by any component of in the processing engine 400.
  • each of the components of the processing engine 400 may include a storage device. Additionally or alternatively, the components of the processing engine 400 may share a common storage device.
  • the feature parameter determination module 420 may be omitted.
  • FIG. 5 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • the process 500 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 500 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • a storage medium e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/
  • process 500 presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 as illustrated in FIG. 5 and described below is not intended to be limiting.
  • the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the target breast for reasons related both to the immobilizing of the target breast and to image quality or intensity of X-rays delivered to the target breast, it is necessary to compress the target breast during a scan process (e.g., a process for obtaining a first breast image of the target breast and/or a process for obtaining a second breast image of the target breast) .
  • the compression force may be applied through a compression pad of the imaging device 110 that compresses the target breast on a breast-holder tray of the imaging device 110.
  • the imaging device 110 may perform one or more pre-scans on the compressed target breast to obtain the first breast image of the target breast.
  • the first breast image may be used to determine one or more scan parameters that are configured to perform one or more formal scans on the target breast.
  • the first breast image may include one or more two-dimensional (2D) image.
  • the first breast image may include one or more preliminary images that have a relatively low image quality.
  • the first breast image may include breast tissue overlap.
  • the imaging device 110 may scan the target breast at one or more scan angles to obtain the first breast image.
  • the imaging device 110 may scan the target breast using a low dose of X-rays to obtain the first breast image.
  • the imaging device 110 may scan the target breast to obtain the first breast image using one or more preliminary parameters.
  • the preliminary parameters may be determined based on, e.g., experience of a user (e.g., a doctor, an engineer, an operator, a technician) of the imaging system 100, and/or default setting of the imaging system 100.
  • the obtaining module 410 may obtain the compression thickness and/or the first breast image from the imaging device 110 and/or a storage device (e.g., the storage device 150, the storage 220, etc. ) of the imaging system 100.
  • the obtaining module 410 may receive a request for obtaining the compression thickness and/or the first breast image.
  • the obtaining module 410 may obtain the compression thickness and/or the first breast image from the imaging device 110 and/or a storage device of the imaging system 100.
  • the obtaining module 410 may receive a request for performing one or more pre-scans on the compressed target breast.
  • the obtaining module 410 may send the request to the imaging device 110 to cause the imaging device 110 to perform the one or more pre-scans on the compressed target breast to generate pre-scan data.
  • the processing device 140 may generate the first breast image based on the pre-scan data.
  • the obtaining module 410 may obtain the first breast image from the processing device 140.
  • the feature parameter determination module 420 may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the feature extraction model may include an expert system or an artificial intelligence network model.
  • the artificial intelligence network model may include a machine learning model.
  • the artificial intelligence network model may include a deep learning model.
  • the feature extraction model may be provided by a training process and configured to output the one or more breast feature parameters. Details regarding the training process of the feature extraction model may be found elsewhere in the present disclosure (e.g., the description in connection with FIG. 7 and FIG. 8) .
  • the artificial intelligence network model may include an artificial neural network model.
  • the artificial neural network model may include an input layer, an output layer, and a plurality of hidden layers.
  • the layers of the artificial neural network model may be connected in a feed-forward fashion, and an output of an i th layer may be provided as an input to an (i+1) th layer.
  • the input layer may be configured to receive an input (e.g., the first breast image) of the artificial neural network model.
  • Each hidden layer may perform a specific function, e.g., convolution, pooling, normalization, matrix multiplication, non-linear activation, or the like.
  • the output layer may receive an input from the preceding layer and apply one or more transformations to the received input to generate a processing result (e.g., the one or more breast feature parameters of the first breast image) of the artificial neural network model.
  • a convolutional neural network (CNN) model may be taken as an example.
  • exemplary hidden layers may include a convolutional layer, a pooling layer, and a fully connected layer.
  • an image e.g., a sample breast image, the first breast image
  • the image may be represented as a 2D matrix or a 3D matrix including a plurality of elements (e.g., pixels or voxels) .
  • Each of the plurality of elements in the matrix may have a value representing a feature or characteristic of the element.
  • the convolutional layer may include one or more kernels, which may be used to extract a feature of the image.
  • each of the one or more convolutional kernel may have a specific size and stride.
  • each of the one or more kernels may filter a portion of the image to generate a specific feature corresponding to the portion.
  • the specific feature may be determined based on the one or more kernels.
  • Exemplary features may include a low-level feature (e.g., an edge feature, a textural feature) , a high-level feature, or a complicated feature.
  • the pooling layer may take an output of the convolutional layer as an input.
  • the pooling layer may include a plurality of pooling nodes, which may be used to sample the output of the convolutional layer, so as to reduce the computational load of data processing and accelerate the speed of data processing speed.
  • a size of the matrix representing the image may be reduced in the pooling layer.
  • the fully connected layer may include a plurality of neurons.
  • the neurons may be connected to the pooling nodes in the pooling layer.
  • a plurality of vectors corresponding to the plurality of pooling nodes may be determined based on one or more features of the image, and a plurality of weighting coefficients may be assigned to the plurality of vectors.
  • the output layer may determine an output based on the vectors and the weighting coefficients obtained from the fully connected layer.
  • an output of the output layer may include the one or more breast feature parameters.
  • the convolutional layer may include 36 convolutional kernels each of which has a size of 13*13 and a stride of 1.
  • the pooling layer may include a max-pooling layer.
  • the convolutional layer may extract features of the input first breast image.
  • the 36 convolution kernels with the size of 13*13 and the stride of 1 may output 36 feature images based on the first breast image.
  • the 36 feature images may be input into the pooling layer to achieve information filtering and feature selection in the 36 feature images.
  • the pooling layer may include a max-pooling layer, which can retain the main features in the feature images, prevent overfitting, improve the generalization ability of the feature extraction model, and reduce the data volume of the feature images, thereby improving the recognition rate of the one or more breast feature parameters.
  • the fully connected layer may reassemble all the local features output by the pooling layer into a complete graph through the weight matrix to achieve effective classification of the features.
  • the output layer may use logic functions to output the one or more breast feature parameters.
  • the feature extraction model may be implemented on one or more processing devices (e.g., the processing device 140, the processor 210, the terminal 130, the CPU 340, the GPU 330, etc. ) .
  • one or more layers may be respectively implemented on a processing device.
  • one or more components of a layer may be implemented on a same processing device.
  • a plurality of processing devices may execute a parallel processing operation in some layers of the feature extraction model by, for example, assigning two or more processing devices for an operation of different nodes (e.g., a kernel, a pooling node, a neuron) in the feature extraction model.
  • a first GPU may execute the operation corresponding to kernel A and kernel B
  • a second kernel may execute the operation corresponding to kernel C and kernel D
  • a plurality of GPUs may also execute the operation of other nodes (e.g., a kernel, a pooling node, a neuron) in the feature extraction model.
  • a storage device e.g., the storage device 150, the storage 220, the storage 390, the memory 360, etc.
  • data related to the feature extraction model such as an activation functions, a learned weight for each node, and/or a network topology (e.g., a number (or count) of the hidden layers, a type of each hidden layer, etc. )
  • the storage device may further store a training data set.
  • the one or more breast feature parameters may include at least one of a type, type probability, shape, distribution, edge, density, location, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
  • ROIs regions of interest
  • the background and the area of fat of the target breast in the first breast image may be darker, and the other areas in the first breast image may be brighter.
  • the brighter area other than the glands of the target breast in the first breast image may be designated as one or more ROIs.
  • the type of an ROI may refer to a type of a lesion that is possibly included in the ROI.
  • the type of an ROI may include a calcification, a lump, a tumor, or the like.
  • the classification of the calcification may be further refined.
  • the calcification may include benign calcification, suspicious calcification, malignant calcification, or the like, or any combination thereof.
  • the benign calcification may include a skin calcification, a round calcification, a vascular calcification, a suture calcification, a small calcification, or the like, or any combination thereof.
  • the suspicious calcification may include an amorphous calcification.
  • the small calcification may have a first probability to belong to the benign calcification and have a second probability to belong to the amorphous calcification of the suspicious calcification.
  • the type probability of an ROI may refer to a probability that the ROI includes a certain type of a lesion (e.g., a calcification, a lump, a tumor, etc. ) .
  • the shape of the ROI may be output as one of the one or more breast feature parameters of the first breast image, instead of the distribution of the ROI.
  • the shape of the ROI may include a regular shape or an irregular shape.
  • the distribution of the ROI may be output as one of the one or more breast feature parameters of the first breast image, instead of the shape of the ROI.
  • the distribution of an ROI may include a diffuse distribution, a regional distribution, a clustered distribution, a linear distribution, a segmental distribution, or the like, or any combination thereof.
  • the edge of the ROI when the type of an ROI include a lump, the edge of the ROI may be clear, fuzzy or rough.
  • the edge of the ROI may be basically clear.
  • the major axis of an ROI may be considered as a distance between two points furthest away from each other in the ROI, which can be used to characterize the size of the ROI.
  • the feature parameter determination module 420 may determine a breast density feature of the target breast based on the compression thickness and/or the first breast image. In some embodiments, the feature parameter determination module 420 may modify at least one of the one or more breast feature parameters based on the breast density feature. The feature parameter determination module 420 may determine one or more scan parameters based on the one or more modified breast feature parameters.
  • the feature extraction model may include a function of determining the breast density feature of the target breast.
  • the first breast image may be input into the feature extraction model.
  • the feature extraction model may output the breast density feature of the target breast by analyzing the first breast image.
  • the compression thickness and the first breast image may be input into the feature extraction model.
  • the feature extraction model may output the breast density feature of the target breast by analyzing the first breast image and the compression thickness.
  • the breast size of the target breast may be estimated based on the breast contour of the target breast in the first breast image and the compression thickness. According to the difference in gray values of fat and glands of the target breast in the first breast image, a first ratio of the size of the fat area to the breast size, and a second ratio of the size of the gland area to the breast size may be estimated.
  • the breast density feature may be estimated based on the first ratio and the second ratio.
  • the breast density feature of the target breast may be particularly important in lesion detection and lesion positioning.
  • the gland of the target breast may be identified as a lesion (e.g., a lump or a tumor) , which leads to false positive recognition of the lesion. Therefore, the one or more breast feature parameters determined by the feature extraction model may be modified according to the breast density feature of the target breast. For example, when the target breast is a dense breast, if the one or more ROIs in the first breast image is identified as including a lump and the lump is located at the gland of the target breast, the type probability that the one or more ROIs include a lump may be appropriately reduced.
  • the scan parameter determination module 430 may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • Breast tomography e.g., digital breast tomosynthesis (DBT)
  • DBT digital breast tomosynthesis
  • Volumetric images may be generated based on a number of 2D X-ray images or 2D projections that are obtained from different scan angles in the scan angle range.
  • the number (or count) of the 2D X-ray images or 2D projections may be referred to as a frame number of the breast tomography.
  • the scan angle range may include a range width and a range location.
  • the range location of the scan angle range may be 10° to the left of a scan reference point and 20° to the right of the scan reference point.
  • the scan reference point may refer to an initialized position of the X-ray tube and the detector of the imaging device 110.
  • a line connected the focus of the X-ray tube and the central point of the detector may be perpendicular or horizontal to the ground.
  • the compression thickness and the one or more breast feature parameters may be impact factors of the one or more scan parameters, and may collectively affect the determination of the one or more scan parameters.
  • the weights of these impact factors may be different in different situations. For example, when an ROI includes a lump, the weight of the distribution of the ROI may be relatively small, which basically does not affect the determination of the one or more scan parameters. As another example, when the type of an ROI includes a calcification, the weights of the edge and the shape of the ROI may be relatively small, which basically do not affect the determination of the one or more scan parameters.
  • the scan parameter determination module 430 may determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters. In some embodiments, choosing a larger scan angle range and a larger frame number may facilitate the analysis of the internal layered structure of the target breast. When the one or more ROIs are concentrated and have a small size, for example, when the one or more ROIs include a calcification, choosing a smaller scan angle range and an appropriate frame number may help the identification of the calcification.
  • the corresponding relationship between the above-mentioned impact factors and the one or more scan parameters may include the following manners.
  • the corresponding relationship between the above-mentioned impact factors and the one or more scan parameters may include the following manners.
  • the scan angle range may be positively related to the frame number, and may be further affected by factors such as the compression thickness and the length of the major axis of the one or more ROIs. For example, when the estimated compression thickness is too large or small, or the estimated major axis is too long or short, after the scan angle range is determined based on the estimated compression thickness and/or the estimated major axis, the scan angle range may be accordingly enlarged or reduced according to the actual compression thickness and/or the major axis of the target breast.
  • Table 1 shows the corresponding relationship of the compression thickness and one or more breast feature parameters with the one or more scan parameters of the breast tomography in a condition that the one or more ROIs include a lump.
  • the feature extraction model may determine the density and the major axis of the one or more ROIs in the first breast image. According to the density and the major axis, the type and the type probability of the one or more ROIs may be preliminarily determined, as shown in Table 1. When the density of the one or more ROIs is lower than the density of the calcification, and the major axis is relatively large (e.g., larger than 1mm, 2mm, 3mm, 4mm, 5mm, 6mm, 7mm, 8mm, 9mm, 10mm, etc. ) , the one or more ROIs may be preliminarily determined as including a lump.
  • the feature extraction model may adjust the preliminarily determined type probability of the one or more ROIs based on the shape and the edge of the one or more ROIs. As shown in the second row and the third row in Table 1, the type probability of determining the one or more ROIs as including a lump may be increased (e.g., 80%) because of the clear edge and the regular shape of the one or more ROIs. According to the adjusted type and type probability of the one or more ROIs, the feature extraction model may finally determine the type and the type probability of the one or more ROIs. As shown in Table 1, it is concluded that the one or more ROIs may include a lump.
  • the scan angle range and the frame number may be determined based on the shape, edge and major axis of the one or more ROIs, and the compression thickness.
  • the major axis of the one or more ROIs is shorter than 15mm, the more irregular the shape of the one or more ROIs is, the larger the scan angle range and the frame number may be.
  • the shape of the one or more ROIs is irregular, the thicker the compression thickness is, the larger the scan angle range and the frame number may be.
  • the compression thickness is less than 30mm, the longer the major axis of the one or more ROIs is, the larger the scan angle range and the frame number may be.
  • Table 2 shows the corresponding relationship of the compression thickness and the one or more breast feature parameters with the one or more scan parameters of the breast tomography when the one or more ROIs include calcification.
  • the feature extraction model may determine the density and the major axis of the one or more ROIs in the first breast image. According to the density and the major axis, the type and the type probability of the one or more ROIs may be preliminarily determined. As shown in Table 2, when the density of the one or more ROIs is relatively high and the major axis of the one or more ROIs is relatively small (e.g., less than 1mm, 2mm, 3mm, 4mm, 5mm, 6mm, 7mm, 8mm, 9mm, 10mm, etc. ) , the one or more ROIs may be preliminarily determined as including a calcification.
  • the feature extraction model may further adjust the type and/or the type probability of the one or more ROIs based on the shape and the edge of the one or more ROIs. For example, the clearer the edge of the one or more ROIs is, the more regular the shape of the one or more ROIs may be, which may improve the probability of determining that the one or more ROIs include a calcification. According to the adjusted type and type probability of the one or more ROIs, the feature extraction model may finally determine the type and the type probability of the one or more ROIs. As shown in Table 2, it is concluded that the one or more ROIs may include a calcification.
  • the distribution of the one or more ROIs and the compression thickness may be further used to determine the scan angle range and the frame number.
  • the scan angle range and the frame number in a condition that the distribution of the one or more ROIs is diffuse may be larger than those in a condition that the distribution of the one or more ROIs is regional and/or clustered.
  • the scan angle range corresponding to the skin calcification may be larger than those corresponding to other benign calcifications (e.g., round calcification and small calcification in Table 2) .
  • the range location of the scan angle range may be determined based on the location of the one or more ROIs. For example, when the range width of the scan angle range is 30°, the range location may be determined by setting, based on the location of the one or more ROIs relative to the scan reference point of the imaging device 110, the scan angle range to 10° to the left of the scan reference point and 20° to the right of the scan reference point. Compared with a scan angle range of 30° with 15° on the left side of the reference point and 15° on the right side of the reference point, the scan angle range of 30° with 10° on the left side of the reference point and 20° on the right side of the reference point may lead to an image that is more conducive to the diagnosis of breast lesion.
  • the one or more scan parameters may be any value allowed by the imaging device 110. In some embodiments, if the one or more determined scan parameters are beyond the allowed range of the imaging device 110, the scan parameter determination module 430 may modify the one or more determined scan parameters to a maximum value or a minimum value allowed by the imaging device 110.
  • the scan parameter determination module 430 may cause, based on the one or more scan parameters, the imaging device 110 to perform the breast tomography on the target breast.
  • the scan parameter determination module 430 may generate a second breast image of the target breast based on the breast tomography.
  • the imaging device 110 may scan the compressed target breast at different scan angles within the scan angle range to obtain a number (equal to the frame number) of 2D X-ray images or 2D projections of the target breast.
  • the second breast image that is a volumetric image may be generated based on the 2D X-ray images or 2D projections.
  • the second breast image may be used to diagnose breast disease of the target breast.
  • FIG. 6 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • the process 600 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 600 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • a storage medium e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/
  • the process 600 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 600 as illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, on the basis of the process 500 in FIG. 5, the process 600 may further include an operation of processing the first breast image before determining the one or more breast feature parameters.
  • the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the obtaining module 410 may process the first breast image.
  • the feature parameter determination module may determine one or more breast feature parameters by inputting the processed first breast image into a feature extraction model.
  • the scan parameter determination module 430 may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angular range and a frame number of the breast tomography.
  • the operations 610, 630, and 640 may be similar to the operations 510-530 of the process 500 in FIG. 5.
  • the obtaining module 410 may process the first breast image by performing downsampling, upsampling, image segmentation, image enhancement, or the like, or any combination thereof on the first breast image.
  • the obtaining module 410 may perform upsampling on the first breast image. For example, the obtaining module 410 may perform interpolation on the first breast image to improve the image quality of the first breast image, which is helpful to improve the accuracy of determining the one or more breast feature parameters. In some embodiments, when the first breast image includes relatively large data, the obtaining module 410 may perform downsampling on the first breast image. For example, the obtaining module 410 may reduce the number (or count) of pixels of the first breast image to reduce data volume of the first breast image, which may help increase the rate of determining the one or more breast feature parameters.
  • the obtaining module 410 may perform image segmentation on the first breast image by dividing the first breast image according to, for example, a preset image segmentation size (e.g., 240 mm *240 mm) .
  • a preset image segmentation size e.g., 240 mm *240 mm
  • the normalization of different first breast images may be achieved, which may improve the accuracy of determining the one or more breast feature parameters, reduce the difficulty of determining the one or more breast feature parameters, and increase the rate of determining the one or more breast feature parameters.
  • the obtaining module 410 may use spatial domain algorithms to achieve image enhancement processing on the first breast image, so as to obtain a processed first breast image that is conducive to the automatic determination of the one or more breast feature parameters.
  • the obtaining module 410 may perform the following operations to achieve the image enhancement of the first breast image.
  • the obtaining module 410 may obtain an equalization image by performing histogram equalization on the first breast image.
  • the obtaining module 410 may transform the equalization image into a frequency domain image.
  • the obtaining module 410 may obtain an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image.
  • the obtaining module 410 may obtain a spatial domain image by performing inverse transformation on the amplified frequency domain image.
  • the obtaining module 410 may perform enhancement related to a greyscale range and contour extraction on the spatial domain image.
  • the first breast image may be understood as a spatial domain image.
  • the obtaining module 410 may perform the histogram equalization on the first breast image. For example, the obtaining module 410 may perform histogram equalization in the spatial domain to adjust the contrast of the first breast image using the image histogram of the first breast image, which can effectively solve the problem, caused by overexposure or underexposure, that the background and/or the foreground of the first breast image is too dark or bright, thereby better reflecting the image details of the first breast image.
  • the obtaining module 410 may transform the equalization image into a frequency domain image. For example, the obtaining module 410 may transform the equalization image into a frequency domain image through a fast Fourier transformation algorithm.
  • the obtaining module 410 may transform the frequency domain image into a spatial domain image through an inverse Fourier transformation algorithm.
  • amplifying a signal at one or more frequency points in the frequency domain image may make the boundary of tissue in the first breast image clearer.
  • the contour extraction processing may be performed on the spatial domain image using, for example, a high-contrast retention algorithm to effectively extract the contour of tissue in the first breast image, so as to realize the sharpening of the image local features of the first breast image.
  • the grayscale range may refer to a grayscale range corresponding to a grayscale distribution of different tissues such as the calcification, the lump, the gland, the fat, or the skin in the first breast image.
  • the obtaining module 410 may perform enhancement on the spatial domain image in the grayscale range.
  • the obtaining module 410 may take a window location (e.g., the center point of the grayscale range) and a window width (e.g., the width of the grayscale range) of the grayscale range to cut and normalize the image grayscale of the spatial domain image.
  • the obtaining module 410 may set the gray values in the spatial image higher than the grayscale range to 1 (e.g., presented as white) , and set the gray values in the spatial image lower than the grayscale range to 0 (e.g., presented as black) , so as to highlight the region in the spatial image corresponding to the grayscale range.
  • the process 600 for determining the one or more scan parameters provided in this embodiment include an operation of processing the first breast image, which can improve the efficiency and accuracy of determining the one or more breast feature parameters using the feature extraction model.
  • FIG. 7 is a flowchart illustrating an exemplary process for obtaining a feature extraction model according to some embodiments of the present disclosure.
  • the process 700 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 700 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • a storage medium e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/
  • the feature parameter determination module 420 may obtain a plurality of sample breast images. Each of the plurality of sample breast images may include one or more sample feature parameters. In some embodiments, the one or more sample feature parameters may correspond to the one or more breast feature parameters.
  • the feature parameter determination module 420 may process the plurality of sample breast images, which may improve the accuracy and the efficiency of the training process of the feature extraction model.
  • the operation of processing the plurality of sample breast images may be similar to the operation of processing the first breast image illustrated in FIG. 6.
  • the extraction of the sample feature parameters from the plurality of sample breast images may be fully automated, semi-automated, manual, or a combination thereof.
  • one or more image processing algorithms may first detect one or more sample ROIs and then extract the sample feature parameters.
  • the image processing algorithm (s) may automatically detect one or more sample ROIs (e.g., calcifications, lumps, tumors in the plurality of sample breast images) , and then extract the sample feature parameters from the detected sample ROI (s) .
  • sample ROIs e.g., calcifications, lumps, tumors in the plurality of sample breast images
  • the sample feature parameters may be annotated or measured by a user.
  • the sample feature parameters may be extracted during the training.
  • deep learning e.g., deep structured learning, hierarchical learning, or deep machine learning
  • the preliminary model may process the sample breast images via multiple layers of feature extraction to produce the sample feature parameters used as ground truth of the training process.
  • Deep learning may be effective in extracting high level abstractions (features) from Iow level input data (e.g., raw images) that is suitable for the trained task, so may be more convenient than and superior to hand-crafted features for identifying breast features for breast images.
  • the sample feature parameters may be extracted from the sample breast images based on gray information and/or morphological information in the sample breast images.
  • the gray values of breast calcifications in a breast image may be relatively higher, the distribution of breast calcifications in a breast image may be relatively loose, the edges of breast calcifications in a breast image may be relatively smooth, and the diameters of breast calcifications in a breast image may be between 100 microns and several millimeters.
  • the gray values of breast lumps in a breast image may be lower than those of calcifications.
  • the edges of breast lumps in a breast image may be different due to the malignant lumps or benign lumps.
  • the malignant lump may have sharp and scattering edges or irregular edges, and the benign lump may have smooth edges that have an oval or round shape.
  • the diameter of a lump may be a few millimeters to a few centimeters.
  • the feature parameter determination module 420 may obtain the feature extraction model by training a preliminary model using the plurality of sample breast images.
  • the plurality of sample breast images may be used as a training set of data (e.g., a training set of inputs each having a known output (e.g., ground truth) such as the sample feature parameters) to train the feature extraction model.
  • a preliminary model may be obtained.
  • the preliminary model may be trained based on the sample breast images and the sample feature parameters to obtain the feature extraction model.
  • the preliminary model may include a plurality of weight parameters that are to be determined during learning, which may be referred to as a training process.
  • a training set of data (a training set of inputs such as the sample breast images each having a known output such as the sample feature parameters) may be processed by the preliminary model so that the preliminary model may learn how to provide an output for new input data by generalizing the information it learns in the training process from the training data.
  • the purpose of learning may be to adapt the weight parameters on the incoming connections to predict the correct output when given an input.
  • the training process may be repeated to update the feature extraction model using the additional sample breast images with or without previous sample breast images used for earlier rounds of the training.
  • the feature extraction model may be determined by the imaging system 100 (e.g., the processing device 140, the terminal 130, a storage device (the storage device 150, the storage 220, the storage 390) ) or a third party (e.g., an external device) .
  • the imaging system 100 may determine and/or update the feature extraction model offline and store the feature extraction model in the storage device.
  • the feature extraction model may be determined and/or updated (or maintained) by, e.g., the manufacturer of the imaging device 110 or a vendor.
  • the manufacturer or the vendor may load either one of the feature extraction model into the imaging system 100 or a portion thereof (e.g., the processing device 140 and/or the terminal 130) before or during the installation of the imaging device 110, the processing device 140, and/or the terminal 130, and maintain or update the feature extraction model from time to time (periodically or not) .
  • the maintenance or update may be achieved by installing a program stored on a storage device (e.g., a compact disc, a USB drive, etc. ) or retrieved from an external source (e.g., a server maintained by the manufacturer or vendor) via the network 120.
  • the program may include a new model (e.g., a new feature extraction model) or a portion of a model that substitute or supplement a corresponding portion of the model.
  • FIG. 8 is a flowchart illustrating an exemplary training process for obtaining a feature extraction model according to some embodiments of the present disclosure.
  • the process 800 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 800 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4 .
  • process 800 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 800 as illustrated in FIG. 8 and described below is not intended to be limiting. In some embodiments, operation 720 of the process 700 may be performed based on the process 800.
  • the feature parameter determination module 420 may obtain the feature extraction model by performing an iteration process including one or more iterations. In some embodiments, the feature parameter determination module 420 may update weight parameters of a preliminary model by performing an iteration process of a backpropagation neural network training procedure, e.g., a stochastic gradient descent backpropagation training technique, to determine the updated values of the weight parameters of the feature extraction model. For example, the feature parameter determination module 420 may backpropagate the error determined for the output of the neural network in order to adjust the parameters of the neural network layers.
  • a backpropagation neural network training procedure e.g., a stochastic gradient descent backpropagation training technique
  • the feature parameter determination module 420 may obtain one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model.
  • the intermediate model may include a preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the current iteration.
  • the feature parameter determination module 420 may determine a difference between the one or more output feature parameters and the one or more sample feature parameters. In some embodiments, the feature parameter determination module 420 may determine a value of a loss function based on the difference.
  • the feature parameter determination module 420 may determine whether a termination condition is satisfied.
  • An exemplary termination condition may be that the value of the loss function in the current iteration is less than a threshold value.
  • Other exemplary termination conditions may include that a maximum number (or count) of iterations has been performed, and/or that a difference between the values of the loss function obtained in a previous iteration and the current iteration (or among the values of the loss function within a certain number or count of successive iterations) is less than a certain threshold.
  • the process 800 may proceed to 840, and initiate a new iteration by further repeating 810-830 until the termination condition is satisfied.
  • the process 800 may proceed to operation 840, e.g., the iterative process may be terminated and the intermediate model in the current iteration may be determined as the feature extraction model, and may be stored and/or output.
  • the feature parameter determination module 420 may update the intermediate model based on the difference between the one or more output feature parameters and the one or more sample feature parameters. For example, the feature parameter determination module 420 may update the weight parameters in the intermediate model based on the difference between the one or more output feature parameters and the one or more sample feature parameters.
  • FIG. 9 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • the process 900 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 900 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4 .
  • the process 900 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 900 as illustrated in FIG. 9 and described below is not intended to be limiting. In some embodiments, on the basis of the process 500 in FIG. 5, the process 900 may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  • the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the feature parameter determination module 420 may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
  • the scan parameter determination module 430 may determine one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  • the operations 910 and 920 may be similar to the operations 510 and 520 of the process 500 in FIG. 5.
  • the parameter determination model may include an expert system or an artificial intelligence network model.
  • the artificial intelligence network model may include a machine learning model.
  • the artificial intelligence network model may include a deep learning model.
  • the artificial intelligence network model may include an artificial neural network model (e.g., a CNN model) .
  • the parameter determination model may be provided by a training process and configured to output the one or more scan parameters.
  • the training process of the parameter determination model may be similar to the training process of the feature extraction model illustrated in FIG. 7 and FIG. 8.
  • sample feature parameters and sample compression thicknesses of a plurality of sample breast images, and sample scan parameters used to generate the plurality of sample breast images may be obtained and used as a training set of data (e.g., a training set of inputs such as the sample feature parameters and the sample compression thicknesses each having a known output (e.g., ground truth) such as the sample scan parameters) to train the parameter determination model.
  • the extraction of the sample feature parameters may be similar to the relevant description in connection with operation 710 of the process 700 in FIG. 7.
  • the parameter determination model may realize the automatic determination of the one or more scan parameters of the breast tomography.
  • FIG. 10 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
  • the process 1000 may be implemented in the imaging system 100 illustrated in FIG. 1.
  • the process 1000 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) .
  • the processing device 140 or the terminal 130 e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4 .
  • process 1000 presented below are intended to be illustrative. In some embodiments, the process 1000 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 1000 as illustrated in FIG. 10 and described below is not intended to be limiting.
  • the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
  • the operation 1010 may be similar to the operation 510 of the process 500 in FIG. 5.
  • the scan parameter determination module 430 may determine, based on the compression thickness and the first breast image, one or more scan parameters of breast tomography on the target breast.
  • the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
  • the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the first breast image into a scan parameter model.
  • the scan parameter model may include a function of extract features from an image. Therefore, the scan parameter model may realize directly outputting the one or more scan parameters based on the input of the compression thickness and the first breast image. For example, the scan parameter model may extract one or more breast feature parameters from the first breast image. Then, the scan parameter model may determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters.
  • the scan parameter model may include an expert system or an artificial intelligence network model.
  • the artificial intelligence network model may include a machine learning model.
  • the artificial intelligence network model may include a deep learning model.
  • the scan parameter model may be provided by a training process and configured to output the one or more scan parameters.
  • the artificial intelligence network model may include an artificial neural network model.
  • the artificial neural network model may include an input layer, an output layer, and a plurality of hidden layers.
  • the layers of the artificial neural network model may be connected in a feed-forward fashion, and an output of an i th layer may be provided as an input to an (i+1) th layer.
  • the input layer may be configured to receive an input (e.g., the first breast image and the compression thickness) of the artificial neural network model.
  • Each hidden layer may perform a specific function, e.g., convolution, pooling, normalization, matrix multiplication, non-linear activation, or the like.
  • the output layer may receive an input from the preceding layer and apply one or more transformations to the received input to generate a processing result (e.g., the one or more scan parameters) of the artificial neural network model.
  • a convolutional neural network (CNN) model may be taken as an example.
  • exemplary hidden layers may include a convolutional layer, a pooling layer, and a fully connected layer.
  • the first breast image and the compression thickness may be inputted into the scan parameter model.
  • the first breast image may be represented as a 2D matrix or a 3D matrix including a plurality of elements (e.g., pixels or voxels) . Each of the plurality of elements in the matrix may have a value representing a feature or characteristic of the element.
  • the convolutional layer may include one or more kernels, which may be used to extract a feature of the image (e.g., one or more breast feature parameters of the first breast image) .
  • each of the one or more convolutional kernel may have a specific size and stride.
  • each of the one or more kernels may filter a portion of the first breast image to generate a specific feature corresponding to the portion. The specific feature may be determined based on the one or more kernels.
  • Exemplary features may include a low-level feature (e.g., an edge feature, a textural feature) , a high-level feature, or a complicated feature.
  • the pooling layer may take an output of the convolutional layer as an input.
  • the pooling layer may include a plurality of pooling nodes, which may be used to sample the output of the convolutional layer, so as to reduce the computational load of data processing and accelerate the speed of data processing speed.
  • a size of the matrix representing the first breast image may be reduced in the pooling layer.
  • the fully connected layer may include a plurality of neurons.
  • the neurons may be connected to the pooling nodes in the pooling layer.
  • a plurality of vectors corresponding to the plurality of pooling nodes may be determined based on the one or more breast feature parameters of the first breast image and the compression thickness, and a plurality of weighting coefficients may be assigned to the plurality of vectors.
  • the output layer may determine an output based on the vectors and the weighting coefficients obtained from the fully connected layer.
  • an output of the output layer may include the one or more scan parameters.
  • the training process of the scan parameter model may be similar to the training process of the feature extraction model illustrated in FIG. 7 and FIG. 8.
  • a plurality of sample breast images, sample compression thicknesses of the plurality of sample breast images, and sample scan parameters used to generate the plurality of sample breast images may be obtained and used as a training set of data (e.g., a training set of inputs such as the sample breast images and the sample compression thicknesses each having a known output (e.g., ground truth) such as the sample scan parameters) to train the scan parameter model.
  • a training set of data e.g., a training set of inputs such as the sample breast images and the sample compression thicknesses each having a known output (e.g., ground truth) such as the sample scan parameters
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
  • a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof.
  • a computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Peri, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A method may include obtaining a compression thickness of a compressed target breast and a first breast image of the target breast. The method may also include determining one or more breast feature parameters by inputting the first breast image into a feature extraction model. The method may also include determining, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography.

Description

IMAGING SYSTEMS AND METHODS
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims priority to Chinese Patent Application No. 201911419230.0 filed on December 31, 2019, the contents of which are hereby incorporated by reference.
TECHNICAL FIELD
The present disclosure generally relates to medical imaging, and in particular, to systems and methods for determining scan parameters of breast tomography.
BACKGROUND
Mammography is an effective screening and diagnostic tool for detection of breast disease (e.g., breast cancer) . However, the nature of the two-dimensional (2D) mammography (e.g., full-field digital mammography, FFDM) leads to breast tissue overlap in the resultant images, which makes low accuracy of the diagnosis for breast disease. Breast tomography, e.g., digital breast tomosynthesis (DBT) is a three-dimensional (3D) imaging technique that is designed to overcome this problem. Therefore, it is desirable to provide systems and methods for automatically determining scan parameters of breast tomography to realize more accurate and more efficient breast tomography.
SUMMARY
According to a first aspect of the present disclosure, a system may include one or more storage devices and one or more processors configured to communicate with the one or more storage devices. The one or more storage devices may include a set of instructions. When the one or more processors execute the set of instructions, the one or more processors may be directed to perform one or more of the following operations. The one or more processors may obtain a compression thickness of a compressed target breast and a first breast  image of the target breast. The one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model. The one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
In some embodiments, before determining the one or more breast feature parameters based on the first breast image, the one or more processors may process the first breast image.
In some embodiments, to process the first breast image, the one or more processors may perform at least one of downsampling, upsampling, image segmentation, or image enhancement on the first breast image.
In some embodiments, to perform image enhancement on the first breast image, the one or more processors may obtain an equalization image by performing histogram equalization on the first breast image. The one or more processors may transform the equalization image into a frequency domain image. The one or more processors may obtain an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image. The one or more processors may obtain a spatial domain image by performing inverse transformation on the amplified frequency domain image. The one or more processors may perform enhancement related to a greyscale range and contour extraction on the spatial domain image.
In some embodiments, the feature extraction model may include an expert system or an artificial intelligence network model.
In some embodiments, the artificial intelligence network model may include a machine learning model.
In some embodiments, the artificial intelligence network model may include a  deep learning model.
In some embodiments, the deep learning model may include a convolutional neural network model. The convolutional neural network model may include an input layer, a convolutional layer, a pooling layer, a fully connected later, and an output layer.
In some embodiments, the convolutional layer may include 36 convolutional kernels each of which has a size of 13*13 and a stride of 1. The pooling layer may include a max-pooling layer.
In some embodiments, the feature extraction model may be provided by: obtaining a plurality of sample breast images; and obtaining the feature extraction model by training a preliminary model using the plurality of sample breast images. Each of the plurality of sample breast images may include one or more sample feature parameters.
In some embodiments, to obtain the feature extraction model by training the preliminary model using the plurality of sample breast images, the one or more processors may obtain the feature extraction model by performing an iteration process including one or more iterations. At least one of the one or more iterations may include: obtaining one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model; and updating the intermediate model based on a difference between the one or more output feature parameters and the one or more sample feature parameters. The intermediate model may include the preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the at least one of the one or more iterations.
In some embodiments, to determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters, the one or more processors may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters  into a parameter determination model.
In some embodiments, the parameter determination model may include a machine learning model.
In some embodiments, to determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters, the one or more processors may determine a breast density feature of the target breast based on the compression thickness. The one or more processors may modify at least one of the one or more breast feature parameters based on the breast density feature. The one or more processors may determine the one or more scan parameters based on the one or more modified breast feature parameters.
In some embodiments, the one or more breast feature parameters may include at least one of a type, shape, distribution, edge, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
In some embodiments, the scan angle range and the frame number may be any value allowed by an imaging device that performs the breast tomography.
In some embodiments, the one or more processors may cause, based on the one or more scan parameters, an imaging device to perform the breast tomography on the target breast. The one or more processors may generate a second breast image of the target breast based on the breast tomography.
In some embodiments, the breast tomography may include digital breast tomosynthesis (DBT) .
According to another aspect of the present disclosure, a method may include one or more of the following operations. One or more processors may obtain a compression thickness of a compressed target breast and a first breast image of the target breast. The one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model. The one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of  breast tomography on the target breast. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
According to yet another aspect of the present disclosure, a system may include an obtaining module configured to obtain a compression thickness of a compressed target breast and a first breast image of the target breast. The system may also include a feature parameter determination module configured to determine one or more breast feature parameters by inputting the first breast image into a feature extraction model. The system may also include a scan parameter determination module configured to determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
According to yet another aspect of the present disclosure, a non-transitory computer readable medium may comprise at least one set of instructions. The at least one set of instructions may be executed by one or more processors of a computing device. The one or more processors may obtain a compression thickness of a compressed target breast and a first breast image of the target breast. The one or more processors may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model. The one or more processors may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
Additional features will be set forth in part in the description which follows, and in part will become apparent to those skilled in the art upon examination of the following and the accompanying drawings or may be learned by production or operation of the examples. The features of the present disclosure may be realized and attained by practice or use of various aspects of the methodologies,  instrumentalities, and combinations set forth in the detailed examples discussed below.
BRIEF DESCRIPTION OF THE DRAWINGS
The present disclosure is further described in terms of exemplary embodiments. These exemplary embodiments are described in detail with reference to the drawings. These embodiments are non-limiting exemplary embodiments, in which like reference numerals represent similar structures throughout the several views of the drawings, and wherein:
FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure;
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device according to some embodiments of the present disclosure;
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device according to some embodiments of the present disclosure;
FIG. 4 is a schematic block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure;
FIG. 5 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure;
FIG. 6 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure;
FIG. 7 is a flowchart illustrating an exemplary process for obtaining a feature extraction model according to some embodiments of the present disclosure;
FIG. 8 is a flowchart illustrating an exemplary training process for obtaining a feature extraction model according to some embodiments of the present disclosure;
FIG. 9 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure;  and
FIG. 10 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure.
DETAILED DESCRIPTION
In the following detailed description, numerous specific details are set forth by way of examples in order to provide a thorough understanding of the relevant disclosure. However, it should be apparent to those skilled in the art that the present disclosure may be practiced without such details. In other instances, well-known methods, procedures, systems, components, and/or circuitry have been described at a relatively high-level, without detail, in order to avoid unnecessarily obscuring aspects of the present disclosure. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present disclosure. Thus, the present disclosure is not limited to the embodiments shown, but to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. As used herein, the singular forms “a, ” “an, ” and “the” may be intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms “comprise, ” “comprises, ” and/or “comprising, ” “include, ” “includes, ” and/or “including, ” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It will be understood that the term “system, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, section or assembly of different level in ascending order. However, the terms may  be displaced by another expression if they achieve the same purpose.
Generally, the word “module, ” “unit, ” or “block, ” as used herein, refers to logic embodied in hardware or firmware, or to a collection of software instructions. A module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device. In some embodiments, a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts. Software modules/units/blocks configured for execution on computing devices (e.g., processor 210 as illustrated in FIG. 2) may be provided on a computer readable medium, such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) . Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device. Software instructions may be embedded in firmware, such as an EPROM. It will be further appreciated that hardware modules/units/blocks may be included of connected logic components, such as gates and flip-flops, and/or can be included of programmable units, such as programmable gate arrays or processors. The modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware. In general, the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage.
It will be understood that when a unit, engine, module or block is referred to as being “on, ” “connected to, ” or “coupled to, ” another unit, engine, module, or block,  it may be directly on, connected or coupled to, or communicate with the other unit, engine, module, or block, or an intervening unit, engine, module, or block may be present, unless the context clearly indicates otherwise. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
These and other features, and characteristics of the present disclosure, as well as the methods of operation and functions of the related elements of structure and the combination of parts and economies of manufacture, may become more apparent upon consideration of the following description with reference to the accompanying drawings, all of which form a part of this disclosure. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended to limit the scope of the present disclosure. It is understood that the drawings are not to scale.
Provided herein are systems and components for medical imaging and/or medical treatment. In some embodiments, the medical system may include an imaging system. The imaging system may include a single modality imaging system and/or a multi-modality imaging system. The single modality imaging system may include, for example, an X-ray imaging system (e.g., a computed tomography (CT) imaging system, a digital subtraction angiography (DSA) imaging system, a digital radiology (DR) imaging system, a computed radiology (CR) imaging system, etc. ) , an ultrasound imaging system (e.g., a color Doppler flow imaging (CDFI) system) , a magnetic resonance imaging (MRI) system, or a nuclear medical imaging system (e.g., a positron emission tomography (PET) imaging system, a single photon emission computed tomography (SPECT) imaging system, etc. ) . The multi-modality imaging system may include, for example, a computed tomography-magnetic resonance imaging (MRI-CT) system, a positron emission tomography-magnetic resonance imaging (PET-MRI) system, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) system, a digital subtraction  angiography-magnetic resonance imaging (DSA-MRI) system, a positron emission tomography-magnetic resonance imaging-computed tomography (PET-CT) imaging system, etc. In some embodiments, the medical system may include a treatment system. The treatment system may include a treatment plan system (TPS) , image-guide radiotherapy (IGRT) , etc. The image-guide radiotherapy (IGRT) may include a treatment device and an imaging device. The treatment device may include a linear accelerator, a cyclotron, a synchrotron, etc., configured to perform a radio therapy on a subject. The treatment device may include an accelerator of species of particles including, for example, photons, electrons, protons, or heavy ions. The imaging device may include an MRI scanner, a CT scanner (e.g., cone beam computed tomography (CBCT) scanner) , a digital radiology (DR) scanner, an electronic portal imaging device (EPID) , etc.
It should be noted that, in the present disclosure, an image, or a portion thereof (e.g., a region in the image) corresponding to an object (e.g., tissue, an organ, a tumor, etc. ) may be referred to as an image, or a portion of thereof (e.g., a region) of or including the object, or the object itself. For instance, a region in an image that corresponds to or represents a breast may be described as that the region includes a breast. As another example, an image of or including a breast may be referred to a breast image, or simply breast. For brevity, that a portion of an image corresponding to or representing an object is processed (e.g., extracted, segmented, etc. ) may be described as the object is processed. For instance, that a portion of an image corresponding to a breast is segmented from the rest of the image may be described as that the breast is segmented from the image.
Mammography is an effective screening and diagnostic tool for detection of breast disease (e.g., breast cancer) . However, the nature of the two-dimensional (2D) mammography (e.g., full-field digital mammography, FFDM) leads to breast tissue overlap in the resultant images, which makes low accuracy of the diagnosis for breast disease. Breast tomography, e.g., digital breast tomosynthesis (DBT) , is a  three-dimensional (3D) imaging technique that is designed to overcome this problem. It is an angle tomography technique that provides reconstruction planes in the breast using projection images from a scan angle range. Volumetric images may be generated based on a number of 2D X-ray images or 2D projections that are obtained from different scan angles in the scan angle range. The number (or count) of the 2D X-ray images or 2D projections may be referred to as a frame number of the breast tomography.
An aspect of the present disclosure provides systems and methods for automatically determining scan parameters of breast tomography to realize more accurate and more efficient breast tomography. In the present disclosure, a compression thickness of a compressed target breast and a first breast image of the target breast may be obtained. The first breast image may be a 2D preliminary breast image of the target breast. One or more breast feature parameters may be determined by inputting the first breast image into a feature extraction model. One or more scan parameters of breast tomography on the target breast may be determined based on the compression thickness and the one or more breast feature parameters. The one or more scan parameters may include a scan angle range and a frame number of the breast tomography. The one or more scan parameters may be configured to cause an imaging device to perform the breast tomography on the target breast. A second breast image of the target breast may be generated based on the breast tomography. The second breast image may be a volumetric image and used to diagnose disease of the target breast.
The systems and methods for automatically determining scan parameters of breast tomography provided in the present disclosure may save human resources through automatic parameter determination. In addition, in the systems and methods for automatically determining scan parameters of breast tomography provided in the present disclosure, the automatic parameter determination is no longer limited to the exposure dose, but related to the scan angle range and the  frame number, which improves the quality of the resultant images, makes the resultant images conducive to the diagnosis of breast disease, avoids repeated scanning and meaningless multi-frame scanning, and prevents patients from receiving a larger dose of radiation energy.
FIG. 1 is a schematic diagram illustrating an exemplary imaging system 100 according to some embodiments of the present disclosure. In some embodiments, the imaging system 100 may be applied in breast tomography, e.g., digital breast tomosynthesis (DBT) . In some embodiments, the imaging system 100 may be applied a Combo mode and/or a U-view mode. The Combo mode may refer to a mode in which one or more 2D breast images and one or more tomographic breast images are obtained in the breast tomography. The U-view mode may refer to a mode in which one or more fused 2D breast images and one or more tomographic breast images are obtained in the breast tomography.
As illustrated, the imaging system 100 may include an imaging device 110, a network 120, a terminal 130, a processing device 140, and a storage device 150. The components of the imaging system 100 may be connected in one or more of various ways. Mere by way of example, as illustrated in FIG. 1, the imaging device 110 may be connected to the processing device 140 through the network 120. As another example, the imaging device 110 may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 140) . As a further example, the storage device 150 may be connected to the processing device 140 directly or through the network 120. As still a further example, a terminal device (e.g., 131, 132, 133, etc. ) may be connected to the processing device 140 directly (as indicated by the bi-directional arrow in dotted lines linking the terminal 130 and the processing device 140) or through the network 120.
In some embodiments, the imaging device 110 may include a medical imaging device. The imaging device 110 may be used to scan an object located  within its detection region and generate a plurality of scan data (e.g., digital signals) used to generate one or more images relating to the object. In the present disclosure, “subject” and “object” are used interchangeably. Mere by way of example, the object may include a patient, a man-made object, etc. As another example, the object may include a specific portion, organ, and/or tissue of a patient. For example, the object may include head, brain, neck, body, shoulder, arm, thorax, cardiac, stomach, blood vessel, soft tissue, knee, feet, breast, or the like, or any combination thereof.
In some embodiments, the imaging device 110 may be used to scan a breast of the object (e.g., a patient) . For example, the imaging device 110 may be an X-ray device. The X-ray device may include a breast-holder tray on which the patient lays her breast, an X-ray tube, and a detector. The breast-holder tray may be placed on the top of the detector. The detector may be placed beneath the breast-holder tray. The X-ray tube may emit X-rays going through the breast. The detector may be located opposite to the X-ray tube so as to detect the X-rays that have crossed the patient’s breast and the breast-holder tray. The detector may transform the light signals of the detected X-rays into digital signals and transmit the digital signals to the processing device 140 for further processing (e.g., generating a breast image) . In some embodiments, the X-ray device may further include a compression pad. For reasons related both to the immobilizing of the breast and to image quality or intensity of X-rays delivered to the patient's breast, it is necessary to compress the patient's breast during the scan process. The compression force may be applied through the compression pad that compresses the breast on the breast-holder tray. In some embodiments, the X-ray device may further include a high-voltage generator configured to provide the voltage that is required for the X-ray tube to produce X-rays. In some embodiments, the X-ray device may further include a collimator configured to adjust an X-ray irradiation range. The collimator also can absorb some scattered X-rays, which may improve the image quality. The  collimator may be located in front of the X-ray tube in the emitting direction of the X-rays. In some embodiments, the imaging device 110 may include a digital breast tomosynthesis (DBT) device.
The network 120 may include any suitable network that can facilitate the exchange of information and/or data for the imaging system 100. In some embodiments, one or more components of the imaging system 100 (e.g., the imaging device 110, the terminal 130, the processing device 140, or the storage device 150) may communicate information and/or data with one or more other components of the imaging system 100 via the network 120. For example, the processing device 140 may obtain scan data (e.g., digital signals) of a breast of an object (e.g., a patient) from the imaging device 110 via the network 120. In some embodiments, the network 120 may be any type of wired or wireless network, or a combination thereof. The network 120 may be and/or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , etc. ) , a wired network (e.g., an Ethernet network) , a wireless network (e.g., an 802.11 network, a Wi-Fi network, etc. ) , a cellular network (e.g., a Long Term Evolution (LTE) network) , a frame relay network, a virtual private network ( "VPN" ) , a satellite network, a telephone network, routers, hubs, switches, server computers, and/or any combination thereof. Merely by way of example, the network 120 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof. In some embodiments, the network 120 may include one or more network access points. For example, the network 120 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the imaging system 100 may be connected to the network 120 to exchange data  and/or information.
The terminal 130 include a mobile device 131, a tablet computer 132, a laptop computer 133, or the like, or any combination thereof. In some embodiments, the mobile device 131 may include a smart home device, a wearable device, a smart mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof. In some embodiments, the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof. In some embodiments, the wearable device may include a smart bracelet, smart footgear, a pair of smart glasses, a smart helmet, a smart watch, smart clothing, a smart backpack, a smart accessory, or the like, or any combination thereof. In some embodiments, the smart mobile device may include a smartphone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, or the like, or any combination thereof. In some embodiments, the virtual reality device and/or the augmented reality device may include a virtual reality helmet, a virtual reality glass, a virtual reality patch, an augmented reality helmet, an augmented reality glass, an augmented reality patch, or the like, or any combination thereof. For example, the virtual reality device and/or the augmented reality device may include a Google TM Glass, an Oculus Rift, a Hololens, a Gear VR, etc. In some embodiments, the terminal 130 may remotely operate the imaging device 110 and/or the processing device 140. In some embodiments, the terminal 130 may operate the imaging device 110 and/or the processing device 140 via a wireless connection. In some embodiments, the terminal 130 may receive information and/or instructions inputted by a user, and send the received information and/or instructions to the imaging device 110 or to the processing device 140 via the network 120. In some embodiments, the terminal 130 may receive data and/or information from the processing device 140. In some embodiments, the terminal 130 may be part of the  processing device 140. In some embodiments, the terminal 130 may be omitted.
The processing device 140 may process data and/or information obtained from the imaging device 110, the terminal 130, and/or the storage device 150. For example, the processing device 140 may generate one or more medical images (e.g., breast images) by processing scan data (e.g., digital signals) from the imaging device 110. As another example, the processing device 140 may determine one or more scan parameters for scanning a breast using the imaging device 110. In some embodiments, the processing device 140 may be a single server, or a server group. The server group may be centralized or distributed. In some embodiments, the processing device 140 may be local or remote. For example, the processing device 140 may access information and/or data stored in or acquired by the imaging device 110, the terminal 130, and/or the storage device 150 via the network 120. As another example, the processing device 140 may be directly connected to the imaging device 110 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the imaging device 110 in FIG. 1) , the terminal 130 (as illustrated by the bidirectional arrow in dashed lines connecting the processing device 140 and the terminal 130 in FIG. 1) , and/or the storage device 150 to access stored or acquired information and/or data. In some embodiments, the processing device 140 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof. In some embodiments, the processing device 140 may be implemented on a computing device 200 having one or more components illustrated in FIG. 2 in the present disclosure.
The storage device 150 may store data and/or instructions. In some embodiments, the storage device 150 may store data obtained from the imaging device 110, the terminal 130 and/or the processing device 140. For example, the storage device 150 may store medical images (e.g., breast images) generated by the  processing device 140. In some embodiments, the storage device 150 may store data and/or instructions that the processing device 140 may execute or use to perform exemplary methods described in the present disclosure. For example, the storage device 150 may store instructions that the processing device 140 may execute to determine one or more scan parameters. In some embodiments, the storage device 150 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. Exemplary mass storage may include a magnetic disk, an optical disk, a solid-state drive, etc. Exemplary removable storage may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. Exemplary volatile read-and-write memory may include a random access memory (RAM) . Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc. In some embodiments, the storage device 150 may be implemented on a cloud platform. Merely by way of example, the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
In some embodiments, the storage device 150 may be connected to the network 120 to communicate with one or more components of the imaging system 100 (e.g., the imaging device 110, the processing device 140, the terminal 130, etc. ) . One or more components of the imaging system 100 may access the data or instructions stored in the storage device 150 via the network 120. In some embodiments, the storage device 150 may be directly connected to or communicate with one or more components of the imaging system 100 (e.g., the imaging device  110, the processing device 140, the terminal 130, etc. ) . In some embodiments, the storage device 150 may be part of the processing device 140.
In some embodiments, the imaging system 100 may further include one or more power supplies (not shown in FIG. 1) connected to one or more components of the imaging system 100 (e.g., the imaging device 110, the processing device 140, the terminal 130, the storage device 150, etc. ) .
FIG. 2 is a schematic diagram illustrating exemplary hardware and/or software components of a computing device on which the processing device 140 may be implemented according to some embodiments of the present disclosure. As illustrated in FIG. 3, the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
The processor 210 may execute computer instructions (program code) and perform functions of the processing device140 in accordance with techniques described herein. The computer instructions may include routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein. For example, the processor 210 may generate one or more medical images (e.g., breast images) by processing scan data (e.g., digital signals) from the imaging device 110. As another example, the processor 210 may determine one or more scan parameters for scanning a breast using the imaging device 110. In some embodiments, the processor 210 may include a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
Merely for illustration purposes, only one processor is described in the computing device 200. However, it should be noted that the computing device 200 in the present disclosure may also include multiple processors, and thus operations of a method that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors. For example, if in the present disclosure the processor of the computing device 200 executes both operations A and B, it should be understood that operations A and step B may also be performed by two different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
The storage 220 may store data/information obtained from the imaging device 110, the terminal 130, the storage device 150, or any other component of the imaging system 100. In some embodiments, the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof. For example, the mass storage device may include a magnetic disk, an optical disk, a solid-state drive, etc. The removable storage device may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc. The volatile read-and-write memory may include a random access memory (RAM) . The RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc. The ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (PEROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc. In some embodiments, the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure. For example, the storage 220 may  store instructions that the processor 210 may execute to determine one or more scan parameters.
The I/O 230 may input or output signals, data, or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 140. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, a trackball, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof. Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
The communication port 240 may be connected to a network (e.g., the network 120) to facilitate data communications. The communication port 240 may establish connections between the processing device140 and the imaging device 110, the terminal 130, or the storage device 150. The connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception. The wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof. The wireless connection may include Bluetooth, Wi-Fi, WiMax, WLAN, ZigBee, mobile network (e.g., 3G, 4G, 5G, etc. ) , or the like, or a combination thereof. In some embodiments, the communication port 240 may be a standardized communication port, such as RS232, RS485, etc. In some embodiments, the communication port 240 may be a specially designed communication port. For example, the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
FIG. 3 is a schematic diagram illustrating exemplary hardware and/or software components of a mobile device on which the terminal 130 may be  implemented according to some embodiments of the present disclosure. As illustrated in FIG. 3, the mobile device 300 may include a communication platform 310, a display 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage 390. In some embodiments, any other suitable component, including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300. In some embodiments, a mobile operating system 370 (e.g., iOS, Android, Windows Phone, etc. ) and one or more applications 380 may be loaded into the memory 360 from the storage 390 in order to be executed by the CPU 340. The applications 380 may include a browser or any other suitable mobile apps for receiving and rendering information relating to image processing or other information from the processing device 140. User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device140 and/or other components of the imaging system 100 via the network 120.
To implement various modules, units, and their functionalities described in the present disclosure, computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein. The hardware elements, operating systems and programming languages of such computers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith to adapt those technologies to the blood pressure monitoring as described herein. A computer with user interface elements may be used to implement a personal computer (PC) or another type of work station or terminal device, although a computer may also act as a server if appropriately programmed. It is believed that those skilled in the art are familiar with the structure, programming and general operation of such computer equipment and as a result the drawings should be self-explanatory.
FIG. 4 is a schematic block diagram illustrating an exemplary processing engine 400 according to some embodiments of the present disclosure. The  processing engine 400 may include an obtaining module 410, a feature parameter determination module 420, and a scan parameter determination module 430.
The obtaining module 410 may be configured to obtain a compression thickness of a compressed target breast and a first breast image of the target breast. In some embodiments, the obtaining module 410 may be further configured to process the first breast image.
The feature parameter determination module 420 may be configured to determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
The scan parameter determination module 430 may be configured to determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. In some embodiments, the one or more scan parameters may include a scan angle range and a frame number of the breast tomography. In some embodiments, the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
In some embodiments, the scan parameter determination module 430 may be configured to determine, based on the compression thickness and the first breast image, one or more scan parameters of breast tomography on the target breast. In some embodiments, the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the first breast image into a scan parameter model.
In some embodiments, the scan parameter determination module 430 may be configured to cause, based on the one or more scan parameters, the imaging device 110 to perform the breast tomography on the target breast. The scan parameter determination module 430 may be configured to generate a second breast image of the target breast based on the breast tomography.
In some embodiments, the feature parameter determination module may be configured to obtain a feature extraction model. In some embodiments, the scan parameter determination module may be configured to obtain a parameter determination model and/or a scan parameter model.
The modules in the processing engine 400 may be connected to or communicate with each other via a wired connection or a wireless connection. The wired connection may include a metal cable, an optical cable, a hybrid cable, or the like, or any combination thereof. The wireless connection may include a Local Area Network (LAN) , a Wide Area Network (WAN) , a Bluetooth, a ZigBee, a Near Field Communication (NFC) , or the like, or any combination thereof. Two or more of the modules may be combined as a single module, and any one of the modules may be divided into two or more units. For example, the obtaining module 410 may be divided into two units One of the two unit may be configured to obtain the compression thickness, and the other one of the two unit may be configured to obtain the first breast image.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. For example, the processing engine 400 may further include a storage module (not shown in FIG. 4) . The storage module may be configured to store data generated during any process performed by any component of in the processing engine 400. As another example, each of the components of the processing engine 400 may include a storage device. Additionally or alternatively, the components of the processing engine 400 may share a common storage device. As still another example, the feature parameter determination module 420 may be omitted.
FIG. 5 is a flowchart illustrating an exemplary process for determining one or  more scan parameters according to some embodiments of the present disclosure. In some embodiments, the process 500 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 500 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 500 presented below are intended to be illustrative. In some embodiments, the process 500 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 500 as illustrated in FIG. 5 and described below is not intended to be limiting.
In 510, the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
In some embodiments, for reasons related both to the immobilizing of the target breast and to image quality or intensity of X-rays delivered to the target breast, it is necessary to compress the target breast during a scan process (e.g., a process for obtaining a first breast image of the target breast and/or a process for obtaining a second breast image of the target breast) . The compression force may be applied through a compression pad of the imaging device 110 that compresses the target breast on a breast-holder tray of the imaging device 110.
In some embodiments, before the target breast is formally scanned, the imaging device 110 may perform one or more pre-scans on the compressed target breast to obtain the first breast image of the target breast. The first breast image may be used to determine one or more scan parameters that are configured to perform one or more formal scans on the target breast. In some embodiments, the  first breast image may include one or more two-dimensional (2D) image. In some embodiments, the first breast image may include one or more preliminary images that have a relatively low image quality. In some embodiments, the first breast image may include breast tissue overlap. In some embodiments, the imaging device 110 may scan the target breast at one or more scan angles to obtain the first breast image. In some embodiments, the imaging device 110 may scan the target breast using a low dose of X-rays to obtain the first breast image.
In some embodiments, the imaging device 110 may scan the target breast to obtain the first breast image using one or more preliminary parameters. In some embodiments, the preliminary parameters may be determined based on, e.g., experience of a user (e.g., a doctor, an engineer, an operator, a technician) of the imaging system 100, and/or default setting of the imaging system 100.
In some embodiments, the obtaining module 410 may obtain the compression thickness and/or the first breast image from the imaging device 110 and/or a storage device (e.g., the storage device 150, the storage 220, etc. ) of the imaging system 100. For example, the obtaining module 410 may receive a request for obtaining the compression thickness and/or the first breast image. In response to the request, the obtaining module 410 may obtain the compression thickness and/or the first breast image from the imaging device 110 and/or a storage device of the imaging system 100. As another example, the obtaining module 410 may receive a request for performing one or more pre-scans on the compressed target breast. The obtaining module 410 may send the request to the imaging device 110 to cause the imaging device 110 to perform the one or more pre-scans on the compressed target breast to generate pre-scan data. The processing device 140 may generate the first breast image based on the pre-scan data. The obtaining module 410 may obtain the first breast image from the processing device 140.
It should be noted that the above description about obtaining the compression thickness and the first breast image is merely provided for the purposes  of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure. Other embodiments can be applied to the present invention, which will not be exhaustive here.
In 520, the feature parameter determination module 420 may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
In some embodiments, the feature extraction model may include an expert system or an artificial intelligence network model. In some embodiments, the artificial intelligence network model may include a machine learning model. In some embodiments, the artificial intelligence network model may include a deep learning model. In some embodiments, the feature extraction model may be provided by a training process and configured to output the one or more breast feature parameters. Details regarding the training process of the feature extraction model may be found elsewhere in the present disclosure (e.g., the description in connection with FIG. 7 and FIG. 8) .
In some embodiments, the artificial intelligence network model may include an artificial neural network model. The artificial neural network model may include an input layer, an output layer, and a plurality of hidden layers. The layers of the artificial neural network model may be connected in a feed-forward fashion, and an output of an i th layer may be provided as an input to an (i+1)  th layer.
In some embodiments, in the artificial neural network model, the input layer may be configured to receive an input (e.g., the first breast image) of the artificial neural network model. Each hidden layer may perform a specific function, e.g., convolution, pooling, normalization, matrix multiplication, non-linear activation, or the like. The output layer may receive an input from the preceding layer and apply one  or more transformations to the received input to generate a processing result (e.g., the one or more breast feature parameters of the first breast image) of the artificial neural network model.
For illustration purposes, a convolutional neural network (CNN) model may be taken as an example. Exemplary hidden layers may include a convolutional layer, a pooling layer, and a fully connected layer. In some embodiments, an image (e.g., a sample breast image, the first breast image) may be inputted into the feature extraction model. The image may be represented as a 2D matrix or a 3D matrix including a plurality of elements (e.g., pixels or voxels) . Each of the plurality of elements in the matrix may have a value representing a feature or characteristic of the element.
The convolutional layer may include one or more kernels, which may be used to extract a feature of the image. In some embodiments, each of the one or more convolutional kernel may have a specific size and stride. In some embodiments, each of the one or more kernels may filter a portion of the image to generate a specific feature corresponding to the portion. The specific feature may be determined based on the one or more kernels. Exemplary features may include a low-level feature (e.g., an edge feature, a textural feature) , a high-level feature, or a complicated feature.
The pooling layer may take an output of the convolutional layer as an input. The pooling layer may include a plurality of pooling nodes, which may be used to sample the output of the convolutional layer, so as to reduce the computational load of data processing and accelerate the speed of data processing speed. In some embodiments, a size of the matrix representing the image may be reduced in the pooling layer.
The fully connected layer may include a plurality of neurons. The neurons may be connected to the pooling nodes in the pooling layer. In the fully connected layer, a plurality of vectors corresponding to the plurality of pooling nodes may be  determined based on one or more features of the image, and a plurality of weighting coefficients may be assigned to the plurality of vectors.
The output layer may determine an output based on the vectors and the weighting coefficients obtained from the fully connected layer. In some embodiments, an output of the output layer may include the one or more breast feature parameters.
Merely by way of example, the convolutional layer may include 36 convolutional kernels each of which has a size of 13*13 and a stride of 1. The pooling layer may include a max-pooling layer. When the first breast image is input into the feature extraction model, the convolutional layer may extract features of the input first breast image. Specifically, the 36 convolution kernels with the size of 13*13 and the stride of 1 may output 36 feature images based on the first breast image. After the feature extraction in the convolutional layer, the 36 feature images may be input into the pooling layer to achieve information filtering and feature selection in the 36 feature images. The pooling layer may include a max-pooling layer, which can retain the main features in the feature images, prevent overfitting, improve the generalization ability of the feature extraction model, and reduce the data volume of the feature images, thereby improving the recognition rate of the one or more breast feature parameters. The fully connected layer may reassemble all the local features output by the pooling layer into a complete graph through the weight matrix to achieve effective classification of the features. The output layer may use logic functions to output the one or more breast feature parameters.
In some embodiments, the feature extraction model may be implemented on one or more processing devices (e.g., the processing device 140, the processor 210, the terminal 130, the CPU 340, the GPU 330, etc. ) . For example, one or more layers may be respectively implemented on a processing device. As another example, one or more components of a layer may be implemented on a same processing device. In some embodiments, a plurality of processing devices may  execute a parallel processing operation in some layers of the feature extraction model by, for example, assigning two or more processing devices for an operation of different nodes (e.g., a kernel, a pooling node, a neuron) in the feature extraction model. For example, a first GPU may execute the operation corresponding to kernel A and kernel B, and a second kernel may execute the operation corresponding to kernel C and kernel D. Similarly, a plurality of GPUs may also execute the operation of other nodes (e.g., a kernel, a pooling node, a neuron) in the feature extraction model.
In addition, in some embodiments, a storage device (e.g., the storage device 150, the storage 220, the storage 390, the memory 360, etc. ) may be provided for storing data related to the feature extraction model, such as an activation functions, a learned weight for each node, and/or a network topology (e.g., a number (or count) of the hidden layers, a type of each hidden layer, etc. ) . Optionally, the storage device may further store a training data set.
In some embodiments, the one or more breast feature parameters may include at least one of a type, type probability, shape, distribution, edge, density, location, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
In some embodiments, the background and the area of fat of the target breast in the first breast image may be darker, and the other areas in the first breast image may be brighter. The brighter area other than the glands of the target breast in the first breast image may be designated as one or more ROIs.
In some embodiments, the type of an ROI may refer to a type of a lesion that is possibly included in the ROI. The type of an ROI may include a calcification, a lump, a tumor, or the like. In some embodiments, when an ROI includes a calcification, the classification of the calcification may be further refined. In some embodiments, the calcification may include benign calcification, suspicious calcification, malignant calcification, or the like, or any combination thereof. The  benign calcification may include a skin calcification, a round calcification, a vascular calcification, a suture calcification, a small calcification, or the like, or any combination thereof. The suspicious calcification may include an amorphous calcification. In some embodiments, the small calcification may have a first probability to belong to the benign calcification and have a second probability to belong to the amorphous calcification of the suspicious calcification. The type probability of an ROI may refer to a probability that the ROI includes a certain type of a lesion (e.g., a calcification, a lump, a tumor, etc. ) .
In some embodiment, when the type of an ROI includes a lump, in this case, since the ROI may not be filled in the entire target breast, the shape of the ROI may be output as one of the one or more breast feature parameters of the first breast image, instead of the distribution of the ROI. The shape of the ROI may include a regular shape or an irregular shape. When the type of an ROI include a calcification, in this case, since the shape of the ROI mat be circular or elliptical, the distribution of the ROI may be output as one of the one or more breast feature parameters of the first breast image, instead of the shape of the ROI. The distribution of an ROI may include a diffuse distribution, a regional distribution, a clustered distribution, a linear distribution, a segmental distribution, or the like, or any combination thereof.
In some embodiments, when the type of an ROI include a lump, the edge of the ROI may be clear, fuzzy or rough. When the type of an ROI include a calcification, the edge of the ROI may be basically clear. The major axis of an ROI may be considered as a distance between two points furthest away from each other in the ROI, which can be used to characterize the size of the ROI.
In some embodiments, the feature parameter determination module 420 may determine a breast density feature of the target breast based on the compression thickness and/or the first breast image. In some embodiments, the feature parameter determination module 420 may modify at least one of the one or more  breast feature parameters based on the breast density feature. The feature parameter determination module 420 may determine one or more scan parameters based on the one or more modified breast feature parameters.
In some embodiments, the feature extraction model may include a function of determining the breast density feature of the target breast. For example, the first breast image may be input into the feature extraction model. The feature extraction model may output the breast density feature of the target breast by analyzing the first breast image. As another example, the compression thickness and the first breast image may be input into the feature extraction model. The feature extraction model may output the breast density feature of the target breast by analyzing the first breast image and the compression thickness.
For example, the breast size of the target breast may be estimated based on the breast contour of the target breast in the first breast image and the compression thickness. According to the difference in gray values of fat and glands of the target breast in the first breast image, a first ratio of the size of the fat area to the breast size, and a second ratio of the size of the gland area to the breast size may be estimated. The breast density feature may be estimated based on the first ratio and the second ratio.
The breast density feature of the target breast may be particularly important in lesion detection and lesion positioning. When the target breast is a dense breast, the gland of the target breast may be identified as a lesion (e.g., a lump or a tumor) , which leads to false positive recognition of the lesion. Therefore, the one or more breast feature parameters determined by the feature extraction model may be modified according to the breast density feature of the target breast. For example, when the target breast is a dense breast, if the one or more ROIs in the first breast image is identified as including a lump and the lump is located at the gland of the target breast, the type probability that the one or more ROIs include a lump may be appropriately reduced.
In 530, the scan parameter determination module 430 may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. In some embodiments, the one or more scan parameters may include a scan angle range and a frame number of the breast tomography.
Breast tomography, e.g., digital breast tomosynthesis (DBT) , is a three-dimensional (3D) imaging technique. It is an angle tomography technique that provides reconstruction planes in the breast using projection images from a scan angle range. Volumetric images may be generated based on a number of 2D X-ray images or 2D projections that are obtained from different scan angles in the scan angle range. The number (or count) of the 2D X-ray images or 2D projections may be referred to as a frame number of the breast tomography.
In some embodiments, the scan angle range may include a range width and a range location. For example, when the range width of the scan angle range is 30°, the range location of the scan angle range may be 10° to the left of a scan reference point and 20° to the right of the scan reference point. The scan reference point may refer to an initialized position of the X-ray tube and the detector of the imaging device 110. For example, when the X-ray tube and the detector of the imaging device 110 are located at the scan reference point, a line connected the focus of the X-ray tube and the central point of the detector may be perpendicular or horizontal to the ground.
In some embodiments, the compression thickness and the one or more breast feature parameters may be impact factors of the one or more scan parameters, and may collectively affect the determination of the one or more scan parameters. In some embodiments, the weights of these impact factors may be different in different situations. For example, when an ROI includes a lump, the weight of the distribution of the ROI may be relatively small, which basically does not affect the determination of the one or more scan parameters. As another example,  when the type of an ROI includes a calcification, the weights of the edge and the shape of the ROI may be relatively small, which basically do not affect the determination of the one or more scan parameters. According to the corresponding relationship between the above impact factors and the one or more scan parameters, the scan parameter determination module 430 may determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters. In some embodiments, choosing a larger scan angle range and a larger frame number may facilitate the analysis of the internal layered structure of the target breast. When the one or more ROIs are concentrated and have a small size, for example, when the one or more ROIs include a calcification, choosing a smaller scan angle range and an appropriate frame number may help the identification of the calcification.
When the type of the one or more ROIs include a lump, the corresponding relationship between the above-mentioned impact factors and the one or more scan parameters may include the following manners. The more irregular the shape of the one or more ROIs is, the larger the frame number may be. The fuzzier and/or rougher the edge of the one or more ROIs is, the larger the frame number may be. The longer the major axis of the one or more ROIs is, the larger the frame number may be. The thicker the compression thickness is, the larger the frame number may be.
When the type of the one or more ROIs includes a calcification, the corresponding relationship between the above-mentioned impact factors and the one or more scan parameters may include the following manners. The more diffuse the distribution of the one or more ROIs is, the larger the frame number may be. The longer the major axis, the larger the frame number may be. The thicker the compression thickness is, the larger the frame number may be.
In some embodiments, the scan angle range may be positively related to the frame number, and may be further affected by factors such as the compression  thickness and the length of the major axis of the one or more ROIs. For example, when the estimated compression thickness is too large or small, or the estimated major axis is too long or short, after the scan angle range is determined based on the estimated compression thickness and/or the estimated major axis, the scan angle range may be accordingly enlarged or reduced according to the actual compression thickness and/or the major axis of the target breast.
Merely by way of example, Table 1 shows the corresponding relationship of the compression thickness and one or more breast feature parameters with the one or more scan parameters of the breast tomography in a condition that the one or more ROIs include a lump.
Table 1
Figure PCTCN2020142136-appb-000001
After the first breast image is input into the feature extraction model, the feature extraction model may determine the density and the major axis of the one or more ROIs in the first breast image. According to the density and the major axis, the type and the type probability of the one or more ROIs may be preliminarily determined, as shown in Table 1. When the density of the one or more ROIs is  lower than the density of the calcification, and the major axis is relatively large (e.g., larger than 1mm, 2mm, 3mm, 4mm, 5mm, 6mm, 7mm, 8mm, 9mm, 10mm, etc. ) , the one or more ROIs may be preliminarily determined as including a lump. In some embodiments, the feature extraction model may adjust the preliminarily determined type probability of the one or more ROIs based on the shape and the edge of the one or more ROIs. As shown in the second row and the third row in Table 1, the type probability of determining the one or more ROIs as including a lump may be increased (e.g., 80%) because of the clear edge and the regular shape of the one or more ROIs. According to the adjusted type and type probability of the one or more ROIs, the feature extraction model may finally determine the type and the type probability of the one or more ROIs. As shown in Table 1, it is concluded that the one or more ROIs may include a lump.
When the one or more ROIs include a lump, the scan angle range and the frame number may be determined based on the shape, edge and major axis of the one or more ROIs, and the compression thickness. As shown in Table 1, when the major axis of the one or more ROIs is shorter than 15mm, the more irregular the shape of the one or more ROIs is, the larger the scan angle range and the frame number may be. As shown in Table 1, when the shape of the one or more ROIs is irregular, the thicker the compression thickness is, the larger the scan angle range and the frame number may be. As shown in Table 1, when the compression thickness is less than 30mm, the longer the major axis of the one or more ROIs is, the larger the scan angle range and the frame number may be.
It should be noted that the above description of the breast feature parameters, the compression thickness, and the scan parameters in Table 1 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the  scope of the present disclosure.
Merely by way of example, Table 2 shows the corresponding relationship of the compression thickness and the one or more breast feature parameters with the one or more scan parameters of the breast tomography when the one or more ROIs include calcification.
Table 2
Figure PCTCN2020142136-appb-000002
After the first breast image is input into the feature extraction model, the feature extraction model may determine the density and the major axis of the one or more ROIs in the first breast image. According to the density and the major axis, the type and the type probability of the one or more ROIs may be preliminarily determined. As shown in Table 2, when the density of the one or more ROIs is relatively high and the major axis of the one or more ROIs is relatively small (e.g., less than 1mm, 2mm, 3mm, 4mm, 5mm, 6mm, 7mm, 8mm, 9mm, 10mm, etc. ) , the one or more ROIs may be preliminarily determined as including a calcification. When preliminarily determining that the one or more ROIs include a calcification, the  feature extraction model may further adjust the type and/or the type probability of the one or more ROIs based on the shape and the edge of the one or more ROIs. For example, the clearer the edge of the one or more ROIs is, the more regular the shape of the one or more ROIs may be, which may improve the probability of determining that the one or more ROIs include a calcification. According to the adjusted type and type probability of the one or more ROIs, the feature extraction model may finally determine the type and the type probability of the one or more ROIs. As shown in Table 2, it is concluded that the one or more ROIs may include a calcification.
When the one or more ROIs include a calcification, the distribution of the one or more ROIs and the compression thickness may be further used to determine the scan angle range and the frame number. As shown in Table 2, when the compression thickness is less than 30mm, the scan angle range and the frame number in a condition that the distribution of the one or more ROIs is diffuse may be larger than those in a condition that the distribution of the one or more ROIs is regional and/or clustered. In some embodiments, as shown in Table 2, the scan angle range corresponding to the skin calcification may be larger than those corresponding to other benign calcifications (e.g., round calcification and small calcification in Table 2) .
It should be noted that the above description of the breast feature parameters, the compression thickness, and the scan parameters in Table 2 is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
In some embodiments, the range location of the scan angle range may be determined based on the location of the one or more ROIs. For example, when the  range width of the scan angle range is 30°, the range location may be determined by setting, based on the location of the one or more ROIs relative to the scan reference point of the imaging device 110, the scan angle range to 10° to the left of the scan reference point and 20° to the right of the scan reference point. Compared with a scan angle range of 30° with 15° on the left side of the reference point and 15° on the right side of the reference point, the scan angle range of 30° with 10° on the left side of the reference point and 20° on the right side of the reference point may lead to an image that is more conducive to the diagnosis of breast lesion.
In some embodiments, the one or more scan parameters may be any value allowed by the imaging device 110. In some embodiments, if the one or more determined scan parameters are beyond the allowed range of the imaging device 110, the scan parameter determination module 430 may modify the one or more determined scan parameters to a maximum value or a minimum value allowed by the imaging device 110.
In some embodiments, the scan parameter determination module 430 may cause, based on the one or more scan parameters, the imaging device 110 to perform the breast tomography on the target breast. The scan parameter determination module 430 may generate a second breast image of the target breast based on the breast tomography. For example, the imaging device 110 may scan the compressed target breast at different scan angles within the scan angle range to obtain a number (equal to the frame number) of 2D X-ray images or 2D projections of the target breast. The second breast image that is a volumetric image may be generated based on the 2D X-ray images or 2D projections. The second breast image may be used to diagnose breast disease of the target breast.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those  variations and modifications do not depart from the scope of the present disclosure.
FIG. 6 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure. In some embodiments, the process 600 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 600 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 600 presented below are intended to be illustrative. In some embodiments, the process 600 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 600 as illustrated in FIG. 6 and described below is not intended to be limiting. In some embodiments, on the basis of the process 500 in FIG. 5, the process 600 may further include an operation of processing the first breast image before determining the one or more breast feature parameters.
In 610, the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
In 620, the obtaining module 410 may process the first breast image.
In 630, the feature parameter determination module may determine one or more breast feature parameters by inputting the processed first breast image into a feature extraction model.
In 640, the scan parameter determination module 430 may determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast. The one or  more scan parameters may include a scan angular range and a frame number of the breast tomography.
The  operations  610, 630, and 640 may be similar to the operations 510-530 of the process 500 in FIG. 5.
In some embodiments, the obtaining module 410 may process the first breast image by performing downsampling, upsampling, image segmentation, image enhancement, or the like, or any combination thereof on the first breast image.
In some embodiments, when the image quality of the first breast image is relatively low, the obtaining module 410 may perform upsampling on the first breast image. For example, the obtaining module 410 may perform interpolation on the first breast image to improve the image quality of the first breast image, which is helpful to improve the accuracy of determining the one or more breast feature parameters. In some embodiments, when the first breast image includes relatively large data, the obtaining module 410 may perform downsampling on the first breast image. For example, the obtaining module 410 may reduce the number (or count) of pixels of the first breast image to reduce data volume of the first breast image, which may help increase the rate of determining the one or more breast feature parameters. In some embodiments, the obtaining module 410 may perform image segmentation on the first breast image by dividing the first breast image according to, for example, a preset image segmentation size (e.g., 240 mm *240 mm) . By dividing different first breast images into a uniform size, the normalization of different first breast images may be achieved, which may improve the accuracy of determining the one or more breast feature parameters, reduce the difficulty of determining the one or more breast feature parameters, and increase the rate of determining the one or more breast feature parameters. In some embodiments, the obtaining module 410 may use spatial domain algorithms to achieve image enhancement processing on the first breast image, so as to obtain a processed first breast image that is conducive to the automatic determination of the one or more  breast feature parameters.
In some embodiments, the obtaining module 410 may perform the following operations to achieve the image enhancement of the first breast image. The obtaining module 410 may obtain an equalization image by performing histogram equalization on the first breast image. The obtaining module 410 may transform the equalization image into a frequency domain image. The obtaining module 410 may obtain an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image. The obtaining module 410 may obtain a spatial domain image by performing inverse transformation on the amplified frequency domain image. The obtaining module 410 may perform enhancement related to a greyscale range and contour extraction on the spatial domain image.
In some embodiments, the first breast image may be understood as a spatial domain image. The obtaining module 410 may perform the histogram equalization on the first breast image. For example, the obtaining module 410 may perform histogram equalization in the spatial domain to adjust the contrast of the first breast image using the image histogram of the first breast image, which can effectively solve the problem, caused by overexposure or underexposure, that the background and/or the foreground of the first breast image is too dark or bright, thereby better reflecting the image details of the first breast image. In some embodiments, the obtaining module 410 may transform the equalization image into a frequency domain image. For example, the obtaining module 410 may transform the equalization image into a frequency domain image through a fast Fourier transformation algorithm. Correspondingly, the obtaining module 410 may transform the frequency domain image into a spatial domain image through an inverse Fourier transformation algorithm. In some embodiments, amplifying a signal at one or more frequency points in the frequency domain image may make the boundary of tissue in the first breast image clearer. In some embodiments, the contour extraction processing may be performed on the spatial domain image using, for example, a high-contrast  retention algorithm to effectively extract the contour of tissue in the first breast image, so as to realize the sharpening of the image local features of the first breast image. In some embodiments, the grayscale range may refer to a grayscale range corresponding to a grayscale distribution of different tissues such as the calcification, the lump, the gland, the fat, or the skin in the first breast image. In some embodiments, the obtaining module 410 may perform enhancement on the spatial domain image in the grayscale range. In some embodiments, the obtaining module 410 may take a window location (e.g., the center point of the grayscale range) and a window width (e.g., the width of the grayscale range) of the grayscale range to cut and normalize the image grayscale of the spatial domain image. The obtaining module 410 may set the gray values in the spatial image higher than the grayscale range to 1 (e.g., presented as white) , and set the gray values in the spatial image lower than the grayscale range to 0 (e.g., presented as black) , so as to highlight the region in the spatial image corresponding to the grayscale range.
The process 600 for determining the one or more scan parameters provided in this embodiment include an operation of processing the first breast image, which can improve the efficiency and accuracy of determining the one or more breast feature parameters using the feature extraction model.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
FIG. 7 is a flowchart illustrating an exemplary process for obtaining a feature extraction model according to some embodiments of the present disclosure. In some embodiments, the process 700 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 700 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device  140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 700 presented below are intended to be illustrative. In some embodiments, the process 700 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 700 as illustrated in FIG. 7 and described below is not intended to be limiting.
In 710, the feature parameter determination module 420 may obtain a plurality of sample breast images. Each of the plurality of sample breast images may include one or more sample feature parameters. In some embodiments, the one or more sample feature parameters may correspond to the one or more breast feature parameters.
In some embodiments, the feature parameter determination module 420 may process the plurality of sample breast images, which may improve the accuracy and the efficiency of the training process of the feature extraction model. The operation of processing the plurality of sample breast images may be similar to the operation of processing the first breast image illustrated in FIG. 6.
In some embodiments, the extraction of the sample feature parameters from the plurality of sample breast images may be fully automated, semi-automated, manual, or a combination thereof. In some embodiments, in a fully-automated feature extraction approach, one or more image processing algorithms may first detect one or more sample ROIs and then extract the sample feature parameters. For instance, the image processing algorithm (s) may automatically detect one or more sample ROIs (e.g., calcifications, lumps, tumors in the plurality of sample breast images) , and then extract the sample feature parameters from the detected  sample ROI (s) . Under a semi-automated approach, some of the sample feature parameters may be extracted automatically, while others may be annotated, edited, or corrected by a user. Under a manual approach, the sample feature parameters may be annotated or measured by a user.
In some embodiments, the sample feature parameters may be extracted during the training. For example, deep learning (e.g., deep structured learning, hierarchical learning, or deep machine learning) may model high-level abstractions in data by using multiple processing layers with structures composed of multiple non-linear transformations, where the input data features are not engineered explicitly. When the sample breast images are input into a preliminary model, the preliminary model may process the sample breast images via multiple layers of feature extraction to produce the sample feature parameters used as ground truth of the training process. Deep learning may be effective in extracting high level abstractions (features) from Iow level input data (e.g., raw images) that is suitable for the trained task, so may be more convenient than and superior to hand-crafted features for identifying breast features for breast images.
In some embodiments, the sample feature parameters may be extracted from the sample breast images based on gray information and/or morphological information in the sample breast images. In some embodiments, the gray values of breast calcifications in a breast image may be relatively higher, the distribution of breast calcifications in a breast image may be relatively loose, the edges of breast calcifications in a breast image may be relatively smooth, and the diameters of breast calcifications in a breast image may be between 100 microns and several millimeters. In some embodiments, there may be also a few calcifications that are large rod-shaped and linearly distribute. In some embodiments, the gray values of breast lumps in a breast image may be lower than those of calcifications. The edges of breast lumps in a breast image may be different due to the malignant lumps or benign lumps. The malignant lump may have sharp and scattering edges or  irregular edges, and the benign lump may have smooth edges that have an oval or round shape. The diameter of a lump may be a few millimeters to a few centimeters.
In 720, the feature parameter determination module 420 may obtain the feature extraction model by training a preliminary model using the plurality of sample breast images. In some embodiments, the plurality of sample breast images may be used as a training set of data (e.g., a training set of inputs each having a known output (e.g., ground truth) such as the sample feature parameters) to train the feature extraction model.
In some embodiments, during a training process of the feature extraction model, a preliminary model may be obtained. The preliminary model may be trained based on the sample breast images and the sample feature parameters to obtain the feature extraction model. In some embodiments, the preliminary model may include a plurality of weight parameters that are to be determined during learning, which may be referred to as a training process. In the training process, a training set of data (a training set of inputs such as the sample breast images each having a known output such as the sample feature parameters) may be processed by the preliminary model so that the preliminary model may learn how to provide an output for new input data by generalizing the information it learns in the training process from the training data. The purpose of learning may be to adapt the weight parameters on the incoming connections to predict the correct output when given an input.
Details regarding the training process of the feature extraction model may be found elsewhere in the present disclosure (e.g., the description in connection with FIG. 8) .
In some embodiments, after additional sample breast images are collected, the training process may be repeated to update the feature extraction model using the additional sample breast images with or without previous sample breast images  used for earlier rounds of the training.
In some embodiments, the feature extraction model may be determined by the imaging system 100 (e.g., the processing device 140, the terminal 130, a storage device (the storage device 150, the storage 220, the storage 390) ) or a third party (e.g., an external device) . In some embodiments, the imaging system 100 may determine and/or update the feature extraction model offline and store the feature extraction model in the storage device. In some embodiments, the feature extraction model may be determined and/or updated (or maintained) by, e.g., the manufacturer of the imaging device 110 or a vendor. For instance, the manufacturer or the vendor may load either one of the feature extraction model into the imaging system 100 or a portion thereof (e.g., the processing device 140 and/or the terminal 130) before or during the installation of the imaging device 110, the processing device 140, and/or the terminal 130, and maintain or update the feature extraction model from time to time (periodically or not) . The maintenance or update may be achieved by installing a program stored on a storage device (e.g., a compact disc, a USB drive, etc. ) or retrieved from an external source (e.g., a server maintained by the manufacturer or vendor) via the network 120. The program may include a new model (e.g., a new feature extraction model) or a portion of a model that substitute or supplement a corresponding portion of the model.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
FIG. 8 is a flowchart illustrating an exemplary training process for obtaining a feature extraction model according to some embodiments of the present disclosure. In some embodiments, the process 800 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 800 may be stored in a storage  medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 800 presented below are intended to be illustrative. In some embodiments, the process 800 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 800 as illustrated in FIG. 8 and described below is not intended to be limiting. In some embodiments, operation 720 of the process 700 may be performed based on the process 800.
In some embodiments, the feature parameter determination module 420 may obtain the feature extraction model by performing an iteration process including one or more iterations. In some embodiments, the feature parameter determination module 420 may update weight parameters of a preliminary model by performing an iteration process of a backpropagation neural network training procedure, e.g., a stochastic gradient descent backpropagation training technique, to determine the updated values of the weight parameters of the feature extraction model. For example, the feature parameter determination module 420 may backpropagate the error determined for the output of the neural network in order to adjust the parameters of the neural network layers.
In 810, the feature parameter determination module 420 may obtain one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model. The intermediate model may include a preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the current iteration.
In 820, the feature parameter determination module 420 may determine a difference between the one or more output feature parameters and the one or more sample feature parameters. In some embodiments, the feature parameter determination module 420 may determine a value of a loss function based on the difference.
In 830, the feature parameter determination module 420 may determine whether a termination condition is satisfied. An exemplary termination condition may be that the value of the loss function in the current iteration is less than a threshold value. Other exemplary termination conditions may include that a maximum number (or count) of iterations has been performed, and/or that a difference between the values of the loss function obtained in a previous iteration and the current iteration (or among the values of the loss function within a certain number or count of successive iterations) is less than a certain threshold. In response to a determination that the termination condition is not satisfied in 830, the process 800 may proceed to 840, and initiate a new iteration by further repeating 810-830 until the termination condition is satisfied. In response to a determination that the termination condition is satisfied in 830, the process 800 may proceed to operation 840, e.g., the iterative process may be terminated and the intermediate model in the current iteration may be determined as the feature extraction model, and may be stored and/or output.
In 850, the feature parameter determination module 420 may update the intermediate model based on the difference between the one or more output feature parameters and the one or more sample feature parameters. For example, the feature parameter determination module 420 may update the weight parameters in the intermediate model based on the difference between the one or more output feature parameters and the one or more sample feature parameters.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure.  For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
FIG. 9 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure. In some embodiments, the process 900 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 900 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 900 presented below are intended to be illustrative. In some embodiments, the process 900 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 900 as illustrated in FIG. 9 and described below is not intended to be limiting. In some embodiments, on the basis of the process 500 in FIG. 5, the process 900 may determine the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
In 910, the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast.
In 920, the feature parameter determination module 420 may determine one or more breast feature parameters by inputting the first breast image into a feature extraction model.
In 930, the scan parameter determination module 430 may determine one or more scan parameters by inputting the compression thickness and the one or more  breast feature parameters into a parameter determination model.
The  operations  910 and 920 may be similar to the  operations  510 and 520 of the process 500 in FIG. 5.
In some embodiments, the parameter determination model may include an expert system or an artificial intelligence network model. In some embodiments, the artificial intelligence network model may include a machine learning model. In some embodiments, the artificial intelligence network model may include a deep learning model. In some embodiments, the artificial intelligence network model may include an artificial neural network model (e.g., a CNN model) . In some embodiments, the parameter determination model may be provided by a training process and configured to output the one or more scan parameters.
In some embodiments, the training process of the parameter determination model may be similar to the training process of the feature extraction model illustrated in FIG. 7 and FIG. 8.
In some embodiments, sample feature parameters and sample compression thicknesses of a plurality of sample breast images, and sample scan parameters used to generate the plurality of sample breast images may be obtained and used as a training set of data (e.g., a training set of inputs such as the sample feature parameters and the sample compression thicknesses each having a known output (e.g., ground truth) such as the sample scan parameters) to train the parameter determination model. In some embodiments, the extraction of the sample feature parameters may be similar to the relevant description in connection with operation 710 of the process 700 in FIG. 7. In some embodiments, the parameter determination model may realize the automatic determination of the one or more scan parameters of the breast tomography.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications  may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
FIG. 10 is a flowchart illustrating an exemplary process for determining one or more scan parameters according to some embodiments of the present disclosure. In some embodiments, the process 1000 may be implemented in the imaging system 100 illustrated in FIG. 1. For example, the process 1000 may be stored in a storage medium (e.g., the storage device 150, or the storage 220 of the processing device 140, the storage 390 of the terminal 130, the memory 360 of the terminal 130) as a form of instructions, and can be invoked and/or executed by the processing device 140 or the terminal 130 (e.g., the processor 210 of the processing device 140, the CPU 340 and/or the GPU 330 of the terminal 130, or one or more modules in the processing device 140 illustrated in FIG. 4) . The operations of the illustrated process 1000 presented below are intended to be illustrative. In some embodiments, the process 1000 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of the process 1000 as illustrated in FIG. 10 and described below is not intended to be limiting.
In 1010, the obtaining module 410 may obtain a compression thickness of a compressed target breast and a first breast image of the target breast. The operation 1010 may be similar to the operation 510 of the process 500 in FIG. 5.
In 1020, the scan parameter determination module 430 may determine, based on the compression thickness and the first breast image, one or more scan parameters of breast tomography on the target breast. In some embodiments, the one or more scan parameters may include a scan angle range and a frame number of the breast tomography. In some embodiments, the scan parameter determination module 430 may determine the one or more scan parameters by inputting the compression thickness and the first breast image into a scan parameter model.
In some embodiments, the scan parameter model may include a function of extract features from an image. Therefore, the scan parameter model may realize directly outputting the one or more scan parameters based on the input of the compression thickness and the first breast image. For example, the scan parameter model may extract one or more breast feature parameters from the first breast image. Then, the scan parameter model may determine the one or more scan parameters based on the compression thickness and the one or more breast feature parameters.
In some embodiments, the scan parameter model may include an expert system or an artificial intelligence network model. In some embodiments, the artificial intelligence network model may include a machine learning model. In some embodiments, the artificial intelligence network model may include a deep learning model. In some embodiments, the scan parameter model may be provided by a training process and configured to output the one or more scan parameters.
In some embodiments, the artificial intelligence network model may include an artificial neural network model. The artificial neural network model may include an input layer, an output layer, and a plurality of hidden layers. The layers of the artificial neural network model may be connected in a feed-forward fashion, and an output of an i th layer may be provided as an input to an (i+1)  th layer.
In some embodiments, in the artificial neural network model, the input layer may be configured to receive an input (e.g., the first breast image and the compression thickness) of the artificial neural network model. Each hidden layer may perform a specific function, e.g., convolution, pooling, normalization, matrix multiplication, non-linear activation, or the like. The output layer may receive an input from the preceding layer and apply one or more transformations to the received input to generate a processing result (e.g., the one or more scan parameters) of the artificial neural network model.
For illustration purposes, a convolutional neural network (CNN) model may  be taken as an example. Exemplary hidden layers may include a convolutional layer, a pooling layer, and a fully connected layer. In some embodiments, the first breast image and the compression thickness may be inputted into the scan parameter model. The first breast image may be represented as a 2D matrix or a 3D matrix including a plurality of elements (e.g., pixels or voxels) . Each of the plurality of elements in the matrix may have a value representing a feature or characteristic of the element.
The convolutional layer may include one or more kernels, which may be used to extract a feature of the image (e.g., one or more breast feature parameters of the first breast image) . In some embodiments, each of the one or more convolutional kernel may have a specific size and stride. In some embodiments, each of the one or more kernels may filter a portion of the first breast image to generate a specific feature corresponding to the portion. The specific feature may be determined based on the one or more kernels. Exemplary features may include a low-level feature (e.g., an edge feature, a textural feature) , a high-level feature, or a complicated feature.
The pooling layer may take an output of the convolutional layer as an input. The pooling layer may include a plurality of pooling nodes, which may be used to sample the output of the convolutional layer, so as to reduce the computational load of data processing and accelerate the speed of data processing speed. In some embodiments, a size of the matrix representing the first breast image may be reduced in the pooling layer.
The fully connected layer may include a plurality of neurons. The neurons may be connected to the pooling nodes in the pooling layer. In the fully connected layer, a plurality of vectors corresponding to the plurality of pooling nodes may be determined based on the one or more breast feature parameters of the first breast image and the compression thickness, and a plurality of weighting coefficients may be assigned to the plurality of vectors.
The output layer may determine an output based on the vectors and the weighting coefficients obtained from the fully connected layer. In some embodiments, an output of the output layer may include the one or more scan parameters.
In some embodiments, the training process of the scan parameter model may be similar to the training process of the feature extraction model illustrated in FIG. 7 and FIG. 8.
In some embodiments, a plurality of sample breast images, sample compression thicknesses of the plurality of sample breast images, and sample scan parameters used to generate the plurality of sample breast images may be obtained and used as a training set of data (e.g., a training set of inputs such as the sample breast images and the sample compression thicknesses each having a known output (e.g., ground truth) such as the sample scan parameters) to train the scan parameter model.
It should be noted that the above description is merely provided for the purposes of illustration, and not intended to limit the scope of the present disclosure. For persons having ordinary skills in the art, multiple variations and modifications may be made under the teachings of the present disclosure. However, those variations and modifications do not depart from the scope of the present disclosure.
Having thus described the basic concepts, it may be rather apparent to those skilled in the art after reading this detailed disclosure that the foregoing detailed disclosure is intended to be presented by way of example only and is not limiting. Various alterations, improvements, and modifications may occur and are intended to those skilled in the art, though not expressly stated herein. These alterations, improvements, and modifications are intended to be suggested by this disclosure, and are within the spirit and scope of the exemplary embodiments of this disclosure.
Moreover, certain terminology has been used to describe embodiments of the present disclosure. For example, the terms “one embodiment, ” “an embodiment, ”  and/or “some embodiments” mean that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present disclosure. Therefore, it is emphasized and should be appreciated that two or more references to “an embodiment” or “one embodiment” or “an alternative embodiment” in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined as suitable in one or more embodiments of the present disclosure.
Further, it will be appreciated by one skilled in the art, aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer readable media having computer readable program code embodied thereon.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electro-magnetic, optical, or the like, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable signal medium may be transmitted using any appropriate medium,  including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran 2003, Peri, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
Furthermore, the recited order of processing elements or sequences, or the use of numbers, letters, or other designations therefore, is not intended to limit the claimed processes and methods to any order except as may be specified in the claims. Although the above disclosure discusses through various examples what is currently considered to be a variety of useful embodiments of the disclosure, it is to be understood that such detail is solely for that purpose, and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover modifications and equivalent arrangements that are within the spirit and scope of the disclosed embodiments. For example, although the implementation of various components described above may be embodied in a hardware device, it  may also be implemented as a software only solution, e.g., an installation on an existing server or mobile device.
Similarly, it should be appreciated that in the foregoing description of embodiments of the present disclosure, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the various embodiments. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed subject matter requires more features than are expressly recited in each claim. Rather, claimed subject matter may lie in less than all features of a single foregoing disclosed embodiment.

Claims (55)

  1. A system comprising:
    at least one storage device including a set of instructions; and
    at least one processor in communication with the at least one storage device, wherein when executing the set of instructions, the at least one processor is directed to perform operations including:
    obtaining a compression thickness of a compressed target breast and a first breast image of the target breast;
    determining one or more breast feature parameters by inputting the first breast image into a feature extraction model; and
    determining, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast, the one or more scan parameters including a scan angle range and a frame number of the breast tomography.
  2. The system of claim 1, wherein the at least one processor is directed to perform the operations including:
    before determining the one or more breast feature parameters based on the first breast image, processing the first breast image.
  3. The system of claim 2, wherein processing the first breast image includes performing at least one of downsampling, upsampling, image segmentation, or image enhancement on the first breast image.
  4. The system of claim 3, wherein performing image enhancement on the first breast image includes:
    obtaining an equalization image by performing histogram equalization on the first breast image;
    transforming the equalization image into a frequency domain image;
    obtaining an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image;
    obtaining a spatial domain image by performing inverse transformation on the amplified frequency domain image; and
    performing enhancement related to a greyscale range and contour extraction on the spatial domain image.
  5. The system of any one of claims 1-4, wherein the feature extraction model includes an expert system or an artificial intelligence network model.
  6. The system of claim 5, wherein the artificial intelligence network model includes a machine learning model.
  7. The system of claim 6, wherein the artificial intelligence network model includes a deep learning model.
  8. The system of claim 7, wherein the deep learning model includes a convolutional neural network model, the convolutional neural network model including an input layer, a convolutional layer, a pooling layer, a fully connected later, and an output layer.
  9. The system of claim 8, wherein
    the convolutional layer includes 36 convolutional kernels each of which has a size of 13*13 and a stride of 1; and
    the pooling layer includes a max-pooling layer.
  10. The system of any one of claims 1-9, wherein the feature extraction model is provided by:
    obtaining a plurality of sample breast images, each of the plurality of sample breast images including one or more sample feature parameters; and
    obtaining the feature extraction model by training a preliminary model using the plurality of sample breast images.
  11. The system of claim 10, wherein obtaining the feature extraction model by training the preliminary model using the plurality of sample breast images includes:
    obtaining the feature extraction model by performing an iteration process including one or more iterations, at least one of the one or more iterations including:
    obtaining one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model, the intermediate model including the preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the at least one of the one or more iterations; and
    updating the intermediate model based on a difference between the one or more output feature parameters and the one or more sample feature parameters.
  12. The system of any one of claims 1-11, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  13. The system of claim 12, wherein the parameter determination model includes a machine learning model.
  14. The system of any one of claims 1-13, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining a breast density feature of the target breast based on the compression thickness;
    modifying at least one of the one or more breast feature parameters based on the breast density feature; and
    determining the one or more scan parameters based on the one or more modified breast feature parameters.
  15. The system of any one of claims 1-14, wherein the one or more breast feature parameters include at least one of a type, shape, distribution, edge, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
  16. The system of any one of claims 1-15, wherein the scan angle range and the frame number are any value allowed by an imaging device that performs the breast tomography.
  17. The system of any one of claims 1-16, wherein the at least one processor is directed to perform the operations including:
    causing, based on the one or more scan parameters, an imaging device to perform the breast tomography on the target breast; and
    generating a second breast image of the target breast based on the breast tomography.
  18. The system of any one of claims 1-17, wherein the breast tomography includes digital breast tomosynthesis (DBT) .
  19. A method implemented on a machine including one or more processors and one or more storage devices, comprising:
    obtaining a compression thickness of a compressed target breast and a first breast image of the target breast;
    determining one or more breast feature parameters by inputting the first breast image into a feature extraction model; and
    determining, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast, the one or more scan parameters including a scan angle range and a frame number of the breast tomography.
  20. The method of claim 19, further comprising:
    before determining the one or more breast feature parameters based on the first breast image, processing the first breast image.
  21. The method of claim 20, wherein processing the first breast image includes performing at least one of downsampling, upsampling, image segmentation, or image enhancement on the first breast image.
  22. The method of claim 21, wherein performing image enhancement on the first breast image includes:
    obtaining an equalization image by performing histogram equalization on the first breast image;
    transforming the equalization image into a frequency domain image;
    obtaining an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image;
    obtaining a spatial domain image by performing inverse transformation on the amplified frequency domain image; and
    performing enhancement related to a greyscale range and contour extraction on the spatial domain image.
  23. The method of any one of claims 19-22, wherein the feature extraction model includes an expert system or an artificial intelligence network model.
  24. The method of claim 23, wherein the artificial intelligence network model includes a machine learning model.
  25. The method of claim 24, wherein the artificial intelligence network model includes a deep learning model.
  26. The method of claim 25, wherein the deep learning model includes a convolutional neural network model, the convolutional neural network model including an input layer, a convolutional layer, a pooling layer, a fully connected later, and an output layer.
  27. The method of claim 26, wherein
    the convolutional layer includes 36 convolutional kernels each of which has a size of 13*13 and a stride of 1; and
    the pooling layer includes a max-pooling layer.
  28. The method of any one of claims 19-27, wherein the feature extraction model is provided by:
    obtaining a plurality of sample breast images, each of the plurality of sample breast images including one or more sample feature parameters; and
    obtaining the feature extraction model by training a preliminary model using the plurality of sample breast images.
  29. The method of claim 28, wherein obtaining the feature extraction model by training the preliminary model using the plurality of sample breast images includes:
    obtaining the feature extraction model by performing an iteration process including one or more iterations, at least one of the one or more iterations including:
    obtaining one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model, the intermediate model including the preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the at least one of the one or more iterations; and
    updating the intermediate model based on a difference between the one or more output feature parameters and the one or more sample feature parameters.
  30. The method of any one of claims 19-29, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  31. The method of claim 30, wherein the parameter determination model includes a machine learning model.
  32. The method of any one of claims 19-31, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining a breast density feature of the target breast based on the compression thickness;
    modifying at least one of the one or more breast feature parameters based on the breast density feature; and
    determining the one or more scan parameters based on the one or more modified breast feature parameters.
  33. The method of any one of claims 19-32, wherein the one or more breast feature parameters include at least one of a type, shape, distribution, edge, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
  34. The method of any one of claims 19-33, wherein the scan angle range and the frame number are any value allowed by an imaging device that performs the breast tomography.
  35. The method of any one of claims 19-34, further comprising:
    causing, based on the one or more scan parameters, an imaging device to perform the breast tomography on the target breast; and
    generating a second breast image of the target breast based on the breast tomography.
  36. The method of any one of claims 19-35, wherein the breast tomography includes digital breast tomosynthesis (DBT) .
  37. A system comprising:
    an obtaining module configured to obtain a compression thickness of a compressed target breast and a first breast image of the target breast;
    a feature parameter determination module configured to determine one or more breast feature parameters by inputting the first breast image into a feature extraction model; and
    a scan parameter determination module configured to determine, based on the compression thickness and the one or more breast feature parameters, one or more scan parameters of breast tomography on the target breast, the one or more scan parameters including a scan angle range and a frame number of the breast tomography.
  38. The system of claim 37, wherein the obtaining module is further configured to:
    process the first breast image before the one or more breast feature parameters are determined based on the first breast image.
  39. The system of claim 38, wherein processing the first breast image includes performing at least one of downsampling, upsampling, image segmentation, or image enhancement on the first breast image.
  40. The system of claim 39, wherein performing image enhancement on the first breast image includes:
    obtaining an equalization image by performing histogram equalization on the first breast image;
    transforming the equalization image into a frequency domain image;
    obtaining an amplified frequency domain image by amplifying a signal at one or more frequency points in the frequency domain image;
    obtaining a spatial domain image by performing inverse transformation on the amplified frequency domain image; and
    performing enhancement related to a greyscale range and contour extraction on the spatial domain image.
  41. The system of any one of claims 37-40, wherein the feature extraction model includes an expert system or an artificial intelligence network model.
  42. The system of claim 41, wherein the artificial intelligence network model includes a machine learning model.
  43. The system of claim 42, wherein the artificial intelligence network model includes a deep learning model.
  44. The system of claim 43, wherein the deep learning model includes a convolutional neural network model, the convolutional neural network model including an input layer, a convolutional layer, a pooling layer, a fully connected later, and an output layer.
  45. The system of claim 44, wherein
    the convolutional layer includes 36 convolutional kernels each of which has a size of 13*13 and a stride of 1; and
    the pooling layer includes a max-pooling layer.
  46. The system of any one of claims 37-45, wherein the feature extraction model is provided by:
    obtaining a plurality of sample breast images, each of the plurality of sample breast images including one or more sample feature parameters; and
    obtaining the feature extraction model by training a preliminary model using the plurality of sample breast images.
  47. The system of claim 46, wherein obtaining the feature extraction model by training the preliminary model using the plurality of sample breast images includes:
    obtaining the feature extraction model by performing an iteration process including one or more iterations, at least one of the one or more iterations including:
    obtaining one or more output feature parameters by inputting one or more of the plurality of sample breast images into an intermediate model, the intermediate model including the preliminary model in a first iteration of the one or more iterations of the iteration process or an updated model generated in a previous iteration of the at least one of the one or more iterations; and
    updating the intermediate model based on a difference between the one or more output feature parameters and the one or more sample feature parameters.
  48. The system of any one of claims 37-47, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining the one or more scan parameters by inputting the compression thickness and the one or more breast feature parameters into a parameter determination model.
  49. The system of claim 48, wherein the parameter determination model includes a machine learning model.
  50. The system of any one of claims 37-49, wherein determining the one or more scan parameters based on the compression thickness and the one or more breast feature parameters includes:
    determining a breast density feature of the target breast based on the compression thickness;
    modifying at least one of the one or more breast feature parameters based on the breast density feature; and
    determining the one or more scan parameters based on the one or more modified breast feature parameters.
  51. The system of any one of claims 37-50, wherein the one or more breast feature parameters include at least one of a type, shape, distribution, edge, or a major axis of each of one or more regions of interest (ROIs) in the first breast image.
  52. The system of any one of claims 37-51, wherein the scan angle range and the frame number are any value allowed by an imaging device that performs the breast tomography.
  53. The system of any one of claims 37-52, wherein the scan parameter determination module is further configured to:
    cause, based on the one or more scan parameters, an imaging device to perform the breast tomography on the target breast; and
    generate a second breast image of the target breast based on the breast tomography.
  54. The system of any one of claims 37-53, wherein the breast tomography includes digital breast tomosynthesis (DBT) .
  55. A non-transitory computer readable medium, comprising at least one set of instructions, wherein when executed by one or more processors of a computing device, the at least one set of instructions causes the computing device to perform a method, the method comprising:
    obtaining a compression thickness of a compressed target breast and a first breast image of the target breast;
    determining one or more breast feature parameters by inputting the first breast image into a feature extraction model; and
    determining, based on the compression thickness and the one or more breast  feature parameters, one or more scan parameters of breast tomography on the target breast, the one or more scan parameters including a scan angle range and a frame number of the breast tomography.
PCT/CN2020/142136 2019-12-31 2020-12-31 Imaging systems and methods WO2021136505A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201911419230.0 2019-12-31
CN201911419230.0A CN111028310B (en) 2019-12-31 2019-12-31 Method, device, terminal and medium for determining scanning parameters of breast tomography

Publications (1)

Publication Number Publication Date
WO2021136505A1 true WO2021136505A1 (en) 2021-07-08

Family

ID=70201483

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/142136 WO2021136505A1 (en) 2019-12-31 2020-12-31 Imaging systems and methods

Country Status (2)

Country Link
CN (1) CN111028310B (en)
WO (1) WO2021136505A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023205212A1 (en) * 2022-04-20 2023-10-26 Clarix Imaging Corporation Co-registraton, display, and visualization of volumetric specimen imaging data with pre-surgical imaging data

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111028310B (en) * 2019-12-31 2023-10-03 上海联影医疗科技股份有限公司 Method, device, terminal and medium for determining scanning parameters of breast tomography
CN116433695B (en) * 2023-06-13 2023-08-22 天津市第五中心医院 Mammary gland region extraction method and system of mammary gland molybdenum target image

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8411923B2 (en) * 2008-06-13 2013-04-02 Siemens Aktiengesellschaft Tomographic image reconstruction method and apparatus using filtered back projection
CN105611876A (en) * 2013-09-30 2016-05-25 富士胶片株式会社 Breast thickness measurement device and breast thickness measurement method
CN109363699A (en) * 2018-10-16 2019-02-22 杭州依图医疗技术有限公司 A kind of method and device of breast image lesion identification
CN110276411A (en) * 2019-06-28 2019-09-24 腾讯科技(深圳)有限公司 Image classification method, device, equipment, storage medium and medical treatment electronic equipment
CN111028310A (en) * 2019-12-31 2020-04-17 上海联影医疗科技有限公司 Scanning parameter determination method, device, terminal and medium for breast tomography

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002008006A (en) * 2000-06-20 2002-01-11 Fuji Photo Film Co Ltd Method and device for determining image processing condition
US7123684B2 (en) * 2002-11-27 2006-10-17 Hologic, Inc. Full field mammography with tissue exposure control, tomosynthesis, and dynamic field of view processing
WO2005120357A1 (en) * 2004-06-04 2005-12-22 U-Systems, Inc. Breast ultrasound scanning promoting patient comfort and improved imaging near chest wall
EP2428163B1 (en) * 2010-09-09 2021-12-01 Hologic, Inc. Methods and systems for dynamically modifying acquisition parameter during image acquisition
CN103582455B (en) * 2011-02-14 2016-12-28 罗切斯特大学 Computer aided detection based on cone beam breast CT image and the method and apparatus of diagnosis
KR102326968B1 (en) * 2014-08-29 2021-11-17 (주)바텍이우홀딩스 mammography system and method
JP6695164B2 (en) * 2016-02-19 2020-05-20 富士フイルム株式会社 Tomographic image generator, radiation image capturing system, tomographic image generating method, and tomographic image generating program
CN105662445A (en) * 2016-02-25 2016-06-15 深圳柏斯生物科技有限公司 Full-automatic exposure control method and device of digital mammary gland X-ray imaging system
US9943280B2 (en) * 2016-03-07 2018-04-17 General Electric Company Breast tomosynthesis with flexible compression paddle
CN109381212A (en) * 2018-09-27 2019-02-26 上海联影医疗科技有限公司 A kind of image formation control method and system
CN109447065B (en) * 2018-10-16 2020-10-16 杭州依图医疗技术有限公司 Method and device for identifying mammary gland image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8411923B2 (en) * 2008-06-13 2013-04-02 Siemens Aktiengesellschaft Tomographic image reconstruction method and apparatus using filtered back projection
CN105611876A (en) * 2013-09-30 2016-05-25 富士胶片株式会社 Breast thickness measurement device and breast thickness measurement method
CN109363699A (en) * 2018-10-16 2019-02-22 杭州依图医疗技术有限公司 A kind of method and device of breast image lesion identification
CN110276411A (en) * 2019-06-28 2019-09-24 腾讯科技(深圳)有限公司 Image classification method, device, equipment, storage medium and medical treatment electronic equipment
CN111028310A (en) * 2019-12-31 2020-04-17 上海联影医疗科技有限公司 Scanning parameter determination method, device, terminal and medium for breast tomography

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023205212A1 (en) * 2022-04-20 2023-10-26 Clarix Imaging Corporation Co-registraton, display, and visualization of volumetric specimen imaging data with pre-surgical imaging data

Also Published As

Publication number Publication date
CN111028310B (en) 2023-10-03
CN111028310A (en) 2020-04-17

Similar Documents

Publication Publication Date Title
US11565130B2 (en) System and method for diagnostic and treatment
US11847763B2 (en) Systems and methods for image reconstruction
WO2021136505A1 (en) Imaging systems and methods
US20210142476A1 (en) Systems and methods for image optimization
US11562469B2 (en) System and method for image processing
CA3067078C (en) System and method for image processing
WO2021068975A1 (en) Systems and methods for image reconstruction
US11475569B2 (en) Systems and methods for image processing
US20220192619A1 (en) Imaging systems and methods
US11672496B2 (en) Imaging systems and methods
US20230064456A1 (en) Imaging systems and methods
WO2023125683A1 (en) Systems and methods for image reconstruction
US20230225687A1 (en) System and method for medical imaging
US11244446B2 (en) Systems and methods for imaging
US20220114801A1 (en) Systems and methods for image processing
US20230342974A1 (en) Imaging systems and methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20909997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20909997

Country of ref document: EP

Kind code of ref document: A1