WO2022061787A1 - Systèmes et procédés médicaux - Google Patents

Systèmes et procédés médicaux Download PDF

Info

Publication number
WO2022061787A1
WO2022061787A1 PCT/CN2020/117944 CN2020117944W WO2022061787A1 WO 2022061787 A1 WO2022061787 A1 WO 2022061787A1 CN 2020117944 W CN2020117944 W CN 2020117944W WO 2022061787 A1 WO2022061787 A1 WO 2022061787A1
Authority
WO
WIPO (PCT)
Prior art keywords
diagnosis result
images
suspected lesions
severity degree
suspected
Prior art date
Application number
PCT/CN2020/117944
Other languages
English (en)
Inventor
Leimin SHANG
Original Assignee
Shanghai United Imaging Healthcare Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co., Ltd. filed Critical Shanghai United Imaging Healthcare Co., Ltd.
Priority to CN202080105497.7A priority Critical patent/CN116325009A/zh
Priority to PCT/CN2020/117944 priority patent/WO2022061787A1/fr
Publication of WO2022061787A1 publication Critical patent/WO2022061787A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/20ICT specially adapted for the handling or processing of medical images for handling medical images, e.g. DICOM, HL7 or PACS
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/20ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management or administration of healthcare resources or facilities, e.g. managing hospital staff or surgery rooms
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/70ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for mining of medical data, e.g. analysing previous cases of other patients
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring

Definitions

  • the disclosure generally relates to medical technology, and more particularly, to systems and methods for automatic diagnosis.
  • Automatic diagnosis e.g., computer-aided diagnosis (CAD)
  • CAD computer-aided diagnosis
  • a system may include one or more storage devices and one or more processors configured to communicate with the one or more storage devices.
  • the one or more storage devices may include a set of instructions.
  • the one or more processors may be directed to perform one or more of the following operations.
  • the one or more processors may cause a scanner to perform one or more scans on an object.
  • the one or more processors may obtain scan data of the object based on the one or more scans.
  • the one or more processors may generate one or more images of the object based on the scan data.
  • the one or more processors may generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • the diagnosis result may be generated based on a machine learning model.
  • the diagnosis result may indicate lesion information including at least one of a type of the one or more suspected lesions, a size of the one or more suspected lesions, a location of the one or more suspected lesions, a characteristic of the one or more suspected lesions, a severity degree of the one or more suspected lesions, or a suggestion of subsequent treatment associated with the one or more suspected lesions.
  • the lesion information may include the severity degree of the one or more suspected lesions.
  • the one or more processors may determine whether the severity degree of the one or more suspected lesions is greater than a threshold. In response to determining that the severity degree of the one or more suspected lesions is greater than the threshold, the one or more processors may generate a message indicating that the object needs a priority treatment.
  • the one or more processors may transmit, via a network, the notification to a remote surgery device.
  • the lesion information may include the severity degree of the one or more suspected lesions.
  • a tag associated with the severity degree may be assigned to the diagnosis result.
  • the one or more processors may transmit, via a network, the one or more images and/or the diagnosis result to a target remote device.
  • the target remote device may include a picture archiving and communication system (PACS) or a post-processing workstation.
  • PACS picture archiving and communication system
  • post-processing workstation may include a picture archiving and communication system (PACS) or a post-processing workstation.
  • the one or more processors may cause the one or more images and the diagnosis result to be displayed.
  • the one or more processors may transmit an instruction to a remote assistance device to direct the remote assistance device to arrange subsequent treatments associated with the object based on the diagnosis result.
  • the remote assistance device may include at least one of a hospital information system (HIS) , a laboratory information system (LIS) , a radiology information system (RIS) , a doctor workstation, or a nurse workstation.
  • HIS hospital information system
  • LIS laboratory information system
  • RIS radiology information system
  • the one or more processors may be configured to allow one or more user devices to access, via a network, the one or more processors to view the one or more images and the diagnosis result.
  • modification on at least one of a scan parameter of the one or more scans, a reconstruction parameter used to generate the one or more images, or the diagnosis result may be online performed based on the one or more images and/or the diagnosis result through a first user device of the one or more user devices.
  • the modification may be able to be viewed in real-time through a second user device of the one or more user devices.
  • one of the one or more user devices may relate to a doctor or an imaging technician.
  • a method may include one or more of the following operations.
  • One or more processors may cause a scanner to perform one or more scans on an object.
  • the one or more processors may obtain scan data of the object based on the one or more scans.
  • the one or more processors may generate one or more images of the object based on the scan data.
  • the one or more processors may generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • a system may include a control module configured to cause a scanner to perform one or more scans on an object.
  • the system may also include a reconstruction module configured to obtain scan data of the object based on the one or more scans.
  • the reconstruction module may also be configured to generate one or more images of the object based on the scan data.
  • the system may also include a diagnosis module configured to generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • a non-transitory computer readable medium may comprise at least one set of instructions.
  • the at least one set of instructions may be executed by one or more processors of a computing device.
  • the one or more processors may cause a scanner to perform one or more scans on an object.
  • the one or more processors may obtain scan data of the object based on the one or more scans.
  • the one or more processors may generate one or more images of the object based on the scan data.
  • the one or more processors may generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure
  • FIG. 2 is a schematic diagram illustrating hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure
  • FIG. 3 is a schematic diagram illustrating hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure
  • FIG. 4 is a block diagram illustrating an exemplary processing device according to some embodiments of the present disclosure.
  • FIG. 5 is a flowchart illustrating an exemplary process for automatic diagnosis according to some embodiments of the present disclosure.
  • system, ” “engine, ” “unit, ” “module, ” and/or “block” used herein are one method to distinguish different components, elements, parts, sections, or assemblies of different levels in ascending order. However, the terms may be displaced by another expression if they achieve the same purpose.
  • module, ” “unit, ” or “block, ” as used herein refer to logic embodied in hardware or firmware, or to a collection of software instructions.
  • a module, a unit, or a block described herein may be implemented as software and/or hardware and may be stored in any type of non-transitory computer-readable medium or another storage device.
  • a software module/unit/block may be compiled and linked into an executable program. It will be appreciated that software modules can be callable from other modules/units/blocks or from themselves, and/or may be invoked in response to detected events or interrupts.
  • Software modules/units/blocks configured for execution on computing devices may be provided on a computer-readable medium, such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • a computer-readable medium such as a compact disc, a digital video disc, a flash drive, a magnetic disc, or any other tangible medium, or as a digital download (and can be originally stored in a compressed or installable format that needs installation, decompression, or decryption prior to execution) .
  • Such software code may be stored, partially or fully, on a storage device of the executing computing device, for execution by the computing device.
  • Software instructions may be embedded in firmware, such as an EPROM.
  • hardware modules/units/blocks may be included in connected logic components, such as gates and flip-flops, and/or can be included of programmable
  • modules/units/blocks or computing device functionality described herein may be implemented as software modules/units/blocks, but may be represented in hardware or firmware.
  • the modules/units/blocks described herein refer to logical modules/units/blocks that may be combined with other modules/units/blocks or divided into sub-modules/sub-units/sub-blocks despite their physical organization or storage. The description may be applicable to a system, an engine, or a portion thereof.
  • image in the present disclosure is used to collectively refer to image data and/or images of various forms, including a two-dimensional (2D) image, a three-dimensional (3D) image, a four-dimensional (4D) image, etc.
  • 2D two-dimensional
  • 3D three-dimensional
  • 4D four-dimensional
  • first, ” “second, ” “third, ” etc. are used to distinguish similar objects and does not denote a specific ranking of the objects.
  • the character “/” includes one of the associated terms.
  • the systems may include an imaging system.
  • the imaging system may include a single-modality system and/or a multi-modality system.
  • the term “modality” used herein broadly refers to an imaging or treatment method or technology that gathers, generates, processes, and/or analyzes imaging information of a subject or treatments the subject.
  • the single-modality system may include, for example, a magnetic resonance imaging (MRI) system, a computed tomography (CT) system, a positron emission tomography (PET) system, a single photon emission computed tomography (SPECT) system, an X-ray system, an ultrasound system, etc.
  • MRI magnetic resonance imaging
  • CT computed tomography
  • PET positron emission tomography
  • SPECT single photon emission computed tomography
  • X-ray system X-ray system
  • ultrasound system etc.
  • the multi-modality system may include, for example, a positron emission tomography-magnetic resonance imaging (PET-MRI) system, a positron emission tomography-computed tomography (PET-CT) system, a single photon emission computed tomography-magnetic resonance imaging (SPECT-MRI) system, a single photon emission computed tomography-computed tomography (SPECT-CT) system, a magnetic resonance imaging-computed tomography (MRI-CT) system, etc.
  • PET-MRI positron emission tomography-magnetic resonance imaging
  • PET-CT positron emission tomography-computed tomography
  • SPECT-MRI single photon emission computed tomography-computed tomography
  • MRI-CT magnetic resonance imaging-computed tomography
  • the systems may include a treatment system.
  • the treatment system may include a treatment plan system (TPS) , image-guided radiotherapy (IGRT) system, etc.
  • the image-guided radiotherapy (IGRT) system may include a treatment device and an imaging device.
  • the treatment device may include a linear accelerator, a cyclotron, a synchrotron, etc., configured to perform a radiotherapy on a subject.
  • the treatment device may include an accelerator of species of particles including, for example, photons, electrons, protons, or heavy ions.
  • the imaging device may include a single-modality device and/or a multi-modality device.
  • the single-modality device may include, for example, a magnetic resonance (MR) scanner, a CT scanner, an X-ray scanner, a PET scanner, an SPECT scanner, an ultrasound scanner, etc.
  • the multi-modality device may include, for example, a PET-MRI scanner, a PET-CT scanner, an SPECT-MRI scanner, an SPECT-CT scanner, an MR-CT scanner, etc.
  • the subject may include a biological object and/or a non-biological object.
  • the biological subject may be a human being, an animal, a plant, or a specific portion, organ, and/or tissue thereof.
  • the subject may include a head, a neck, a thorax, a heart, a stomach, a blood vessel, a soft tissue, a tumor, a nodule, or the like, or any combination thereof.
  • the subject may be a man-made composition of organic and/or inorganic matters that are with or without life.
  • object or “subject” are used interchangeably in the present disclosure.
  • a representation of an object (e.g., a patient, a subject, or a portion thereof) in an image may be referred to as an object for brevity.
  • a representation of an organ or tissue (e.g., a heart, a liver, a lung) in an image may be referred to as an organ or tissue for brevity.
  • an image including a representation of an object may be referred to as an image of an object or an image including an object for brevity.
  • an operation performed on a representation of an object in an image may be referred to as an operation performed on an object for brevity.
  • a segmentation of a portion of an image including a representation of an organ or tissue from the image may be referred to as a segmentation of an organ or tissue for brevity.
  • An aspect of the present disclosure provides an automatic diagnosis system and method.
  • an examination terminal may be capable of performing automatic diagnosis based on the one or more generated images.
  • remote data transmission is omitted.
  • the reconstructed images may be archived to PACS together with the diagnosis result.
  • the operator of the examination terminal may view the diagnosis result directly on the display of the control console of the examination terminal, instead of accessing the PACS. According to the diagnosis result, the operator of the examination terminal may directly make a decision of subsequent treatment of the patient, so that the subsequent treatment may be performed on the patient (especially emergency patient) as soon as possible.
  • FIG. 1 is a schematic diagram illustrating an exemplary imaging system according to some embodiments of the present disclosure.
  • the imaging system 100 may include an imaging device 110, a processing device 120, a storage device 130, one or more terminals 140, and a network 150.
  • the components in the imaging system 100 may be connected in various ways.
  • the imaging device 110 may be connected to the processing device 120 through the network 150 or directly (as indicated by the bi-directional arrow in dotted lines linking the imaging device 110 and the processing device 120) as illustrated in FIG. 1.
  • the terminal (s) 140 may be connected to the processing device 120 via the network 150 or directly (as indicated by the bi-directional arrow in dotted lines linking the terminal (s) 140 and the processing device 120) as illustrated in FIG. 1.
  • the imaging device 110 may be configured to acquire imaging data (also referred to as scan data) relating to at least one part of an object.
  • the imaging device 110 may scan the object or a portion thereof that is located within its detection region and generate imaging data relating to the object or the portion thereof.
  • the imaging data relating to at least one part of an object may include an image (e.g., an image slice) , PET data (e.g., gamma photon information) , SPECT data (e.g., gamma photon information) , MR data (e.g., echo signals) , CT data (e.g., projection data) , or the like, or a combination thereof.
  • the imaging data may be two-dimensional (2D) imaging data, three-dimensional (3D) imaging data, four-dimensional (4D) imaging data, or the like, or any combination thereof.
  • the object may be biological or non-biological.
  • the object may include a patient, a man-made object, etc.
  • the object may include a specific portion, organ, and/or tissue of the patient.
  • the object may include the head, the neck, the thorax, the heart, the stomach, a blood vessel, soft tissue, a tumor, nodules, or the like, or any combination thereof.
  • the imaging device 110 may include a single-modality imaging device or a multi-modality device.
  • the single-modality device may include, for example, a magnetic resonance (MR) scanner, a CT scanner, an X-ray scanner, a PET scanner, an SPECT scanner, an ultrasound scanner, etc.
  • the multi-modality device may include, for example, a PET-MRI scanner, a PET-CT scanner, an SPECT-MRI scanner, an SPECT-CT scanner, an MR-CT scanner, etc.
  • the processing device 120 may process data and/or information obtained from the imaging device 110, the terminal (s) 140, and/or the storage device 130. For example, the processing device 120 may cause the imaging device 110 to perform one or more scans on an object. The processing device 120 may obtain scan data of the object based on the one or more scans. The processing device 120 may generate one or more images of the object based on the scan data. The processing device 120 may generate, based on the one or more images, a diagnosis result, the diagnosis result at least indicating lesion information associated with one or more suspected lesions of the object. The processing device 120 may transmit, via a network, the one or more images and the diagnosis result to a target remote device.
  • the processing device 120 may be a computer, a user console, a single server or a server group, etc.
  • the server group may be centralized or distributed.
  • the processing device 120 may be local or remote.
  • the processing device 120 may access information and/or data stored in the imaging device 110, the terminal (s) 140, and/or the storage device 130 via the network 150.
  • the processing device 120 may be directly connected to the imaging device 110, the terminal (s) 140, and/or the storage device 130 to access stored information and/or data.
  • the processing device 120 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 130 may store data, instructions, and/or any other information.
  • the storage device 130 may store data obtained from the terminal (s) 140 and/or the processing device 120.
  • the storage device 130 may store imaging data acquired by the imaging device 110.
  • the storage device 130 may store one or more images.
  • the storage device 130 may store a diagnosis result of one or more images.
  • the storage device 130 may store data and/or instructions that the processing device 120 may execute or use to perform exemplary methods/systems described in the present disclosure.
  • the storage device 130 may store data and/or instructions that the processing device 120 may execute or use to generate one or more images.
  • the storage device 130 may store data and/or instructions that the processing device 120 may execute or use to generate a diagnosis result.
  • the storage device 130 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • Exemplary mass storage devices may include a magnetic disk, an optical disk, a solid-state drive, etc.
  • Exemplary removable storage devices may include a flash drive, a floppy disk, an optical disk, a memory card, a zip disk, a magnetic tape, etc.
  • Exemplary volatile read-and-write memories may include a random access memory (RAM) .
  • Exemplary RAM may include a dynamic RAM (DRAM) , a double date rate synchronous dynamic RAM (DDR SDRAM) , a static RAM (SRAM) , a thyristor RAM (T-RAM) , and a zero-capacitor RAM (Z-RAM) , etc.
  • Exemplary ROM may include a mask ROM (MROM) , a programmable ROM (PROM) , an erasable programmable ROM (EPROM) , an electrically erasable programmable ROM (EEPROM) , a compact disk ROM (CD-ROM) , and a digital versatile disk ROM, etc.
  • the storage device 130 may be implemented on a cloud platform.
  • the cloud platform may include a private cloud, a public cloud, a hybrid cloud, a community cloud, a distributed cloud, an inter-cloud, a multi-cloud, or the like, or any combination thereof.
  • the storage device 130 may be connected to the network 150 to communicate with one or more other components in the imaging system 100 (e.g., the processing device 120, the terminal (s) 140, etc. ) .
  • One or more components in the imaging system 100 may access the data or instructions stored in the storage device 130 via the network 150.
  • the storage device 130 may be directly connected to or communicate with one or more other components in the imaging system 100 (e.g., the processing device 120, the terminal (s) 140, etc. ) .
  • the storage device 130 may be part of the processing device 120.
  • a user may interact with the imaging system 100 through the terminal (s) 140.
  • a user may set one or more scan parameters (e.g., a scan protocol) through an interface of the terminal 140.
  • a diagnosis result or one or more images may be displayed on an interface of the terminal 140.
  • the user may perform one or more user operations (e.g., zooming in, zooming out, rotation, marking, storing, uploading, cropping, etc. ) with respect to the diagnosis result or the one or more images via the terminal 140.
  • the terminal (s) 140 may include a mobile device 140-1, a tablet computer 140-2, a laptop computer 140-3, or the like, or any combination thereof.
  • the mobile device 140-1 may include a smart home device, a wearable device, a mobile device, a virtual reality device, an augmented reality device, or the like, or any combination thereof.
  • the smart home device may include a smart lighting device, a control device of an intelligent electrical apparatus, a smart monitoring device, a smart television, a smart video camera, an interphone, or the like, or any combination thereof.
  • the wearable device may include a bracelet, a footgear, eyeglasses, a helmet, a watch, clothing, a backpack, a smart accessory, or the like, or any combination thereof.
  • the mobile device may include a mobile phone, a personal digital assistant (PDA) , a gaming device, a navigation device, a point of sale (POS) device, a laptop, a tablet computer, a desktop, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a virtual reality helmet, virtual reality glasses, a virtual reality patch, an augmented reality helmet, augmented reality glasses, an augmented reality patch, or the like, or any combination thereof.
  • the virtual reality device and/or the augmented reality device may include a Google Glass TM , an Oculus Rift TM , a Hololens TM , a Gear VR TM , etc.
  • the terminal (s) 140 may be part of the processing device 120.
  • the terminal 140 may include a control console of the imaging system 100 and/or one or more user devices (e.g., a mobile device, a tablet computer, a laptop computer, etc. ) that establish connection (e.g., a wireless communication) and communicate with the processing device 120 via an application installed in the one or more user devices, respectively.
  • the network 150 may include any suitable network that can facilitate the exchange of information and/or data for the imaging system 100.
  • one or more components of the imaging device 110, the terminal (s) 140, the processing device 120, the storage device 130, etc. may communicate information and/or data with one or more other components of the imaging system 100 via the network 150.
  • the processing device 120 may obtain imaging data from the imaging device 110 via the network 150.
  • the processing device 120 may obtain user instructions from the terminal (s) 140 via the network 150.
  • the network 150 may be and/or include a public network (e.g., the Internet) , a private network (e.g., a local area network (LAN) , a wide area network (WAN) ) , etc.
  • LAN local area network
  • WAN wide area network
  • a wired network e.g., an Ethernet network
  • a wireless network e.g., an 802.11 network, a Wi-Fi network, etc.
  • a cellular network e.g., a Long Term Evolution (LTE) network
  • LTE Long Term Evolution
  • frame relay network e.g., a virtual private network ( “VPN” )
  • satellite network a telephone network, routers, hubs, switches, server computers, and/or any combination thereof.
  • the network 150 may include a cable network, a wireline network, a fiber-optic network, a telecommunications network, an intranet, a wireless local area network (WLAN) , a metropolitan area network (MAN) , a public telephone switched network (PSTN) , a Bluetooth TM network, a ZigBee TM network, a near field communication (NFC) network, or the like, or any combination thereof.
  • the network 150 may include one or more network access points.
  • the network 150 may include wired and/or wireless network access points such as base stations and/or internet exchange points through which one or more components of the imaging system 100 may be connected to the network 150 to exchange data and/or information.
  • the imaging system 100 may communicate with one or more remote devices 160 via the network 150.
  • a remote device 160 may include one or more servers 161.
  • the imaging system 100 e.g., the storage device 130, the processing device 120, and/or the terminal 140
  • may establish a communication e.g., a wireless communication
  • the processing device 120 may transmit one or more images and/or a diagnosis result corresponding to the one or more images to the remote device 160.
  • the processing device 120 may transmit a notification indicating that an object needs a priority treatment to remote device 160.
  • the processing device 120 may transmit an instruction to the remote device 160 to direct the remote device 160 to arrange subsequent treatments associated with an object based on the diagnosis result.
  • the remote device 160 may include a picture archiving and communication system (PACS) , a post-processing workstation, a remote surgery device, a remote assistance device, or the like, or any combination thereof.
  • the assistance device may include a hospital information system (HIS) , a laboratory information system (LIS) , a radiology information system (RIS) , a doctor workstation, a nurse workstation, or the like, or any combination thereof.
  • HIS hospital information system
  • LIS laboratory information system
  • RIS radiology information system
  • the imaging system 100 may include one or more additional components and/or one or more components of the imaging system 100 described above may be omitted. Additionally or alternatively, two or more components of the imaging system 100 may be integrated into a single component. A component of the imaging system 100 may be implemented on two or more sub-components.
  • FIG. 2 is a schematic diagram illustrating hardware and/or software components of an exemplary computing device according to some embodiments of the present disclosure.
  • the computing device 200 may be used to implement any component of the imaging system 100 as described herein.
  • the processing device 120 may be implemented on the computing device 200 via its hardware, software program, firmware, or a combination thereof. Although only one such computing device is shown, for convenience, the computer functions relating to the imaging system 100 as described herein may be implemented in a distributed fashion on a number of similar platforms, to distribute the processing load.
  • the computing device 200 may include a processor 210, a storage 220, an input/output (I/O) 230, and a communication port 240.
  • I/O input/output
  • the processor 210 may execute computer instructions (program codes) and perform functions of the processing device 120 in accordance with techniques described herein.
  • the computer instructions may include, for example, routines, programs, objects, components, signals, data structures, procedures, modules, and functions, which perform particular functions described herein.
  • the processor 210 may perform instructions obtained from the terminal (s) 140.
  • the processor 210 may include one or more hardware processors, such as a microcontroller, a microprocessor, a reduced instruction set computer (RISC) , an application-specific integrated circuits (ASICs) , an application-specific instruction-set processor (ASIP) , a central processing unit (CPU) , a graphics processing unit (GPU) , a physics processing unit (PPU) , a microcontroller unit, a digital signal processor (DSP) , a field-programmable gate array (FPGA) , an advanced RISC machine (ARM) , a programmable logic device (PLD) , any circuit or processor capable of executing one or more functions, or the like, or any combinations thereof.
  • RISC reduced instruction set computer
  • ASICs application-specific integrated circuits
  • ASIP application-specific instruction-set processor
  • CPU central processing unit
  • GPU graphics processing unit
  • PPU physics processing unit
  • DSP digital signal processor
  • FPGA field-programmable gate array
  • ARM advanced RIS
  • the computing device 200 in the present disclosure may also include multiple processors.
  • operations and/or method steps that are performed by one processor as described in the present disclosure may also be jointly or separately performed by the multiple processors.
  • the processor of the computing device 200 executes both operation A and operation B
  • operation A and operation B may also be performed by two or more different processors jointly or separately in the computing device 200 (e.g., a first processor executes operation A and a second processor executes operation B, or the first and second processors jointly execute operations A and B) .
  • the storage 220 may store data/information obtained from the imaging device 110, the terminal (s) 140, the storage device 130, or any other component of the imaging system 100.
  • the storage 220 may include a mass storage device, a removable storage device, a volatile read-and-write memory, a read-only memory (ROM) , or the like, or any combination thereof.
  • the storage 220 may store one or more programs and/or instructions to perform exemplary methods described in the present disclosure.
  • the I/O 230 may input or output signals, data, and/or information. In some embodiments, the I/O 230 may enable user interaction with the processing device 120. In some embodiments, the I/O 230 may include an input device and an output device. Exemplary input devices may include a keyboard, a mouse, a touch screen, a microphone, or the like, or a combination thereof. Exemplary output devices may include a display device, a loudspeaker, a printer, a projector, or the like, or a combination thereof.
  • Exemplary display devices may include a liquid crystal display (LCD) , a light-emitting diode (LED) -based display, a flat panel display, a curved screen, a television device, a cathode ray tube (CRT) , or the like, or a combination thereof.
  • LCD liquid crystal display
  • LED light-emitting diode
  • CRT cathode ray tube
  • the communication port 240 may be connected with a network (e.g., the network 150) to facilitate data communications.
  • the communication port 240 may establish connections between the processing device 120 and one or more components (e.g., the imaging device 110, the terminal (s) 140, or the storage device 130) of the imaging system 100, and/or connections between the processing device 120 and one or more external devices (e.g., a database, an external storage, and an image/data processing work station, etc. ) .
  • the connection may be a wired connection, a wireless connection, or a combination of both that enables data transmission and reception.
  • the wired connection may include an electrical cable, an optical cable, a telephone wire, or the like, or any combination thereof.
  • the wireless connection may include a Bluetooth network, a Wi-Fi network, a WiMax network, a WLAN, a ZigBee network, a mobile network (e.g., 3G, 4G, 5G, etc. ) , or the like, or any combination thereof.
  • the communication port 240 may be a standardized communication port, such as RS232, RS485, etc.
  • the communication port 240 may be a specially designed communication port.
  • the communication port 240 may be designed in accordance with the digital imaging and communications in medicine (DICOM) protocol.
  • DICOM digital imaging and communications in medicine
  • the components e.g., the processor 210, the storage 220, the I/O 230, or the communication port 240
  • the bus may include hardware, software, or the like, or any combination thereof, to couple the components of the computing device 200.
  • the bus may include at least one of a data bus, an address bus, an expansion bus, a local bus.
  • the bus may include a graphics bus (e.g., an accelerated graphics port (AGP) ) , an extended industry standard architecture (EISA) bus, a front side bus (FSB) , a hyper transport (HT) interconnect, an industry standard architecture (ISA) bus, an infiniband interconnect, a low pin count (LPC) bus, a storage bus, a micro channel architecture (MCA) bus, a peripheral component interconnect (PCI) bus, a PCI-express (PCI-X) bus, a serial advanced technology attachment (SATA) bus, a video electronics standards association local bus (VLB) , or the like, or any combination thereof.
  • the bus may include one or more buses.
  • the bus may include any type of bus or interconnect, which is not limited herein.
  • FIG. 3 is a schematic diagram illustrating hardware and/or software components of an exemplary mobile device according to some embodiments of the present disclosure.
  • one or more components e.g., a terminal 140
  • the imaging system 100 may be implemented on the mobile device 300.
  • the mobile device 300 may include a communication unit 310, a display unit 320, a graphics processing unit (GPU) 330, a central processing unit (CPU) 340, an I/O 350, a memory 360, and a storage unit 370.
  • any other suitable component including but not limited to a system bus or a controller (not shown) , may also be included in the mobile device 300.
  • a mobile operating system 361 e.g., iOS, Android, Windows Phone, etc.
  • one or more applications 362 may be loaded into the memory 360 from the storage unit 370 in order to be executed by the CPU 340.
  • the applications 362 may include a browser or any other suitable mobile apps for receiving and rendering information relating to image processing or other information from the processing device 120.
  • User interactions with the information stream may be achieved via the I/O 350 and provided to the processing device 120 and/or other components of the imaging system 100 via the network 150.
  • computer hardware platforms may be used as the hardware platform (s) for one or more of the elements described herein.
  • the hardware elements, operating systems and programming languages of such computers are conventional in nature, and it is presumed that those skilled in the art are adequately familiar therewith to adapt those technologies to generate an image as described herein.
  • a computer with user interface elements may be used to implement a personal computer (PC) or another type of work station or terminal device, although a computer may also act as a server if appropriately programmed. It is believed that those skilled in the art are familiar with the structure, programming and general operation of such computer equipment and as a result, the drawings should be self-explanatory.
  • FIG. 4 is a block diagram illustrating an exemplary computing device according to some embodiments of the present disclosure.
  • the processing device 120 and/or the terminal 140 may be implemented on the computing device 400, respectively, via its hardware, software program, firmware, or a combination thereof.
  • the computing device 400 may include a control module 410, a reconstruction module 420, a diagnosis module 430, and an input/output (I/O) module 440.
  • the control module 410 may cause a scanner (e.g., the imaging device 110 of the imaging system 100 in FIG. 1) to perform one or more scans on an object (e.g., one or more regions of interests (ROIs) of a patient) .
  • a scanner e.g., the imaging device 110 of the imaging system 100 in FIG. 1
  • ROIs regions of interests
  • the reconstruction module 420 may obtain scan data of the object based on the one or more scans.
  • the reconstruction module 420 may generate one or more images of the object based on the scan data.
  • the diagnosis module 430 may generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • the diagnosis result may further indicate lesion information of the one or more suspected lesions of the object.
  • the lesion information may include basic information of the object, a determination indicating a possibility that one or more suspected lesions exist in the object, a type (e.g., tumor, pulmonary nodule, hemorrhage point, etc. ) of the one or more suspected lesions, a type (e.g., cancer, infectious disease such as 2019 Novel Coronavirus (2019-nCoV) , cerebral hemorrhage, etc.
  • a type e.g., cancer, infectious disease such as 2019 Novel Coronavirus (2019-nCoV) , cerebral hemorrhage, etc.
  • the basic information of the object may include the name, age, gender, occupation, medical history, identity number, contact information (e.g., phone number, email address, etc. ) , a portrait photo of the object, or the like, or any combination thereof.
  • the characteristic of the one or more suspected lesions may include morphology of the one or more suspected lesions, symptom of the one or more suspected lesions, physiological information (e.g., a tissue density, etc. ) of the one or more suspected lesions, etc.
  • the suggestion of subsequent treatment may include rescanning at least a portion of the object, performing one or more other scans such as an enhancement scan, a perfusion scan, etc. using the imaging device 110, performing other physical examinations, sending directly to the operating room, determining which doctor will receive the patient, a message indicating that the object needs a priority treatment based on the severity degree of the one or more suspected lesions, or the like, or any combination thereof.
  • the message indicating that the object needs a priority treatment may correspond to a severity degree of the one or more suspected lesions that is greater than a threshold.
  • the diagnosis module 430 may determine whether the severity degree of the one or more suspected lesions is greater than a threshold. In response to determining that the severity degree of the one or more suspected lesions is greater than the threshold, the diagnosis module 430 may generate a message indicating that the object needs a priority treatment. Alternatively or additionally, the diagnosis module 430 may generate a suggestion of subsequent treatment of the one or more suspected lesions and directly send, based on the suggestion of subsequent treatment, the message to the remote assistance device to make nurses and doctors prepare for the subsequent treatment (e.g., surgery) as soon as possible.
  • the one or more generated images and/or the diagnosis result may be displayed in the terminal 140 (e.g., the display unit 320) .
  • the diagnosis result may be displayed in form of text, picture, voice, video, or the like, or any combination thereof.
  • the one or more suspected lesions may be highlighted in the one or more images.
  • the terminal 140 may present the message indicating that the object needs a priority treatment by voice broadcasting, initiating an alarm bell and/or an alarm light, or the like.
  • the user may perform one or more user operations (e.g., zooming in, zooming out, rotation, marking, storing, uploading, cropping, etc. ) with respect to the diagnosis result or the one or more images via the terminal 140.
  • the I/O module 440 may transmit, via a network (e.g., the network 150) , the one or more images and the diagnosis result to a target remote device (e.g., the remote device 160) .
  • a network e.g., the network 150
  • a target remote device e.g., the remote device 160
  • the diagnosis module 430 may generate, for the diagnosis result, a tag corresponding to the severity degree of the object.
  • the diagnosis module 430 may identify the severity degree of the diagnosis result by text recognition and generate, for the diagnosis result, a tag corresponding to the identified degree of the object.
  • the tag assigned to the diagnosis result corresponding to the severity degree may cause the PACS server to rank the diagnosis result based on the severity degree, so that the diagnosis results to be viewed in the PACS may be ranked in descending order. In this way, the diagnosis result corresponding to a relatively high severity degree is given a priority to be viewed and/or treated by a doctor.
  • the diagnosis module 430 may generate a diagnosis result including a severity degree and a tag corresponding to the severity degree.
  • the diagnosis results may be transmitted to the PACS and ranked in descending order of severity degree in the PACS, so that a doctor may determine which patient should be treated first according to the ranking order of the diagnosis results.
  • the I/O module 440 may transmit an instruction to a remote assistance device to direct the remote assistance device to arrange subsequent treatments of the object based on the diagnosis result.
  • the operator of the imaging system 100 may immediately contact a corresponding doctor and prompt the doctor to diagnose and treat the patient as soon as possible.
  • the I/O module 440 may automatically send a notification indicating that the patient needs a priority treatment to the doctor workstation, prompting a corresponding doctor to diagnose and treat the patient as soon as possible.
  • the operator of the imaging system 100 may immediately contact corresponding doctors and nurses, and make an appointment of a surgery room.
  • the I/O module 440 may automatically transmit a notification indicating that surgery needs to be performed on the patient as soon as possible to the doctor workstation, the nurse workstation, and the remote surgery device, prompting the corresponding doctors and nurses to prepare for surgery as soon as possible.
  • the operator of the imaging system 100 may make an appointment of the suggested examination.
  • the I/O module 440 may automatically make an appointment of the suggested examination by accessing, e.g., a hospital information system (HIS) , a laboratory information system (LIS) , or a radiology information system (RIS) .
  • HIS hospital information system
  • LIS laboratory information system
  • RIS radiology information system
  • each of the modules described above may be a hardware circuit that is designed to perform certain actions, e.g., according to a set of instructions stored in one or more storage media, and/or any combination of the hardware circuit and the one or more storage media.
  • the computing device 400 may include one or more additional modules, such as a storage module (not shown) for storing data.
  • a storage module not shown
  • one or more modules of the computing device 400 may be omitted.
  • two or more modules of the computing device 400 may be integrated into a single module.
  • FIG. 5 is a flowchart illustrating an exemplary process for automatic diagnosis according to some embodiments of the present disclosure.
  • process 500 may be implemented as a set of instructions (e.g., an application) stored in a storage device (e.g., the storage device 130, the storage 220, etc. ) .
  • the processing device 120 e.g., the processor 210 and/or the one or more modules illustrated in FIG. 4
  • the operations of the illustrated process presented below are intended to be illustrative.
  • the process 500 may be accomplished with one or more additional operations not described and/or without one or more of the operations discussed. Additionally, the order of the operations of process 500 illustrated in FIG. 5 and described below is not intended to be limiting.
  • the processing device 120 may cause a scanner (e.g., the imaging device 110 of the imaging system 100 in FIG. 1) to perform one or more scans on an object (e.g., one or more regions of interests (ROIs) of a patient) .
  • a scanner e.g., the imaging device 110 of the imaging system 100 in FIG. 1
  • ROIs regions of interests
  • a user may set scan parameters of the one or more scans through the terminal 140.
  • the user may set a position of a component (e.g., a scanning table, a detector, an X-ray tube, a gantry) of the imaging device 110, a collimator aperture, an X-ray tube voltage and/or current, a moving speed of the scanning table, a gantry rotation speed, a field of view (FOV) , a scan time, or the like.
  • the terminal 140 may transmit the scan parameters to the processing device 120.
  • the processing device 120 may cause the imaging device 110 to perform one or more scans on the object based on the scan parameters.
  • the processing device 120 may obtain scan data of the object based on the one or more scans.
  • the scan data (also referred to as imaging data) of the object may include an image (e.g., an image slice) , PET data (e.g., gamma photon information) , SPECT data (e.g., gamma photon information) , MR data (e.g., echo signals) , CT data (e.g., projection data) , or the like, or any combination thereof.
  • the scan data may be two-dimensional (2D) scan data, three-dimensional (3D) scan data, four-dimensional (4D) scan data, or the like, or any combination thereof.
  • the processing device 120 may obtain the scan data from the imaging device 110.
  • the scan data may be acquired by the imaging device 110 and stored in a storage device (e.g., the storage device 130, the storage device 220, the storage unit 370, or an external source) .
  • the processing device 120 may retrieve the scan data from the storage device.
  • the processing device 120 may generate one or more images of the object based on the scan data.
  • the user may input a reconstruction protocol through the terminal 140.
  • the terminal 140 may transmit the reconstruction protocol to the processing device 120.
  • the processing device 120 may generate the one or more images of the object by reconstructing the scan data based on the reconstruction protocol.
  • the reconstruction protocol may include a reconstruction algorithm, an image processing algorithm (e.g., an image segmentation algorithm, an image fusion algorithm, an image registration algorithm, an image rendering algorithm, a multi-planar reconstruction algorithm, a curved planar reconstruction algorithm) , etc.
  • the one or more images may include a 2D image, a 3D image, a 4D image (e.g., a series of 2D or 3D images in time order) , etc.
  • the one or more images may include a PET image, an SPECT image, an MR image, a CT image, or the like, or any combination thereof.
  • the processing device 120 may generate, based on the one or more images, a diagnosis result.
  • the diagnosis result may at least indicate whether one or more suspected lesions exist in the object.
  • the diagnosis result may further indicate lesion information of the one or more suspected lesions of the object.
  • the processing device 120 may automatically generate the diagnosis result by analyzing the one or more images using existing automatic diagnosis technology. For example, the processing device 120 may obtain one or more trained diagnosis models. The processing device 120 may input the one or more images into the one or more trained diagnosis models. The one or more trained diagnosis models may segment the one or more images into one or more regions of interest (ROIs) and generate the diagnosis result based on the one or more ROIs. As another example, the user may manually draw the one or more ROIs of the one or more images through the terminal 140. The processing device 120 may obtain one or more trained diagnosis models. The processing device 120 may input the one or more images with the one or more ROIs into the one or more trained diagnosis models. The one or more trained diagnosis models may generate the diagnosis result based on the one or more ROIs.
  • ROIs regions of interest
  • a lesion refers to the pathological changes in the morphology and function of human tissues and organs, such as some symptoms caused by the invasion of bacteria and viruses (e.g., an inflammatory infection, an ulcer, a tumor, etc. ) .
  • a suspected lesion refers to at least a portion of the one or more ROIs that is segmented from the one or more images and deemed as a lesion by the automatic diagnosis (e.g., the one or more trained diagnosis models) .
  • the lesion information may include basic information of the object, a determination indicating a possibility that one or more suspected lesions exist in the object, a type (e.g., tumor, pulmonary nodule, hemorrhage point, etc.
  • the one or more suspected lesions a type (e.g., cancer, infectious disease such as 2019 Novel Coronavirus (2019-nCoV) , cerebral hemorrhage, etc. ) of the disease related to the one or more suspected lesions, a size of the one or more suspected lesions, an outline of the one or more suspected lesions, a location of the one or more suspected lesions, a characteristic of the one or more suspected lesions, a severity degree of the one or more suspected lesions, a tag assigned to the diagnosis result corresponding to the severity degree, a suggestion of subsequent treatment of the one or more suspected lesions, or the like, or any combination thereof.
  • a type e.g., cancer, infectious disease such as 2019 Novel Coronavirus (2019-nCoV) , cerebral hemorrhage, etc.
  • the basic information of the object may include the name, age, gender, occupation, medical history, identity number, contact information (e.g., phone number, email address, etc. ) , a portrait photo of the object, or the like, or any combination thereof.
  • the characteristic of the one or more suspected lesions may include morphology of the one or more suspected lesions, symptom of the one or more suspected lesions, physiological information (e.g., a tissue density, etc. ) of the one or more suspected lesions, etc.
  • the type of the one or more suspected lesions, the type of the disease related to the one or more suspected lesions, and/or the severity degree of the one or more suspected lesions may be set through the terminal 140.
  • the suggestion of subsequent treatment may include rescanning at least a portion of the object, performing one or more other scans such as an enhancement scan, a perfusion scan, etc. using the imaging device 110, performing other physical examinations, sending directly to the operating room, determining which doctor will receive the patient, a message indicating that the object needs a priority treatment based on the severity degree of the one or more suspected lesions, or the like, or any combination thereof.
  • the message indicating that the object needs a priority treatment may correspond to a severity degree of the one or more suspected lesions that is greater than a threshold.
  • the severity degree of the one or more suspected lesions may be represented in various forms.
  • the severity degree may be represented as a score, and a greater score may indicate that the suspected lesion is more serious.
  • a mild case of the 2019-nCoV may be assigned to a severity degree with a smaller score than a critical case of the 2019-nCoV.
  • a disease e.g., cerebral hemorrhage
  • a relatively high score e.g., the highest score
  • the processing device 120 may determine whether the severity degree of the one or more suspected lesions is greater than a threshold. In response to determining that the severity degree of the one or more suspected lesions is greater than the threshold, the processing device 120 may generate a message indicating that the object needs a priority treatment. Alternatively or additionally, the processing device 120 may generate a suggestion of subsequent treatment of the one or more suspected lesions and directly send, based on the suggestion of subsequent treatment, the message to the remote assistance device to make nurses and doctors prepare for the subsequent treatment (e.g., surgery) as soon as possible.
  • the threshold may be set manually by a user based on an experience value. In some embodiments, the threshold may be set based on default setting of the imaging system 100. In some embodiments, the threshold may be automatically determined by the processing device 120 using, for example, a machine learning model. For example, if the total score is 10, the threshold may be set as 5, 6, 7, 8, or 9.
  • the processing device 120 may determine, by analyzing the one or more images of the object, that one or more hemorrhage points exist in the brain of the object and the disease of the object is cerebral hemorrhage that requires a surgery as soon as possible.
  • the processing device 120 may assign the highest score 10 of the severity degree to the object.
  • the processing device 120 may determine that the severity degree of the object is greater than the threshold of 7.
  • the processing device 120 may generate a message indicating that the object needs a priority treatment and send the message to the remote assistance device to make nurses and doctors prepare the surgery as soon as possible.
  • the one or more generated images and/or the diagnosis result may be displayed in the terminal 140 (e.g., the display unit 320) .
  • the diagnosis result may be displayed in form of text, picture, voice, video, or the like, or any combination thereof.
  • the one or more suspected lesions may be highlighted in the one or more images.
  • the terminal 140 may present the message indicating that the object needs a priority treatment by voice broadcasting, initiating an alarm bell and/or an alarm light, or the like.
  • the user may perform one or more user operations (e.g., zooming in, zooming out, rotation, marking, storing, uploading, cropping, etc. ) with respect to the diagnosis result or the one or more images via the terminal 140.
  • the processing device 120 may transmit, via a network (e.g., the network 150) , the one or more images and the diagnosis result to a target remote device (e.g., the remote device 160) .
  • the remote device 160 may include a picture archiving and communication system (PACS) , a post-processing workstation, a remote surgery device, a remote assistance device, or the like, or any combination thereof.
  • the assistance device may include a hospital information system (HIS) , a laboratory information system (LIS) , a radiology information system (RIS) , a doctor workstation, a nurse workstation, or the like, or any combination thereof.
  • HIS hospital information system
  • LIS laboratory information system
  • RIS radiology information system
  • the one or more images and/or the diagnosis result may be archived in the PACS.
  • a PACS application may be installed on the terminal 140, and a user may upload, from, e.g., the storage device 130, the processing device 120, or the terminal 140, the one or more images and the diagnosis result to the PACS server (e.g., the server 161) through the PACS application via the network 150.
  • Other users may access the PACS server through the PACS application installed on their personal terminals to browse and/or download the one or more images and the diagnosis result.
  • the processing device 120 may generate, for the diagnosis result, a tag corresponding to the severity degree of the object.
  • the processing device 120 may identify the severity degree of the diagnosis result by text recognition and generate, for the diagnosis result, a tag corresponding to the identified degree of the object.
  • the tag assigned to the diagnosis result corresponding to the severity degree may cause the PACS server to rank the diagnosis result based on the severity degree, so that the diagnosis results to be viewed in the PACS may be ranked in descending order. In this way, the diagnosis result corresponding to a relatively high severity degree is given a priority to be viewed and/or treated by a doctor.
  • the processing device 120 may generate a diagnosis result including a severity degree and a tag corresponding to the severity degree.
  • the diagnosis results may be transmitted to the PACS and ranked in descending order of severity degree in the PACS, so that a doctor may determine which patient should be treated first according to the ranking order of the diagnosis results.
  • the processing device 120 may transmit an instruction to a remote assistance device to direct the remote assistance device to arrange subsequent treatments of the object based on the diagnosis result.
  • the operator of the imaging system 100 may immediately contact a corresponding doctor and prompt the doctor to diagnose and treat the patient as soon as possible.
  • the processing device 120 may automatically send a notification indicating that the patient needs a priority treatment to the doctor workstation, prompting a corresponding doctor to diagnose and treat the patient as soon as possible.
  • the operator of the imaging system 100 may immediately contact corresponding doctors and nurses, and make an appointment of a surgery room.
  • the processing device 120 may automatically transmit a notification indicating that surgery needs to be performed on the patient as soon as possible to the doctor workstation, the nurse workstation, and the remote surgery device, prompting the corresponding doctors and nurses to prepare for surgery as soon as possible.
  • the operator of the imaging system 100 may make an appointment of the suggested examination.
  • the processing device 120 may automatically make an appointment of the suggested examination by accessing, e.g., a hospital information system (HIS) , a laboratory information system (LIS) , or a radiology information system (RIS) .
  • HIS hospital information system
  • LIS laboratory information system
  • RIS radiology information system
  • one or more user devices may establish connection (e.g., a wireless communication) and communicate with the processing device 120 via an application installed in the one or more user devices, respectively.
  • the one or more user devices may directly access the processing device 120 to view the one or more images and/or the diagnosis result, instead of accessing the PACS, in which the one or more images and/or the diagnosis result need to be first transmitted to the PACS from the imaging system 100. In this way, by omitting the data transmission to the PACS, a doctor may know or make the diagnosis result of a patient (especially emergency patient) as soon as possible.
  • a user may online modify at least one of a scan parameter of the one or more scans, a reconstruction parameter used to generate the one or more images, and the diagnosis result through her/his user device.
  • the user may include a doctor or an imaging technician.
  • a second imaging technician distance away from the imaging system 100 may modify the scan parameter and/or the reconstruction parameter through his/her user device.
  • the processing device 120 may perform a scan on the patient based on the modified scan parameter, and/or generate one or more images of the patient based on the modified reconstruction parameter.
  • an imaging technician distance away from the imaging system 100 may access the processing device 120 to view the one or more images and/or the diagnosis result.
  • the imaging technician may modify, based on the one or more images (e.g., the image quality of the one or more images) and/or the diagnosis result, a scan parameter of the first scan and/or a reconstruction parameter used to generate the one or more images through his/her user device.
  • the processing device 120 may perform a second scan on the patient based on the modified scan parameter, and/or perform image reconstruction based on scan data acquired in the first scan or the second scan and the modified reconstruction parameter. For example, if the imaging technician is not satisfied with the quality of the one or more images, the imaging technician may modify a scan parameter of the scan and/or a reconstruction parameter used to generate the one or more images through his/her user device to improve the image quality.
  • a doctor may modify the diagnosis result, such as modify or delete the suggestion of subsequent treatment in the diagnosis result, add new suggestion of subsequent treatment to the diagnosis result, or modify the severity degree in the diagnosis result through her/his user device.
  • a reconstruction parameter used to generate the one or more images, or the diagnosis result through her/his user device (also referred to as a first user device)
  • a second user that access the processing device 120 through a second user device during the same time period as the first user is able to view the real-time update (synchronized with the modification operation of the first user) of the scan parameter, the reconstruction parameter, and/or the diagnosis result.
  • the processing device 120 may transmit a message indicating that the patient needs a priority operation to a remote surgery device and/or a remote assistance device.
  • doctors responsible for the patient may directly access the processing device 120 through their user devices (e.g., their smartphones, tablet computers, laptop computers, etc. ) to view the patient’s image and diagnosis result.
  • their user devices e.g., their smartphones, tablet computers, laptop computers, etc.
  • the modification of the suggestion of subsequent treatment may be provided in real-time to the user devices of the other doctors that are viewing the diagnosis result by accessing the processing device 120, which may help the other doctors know the updated treatment and prepare for the operation in time.
  • doctors having less experience for making a diagnosis and/or imaging technicians may access the processing device 120 to view the real-time modification operation on the diagnosis result, so as to online learn diagnosis experience.
  • an imaging technician having less operational experience of the imaging system 100 may access the processing device 120 to view the real-time modification operation on scan parameters and/or reconstruction parameters, so as to online learn experience of how to generate images with high quality.
  • the imaging system 100 may be referred to as an examination terminal.
  • the function of the examination terminal is to scan an object (e.g., a patient) and generate one or more images of the object.
  • the imaging device 110 may be positioned in a scanning room.
  • An operator of the examination terminal inputs a scan protocol and/or a reconstruction protocol through a control console (e.g., the terminal 140) to control the scan performed on the object and the image reconstruction.
  • the control console may be positioned in a control room separated from the scanning room with a glass window, so that the operator can monitor the scan performed on the object using the imaging device 110.
  • the control console may cause the processing device 120 to control the imaging device 110 to scan the object based on the scan protocol and cause the processing device 120 to generate one or more images of the object based on scan data generate by the scan of the object.
  • the examination terminal may transmit the one or more images to a remote device (e.g., the remote device 160) via a network (e.g., the network 150) .
  • a remote device e.g., the remote device 160
  • a network e.g., the network 150
  • the examination terminal may transmit the one or more images to a picture archiving and communication system (PACS) via the network 150.
  • PACS picture archiving and communication system
  • the PACS may generate, for example, using the server 161, a diagnosis result by performing automatic diagnosis based on the one or more images.
  • a doctor may access the PACS to obtain the diagnosis result and make subsequent treatment on the object based on the diagnosis result.
  • the examination terminal may transmit the one or more images to the PACS via the network 150.
  • a doctor may access the PACS to obtain the one or more images and use a personal terminal or a post-processing workstation to generate a diagnosis result. The doctor may make subsequent treatment on the object based on the diagnosis result.
  • the examination terminal may transmit the one or more images to a post-processing workstation via the network 150.
  • the post-processing workstation may generate, for example, using the server 161, a diagnosis result by performing automatic diagnosis based on the one or more images.
  • the post-processing workstation may archive the diagnosis result in the PACS.
  • a doctor may access the PACS or the post-processing workstation to obtain the diagnosis result and make subsequent treatment on the object based on the diagnosis result.
  • the examination terminal since the examination terminal is not capable of performing the automatic diagnosis, the examination terminal needs to transmit the one or more generated images to a remote device (e.g., the PACS or the post-processing workstation) via the network 150 to achieve the automatic diagnosis.
  • a remote device e.g., the PACS or the post-processing workstation
  • the time interval from the image reconstruction to the automatic diagnosis may be relatively long, and depends on the data transmission speed between the examination terminal and the remote device. For example, for dynamic imaging, the number of generated images is large (e.g., thousands of images) , so the data transmission may be very slow. Moreover, for emergency patients, this long-time interval may be fatal.
  • this long-time interval may reduce the diagnosis efficiency.
  • the interface and protocol of the remote device may be inconsistent with the examination terminal, which may increase the difficulty of the data transmission between the remote device and the examination terminal.
  • the examination terminal e.g., the imaging system 100
  • the examination terminal may be capable of performing the automatic diagnosis based on the one or more generated images.
  • the automatic diagnosis may be performed by the processing device 120 or the terminal 140.
  • the remote data transmission is omitted.
  • the reconstructed images may be archived to PACS together with the diagnosis result.
  • the operator of the imaging system 100 may view the diagnosis result directly on the display of the control console of the examination terminal (e.g., the terminal 140 of the imaging system 100) , instead of accessing the PACS.
  • the operator of the examination terminal may directly make a decision of subsequent treatment of the patient, so that the subsequent treatment may be performed on the patient (especially emergency patient) as soon as possible.
  • the process 500 may be performed by the terminal 140 (e.g., the CPU 340, the GPU 330, and/or the one or more modules illustrated in FIG. 4) .
  • aspects of the present disclosure may be illustrated and described herein in any of a number of patentable classes or context including any new and useful process, machine, manufacture, or composition of matter, or any new and useful improvement thereof. Accordingly, aspects of the present disclosure may be implemented entirely hardware, entirely software (including firmware, resident software, micro-code, etc. ) or combining software and hardware implementation that may all generally be referred to herein as a “unit, ” “module, ” or “system. ” Furthermore, aspects of the present disclosure may take the form of a computer program product embodied in one or more computer-readable media having computer-readable program code embodied thereon.
  • a non-transitory computer-readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including electromagnetic, optical, or the like, or any suitable combination thereof.
  • a computer-readable signal medium may be any computer-readable medium that is not a computer-readable storage medium and that may communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
  • Program code embodied on a computer-readable signal medium may be transmitted using any appropriate medium, including wireless, wireline, optical fiber cable, RF, or the like, or any suitable combination of the foregoing.
  • Computer program code for carrying out operations for aspects of the present disclosure may be written in any combination of one or more programming languages, including an object-oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB. NET, Python or the like, conventional procedural programming languages, such as the "C" programming language, Visual Basic, Fortran, Perl, COBOL, PHP, ABAP, dynamic programming languages such as Python, Ruby, and Groovy, or other programming languages.
  • the program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN) , or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider) or in a cloud computing environment or offered as a service such as a Software as a Service (SaaS) .
  • LAN local area network
  • WAN wide area network
  • SaaS Software as a Service
  • the numbers expressing quantities, properties, and so forth, used to describe and claim certain embodiments of the application are to be understood as being modified in some instances by the term “about, ” “approximate, ” or “substantially. ”
  • “about, ” “approximate” or “substantially” may indicate ⁇ 20%variation of the value it describes, unless otherwise stated.
  • the numerical parameters set forth in the written description and attached claims are approximations that may vary depending upon the desired properties sought to be obtained by a particular embodiment.
  • the numerical parameters should be construed in light of the number of reported significant digits and by applying ordinary rounding techniques. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of some embodiments of the application are approximations, the numerical values set forth in the specific examples are reported as precisely as practicable.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Data Mining & Analysis (AREA)
  • Pathology (AREA)
  • Databases & Information Systems (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Radiology & Medical Imaging (AREA)
  • Surgery (AREA)
  • Urology & Nephrology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

Un procédé (500) peut consister à amener un dispositif de balayage à effectuer un ou plusieurs balayages sur un objet (510). Le procédé peut également consister à obtenir des données de balayage de l'objet sur la base du ou des balayages (520). Le procédé peut également consister à générer une ou plusieurs images de l'objet sur la base des données de balayage (530). Le procédé peut également consister à générer, sur la base de la ou des images, un résultat de diagnostic (540). Le résultat de diagnostic peut au moins indiquer si une ou plusieurs lésions suspectées existent dans l'objet.
PCT/CN2020/117944 2020-09-25 2020-09-25 Systèmes et procédés médicaux WO2022061787A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080105497.7A CN116325009A (zh) 2020-09-25 2020-09-25 医疗系统和方法
PCT/CN2020/117944 WO2022061787A1 (fr) 2020-09-25 2020-09-25 Systèmes et procédés médicaux

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/117944 WO2022061787A1 (fr) 2020-09-25 2020-09-25 Systèmes et procédés médicaux

Publications (1)

Publication Number Publication Date
WO2022061787A1 true WO2022061787A1 (fr) 2022-03-31

Family

ID=80844787

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/117944 WO2022061787A1 (fr) 2020-09-25 2020-09-25 Systèmes et procédés médicaux

Country Status (2)

Country Link
CN (1) CN116325009A (fr)
WO (1) WO2022061787A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232068A1 (fr) * 2022-05-31 2023-12-07 Shanghai United Imaging Healthcare Co., Ltd. Systèmes et procédés de traitement d'images

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8109875B2 (en) * 2007-01-03 2012-02-07 Gizewski Theodore M Derma diagnostic and automated data analysis system
CN108305671A (zh) * 2018-01-23 2018-07-20 深圳科亚医疗科技有限公司 由计算机实现的医学图像调度方法、调度系统及存储介质
CN108573490A (zh) * 2018-04-25 2018-09-25 王成彦 一种针对肿瘤影像数据的智能读片系统
CN109686444A (zh) * 2018-12-27 2019-04-26 上海联影智能医疗科技有限公司 用于医学图像分级的系统和方法
CN110379492A (zh) * 2019-07-24 2019-10-25 复旦大学附属中山医院青浦分院 一种全新的ai+pacs系统及其检查报告构建方法
US20200303049A1 (en) * 2019-03-22 2020-09-24 Shanghai United Imaging Healthcare Co., Ltd. System and method for generating imaging report

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106388843A (zh) * 2016-10-25 2017-02-15 上海联影医疗科技有限公司 医学影像设备及其扫描方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8109875B2 (en) * 2007-01-03 2012-02-07 Gizewski Theodore M Derma diagnostic and automated data analysis system
CN108305671A (zh) * 2018-01-23 2018-07-20 深圳科亚医疗科技有限公司 由计算机实现的医学图像调度方法、调度系统及存储介质
CN108573490A (zh) * 2018-04-25 2018-09-25 王成彦 一种针对肿瘤影像数据的智能读片系统
CN109686444A (zh) * 2018-12-27 2019-04-26 上海联影智能医疗科技有限公司 用于医学图像分级的系统和方法
US20200303049A1 (en) * 2019-03-22 2020-09-24 Shanghai United Imaging Healthcare Co., Ltd. System and method for generating imaging report
CN110379492A (zh) * 2019-07-24 2019-10-25 复旦大学附属中山医院青浦分院 一种全新的ai+pacs系统及其检查报告构建方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023232068A1 (fr) * 2022-05-31 2023-12-07 Shanghai United Imaging Healthcare Co., Ltd. Systèmes et procédés de traitement d'images

Also Published As

Publication number Publication date
CN116325009A (zh) 2023-06-23

Similar Documents

Publication Publication Date Title
US20210201066A1 (en) Systems and methods for displaying region of interest on multi-plane reconstruction image
US20200211188A1 (en) Systems and methods for image processing
JP7027438B2 (ja) 画像取得システム及び方法
US11995837B2 (en) System and method for medical image visualization
US11494877B2 (en) Systems and methods for image reconstruction
US11727608B2 (en) Systems and methods for image reconstruction
US11842465B2 (en) Systems and methods for motion correction in medical imaging
US11989881B2 (en) Systems and methods for imaging device
US11911201B2 (en) Systems and methods for determining position of region of interest
WO2021239141A1 (fr) Procédé et système de traitement d'image
KR102422871B1 (ko) 디지털 방사선 촬영을 위한 시스템 및 방법
US20220351372A1 (en) Systems and methods for image processing
WO2022061787A1 (fr) Systèmes et procédés médicaux
US20240037762A1 (en) Systems and methods for image processing
WO2022126475A1 (fr) Systèmes et procédés de contourage de cibles cliniques en radiothérapie
CN111161371B (zh) 成像系统和方法
US20230225687A1 (en) System and method for medical imaging
US20220183649A1 (en) Systems and methods for configuring medical device
US20220114801A1 (en) Systems and methods for image processing
US12040074B2 (en) Systems and methods for data synchronization
US20230030595A1 (en) Methods and systems for selecting data processing models
WO2021136250A1 (fr) Systèmes et procédés d'imagerie
CN117015801A (zh) 图像分割的系统和方法
US20230401768A1 (en) Systems and methods for image reconstruction
WO2023039736A1 (fr) Systèmes et procédés de reconstruction d'image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20954639

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20954639

Country of ref document: EP

Kind code of ref document: A1