WO2018218478A1 - 图像处理方法及系统 - Google Patents

图像处理方法及系统 Download PDF

Info

Publication number
WO2018218478A1
WO2018218478A1 PCT/CN2017/086539 CN2017086539W WO2018218478A1 WO 2018218478 A1 WO2018218478 A1 WO 2018218478A1 CN 2017086539 W CN2017086539 W CN 2017086539W WO 2018218478 A1 WO2018218478 A1 WO 2018218478A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
region
dimensional
volume
data
Prior art date
Application number
PCT/CN2017/086539
Other languages
English (en)
French (fr)
Inventor
向昱
张洋
Original Assignee
上海联影医疗科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 上海联影医疗科技有限公司 filed Critical 上海联影医疗科技有限公司
Priority to PCT/CN2017/086539 priority Critical patent/WO2018218478A1/zh
Priority to EP17911976.3A priority patent/EP3627442A4/en
Priority to US15/854,705 priority patent/US10824896B2/en
Publication of WO2018218478A1 publication Critical patent/WO2018218478A1/zh
Priority to US17/086,517 priority patent/US11461990B2/en
Priority to US17/821,481 priority patent/US11798168B2/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/251Fusion techniques of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/155Segmentation; Edge detection involving morphological operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/25Determination of region of interest [ROI] or a volume of interest [VOI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/255Detecting or recognising potential candidate objects based on visual cues, e.g. shapes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10108Single photon emission computed tomography [SPECT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V2201/00Indexing scheme relating to image or video recognition or understanding
    • G06V2201/03Recognition of patterns in medical or anatomical images

Definitions

  • the present application relates to methods and systems for image processing, and further, to regions of interest for medical images (including two-dimensional regions of interest (also referred to as two-dimensional ROI) and/or three-dimensional volumes of interest (also referred to as Three-dimensional VOI)) processing method and system.
  • regions of interest for medical images including two-dimensional regions of interest (also referred to as two-dimensional ROI) and/or three-dimensional volumes of interest (also referred to as Three-dimensional VOI)) processing method and system.
  • Medical image post-processing software usually provides tools for processing regions of interest (two-dimensional regions of interest and/or three-dimensional volumes of interest) for analysis and display of two-dimensional ROI (Region of Interest) / three-dimensional VOI (Volume of Characteristic information of interest, for example, statistical features.
  • regions of interest two-dimensional regions of interest and/or three-dimensional volumes of interest
  • ROI Region of Interest
  • VOI Volume of Characteristic information of interest, for example, statistical features.
  • the user hopes to achieve two functions: (1) drawing two-dimensional ROI/three-dimensional VOI of the same size and shape at the same two-dimensional position of different data layers of a medical image volume data, and counting these two-dimensional ROI/three-dimensional VOI And comparing; (2) drawing two-dimensional ROI/three-dimensional VOI of the same size and shape in the same three-dimensional position of the plurality of medical image volume data, and counting and comparing these two-dimensional ROI/three-dimensional VOI.
  • each of the two-dimensional ROI/3D VOIs drawn is usually a continuous area, but if some of the areas inside the continuous area are not of interest to the user. Or not the user wants to analyze, traditional 2D ROI/3D VOI tools usually cannot remove parts that are not of interest to users to meet this need.
  • each two-dimensional ROI/three-dimensional VOI can display its corresponding statistical features. When there are multiple two-dimensional ROI/three-dimensional VOIs, statistical features of multiple two-dimensional ROI/three-dimensional VOI populations cannot be obtained and displayed. .
  • the present application provides a method and system for positioning, transmitting, cropping, merging, and the like of a two-dimensional ROI/three-dimensional VOI to solve the above problems and meet various needs of users.
  • the first image processing method may include acquiring an image data set, the image data set may include first volume data, the first volume data may include at least one data layer, and the at least one data layer may include at least one a voxel; determining, by the at least one processor, a target region of interest in the first volume data, the target region of interest may include at least one voxel in the at least one data layer, and the target is of interest
  • the region may include at least one two-dimensional region of interest or at least one three-dimensional volume of interest, and determining the target region of interest may include: rendering an initial region of interest, the initial region of interest may be at the first volume data And the initial region of interest may comprise at least one two-dimensional region of interest or at least one three-dimensional volume of interest; and cropping the initial region of interest to obtain the target region of interest.
  • the second image processing method can be implemented on at least one machine, each machine can include at least one processor and memory.
  • the second image processing method may include acquiring an image data set, the image data set may include first volume data, the first volume data may include at least one data layer, and the at least one data layer may include at least one a voxel; acquiring an initial region of interest in the first volume data, the initial region of interest may include at least one two-dimensional region of interest or at least one three-dimensional volume of interest; determining, by the at least one processor Position information of an initial region of interest; and transmitting the initial region of interest based on the location information.
  • the third image processing method can be implemented on at least one machine, each machine can include at least one processor and memory.
  • the third image processing method may include: acquiring an image data set; acquiring an initial region of interest in the image data set, the initial region of interest may include at least one two-dimensional region of interest or at least one three-dimensional volume of interest; Acquiring a region of interest to be merged in the image data set, the region to be merged may include at least one two-dimensional region of interest or at least one three-dimensional volume of interest; merging the initial interest with the at least one processor a region and the region to be merged; and analyzing feature information of the merged region of interest.
  • the first non-transitory computer readable medium can include executable instructions.
  • the instructions when executed by at least one processor, may cause the at least one processor to implement the first image processing method.
  • the second non-transitory computer readable medium can include executable instructions.
  • the at least one processor may be caused to implement the second image processing method.
  • the third non-transitory computer readable medium can include executable instructions.
  • the instructions when executed by at least one processor, may cause the at least one processor to implement the third image processing method.
  • the first system can include: at least one processor, and a memory for storing instructions that, when executed by the at least one processor, cause the system to implement the first image processing method.
  • the second system can include: at least one processor, and a memory for storing instructions that, when executed by the at least one processor, cause the system to implement the second image processing method.
  • the third system can include at least one processor and a memory for storing instructions that, when executed by the at least one processor, cause the system to implement the third image processing method.
  • the cropping the initial region of interest to obtain the target region of interest may include: drawing a region to be cropped to be cropped in the first volume data; and from the An intersection of the region to be cropped and the initial region of interest is removed from the initial region of interest to obtain the target region of interest.
  • the first image processing method may further include: determining whether to continue cropping the target region of interest; and if it is determined to continue cropping, continuing to draw a sense of cropping in the first volume data Area of interest.
  • the first image processing method may further include transmitting the target region of interest, and the delivering the target region of interest may include determining first location information, the first location information Included may include a three-dimensional coordinate position of one of the target regions of interest in the first volume data; acquire a first delivery depth of the first delivery direction, the first delivery direction and the first volume data a plane of a first data layer may form a first angle; a second transmission depth of the second transmission direction, the second transmission direction and a plane of the first data layer in the first volume data Forming a second angle; determining at least one second data layer of volume data in a range corresponding to the first delivery depth and the second delivery depth; and in the at least one second according to the first location information
  • the data layer generates the region of interest after delivery.
  • the first image processing method may further include transmitting the target region of interest, and the delivering the target region of interest may include: acquiring second volume data; determining second location information, The second location information may include a three-dimensional coordinate position of one voxel in the target volume of interest in the first volume data; and generate and transmit the second volume data according to the second location information. After the area of interest.
  • the first image processing method may further include: acquiring a region to be merged in the first volume data; and merging the target region of interest and the region to be merged .
  • the first image processing method may further include: transmitting the initial region of interest.
  • the drawing the initial region of interest may include: drawing a delivery source region of interest in the first volume data; determining third location information, the third location information may include the delivering a three-dimensional coordinate position of one of the source region of interest in the first volume data; and transmitting the source of interest region of interest in the first volume data based on the third location information to generate the The initial region of interest.
  • the drawing the initial region of interest may include: acquiring third volume data; drawing a delivery source region of interest in the third volume data; determining fourth location information, the fourth location information may And including a three-dimensional coordinate position of one of the voxels in the region of interest of the transfer source in the third volume data; and transmitting the region of interest of the transfer source to the first volume data according to the fourth position information To generate the initial region of interest.
  • the drawing the initial region of interest may include: drawing at least two different merged source regions of interest in the first volume data; and merging the at least two different merged sources An area of interest to generate the initial region of interest.
  • the first image processing method may further include analyzing the target region of interest, and the analyzing the target region of interest may include analyzing feature information of the target region of interest,
  • the feature information includes statistical feature information, which is information obtained by statistical analysis of a plurality of voxels in the target region of interest.
  • the transmitting the initial region of interest may include: acquiring a first delivery depth of the first delivery direction, the first delivery direction and a first data in the first volume data
  • the plane of the layer may form a first angle; the second transmission depth of the second transmission direction is obtained, and the second transmission direction may form a second angle with the plane of the first data layer in the first volume data.
  • the second image processing method may further include: drawing a region of interest to be merged in the first volume data; and merging the target region of interest with the region of interest to be merged .
  • the delivering the initial region of interest may include: acquiring second volume data; determining second location information, the second location information may include one of the initial regions of interest Generating a three-dimensional coordinate position in the first volume data; and generating a region of interest after the delivery in the second volume data based on the second position information.
  • the second image processing method may further include: drawing a region to be merged in the second volume data; and merging the delivered region of interest and the feeling to be merged Area of interest.
  • the second image processing method may further include: drawing a region of interest to be merged in the first volume data; and merging the initial region of interest and the region of interest to be merged .
  • the acquiring the initial region of interest in the first volume data may include: drawing at least two different merged source regions of interest in the first volume data; and merging the At least two different merged source regions of interest are obtained to obtain the initial region of interest.
  • FIG. 1 is an exemplary schematic diagram of an image processing system shown in accordance with some embodiments of the present application.
  • FIG. 2 is an exemplary schematic diagram of a computing device of an image processing server, in accordance with some embodiments of the present application
  • FIG. 3 is an exemplary schematic diagram of a processing engine shown in accordance with some embodiments of the present application.
  • FIG. 4 is an exemplary schematic diagram of a processing module shown in accordance with some embodiments of the present application.
  • FIG. 5 is an exemplary flow diagram of processing an image, shown in accordance with some embodiments of the present application.
  • FIG. 6 is an exemplary flow diagram for determining a region of interest (a two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application;
  • FIG. 7A is an exemplary flow diagram of cropping a region of interest (a two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application;
  • FIG. 7B is an exemplary schematic diagram of a region of interest (two-dimensional region of interest/three-dimensional volume) shown in accordance with some embodiments of the present application;
  • 7C is an exemplary schematic diagram of a cropped region of interest (two-dimensional region of interest/three-dimensional volume), shown in some embodiments of the present application;
  • 8A is an exemplary flow diagram of delivering a two-dimensional region of interest, in accordance with some embodiments of the present application.
  • FIG. 8B is an exemplary schematic diagram of a two-dimensional region of interest, shown in accordance with some embodiments of the present application.
  • 8C is an exemplary schematic diagram of a two-dimensional region of interest after delivery, in accordance with some embodiments of the present application.
  • 9A is an exemplary flow diagram of delivering a region of interest (two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application.
  • 9B is an exemplary schematic diagram of delivering a two-dimensional region of interest, in accordance with some embodiments of the present application.
  • 9C is an example of delivering a three-dimensional volume of interest, in accordance with some embodiments of the present application.
  • 10A is an exemplary flow diagram of merging regions of interest (two-dimensional regions of interest/three-dimensional volume), shown in some embodiments of the present application;
  • FIG. 10B is an exemplary schematic diagram of a region of interest (two-dimensional region of interest/three-dimensional volume) shown in accordance with some embodiments of the present application;
  • FIG. 10C is an exemplary schematic diagram of a merged region of interest (two-dimensional region of interest/three-dimensional volume), shown in some embodiments of the present application.
  • the ROI of the region of interest in the present application may refer to a corresponding region of interest or volume of interest in one or more data layers, and the VOI may be a corresponding volume of interest in two or more layers of data.
  • a VOI can be a three-dimensional ROI.
  • the two-dimensional region ROI of interest in the present application may refer to a corresponding two-dimensional region of interest in a layer of data.
  • Embodiments described herein may be applied to regions of interest (two-dimensional ROI and/or three-dimensional VOI).
  • the region of interest may comprise a two-dimensional ROI and/or a three-dimensional VOI.
  • FIG. 1 is an exemplary schematic diagram of an image processing system 100 shown in accordance with some embodiments of the present application.
  • the image processing system 100 can include an imaging system 110, an image processing server 120, a network 130, and a database 140.
  • imaging system 110 can be a standalone imaging device, or a multimodal imaging system.
  • image processing server 120 may analyze, process, and/or output the processed information.
  • Imaging system 110 can include a single imaging device, or a combination of multiple different imaging devices.
  • the imaging device can perform imaging by scanning one target.
  • the imaging device can be a medical imaging device.
  • the medical imaging device can collect image information of various parts of the human body.
  • the imaging system 110 can be a Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), computed tomography system ( Computed Tomography, CT), Magnetic Resonance Imaging (MRI), Digital Radiography (DR), Computed Tomography Colonography (CTC), etc., or a combination of several.
  • Imaging system 110 can include one or more scanners.
  • the scanner may be a Digital Subtraction Angiography (DSA), a Magnetic Resonance Angiography (MRA), and a Computed Tomography Angiography (CTA).
  • DSA Digital Subtraction Angiography
  • MRA Magnetic Resonance Angiography
  • CTA Computed Tomography Angiography
  • PET Scanner Digital Subtraction Angiography
  • SPECT Scanner single photon emission computed tomography scanner
  • CT Scanner computed tomography system scanner
  • MRI Scanner magnetic resonance imaging scanner
  • DR Scanner digital Radiation Scanner
  • Multi-modality Scanner etc., or a combination
  • the multimodal scanner may be a Computed Tomography-Positron Emission Tomography scanner (Single Photon Emission Computed Tomography-Magnetic Resonance Imaging scanner), PET- MRI scanner (Positron Emission Tomography-Magnetic Resonance Imaging scanner), DSA-MRI scanner (Digital Subtraction Angiography-Magnetic Resonance Imaging scanner) and the like.
  • Computed Tomography-Positron Emission Tomography scanner Single Photon Emission Computed Tomography-Magnetic Resonance Imaging scanner
  • PET- MRI scanner PET- MRI scanner
  • DSA-MRI scanner Digital Subtraction Angiography-Magnetic Resonance Imaging scanner
  • the image processing server 120 can process the acquired data information.
  • the image processing server 120 can perform denoising, image artifact removal, image segmentation, image rendering, image registration, image fusion, image reconstruction, and the like on the data information.
  • image processing server 120 may perform operations such as rendering, editing, etc. on the image two-dimensional ROI/three-dimensional VOI.
  • the data information may include text information, image information, sound information, video information, etc., or a combination of several.
  • image processing server 120 may include a processing engine 122, a processing core, one or more memories, etc., or a combination of several.
  • the image processing server 120 may include a central processing unit (CPU), an application-specific integrated circuit (ASIC), an application-specific instruction-set processor (ASIP), and graphics processing.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • ASIP application-specific instruction-set processor
  • GPU Graphics Processing Unit
  • PPU Physics Processing Unit
  • DSP Digital Signal Processor
  • FPGA Field Programmable Gate Array
  • PLD Programmable Logic
  • Controller Controller
  • Microcontroller Unit Microcontroller Unit
  • Processor Processor
  • Microprocessor Microprocessor
  • ARM Processor Advanced RISC Machines
  • image processing server 120 can process image information acquired from imaging system 110.
  • image processing server 120 can process image information from database 140 or other storage devices. The results of image processing server 120 processing the information may be stored in internal memory, database 140, or other external data source.
  • image processing server 120 may directly receive user instructions and perform corresponding image processing operations.
  • a user may access image processing server 120 over network 130 using a remote terminal (not shown). The results processed by image processing server 120 may be presented directly to the user or sent to the remote terminal over network 130 for viewing by the user.
  • Network 130 can be a single network, or a combination of multiple different networks.
  • the network 130 may be a local area network (LAN), a wide area network (WAN), a public network, a private network, a private network, or a public switched telephone network (PSTN). , Internet, wireless network, virtual network, metropolitan area network, telephone network, etc., or several The combination.
  • Network 130 may include multiple network access points, such as wired or wireless access points, such as wired access points, wireless access points, base stations, Internet switching points, and the like. Through these access points, image processing server 120 and/or imaging system 110 can access network 130 and transmit and/or receive data information over network 130.
  • the imaging system 110 in the field of medical imaging is now described as an example, but the application is not limited to the scope of this embodiment.
  • the imaging system 110 may be Computed Tomography (CT) or Magnetic Resonance Imaging (MRI)
  • the network 130 of the image processing system 100 may include a wireless network (Bluetooth, wireless local area network (WLAN, Wi -Fi, WiMax, etc.), mobile network (2G, 3G, 4G signals, etc.), or other connection methods (Virtual Private Network (VPN), shared network, Near Field Communication (NFC), ZigBee
  • network 130 may be used for communication of image processing system 100, receiving information internal or external to image processing system 100, and transmitting information to other portions or external portions of image processing system 100.
  • the imaging system 110, the image processing server 120, and the database 140 can be connected to the network 130 by a wired connection, a wireless connection, or a wired connection in combination with a wireless connection.
  • Database 140 can store information.
  • the database 140 can be built on a device having a storage function.
  • Database 140 can be local or remote.
  • database 140 or other storage devices within image processing system 100 may store various information, such as image data and the like.
  • database 140 or other storage devices within the system may be media with read/write capabilities.
  • the database 140 or other storage devices in the system may be internal devices of the system, or external devices of the system.
  • the connection of the database 140 to other storage devices in the system can be wired or wireless.
  • Database 140 or other storage devices within the system may include hierarchical databases, networked databases, relational databases, etc., or a combination of several.
  • the database 140 or other storage devices within the system may digitize the information and store it using electrical, magnetic or optical storage devices.
  • the database 140 or other storage devices in the system may be devices that store information by means of electrical energy, such as random access memory (RAM), read only memory (ROM), etc., or a combination of several.
  • the random access memory RAM may include a decimal counting tube, a selection counting tube, a delay line memory, a Williams tube, a dynamic random access memory (DRAM), a static random access memory (SRAM), a thyristor. Thyristor Random Access Memory (T-RAM), zero-capacitor random access memory (Zero-capacitor Random Access Memory, Z-RAM), etc., or a combination of several.
  • the read only memory ROM may include a magnetic bubble memory, a magnetic button line memory, a thin film memory, a magnetic plate line memory, a magnetic core memory, a drum memory, an optical disk drive, a hard disk, a magnetic tape, a phase change memory, a flash memory, an electronic erase type Rewritable read only memory, erasable programmable read only memory, programmable read only memory, shielded heap read memory, track memory, variable resistive memory, programmable metallization cells, etc., or a combination of several.
  • the database 140 or other storage devices within the system may be devices that store information using magnetic energy, such as hard disks, floppy disks, magnetic tapes, magnetic core memories, magnetic bubble memories, USB flash drives, flash memories, and the like.
  • the database 140 or other storage device within the system may be a device that optically stores information, such as a CD or DVD.
  • the database 140 or other storage device within the system may be a device that uses magneto-optical means to store information, such as a magneto-optical disk or the like.
  • the access mode of the database 140 or other storage devices in the system may be random storage, serial access storage, read-only storage, etc., or a combination of several.
  • Database 140 or other storage devices within the system may be non-persistent memory or permanent memory.
  • the storage devices mentioned above are just a few examples, and the storage devices that the system can use are not limited thereto.
  • database 140 may be part of imaging system 110 and/or image processing server 120. In some embodiments, database 140 can be self-contained, directly connected to network 130. In some embodiments, database 140 may store data collected from imaging system 110, image processing server 120, and/or network 130. In some embodiments, database 140 may store various data utilized, generated, and/or output in the operation of image processing server 120. In some embodiments, the connection or communication of database 140 with imaging system 110, image processing server 120, and/or network 130 may be wired, or wireless, or a combination of both. In some embodiments, imaging system 110 can access database 140, image processing server 120, etc., either directly or through network 130.
  • the image processing server 120 and/or the database 140 described above may actually exist in the imaging system 110 or perform corresponding functions through a cloud computing platform.
  • the cloud computing platform may include a storage-based cloud platform that stores data, a computing cloud platform that processes data, and an integrated cloud computing platform that takes into account data storage and processing.
  • the cloud platform used by the imaging system 110 may be a public cloud, a private cloud, a community cloud, or a hybrid cloud.
  • the image information and/or data information output by the imaging system 110 may be calculated and/or stored by the user cloud platform according to actual needs, or may be calculated and/or stored by the local image processing server 120 and/or the database 140. .
  • the above description of the image processing system 100 is only for convenience of description, and is not This application is intended to be limited to the scope of the embodiments. It will be understood by those skilled in the art that after understanding the principle of the system, it is possible to arbitrarily combine the modules without departing from the principle, or to form a subsystem to be connected with other modules for image processing. Various modifications and changes are made to the configuration of system 100. However, these modifications and changes are still within the scope of the above description.
  • the database 140 may be a cloud computing platform with data storage capabilities, including public clouds, private clouds, community clouds, hybrid clouds, and the like. Variations such as these are within the scope of the present application.
  • FIG. 2 is an exemplary schematic diagram of a computing device 200 of image processing server 120, shown in accordance with some embodiments of the present application.
  • Computing device 200 can implement and/or implement the particular systems disclosed in this application.
  • the particular system in this embodiment utilizes a functional block diagram to explain a hardware platform that includes a user interface.
  • Computing device 200 can implement one or more components, modules, units, sub-units of image processing server 120 as currently described.
  • image processing server 120 can be implemented by computing device 200 through its hardware devices, software programs, firmware, and combinations thereof.
  • Computing device 200 can be a general purpose computer or a purposeful computer. Both computers can be used to implement the particular system in this embodiment.
  • Only one computing device is drawn in FIG. 2, but the related computer functions described in this embodiment for performing information processing and pushing information may be implemented in a distributed manner by a similar set of platforms. The processing load of the system.
  • computing device 200 can include an internal communication bus 210, a processor 220, a read only memory (ROM) 230, a random access memory (RAM) 240, a communication port 250, and an input/output component 260, Hard disk 270, user interface 280.
  • the internal communication bus 210 can enable data communication between components of the computing device 200.
  • the processor 220 is operative to execute program instructions to perform any of the functions, components, modules, units, subunits of the image processing server 120 described in this disclosure.
  • Processor 220 is comprised of one or more processors.
  • Communication port 250 may enable data communication between computing device 200 and other components of image processing system 100, such as imaging system 110, image processing server 120 (such as through network 130).
  • Computing device 200 can also include various forms of program storage units and data storage units, such as hard disk 270, read only memory (ROM) 230, random access memory (RAM) 240, which can be used for computer processing and/or communication.
  • Input/output component 260 supports input/output of data streams between computing device 200 and other components, such as user interface 280, and/or other components of image processing system 100, such as database 140.
  • Computing device 200 can also transmit and receive information and data from network 130 via communication port 250.
  • FIG. 3 is an exemplary schematic diagram of a processing engine 122, shown in accordance with some embodiments of the present application.
  • the processing engine 122 in the image processing server 120 can include a processing module 310, a communication module 320, and a storage module 330.
  • Processing engine 122 may further include an input/output module 340.
  • the input/output module 340 can receive image data of one or more imaging devices in the imaging system 110 and send it to the processing module 310 or the like.
  • the input/output module 340 can transmit the image data processed by the processing module 310 to the imaging system 110 and/or the database 140 and the like connected to the image processing server 120 through the network 130.
  • the connections between the various modules of the processing engine 122 may be wired, wireless, and/or a combination of wired and wireless connections.
  • the various modules of processing engine 122 may be local, remote, and/or a combination of local and remote.
  • the correspondence between the modules of the processing engine 122 may be one-to-one, one-to-many, or many-to-many.
  • the processing engine 122 can include a processing module 310 and a communication module 320.
  • the processing engine 122 can include a plurality of processing modules 310 and a plurality of storage modules 330.
  • the plurality of processing modules 310 may respectively correspond to the plurality of storage modules 330 to respectively process image data from the corresponding storage module 330.
  • Image data may include one or more images (eg, two-dimensional images, three-dimensional images, etc.) and one or more portions thereof, video data (eg, one or more videos, video frames, and other video-associated data, etc.) Data that can be used to process images and/or video (eg, data that can be used to compress or decompress, encrypt or decrypt, send or receive, and play back images and/or video, etc.), image-related data, and the like.
  • images eg, two-dimensional images, three-dimensional images, etc.
  • video data eg, one or more videos, video frames, and other video-associated data, etc.
  • Data that can be used to process images and/or video eg, data that can be used to compress or decompress, encrypt or decrypt, send or receive, and play back images and/or video, etc.
  • image-related data e.g, image-related data, and the like.
  • Input/output module 340 can receive information from other modules or external modules in image processing system 100. Input/output module 340 can send information to other modules or external modules in image processing system 100. In some embodiments, input/output module 340 can receive image data generated by imaging system 110. The image data may include computed tomography image data, X-ray image data, magnetic resonance image data, ultrasonic image data, thermal image data, nuclear image data, light image data, and the like. In some embodiments, the information received by the input/output module 340 can be processed in the processing module 310 and/or stored in the storage module 330. In some embodiments, the input/output module 340 can output image data processed by the processing module 310.
  • the information received and/or output by input/output module 340 may be data of Digital Imaging and Communications in Medicine (DICOM).
  • DICOM Digital Imaging and Communications in Medicine
  • the DICOM form of data may be transmitted and/or stored in accordance with one or more standards.
  • input/output module 340 can perform corresponding operations through input/output component 260.
  • Processing module 310 can process the image data.
  • the processing module 310 Image data may be acquired from imaging system 110, and/or database 140 by input/output module 340.
  • the processing module 310 can retrieve image data directly from the storage module 330.
  • the processing module 310 can process the acquired image data.
  • the processing of the image data may include image digitization, image compression (eg, image encoding), image enhancement restoration (eg, image enhancement, image restoration, image reconstruction), image analysis (eg, image segmentation, image matching, image recognition). Etc, or a combination of several.
  • the processing module 310 can process medical image data.
  • the processing of the medical image data may include volume rendering (eg, volume ray casting, ray premature termination, determining a two-dimensional region of interest/three-dimensional volume (eg, mapping a two-dimensional region of interest/three-dimensional volume, cropping a two-dimensional region of interest) / 3D volume, transfer 2D area/3D volume of interest, merge 2D area/3D volume of interest), image compression (eg image coding (including model base coding, based on neural network coding)), image analysis ( For example, image segmentation (including region segmentation (including region growing and region splitting, octree), threshold segmentation, edge segmentation, histogram method, etc.), image enhancement recovery (eg, filtering (including high-pass filtering, low-pass) Filtering, bandpass filtering, etc., Fourier transform, pseudo color enhanced image reconstruction (including texture mapping), image coloring (including radiation coloring), image rendering (including ray tracing, photon mapping, stereo rendering), image edge blending ( This includes grayscale window-based matching), fitting, interpolation
  • a two-dimensional region/three-dimensional volume of interest is drawn on an image and characteristic information (eg, statistical features) of the two-dimensional region/three-dimensional volume of interest is analyzed.
  • the statistical features may include, but are not limited to, variance, area, length, average, maximum, minimum, volume, frequency distribution, histogram, etc., or a combination of several.
  • processing module 310 can perform corresponding operations by processor 220.
  • processing module 310 can include one or more processing elements or devices, such as a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (digital) Signal processor (DSP), system on a chip (SoC), microcontroller (MCU), etc., or a combination of several.
  • processing module 310 can include processing elements having special functions. For example, processing module 310 can include processing elements that determine a two-dimensional region/three-dimensional volume of interest of the image. As another example, processing module 310 can include processing elements of a user-defined function.
  • Communication module 320 can enable communication of image processing server 120 with network 130.
  • the communication mode of the communication module 320 may include wired communication and/or wireless communication.
  • the wired communication can refer to Communication is carried out by transmission media such as wires, cables, optical cables, waveguides, nanomaterials, etc., which may include IEEE 802.11 series wireless local area network communication, IEEE 802.15 series wireless communication (eg, Bluetooth, ZigBee, etc.), mobile communication (eg, TDMA, CDMA, WCDMA, TD-SCDMA, TD-LTE, FDD-LTE, etc.), satellite communication, microwave communication, scatter communication, atmospheric laser communication, etc., or a combination thereof.
  • communication module 320 may encode the transmitted information using one or more encoding methods.
  • the encoding method may include phase encoding, non-returning zeroing, differential Manchester encoding, etc., or a combination of several.
  • communication module 320 can select different encoding and/or transmission methods depending on the type of image. For example, when the image data is in the DICOM form, the communication module 320 can encode and transmit according to the DICOM standard. In some embodiments, communication module 320 can perform corresponding operations through communication port 250.
  • the storage module 330 can store information.
  • the information may include image data acquired by the input/output module 340, and/or results processed by the processing module 310, and the like.
  • the information may include text, numbers, sounds, images, videos, etc., or a combination of several.
  • the storage module 330 may include various types of storage devices such as a solid state drive, a mechanical hard disk, a USB flash drive, an SD memory card, an optical disk, a random access memory (RAM), and a read only memory (Read-Only). Memory, ROM), etc., or a combination of several.
  • storage module 330 can be storage local to image processing server 120, external storage, and/or storage connected via network 130 (eg, cloud storage, etc.), and the like.
  • storage module 330 can include a data management unit (not shown).
  • the data management unit can monitor and manage data in the storage module, delete data with zero or low utilization, and enable the storage module 330 to maintain sufficient storage capacity.
  • storage module 330 can perform corresponding operations through ROM 230, RAM 240, and/or other storage devices.
  • processing engine 122 can include a control module.
  • the control module can control each module of the processing engine 122 to receive, process, store, input, and/or output image data.
  • the control module can control the input/output module 340 to obtain information (eg, obtain user instructions, expert opinions, etc. from the user interface 280), or transmit information to the network 130 (eg, share a disease in a medical system) Information, etc.).
  • FIG. 4 is an exemplary schematic diagram of a processing module 310 shown in accordance with some embodiments of the present application.
  • the processing module 310 can include a rendering unit 410, a delivery unit 420, a cropping unit 430, a merging unit 440, and an analysis unit 450.
  • processing module 310 may also include other units. In some embodiments, some of the above units may not be necessary. In some embodiments, some of the above units may be combined into one unit to function together. In some embodiments, the above units may be independent. The unit independence may mean that each unit performs its own function. In some embodiments, the above units may be in communication with one another. The unit being interconnected may be that the data for each unit may be used interchangeably.
  • the object of image processing may be an image or a part thereof.
  • the image may be a two-dimensional image and/or a three-dimensional image.
  • the finest resolvable elements can be pixels.
  • the finest resolvable elements can be voxels.
  • the image may include one or more two-dimensional regions of interest (ROI) and/or a three-dimensional volume of interest (VOI).
  • the two-dimensional ROI may refer to one or more pixel points of interest in a two-dimensional image (or a layer of an image in a three-dimensional image).
  • the three-dimensional VOI may refer to one or more voxel points of interest in the three-dimensional image.
  • the three dimensional VOI may comprise an ROI of one or more layers of images.
  • the two-dimensional ROI/three-dimensional VOI may correspond to one or more tissues in the image or a portion thereof.
  • a two-dimensional ROI/three-dimensional VOI can refer to a tumor, a hardening, or a diseased tissue in an image.
  • the processing module 310 can process pixels/voxels in the image corresponding to a portion of a tissue, organ, or related content (eg, colon, small intestine, lung, or air, liquid, etc. therein).
  • drawing a two-dimensional ROI/three-dimensional VOI drawing a two-dimensional ROI/three-dimensional VOI, passing a two-dimensional ROI/three-dimensional VOI, cropping a two-dimensional ROI/three-dimensional VOI, combining two-dimensional ROI/three-dimensional VOI, identifying or segmenting an organization in an image, and removing a certain image from the image Area or volume, etc.
  • the drawing unit 410 can draw a two-dimensional area/three-dimensional volume based on the image data.
  • the rendering unit 410 can render one or more specific two-dimensional regions/three-dimensional volumes based on the images.
  • the particular two-dimensional region/three-dimensional volume may be a Region of Interest (ROI) of interest and/or a Volume of Interest (VOI) of interest.
  • the drawing unit 410 can draw a two-dimensional area/three-dimensional volume of a specific shape.
  • the two-dimensional area/three-dimensional volume of the specific shape may include a two-dimensional area/three-dimensional volume of a regular shape and/or a two-dimensional area/three-dimensional volume of an irregular shape.
  • the two-dimensional area/three-dimensional volume of the regular shape may include a rectangle, Squares, diamonds, circles, ellipses, triangles, cuboids, cubes, vertebral bodies, spheres, etc.
  • the two-dimensional area/three-dimensional volume of the irregular shape may include a two-dimensional area/three-dimensional volume of an arbitrary shape.
  • the rendering unit 410 can perform any modification operations on the interior of the two-dimensional region/three-dimensional volume and/or its boundaries. The operations may include stretching, dragging, enlarging, reducing, erasing, thickening, adding colors, and the like. For example, the user may add color to the particular two-dimensional area/three-dimensional volume drawn by the rendering unit 410 at the user interface 280.
  • rendering unit 410 may render a two-dimensional ROI/three-dimensional VOI on the image processed by rendering unit 410, delivery unit 420, crop unit 430, and/or merging unit 440.
  • the transfer unit 420 can deliver the two-dimensional region of interest/three-dimensional volume based on the image data.
  • the transfer unit 420 can communicate one or more specific two-dimensional regions of interest/three-dimensional volume.
  • the transfer may refer to copying the shape of a certain two-dimensional area/three-dimensional volume of an image to different positions of the same image or different images, for example, copying the shape of one or more areas of a certain layer of image to Different positions of the same layer image, copy the shape of one or more regions of one layer image to the same position of another layer image, or copy the shape of one or more volumes of one three-dimensional image to another three-dimensional image The same position and so on.
  • the passing may refer to generating a two-dimensional area/three-dimensional volume of the same shape at a corresponding location of the image without changing the pixel value/voxel value information of the image.
  • the delivery unit 420 can deliver a particular two-dimensional region in a different layer image (ie, a data layer) of the same three-dimensional image (ie, volume data). / 3D volume.
  • the transfer unit 420 can deliver a particular two-dimensional region of interest/three-dimensional volume at a different data layer of the same volume data.
  • the delivery unit 420 may determine location information (eg, three-dimensional coordinates) of one or more reference voxel points in the two-dimensional region of interest. (x, y, z), etc., and generate a two-dimensional region of interest based on the position information at a corresponding position of another data layer (for example, three-dimensional coordinates (x, y, z'), etc.).
  • the reference voxel point may be any voxel point in the two-dimensional region ROI 1 of interest, for example, a central voxel point, or a voxel point on the edge of the ROI 1 or the like.
  • the relative position of the reference voxel points within the two-dimensional region of interest may remain unchanged.
  • the positional information of the reference voxel points in the two-dimensional region/three-dimensional volume of interest may be used to represent positional information of the two-dimensional region/three-dimensional volume of interest.
  • the transfer unit 420 may have one or more data layers in the volume data corresponding to the transfer depth of the one or more transfer directions (eg, the first data layer z1, the second data layer z2 in the z-axis direction, and the y-axis direction).
  • the data layer y1, the second data layer y2, the first data layer x1 in the x-axis direction, the second data layer x2, and the like are transferred between the two-dimensional regions of interest/three-dimensional volume.
  • the direction of the transfer may be a direction perpendicular to the plane of the two-dimensional area of interest/the volume of the three-dimensional volume, or a direction at an arbitrary inclination angle to the plane of the two-dimensional area/three-dimensional volume of interest, for example, the positive direction of the Z-axis of the volume data. And/or the Z-axis negative direction, the Y-axis positive direction of the volume data and/or the Y-axis negative direction, the X-axis positive direction of the volume data and/or the X-axis negative direction, or any other direction in the three-dimensional space.
  • the transfer unit 420 may transfer the two-dimensional region of interest ROI 1 of the XY plane to the first data layer z1 to obtain an interest of the reference voxel point in the first data layer z1 at the three-dimensional coordinates (x, y, z1). Two-dimensional area ROI 1 '.
  • the transfer unit 420 can transfer the two-dimensional region ROI 1 of interest in the XY plane to the second data layer z2 to obtain a sense that the reference voxel point in the second data layer z2 is located at the three-dimensional coordinates (x, y, z2). Interesting in the two-dimensional region ROI 1 ”, etc.
  • the transfer unit 420 can transfer the two-dimensional region ROI 2 of interest (reference voxel point position information is (x, y, z)) to the first data layer y1 Obtaining a two-dimensional region ROI 2 ' of interest in the first data layer y1 in which the reference voxel point is located in three-dimensional coordinates (x, y1, z).
  • the transmitting unit 420 may use the two-dimensional region ROI of the XZ plane of interest.
  • the transfer unit 420 may transfer the two-dimensional region ROI 3 of interest in the YZ plane (the reference voxel point position information is (x, y, z)) to the first data layer x1 to obtain the first data layer x1.
  • the reference voxel point is located in the two-dimensional region of interest ROI 3 ' in three-dimensional coordinates (x1, y, z).
  • the transfer unit 420 may transfer the two-dimensional region ROI 3 of interest of the XZ plane (the reference voxel point position information is (x, y, z)) to the second data layer x2 to obtain the second data layer x2.
  • the reference voxel point is located in a three-dimensional region of interest (x2, y, z) of the two-dimensional region ROI 3 ".
  • the positive Z-axis direction may refer to the head-to-tail direction of the detected object in the image along The direction of the head; the negative direction of the Z axis may refer to the direction along the tail in the head-to-tail direction of the detected object in the image; the positive direction of the Y-axis may refer to the front and rear direction of the detected object in the image along the front
  • the Y-axis negative direction may refer to a direction along the rear in the front-rear direction of the detection object in the image;
  • the X-axis positive direction may refer to a direction along the right portion in the left-right direction of the detection object in the image;
  • the X-axis negative direction may refer to a direction along the left portion in the left-right direction of the detection object in the image.
  • the transmission depth may be a height of the volume data in the Z-axis, the Y-axis, or the X-axis direction.
  • One or more data layers in the data layer can be obtained by the layer spacing of the volume data
  • the layer spacing may be 0.1-20 mm or other suitable spacing.
  • the layer spacing may be layer spacing of different axial directions, for example, layer spacing in the X-axis, layer spacing in the Y-axis, layer in the Z-axis.
  • the spacing of the different axial layers may be the same or different.
  • the transfer unit 420 may pass a particular two-dimensional area between different image data/ Three-dimensional volume.
  • the transfer unit 420 can transfer a particular two-dimensional region of interest/three-dimensional volume between different volume data. For example, a VOI in one volume data 1 , the transfer unit 420 can determine the three-dimensional position (x, y, z) of the reference voxel point in the VOI 1 ; according to the three-dimensional position, the transfer unit 420 can be in the same three-dimensional position (x, y, z) of the other volume data And its neighborhood generates VOI 1 ' of the same shape.
  • the reference voxel point may be any voxel point in the three-dimensional volume VOI 1 of interest, for example, a central voxel point, or a voxel on the surface of VOI 1 Point etc.
  • the positional information of the reference voxel point in the three-dimensional volume of interest may be used to represent positional information of the three-dimensional volume of interest.
  • the transfer unit 420 can receive the two-dimensional ROI/three-dimensional VOI of interest between different data layers, and/or between different volume data.
  • the transfer unit The transferred two-dimensional ROI/three-dimensional VOI may be rendered in different data layers and/or different volume data by the rendering unit 410.
  • the delivery unit 420 may pass the rendering unit 410, the cropping unit 430, the delivery unit 420, and/or merge Unit 440 processes the image two-dimensional ROI/three-dimensional VOI.
  • the cropping unit 430 can crop the two-dimensional area of interest/three-dimensional volume based on the image data.
  • the cropping unit 430 may retain a partial two-dimensional area/three-dimensional volume of the two-dimensional ROI/three-dimensional VOI of interest in a certain volume data or data layer, and/or cut one or more of the two-dimensional ROI/three-dimensional VOI of the interest A specific two-dimensional area / three-dimensional volume.
  • cropping unit 430 can implement a cropping function through operations of a set of pixels and/or a set of voxels.
  • the cropping unit 430 may crop a specific two-dimensional area/three-dimensional volume that is not required by the internal two-dimensional ROI/three-dimensional VOI user to obtain a cropping result (ie, the two-dimensional ROI/three-dimensional VOI of interest and the to-be-cut The difference between a particular two-dimensional area/three-dimensional volume). For example, the cropping unit 430 may subtract the pixel/voxel set corresponding to the specific specific two-dimensional area/three-dimensional volume from the pixel/voxel set in the two-dimensional ROI/three-dimensional VOI of interest to achieve cropping. In some embodiments, the cropping unit 430 can adjust the cropped two-dimensional region of interest/three-dimensional volume.
  • the cropping unit 430 can restore the erroneously cropped two-dimensional area/three-dimensional volume.
  • the cropping unit 430 can restore the two-dimensional ROI/three-dimensional VOI by combining the pixel/voxel set of the erroneously cropped two-dimensional area/three-dimensional volume.
  • the cropping unit 430 may perform cropping based on the image data processed by the rendering unit 410, the delivery unit 420, the cropping unit 430, and/or the merging unit 440.
  • the merging unit 440 can merge the two-dimensional regions of interest/three-dimensional volume based on the image data.
  • Merge unit 440 can combine two or more specific two-dimensional regions/three-dimensional volumes (e.g., two-dimensional ROI/three-dimensional VOI).
  • merging unit 440 can achieve merging of a particular two-dimensional region/three-dimensional volume by merging pixel/voxel sets.
  • the result of the merging may be a union of two or more pixel/voxel sets to be merged.
  • the merging unit 440 can combine at least two specific two-dimensional regions/three-dimensional volumes.
  • the merging unit 440 can combine at least two specific volumes.
  • the merging unit 440 can merge one or more specific two-dimensional regions/three-dimensional volumes, as well as one or more specific volumes. As an example, the merging unit 440 can combine at least two specific two-dimensional regions/three-dimensional volumes. For example, merging unit 440 can combine data for two different ROIs in the same data layer (eg, first interest two-dimensional region ROI 1 and second interest two-dimensional region ROI 2 ). In some embodiments, the merging unit 440 can combine data of at least two two-dimensional regions/three-dimensional volumes of connected or non-connected in the same data layer and for further analysis. The merging unit 440 may perform merging based on the two-dimensional ROI/three-dimensional VOI processed by the rendering unit 410, the delivery unit 420, the merging unit 440, and/or the cropping unit 430.
  • the analysis unit 450 may analyze information of the two-dimensional area/three-dimensional volume of interest based on the image data. Analysis unit 450 can analyze information for a particular two-dimensional region of interest/three-dimensional volume (eg, two-dimensional ROI/three-dimensional VOI). In some embodiments, analysis unit 450 can analyze feature information (eg, statistical feature information) of the two-dimensional ROI/three-dimensional VOI. The statistical features may include variance, area, length, mean, maximum, minimum, volume, etc., or a combination of several. In some embodiments, the feature information may include gray value feature information, such as a grayscale distribution, a grayscale average, and the like. Analysis unit 450 can display the analysis results at user interface 280 via input/output module 340.
  • feature information eg, statistical feature information
  • the statistical features may include variance, area, length, mean, maximum, minimum, volume, etc., or a combination of several.
  • the feature information may include gray value feature information, such as a grayscale distribution, a grayscale average,
  • the analysis unit 450 may analyze the image processed by the rendering unit 410, the delivery unit 420, the cropping unit 430, and/or the merging unit 440, a two-dimensional ROI/three-dimensional VOI, and the like. In some embodiments, analysis unit 450 can analyze whether a particular two-dimensional region/three-dimensional volume requires further processing. For example, analysis unit 450 can determine via user interface 280 whether to continue drawing, cropping, transferring, and/or merging the two-dimensional ROI/three-dimensional VOI.
  • the processing module 310 in the image processing server 120 is merely exemplary and is not intended to limit the application to the scope of the enumerated embodiments. It will be understood that, for those skilled in the art, after understanding the functions performed by the processing module, it is possible to perform any combination of the modules, units or sub-units in the case of implementing the above functions, and the configuration of the processing module 310 is performed. Various corrections and changes. However, these modifications and changes are still within the scope of the above description.
  • the processing module 310 can also include a A separate image unit is used to implement processing of the image data. The separate image units may be independent of the rendering unit 410.
  • the image unit may implement the functions of the rendering unit 410, the delivery unit 420, the cropping unit 430, the merging unit 440, and/or the analysis unit 450.
  • some units are not required, for example, drawing unit 410.
  • processing module 310 can include other units or sub-units. Variations such as these are within the scope of the present application.
  • FIG. 5 is an exemplary flow diagram of processing an image, shown in accordance with some embodiments of the present application.
  • Process 500 can be implemented by processing engine 122.
  • image data can be acquired.
  • 501 can be implemented by input/output module 340.
  • the image data may be obtained by imaging system 110 scanning a target object or a portion thereof.
  • the image data may be obtained from an internal storage device (eg, database 140 and/or storage module 330, etc.).
  • the image data may be obtained from an external storage device (eg, a network storage device, a cloud disk, a mobile hard disk, etc., or a combination of several).
  • the image data may include an image matrix, image information, image vectors, bitmaps, animations, image encodings, primitives, segments, etc., or a combination of several.
  • the image data can be medical image data.
  • the medical image data can be obtained by one or more scanners.
  • the scanner may include magnetic resonance imaging (MRI), computed tomography (CT), positron computed tomography (PET), single photon emission computed tomography (SPECT), computed tomography colonography (CTC), etc. , or a combination of several.
  • the image data may be data obtained by scanning an organ, a body, an object, a dysfunction, a tumor, or the like, or a plurality of targets.
  • the image data may be data obtained by scanning a head, a chest, an organ, a bone, a blood vessel, a colon, etc., or a plurality of targets.
  • the image data can be two-dimensional data and/or three-dimensional data.
  • the image data may be composed of a plurality of two-dimensional pixels or three-dimensional voxels.
  • a value in the image data may correspond to one or more properties of the pixel or voxel, such as grayscale, brightness, color, absorbance to X-rays or gamma rays, hydrogen atom density, biomolecular metabolism, receptors, and Neural media activity, etc.
  • At 502 at least one region of interest (a two-dimensional region of interest/three-dimensional volume) can be determined based on image data acquired in 501.
  • 502 can be implemented by rendering unit 410, delivery unit 420, cropping unit 430, merging unit 440, or any combination of one or more of the above-described units in processing module 310.
  • the determining the two-dimensional area of interest/three-dimensional volume may include drawing a two-dimensional area of interest/three-dimensional volume, transmitting The 2D region/3D volume of interest, the 2D region/3D volume of interest, and/or the 2D region/3D volume of interest, and/or the like.
  • the region of interest may include a two-dimensional region ROI of interest and/or a three-dimensional volume of interest.
  • the two-dimensional ROI may be specific regions of different sizes, and/or different shapes.
  • the region of interest may be an area outlined by a circle, an ellipse, a box, an irregular polygon, or the like.
  • the region of interest may outline a particular region by a rectangle.
  • the particular area may be an area that requires further processing.
  • the drawing of the region of interest may be to outline a particular region within a volume of data.
  • processing module 310 can determine one or more three-dimensional VOIs.
  • the passing of the region of interest may be the transfer of the region of interest of one data layer to other data layers within the same volume data.
  • the passing of the region of interest may be to transfer the region of interest within one volume of data to other volume data.
  • the cropping of the region of interest may be to remove a particular region within a region of interest.
  • the merging of the regions of interest may be the merging of two or more regions of interest.
  • the merging of the two regions of interest may be merging image data of the two regions of interest.
  • the merged region of interest may be a collection of pixels that combine the two regions of interest.
  • At 503 at least one of the two-dimensional regions of interest/three-dimensional volume-dependent image data determined in 502 can be analyzed.
  • 503 can be implemented by analysis unit 450 in processing module 310.
  • the analyzing the two-dimensional region/three-dimensional volume-related image data of interest may include analyzing feature information of the two-dimensional region/three-dimensional volume of interest.
  • the feature information may be a statistical feature.
  • the statistical features can include variance, area, length, mean, maximum, minimum, volume, etc., or a combination of several.
  • the analyzing the two-dimensional region/three-dimensional volume-dependent image data of interest may include determining whether a tissue image is within the two-dimensional region/three-dimensional volume of interest.
  • the image data may correspond to a tissue, organ, and/or related content (eg, colon, small intestine, lung, or air, liquid, etc. therein).
  • a tissue, organ, and/or related content eg, colon, small intestine, lung, or air, liquid, etc. therein.
  • the particular condition may include an area of a certain threshold, and/or a volume of a certain threshold, and the like.
  • the image data may be further analyzed and/or processed when the associated image data meets certain conditions.
  • the results of the analysis in 503 can be displayed.
  • 504 can be implemented by input/output module 340.
  • the analysis results may be displayed in user interface 280 via input/output module 340 and/or input/output component 260.
  • the display analysis result may be outputting the two-dimensional region/three-dimensional volume-related image data of interest obtained by 503 analysis.
  • the image The output of the data may include transmitting the analyzed image data to other modules of image processing system 100.
  • the input/output module 340 can transmit the analyzed image data directly to and/or via the network 130 to the imaging system 110, and/or the database 140 at 504.
  • the output of the image data can include displaying the analyzed image data through one of the imaging system 110 and/or the image processing server 120.
  • 504 can send the analysis results to a module or device external to the system.
  • the image data transmitted by the input/output module 340 may be wireless, wired, or a combination of both.
  • the analysis results may be sent to a module or device outside the system through the communication module 320 of the processing engine 122 in the image processing server 120.
  • 504 can further store the analysis results into storage module 330 and/or database 140.
  • 504 can display the image data acquired in 501, the two-dimensional region of interest/three-dimensional volume determined in 502, or other information related to the intermediate state of the process.
  • process 500 is merely exemplary and is not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 500, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description.
  • process 500 may not perform partial operations. As an example, operation 503 and/or operation 504 may not be performed.
  • the process 500 can include other operations, such as processing image data of a two-dimensional region/three-dimensional volume of interest. Variations such as these are within the scope of the present application.
  • FIG. 6 is an exemplary flow diagram for determining a region of interest (a two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application.
  • the process 600 can be implemented by the processing module 310 in the image processing server 120.
  • one or more regions of interest can be drawn.
  • 601 can be implemented by rendering unit 410 in processing module 310.
  • 601 can draw one or more two-dimensional regions of interest/three-dimensional volume based on image data acquired in 501.
  • the image data may include a medical image.
  • the medical image may include a magnetic resonance image (MRI image), a computed tomography image (CT image), a positron computed tomography image (PET image), a single photon emission computed tomography image (SPECT image), a computed tomography Scan colon images (CTC images) and the like.
  • MRI image magnetic resonance image
  • CT image computed tomography image
  • PET image positron computed tomography image
  • SPECT image single photon emission computed tomography image
  • CTC images computed tomography Scan colon images
  • a two-dimensional region of interest/three-dimensional volume can be drawn in image data conforming to Digital Imaging and Communication in Medicine (DICOM).
  • DICOM Digital Imaging and Communication
  • the rendering of the two-dimensional region of interest/three-dimensional volume in 601 can be automated, manual, and/or a combination of both.
  • the automatic rendering may refer to the system automatically sketching a specific shape two-dimensional area/three-dimensional volume at a specific position of the image data.
  • the particular location may be determined by the system and/or selected by the user.
  • the particular shape two-dimensional area/three-dimensional volume may be determined by the system and/or selected by the user.
  • a user eg, a doctor, an expert, etc.
  • rendering unit 410 may render a correspondingly shaped two-dimensional region of interest at a corresponding location in the image data based on the position and/or shape of one or more of the two-dimensional regions of interest/three-dimensional volume that have been drawn/ Three-dimensional volume.
  • the automatic rendering may refer to the rendering unit 410 segmenting the image data using one or more algorithms to extract the two-dimensional region/three-dimensional volume of interest.
  • the algorithm may include an image segmentation algorithm, for example, a grayscale threshold segmentation method, a region growing and splitting merge method, an edge segmentation method, a histogram method, and a fuzzy theory segmentation method (eg, fuzzy threshold segmentation method, fuzzy connectivity segmentation method, Fuzzy clustering segmentation method, etc.), based on neural network segmentation method, mathematical morphology segmentation method (for example, morphological watershed algorithm, etc.), or a combination thereof.
  • the manual rendering may manually outline a two-dimensional region of interest/three-dimensional volume of a particular shape by a user at a particular location of the image data.
  • the particular shape may include a regular shape and/or an irregular shape.
  • the drawing a two-dimensional area of interest/three-dimensional volume may include drawing a two-dimensional area/three-dimensional volume of a specific shape, and may also include performing a two-dimensional area/three-dimensional volume of interest and/or a two-dimensional area/three-dimensional volume boundary of interest.
  • Different operations may include modifying the rendered two-dimensional area of interest/three-dimensional volume.
  • the modifications may include stretching, dragging, erasing, thickening, adding colors, and the like.
  • the user can modify the two-dimensional area of interest/three-dimensional volume that has been drawn by adding the two-dimensional area of interest/three-dimensional volume boundary color.
  • the rendering of a two-dimensional region of interest/three-dimensional volume may be to re-draw a particular shape of the two-dimensional region of interest/three-dimensional volume within the existing two-dimensional region of interest/three-dimensional volume.
  • the existing two-dimensional area of interest/three-dimensional volume may be a two-dimensional area of interest/three-dimensional volume obtained by drawing, cropping, transferring, and/or merging.
  • 601 may draw one or more based on the cropped 2D region/3D volume of interest in 602, the 2D region/3D volume of interest passed in 603, and/or the 2D region/3D volume of interest merged in 604. Two 2D regions/3D volumes of interest.
  • 601 can draw another 2D region of interest/3D volume within the drawn 2D region/3D volume of interest.
  • the two-dimensional region of interest/three-dimensional volume drawn in 601 can be subjected to subsequent rendering, cropping, passing, and/or merging.
  • one or more two-dimensional regions of interest/three-dimensional volume can be cropped.
  • 602 can be implemented by cropping unit 430 in processing module 310.
  • the cropping of a two-dimensional region of interest/three-dimensional volume may be to remove a particular two-dimensional region of interest/three-dimensional volume within a two-dimensional region of interest/three-dimensional volume.
  • the particular two-dimensional region of interest/three-dimensional volume that needs to be cropped can be adjusted in 602.
  • the adjusting may include restoring the removed two-dimensional area of interest/three-dimensional volume, reducing or expanding the specific two-dimensional area of interest/three-dimensional volume to be tailored, and changing the specific two-dimensional area of interest/three-dimensional to be tailored.
  • the cropping of a two-dimensional region of interest/three-dimensional volume may be accomplished by an operation of a set of pixels and/or a set of voxels.
  • the cropping the 2D region/3D volume of interest may be performed by finding a pixel/voxel set of the 2D region/3D volume of interest and a specific to be cropped inside the 2D region/3D volume of interest. Obtained from the difference set of the pixel/voxel set of the two-dimensional area/three-dimensional volume.
  • the cropping a two-dimensional region of interest/three-dimensional volume may be cropped within an existing two-dimensional region of interest/three-dimensional volume.
  • the existing two-dimensional area of interest/three-dimensional volume may be a two-dimensional area of interest/three-dimensional volume obtained by drawing, cropping, transferring, and/or merging.
  • 602 may crop the interest based on the two-dimensional region of interest/three-dimensional volume drawn in 601, the two-dimensional region of interest/three-dimensional volume passed in 603, and/or the merged two-dimensional region of interest/three-dimensional volume in 604.
  • Dimension area / 3D volume As another example, 602 can be cropped within the cropped two-dimensional area of interest/three-dimensional volume.
  • the cropped 2D region/3D volume of interest may be subjected to subsequent rendering, cropping, passing, and/or merging.
  • one or more two-dimensional regions of interest/three-dimensional volume can be delivered.
  • 603 can be implemented by the delivery unit 420 in the processing module 310.
  • the transfer of a two-dimensional region of interest/three-dimensional volume may be location information and/or shape information that conveys a two-dimensional region/three-dimensional volume of interest.
  • the location information may include a two-dimensional location, a three-dimensional location, and the like.
  • the shape information may include a regular shape, an irregular shape, and the like.
  • the transfer of the two-dimensional region of interest/three-dimensional volume may be between different volume data or may be delivered within the same volume data.
  • the transferring the two-dimensional region of interest/three-dimensional volume may be between different data layers of the same volume data according to the two-dimensional position of the two-dimensional region/three-dimensional volume of interest.
  • the transmitting the two-dimensional region of interest/three-dimensional volume may be between different volume data according to the three-dimensional position of the two-dimensional region/three-dimensional volume of interest.
  • the transmitting the two-dimensional region of interest/three-dimensional volume may be an existing sense
  • the shape information of the interest two-dimensional area/three-dimensional volume is copied to the image data to be delivered by the position information.
  • the transmitted two-dimensional region of interest/three-dimensional volume may have the same two-dimensional position and/or shape as the two-dimensional region of interest/three-dimensional volume being delivered.
  • the transmitted two-dimensional region of interest/three-dimensional volume may have the same three-dimensional position and/or shape as the two-dimensional region of interest/three-dimensional volume being delivered.
  • the voxel value/pixel value of the volume data and/or data layer may not be changed in 603.
  • the transferring a two-dimensional region of interest/three-dimensional volume may be to transfer an existing two-dimensional region of interest/three-dimensional volume.
  • the existing two-dimensional area of interest/three-dimensional volume may be a two-dimensional area of interest/three-dimensional volume obtained by drawing, cropping, transferring, and/or merging.
  • 603 may pass the interest 2 based on the 2D region/3D volume of interest drawn in 601, the 2D region/3D volume of interest cropped in 602, and/or the 2D region/3D volume of interest merged in 604.
  • Dimension area / 3D volume may be subjected to subsequent rendering, cropping, passing, and/or merging.
  • At 604 at least two two-dimensional regions of interest/three-dimensional volume can be combined.
  • 604 can be implemented by a merging unit 440 in processing module 310.
  • the combining the at least two two-dimensional regions of interest/three-dimensional volume may be data combining at least two two-dimensional regions of interest/three-dimensional volume, the data including feature information.
  • the feature information may include statistical features and the like.
  • the statistical features can include variance, area, length, mean, maximum, minimum, volume, and the like.
  • the combining at least two two-dimensional regions of interest/three-dimensional volume may be implemented by operations of a set of pixels and/or a set of voxels.
  • the combining at least two two-dimensional regions of interest/three-dimensional volume may refer to assorting a set of pixels of one two-dimensional region/three-dimensional volume of interest and a pixel set of another two-dimensional region/three-dimensional volume of interest. .
  • the merging of the two-dimensional region of interest/three-dimensional volume may be merging the existing two-dimensional region of interest/three-dimensional volume.
  • the existing two-dimensional area of interest/three-dimensional volume may be a two-dimensional area of interest/three-dimensional volume obtained by drawing, cropping, transferring, and/or merging.
  • 604 may be based on two or more of the two-dimensional area of interest/three-dimensional volume drawn in 601, the two-dimensional area of interest/three-dimensional volume cropped in 602, the two-dimensional area of interest/three-dimensional volume passed in 603 Consolidate.
  • a cropped 2D region of interest/3D volume and a merged 2D region/3D volume of interest can be combined in 604.
  • 604 can combine a transmitted 2D region of interest/3D volume and a drawn 2D of interest. Area / 3D volume.
  • the merged two-dimensional region of interest/three-dimensional volume may be subjected to subsequent rendering, cropping, passing, and/or merging.
  • process 600 may not perform a portion of the operations.
  • process 600 may not perform operation 602 and/or operation 603.
  • the sequence of operations performed by process 600 can be interchanged.
  • the process 600 may first perform operation 603 and then perform operation 602.
  • process 600 can repeat a portion of the operations.
  • the process 600 can perform operation 602 and/or operation 603 after operation 604.
  • the 601 drawn 2D region of interest, the 602 cropped 2D region of interest, the 603 passed 2D region of interest, and/or the 604 merged 2D region of interest may be replaced with the 3D of interest. volume.
  • the processing in the process 600 can be referred to for drawing, cropping, transferring, merging, and the like of the two-dimensional area/three-dimensional volume of interest. Variations such as these are within the scope of the present application.
  • FIG. 7A is an exemplary flow diagram of cropping a region of interest (a two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application.
  • the process 700 can be implemented by the crop unit 430 in the processing module 310.
  • Process 700 can be an exemplary implementation of 602 in process 600.
  • the cropping of the two-dimensional region of interest/three-dimensional volume may be achieved by an operation of a set of pixels of a two-dimensional region/three-dimensional volume of interest and/or a set of voxels.
  • a first two-dimensional region of interest/three-dimensional volume can be acquired.
  • 701 can be implemented by input/output module 340 of processing engine 122, and/or storage module 330.
  • the acquired first two-dimensional region of interest/three-dimensional volume may include a drawn two-dimensional region of interest/three-dimensional volume, a two-dimensional region of interest/three-dimensional volume after delivery, and a combined two-dimensional region/three-dimensional volume of interest And/or the cropped 2D area/3D volume, etc.
  • the acquiring the first two-dimensional region of interest/three-dimensional volume may be drawn by the rendering unit 410 in the processing module 310.
  • the first two-dimensional area of interest/three-dimensional volume may be a connected domain or a non-connected domain.
  • FIG. 7B is an exemplary schematic diagram of a region of interest (two-dimensional region of interest/three-dimensional volume) shown in accordance with some embodiments of the present application.
  • the two-dimensional area of interest/three-dimensional volume 720 may be an irregularly connected or non-connected two-dimensional area/three-dimensional shape. volume.
  • the user interface 280 can display the two-dimensional region/three-dimensional volume 720 of interest and/or its feature information, such as statistical features.
  • a second two-dimensional region of interest/three-dimensional volume can be drawn.
  • 702 can be implemented by rendering unit 410 in processing module 310.
  • the second region of interest/three-dimensional volume of interest may be drawn within the first two-dimensional region of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume can be drawn inside and outside the first two-dimensional region of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume may be a partial two-dimensional region/three-dimensional volume inside the first two-dimensional region of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume may be a partial two-dimensional region/three-dimensional volume that the user of the first two-dimensional region of interest/three-dimensional volume does not need to analyze.
  • a portion of the second two-dimensional region of interest/three-dimensional volume may be inside the first two-dimensional region of interest/three-dimensional volume, and another portion may be outside the first two-dimensional region of interest/three-dimensional volume That is, the second two-dimensional region of interest/three-dimensional volume and the first two-dimensional region of interest/three-dimensional volume may have one or more intersections.
  • the second two-dimensional region of interest/three-dimensional volume may be a hollow structure inside the tissue.
  • the void structure may be a necrotic area inside the annular tumor, a necrotic area of the annular ischemic lesion inside the brain, and/or a hollow structure of the trachea.
  • the shape and/or size of the second two-dimensional region of interest/three-dimensional volume may be further modified in 702, and may also be enlarged and/or reduced by adjusting the boundary of the second two-dimensional region of interest/three-dimensional volume. Wait for the operation.
  • the adjustment boundary may be a drag boundary, or a size of a boundary, or the like.
  • the second 2D region of interest/3D volume can be cropped from the first 2D region of interest/3D volume to obtain a third 2D region of interest/3D volume.
  • 703 can be implemented by cropping unit 430 in processing module 310.
  • the cropping the second two-dimensional region of interest/three-dimensional volume may be removing the second two-dimensional region of interest/three-dimensional volume within the first two-dimensional region of interest/three-dimensional volume.
  • the acquired third region of interest/three-dimensional volume may be a two-dimensional region/three-dimensional volume after the first two-dimensional region of interest/three-dimensional volume is removed from the second two-dimensional region of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume may be located inside the first two-dimensional region of interest/three-dimensional volume, then, in 703, the first two-dimensional region of interest/three-dimensional volume and the second may be obtained.
  • the difference set of the two-dimensional area/three-dimensional volume of interest is obtained to obtain a third two-dimensional area of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume and the first two-dimensional region of interest/three-dimensional volume may have one or more intersections, then, in 703, the first two-dimensional region of interest may be obtained/ A difference set of the three-dimensional volume and the intersection is obtained to obtain a third two-dimensional region of interest/three-dimensional volume.
  • the interesting two-dimensional area/three-dimensional volume may be a ring-shaped organizational structure.
  • the annular structure may be a ring-shaped tumor that does not contain internal necrotic tissue, and/or an annular ischemic lesion that does not contain necrosis within the brain.
  • FIG. 7C is an exemplary schematic diagram of a cropped region of interest (two-dimensional region of interest/three-dimensional volume), shown in accordance with some embodiments of the present application.
  • the two-dimensional area of interest/three-dimensional volume 730 can be a cropped connected domain or a non-connected domain.
  • the second two-dimensional region of interest/three-dimensional volume to be cropped may be one of a two-dimensional region of interest/three-dimensional volume 731, a two-dimensional region of interest/three-dimensional volume 732, a two-dimensional region of interest/three-dimensional volume 733, and the like. Or multiple.
  • the first two-dimensional region of interest/three-dimensional volume may be a two-dimensional region of interest/three-dimensional volume 730
  • the second two-dimensional region of interest/three-dimensional volume may be a two-dimensional region of interest/three-dimensional volume 731
  • the third two-dimensional region of interest/three-dimensional volume may be a two-dimensional region/three-dimensional volume obtained after the two-dimensional region/three-dimensional volume 730 of interest removes the two-dimensional region/three-dimensional volume 731 of interest.
  • the user may need to restore the 2D region/3D volume of interest before cropping, and/or the second 2D region/3D volume of interest may be incorrectly cropped, 703 may restore the cropped second interest 2 Dimension area / 3D volume.
  • the cropping operation of 703 may be undone, or the cropped two-dimensional area/three-dimensional volume of interest and the cropped two-dimensional area/three-dimensional volume may be merged to obtain the two-dimensional area/three-dimensional volume of interest before cropping.
  • a set of pixels of a first two-dimensional region of interest/three-dimensional volume may be acquired by merging a set of pixels of a third two-dimensional region of interest/three-dimensional volume and a set of pixels of a second two-dimensional region of interest/three-dimensional volume.
  • 704 can be determined whether to continue cropping.
  • 704 can be implemented by analysis unit 450 in processing module 310.
  • the determining whether to continue cropping may be performed according to one or more criteria.
  • the criteria may be stored in a storage device (eg, storage module 330, database 140, etc.) such that analysis unit 450 invokes relevant information for automatic determination.
  • the criteria may be based on the user's experience so that the user makes a manual determination while viewing the third 2D region of interest/3D volume.
  • the criteria may include whether there are other two-dimensional regions/three-dimensional volumes that do not require analysis within the first two-dimensional region of interest/three-dimensional volume. As an example, the determination may be based on whether there are other areas of necrosis within the tissue.
  • the process 700 can end the cropping.
  • characteristic information of the third region of interest/three-dimensional volume of interest can be analyzed.
  • 705 can be implemented by analysis unit 450 in processing module 310.
  • operation 704 may not be performed, and 705 is performed directly after 703.
  • the feature information may include statistical feature information.
  • the two-dimensional region of interest/three-dimensional volume processed by the process 700 may be the two-dimensional region of interest/three-dimensional volume 730 tailoring the two-dimensional region of interest/three-dimensional body.
  • the two-dimensional area of interest/three-dimensional volume 731 can be a regular shape, such as an ellipse.
  • the process 700 can continue to crop.
  • Flow 700 may return to 702 to draw other two-dimensional regions of interest/three-dimensional volume to be cropped within the first two-dimensional region of interest/three-dimensional volume.
  • the 2D region/3D volume of interest 730 can continue to crop the 2D region/3D volume 732 of interest and/or the 2D region/3D volume of interest 733.
  • the two-dimensional area of interest/three-dimensional volume 732 can be a particular shape, such as a rounded rectangle.
  • the two-dimensional area of interest/three-dimensional volume 733 can be an irregular shape.
  • the user interface 280 can display the cropped feature information of the 2D region/3D volume of interest, eg, the cropped statistics.
  • process 700 may not perform a portion of operations (eg, operation 704).
  • the process 700 can crop another region of interest from one region of interest, crop a two-dimensional region of interest from a three-dimensional volume of interest, and crop another three-dimensional volume of interest from a three-dimensional volume of interest, and, from The two-dimensional region of interest crops a three-dimensional volume of interest (specifically, the intersection of the three-dimensional volume of interest and the two-dimensional region of interest). Variations such as these are within the scope of the present application.
  • FIG. 8A is an exemplary flow diagram of communicating a two-dimensional region of interest, in accordance with some embodiments of the present application.
  • the process 800 can be implemented by the delivery unit 420 in the processing module 310.
  • Process 800 can be an exemplary implementation of 603 in process 600.
  • a first two-dimensional region of interest of the first data layer can be acquired.
  • 801 can be implemented by input/output module 340 in processing engine 122.
  • 801 can read data related to the first two-dimensional region of interest from storage module 330.
  • the user interface 280 can retrieve a two-dimensional region of interest selected by the user that needs to be communicated.
  • 801 can draw a first two-dimensional region of interest through rendering unit 410.
  • FIG. 8B is an exemplary schematic diagram of a two-dimensional region of interest, shown in accordance with some embodiments of the present application.
  • volume data 820 can include a data layer 821.
  • a data two-dimensional region 821-1 may be included in the data layer 821. It should be noted that the first impression of 801 is The interesting two-dimensional area may be the same as or different from the first two-dimensional area of interest acquired by 701 in FIG. 7A.
  • the first two-dimensional region of interest may be a two-dimensional region of interest, or a collection of two-dimensional regions of interest.
  • the first data layer is merely for convenience of description, and does not necessarily mean that the data layer is located in the first layer of the axial direction of the volume data.
  • the first data layer may be located in an XY plane, an XZ plane, a YZ plane, or a plane that forms an arbitrary oblique angle with the plane.
  • a first delivery depth of the first direction of transmission of the first data layer can be obtained.
  • 802 can be implemented by input/output module 340 in processing engine 122.
  • the first transfer direction may be perpendicular to a positive direction and/or a negative direction of the first data layer.
  • the first data layer may be located in an XY plane, an XZ plane, a YZ plane, or a plane that forms an arbitrary oblique angle with the plane.
  • the first transfer direction may be positive of the vertical axis of the plane of the Z-axis positive or negative direction, the Y-axis positive or negative direction, the X-axis positive or negative direction, or any of the above-described oblique angles. Or negative direction.
  • the first transfer direction may be a direction that forms an arbitrary oblique angle with a plane in which the first data layer is located.
  • the input/output module 340 can obtain a transfer direction and/or a corresponding transfer depth of the user settings through the user interface 280.
  • the user can click on a side of the data layer 821 with the mouse in the window of the display volume data 820 through the user interface 280 to select a delivery direction.
  • the user can enter a value through the user interface 280 or drag a value slider to set the corresponding delivery depth.
  • FIG. 8C is an exemplary schematic diagram of a two-dimensional region of interest after delivery, in accordance with some embodiments of the present application.
  • the first transfer depth may be a transfer depth H1 perpendicular to the positive direction of the data layer 821.
  • a second delivery depth of the second transfer direction of the first data layer can be obtained.
  • 803 can be implemented by input/output module 340 in processing engine 122.
  • the second transfer direction may be perpendicular to the positive and/or negative direction of the first data layer.
  • the second transfer direction may be opposite to the first transfer direction.
  • the second transfer direction may be a direction that forms an arbitrary oblique angle with a plane in which the first data layer is located.
  • the second transfer depth may be a transfer depth H2 that is perpendicular to the negative direction of the data layer 821.
  • the input/output module 340 can obtain another delivery direction and/or a corresponding delivery depth set by the user via the user interface 280.
  • the second delivery depth set by the user in the second transfer direction may be similar to 802.
  • operation 802 and operation 803 can be combined into one operation, for example, input/output module 340 can obtain two delivery directions and/or two delivery depths set by the user through user interface 280.
  • operations 802 and 803 may alternatively be performed.
  • the input/output module 340 can pass User interface 280 retrieves a delivery direction and/or a corresponding delivery depth set by the user.
  • one or more data layers in the volume data corresponding to the first delivery depth and the second delivery depth may be determined.
  • 804 can be implemented by analysis unit 450 in processing module 310.
  • the volume data corresponding to the first delivery depth H1 may include a data layer 831 and a data layer 832.
  • the volume data corresponding to the second delivery depth H2 may include a data layer 841, a data layer 842, and/or a data layer 843.
  • 804 may determine one or more data layers in the volume data corresponding to the delivery depth by layer spacing.
  • the layer spacing may be the layer spacing of the volume data itself (ie, the unit layer spacing), possibly related to image data scanned by imaging system 110, or layer thickness set by the user during scanning.
  • the layer spacing may be the image processing system 100 or a user-defined layer spacing, for example, a layer spacing obtained by amplifying a magnification per unit layer spacing (eg, 1.5 times unit layer spacing, 2 times unit layer). Spacing, 3 times unit layer spacing, etc.).
  • the magnification can be any positive real number.
  • the layer spacing of different volume data can be different.
  • the layer spacing of PET (Positron Emission Tomography) images may be 1-10 mm
  • the layer spacing of CT (Computed Tomography) images may be 0.1-20 mm or other suitable spacing.
  • the layer spacing of different axial or transmission directions of the same volume data may be the same or different.
  • the layer spacing of the X-axis and/or the Y-axis may be 1.5 mm
  • the layer spacing of the Z-axis may be 5 mm.
  • the data layer can be a different data layer within the same vertebral body, or a different cross-sectional data layer within the same blood vessel.
  • a two-dimensional position of the first two-dimensional region of interest at the first data layer can be determined.
  • 805 can be implemented by analysis unit 450 in processing module 310.
  • the determined two-dimensional position may be a two-dimensional position of the reference voxel point in the first two-dimensional region of interest at the first data layer.
  • the two-dimensional position may be two-dimensional coordinate information of a reference voxel point of the first two-dimensional region of interest on a plane of the first data layer, for example, a Cartesian Cartesian coordinate system position (x, y).
  • the two-dimensional position of the two-dimensional region 821-1 of interest in the data layer 821 in FIG. 8C For example, the two-dimensional position of the two-dimensional region 821-1 of interest in the data layer 821 in FIG. 8C.
  • the two-dimensional position of the first two-dimensional region of interest may include a two-dimensional position of all pixels in the set of pixels of the first two-dimensional region of interest. In some embodiments, the two-dimensional position of the first two-dimensional region of interest may include a two-dimensional position of a boundary pixel of the first two-dimensional region of interest.
  • one or more fourth two-dimensional regions of interest can be generated in one or more data layers determined at 804 based on the two-dimensional location determined by 805.
  • a fourth two-dimensional region of interest may Includes a collection of one or more two-dimensional regions of interest.
  • 806 can be implemented by delivery unit 420 in processing module 310.
  • the fourth two-dimensional region of interest may include a two-dimensional region of interest 831-1 generated at the data layer 831, a two-dimensional region of interest 832-1 generated at the data layer 832, at the data layer.
  • the fourth two-dimensional region of interest may have the same size and/or shape as the first two-dimensional region of interest. It should be noted that a fourth two-dimensional region of interest, each of the plurality of fourth two-dimensional regions of interest, or a partial region of a fourth two-dimensional region of interest may be used by the image processing system 100 or The user separately extracts, and/or analyzes, and the like.
  • the above description of the process 800 and the schematic diagrams of FIG. 8B, FIG. 8C are merely exemplary and are not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 800, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description.
  • the two-dimensional region of interest communicated in FIG. 8B or FIG. 8C may replace the circular region with a rectangular region, a diamond region, and/or an irregular region, and the like.
  • the first two-dimensional region of interest in process 800 can be replaced with a three-dimensional volume of interest, i.e., a three-dimensional volume of interest can be communicated between different ranges of data layers of the same individual data.
  • the different ranges of data layers may refer to different ranges of data layers in any direction in three-dimensional space. Variations such as these are within the scope of the present application.
  • 805 can be executed prior to 802, 803, or 804.
  • FIG. 9A is an exemplary flow diagram of delivering a region of interest (a two-dimensional region of interest/three-dimensional volume), in accordance with some embodiments of the present application.
  • the process 900 can be implemented by the delivery unit 420 in the processing module 310.
  • Process 900 can be another exemplary implementation of 603 in process 600.
  • a first two-dimensional region of interest/three-dimensional volume of the first volume data can be acquired.
  • 901 can be implemented by input/output module 340 in processing engine 122.
  • 901 can read first volume data and/or first two-dimensional area of interest/three-dimensional volume related data from storage module 330.
  • the user interface 280 can retrieve a first two-dimensional region of interest/three-dimensional volume that the user selects and/or draws in the first volume data.
  • FIG. 9B is an exemplary schematic diagram of delivering a two-dimensional region of interest, in accordance with some embodiments of the present application.
  • volume data 920 can include a data layer 921.
  • the volume data may include, but is not limited to, gating data, dynamic data, functional images, structural images, raw images, and/or algorithmic analysis result data, and the like.
  • the first 2D region of interest/3D volume acquired by 901 may be the same as or different from the first 2D region of interest/3D volume acquired by 701 in FIG. 7A, and/or 801 in FIG. 8A.
  • the first two-dimensional area of interest is obtained.
  • the first two-dimensional region of interest/three-dimensional volume may be a two-dimensional region of interest/three-dimensional volume, or a collection of multiple two-dimensional regions/three-dimensional volumes of interest.
  • the plurality of two-dimensional regions of interest/three-dimensional volume may be a two-dimensional region of interest/three-dimensional volume in the same data layer or a plurality of different data layers.
  • the first volume data is merely for convenience of description, and does not necessarily mean that the volume data is volume data acquired for the first time.
  • a three-dimensional volume of interest may intersect a plurality of data layers such that a contour of a two-dimensional region of interest may be formed in each of the plurality of data layers.
  • FIG. 9C is an exemplary schematic diagram of delivering a three-dimensional volume of interest, in accordance with some embodiments of the present application.
  • volume data 960 can include multiple data layers, such as data layer 961, data layer 962, and/or data layer 963, and the like.
  • Volume data 960 can include a three-dimensional volume 960-1 of interest.
  • the three-dimensional volume 960-1 of interest intersects the data layer 961, the data layer 962, and the data layer 963, and forms a contour of the two-dimensional region 961-1 of interest in the data layer 961, forming an interest in the data layer 962.
  • the outline of the two-dimensional region 962-1 forms the contour of the two-dimensional region 963-1 of interest in the data layer 963.
  • the two-dimensional region 961-1 of interest, the two-dimensional region 962-1 of interest, and the two-dimensional region 963-1 of interest may have the same or different shapes and/or sizes.
  • the three-dimensional volume 960-1 of interest may have various three-dimensional shapes, such as a cube, a cuboid, a sphere, an ellipsoid, a cylinder, a cone, and/or a three-dimensional geometry of any shape, and the like.
  • one or more second volume data can be obtained.
  • 902 can be implemented by input/output module 340 in processing engine 122.
  • 902 can read second volume data related data from storage module 330.
  • the second volume data can be used for comparative analysis with the first volume data. For example, comparative analysis of gated data (first volume data) and dynamic data (second volume data) at different time points. Another example is a comparative analysis of a functional image (first volume data) and a structural image (second volume data). For example, a comparison analysis of the original image (first volume data) and algorithm analysis result data (second volume data), and the like. As shown in FIG.
  • the second volume data may include volume data 930, volume data 940, and/or volume data 950, and the like. As shown in FIG. 9C, the second volume data may include volume data 970, volume data 980, and/or volume data 990, and the like. In some embodiments, on The second volume data may have the same three-dimensional shape and/or size as the first volume data. In some embodiments, the second volume data described above may have a different three-dimensional shape and/or size than the first volume data. The number of voxels in the second volume data may be more than, less than, or equal to the number of voxels in the first volume data. In some embodiments, the second volume data can have a shape and/or size that is greater than or equal to the first two-dimensional region of interest/three-dimensional volume.
  • a first three-dimensional region of interest/three-dimensional volume can be determined at a three-dimensional position of the first volumetric data.
  • 903 can be implemented by analysis unit 450 in processing module 310.
  • the three-dimensional position information may include three-dimensional Cartesian Cartesian coordinate information (x, y, z).
  • the three-dimensional position of the first two-dimensional region of interest/three-dimensional volume may include one or more voxels in the voxel set of the first two-dimensional region/three-dimensional volume of interest (eg, a reference body) The three-dimensional position of the prime point).
  • the three-dimensional position of the first two-dimensional region of interest/three-dimensional volume may include a three-dimensional position of one or more voxels in the set of boundary voxels of the first two-dimensional region/three-dimensional volume of interest.
  • the three-dimensional position of the two-dimensional region 921-1 of interest in the volume data 920 can be determined.
  • the three-dimensional position of the three-dimensional volume 960-1 of interest in the volume data 960 can be determined.
  • the three-dimensional position of the three-dimensional volume 960-1 of interest in the volume data 960 may mean that the three-dimensional position of the two-dimensional region 961-1 of interest in the data layer 961 is determined, the two-dimensional region of interest The three-dimensional position of the 962-1 in the data layer 962 is determined, and the three-dimensional position of the two-dimensional region 963-1 of interest in the data layer 963 is determined.
  • one or more fifth two-dimensional regions of interest/three-dimensional volume can be generated in the second volume data based on the three-dimensional position.
  • a fifth two-dimensional region of interest/three-dimensional volume may include one or more sets of two-dimensional regions of interest/three-dimensional volumes.
  • 904 can be implemented by delivery unit 420 in processing module 310.
  • the fifth two-dimensional region of interest/three-dimensional volume may have the same size and/or shape as the first two-dimensional region of interest/three-dimensional volume. As shown in FIG.
  • the generated fifth region of interest may include a two-dimensional region of interest generated in the data layer 931 of the volume data 930 according to the three-dimensional position of the two-dimensional region 921-1 of interest in the volume data 920. 931-1, the two-dimensional region 941-1 of interest generated at the data layer 941 of the volume data 940, and/or the two-dimensional region 951-1 of interest generated at the data layer 951 of the volume data 950. As shown in FIG.
  • the generated fifth three-dimensional volume of interest may include the three-dimensional volume of interest 970-1 generated in the volume data 970,
  • the volumetric data 980 generates a three-dimensional volume of interest 980-1, and/or a three-dimensional volume of interest 990-1 generated at volume data 990.
  • the three-dimensional volume of interest 970-1 may include a two-dimensional region of interest in the data layer 971 in the volume data 970 971-1, the two-dimensional region of interest 972-1 in the data layer 972, and/or the two-dimensional region 973-1 of interest in the data layer 973, and the like.
  • the three-dimensional volume of interest 980-1 may include a two-dimensional region 981-1 of interest in the data layer 981 in the volume data 980, a two-dimensional region 982-1 of interest in the data layer 982, and/or a data layer 983 The two-dimensional region of interest 983-1 and the like.
  • the three-dimensional volume of interest 990-1 may include a two-dimensional region of interest 991-1 in the data layer 991 in the volume data 990, a two-dimensional region 992-1 of interest in the data layer 992, and/or a data layer 993 The two-dimensional area of interest 993-1 and so on.
  • a fifth two-dimensional region/three-dimensional volume of interest each of the plurality of fifth-dimensional regions of interest/three-dimensional volume, or a fifth-dimensional region of interest/ A portion of the two-dimensional area/three-dimensional volume in the three-dimensional volume may be separately extracted, and/or analyzed by the image processing system 100 or the user.
  • the above description of the process 900 and the schematic diagrams of FIGS. 9B and 9C are merely exemplary and are not intended to limit the scope of the embodiments. It will be understood that those skilled in the art, after understanding the operations performed by the process 900, may perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description.
  • the two-dimensional region of interest passed in FIG. 9B can replace the circular region with a rectangular region, a diamond region, and/or an irregular region, and the like.
  • 9C can replace the circular volume with an ellipsoid, a cube, a cuboid, a cylinder, a cone, and/or a volume of any shape, and the like. Variations such as these are within the scope of the present application.
  • FIG. 10A is an exemplary flow diagram of merging regions of interest (two-dimensional regions of interest/three-dimensional volume), shown in some embodiments of the present application.
  • the process 1000 can be implemented by the merging unit 440 in the processing module 310.
  • Process 1000 can be an exemplary implementation of operation 604 in process 600.
  • a first two-dimensional region of interest/three-dimensional volume can be acquired.
  • 1001 can be implemented by input/output module 340 of processing engine 122.
  • 1001 can read the first 2D region of interest/three dimensional volume related data from storage module 330.
  • the user interface 280 can acquire a first two-dimensional area of interest/three-dimensional volume determined by the user.
  • the first two-dimensional region of interest/three-dimensional volume may be a connected domain.
  • the first two-dimensional region of interest/three-dimensional volume may be a non-connected domain. As shown in FIG. 10B, FIG.
  • FIG. 10B is an exemplary schematic diagram of a region of interest (two-dimensional region of interest/three-dimensional volume) shown in accordance with some embodiments of the present application.
  • 1001 can acquire the two-dimensional area/three-dimensional volume 1021 of interest.
  • the two-dimensional area of interest/three-dimensional volume 1021 may include feature information, such as statistical information A.
  • a second two-dimensional region of interest/three-dimensional volume can be acquired.
  • 1002 can be implemented by input/output module 340 of processing engine 122.
  • 1002 can read the second 2D region of interest/three dimensional volume related data from storage module 330.
  • the second two-dimensional region of interest/three-dimensional volume and the first two-dimensional region of interest/three-dimensional volume may be located in the same data layer.
  • the second two-dimensional region of interest/three-dimensional volume and the first two-dimensional region of interest/three-dimensional volume may be located in different data layers of the same individual data.
  • the second two-dimensional region of interest/three-dimensional volume and the first two-dimensional region of interest/three-dimensional volume may be located in different volumetric data.
  • the acquired second two-dimensional region of interest/three-dimensional volume may be used for a combined analysis with the first two-dimensional region of interest/three-dimensional volume.
  • the second two-dimensional region of interest/three-dimensional volume may be related to the first two-dimensional region of interest/three-dimensional volume.
  • the first two-dimensional region of interest/three-dimensional volume and the second two-dimensional region of interest/three-dimensional volume may be scattered two-dimensional regions/three-dimensional volumes in a biomedical image that require merging statistics, or merging statistics in medical images Connected or non-connected areas.
  • the connected or non-connected region may be a tumor that has spread, and the location information of the tumor image may be discontinuous.
  • 1002 may acquire a two-dimensional region/three-dimensional volume 1022 of interest, a two-dimensional region of interest/three-dimensional volume 1023, and/or a two-dimensional region of interest/three-dimensional volume 1024, and the like.
  • the two-dimensional area of interest/three-dimensional volume 1022 may include feature information, such as statistical information B.
  • the two-dimensional region of interest/three-dimensional volume 1023 may include feature information, such as statistical information C.
  • the two-dimensional region of interest/three-dimensional volume 1024 may include feature information, such as statistical information D.
  • the two-dimensional region of interest/three-dimensional volume 1022, the two-dimensional region of interest/three-dimensional volume 1023, the two-dimensional region of interest/three-dimensional volume 1024, and the two-dimensional region of interest/three-dimensional volume 1021 may be located in the same data. Layer, different data layers of the same volume data, or different volume data.
  • the first two-dimensional region of interest/three-dimensional volume and the second two-dimensional region of interest/three-dimensional volume may be combined.
  • 1001 may be implemented by a merging unit 440 in processing module 310.
  • the merging the first two-dimensional region of interest/three-dimensional volume and the second two-dimensional region of interest/three-dimensional volume may be data combining the first two-dimensional region of interest/three-dimensional volume and the second two-dimensional region of interest/three-dimensional volume .
  • FIG. 10C is an exemplary schematic diagram of a merged region of interest (two-dimensional region of interest/three-dimensional volume), shown in accordance with some embodiments of the present application.
  • 1003 can merge the two-dimensional area of interest/three-dimensional volume 1031 and 2D area of interest / 3D volume 1032.
  • 1004 it can be determined whether or not to merge other two-dimensional regions of interest/three-dimensional volume.
  • 1004 can be implemented by analysis unit 450 in processing module 310.
  • the criteria for judgment can be related to the user's analytical needs, ie, whether the user needs to combine more of the two-dimensional regions of interest/three-dimensional volume for analysis.
  • the process 1000 can return to 1002 to obtain other two-dimensional regions of interest/three-dimensional volume to be merged.
  • the other two-dimensional regions of interest/three-dimensional volume to be merged may be a two-dimensional region of interest/three-dimensional volume 1033, and/or a two-dimensional region of interest/three-dimensional volume 1034. .
  • the process 1000 can proceed to 1005 to analyze the feature information of the merged two-dimensional region/three-dimensional volume of interest.
  • 1005 can be implemented by analysis unit 450 in processing module 310.
  • operation 1004 may not be performed, and 1005 is performed directly after 1003.
  • 1005 can analyze the combined feature information of a plurality of connected or non-connected regions.
  • 1005 can analyze the overall feature information of the tumor that has been diffused after merging.
  • the feature information may include statistical features.
  • the combined feature information of the two-dimensional region/three-dimensional volume of interest may be merged statistical information 1040.
  • operation 1003 and operation 1005 can be performed together, for example, the merged feature information can be directly analyzed after combining at least two 2D regions of interest/3D volume.
  • process 1000 and the schematic diagrams of FIG. 10B and FIG. 10C are merely exemplary and are not intended to limit the scope of the embodiments. It will be understood that, for those skilled in the art, after understanding the operations performed by the process 1000, it is possible to perform any combination of the operations and perform various modifications and changes to the operations of the processes. However, these modifications and changes are still within the scope of the above description. In some embodiments, the order of operations of process 1000 can be interchanged. For example, the process 1000 may first perform operation 1004 and then perform operation 1003.
  • one 2D region of interest may be merged with another 2D region of interest, a 2D region of interest may be merged with a 3D volume of interest, and a 3D volume of interest may be associated with another Interest in 3D volume merge. Variations such as these are within the scope of the present application.
  • the present application may perform operations such as cropping, passing, and/or merging based on one or more initial regions of interest (interest two-dimensional regions/three-dimensional volumes) of the volume data to obtain one or more targets. Area of interest. In some embodiments, operations such as further cropping, passing, and/or merging may be performed based on the target region of interest.
  • the initial region of interest may be the two-dimensional region of interest/three-dimensional volume drawn in step 601, the two-dimensional region of interest/three-dimensional volume after step 602, and the two-dimensional image of interest after step 603 Region/three-dimensional volume, merged two-dimensional region of interest/three-dimensional volume in step 604, first two-dimensional region of interest/three-dimensional volume acquired in step 701, third-dimensional two-dimensional region/three-dimensional volume acquired in step 703, steps The first two-dimensional region of interest acquired by 801, the fourth two-dimensional region of interest generated by step 806, the first two-dimensional region of interest/three-dimensional volume acquired in step 901, and the fifth two-dimensional region of interest generated by step 904/ The three-dimensional volume, the merged two-dimensional region of interest/three-dimensional volume, etc., or any other region of interest, in step 1003.
  • This application does not limit the initial area of interest.
  • the initial region of interest may be merged from other regions of interest, and the other regions of interest may be referred to as merged source regions of interest.
  • the merged two-dimensional region/three-dimensional volume of step 1003 is taken as the initial region of interest, then the first region of interest acquired in step 1001 and/or the second region of interest/three-dimensional volume acquired in step 1002 It can be called a merged source of interest.
  • the initial region of interest may be passed from other regions of interest, and the other regions of interest may be referred to as delivery source regions of interest.
  • the first two-dimensional region of interest acquired in step 801 may be referred to as a delivery source region of interest.
  • the first 2D region of interest/3D volume acquired in step 904 may be referred to as a delivery source region of interest.
  • one or more regions of interest may be cropped from the initial region of interest or the region of interest, the cropped region or regions of interest may be referred to as regions of interest to be cropped .
  • the region to be cropped may be the two-dimensional region of interest/three-dimensional volume cropped in step 602, the second two-dimensional region of interest/three-dimensional volume cropped in step 703, or any other region of interest.
  • the initial region of interest or target region of interest may be merged with other regions of interest, which may be referred to as regions of interest to be merged.
  • the region to be merged may be the two-dimensional region/three-dimensional volume to be merged in step 604, the third two-dimensional region/three-dimensional volume acquired in step 703, and the fourth two-dimensional region of interest generated in step 806.
  • the fifth 2D region of interest/3D volume generated by step 904, the merged 2D region of interest/3D volume, etc., or any other region of interest, is performed in step 1003.
  • Tangible, permanent storage media includes the memory or memory used by any computer, processor, or similar device or associated module. For example, various semiconductor memories, tape drives, disk drives, or the like that can provide storage functions for software at any time.
  • All software or parts of it may sometimes communicate over a network, such as the Internet or other communication networks.
  • a network such as the Internet or other communication networks.
  • Such communication can load software from one computer device or processor to another.
  • a hardware platform loaded from a management server or host computer of an image processing system to a computer environment, or other computer environment implementing the system, or a similar function related to providing information required for image processing. Therefore, another medium capable of transmitting software elements is used as a physical connection between local devices, such as light waves, electric waves, electromagnetic waves, etc., and is propagated through cables, cables, or air.
  • the physical medium used for the carrier such as a cable, wireless connection, or fiber optic cable, or the like, or is considered to be the medium that carries the software.
  • a computer readable medium medium can have many forms, including but not limited to, a temporarily readable media medium and a non-transitory readable medium medium.
  • the non-transitory readable media medium can be a stable storage medium, including: an optical or magnetic disk, and a storage system for use in a computer or similar device that implements the system components described in the Figures.
  • the temporarily readable media medium can be an unstable storage medium, including dynamic memory, such as the main memory of a computer platform.
  • the computer readable medium medium can include a tangible transmission medium, carrier wave transmission medium or physical transmission medium. Tangible transmission media include coaxial cables, copper cables, and fiber optics, including the circuitry that forms the bus within the computer system.
  • the carrier transmission medium can transmit an electrical signal, an electromagnetic signal, an acoustic signal or a light wave signal, and these signals can be Produced by methods of communication by radio frequency or infrared data.
  • Typical computer readable media include hard disks, floppy disks, magnetic tape, any other magnetic media; CD-ROM, DVD, DVD-ROM, any other optical media; perforated cards, any other physical storage media containing aperture patterns; RAM, PROM , EPROM, FLASH-EPROM, any other memory slice or tape; a carrier, cable or carrier for transmitting data or instructions, any other program code and/or data that can be read by a computer. Many of these forms of computer readable media appear in the process of the processor executing instructions, passing one or more results.
  • the present application uses specific words to describe embodiments of the present application.
  • a "one embodiment,” “an embodiment,” and/or “some embodiments” means a feature, structure, or feature associated with at least one embodiment of the present application. Therefore, it should be emphasized and noted that “an embodiment” or “an embodiment” or “an alternative embodiment” that is referred to in this specification two or more times in different positions does not necessarily refer to the same embodiment. . Furthermore, some of the features, structures, or characteristics of one or more embodiments of the present application can be combined as appropriate.
  • aspects of the present application can be illustrated and described by a number of patentable categories or conditions, including any new and useful process, machine, product, or combination of materials, or Any new and useful improvements. Accordingly, various aspects of the present application can be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.) or by a combination of hardware and software.
  • the above hardware or software may be referred to as a "data block,” “module,” “sub-module,” “engine,” “unit,” “sub-unit,” “component,” or “system.”
  • aspects of the present application may be embodied in a computer product located in one or more computer readable medium(s) including a computer readable program code.
  • a computer readable signal medium may contain a propagated data signal containing a computer program code, for example, on a baseband or as part of a carrier.
  • the propagated signal may have a variety of manifestations, including electromagnetic forms, optical forms, and the like, or a suitable combination.
  • the computer readable signal medium may be any computer readable medium other than a computer readable storage medium that can be communicated, propagated, or transmitted for use by connection to an instruction execution system, apparatus, or device.
  • the program code located on the computer readable signal medium can be propagated through any suitable medium, including radio, cable, fiber optic cable, radio frequency signal, or similar medium, or any What combination of the above media.
  • the computer program code required for the operation of various parts of the application can be written in any one or more programming languages, including object oriented programming languages such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python. Etc., regular programming languages such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
  • the program code can run entirely on the user's computer, or run as a stand-alone software package on the user's computer, or partially on the user's computer, partly on a remote computer, or entirely on a remote computer or server.
  • the remote computer can be connected to the user's computer via any network, such as a local area network (LAN) or wide area network (WAN), or connected to an external computer (eg via the Internet), or in a cloud computing environment, or as a service.
  • LAN local area network
  • WAN wide area network
  • an external computer eg via the Internet
  • SaaS software as a service

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Databases & Information Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Generation (AREA)

Abstract

一种图像处理方法和系统。所述图像处理方法可以包括:获取图像数据集,所述图像数据集包括第一体数据;用至少一个处理器在所述第一体数据中确定一个目标感兴趣区域,包括:绘制初始感兴趣区域,所述初始感兴趣区域在所述第一体数据中;和裁剪所述初始感兴趣区域,以获得所述目标感兴趣区域。其中,所述目标感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积,所述初始感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积。

Description

图像处理方法及系统 技术领域
本申请涉及图像处理的方法及系统,更进一步的,涉及对医学图像的感兴趣区域(包括感兴趣二维区域(也可称为二维ROI)和/或感兴趣三维体积(也可称为三维VOI))的处理方法及系统。
背景技术
医学图像后处理软件中通常提供了一些处理感兴趣区域(感兴趣二维区域和/或感兴趣三维体积)的工具,用于分析并显示二维ROI(Region of Interest)/三维VOI(Volume of Interest)的特征信息,例如,统计学特征。一方面,用户希望能实现两种功能:(1)在一个医学图像体数据的不同数据层的相同二维位置绘制相同大小、形状的二维ROI/三维VOI,统计这些二维ROI/三维VOI并进行比较;(2)在多个医学图像体数据的相同三维位置绘制相同大小、形状的二维ROI/三维VOI,统计这些二维ROI/三维VOI并进行比较。目前,绘制一个二维ROI/三维VOI需要花费一定时间,而且手动绘制二维ROI/三维VOI难以实现在不同数据层和/或不同体数据中绘制出位置、形状、大小一样的二维ROI和/或三维VOI。
另一方面,大多数二维ROI/三维VOI工具均存在以下问题:每个绘制出的二维ROI/三维VOI通常是连续的区域,但是如果该连续区域内部的部分区域不是用户感兴趣的或不是用户希望进行分析的,传统的二维ROI/三维VOI工具通常无法去除用户不感兴趣的部分来满足这种需求。此外,每个二维ROI/三维VOI可以显示其对应的统计学特征,当存在多个二维ROI/三维VOI的情况时,无法获得并显示多个二维ROI/三维VOI总体的统计学特征。本申请提供一种具有二维ROI/三维VOI的定位、传递、裁剪、合并等功能的方法和系统,以解决以上问题并满足用户的多种需求。
简述
本申请的一个方面是关于一种第一图像处理方法。所述第一图像处理方法 可以在至少一个机器上实施,每个机器可以包括至少一个处理器和存储器。所述第一图像处理方法可以包括:获取图像数据集,所述图像数据集可以包括第一体数据,所述第一体数据可以包括至少一个数据层,所述至少一个数据层可以包含至少一个体素;用至少一个处理器在所述第一体数据中确定一个目标感兴趣区域,所述目标感兴趣区域可以包括所述至少一个数据层中的至少一个体素,并且所述目标感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积,所述确定所述目标感兴趣区域可以包括:绘制初始感兴趣区域,所述初始感兴趣区域可以在所述第一体数据中,并且所述初始感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;和裁剪所述初始感兴趣区域,以获得所述目标感兴趣区域。
本申请的另一个方面是关于一种第二图像处理方法。所述第二图像处理方法可以在至少一个机器上实施,每个机器可以包括至少一个处理器和存储器。所述第二图像处理方法可以包括:获取图像数据集,所述图像数据集可以包括第一体数据,所述第一体数据可以包括至少一个数据层,所述至少一个数据层可以包含至少一个体素;获取所述第一体数据中的初始感兴趣区域,所述初始感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;用所述至少一个处理器确定所述初始感兴趣区域的位置信息;和根据所述位置信息,传递所述初始感兴趣区域。
本申请的另一个方面是关于一种第三图像处理方法。所述第三图像处理方法可以在至少一个机器上实施,每个机器可以包括至少一个处理器和存储器。所述第三图像处理方法可以包括:获取图像数据集;在所述图像数据集中获取初始感兴趣区域,所述初始感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;在所述图像数据集中获取待合并感兴趣区域,所述待合并感兴趣区域可以包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;用所述至少一个处理器合并所述初始感兴趣区域和所述待合并感兴趣区域;和分析合并后的感兴趣区域的特征信息。
本申请的另一个方面是关于一种第一非暂时性的计算机可读介质。所述第一非暂时性的计算机可读介质可以包括可执行指令。所述指令被至少一个处理器执行时,可以导致所述至少一个处理器实现所述第一图像处理方法。
本申请的另一个方面是关于一种第二非暂时性的计算机可读介质。所述第二非暂时性的计算机可读介质可以包括可执行指令。所述指令被至少一个处理器执行时, 可以导致所述至少一个处理器实现所述第二图像处理方法。
本申请的另一个方面是关于一种第三非暂时性的计算机可读介质。所述第三非暂时性的计算机可读介质可以包括可执行指令。所述指令被至少一个处理器执行时,可以导致所述至少一个处理器实现所述第三图像处理方法。
本申请的另一个方面是关于一个第一系统。所述第一系统可以包括:至少一个处理器,以及存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现所述第一图像处理方法。
本申请的另一个方面是关于一个第二系统。所述第二系统可以包括:至少一个处理器,以及存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现所述第二图像处理方法。
本申请的另一个方面是关于一个第三系统。所述第三系统可以包括:至少一个处理器,以及存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现所述第三图像处理方法。
根据本申请的一些实施例,所述裁剪所述初始感兴趣区域以获得所述目标感兴趣区域可以包括:在所述第一体数据中绘制需裁剪的待裁剪感兴趣区域;和从所述初始感兴趣区域中去除所述待裁剪感兴趣区域与所述初始感兴趣区域的交集,以获得所述目标感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括:判断是否继续裁剪所述目标感兴趣区域;和若判断继续裁剪,在所述第一体数据中继续绘制需要裁剪的感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括传递所述目标感兴趣区域,所述传递所述目标感兴趣区域可以包括:确定第一位置信息,所述第一位置信息可以包括所述目标感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;获取第一传递方向的第一传递深度,所述第一传递方向与所述第一体数据中的一个第一数据层所在平面可以形成第一夹角;获取第二传递方向的第二传递深度,所述第二传递方向与所述第一体数据中的所述第一数据层所在平面可以形成第二夹角;确定所述第一传递深度和第二传递深度对应的范围中的体数据的至少一个第二数据层;和根据所述第一位置信息,在所述至少一个第二数据层生成传递后的感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括传递所述目标感兴趣区域,所述传递所述目标感兴趣区域可以包括:获取第二体数据;确定第二位置信息,所述第二位置信息可以包括所述目标感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和根据所述第二位置信息,在所述第二体数据生成传递后的感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括:在所述第一体数据中获取待合并感兴趣区域;和合并所述目标感兴趣区域和所述待合并感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括:传递所述初始感兴趣区域。
根据本申请的一些实施例,所述绘制初始感兴趣区域可以包括:在所述第一体数据中绘制传递源感兴趣区域;确定第三位置信息,所述第三位置信息可以包括所述传递源感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和根据所述第三位置信息,在第一体数据中传递所述传递源感兴趣区域,以生成所述初始感兴趣区域。
根据本申请的一些实施例,所述绘制初始感兴趣区域可以包括:获取第三体数据;在第三体数据中绘制传递源感兴趣区域;确定第四位置信息,所述第四位置信息可以包括所述传递源感兴趣区域中的一个体素在所述第三体数据中的三维坐标位置;和根据所述第四位置信息,传递所述传递源感兴趣区域至所述第一体数据,以生成所述初始感兴趣区域。
根据本申请的一些实施例,所述绘制初始感兴趣区域可以包括:在所述第一体数据中绘制至少两个不同的合并源感兴趣区域;以及合并所述至少两个不同的合并源感兴趣区域,以生成所述初始感兴趣区域。
根据本申请的一些实施例,所述第一图像处理方法可以进一步包括分析所述目标感兴趣区域,所述分析所述目标感兴趣区域可以包括:分析所述目标感兴趣区域的特征信息,所述特征信息包括统计学特征信息,所述统计学特征信息是根据所述目标感兴趣区域中的多个体素统计分析得到的信息。
根据本申请的一些实施例,所述传递所述初始感兴趣区域可以包括:获取第一传递方向的第一传递深度,所述第一传递方向与所述第一体数据中的一个第一数据 层所在平面可以形成第一夹角;获取第二传递方向的第二传递深度,所述第二传递方向与所述第一体数据中的所述第一数据层所在平面可以形成第二夹角;确定所述第一传递深度和所述第二传递深度对应的范围中的体数据中包含的至少一个第二数据层;确定第一位置信息,所述第一位置信息可以包括所述初始感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和根据所述第一位置信息,在所述至少一个第二数据层生成传递后的目标感兴趣区域。
根据本申请的一些实施例,所述第二图像处理方法可以进一步包括:在所述第一体数据中绘制待合并感兴趣区域;和合并所述目标感兴趣区域与所述待合并感兴趣区域。
根据本申请的一些实施例,所述传递所述初始感兴趣区域可以包括:获取第二体数据;确定第二位置信息,所述第二位置信息可以包括所述初始感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和根据所述第二位置信息,在所述第二体数据生成传递后的感兴趣区域。
根据本申请的一些实施例,所述第二图像处理方法可以进一步包括:在所述第二体数据中绘制待合并感兴趣区域;和合并所述传递后的感兴趣区域与所述待合并感兴趣区域。
根据本申请的一些实施例,所述第二图像处理方法可以进一步包括:在所述第一体数据中绘制待合并感兴趣区域;和合并所述初始感兴趣区域和所述待合并感兴趣区域。
根据本申请的一些实施例,所述获取所述第一体数据中的初始感兴趣区域可以包括:在所述第一体数据中绘制至少两个不同的合并源感兴趣区域;和合并所述至少两个不同的合并源感兴趣区域,以获得所述初始感兴趣区域。
本申请的一部分附加特性可以在下面的描述中进行说明。通过对以下描述和相应附图的检查或者对实施例的生产或操作的了解,本申请的一部分附加特性对于本领域技术人员是明显的。本披露的特性可以通过对以下描述的具体实施例的各种方面的方法、手段和组合的实践或使用得以实现和达到。
附图描述
在此所述的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的限定。各图中相同的标号表示相同的部件。
图1是根据本申请的一些实施例所示的图像处理系统的一个示例性示意图;
图2是根据本申请的一些实施例所示的图像处理服务器的计算设备的一个示例性的示意图;
图3是根据本申请的一些实施例所示的处理引擎的一个示例性示意图;
图4是根据本申请的一些实施例所示的处理模块的一个示例性示意图;
图5是根据本申请的一些实施例所示的的处理图像的一种示例性流程图;
图6是根据本申请的一些实施例所示的确定感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图;
图7A是根据本申请的一些实施例所示的裁剪感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图;
图7B是根据本申请的一些实施例所示的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图;
图7C是根据本申请的一些实施例所示的裁剪后的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图;
图8A是根据本申请的一些实施例所示的传递感兴趣二维区域的一种示例性流程图;
图8B是根据本申请的一些实施例所示的感兴趣二维区域的一个示例性示意图;
图8C是根据本申请的一些实施例所示的传递后的感兴趣二维区域的一个示例性示意图;
图9A是根据本申请的一些实施例所示的传递感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图;
图9B是根据本申请的一些实施例所示的传递感兴趣二维区域的一个示例性示意图;
图9C是根据本申请的一些实施例所示的传递感兴趣三维体积的一个示例 性示意图;
图10A是根据本申请的一些实施例所示的合并感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图;
图10B是根据本申请的一些实施例所示的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图;以及
图10C是根据本申请的一些实施例所示的合并后的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图。
具体描述
为了更清楚地说明本申请的实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其他类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的操作和元素,而这些操作和元素不构成一个排它性的罗列,方法或者设备也可能包含其他的操作或元素。
虽然本申请对根据本申请的实施例的系统中的某些模块做出了各种引用,然而,任何数量的不同模块可以被使用并运行在图像处理系统和/或处理器上。所述模块仅是说明性的,并且所述系统和方法的不同方面可以使用不同模块。
需要注意的是,本申请中的感兴趣区域ROI可以指一层或多层数据层中对应的感兴趣区域或感兴趣体积,而VOI可以是两层或多层数据层中对应的感兴趣体积(例如,VOI可以是一种三维的ROI)。本申请中的感兴趣二维区域ROI可以指一层数据层中对应的感兴趣二维区域。本申请所描述的实施例可以应用于感兴趣区域(二维ROI和/或三维VOI)。在一些实施例中,感兴趣区域可以包括二维ROI和/或三维VOI。
本申请中使用了流程图用来说明根据本申请的实施例的系统所执行的操作。应当理解的是,前面或下面操作不一定按照顺序来精确地执行。相反,可以按照倒序或 同时处理各种操作。同时,或将其他操作添加到这些过程中,或从这些过程移除某一步或数步操作。需要注意的是,本申请中的“第一”、“第二”、“第三”、“第四”、“第五”、“第六”、“第七”、“第八”等仅为了描述方便,并不表示一个特定的顺序或名称。
图1是根据本申请的一些实施例所示的图像处理系统100的一个示例性示意图。该图像处理系统100可以包括一个成像系统110、一个图像处理服务器120、一个网络130、和一个数据库140。在一些实施例中,成像系统110可以是独立的成像设备,或多模态成像系统。在一些实施例中,图像处理服务器120可以对获取的信息进行分析、处理,和/或输出处理结果。
成像系统110可以包括单个成像设备,或多个不同成像设备的组合。所述成像设备可以通过扫描一个目标进行成像。在一些实施例中,所述成像设备可以是一个医学成像设备。所述医学成像设备可以采集人体各部位的图像信息。在一些实施例中,成像系统110可以是正电子发射型计算机断层显像系统(Positron Emission Tomography,PET),单光子发射计算机断层显像系统(Single Photon Emission Computed Tomography,SPECT),计算机断层扫描系统(Computed Tomography,CT),磁共振成像系统(Magnetic Resonance Imaging,MRI),数字放射显影系统(Digital Radiography,DR),计算机断层扫描结肠成像系统(Computed Tomography Colonography,CTC)等,或几种的组合。成像系统110可以包括一个或多个扫描仪。所述扫描仪可以是数字减影血管造影扫描仪(Digital Subtraction Angiography,DSA),磁共振血管造影扫描仪(Magnetic Resonance Angiography,MRA),计算机断层血管造影扫描仪(Computed Tomography Angiography,CTA),正电子发射型计算机断层显像扫描仪(PET Scanner),单光子发射计算机断层显像扫描仪(SPECT Scanner),计算机断层扫描系统扫描仪(CT Scanner),磁共振成像扫描仪(MRI Scanner),数字放射显影扫描仪(DR Scanner),多模态扫描仪(Multi-modality Scanner)等,或几种的组合。在一些实施例中,所述多模态扫描仪可以是CT-PET扫描仪(Computed Tomography-Positron Emission Tomography scanner),SPECT-MRI扫描仪(Single Photon Emission Computed Tomography-Magnetic Resonance Imaging scanner),PET-MRI扫描仪(Positron Emission Tomography-Magnetic Resonance  Imaging scanner),DSA-MRI扫描仪(Digital Subtraction Angiography-Magnetic Resonance Imaging scanner)等。
图像处理服务器120可以处理获取的数据信息。图像处理服务器120可以对数据信息进行去噪、去除图像伪影、图像分割、图像渲染、图像配准、图像融合、图像重建等处理。在一些实施例中,图像处理服务器120可以对图像二维ROI/三维VOI进行绘制、编辑等操作。在一些实施例中,所述数据信息可以包括文本信息,图像信息,声音信息,视频信息等,或几种的组合。在一些实施例中,图像处理服务器120可以包括一个处理引擎122,一个处理核,一个或多个存储器等,或几种的组合。例如,图像处理服务器120可以包括中央处理器(Central Processing Unit,CPU),专用集成电路(Application-Specific Integrated Circuit,ASIC),专用指令处理器(Application-Specific Instruction-Set Processor,ASIP),图形处理器(Graphics Processing Unit,GPU),物理运算处理器(Physics Processing Unit,PPU),数字信号处理器(Digital Signal Processor,DSP),现场可编程逻辑门阵列(Field Programmable Gate Array,FPGA),可编程逻辑器(Programmable Logic Device,PLD),控制器(Controller),微控制器单元(Microcontroller unit),处理器(Processor),微处理器(Microprocessor),ARM处理器(Advanced RISC Machines)等,或几种的组合。在一些实施例中,图像处理服务器120可以处理从成像系统110获取的图像信息。在一些实施例中,图像处理服务器120可以处理来自于数据库140或其他存储设备的图像信息。图像处理服务器120处理信息的结果可以保存在内部存储器、数据库140、或其他外部数据源中。
在一些实施例中,图像处理服务器120可以直接接收用户的指令,并执行相应的图像处理操作。在一些实施例中,用户可以利用一个远程终端(未示出)通过网络130访问图像处理服务器120。图像处理服务器120处理的结果可以直接呈现给用户,或通过网络130发送给远程终端,以供用户查看。
网络130可以是单个网络,或多个不同网络的组合。例如,网络130可以是一个局域网(local area network(LAN))、广域网(wide area network(WAN))、公用网络、私人网络、专有网络、公共交换电话网(public switched telephone network(PSTN))、互联网、无线网络、虚拟网络、城域网络、电话网络等,或几种 的组合。网络130可以包括多个网络接入点,例如,有线接入点、无线接入点、基站、互联网交换点等在内的有线或无线接入点。通过这些接入点,图像处理服务器120和/或成像系统110可以接入网络130并通过网络130发送和/或接收数据信息。为理解方便,现以医学图像领域中的成像系统110为例说明,但本申请并不局限于此实施例范围内。例如,成像系统110可以是计算机断层扫描(Computed Tomography,CT)或磁共振成像(Magnetic Resonance Imaging,MRI),图像处理系统100的网络130可以包括无线网络(蓝牙、wireless local area network(WLAN、Wi-Fi、WiMax等)、移动网络(2G、3G、4G信号等)、或其他连接方式(虚拟专用网络(Virtual Private Network,VPN)、共享网络、近场通信(Near Field Communication,NFC)、ZigBee等)。在一些实施例中,网络130可以用于图像处理系统100的通信,接收图像处理系统100内部或外部的信息,向图像处理系统100内部其他部分或外部发送信息。在一些实施例中,成像系统110、图像处理服务器120和数据库140之间可以通过有线连接、无线连接或有线连接与无线连接相结合的方式接入网络130。
数据库140可以存储信息。数据库140可以建立在具有存储功能的设备上。数据库140可以是本地的,或远程的。在一些实施例中,数据库140或图像处理系统100内其他存储设备可以存储各种信息,例如图像数据等。在一些实施例中,数据库140或系统内的其他存储设备可以是具有读/写功能的媒介。数据库140或系统内其他存储设备可以是系统内部的设备,或系统的外接设备。数据库140与系统内其他存储设备的连接方式可以是有线的,或无线的。数据库140或系统内其他存储设备可以包括层次式数据库、网络式数据库、关系式数据库等,或几种的组合。数据库140或系统内其他存储设备可以将信息数字化后再利用电、磁或光学等方式的存储设备加以存储。
数据库140或系统内其他存储设备可以是利用电能方式存储信息的设备,例如,随机存取存储器(Random Access Memory,RAM)、只读存储器(Read Only Memory,ROM)等,或几种的组合。所述随机存储器RAM可以包括十进计数管、选数管、延迟线存储器、威廉姆斯管、动态随机存储器(Dynamic Random Access Memory,DRAM)、静态随机存储器(Static Random Access Memory,SRAM)、晶闸管随机存储器(Thyristor Random Access Memory,T-RAM)、零电容随机存储器 (Zero-capacitor Random Access Memory,Z-RAM)等,或几种的组合。所述只读存储器ROM可以包括磁泡存储器、磁钮线存储器、薄膜存储器、磁镀线存储器、磁芯内存、磁鼓存储器、光盘驱动器、硬盘、磁带、相变化内存、闪存、电子抹除式可复写只读存储器、可擦除可编程只读存储器、可编程只读存储器、屏蔽式堆读内存、赛道内存、可变电阻式内存、可编程金属化单元等,或几种的组合。数据库140或系统内其他存储设备可以是利用磁能方式存储信息的设备,例如硬盘、软盘、磁带、磁芯存储器、磁泡存储器、U盘、闪存等。数据库140或系统内其他存储设备可以是利用光学方式存储信息的设备,例如CD或DVD等。数据库140或系统内其他存储设备可以是利用磁光方式存储信息的设备,例如磁光盘等。数据库140或系统内其他存储设备的存取方式可以是随机存储、串行访问存储、只读存储等,或几种的组合。数据库140或系统内其他存储设备可以是非永久记忆存储器,或是永久记忆存储器。以上提及的存储设备只是列举的一些例子,该系统可以使用的存储设备并不局限于此。
在一些实施例中,数据库140可以是成像系统110和/或图像处理服务器120的一部分。在一些实施例中,数据库140可以是独立的,直接与网络130连接。在一些实施例中,数据库140可以存储从成像系统110,图像处理服务器120和/或网络130收集的数据。在一些实施例中,数据库140可以存储图像处理服务器120工作中所利用、产生和/或输出的各种数据。在一些实施例中,数据库140与成像系统110,图像处理服务器120和/或网络130的连接或通信可以是有线的,或无线的,或两种的结合。在一些实施例中,成像系统110可以直接或通过网络130访问数据库140,图像处理服务器120等。
需要注意的是,上述图像处理服务器120和/或数据库140可以实际存在于成像系统110中,或通过云计算平台完成相应功能。所述云计算平台可以包括存储数据为主的存储型云平台、以处理数据为主的计算型云平台以及兼顾数据存储和处理的综合云计算平台。成像系统110所使用的云平台可以是公共云、私有云、社区云或混合云等。例如,根据实际需要,成像系统110输出的图像信息和/或数据信息,可以通过用户云平台进行计算和/或存储,也可以通过本地图像处理服务器120和/或数据库140进行计算和/或存储。
需要注意的是,以上对于图像处理系统100的描述,仅为描述方便,并不 能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对图像处理系统100的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,数据库140可以是具有数据存储功能的云计算平台,包括公共云、私有云、社区云和混合云等。诸如此类的变形,均在本申请的保护范围之内。
图2是根据本申请的一些实施例所示的图像处理服务器120的计算设备200的一个示例性示意图。计算设备200可以实现和/或实施本申请中披露的特定系统。本实施例中的特定系统利用功能框图解释了一个包含用户界面的硬件平台。计算设备200可以实施当前描述的图像处理服务器120的一个或多个组件、模块、单元、子单元。另外,图像处理服务器120能够被计算设备200通过其硬件设备、软件程序、固件以及它们的组合所实现。计算设备200可以是一个通用目的的计算机或一个有特定目的的计算机。两种计算机都可以被用于实现本实施例中的特定系统。为了方便起见,图2中只绘制了一台计算设备,但是本实施例所描述的进行信息处理并推送信息的相关计算机功能可以是以分布的方式、由一组相似的平台所实施的,分散系统的处理负荷。
如图2所示,计算设备200可以包括内部通信总线210,处理器(processor)220,只读存储器(ROM)230,随机存取存储器(RAM)240,通信端口250,输入/输出组件260,硬盘270,用户界面280。内部通信总线210可以实现计算设备200组件间的数据通信。处理器220用于执行程序指令完成在此披露书中所描述的图像处理服务器120的任何功能、组件、模块、单元、子单元。处理器220由一个或多个处理器组成。通信端口250可以实现计算设备200与图像处理系统100其他部件(比如成像系统110、图像处理服务器120)之间的数据通信(比如通过网络130)。计算设备200还可以包括不同形式的程序储存单元以及数据储存单元,例如硬盘270,只读存储器(ROM)230,随机存取存储器(RAM)240,能够用于计算机处理和/或通信使用的各种数据文件,以及处理器220所执行的可能的程序指令。输入/输出组件260支持计算设备200与其他组件(如用户界面280),和/或图像处理系统100其他组件(如数据库140)之间的数据流的输入/输出。计算设备200也可以通过通信端口250从网络130发送和接收信息及数据。
图3是根据本申请的一些实施例所示的处理引擎122的一个示例性示意图。图像处理服务器120中的处理引擎122可以包括处理模块310、通信模块320和存储模块330。处理引擎122可以进一步包括输入/输出模块340。所述输入/输出模块340可以接收成像系统110中的一个或多个成像设备的图像数据,并发送给处理模块310等。所述输入/输出模块340可以通过网络130将处理模块310处理后的图像数据发送给与图像处理服务器120相连接的成像系统110和/或数据库140等。处理引擎122的各模块之间的连接可以是有线连接、无线连接和/或有线连接与无线连接的组合。处理引擎122的各模块可以是本地的、远程的和/或本地与远程的组合。处理引擎122的各模块之间的对应关系可以是一对一,一对多,或多对多。例如,处理引擎122可以包括一个处理模块310和一个通信模块320。又如,处理引擎122可以包括多个处理模块310和多个存储模块330。所述多个处理模块310可以分别对应所述多个存储模块330,从而分别处理来自所对应存储模块330的图像数据。图像数据可以包括一个或多个图像(例如二维图像、三维图像等)及其一个或多个部分、视频数据(例如一个或多个视频,视频帧,以及其它和视频相关联的数据等)、可用于处理图像和/或视频的数据(例如可用于压缩或解压缩、加密或解密、发送或接收、以及播放图像和/或视频的数据等)、和图像相关的数据等。
输入/输出模块340可以从图像处理系统100中的其他模块或外部模块接收信息。输入/输出模块340可以将信息发送至图像处理系统100中的其他模块或外部模块。在一些实施例中,输入/输出模块340可以接收成像系统110生成的图像数据。所述图像数据可以包括计算机断层扫描图像数据、X射线影像数据、磁共振影像数据、超声波影像数据、热影像数据、核影像数据、光线影像数据等。在一些实施例中,所述输入/输出模块340接收的信息可以在处理模块310中进行处理,和/或存储在存储模块330中。在一些实施例中,所述输入/输出模块340可以输出通过处理模块310处理的图像数据。在一些实施例中,输入/输出模块340接收和/或输出的信息可以是医学数字成像和通信形式(Digital Imaging and Communications in Medicine,DICOM)的数据。所述DICOM形式的数据可以根据一个或多个标准进行传输和/或存储。在一些实施例中,输入/输出模块340可以通过输入/输出组件260执行相应操作。
处理模块310可以处理图像数据。在一些实施例中,所述处理模块310 可以通过输入/输出模块340从成像系统110、和/或数据库140中获取图像数据。在一些实施例中,所述处理模块310可以直接从存储模块330中获取图像数据。在一些实施例中,所述处理模块310可以对获取的图像数据进行处理。所述图像数据的处理可以包括图像数字化、图像压缩(例如,图像编码)、图像增强复原(例如,图像增强、图像复原、图像重建)、图像分析(例如,图像分割、图像匹配、图像识别)等,或几种的组合。在一些实施例中,所述处理模块310可以对医学图像数据进行处理。所述医学图像数据的处理可以包括体绘制(例如,体光线投射、光线提前终止、确定感兴趣二维区域/三维体积(例如,绘制感兴趣二维区域/三维体积、裁剪感兴趣二维区域/三维体积、传递感兴趣二维区域/三维体积、合并感兴趣二维区域/三维体积))、图像压缩(例如,图像编码(包括模型基编码、基于神经元网络编码))、图像分析(例如,图像分割(包括区域分割(包括区域生长和区域分裂合并法、八叉树)、阈值分割、边缘分割、直方图法等))、图像增强复原(例如,滤波(包括高通滤波、低通滤波、带通滤波等)、傅里叶变换、伪彩色增强图像重建(包括纹理映射)、图像着色(包括辐射着色)、图像渲染(包括光线跟踪、光子映射、立体渲染)、图像边缘融合(包括基于灰度窗口相关匹配))、拟合、插值、离散等,或几种的组合。例如,在医学图像处理中,在图像上绘制感兴趣二维区域/三维体积并分析感兴趣二维区域/三维体积的特征信息(例如,统计学特征)。所述统计学特征可以包括但不限于方差、面积、长度、平均值、最大值、最小值、体积、频率分布、直方图等,或几种的组合。在一些实施例中,处理模块310可以通过处理器220执行相应操作。
在一些实施例中,处理模块310可以包括一个或多个处理元件或设备,例如,中央处理器(central processing unit,CPU)、图形处理器(graphics processing unit,GPU)、数字信号处理器(digital signal processor,DSP)、系统芯片(system on a chip,SoC)、微控制器(microcontroller unit,MCU)等,或几种的组合。在一些实施例中,处理模块310可以包括具备特殊功能的处理元件。例如,处理模块310可以包括确定图像感兴趣二维区域/三维体积的处理元件。又如,处理模块310可以包括用户自定义功能的处理元件。
通信模块320可以实现图像处理服务器120与网络130的通信。所述通信模块320的通信方式可以包括有线通信和/或无线通信。所述有线通信可以指通过导 线、电缆、光缆、波导、纳米材料等传输媒介进行通信,所述无线通信可以包括IEEE 802.11系列无线局域网通信、IEEE 802.15系列无线通信(例如蓝牙、ZigBee等)、移动通信(例如TDMA、CDMA、WCDMA、TD-SCDMA、TD-LTE、FDD-LTE等)、卫星通信、微波通信、散射通信、大气激光通信等,或多种的组合。在一些实施例中,通信模块320可以采用一种或多种编码方式对传输的信息进行编码处理。所述编码方式可以包括相位编码、不归零制码、差分曼彻斯特码等,或几种的组合。在一些实施例中,通信模块320可以根据图像类型选择不同的编码和/或传输方式。例如,当图像数据为DICOM形式时,通信模块320可以根据DICOM标准进行编码和传输。在一些实施例中,通信模块320可以通过通信端口250执行相应操作。
存储模块330可以存储信息。所述信息可以包括所述输入/输出模块340获取的图像数据,和/或处理模块310处理的结果等。所述信息可以包括文本、数字、声音、图像、视频等,或几种的组合。在一些实施例中,所述存储模块330可以包括各类存储设备如固态硬盘、机械硬盘、USB闪存、SD存储卡、光盘、随机存储器(Random Access Memory,RAM)和只读存储器(Read-Only Memory,ROM)等,或几种的组合。在一些实施例中,存储模块330可以是图像处理服务器120本地的存储,外接的存储,和/或通过网络130通信连接的存储(如云存储等)等。在一些实施例中,存储模块330可以包括一个数据管理单元(未示出)。所述数据管理单元可以监测并管理存储模块中的数据,删除利用率为零或较低的数据,使存储模块330保持足够的存储容量。在一些实施例中,存储模块330可以通过ROM 230、RAM 240、和/或其他存储设备执行相应操作。
需要注意的是,以上对于处理引擎122的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解各个模块的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对处理器的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,处理引擎122可以包括一个控制模块。所述控制模块可以控制处理引擎122各模块进行图像数据的接收、处理、存储、输入、和/或输出等。例如,所述控制模块可以控制输入/输出模块340获取信息(例如,从用户界面280获取用户指令、专家意见等),或向网络130传输信息(例如,在一个医疗体系中共享患 者信息等)。
图4是根据本申请的一些实施例所示的处理模块310的一个示例性示意图。处理模块310可以包括绘制单元410、传递单元420、裁剪单元430、合并单元440、分析单元450。应当注意的是,上面对于图像处理服务器120中处理引擎122的处理模块310的描述只是示例性的,不构成对本申请的限制。在一些实施例中,处理模块310还可以包含其他的单元。在一些实施例中,上述单元中的某些单元可以不是必需的。在一些实施例中,上述单元中的一些单元可以合并为一个单元共同作用。在一些实施例中,上述单元可以是独立的。所述单元独立可以指每个单元执行各自的功能。在一些实施例中,上述单元可以是相互联系的。所述单元相互联系可以是每个单元的数据可以交叉使用。
在本申请中,图像处理的对象可以是图像或其一部分。所述图像可以是二维图像和/或三维图像。在二维图像中,最细微的可分辨元素可以为像素点(pixel)。在三维图像中,最细微的可分辨元素可以为体素点(voxel)。所述图像可以包括一个或多个感兴趣二维区域(ROI)和/或感兴趣三维体积(VOI)。所述二维ROI可以指二维图像(或三维图像中的某一层图像)中的用户感兴趣的一个或多个像素点。所述三维VOI可以指三维图像中的用户感兴趣的一个或多个体素点。所述三维VOI可以包括一层或多层图像的ROI。所述二维ROI/三维VOI可以对应于图像中的一个或多个组织或它们的一部分。例如,二维ROI/三维VOI可以指图像中的肿瘤、硬化、或病变组织。处理模块310可以对图像中对应于一个组织、器官、或相关内容(例如,结肠、小肠、肺、或其中的空气、液体等)的部分的像素/体素进行处理。例如,绘制二维ROI/三维VOI、传递二维ROI/三维VOI、裁剪二维ROI/三维VOI、合并二维ROI/三维VOI、识别或分割图像中的某种组织、从图像中去除某个区域或体积等。
绘制单元410可以基于图像数据绘制二维区域/三维体积。所述绘制单元410可以基于图像绘制一个或多个特定二维区域/三维体积。在一些实施例中,所述特定二维区域/三维体积可以是感兴趣二维区域(Region of Interest,ROI)和/或感兴趣三维体积(Volume of Interest,VOI)。绘制单元410可以绘制特定形状的二维区域/三维体积。所述特定形状的二维区域/三维体积可以包括规则形状的二维区域/三维体积和/或不规则形状的二维区域/三维体积。所述规则形状的二维区域/三维体积可以包括矩形、 正方形、菱形、圆形、椭圆形、三角形、长方体、正方体、椎体、球体等。所述不规则形状的二维区域/三维体积可以包括任意形状的二维区域/三维体积。在一些实施例中,绘制单元410可以对二维区域/三维体积的内部和/或其边界进行任意更改操作。所述操作可以包括拉伸、拖动、放大、缩小、擦除、加粗、添加颜色等。例如,用户可以在用户界面280对绘制单元410绘制的特定二维区域/三维体积添加颜色。在一些实施例中,绘制单元410可以在绘制单元410、传递单元420、裁剪单元430和/或合并单元440处理后的图像上绘制二维ROI/三维VOI。
传递单元420可以基于图像数据传递感兴趣二维区域/三维体积。所述传递单元420可以传递一个或多个特定感兴趣二维区域/三维体积。所述传递可以指将某一图像的某一二维区域/三维体积的形状复制至同一图像的不同位置或不同的图像中,例如,将某一层图像的一个或多个区域的形状复制至同一层图像的不同位置,将某一层图像的一个或多个区域的形状复制至另一层图像的相同位置,或将某一个三维图像的一个或多个体积的形状复制至另一个三维图像的相同位置等。在一些实施例中,所述传递可以指在图像的相应位置生成相同形状的二维区域/三维体积而不改变图像的像素值/体素值信息。
在一些实施例中,根据所述特定感兴趣二维区域/三维体积的位置,传递单元420可以在同一个三维图像(即体数据)的不同层图像(即数据层)中传递特定二维区域/三维体积。作为示例,根据特定感兴趣二维区域/三维体积的二维位置,传递单元420可以在同一体数据的不同数据层传递特定感兴趣二维区域/三维体积。例如,对于一个体数据的一个数据层中的感兴趣二维区域ROI1,传递单元420可以确定所述感兴趣二维区域中的一个或多个参考体素点的位置信息(例如,三维坐标(x,y,z)等),并且根据所述位置信息在另一个数据层的相应位置(例如,三维坐标(x,y,z’)等),生成一个感兴趣二维区域。所述参考体素点可以是感兴趣二维区域ROI1中的任意一个体素点,例如,中心体素点、或ROI1边缘上的一个体素点等。在传递过程中,所述参考体素点在感兴趣二维区域内部的相对位置可以保持不变。在一些实施例中,可以用感兴趣二维区域/三维体积中的参考体素点的位置信息代表感兴趣二维区域/三维体积的位置信息。传递单元420可以在一个或多个传递方向的传递深度对应的体数据中的一个或多个数据层(例如,z轴方向的第一数据层z1、第二数据层z2,y轴方向的 第一数据层y1、第二数据层y2,x轴方向的第一数据层x1、第二数据层x2等)之间传递所述感兴趣二维区域/三维体积。所述传递方向可以是垂直于感兴趣二维区域/三维体积所在平面的方向、或与感兴趣二维区域/三维体积所在平面呈一个任意倾斜角度的方向,例如,体数据的Z轴正方向和/或Z轴负方向、体数据的Y轴正方向和/或Y轴负方向、体数据的X轴正方向和/或X轴负方向、或三维空间中的其他任意方向。例如,传递单元420可以将XY平面的感兴趣二维区域ROI1传递至第一数据层z1,以获得第一数据层z1中参考体素点位于三维坐标(x,y,z1)的感兴趣二维区域ROI1’。又如,传递单元420可以将XY平面的感兴趣二维区域ROI1传递至第二数据层z2,以获得第二数据层z2中参考体素点位于三维坐标(x,y,z2)的感兴趣二维区域ROI1”等。再如,传递单元420可以将XZ平面的感兴趣二维区域ROI2(参考体素点位置信息为(x,y,z))传递至第一数据层y1,以获得第一数据层y1中参考体素点位于三维坐标(x,y1,z)的感兴趣二维区域ROI2’。再如,传递单元420可以将XZ平面的感兴趣二维区域ROI2(参考体素点位置信息为(x,y,z))传递至第二数据层y2,以获得第二数据层y2中参考体素点位于三维坐标(x,y2,z)的感兴趣二维区域ROI2”。再如,传递单元420可以将YZ平面的感兴趣二维区域ROI3(参考体素点位置信息为(x,y,z))传递至第一数据层x1,以获得第一数据层x1中参考体素点位于三维坐标(x1,y,z)的感兴趣二维区域ROI3’。再如,传递单元420可以将XZ平面的感兴趣二维区域ROI3(参考体素点位置信息为(x,y,z))传递至第二数据层x2,以获得第二数据层x2中参考体素点位于三维坐标(x2,y,z)的感兴趣二维区域ROI3”。在一些实施例中,所述Z轴正方向可以指图像中的检测对象的头尾方向上沿着头部的方向;所述Z轴负方向可以指图像中检测对象的头尾方向上沿着尾部的方向;所述Y轴正方向可以指图像中的检测对象的前后方向上沿着前部的方向;所述Y轴负方向可以指图像中的检测对象的前后方向上沿着后部的方向;所述X轴正方向可以指图像中的检测对象的左右方向上沿着右部的方向;所述X轴负方向可以指图像中的检测对象的左右方向上沿着左部的方向。所述传递深度可以是体数据在Z轴、Y轴、或X轴方向的高度。所述体数据中的一个或多个数据层可以通过体数据的层间距得到。所述层间距可以是0.1-20mm或其它适宜的间距。所述层间距可以是不同轴向的层间距,例如,X轴向的层间距,Y轴向的层间距,Z轴向的层间距。所述不同轴向的层间距可以相同,也可以不 同。在一些实施例中,根据特定二维区域/三维体积的位置,传递单元420可以在不同图像数据之间传递特定二维区域/三维体积。作为示例,根据特定感兴趣二维区域/三维体积的三维位置,传递单元420可以在不同体数据之间传递特定感兴趣二维区域/三维体积。例如,在一个体数据中的VOI1,传递单元420可以确定VOI1中参考体素点的三维位置(x,y,z);根据所述三维位置,传递单元420可以在另一体数据的相同三维位置(x,y,z)及其邻域生成形状相同的VOI1’。所述参考体素点可以是感兴趣三维体积VOI1中的任意一个体素点,例如,中心体素点、或VOI1表面上的一个体素点等。在一些实施例中,可以用感兴趣三维体积中的参考体素点的位置信息代表感兴趣三维体积的位置信息。在传递过程中,所述参考体素点在感兴趣三维体积内部的相对位置可以保持不变。在一些实施例中,传递单元420可以接收不同数据层之间,和/或不同体数据之间传递后的感兴趣二维ROI/三维VOI。在一些实施例中,传递单元420可以通过绘制单元410将传递的感兴趣二维ROI/三维VOI绘制在不同数据层和/或不同体数据中。传递单元420可以传递绘制单元410、裁剪单元430、传递单元420和/或合并单元440处理后的图像二维ROI/三维VOI。
裁剪单元430可以基于图像数据裁剪感兴趣二维区域/三维体积。裁剪单元430可以保留某一体数据或数据层中的感兴趣二维ROI/三维VOI的部分二维区域/三维体积,和/或裁去所述感兴趣二维ROI/三维VOI内部的一个或多个特定二维区域/三维体积。在一些实施例中,裁剪单元430可以通过像素集合和/或体素集合的运算实现裁剪功能。作为示例,裁剪单元430可以裁去感兴趣二维ROI/三维VOI内部用户不需要的特定二维区域/三维体积以获得裁剪结果(即,所述感兴趣二维ROI/三维VOI与待裁去的特定二维区域/三维体积的差集)。例如,裁剪单元430可以从感兴趣二维ROI/三维VOI中的像素/体素集合中扣除不需要的特定二维区域/三维体积对应的像素/体素集合,以实现裁剪。在一些实施例中,裁剪单元430可以调整裁剪的感兴趣二维区域/三维体积。作为示例,裁剪单元430可以还原错误裁剪的二维区域/三维体积。例如,裁剪单元430可以通过合并错误裁剪的二维区域/三维体积的像素/体素集合还原二维ROI/三维VOI。裁剪单元430可以基于绘制单元410、传递单元420、裁剪单元430和/或合并单元440处理后的图像数据进行裁剪。
合并单元440可以基于图像数据合并感兴趣二维区域/三维体积。合并单 元440可以合并两个或多个特定二维区域/三维体积(例如,二维ROI/三维VOI)。在一些实施例中,合并单元440可以通过合并像素/体素集合实现特定二维区域/三维体积的合并。在一些实施例中,合并的结果可以是待合并的两个或多个像素/体素集合的并集。在一些实施例中,合并单元440可以合并至少两个特定二维区域/三维体积。在一些实施例中,合并单元440可以合并至少两个特定体积。在一些实施例中,合并单元440可以合并一个或多个特定二维区域/三维体积,以及一个或多个特定体积。作为示例,合并单元440可以合并至少两个特定二维区域/三维体积。例如,合并单元440可以合并同一个数据层中的两个不同ROI(例如,第一感兴趣二维区域ROI1和第二感兴趣二维区域ROI2)的数据。在一些实施例中,合并单元440可以合并同一个数据层中的连通或非连通的至少两个二维区域/三维体积的数据并用于进一步的分析。合并单元440可以基于绘制单元410、传递单元420、合并单元440和/或裁剪单元430处理后的二维ROI/三维VOI进行合并。
分析单元450可以基于图像数据分析感兴趣二维区域/三维体积的信息。分析单元450可以分析特定感兴趣二维区域/三维体积(例如,二维ROI/三维VOI)的信息。在一些实施例中,分析单元450可以分析二维ROI/三维VOI的特征信息(例如,统计学特征信息)。所述统计学特征可以包括方差、面积、长度、平均值、最大值、最小值、体积等,或几种的组合。在一些实施例中,所述特征信息可以包括灰度值特征信息,例如,灰度分布、灰度平均值等。分析单元450可以通过输入/输出模块340在用户界面280显示分析结果。分析单元450可以分析绘制单元410、传递单元420、裁剪单元430、和/或合并单元440处理后的图像、二维ROI/三维VOI等。在一些实施例中,分析单元450可以分析特定二维区域/三维体积是否需要进一步处理。例如,分析单元450可以通过用户界面280确定是否继续绘制、裁剪、传递、和/或合并二维ROI/三维VOI。
需要注意的是,以上对图像处理服务器120中处理模块310的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解处理模块所执行的功能后,可能在实现上述功能的情况下,对各个模块、单元或子单元进行任意组合,对处理模块310的配置进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。在一些实施例中,处理模块310还可以包括一个 独立的图像单元来实现对图像数据的处理。所述独立的图像单元可以相对于绘制单元410是独立的。又如,图像单元可以实现绘制单元410、传递单元420、裁剪单元430、合并单元440和/或分析单元450的功能。在一些实施例中,有些单元不是必需的,例如,绘制单元410。在一些实施例中,处理模块310可以包含其他的单元或子单元。诸如此类的变形,均在本申请的保护范围之内。
图5是根据本申请的一些实施例所示的处理图像的一种示例性流程图。流程500可以通过处理引擎122实现。在501,可以获取图像数据。在一些实施例中,501可以通过输入/输出模块340实现。在一些实施例中,所述图像数据可以由成像系统110扫描目标物体或其一部分获得。在一些实施例中,所述图像数据可以从内部存储设备(例如,数据库140和/或存储模块330等)中获取。在一些实施例中,所述图像数据可以从外部存储设备(例如,网络存储设备、云盘、移动硬盘等,或几种的组合)中获取。所述图像数据可以包括图像矩阵、图像信息、图像向量、位图、动图、图像编码、图元、图段等,或几种的组合。
在一些实施例中,所述图像数据可以是医学图像数据。在一些实施例中,所述医学图像数据可以通过一种或多种扫描仪获得。所述扫描仪可以包括磁共振成像(MRI)、计算机断层扫描(CT)、正电子计算机断层显像(PET)、单光子发射计算机断层显像(SPECT)、计算机断层扫描结肠成像(CTC)等,或几种的组合。在一些实施例中,所述图像数据可以是对器官、机体、物体、机能障碍、肿瘤等,或多种目标进行扫描得到的数据。在一些实施例中,所述图像数据可以是对头部、胸腔、器官、骨骼、血管、结肠等,或多种目标进行扫描得到的数据。在一些实施例中,所述图像数据可以是二维数据和/或三维数据。在一些实施例中,所述图像数据可以由多个二维的像素或三维的体素组成。图像数据中一个数值可以对应所述像素或者体素的一种或多种属性,如灰度、亮度、颜色、对X射线或γ射线的吸收度、氢原子密度、生物分子代谢、受体及神经介质活动等。
在502,可以基于501中获取的图像数据确定至少一个感兴趣区域(感兴趣二维区域/三维体积)。在一些实施例中,502可以通过处理模块310中的绘制单元410、传递单元420、裁剪单元430、合并单元440、或一个或多个上述单元的任意组合来实现。所述确定感兴趣二维区域/三维体积可以包括绘制感兴趣二维区域/三维体积、传递 感兴趣二维区域/三维体积、裁剪感兴趣二维区域/三维体积、和/或合并感兴趣二维区域/三维体积等。在一些实施例中,所述感兴趣区域可以包括感兴趣二维区域ROI和/或感兴趣三维体积。所述二维ROI可以是不同大小、和/或不同形状的特定区域。作为示例,所述感兴趣区域可以是通过圆、椭圆、方框、不规则多边形等方式勾勒出的区域。所述感兴趣区域可以通过矩形勾勒出特定区域。例如,所述特定区域可以是需要进一步处理的区域。所述绘制感兴趣区域可以是在一个体数据内勾勒一个特定区域。在一些实施例中,处理模块310可以确定一个或多个三维VOI。所述传递感兴趣区域可以是将一个数据层的感兴趣区域传递给同一体数据内的其他数据层。所述传递感兴趣区域可以是将一个体数据内的感兴趣区域传递给其他体数据。所述裁剪感兴趣区域可以是在一个感兴趣区域内去除一个特定区域。所述合并感兴趣区域可以是将两个或多个感兴趣区域合并。在一些实施例中,所述两个感兴趣区域的合并可以是合并所述两个感兴趣区域的图像数据。作为示例,所述合并感兴趣区域可以是合并所述两个感兴趣区域的像素集合。
在503,可以分析502中确定的至少一个感兴趣二维区域/三维体积相关的图像数据。在一些实施例中,503可以通过所述处理模块310中的分析单元450实现。在一些实施例中,所述分析感兴趣二维区域/三维体积相关的图像数据可以包括分析感兴趣二维区域/三维体积的特征信息。作为示例,所述特征信息可以是统计学特征。例如,所述统计学特征可以包括方差、面积、长度、平均值、最大值、最小值、体积等,或几种的组合。在一些实施例中,所述分析感兴趣二维区域/三维体积相关的图像数据可以包括判断一种组织图像是否在感兴趣二维区域/三维体积内。所述图像数据可以对应于一个组织、器官和/或相关内容(例如,结肠、小肠、肺、或其中的空气、液体等)。作为示例,当判断相关的图像数据在感兴趣二维区域/三维体积内时,可以进一步判断相关的图像数据是否满足特定条件。所述特定条件可以包括一定阈值的面积,和/或一定阈值的体积等。例如,所述相关的图像数据满足特定条件时,可以进一步分析和/或处理所述图像数据。
在504,可以显示503中分析的结果。在一些实施例中,504可以通过输入/输出模块340实现。作为示例,所述分析结果可以通过输入/输出模块340和/或输入/输出组件260在用户界面280中显示。在一些实施例中,所述显示分析结果可以是输出通过503分析获得的所述感兴趣二维区域/三维体积相关的图像数据。作为示例,所述图像 数据的输出可以包括将分析后的图像数据发送给图像处理系统100的其他模块。例如,所述输入/输出模块340可以在504将分析后的图像数据直接发送和/或通过网络130发送给成像系统110、和/或数据库140。在一些实施例中,所述图像数据的输出可以包括通过成像系统110和/或图像处理服务器120中的一个显示模块显示所述分析后的图像数据。在一些实施例中,504可以将所述分析结果发送到系统外的模块或设备。所述输入/输出模块340发送图像数据可以是无线的,有线的,或两者的结合。例如,所述分析结果可以通过图像处理服务器120中处理引擎122的通信模块320发送至系统外的模块或设备。在一些实施例中,504可以进一步将所述分析结果储存到存储模块330和/或数据库140中。在一些实施例中,504可以显示501中获取的图像数据、502中确定的感兴趣二维区域/三维体积,或其他与处理过程的中间状态相关的信息。
需要注意的是,以上对流程500的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程500所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。在一些实施例中,流程500可以不执行部分操作。作为示例,可以不执行操作503和/或操作504。在一些实施例中,流程500可以包含其他操作,例如,处理感兴趣二维区域/三维体积相关的图像数据。诸如此类的变形,均在本申请的保护范围之内。
图6是根据本申请的一些实施例所示的确定感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图。流程600可以通过图像处理服务器120中的处理模块310实现。
在601,可以绘制一个或多个感兴趣区域(感兴趣二维区域/三维体积)。在一些实施例中,601可以通过处理模块310中的绘制单元410实现。在一些实施例中,601可以基于501中获取的图像数据绘制一个或多个感兴趣二维区域/三维体积。所述图像数据可以包括医学图像。所述医学图像可以包括磁共振图像(MRI图像)、计算机断层扫描图像(CT图像)、正电子计算机断层显像图像(PET图像)、单光子发射计算机断层显像图像(SPECT图像)、计算机断层扫描结肠图像(CTC图像)等。作为示例,可以在符合医学数字成像和通信3.0格式(Digital Imaging and Communication in Medicine,DICOM)的图像数据中绘制一个感兴趣二维区域/三维体积。
在一些实施例中,601中感兴趣二维区域/三维体积的绘制可以采用自动的、手动的和/或二者结合的方式。所述自动绘制可以指系统在图像数据的特定位置自动勾勒出特定的形状二维区域/三维体积。作为示例,所述特定位置可以是由系统确定的,和/或由用户选择的。所述特定的形状二维区域/三维体积可以是由系统确定的,和/或由用户选择的。例如,用户(例如,医生、专家等)可以根据经验选择图像数据的一个特定位置,和/或需要绘制的一个特定形状/体积。在一些实施例中,绘制单元410可以根据一个或多个已绘制的感兴趣二维区域/三维体积的位置和/或形状,在图像数据中的相应位置绘制相应形状的感兴趣二维区域/三维体积。在一些实施例中,所述自动绘制可以指绘制单元410利用一个或多个算法对图像数据进行分割,以提取出感兴趣二维区域/三维体积。所述算法可以包括图像分割算法,例如,灰度阈值分割法、区域生长和分裂合并法、边缘分割法、直方图法、基于模糊理论分割法(例如模糊阈值分割法、模糊连接度分割法、模糊聚类分割法等)、基于神经网络分割法、基于数学形态学分割法(例如形态学分水岭算法等)等,或多种的组合。所述手动绘制可以由用户在图像数据的特定位置手动勾勒出特定的形状的感兴趣二维区域/三维体积。所述特定的形状可以包括规则的形状和/或不规则的形状。所述绘制一个感兴趣二维区域/三维体积可以包括绘制特定形状的二维区域/三维体积,也可以包括对感兴趣二维区域/三维体积和/或感兴趣二维区域/三维体积边界进行的不同操作。作为示例,所述操作可以包括对绘制的感兴趣二维区域/三维体积进行修改。所述修改可以包括拉伸、拖动、擦除、加粗、添加颜色等。例如,用户可以通过添加感兴趣二维区域/三维体积边界颜色修改已绘制的感兴趣二维区域/三维体积。
在一些实施例中,所述绘制一个感兴趣二维区域/三维体积可以是在已有的感兴趣二维区域/三维体积内部再绘制一个特定形状的感兴趣二维区域/三维体积。作为示例,所述已有的感兴趣二维区域/三维体积可以是通过绘制、裁剪、传递和/或合并得到的感兴趣二维区域/三维体积。例如,601可以基于602中裁剪的感兴趣二维区域/三维体积、603中传递的感兴趣二维区域/三维体积、和/或604中合并的感兴趣二维区域/三维体积绘制一个或多个感兴趣二维区域/三维体积。又如,601可以在已绘制的感兴趣二维区域/三维体积内部再绘制一个感兴趣二维区域/三维体积。在一些实施例中,601中绘制的感兴趣二维区域/三维体积可以进行后续的绘制、裁剪、传递和/或合并等处理。
在602,可以裁剪一个或多个感兴趣二维区域/三维体积。在一些实施例中,602可以通过处理模块310中的裁剪单元430实现。所述裁剪一个感兴趣二维区域/三维体积可以是去除一个感兴趣二维区域/三维体积内部的一个特定感兴趣二维区域/三维体积。在一些实施例中,602中可以调整所述需裁剪的特定感兴趣二维区域/三维体积。作为示例,所述调整可以包括还原已去除的特定感兴趣二维区域/三维体积,缩小或扩大待裁剪的特定感兴趣二维区域/三维体积,改变待裁剪的特定感兴趣二维区域/三维体积的特定形状、位置等。在一些实施例中,所述裁剪一个感兴趣二维区域/三维体积可以通过像素集合和/或体素集合的运算实现。作为示例,所述裁剪感兴趣二维区域/三维体积可以通过求取感兴趣二维区域/三维体积的像素/体素集合与所述感兴趣二维区域/三维体积内部的一个待裁剪的特定二维区域/三维体积的像素/体素集合的差集而得到。
在一些实施例中,所述裁剪一个感兴趣二维区域/三维体积可以是在已有的感兴趣二维区域/三维体积内部再进行裁剪。作为示例,所述已有的感兴趣二维区域/三维体积可以是通过绘制、裁剪、传递和/或合并得到的感兴趣二维区域/三维体积。例如,602可以基于601中绘制的感兴趣二维区域/三维体积、603中传递的感兴趣二维区域/三维体积、和/或604中合并的感兴趣二维区域/三维体积裁剪感兴趣二维区域/三维体积。又如,602可以在已裁剪的感兴趣二维区域/三维体积内部再进行裁剪。在一些实施例中,所述裁剪后的感兴趣二维区域/三维体积可以进行后续的绘制、裁剪、传递和/或合并等处理。
在603,可以传递一个或多个感兴趣二维区域/三维体积。在一些实施例中,603可以通过处理模块310中的传递单元420实现。所述传递一个感兴趣二维区域/三维体积可以是传递感兴趣二维区域/三维体积的位置信息和/或形状信息。所述位置信息可以包括二维位置,三维位置等。所述形状信息可以包括规则形状、不规则形状等。在一些实施例中,所述传递感兴趣二维区域/三维体积可以是在不同的体数据之间传递,也可以是在同一体数据内进行传递。作为示例,所述传递感兴趣二维区域/三维体积可以是根据感兴趣二维区域/三维体积的二维位置在同一体数据的不同数据层之间进行传递。又如,所述传递感兴趣二维区域/三维体积可以是根据感兴趣二维区域/三维体积的三维位置在不同体数据之间进行传递。
在一些实施例中,所述传递感兴趣二维区域/三维体积可以是将已有的感 兴趣二维区域/三维体积的形状信息通过位置信息复制到待传递的图像数据。例如,在同一体数据的不同数据层之间,传递后的感兴趣二维区域/三维体积可以具有与被传递的感兴趣二维区域/三维体积相同的二维位置和/或形状。又如,在不同的体数据之间,传递后的感兴趣二维区域/三维体积可以具有与被传递的感兴趣二维区域/三维体积相同的三维位置和/或形状。在一些实施例中,603中可以不改变体数据和/或数据层的体素值/像素值。
在一些实施例中,所述传递一个感兴趣二维区域/三维体积可以是传递已有的感兴趣二维区域/三维体积。作为示例,所述已有的感兴趣二维区域/三维体积可以是通过绘制、裁剪、传递和/或合并得到的感兴趣二维区域/三维体积。例如,603可以基于601中绘制的感兴趣二维区域/三维体积、602中裁剪的感兴趣二维区域/三维体积、和/或604中合并的感兴趣二维区域/三维体积传递感兴趣二维区域/三维体积。在一些实施例中,所述传递后的感兴趣二维区域/三维体积可以进行后续的绘制、裁剪、传递和/或合并等处理。
在604,可以合并至少两个感兴趣二维区域/三维体积。在一些实施例中,604可以通过处理模块310中的合并单元440实现。所述合并至少两个感兴趣二维区域/三维体积可以是合并至少两个感兴趣二维区域/三维体积的数据,所述数据包括特征信息。所述特征信息可以包括统计学特征等。在一些实施例中,所述统计学特征可以包括方差、面积、长度、平均值、最大值、最小值、体积等。在一些实施例中,所述合并至少两个感兴趣二维区域/三维体积可以通过像素集合和/或体素集合的运算实现。作为示例,所述合并至少两个感兴趣二维区域/三维体积可以指求取一个感兴趣二维区域/三维体积的像素集合与另一个感兴趣二维区域/三维体积的像素集合的并集。
在一些实施例中,所述合并感兴趣二维区域/三维体积可以是合并已有的感兴趣二维区域/三维体积。作为示例,所述已有的感兴趣二维区域/三维体积可以是通过绘制、裁剪、传递和/或合并得到的感兴趣二维区域/三维体积。例如,604中可以基于601中绘制的感兴趣二维区域/三维体积、602中裁剪的感兴趣二维区域/三维体积、603中传递的感兴趣二维区域/三维体积中的两个或多个进行合并。又如,604中可以合并一个裁剪后的感兴趣二维区域/三维体积和一个合并后的感兴趣二维区域/三维体积。再如,604中可以合并一个传递后的感兴趣二维区域/三维体积和一个绘制的感兴趣二维 区域/三维体积。在一些实施例中,所述合并后的感兴趣二维区域/三维体积可以进行后续的绘制、裁剪、传递和/或合并等处理。
需要注意的是,以上对流程600的描述只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程600所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。在一些实施例中,流程600可以不执行部分操作。作为示例,流程600可以不执行操作602和/或操作603。在一些实施例中,流程600所执行的操作顺序可以互换。例如,流程600可以先执行操作603再执行操作602。在一些实施例中,流程600可以重复部分操作。例如,流程600可以在操作604之后再执行操作602和/或操作603。在一些实施例中,601绘制的感兴趣二维区域、602裁剪的感兴趣二维区域、603传递的感兴趣二维区域、和/或604合并的感兴趣二维区域可以替换为感兴趣三维体积。对感兴趣二维区域/三维体积进行的绘制、裁剪、传递、合并等操作都可以参考流程600中的处理。诸如此类的变形,均在本申请的保护范围之内。
图7A是根据本申请的一些实施例所示的裁剪感兴趣区域(感兴趣二维区域/三维体积)的一种示例性流程图。流程700可以通过处理模块310中的裁剪单元430实现。流程700可以是流程600中602的一种示例性实现方式。在一些实施例中,所述裁剪感兴趣二维区域/三维体积可以通过感兴趣二维区域/三维体积的像素集合和/或体素集合的运算实现。
在701,可以获取第一感兴趣二维区域/三维体积。在一些实施例中,701可以通过处理引擎122的输入/输出模块340,和/或存储模块330实现。所述获取的第一感兴趣二维区域/三维体积可以包括绘制的感兴趣二维区域/三维体积、传递后的感兴趣二维区域/三维体积、合并后的感兴趣二维区域/三维体积、和/或裁剪后的感兴趣二维区域/三维体积等。作为示例,所述获取第一感兴趣二维区域/三维体积可以是通过处理模块310中的绘制单元410绘制得到的。在一些实施例中,所述第一感兴趣二维区域/三维体积可以是一个连通域或非连通域。如图7B所示,图7B是根据本申请的一些实施例所示的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图。在图7B中,感兴趣二维区域/三维体积720可以是一个形状不规则的连通或非连通的二维区域/三维 体积。在一些实施例中,用户界面280可以显示感兴趣二维区域/三维体积720和/或它的特征信息,例如,统计学特征。
在702,可以绘制第二感兴趣二维区域/三维体积。在一些实施例中,702可以通过处理模块310中的绘制单元410实现。在一些实施例中,可以在第一感兴趣二维区域/三维体积内部绘制第二感兴趣二维区域/三维体积。在一些实施例中,可以在第一感兴趣二维区域/三维体积内部和外部绘制第二感兴趣二维区域/三维体积。在一些实施例中,所述第二感兴趣二维区域/三维体积可以是在第一感兴趣二维区域/三维体积内部的部分二维区域/三维体积。所述第二感兴趣二维区域/三维体积可以是第一感兴趣二维区域/三维体积内部用户不需要分析的部分二维区域/三维体积。在一些实施例中,所述第二感兴趣二维区域/三维体积的一部分可以在第一感兴趣二维区域/三维体积内部,而另一部分可以在第一感兴趣二维区域/三维体积外部,即,所述第二感兴趣二维区域/三维体积与第一感兴趣二维区域/三维体积可以具有一个或多个交集。作为示例,所述第二感兴趣二维区域/三维体积可以是组织内部的空洞结构。所述空洞结构可以是环状肿瘤内部的坏死区域,脑内部环状缺血病灶的坏死区域,和/或气管的中空结构等。在一些实施例中,702中可以进一步修改第二感兴趣二维区域/三维体积的形状和/或大小,也可以通过调整第二感兴趣二维区域/三维体积的边界进行放大和/或缩小等操作。作为示例,所述调整边界可以是拖动边界,或设置边界的尺寸等。
在703,可以从第一感兴趣二维区域/三维体积裁剪第二感兴趣二维区域/三维体积,以获取第三感兴趣二维区域/三维体积。在一些实施例中,703可以通过处理模块310中的裁剪单元430实现。所述裁剪第二感兴趣二维区域/三维体积可以是在第一感兴趣二维区域/三维体积内部去除第二感兴趣二维区域/三维体积。所述获取的第三感兴趣二维区域/三维体积可以是第一感兴趣二维区域/三维体积去除第二感兴趣二维区域/三维体积后的二维区域/三维体积。在一些实施例中,第二感兴趣二维区域/三维体积可以位于第一感兴趣二维区域/三维体积内部,那么,703中可以求取第一感兴趣二维区域/三维体积与第二感兴趣二维区域/三维体积的差集,以获得第三感兴趣二维区域/三维体积。在一些实施例中,第二感兴趣二维区域/三维体积和第一感兴趣二维区域/三维体积可以具有一个或多个交集,那么,703中可以求取第一感兴趣二维区域/三维体积与所述交集的差集,以获得第三感兴趣二维区域/三维体积。作为示例,所述获取的第三感兴 趣二维区域/三维体积可以是环状的组织结构。所述环状结构可以是不包含内部坏死组织的环状肿瘤,和/或不包含脑内部坏死的环状缺血病灶。如图7C所示,图7C是根据本申请的一些实施例所示的裁剪后的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图。在图7C中,感兴趣二维区域/三维体积730可以是一个裁剪后的连通域或非连通域。所述待裁剪的第二感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积731、感兴趣二维区域/三维体积732、感兴趣二维区域/三维体积733等中的一个或多个。作为示例,所述第一感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积730,所述第二感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积731,所述第三感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积730去除感兴趣二维区域/三维体积731后得到的二维区域/三维体积。
在一些实施例中,用户需要恢复裁剪前的感兴趣二维区域/三维体积,和/或第二感兴趣二维区域/三维体积被错误裁剪时,703可以还原已裁剪的第二感兴趣二维区域/三维体积。作为示例,可以撤销703的裁剪操作,也可以合并裁剪后的感兴趣二维区域/三维体积和被裁剪的感兴趣二维区域/三维体积以获取裁剪前的感兴趣二维区域/三维体积。例如,可以通过合并第三感兴趣二维区域/三维体积的像素集合和第二感兴趣二维区域/三维体积的像素集合以获取第一感兴趣二维区域/三维体积的像素集合。
在704,可以判断是否继续裁剪。在一些实施例中,704可以通过处理模块310中的分析单元450实现。所述判断是否继续裁剪可以根据一个或多个标准进行。在一些实施例中,所述标准可以存储在存储设备(例如,存储模块330、数据库140等)中以便分析单元450调用相关信息进行自动判断。在一些实施例中,所述标准可以基于用户的经验,以便用户在查看第三感兴趣二维区域/三维体积时进行人工判断。所述标准可以包括第一感兴趣二维区域/三维体积内部是否存在其他不需要分析的二维区域/三维体积。作为示例,所述判断可以是基于组织内部是否存在其他坏死的区域。
若判断为否,流程700可以结束裁剪。在705,可以分析第三感兴趣二维区域/三维体积的特征信息。在一些实施例中,705可以通过处理模块310中的分析单元450实现。在一些实施例中,可以不执行操作704,在703之后直接执行705。所述特征信息可以包括统计学特征信息。如图7C所示,若不继续裁剪,流程700处理得到的感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积730裁剪感兴趣二维区域/三维体 积731后的二维区域/三维体积。作为示例,感兴趣二维区域/三维体积731可以是规则的形状,例如椭圆形。
若判断为是,流程700可以继续裁剪。流程700可以返回至702,在第一感兴趣二维区域/三维体积内部绘制其他待裁剪的感兴趣二维区域/三维体积。如图7C所示,若继续裁剪,感兴趣二维区域/三维体积730可以继续裁剪感兴趣二维区域/三维体积732和/或感兴趣二维区域/三维体积733。作为示例,感兴趣二维区域/三维体积732可以是特定的形状,例如圆角矩形。又如,感兴趣二维区域/三维体积733可以是不规则的形状。在一些实施例中,用户界面280可以显示感兴趣二维区域/三维体积730裁剪后的特征信息,例如,裁剪后的统计信息。
需要注意的是,以上对流程700的描述以及图7B、图7C的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程700所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,流程700可以不执行部分操作(例如,操作704)。又如,流程700可以从一个感兴趣区域裁剪另一个感兴趣区域,从一个感兴趣三维体积裁剪一个感兴趣二维区域,从一个感兴趣三维体积裁剪另一个感兴趣三维体积,以及,从一个感兴趣二维区域裁剪一个感兴趣三维体积(具体地,所述感兴趣三维体积与所述感兴趣二维区域的交集)。诸如此类的变形,均在本申请的保护范围之内。
图8A是根据本申请的一些实施例所示的传递感兴趣二维区域的一个示例性流程图。流程800可以通过处理模块310中的传递单元420实现。流程800可以是流程600中603的一种示例性实现方式。
在801,可以获取第一数据层的第一感兴趣二维区域。在一些实施例中,801可以通过处理引擎122中的输入/输出模块340实现。在一些实施例中,801可以从存储模块330读取第一感兴趣二维区域相关的数据。在一些实施例中,用户界面280可以获取用户选择的需要传递的感兴趣二维区域。在一些实施例中,801可以通过绘制单元410绘制第一感兴趣二维区域。如图8B所示,图8B是根据本申请的一些实施例所示的感兴趣二维区域的一个示例性示意图。在图8B中,体数据820可以包括一个数据层821。所述数据层821中可以包括一个感兴趣二维区域821-1。需要注意的是,801获取的第一感兴 趣二维区域可以相同于或不同于图7A中的701获取的第一感兴趣二维区域。在一些实施例中,所述第一感兴趣二维区域可以是一个感兴趣二维区域,或多个感兴趣二维区域的集合。所述第一数据层仅为了描述方便,并不必须表示所述数据层位于体数据某个轴向的第一层。所述第一数据层可以位于XY平面、XZ平面、YZ平面、或与上述平面形成任意倾斜角度的平面。
在802,可以获取第一数据层的第一传递方向的第一传递深度。在一些实施例中,802可以通过处理引擎122中的输入/输出模块340实现。所述第一传递方向可以是垂直于所述第一数据层的正方向和/或负方向。所述第一数据层可以位于XY平面、XZ平面、YZ平面、或与上述平面形成任意倾斜角度的平面。在一些实施例中,相应地,所述第一传递方向可以是Z轴正或负方向、Y轴正或负方向、X轴正或负方向、或上述任意倾斜角度的平面的垂直轴的正或负方向。在一些实施例中,所述第一传递方向可以是与所述第一数据层所在平面形成任意倾斜角度的方向。在一些实施例中,输入/输出模块340可以通过用户界面280获取用户设置的一个传递方向和/或相应的传递深度。例如,用户可以通过用户界面280在显示体数据820的窗口中用鼠标在数据层821的某一侧进行点击以选中一个传递方向。又如,用户可以通过用户界面280输入一个数值、或拖动一个数值滑块以设置相应的传递深度。如图8C所示,图8C是根据本申请的一些实施例所示的传递后的感兴趣二维区域的一个示例性示意图。在图8C中,第一传递深度可以是一个垂直于数据层821的正方向的传递深度H1。
在803,可以获取第一数据层的第二传递方向的第二传递深度。在一些实施例中,803可以通过处理引擎122中的输入/输出模块340实现。所述第二传递方向可以是垂直于所述第一数据层的正方向和/或负方向。所述第二传递方向可以与第一传递方向相反。在一些实施例中,所述第二传递方向可以是与所述第一数据层所在平面形成任意倾斜角度的方向。如图8C所示,第二传递深度可以是一个垂直于数据层821的负方向的传递深度H2。在一些实施例中,输入/输出模块340可以通过用户界面280获取用户设置的另一传递方向和/或相应的传递深度。用户设置第二传递方向的第二传递深度可以与802类似。在一些实施例中,操作802和操作803可以合并为一个操作,例如,输入/输出模块340可以通过用户界面280获取用户设置的两个传递方向和/或相应的两个传递深度。在一些实施例中,操作802和操作803可以择一执行。例如,输入/输出模块340可以通过 用户界面280获取用户设置的一个传递方向和/或相应的一个传递深度。
在804,可以确定所述第一传递深度和第二传递深度对应的体数据中的一个或多个数据层。在一些实施例中,804可以通过处理模块310中的分析单元450实现。如图8C所示,第一传递深度H1对应的体数据中可以包括数据层831、数据层832。又如,第二传递深度H2对应的体数据中可以包括数据层841、数据层842和/或数据层843。在一些实施例中,804可以通过层间距确定传递深度对应的体数据中的一个或多个数据层。在一些实施例中,所述层间距可以是体数据自身的层间距(即,单位层间距),可能与成像系统110扫描的图像数据、或扫描过程中用户设置的层厚有关。在一些实施例中,所述层间距可以是图像处理系统100或用户自定义的层间距,例如,单位层间距被放大一个倍率得到的层间距(例如,1.5倍单位层间距、2倍单位层间距、3倍单位层间距等)。所述倍率可以是任意正实数。作为示例,不同体数据的层间距可以是不同的。例如,PET(Positron Emission Tomography)图像的层间距可以是1-10mm,而CT(Computed Tomography)图像的层间距可以是0.1-20mm或其它适宜的间距。在一些实施例中,同一体数据的不同轴向或传递方向的层间距可以是相同的,也可以是不同的。例如,X轴和/或Y轴的层间距可以是1.5mm,而Z轴的层间距可以是5mm。在一些实施例中,所述数据层可以是同一椎体内的不同数据层,或同一血管内的不同横截面数据层。
在805,可以确定第一感兴趣二维区域在第一数据层的二维位置。在一些实施例中,805可以通过处理模块310中的分析单元450实现。在一些实施例中,所确定的二维位置可以是第一感兴趣二维区域中的参考体素点在第一数据层的二维位置。作为示例,所述二维位置可以是第一感兴趣二维区域的参考体素点在第一数据层所在平面的二维坐标信息,例如,笛卡尔直角坐标系位置(x,y)。例如,图8C中感兴趣二维区域821-1在数据层821中的二维位置。在一些实施例中,所述第一感兴趣二维区域的二维位置可以包括所述第一感兴趣二维区域的像素集合中所有像素的二维位置。在一些实施例中,所述第一感兴趣二维区域的二维位置可以包括所述第一感兴趣二维区域的边界像素的二维位置。
在806,可以根据805确定的二维位置,在804确定的一个或多个数据层中生成一个或多个第四感兴趣二维区域。在一些实施例中,一个第四感兴趣二维区域可以 包括一个或多个感兴趣二维区域的集合。在一些实施例中,806可以通过处理模块310中的传递单元420实现。如图8C所示,所述第四感兴趣二维区域可以包括在数据层831生成的感兴趣二维区域831-1,在数据层832生成的感兴趣二维区域832-1,在数据层841生成的感兴趣二维区域841-1,在数据层842生成的感兴趣二维区域842-1,和/或在数据层843生成的感兴趣二维区域843-1。在一些实施例中,所述第四感兴趣二维区域可以具有与第一感兴趣二维区域相同的大小和/或形状。需要注意的是,一个第四感兴趣二维区域、多个第四感兴趣二维区域中的每一个区域、或一个第四感兴趣二维区域中的部分区域,可以被图像处理系统100或用户单独提取、和/或分析等。
需要注意的是,以上对流程800的描述以及图8B、图8C的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程800所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,图8B或图8C中传递的感兴趣二维区域可以将圆形区域替换为矩形区域,菱形区域和/或不规则区域等。又如,流程800中的第一感兴趣二维区域可以替换为感兴趣三维体积,即,可以在同一个体数据的不同范围的数据层间传递感兴趣三维体积。所述不同范围的数据层可以指三维空间中沿任意方向的不同范围的数据层。诸如此类的变形,均在本申请的保护范围之内。又如,805可以在802、803或804之前执行。
图9A是根据本申请的一些实施例所示的传递感兴趣区域(感兴趣二维区域/三维体积)的一个示例性流程图。在一些实施例中,流程900可以通过处理模块310中的传递单元420实现。流程900可以是流程600中603的另一种示例性实现方式。
在901,可以获取第一体数据的第一感兴趣二维区域/三维体积。在一些实施例中,901可以通过处理引擎122中的输入/输出模块340实现。在一些实施例中,901可以从存储模块330读取第一体数据和/或第一感兴趣二维区域/三维体积相关的数据。在一些实施例中,用户界面280可以获取用户在第一体数据中选择和/或绘制的第一感兴趣二维区域/三维体积。如图9B所示,图9B是根据本申请的一些实施例所示的传递感兴趣二维区域的一个示例性示意图。在图9B中,体数据920可以包括一个数据层921。所述 数据层921中可以包括第一感兴趣二维区域921-1。在一些实施例中,所述体数据可以包括但不限于门控数据、动态数据、功能图像、结构图像、原始图像、和/或算法分析结果数据等。需要注意的是,901获取的第一感兴趣二维区域/三维体积可以相同于或不同于图7A中的701获取的第一感兴趣二维区域/三维体积、和/或图8A中的801获取的第一感兴趣二维区域。在一些实施例中,所述第一感兴趣二维区域/三维体积可以是一个感兴趣二维区域/三维体积,或多个感兴趣二维区域/三维体积的集合。所述多个感兴趣二维区域/三维体积可以是在同一个数据层或多个不同数据层中的感兴趣二维区域/三维体积。在一些实施例中,所述第一体数据仅为了描述方便,并不必须表示所述体数据为第一次获取的体数据。
在一些实施例中,一个感兴趣三维体积可以与多个数据层相交,从而在所述多个数据层中的每一个数据层中可以形成一个感兴趣二维区域的轮廓。如图9C所示,图9C是根据本申请的一些实施例所示的传递感兴趣三维体积的一个示例性示意图。在图9C中,体数据960可以包括多个数据层,例如,数据层961,数据层962,和/或数据层963等。体数据960可以包括一个感兴趣三维体积960-1。所述感兴趣三维体积960-1与数据层961,数据层962,和数据层963相交,并在数据层961中形成感兴趣二维区域961-1的轮廓,在数据层962中形成感兴趣二维区域962-1的轮廓,在数据层963中形成感兴趣二维区域963-1的轮廓。所述感兴趣二维区域961-1、感兴趣二维区域962-1、感兴趣二维区域963-1可以具有相同或不同的形状和/或尺寸。所述感兴趣三维体积960-1可以具有各种三维形状,例如,正方体、长方体、球体、椭圆体、圆柱体、圆锥体、和/或任意形状的三维几何体等。
在902,可以获取一个或多个第二体数据。在一些实施例中,902可以通过处理引擎122中的输入/输出模块340实现。在一些实施例中,902可以从存储模块330读取第二体数据相关的数据。在一些实施例中,所述第二体数据可以用于和第一体数据的对比分析。例如,门控数据(第一体数据)和动态数据(第二体数据)不同时间点的对比分析。又如,功能图像(第一体数据)和结构图像(第二体数据)的对比分析。再如,原始图像(第一体数据)和算法分析结果数据(第二体数据)的对比分析等。如图9B所示,第二体数据可以包括体数据930、体数据940、和/或体数据950等。如图9C所示,第二体数据可以包括体数据970、体数据980、和/或体数据990等。在一些实施例中,上 述第二体数据可以具有与第一体数据相同的三维形状和/或尺寸。在一些实施例中,上述第二体数据可以具有与第一体数据不同的三维形状和/或尺寸。第二体数据中体素的数量可以多于、少于或等于第一体数据中体素的数量。在一些实施例中,第二体数据可以具有大于或等于所述第一感兴趣二维区域/三维体积的形状和/或尺寸。
在903,可以确定第一感兴趣二维区域/三维体积在第一体数据的三维位置。在一些实施例中,903可以通过处理模块310中的分析单元450实现。作为示例,所述三维位置信息可以包括三维笛卡尔直角坐标信息(x,y,z)。在一些实施例中,所述第一感兴趣二维区域/三维体积的三维位置可以包括所述第一感兴趣二维区域/三维体积的体素集合中一个或多个体素(例如,参考体素点)的三维位置。在一些实施例中,所述第一感兴趣二维区域/三维体积的三维位置可以包括所述第一感兴趣二维区域/三维体积的边界体素集合中一个或多个体素的三维位置。如图9B所示,可以确定感兴趣二维区域921-1在体数据920中的三维位置。如图9C所示,可以确定感兴趣三维体积960-1在体数据960中的三维位置。进一步地,如果确定了感兴趣三维体积960-1在体数据960中的三维位置,可能意味着,感兴趣二维区域961-1在数据层961中的三维位置被确定,感兴趣二维区域962-1在数据层962中的三维位置被确定,以及,感兴趣二维区域963-1在数据层963中的三维位置被确定。
在904,可以根据所述三维位置,在所述第二体数据生成一个或多个第五感兴趣二维区域/三维体积。在一些实施例中,一个第五感兴趣二维区域/三维体积可以包括一个或多个感兴趣二维区域/三维体积的集合。在一些实施例中,904可以通过处理模块310中的传递单元420实现。在一些实施例中,所述第五感兴趣二维区域/三维体积可以具有与第一感兴趣二维区域/三维体积相同的大小和/或形状。如图9B所示,根据感兴趣二维区域921-1在体数据920中的三维位置,所述生成的第五感兴趣区域可以包括在体数据930的数据层931生成的感兴趣二维区域931-1,在体数据940的数据层941生成的感兴趣二维区域941-1,和/或在体数据950的数据层951生成的感兴趣二维区域951-1。如图9C所示,根据感兴趣三维体积960-1在体数据960中的三维位置,所述生成的第五感兴趣三维体积可以包括在体数据970生成的感兴趣三维体积970-1,在体数据980生成的感兴趣三维体积980-1,和/或在体数据990生成的感兴趣三维体积990-1。所述感兴趣三维体积970-1可以包括体数据970中的数据层971中的感兴趣二维区域 971-1,数据层972中的感兴趣二维区域972-1,和/或数据层973中的感兴趣二维区域973-1等。所述感兴趣三维体积980-1可以包括体数据980中的数据层981中的感兴趣二维区域981-1,数据层982中的感兴趣二维区域982-1,和/或数据层983中的感兴趣二维区域983-1等。所述感兴趣三维体积990-1可以包括体数据990中的数据层991中的感兴趣二维区域991-1,数据层992中的感兴趣二维区域992-1,和/或数据层993中的感兴趣二维区域993-1等。需要注意的是,一个第五感兴趣二维区域/三维体积、多个第五感兴趣二维区域/三维体积中的每一个二维区域/三维体积、或一个第五感兴趣二维区域/三维体积中的部分二维区域/三维体积,可以被图像处理系统100或用户单独提取、和/或分析等。
需要注意的是,以上对流程900的描述以及图9B、图9C的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程900所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。例如,在一些实施例中,图9B中传递的感兴趣二维区域可以将圆形区域替换为矩形区域,菱形区域和/或不规则区域等。又如,在一些实施例中,图9C中传递的感兴趣三维体积可以将圆形体积替换为椭圆体、正方体、长方体、圆柱体、圆锥体、和/或任意形状的体积等。诸如此类的变形,均在本申请的保护范围之内。
图10A是根据本申请的一些实施例所示的合并感兴趣区域(感兴趣二维区域/三维体积)的一个示例性流程图。在一些实施例中,流程1000可以通过处理模块310中的合并单元440实现。流程1000可以是流程600中操作604的一个示例性实现方式。
在1001,可以获取第一感兴趣二维区域/三维体积。在一些实施例中,1001可以通过处理引擎122的输入/输出模块340实现。在一些实施例中,1001可以从存储模块330读取第一感兴趣二维区域/三维体积相关的数据。在一些实施例中,用户界面280可以获取用户确定的第一感兴趣二维区域/三维体积。作为示例,所述第一感兴趣二维区域/三维体积可以是一个连通域。在一些实施例中,所述第一感兴趣二维区域/三维体积可以是一个非连通域。如图10B所示,图10B是根据本申请的一些实施例所示的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图。在图10B中, 1001可以获取感兴趣二维区域/三维体积1021。所述感兴趣二维区域/三维体积1021可以包括特征信息,例如统计信息A。
在1002,可以获取第二感兴趣二维区域/三维体积。在一些实施例中,1002可以通过处理引擎122的输入/输出模块340实现。在一些实施例中,1002可以从存储模块330读取第二感兴趣二维区域/三维体积相关的数据。在一些实施例中,第二感兴趣二维区域/三维体积和第一感兴趣二维区域/三维体积可以位于相同的数据层中。在一些实施例中,第二感兴趣二维区域/三维体积和第一感兴趣二维区域/三维体积可以位于同一个体数据的不同数据层中。在一些实施例中,第二感兴趣二维区域/三维体积和第一感兴趣二维区域/三维体积可以位于不同的体数据中。在一些实施例中,所述获取的第二感兴趣二维区域/三维体积可以用于与第一感兴趣二维区域/三维体积的合并分析。作为示例,所述第二感兴趣二维区域/三维体积可以与第一感兴趣二维区域/三维体积相关。例如,所述第一感兴趣二维区域/三维体积和第二感兴趣二维区域/三维体积可以是生物医学图像中需要合并统计的分散二维区域/三维体积,或医学图像中需要合并统计的连通或非连通区域。举例来说,所述连通或非连通区域可以是已经扩散的肿瘤,所述肿瘤图像的位置信息可以是不连续的。如图10B所示,1002可以获取感兴趣二维区域/三维体积1022、感兴趣二维区域/三维体积1023和/或感兴趣二维区域/三维体积1024等。所述感兴趣二维区域/三维体积1022可以包括特征信息,例如统计信息B。所述感兴趣二维区域/三维体积1023可以包括特征信息,例如统计信息C。所述感兴趣二维区域/三维体积1024可以包括特征信息,例如统计信息D。在一些实施例中,感兴趣二维区域/三维体积1022、感兴趣二维区域/三维体积1023、感兴趣二维区域/三维体积1024与感兴趣二维区域/三维体积1021可以位于同一个数据层、同一体数据的不同数据层、或不同的体数据中。
在1003,可以合并第一感兴趣二维区域/三维体积和第二感兴趣二维区域/三维体积。在一些实施例中,1001可以通过处理模块310中的合并单元440实现。所述合并第一感兴趣二维区域/三维体积和第二感兴趣二维区域/三维体积可以是合并第一感兴趣二维区域/三维体积和第二感兴趣二维区域/三维体积的数据。如图10C所示,图10C是根据本申请的一些实施例所示的合并后的感兴趣区域(感兴趣二维区域/三维体积)的一个示例性示意图。在图10C中,1003可以合并感兴趣二维区域/三维体积 1031和感兴趣二维区域/三维体积1032。
在1004,可以判断是否合并其它感兴趣二维区域/三维体积。在一些实施例中,1004可以通过处理模块310中的分析单元450实现。在1004,判断的标准可以与用户的分析需求有关,即,用户是否需要将更多的感兴趣二维区域/三维体积进行合并分析。
若判断为是,流程1000可以返回至1002,以获取其它待合并的感兴趣二维区域/三维体积。在一些实施例中,如图10C所示,所述其它待合并的感兴趣二维区域/三维体积可以是感兴趣二维区域/三维体积1033,和/或感兴趣二维区域/三维体积1034。
若判断为否,流程1000可以进入1005,以分析合并后的感兴趣二维区域/三维体积的特征信息。在一些实施例中,1005可以通过处理模块310中的分析单元450实现。在一些实施例中,可以不执行操作1004,在1003之后直接执行1005。在一些实施例中,1005可以分析多个连通或非连通区域合并后的特征信息。作为示例,1005可以分析已经扩散的肿瘤合并后的整体特征信息。所述特征信息可以包括统计学特征。在一些实施例中,如图10C所示,所述合并后的感兴趣二维区域/三维体积的特征信息可以是合并的统计信息1040。在一些实施例中,操作1003和操作1005可以一起执行,例如,合并至少两个感兴趣二维区域/三维体积后可以直接分析合并后的特征信息。
需要注意的是,以上对流程1000的描述以及图10B、图10C的示意图只是示例性的,并不能把本申请限制在所列举的实施例范围之内。可以理解,对于本领域的技术人员来说,在了解流程1000所执行的操作后,可能在实现上述功能的情况下,对各个操作进行任意组合,对流程的操作进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。在一些实施例中,流程1000的操作顺序可以互换。例如,流程1000可以先执行操作1004,再执行操作1003。在一些实施例中,一个感兴趣二维区域可以和另一个感兴趣二维区域合并,一个感兴趣二维区域可以和一个感兴趣三维体积合并,以及,一个感兴趣三维体积可以和另一个感兴趣三维体积合并。诸如此类的变形,均在本申请的保护范围之内。
本申请所描述的内容仅为部分实施例,并不能把本申请限制在所列举的实 施例范围之内。可以理解,对于本领域的技术人员来说,在了解本申请的原理之后,可能在实现上述功能的情况下,对本申请进行各种修正和改变。但这些修正和改变仍在以上描述的范围内。需要注意的是,本申请可以基于体数据的一个或多个初始感兴趣区域(感兴趣二维区域/三维体积),进行裁剪、传递、和/或合并等操作,以获得一个或多个目标感兴趣区域。在一些实施例中,可以基于所述目标感兴趣区域进行进一步地裁剪、传递、和/或合并等操作。
在一些实施例中,所述初始感兴趣区域可以是步骤601绘制的感兴趣二维区域/三维体积、步骤602裁剪后的感兴趣二维区域/三维体积、步骤603传递后的感兴趣二维区域/三维体积、步骤604合并后的感兴趣二维区域/三维体积、步骤701获取的第一感兴趣二维区域/三维体积、步骤703获取的第三感兴趣二维区域/三维体积、步骤801获取的第一感兴趣二维区域、步骤806生成的第四感兴趣二维区域、步骤901获取的第一感兴趣二维区域/三维体积、步骤904生成的第五感兴趣二维区域/三维体积、步骤1003合并后的感兴趣二维区域/三维体积等,或任何其他感兴趣区域。本申请不对初始感兴趣区域做任何限定。
在一些实施例中,所述初始感兴趣区域可以由其他感兴趣区域合并而来,则所述其他感兴趣区域可以称为合并源感兴趣区域。例如,如果将步骤1003合并后的感兴趣二维区域/三维体积作为初始感兴趣区域,那么步骤1001获取的第一感兴趣区域和/或步骤1002获取的第二感兴趣二维区域/三维体积可以称为合并源感兴趣区域。
在一些实施例中,所述初始感兴趣区域可以由其他感兴趣区域传递而来,则所述其他感兴趣区域可以称为传递源感兴趣区域。例如,如果将步骤806生成的第四感兴趣二维区域作为初始感兴趣区域,那么步骤801获取的第一感兴趣二维区域可以称为传递源感兴趣区域。又如,如果将步骤904生成的第五感兴趣二维区域/三维体积作为初始感兴趣区域,那么步骤901获取的第一感兴趣二维区域/三维体积可以称为传递源感兴趣区域。
在一些实施例中,可以从所述初始感兴趣区域或目标感兴趣区域中裁剪掉一个或多个感兴趣区域,所述被裁剪的一个或多个感兴趣区域可以称为待裁剪感兴趣区域。所述待裁剪感兴趣区域可以是步骤602中裁剪掉的感兴趣二维区域/三维体积、步骤703中裁剪掉的第二感兴趣二维区域/三维体积等,或任何其他感兴趣区域。
在一些实施例中,可以将所述初始感兴趣区域或目标感兴趣区域与其他感兴趣区域合并,所述用于合并的其他感兴趣区域可以称为待合并感兴趣区域。所述待合并感兴趣区域可以是步骤604中的待合并感兴趣二维区域/三维体积、步骤703获取的第三感兴趣二维区域/三维体积、步骤806生成的第四感兴趣二维区域、步骤904生成的第五感兴趣二维区域/三维体积、步骤1003合并后的感兴趣二维区域/三维体积等,或任何其他感兴趣区域。
以上概述了图像处理所需要的方法的不同方面和/或通过程序实现其他操作的方法。技术中的程序部分可以被认为是以可执行的代码和/或相关数据的形式而存在的“产品”或“制品”,是通过计算机可读的介质所参与或实现的。有形的、永久的储存介质包括任何计算机、处理器、或类似设备或相关的模块所用到的内存或存储器。例如各种半导体存储器、磁带驱动器、磁盘驱动器或者类似任何时间能够为软件提供存储功能的设备。
所有软件或其中的一部分有时可能会通过网络进行通信,如互联网或其他通信网络。此类通信能够将软件从一个计算机设备或处理器加载到另一个。例如:从图像处理系统的一个管理服务器或主机计算机加载至一个计算机环境的硬件平台,或其他实现系统的计算机环境,或与提供图像处理所需要的信息相关的类似功能的系统。因此,另一种能够传递软件元素的介质或被用作局部设备之间的物理连接,例如光波、电波、电磁波等,通过电缆、光缆或者空气实现传播。用来载波的物理介质如电缆、无线连接或光缆等类似设备,或被认为是承载软件的介质。在这里的用法除非限制了有形的“储存”介质,其他表示计算机或机器“可读介质”的术语都表示在处理器执行任何指令的过程中参与的介质。
因此,一个计算机可读媒体介质可能有多种形式,包括但不限于临时可读媒体介质和非临时可读媒体介质。非临时可读媒体介质可以是稳定的储存介质,包括:光盘或磁盘,以及其他计算机或类似设备中使用的,能够实现图中所描述的系统组件的存储系统。临时可读媒体介质可以是不稳定的存储介质,包括动态内存,例如计算机平台的主内存。计算机可读媒体介质可以包括有形的传输介质,载波传输介质或物理传输介质。有形的传输介质包括同轴电缆、铜电缆以及光纤,包括计算机系统内部形成总线的线路。载波传输介质可以传递电信号、电磁信号,声波信号或光波信号,这些信号可 以由无线电频率或红外数据通信的方法所产生的。通常的计算机可读介质包括硬盘、软盘、磁带、任何其他磁性介质;CD-ROM、DVD、DVD-ROM、任何其他光学介质;穿孔卡、任何其他包含小孔模式的物理存储介质;RAM、PROM、EPROM、FLASH-EPROM,任何其他存储器片或磁带;传输数据或指令的载波、电缆或传输载波的连接装置、任何其他可以利用计算机读取的程序代码和/或数据。这些计算机可读介质的形式中,会有很多种出现在处理器在执行指令、传递一个或更多结果的过程之中。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述发明披露仅仅作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,本领域技术人员可以理解,本申请的各方面可以通过若干具有可专利性的种类或情况进行说明和描述,包括任何新的和有用的工序、机器、产品或物质的组合,或对他们的任何新的和有用的改进。相应地,本申请的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“子模块”、“引擎”、“单元”、“子单元”、“组件”或“系统”。此外,本申请的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。
计算机可读信号介质可能包含一个内含有计算机程序编码的传播数据信号,例如在基带上或作为载波的一部分。该传播信号可能有多种表现形式,包括电磁形式、光形式等等、或合适的组合形式。计算机可读信号介质可以是除计算机可读存储介质之外的任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机可读信号介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、射频信号、或类似介质、或任 何上述介质的组合。
本申请各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写,包括面向对象编程语言如Java、Scala、Smalltalk、Eiffel、JADE、Emerald、C++、C#、VB.NET、Python等,常规程序化编程语言如C语言、Visual Basic、Fortran 2003、Perl、COBOL 2002、PHP、ABAP,动态编程语言如Python、Ruby和Groovy,或其他编程语言等。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或服务器上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。
此外,除非权利要求中明确说明,本申请所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本申请流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本申请实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本申请披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本申请实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本申请对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值参数均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值参数应考虑规定的有效数位并采用一般位数保留的方法。尽管本申请一些实施例中用于确认其范围广度的数值域和参数为近似值, 在具体实施例中,此类数值的设定在可行范围内尽可能精确。
针对本申请引用的每个专利、专利申请、专利申请公开物和其他材料,如文章、书籍、说明书、出版物、文档等,特此将其全部内容并入本申请作为参考。与本申请内容不一致或产生冲突的申请历史文件除外,对本申请权利要求最广范围有限制的文件(当前或之后附加于本申请中的)也除外。需要说明的是,如果本申请附属材料中的描述、定义、和/或术语的使用与本申请所述内容有不一致或冲突的地方,以本申请的描述、定义和/或术语的使用为准。
最后,应当理解的是,本申请中所述实施例仅用以说明本申请实施例的原则。其他的变形也可能属于本申请的范围。因此,作为示例而非限制,本申请实施例的替代配置可视为与本申请的教导一致。相应地,本申请的实施例不仅限于本申请明确介绍和描述的实施例。

Claims (25)

  1. 一种图像处理方法,在至少一个机器上实施,每个机器包括至少一个处理器和存储器,所述方法包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一个数据层,所述至少一个数据层包含至少一个体素;
    用所述至少一个处理器在所述第一体数据中确定目标感兴趣区域,所述目标感兴趣区域包括所述至少一个数据层中的至少一个体素,并且所述目标感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积,所述确定所述目标感兴趣区域包括:
    绘制初始感兴趣区域,所述初始感兴趣区域在所述第一体数据中,并且所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;和
    裁剪所述初始感兴趣区域,以获得所述目标感兴趣区域。
  2. 权利要求1的方法,所述裁剪所述初始感兴趣区域以获得所述目标感兴趣区域包括:
    在所述第一体数据中绘制待裁剪感兴趣区域;和
    从所述初始感兴趣区域中去除所述待裁剪感兴趣区域与所述初始感兴趣区域的交集,以获得所述目标感兴趣区域。
  3. 权利要求1的方法,进一步包括:
    判断是否继续裁剪所述目标感兴趣区域;和
    若判断继续裁剪,在所述第一体数据中继续绘制需要裁剪的感兴趣区域。
  4. 权利要求1的方法,进一步包括传递所述目标感兴趣区域,所述传递所述目标感兴趣区域包括:
    确定第一位置信息,所述第一位置信息包括所述目标感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;
    获取第一传递方向的第一传递深度,所述第一传递方向与所述第一体数据中的一个第一数据层所在平面形成第一夹角;
    获取第二传递方向的第二传递深度,所述第二传递方向与所述第一体数据中的所述第一数据层所在平面形成第二夹角;
    确定所述第一传递深度和第二传递深度对应的范围中的体数据的至少一个第二数据层;和
    根据所述第一位置信息,在所述至少一个第二数据层生成传递后的感兴趣区域。
  5. 权利要求1的方法,进一步包括传递所述目标感兴趣区域,所述传递所述目标感兴趣区域包括:
    获取第二体数据;
    确定第二位置信息,所述第二位置信息包括所述目标感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和
    根据所述第二位置信息,在所述第二体数据生成传递后的感兴趣区域。
  6. 权利要求1的方法,进一步包括:
    在所述第一体数据中获取待合并感兴趣区域;和
    合并所述目标感兴趣区域和所述待合并感兴趣区域。
  7. 权利要求1的方法,进一步包括:
    传递所述初始感兴趣区域。
  8. 权利要求1的方法,所述绘制初始感兴趣区域包括:
    在所述第一体数据中绘制传递源感兴趣区域;
    确定第三位置信息,所述第三位置信息包括所述传递源感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和
    根据所述第三位置信息,在第一体数据中传递所述传递源感兴趣区域,以生成所述初始感兴趣区域。
  9. 权利要求1的方法,所述绘制初始感兴趣区域包括:
    获取第三体数据;
    在第三体数据中绘制传递源感兴趣区域;
    确定第四位置信息,所述第四位置信息包括所述传递源感兴趣区域中的一个体素在所述第三体数据中的三维坐标位置;和
    根据所述第四位置信息,传递所述传递源感兴趣区域至所述第一体数据,以生成所述初始感兴趣区域。
  10. 权利要求1的方法,所述绘制初始感兴趣区域包括:
    在所述第一体数据中绘制至少两个不同的合并源感兴趣区域;和
    合并所述至少两个不同的合并源感兴趣区域,以生成所述初始感兴趣区域。
  11. 权利要求1的方法,所述方法进一步包括分析所述目标感兴趣区域,所述分析所述目标感兴趣区域包括:
    分析所述目标感兴趣区域的特征信息,所述特征信息包括统计学特征信息,所述统计学特征信息是根据所述目标感兴趣区域中的多个体素统计分析得到的信息。
  12. 一种图像处理方法,在至少一个机器上实施,每个机器包括至少一个处理器和存储器,所述方法包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一个数据层,所述至少一个数据层包含至少一个体素;
    获取所述第一体数据中的初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用所述至少一个处理器确定所述初始感兴趣区域的位置信息;和
    根据所述位置信息,传递所述初始感兴趣区域。
  13. 权利要求12的方法,所述传递所述初始感兴趣区域包括:
    获取第一传递方向的第一传递深度,所述第一传递方向与所述第一体数据中的一个第一数据层所在平面形成第一夹角;
    获取第二传递方向的第二传递深度,所述第二传递方向与所述第一体数据中的所述第一数据层所在平面形成第二夹角;
    确定所述第一传递深度和所述第二传递深度对应的范围中的体数据中包含的至少一个第二数据层;
    确定第一位置信息,所述第一位置信息包括所述初始感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和
    根据所述第一位置信息,在所述至少一个第二数据层生成传递后的目标感兴趣区域。
  14. 权利要求13的方法,进一步包括:
    在所述第一体数据中绘制待合并感兴趣区域;和
    合并所述目标感兴趣区域与所述待合并感兴趣区域。
  15. 权利要求12的方法,所述传递所述初始感兴趣区域包括:
    获取第二体数据;
    确定第二位置信息,所述第二位置信息包括所述初始感兴趣区域中的一个体素在所述第一体数据中的三维坐标位置;和
    根据所述第二位置信息,在所述第二体数据生成传递后的感兴趣区域。
  16. 权利要求15的方法,进一步包括:
    在所述第二体数据中绘制待合并感兴趣区域;和
    合并所述传递后的感兴趣区域与所述待合并感兴趣区域。
  17. 权利要求12的方法,进一步包括:
    在所述第一体数据中绘制待合并感兴趣区域;和
    合并所述初始感兴趣区域和所述待合并感兴趣区域。
  18. 权利要求12的方法,所述获取所述第一体数据中的初始感兴趣区域包括:
    在所述第一体数据中绘制至少两个不同的合并源感兴趣区域;和
    合并所述至少两个不同的合并源感兴趣区域,以获得所述初始感兴趣区域。
  19. 一种图像处理方法,在至少一个机器上实施,每个机器包括至少一个处理器和存储器,所述方法包括:
    获取图像数据集;
    在所述图像数据集中获取初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    在所述图像数据集中获取待合并感兴趣区域,所述待合并感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用所述至少一个处理器合并所述初始感兴趣区域和所述待合并感兴趣区域;和
    分析合并后的感兴趣区域的特征信息。
  20. 一种记录有信息的非临时的机器可读媒体,当被至少一个机器执行时,所述信息使所述至少一个机器执行一个方法,所述方法包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一个数据层,所述至少一个数据层包含至少一个体素;
    用至少一个处理器在所述第一体数据中确定目标感兴趣区域,所述目标感兴趣区域包括所述至少一个数据层中的至少一个体素,并且所述目标感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积,所述确定所述目标感兴趣区域包括:
    绘制初始感兴趣区域,所述初始感兴趣区域在所述第一体数据中,并且所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;和
    裁剪所述初始感兴趣区域,以获得所述目标感兴趣区域。
  21. 一个系统,包括:
    至少一个处理器,以及
    存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现的操作包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一 个数据层,所述至少一个数据层包含至少一个体素;
    用所述至少一个处理器在所述第一体数据中确定目标感兴趣区域,所述目标感兴趣区域包括所述至少一个数据层中的至少一个体素,并且所述目标感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积,所述确定所述目标感兴趣区域包括:
    绘制初始感兴趣区域,所述初始感兴趣区域在所述第一体数据中,并且所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;和
    裁剪所述初始感兴趣区域,以获得所述目标感兴趣区域。
  22. 一种记录有信息的非临时的机器可读媒体,当被至少一个机器执行时,所述信息使所述至少一个机器执行一个方法,所述方法包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一个数据层,所述至少一个数据层包含至少一个体素;
    获取所述第一体数据中的初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用至少一个处理器确定所述初始感兴趣区域的位置信息;和
    根据所述位置信息,传递所述初始感兴趣区域。
  23. 一个系统,包括:
    至少一个处理器,以及
    存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现的操作包括:
    获取图像数据集,所述图像数据集包括第一体数据,所述第一体数据包括至少一个数据层,所述至少一个数据层包含至少一个体素;
    获取所述第一体数据中的初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用所述至少一个处理器确定所述初始感兴趣区域的位置信息;和
    根据所述位置信息,传递所述初始感兴趣区域。
  24. 一种记录有信息的非临时的机器可读媒体,当被至少一个机器执行时,所述信息使所述至少一个机器执行一个方法,所述方法包括:
    获取图像数据集;
    在所述图像数据集中获取初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    在所述图像数据集中获取待合并感兴趣区域,所述待合并感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用至少一个处理器合并所述初始感兴趣区域和所述待合并感兴趣区域;和
    分析合并后的感兴趣区域的特征信息。
  25. 一个系统,包括:
    至少一个处理器,以及
    存储器,用于存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现的操作包括:
    获取图像数据集;
    在所述图像数据集中获取初始感兴趣区域,所述初始感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    在所述图像数据集中获取待合并感兴趣区域,所述待合并感兴趣区域包括至少一个感兴趣二维区域或至少一个感兴趣三维体积;
    用所述至少一个处理器合并所述初始感兴趣区域和所述待合并感兴趣区域;和
    分析合并后的感兴趣区域的特征信息。
PCT/CN2017/086539 2017-05-31 2017-05-31 图像处理方法及系统 WO2018218478A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
PCT/CN2017/086539 WO2018218478A1 (zh) 2017-05-31 2017-05-31 图像处理方法及系统
EP17911976.3A EP3627442A4 (en) 2017-05-31 2017-05-31 IMAGE PROCESSING METHOD AND SYSTEM
US15/854,705 US10824896B2 (en) 2017-05-31 2017-12-26 Method and system for image processing
US17/086,517 US11461990B2 (en) 2017-05-31 2020-11-02 Method and system for image processing
US17/821,481 US11798168B2 (en) 2017-05-31 2022-08-23 Method and system for image processing

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/086539 WO2018218478A1 (zh) 2017-05-31 2017-05-31 图像处理方法及系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/854,705 Continuation US10824896B2 (en) 2017-05-31 2017-12-26 Method and system for image processing

Publications (1)

Publication Number Publication Date
WO2018218478A1 true WO2018218478A1 (zh) 2018-12-06

Family

ID=64454205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/086539 WO2018218478A1 (zh) 2017-05-31 2017-05-31 图像处理方法及系统

Country Status (3)

Country Link
US (3) US10824896B2 (zh)
EP (1) EP3627442A4 (zh)
WO (1) WO2018218478A1 (zh)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6885896B2 (ja) * 2017-04-10 2021-06-16 富士フイルム株式会社 自動レイアウト装置および自動レイアウト方法並びに自動レイアウトプログラム
US10751548B2 (en) * 2017-07-28 2020-08-25 Elekta, Inc. Automated image segmentation using DCNN such as for radiation therapy
US10789675B2 (en) * 2018-12-28 2020-09-29 Intel Corporation Apparatus and method for correcting image regions following upsampling or frame interpolation
EP3675063A1 (en) * 2018-12-29 2020-07-01 Dassault Systèmes Forming a dataset for inference of solid cad features
EP3675062A1 (en) 2018-12-29 2020-07-01 Dassault Systèmes Learning a neural network for inference of solid cad features
JP7211172B2 (ja) * 2019-03-08 2023-01-24 コニカミノルタ株式会社 動態画像解析システム及び動態画像処理装置
US10790056B1 (en) * 2019-04-16 2020-09-29 International Medical Solutions, Inc. Methods and systems for syncing medical images across one or more networks and devices
CN112085015A (zh) * 2019-06-13 2020-12-15 杭州海康机器人技术有限公司 图像处理方法和图像处理装置以及检测设备
CN113191990B (zh) * 2021-05-28 2023-05-23 浙江宇视科技有限公司 一种图像处理方法、装置、电子设备及介质
US11538578B1 (en) 2021-09-23 2022-12-27 International Medical Solutions, Inc. Methods and systems for the efficient acquisition, conversion, and display of pathology images

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665555A (zh) * 2009-09-29 2012-09-12 皇家飞利浦电子股份有限公司 生成合成医学图像
CN104766340A (zh) * 2015-04-30 2015-07-08 上海联影医疗科技有限公司 一种图像分割方法
CN105342701A (zh) * 2015-12-08 2016-02-24 中国科学院深圳先进技术研究院 一种基于影像信息融合的病灶虚拟穿刺系统
CN106548453A (zh) * 2015-09-21 2017-03-29 上海联影医疗科技有限公司 Pet图像重建方法及系统

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4129375B2 (ja) 2002-08-13 2008-08-06 株式会社東芝 医用画像診断装置および画像領域指定支援方法
CN101002231B (zh) * 2004-08-09 2010-09-08 皇家飞利浦电子股份有限公司 基于区域竞争可变形网格适配的分割
US7430320B2 (en) * 2004-11-15 2008-09-30 Drvision Technologies Llc Region-guided boundary refinement method
CN101065775B (zh) 2004-11-26 2010-06-09 皇家飞利浦电子股份有限公司 感兴趣体积的选择
EP1757955B1 (en) * 2005-08-24 2010-11-17 Medison Co., Ltd. Apparatus and method for processing an ultrasound image
US7636463B2 (en) * 2005-12-20 2009-12-22 Siemens Aktiengesellschaft Multi-planar reformating using a three-point tool
DE602006011511D1 (de) 2006-05-15 2010-02-11 Im3D S P A Lien in anatomischen strukturen auf grundlage einer verbesserten bereichswachstumssegmentation und rechnerprogramm dafür
JP4969985B2 (ja) * 2006-10-17 2012-07-04 株式会社東芝 超音波診断装置、及び超音波診断装置の制御プログラム
US8591420B2 (en) * 2006-12-28 2013-11-26 Kabushiki Kaisha Toshiba Ultrasound imaging apparatus and method for acquiring ultrasound image
WO2008136007A2 (en) 2007-05-08 2008-11-13 Amihay Halamish Acquiring regions of interest at a high frame rate
CN101861600B (zh) 2007-11-14 2012-11-28 皇家飞利浦电子股份有限公司 用于定量3d ceus分析的系统和方法
CN102422307B (zh) 2009-05-08 2015-12-16 美国医软科技公司 用于交互式肝脏血管和胆管系统评估的方法、系统、装置和计算机程序产品
US8798342B2 (en) * 2011-05-10 2014-08-05 General Electric Company Method and system for ultrasound imaging with cross-plane images
WO2013051045A1 (ja) * 2011-10-03 2013-04-11 株式会社日立製作所 画像処理装置および画像処理方法
CN104794758B (zh) 2015-04-17 2017-10-03 青岛海信医疗设备股份有限公司 一种三维图像的裁剪方法
CN105139454B (zh) 2015-08-06 2018-01-19 北京工业大学 一种三维ct图像中肝脏三维感兴趣区域的自动提取方法
US10049449B2 (en) 2015-09-21 2018-08-14 Shanghai United Imaging Healthcare Co., Ltd. System and method for image reconstruction
US10346993B2 (en) 2015-11-17 2019-07-09 Shanghai United Imaging Healthcare Co., Ltd. Systems and methods for image processing in magnetic resonance imaging
CN105654490A (zh) 2015-12-31 2016-06-08 中国科学院深圳先进技术研究院 一种基于超声弹性图像的病变区域提取方法及装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102665555A (zh) * 2009-09-29 2012-09-12 皇家飞利浦电子股份有限公司 生成合成医学图像
CN104766340A (zh) * 2015-04-30 2015-07-08 上海联影医疗科技有限公司 一种图像分割方法
CN106548453A (zh) * 2015-09-21 2017-03-29 上海联影医疗科技有限公司 Pet图像重建方法及系统
CN105342701A (zh) * 2015-12-08 2016-02-24 中国科学院深圳先进技术研究院 一种基于影像信息融合的病灶虚拟穿刺系统

Also Published As

Publication number Publication date
US11461990B2 (en) 2022-10-04
EP3627442A1 (en) 2020-03-25
US11798168B2 (en) 2023-10-24
US20180349724A1 (en) 2018-12-06
US10824896B2 (en) 2020-11-03
US20210142089A1 (en) 2021-05-13
US20220415005A1 (en) 2022-12-29
EP3627442A4 (en) 2020-05-20

Similar Documents

Publication Publication Date Title
WO2018218478A1 (zh) 图像处理方法及系统
US11710242B2 (en) Methods and systems for image segmentation
US11776216B2 (en) System and method for extracting a region of interest from volume data
CN107123095B (zh) 一种pet图像重建方法、成像系统
RU2571523C2 (ru) Вероятностная оптимизация сегментации, основанной на модели
US8068650B2 (en) Lesion quantification and tracking using multiple modalities
CN107194925A (zh) 图像处理方法及系统
JP6539303B2 (ja) 3d医用画像中の対象物を分割するための3d対象物の変換
US10580181B2 (en) Method and system for generating color medical image based on combined color table
CN110706241B (zh) 一种三维病灶区域提取方法和装置
CN107273904A (zh) 图像处理方法及系统
CN107507212A (zh) 数字脑可视化方法、装置、计算设备及存储介质
WO2018219818A1 (en) Quantified aspects of lesions in medical images
US9082193B2 (en) Shape-based image segmentation
JP2015226715A (ja) 画像処理装置、画像処理システム、画像処理方法およびプログラム
Dawood et al. The importance of contrast enhancement in medical images analysis and diagnosis
CN107292867A (zh) 图像处理方法及系统
Macho et al. Segmenting Teeth from Volumetric CT Data with a Hierarchical CNN-based Approach.
Adeshina et al. CAHECA: computer aided hepatocellular carcinoma therapy planning
Cui et al. A 3D Segmentation Method for Pulmonary Nodule Image Sequences based on Supervoxels and Multimodal Data
Xie Design and Development of Medical Image Processing Experiment System Based on IDL Language.
Deenadhayalan et al. Computed Tomography Image based Classification and Detection of Lung Diseases with Image Processing Approach
BOUDERBALA et al. Mammographic Image segmentation by deformable models
CN117876833A (zh) 一种用于机器学习的肺部ct图像特征提取方法
Patra et al. Medical Image Processing in Nuclear Medicine and Bone Arthroplasty

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17911976

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2017911976

Country of ref document: EP

Effective date: 20191206