WO2018161257A1 - 生成彩色医学影像的方法及系统 - Google Patents
生成彩色医学影像的方法及系统 Download PDFInfo
- Publication number
- WO2018161257A1 WO2018161257A1 PCT/CN2017/075892 CN2017075892W WO2018161257A1 WO 2018161257 A1 WO2018161257 A1 WO 2018161257A1 CN 2017075892 W CN2017075892 W CN 2017075892W WO 2018161257 A1 WO2018161257 A1 WO 2018161257A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color table
- combined color
- combined
- organization
- tissue
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 102
- 238000009877 rendering Methods 0.000 claims abstract description 15
- 230000008520 organization Effects 0.000 claims description 68
- 230000000694 effects Effects 0.000 claims description 13
- 230000002452 interceptive effect Effects 0.000 claims description 11
- 210000001519 tissue Anatomy 0.000 description 114
- 238000012545 processing Methods 0.000 description 54
- 210000000988 bone and bone Anatomy 0.000 description 26
- 230000003993 interaction Effects 0.000 description 25
- 210000003491 skin Anatomy 0.000 description 24
- 238000010586 diagram Methods 0.000 description 21
- 230000008569 process Effects 0.000 description 19
- 230000000875 corresponding effect Effects 0.000 description 18
- 210000004204 blood vessel Anatomy 0.000 description 13
- 210000003128 head Anatomy 0.000 description 13
- 238000002591 computed tomography Methods 0.000 description 11
- 238000013480 data collection Methods 0.000 description 11
- 210000003205 muscle Anatomy 0.000 description 11
- 210000000056 organ Anatomy 0.000 description 11
- 238000003384 imaging method Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 238000012986 modification Methods 0.000 description 10
- 238000004891 communication Methods 0.000 description 9
- 238000003709 image segmentation Methods 0.000 description 8
- 238000000638 solvent extraction Methods 0.000 description 8
- 238000002604 ultrasonography Methods 0.000 description 6
- 230000006870 function Effects 0.000 description 5
- 230000006872 improvement Effects 0.000 description 5
- 230000000644 propagated effect Effects 0.000 description 4
- 210000003625 skull Anatomy 0.000 description 4
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000002059 diagnostic imaging Methods 0.000 description 3
- 238000002583 angiography Methods 0.000 description 2
- 230000000747 cardiac effect Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 238000010968 computed tomography angiography Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 201000010099 disease Diseases 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 210000004072 lung Anatomy 0.000 description 2
- 238000002595 magnetic resonance imaging Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000002600 positron emission tomography Methods 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000011218 segmentation Effects 0.000 description 2
- 238000002603 single-photon emission computed tomography Methods 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 210000001015 abdomen Anatomy 0.000 description 1
- 238000010521 absorption reaction Methods 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 239000008280 blood Substances 0.000 description 1
- 210000004369 blood Anatomy 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 210000000038 chest Anatomy 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 208000019622 heart disease Diseases 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 239000010977 jade Substances 0.000 description 1
- 230000003902 lesion Effects 0.000 description 1
- 230000014759 maintenance of location Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000002715 modification method Methods 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000036555 skin type Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000003325 tomography Methods 0.000 description 1
- 230000002792 vascular Effects 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
- G09G5/06—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/032—Transmission computed tomography [CT]
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/02—Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
- A61B6/03—Computed tomography [CT]
- A61B6/037—Emission tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/46—Arrangements for interfacing with the operator or the patient
- A61B6/461—Displaying means of special interest
- A61B6/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B6/00—Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
- A61B6/52—Devices using data or image processing specially adapted for radiation diagnosis
- A61B6/5211—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data
- A61B6/5217—Devices using data or image processing specially adapted for radiation diagnosis involving processing of medical diagnostic data extracting a diagnostic or physiological parameter from medical diagnostic data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/46—Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
- A61B8/461—Displaying means of special interest
- A61B8/463—Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5223—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10081—Computed x-ray tomography [CT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10088—Magnetic resonance imaging [MRI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10104—Positron emission tomography [PET]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10072—Tomographic images
- G06T2207/10108—Single photon emission computed tomography [SPECT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10132—Ultrasound image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30024—Cell structures in vitro; Tissue sections in vitro
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/404—Angiography
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2211/00—Image generation
- G06T2211/40—Computed tomography
- G06T2211/412—Dynamic
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2380/00—Specific applications
- G09G2380/08—Biomedical applications
Definitions
- the present application relates to a medical image generation method and system, and more particularly to a method and system for generating a color medical image from a combined color table.
- the color display technology commonly used in medical images is a color table (pseudo-color table), that is, different gray levels are expressed by different colors and transparency, thereby achieving the purpose of displaying medical influence by color means.
- a medical image often contains a variety of tissues, and medical applications need to display multiple tissues at the same time and distinguish them. This requires the use of multiple color tables to configure different tissue regions on the basis of image segmentation, so as to achieve the purpose of displaying multiple tissues.
- the method for generating a color medical image may include: acquiring medical image data, dividing the tissue in the image, according to the division Organizing, selecting a combined color table from a combined color table library, and drawing a colored medical image containing the divided tissue according to the selected combined color table.
- the combined color table can be a data containing the divided tissue color scheme.
- the non-transitory computer readable medium can include executable instructions.
- the instructions when executed by at least one processor, can cause the at least one processor to implement a method.
- the system can include at least one processor and executable instructions.
- the instructions when executed by at least one processor, cause the at least one processor to implement the method of generating a color medical image.
- the system can include at least one processor and a memory for storing instructions that, when executed by the at least one processor, can cause the system to implement the method of generating a color medical image.
- the method of generating a color medical image may further comprise presenting the rendered color medical image comprising the segmented tissue to a user.
- selecting a combined color table from a combined color table library according to the divided organization may include: determining a category of the organization, retrieving a combined color table related to the determined type of the organization, and presenting the correlation The combined color table is given to the user, obtaining an instruction to select a combination color table, and determining a combined color table.
- the method for generating a color medical image may further include a method of generating a combined color table in a color table library, the combined color table generating method may include: determining a number of tissues, determining the first The color table of the organization determines a new color table based on the determined color table until all the color tables of the organization are determined, and all the color tables are combined into one combined color table.
- the method for generating a color medical image may further include a method of modifying a combined color table in a color table library
- the combined color table modification method may include: acquiring a combined color table, determining a combination The data to be modified in the color table is determined by modifying the data to be modified to determine a new combined color table, and storing the new combined color table in the combined color table library.
- the medical image data may be at least one of the following combinations: a two-dimensional picture, a three-dimensional space model, and a motion picture.
- the dividing the tissue in the image may include dividing different tissues in the medical image and different portions of the same tissue.
- the tissue color scheme may include a number, a name, a label, a color, a transparency, whether to hide.
- the determining the type of tissue may be determining a genre characteristic of the tissue, wherein the genre characteristics of the tissue may include a kind of tissue, a number of different tissues, and a form of each tissue.
- the instruction for selecting a combined color table may be sent by a user through an interaction device, wherein the search result of the search may be displayed on the interaction device, and the manner of displaying may include a list and a schematic diagram. And use a combination color table to draw a rendering of a color medical image.
- the combined color table may include: a plurality of tissues representing partial regions or subsets of medical image data, and a color table corresponding to each tissue, the color table including tissue The window color window position and the color matching effect corresponding to the gray value determined by the window width window position.
- the searching for a combined color table related to the determined type of organization may be a hierarchical search according to a category characteristic of the organization.
- the combined color table may further include display or hidden options for the respective organizations.
- the combined color may further include a transparency option for the respective tissue.
- FIG. 1 is a schematic diagram of a color medical image generation system shown in accordance with some embodiments of the present application.
- FIG. 2 is a block diagram showing the structure of a computing device according to some embodiments of the present application.
- FIG. 3 is a schematic diagram of a color medical image generation system shown in accordance with some embodiments of the present application.
- FIG. 4 is a schematic diagram of a combined color table selection unit shown in accordance with some embodiments of the present application.
- FIG. 5 is an exemplary flow diagram of generating a color medical image, shown in some embodiments of the present application.
- FIG. 6 is an exemplary flow diagram of selecting a combined color table from a library of combined color tables, in accordance with some embodiments of the present application.
- FIG. 7 is an exemplary flow diagram of constructing a combined color table library, shown in accordance with some embodiments of the present application.
- FIG. 8 is an exemplary flow diagram of modifying a combined color table, shown in accordance with some embodiments of the present application.
- 10A-D are examples of generating color medical images using different combined color tables, as shown in some embodiments of the present application.
- 11 is an example of a user interface for modifying a combined color table, shown in accordance with some embodiments of the present application.
- modules in a data processing system in accordance with embodiments of the present application, any number of different modules can be used and run on a client connected to the system over a network and / or on the server.
- the modules are merely illustrative, and different aspects of the systems and methods may use different modules.
- the imaging system can include one or more modalities.
- the morphology may include digital subtraction angiography (DSA), magnetic resonance imaging (MRI), magnetic resonance angiography (MRA), computed tomography (CT), computed tomography angiography (CTA), ultrasound scanning (US) , positron emission tomography (PET), single photon emission computed tomography (SPECT), SPECT-MR, CT-PET, CE-SPECT, DSA-MR, PET-MR, PET-US, SPECT-US, TMS a combination of one or more of -MR, US-CT, US-MR, X-ray-CT, X-ray-PET, X-ray-US, video-CT, video-US, and/or the like.
- DSA digital subtraction angiography
- MRI magnetic resonance imaging
- MRA magnetic resonance angiography
- CT computed tomography angiography
- CTA computed tomography angiography
- US positron emission tomography
- PET single photon emission computed tomography
- the target of the imaging scan can be a combination of one or more of an organ, a body, an object, a lesion, a tumor, and the like. In some embodiments, the target of the imaging scan can be a combination of one or more of the head, chest, abdomen, organs, bones, blood vessels, and the like. In some embodiments, the target of the scan can be vascular tissue at one or more locations.
- the image can be a two-dimensional image and/or a three-dimensional image. In a two-dimensional image, the finest resolvable elements can be pixels. In a three-dimensional image, the finest resolvable elements can be voxels. In a three-dimensional image, the image can be composed of a series of two-dimensional slices or two-dimensional layers.
- the tissue partitioning process can be based on the corresponding features of the pixel points (or voxel points) of the image.
- the respective features of the pixel points (or voxel points) may include a combination of one or more of texture, grayscale, average grayscale, signal strength, color saturation, contrast, brightness, and the like.
- the spatial location features of the pixel points (or voxel points) may also be used in an image segmentation process.
- the color medical image generation system 100 can include a data collection device 110, a processing device 120, a storage device 130, and an interaction device 140.
- the data collection device 110, the processing device 120, the storage device 130, and the interaction device 140 can communicate with each other over the network 180.
- the data collection device 110 can be a device that collects data.
- the data may include image data, object feature data, and the like.
- the data collection device 110 can include an imaging device.
- the imaging device can acquire the image data.
- the imaging device may be a magnetic resonance imaging (MRI), a computed tomography (CT), a positron emission computed tomography (PET), or a B-mode ultrasound.
- MRI magnetic resonance imaging
- CT computed tomography
- PET positron emission computed tomography
- B-mode ultrasound a combination of one or more of a b-scan ultrasonography, a diasonography, a thermal texture maps (TTM), a medical electronic endoscope (MEE), and the like.
- the image data may be a picture or data comprising blood vessels, tissues or organs of the subject.
- the object feature collection device can be integrated in the imaging device to simultaneously acquire image data and object feature data.
- the data collection device 110 can transmit its collected data to the processing device 120, the storage device 130, and/or the interaction device 140, etc., via the network 180.
- Processing device 120 can process the data.
- the data may be data collected by the data collection device 110, data read from the storage device 130, feedback data obtained from the interaction device 140, such as user input data, or from the cloud or external device through the network 180. Data obtained in the middle, etc.
- the data can include image data, object feature data, user input data, and the like.
- the processing can include selecting an area of interest in the image data.
- the area of interest may be selected by the processing device 120 itself or selected based on user input data.
- the selected region of interest can be a blood vessel, tissue or organ, and the like.
- the region of interest may be a head region such as a blood vessel, a bone, a skin, or the like.
- Processing device 120 may further segment the region of interest in the image.
- the method of image segmentation may include edge-based image segmentation methods, such as Perwitt operator method, Sobel operator method, gradient operator method, Kirch operator method, etc., region-based image segmentation methods, such as region growing method, threshold method, Clustering methods, etc. and other segmentation methods, such as fuzzy sets, neural networks Method etc.
- processing device 120 can perform partitioning of different organizations. For example, the gray image of the head region is divided into a head blood vessel, a head blood vessel, a head bone, a head skin, and the like.
- Organizational partitioning can be based on user identification or interactive partitioning, or based on algorithms that automatically partition organizations, such as region growing algorithms, grayscale algorithms, level sets, neural networks, clustering, graph cutting, deformation models, and maps. Law and so on.
- processing device 120 can convert the grayscale image to a color medical image in accordance with a combined color table.
- the combined color table may include: a plurality of tissues representing partial regions or subsets of medical image data; a color table corresponding to each tissue, the color table may include a window width window of the tissue and a window width The color matching effect corresponding to the gray level value determined by the window level; the display or hiding options of the respective tissues and the transparency options of the respective tissues.
- the results of applying different color schemes to the grayscale image of the head may include a head blood vessel color medical image 150, a head capillary blood medical image 160, a head blood vessel and a skull color medical image 170, a head capillary tube, and A combination of one or more of a skull color medical image 180 or the like.
- processing device 120 can edit the organizational management module corresponding to the combined color table, such as tissue display controls, color of individual tissues, color table controls, transparency, lighting parameters, and the like. In some embodiments, processing device 120 may modify the transparency parameters of a tissue in a selected combined color table. In some embodiments, processing device 120 may modify the color parameters of certain organizations of a selected combined color table.
- processing device 120 may perform noise reduction or smoothing on the data or processing results obtained therefrom.
- processing device 120 may send the data or processing results it obtains to storage device 130 for storage, or to interactive device 140 for display.
- the processing result may be an intermediate result produced during the processing, such as the result of tissue division, or may be the final result of the processing, such as the final obtained color medical image.
- processing device 120 may be one or more processing elements or devices, such as a central processing unit (CPU), a graphics processing unit (GPU), a digital signal processor (digital signal) Processor, DSP), system on a chip (SoC), microcontroller (mCU), etc.
- processing device 120 may also be a specially designed processing element or device having special functions. Processing device 120 may be local or remote relative to data collection device 110.
- the storage device 130 can store data or information.
- the data or information may include data acquired by the data collection device 110, processing results or control instructions generated by the processing device 120, user input data received by the interaction device 140, and the like.
- the storage device 130 can be one or more storage media that can be read or written, including static random access memory (SRAM), random-access memory (RAM), read-only memory (read- Only memory, ROM), hard disk, flash memory, etc.
- SRAM static random access memory
- RAM random-access memory
- ROM read-only memory
- storage device 130 can also be a remote storage such as a cloud disk or the like.
- the interaction device 140 can receive, transmit, and/or display data or information.
- the received data or information may include data acquired by the data collection device 110, processing results generated by the processing device 120, data stored by the storage device 130, and the like.
- the data or information displayed by the interaction device 140 may include an actual image of the cephalic blood vessel obtained by the data acquisition device 110, a color medical image generated by the processing device 120 according to a color scheme, and the like.
- Display forms can include two-dimensional color medical images, three-dimensional color medical images, color geometric models and their mesh analysis, vector graphics (such as velocity vector lines), contour maps, filled contour maps (cloud maps) One or more combinations of XY scatter plots, particle trajectories, and simulated flow effects.
- the data or information transmitted by the interaction device 140 can include input information of the user.
- the interaction device 140 can receive one or more operational parameters entered by the user and send to the processing device 120.
- the interaction device 140 can include a user interaction interface.
- the user can input a user input data to the interactive device 140 via a particular interactive device, such as a mouse, keyboard, touch pad, microphone, or the like.
- the interaction device 140 can be a display device or the like having a display function. In some embodiments, the interaction device 140 can have the functionality of some or all of the processing device 120. For example, the interaction device 140 can perform operations such as smoothing, noise reduction, color change, and the like on the results generated by the processing device 120. For example, the color change operation can change a grayscale image into a color image or turn a color image into a grayscale image.
- the interaction device 140 and the processing device 120 can be an integrated device. The integrated device can implement the functions of the processing device 120 and the interaction device 140 at the same time.
- interactive device 140 can include a desktop computer, a server, a mobile device, and the like.
- Mobile devices may include laptops, tablets, ipads, built-in devices of vehicles (eg, motor vehicles, boats, airplanes, etc.), wearable devices, and the like.
- interactive device 140 can include or be connected to a display device, printer, fax, or the like.
- the network 180 can be used for communication within the color medical image generation system 100, receiving information external to the system, transmitting information to the outside of the system, and the like.
- the network 180 can be accessed between the data collection device 110, the processing device 120, and the interaction device 140 by way of a wired connection, a wireless connection, or a combination thereof.
- Network 180 can be a single network or a combination of multiple networks.
- network 180 may include, but is not limited to, a combination of one or more of a local area network, a wide area network, a public network, a private network, a wireless local area network, a virtual network, a metropolitan area network, a public switched telephone network, and the like.
- network 180 may include a variety of network access points, such as wired or wireless access points, base stations, or network switching points, through which the data sources connect to network 180 and transmit information over the network.
- FIG. 2 is a block diagram of a computing device 200, shown in accordance with some embodiments of the present application.
- the computing device 200 can implement the particular system disclosed in this application.
- the particular system in this embodiment utilizes a functional block diagram to explain a hardware platform that includes a user interface.
- Computing device 200 can implement one or more components, modules, units, sub-units (e.g., processing device 120, interaction device 140, etc.) in current color medical image generation system 100. Additionally, one or more components, modules, units, sub-units (eg, processing device 120, interaction device 140, etc.) in color medical image generation system 100 can be utilized by computing device 200 through its hardware devices, software programs, firmware, and The combination is implemented.
- Such a computer can be a general purpose computer or a computer with a specific purpose.
- Both computers can be used to implement the particular system in this embodiment.
- only one computing device is drawn in FIG. 2, but the related computer functions described in this embodiment for information processing and pushing information can be implemented in a distributed manner by a similar set of platforms.
- computing device 200 can include an internal communication bus 210, a processor 220, a read only memory (ROM) 230, a random access memory (RAM) 240, a communication port 250, and an input/output component 260, Hard disk 270, user interface 280.
- the internal communication bus 210 can enable data communication between components of the computing device 200.
- the processor 220 can execute program instructions to perform one or more of the functions, components, modules, units, subunits of the color medical image generation system 100 described in this disclosure.
- Processor 220 is comprised of one or more processors.
- Communication port 250 can be configured to enable data communication (e.g., via network 180) between computing device 200 and other components of color medical image generation system 100, such as data collection device 110.
- the computing device 200 can also include different forms of program storage units and data storage units, such as a hard disk 270, read only memory (ROM) 230, random access memory. (RAM) 240, various data files that can be used for computer processing and/or communication, and possible program instructions executed by processor 220.
- Input/output component 260 supports input/output data streams between computing device 200 and other components, such as user interface 280, and/or with other components of color medical image generation system 100, such as database 140.
- Computing device 200 can also transmit and receive information and data from network 180 via communication port 250.
- the color medical image generation system 300 can include a receiving module 310, a processing module 320, a storage module 330, and an output module 340.
- the processing module 320 may further include an organization dividing unit 321, a combined color table selecting unit 322, and an image generating unit 323.
- the modules shown can be connected directly (and/or indirectly) to each other.
- the receiving module 310 can receive a medical image.
- the medical image may include an XR image, a CT image, an MR image, an ultrasound image, and the like.
- the medical image may reflect information about a part of the body or a part of the animal or plant.
- the medical image is one or a set of two-dimensional images. For example, black and white X-ray film, CT scan images of different faults, and the like.
- the two-dimensional medical image can be composed of several pixels (Pixel).
- the medical image may be a three-dimensional spatial model, such as a three-dimensional model of an organ reconstructed from CT scan images of different tomography, or a three-dimensional spatial model output by a device having three-dimensional imaging capabilities.
- the three-dimensional medical image can be composed of several voxels.
- the medical image may also be a dynamic image over a period of time. For example, a video that reflects changes in the heart and its surrounding tissues during a cardiac cycle.
- the medical image may be from the imaging device 110, may be from the storage module 330, or may be from an input of the user through the interactive device.
- the pixels or voxels that make up the medical image are black and white pixels or voxels.
- the gray value of the pixel or voxel may be correspondingly changed according to different states of the imaged tissue organ to present a black and white tissue organ image.
- the pixels or voxels and their corresponding gray values may be stored in the storage module 330 in the form of a list.
- a pixel or voxel belonging to a certain tissue organ may correspond to a portion of the gray value.
- a pixel or voxel belonging to a certain tissue organ may correspond to a portion of the gray value.
- bone, muscle and skin tissue are present on a medical image.
- Each organization can correspond to a range of values of a certain gray value. For example, the gray value of the bone is higher, the gray value of the muscle is second, and the gray value of the skin is the smallest.
- the processing module 320 can process the medical image received by the receiving module 310.
- the processing may include dividing and determining different portions of different tissues or the same tissue in the medical image.
- the processing may be performed by the tissue dividing unit 321, the combined color table selecting unit 322, and the image generating unit 323, respectively.
- the organization dividing unit 321 can divide different tissues in the received medical images or different portions of the same tissue.
- the division may be automatic division, semi-automatic division or manual division.
- the different tissues may be different organs included in the medical image, such as bones, skin, organs, and the like.
- the different portions of the same tissue may be portions of a certain tissue organ that differ in position, such as the left and right lungs of a person.
- the dividing may be to classify pixels or voxels corresponding to different tissues. For example, in a medical image containing bones, muscles, and skin, the pixels or voxels corresponding to the bone portion can be classified into the first category, and the pixels or voxels corresponding to the muscle portion can be classified into the second category, the skin portion. Corresponding pixels or voxels can be classified into the third category.
- the classification information of the pixels or voxels may be stored in the storage module 330.
- the combined color table selection unit 322 can determine a combined color table that is used to assist in generating color medical images.
- the combined color table can be composed of a series of color tables.
- the color table may be a color table (pseudo color table) currently applied in color display technology commonly used in medical imaging.
- the color table may correspond to different colors and/or transparency for different grayscale or grayscale intervals. For example, a higher gradation value may correspond to a longer wavelength, or a certain gradation interval corresponds to a color.
- the color table may also correspond to a certain type of pixel or voxel. For example, pixels or voxels of the same class may correspond to the same color.
- the content of the color table may include information such as the number, name, label, color, transparency, hiding, or the like of a certain type of organization. Whether the hidden information can be a yes or no judgment information. For example, when the hidden information is YES, the organization of the class can be hidden in the final generated color medical image.
- the combined color table may implement parsing and storage of the combined color table by an eXtensible Markup Language (Xml) related technique.
- Xml eXtensible Markup Language
- the combined color table can be a collection of at least two color tables described above.
- a combination color table may include a color table of two types of tissues of bone and skin, and in the color medical image drawn according to the combined color table, the bone portion is drawn according to the bone type information in the combined color table, and the skin portion is based on The skin class information in the combined color table is drawn.
- the information of the different kinds of tissues in the combined color table may be related to each other. For example, in one containing bone In the combined color table of the two types of tissues, the color in the bone type information may be significantly different from the color in the skin type information, so that it can be distinguished by the human eye when generating a colored medical image.
- tissue A when generating a three-dimensional color medical image, a certain tissue A may be completely surrounded by tissue B. Then, in the corresponding combined color table, the user can choose to hide the information of the tissue B to facilitate displaying the tissue A in the three-dimensional color medical image when the tissue A needs to be observed.
- the combined color table can be stored in a combined color table library.
- the parsing and storage of the combined color table is achieved by Xml related techniques.
- the user can retrieve the desired combined color table from the combined color table library according to actual needs.
- the combined color table selection unit 322 can determine different color table for different tissue divisions from a combined color table library according to user instructions for subsequent color medical image drawing work.
- the management of the combined color table can be implemented by MVC (Model-view-controller).
- MVC Model-view-controller
- the color table content will be combined as a model, and the displayed content will be used as a view. In this way, according to multiple organizational divisions selected by the user, a plurality of related combined color table modules are automatically acquired from the combined color table library, and a color medical image including a plurality of tissue divisions is generated in one step.
- the image generation unit 323 can generate a color medical image based on the combined color table determined by the combined color table selection unit 322.
- the generating method may be a color table-based rendering technique, such as VR (virtual reality), MIP (Maximum Intensity Projection), and the like, and a volume rendering technique based on light irradiation.
- the ray-illuminated volume rendering technique may be a ray casting simulation using OpenGL's Shader technique.
- the rendering technique can be based on a multi-threaded CPU or a GPU based rendering technique.
- the storage module 330 can store data or information.
- the stored data or information may be in various forms, such as a combination of one or more of numerical values, signals, images, related information of a target object, commands, algorithms, programs, and the like.
- the stored data may be a black and white medical image, a color medical image, a tissue color table, a combined color table, or a program and/or algorithm applied by image processing.
- the output module 340 can output the generated color medical image.
- the output module 340 can send the color medical image to the storage device 130 for storage, or to the interactive device 140 for display or otherwise present to the client (eg, image, sound, etc.).
- the displayed content can be intermediate results generated, such as a model of the region of interest, or the resulting final result, such as the organization of the color medical image.
- the combined color table selection unit 322 may include an organization type determination sub-unit 410, a retrieval sub-unit 420, and a combined color table determination sub-unit 430.
- the modules shown can be connected directly (and/or indirectly) to each other.
- the tissue type determination sub-unit 410 can determine the genre characteristics of the tissue contained in the received medical image.
- the genre characteristics may include the type of tissue, the number of different kinds of tissues, and the morphology of each tissue.
- the number of categories of tissue may be determined by the tissue category determination sub-unit 410. For example, if a medical image contains bones, skin, and muscles, three different tissues can be identified. In some embodiments, the number of certain tissues can be determined. For example, if a medical image contains 5 segments of vertebrae, the 5 segments of the vertebrae can be determined.
- the morphology of a certain tissue can also be determined. For example, the cardiac morphology of a heart disease patient can be determined due to the fact that certain diseases are different from ordinary people.
- the series of genre features can serve as image features of the received medical image and provide a reference for subsequent color image rendering processes.
- the retrieval sub-unit 420 can retrieve the category features determined by the organizational category determination sub-unit 410 in a combined color table database.
- the combined color table library may be stored in the storage module 330 or may be stored in a cloud server and accessed through the network 180.
- the retrieval can be a hierarchical level of retrieval. For example, you can first search through the organization type and quantity. If a medical image contains three kinds of tissues and is bone, skin and muscle respectively, the first layer of the search can screen out the combined color table of the tissue type information composed of bone, skin and muscle. Further searches can be made through the specific quantities of certain organizations.
- a combined color table containing 5 segments of vertebrae can be further screened out in the results of the first layer search. It can be further retrieved through the form of an organization. For example, for a medical image containing two lobes, a combined color table containing two pieces of lobes information is screened out in the first layer or further search results. After at least one layer of retrieval, there may be several related combined color tables that are selected as search results.
- the combined color table determining sub-unit 430 can determine a series of search results filtered out in the search sub-unit 420.
- the plurality of column retrieval results can be displayed on the interaction device 140.
- the display mode may be a list, a simplified diagram, a rendering of a color medical image by applying a combined color table, and the like.
- the user can perform a selection operation through the interaction device 140, and the combined color table determination sub-unit 430 confirms one of the combined color tables.
- the combined color table selection unit 322 is merely for convenience of description, and the present application is not limited to the scope of the embodiments. It will be understood that, after understanding the principle of the system, it is possible for the various modules to be combined arbitrarily or the subsystems are connected to other modules without being deviated from the principle. Various modifications and changes in the form and details of the application of the method and system.
- the combined color table determination sub-unit 430 can be omitted or integrated into the retrieval sub-unit 420, and the retrieval sub-unit 420 directly outputs the retrieval result containing only one combined color table.
- FIG. 5 is an exemplary flow diagram of acquiring a color medical image, shown in accordance with some embodiments of the present application.
- the process of acquiring the color medical image may include acquiring the medical image data 510, dividing the tissue 520 in the image, selecting a combined color table 530 from a combined color table library according to the divided organization, and drawing the color according to the selected combined color table.
- Medical image 540 showing a color medical image 550.
- a medical image can be acquired.
- obtaining a medical image operation can be performed by the receiving module 310.
- the medical image may include a grayscale image such as an XR image, a CT image, an MR image, an ultrasound image, or the like, or any combination of the above images.
- the medical image can be obtained from the storage module 330.
- image data may be obtained from an external data source via network 180.
- image data may be obtained from input/output device 260.
- the related medical image acquisition method can refer to the description in the receiving module 310.
- the organizational partitioning may be performed by the tissue partitioning unit 321.
- the tissue partitioning operation may be based on one or more algorithms, such as edge-based image segmentation methods, such as Perwitt operator method, Sobel operator method, gradient operator method, Kirch operator method, etc., based on regions Image segmentation methods, such as region growing method, threshold method, clustering method, etc., and other segmentation methods, such as methods based on fuzzy sets, neural networks, etc., or any combination of the above algorithms.
- the organization partitioning can be based on a Mesh structure, a Mask mode, or any combination of the above. Related organizational division methods The description in the unit 321 is referred to by reference.
- a combined color table can be selected from a combined color table library based on the divided organization.
- the selection operation can be performed by the combined color table selection unit 322.
- the organization according to the division may be selected according to the type of organization or a different part of an organization.
- the selecting may be first searching in the combined color table library according to the divided organization, and selecting one of the search results.
- the related combined color table selection method can refer to the expression of the combined color table selection unit 322.
- colored medical images can be rendered in accordance with the combined color table selected in 530.
- rendering of the color medical image may be performed by image generation unit 323.
- a related drawing method can be referred to the description in the image generating unit 323.
- the color medical images drawn in 540 can be displayed.
- the presentation may be via the output module 340 to transmit the color medical image to the interactive device 140 for presentation.
- the presentation may be presented to the user in the form of an image, sound or movie.
- 540 and 550 can be combined into one step.
- the image data processing flow may return to 530 for further processing of the image data. For example, reselect a combined color table for drawing.
- one or more operations may be added to, or deleted from, the process. For example, before 510, a scanning operation of the object to be tested may be added, and the object to be tested may be scanned by the imaging device. As another example, a data storage operation can be added between or after 510, 520, 530, 540, and/or 550. The data can be saved in the storage module 330.
- FIG. 6 is an exemplary flow diagram of selecting a combined color table from a library of combined color tables, in accordance with some embodiments of the present application.
- the process of selecting a combined color table from a combined color table library can include determining a category 610 of the organization, retrieving the associated combined color table 620 based on the determined type of organization, presenting the associated combined color table to the user 630, and obtaining a selection.
- Combined color scheme Let 640 and determine a combined color table 650.
- the type of tissue in the received medical image can be determined.
- the operation of determining the tissue type may be performed by the tissue category determination sub-unit 410.
- the determining the type of tissue includes determining a genre characteristic of the tissue.
- the genre characteristics may include the type of tissue, the number of different kinds of tissues, and the morphology of each tissue.
- a related method of determining the tissue type may refer to the description in the tissue category determination sub-unit 410.
- a related combined color table can be retrieved based on the determined type of tissue.
- the operation of retrieving the associated combined color table may be performed by the retrieval sub-unit 420.
- the search is based on the tissue type characteristics determined in 610.
- the search step can be divided into multiple levels to gradually narrow down the search results.
- a related retrieval method can refer to the description in the retrieval sub-unit 420.
- a related combined color table can be presented to the user.
- the operation of presenting the combined color table can be performed by output module 340.
- the associated combined color table may be the search result in 620.
- the search results can be displayed on the interaction device 140.
- the display mode may be a list, a simplified diagram, a rendering of a color medical image by applying a combined color table, and the like. For example, a set of renderings using different combinations of color tables can be displayed on the display. Users can intuitively see the drawing effect of different combinations of color tables.
- an instruction to select a combined color scheme can be obtained.
- the operation of the fetch instruction may be performed by the receiving module 310.
- the instructions to select a combined color table may be issued by the user via the interaction device 140.
- the instruction to select a combined color table may be one of the search results presented at 630. For example, a set of renderings using different combinations of color tables is displayed on the display. You can click on one of the renderings with your mouse, which means that you have issued an instruction to select the combined color table.
- a combined color table can be determined.
- the determining a combined color table may be performed by the combined color table determining sub-unit 430.
- the determining a combined color table may be determined based on user instructions obtained in 640.
- the determined combined color table can be used as a basis for drawing a color medical image.
- the process of constructing a combined color table library may include determining a number of tissues 710, determining a color table of the first organization and storing, determining whether all of the organization's color tables have been determined 730, according to the determined color table A new color table 740 is determined, the new color table is stored 750 in the combined color table, a combined color table 760 is determined from the determined color table, and the combined color table is stored 770 in the combined color table library.
- the number of tissues can be determined.
- a plurality of combined color tables can be formed into a combined color table library by completing a single combined color table.
- the number of tissues may be the number of types of tissues, or the number of different parts of the same tissue (for example, the number of vertebrae in the spinal image, etc.).
- the number of tissues can be used to assist in constructing a combined color table data unit.
- the data unit can be divided into several parts according to the number of tissues. Portions of the data unit may correspond to different types of organizations or different parts of an organization. For example, the data unit of a combined color table can be divided into three parts, the first part is related information of the bone, the second part is related information of the left lung lobe, and the third part is related information of the right lobe.
- the color table of the first organization can be determined and stored.
- the storage may be stored in a data unit as described in 710.
- the first organization may be any one of the organizations identified in 710, or may be a specific one.
- one tissue can be arbitrarily selected as the first tissue that needs to determine the color table.
- some tissue color tables are determined to have a higher priority than other tissue color tables, and the organization is first considered to be the first organization.
- the color table may include information such as the number, name, label, color, transparency, and whether or not the tissue is hidden.
- step 730 it can be determined if all of the organization's color tables have been determined. If the organization's color table is not determined, then step 740 is performed. If the color tables for all of the organizations have been determined, then step 760 is performed. In some embodiments, the determination may be based on whether the amount of data in the color table data unit described in 710 is the same as the number of tissues. For example, when the color table data unit of the first organization is only the first time execution 730, the data in the data unit of the current combination color table is not filled, that is, the color table of the organization is not yet OK, continue with 740. For another example, if the number of all tissues in the color table data unit matches the number in 710, then 760 is performed.
- a new color table can be determined based on the determined color table.
- the information in the color table of the determined tissue may be information in the color table of the undetermined organization Create constraints. For example, if the color of the identified bone color table is green, then the color in the color table of the determined tissue may not be green.
- the new color table can be stored in a combined color table.
- the storing may be to write the newly determined color table information to the combined color table data unit described in 710.
- the writing may be written to a portion of the data unit that has been divided according to the type of organization.
- a combined color table can be determined based on the determined color table.
- the color tables of all the organizations have been determined, that is, all the information in the combined color table data unit has been filled.
- the information in the combined color table data unit can then be determined as a combined color table.
- the combined color table can be stored in a combined color table library.
- the storage may be numbered and stored according to the type of organization determined in 710.
- the numbering may be to set a number of tags for the combined color table so that the combined color table can be retrieved based on the relationship between the search term and the tag at the time of retrieval.
- a label comprising a combined color chart of bones, skin, and muscles can be set to include three tissues, include bones, contain skin, contain muscles, and the like.
- the process of modifying the combined color table may include obtaining a combined color table 810, determining data 820 to be modified in the combined color table, determining a new combined color table 830 by modifying the data, and storing the new combined color table to Combine the color table library.
- a combined color table can be obtained.
- the acquired combined color table may be retrieved from a combined color table library, or may be manually written by a user.
- the user may be ineffective when applying a combined color table to assist in drawing a color medical image, and some of the data needs to be adjusted.
- the user can manually write a combined color table based on the display effect. For example, the data corresponding to the current display effect is filled in according to the data structure of the foregoing combined color table data unit.
- data that needs to be modified in the combined color table can be determined.
- a color medical image drawn according to the combined color table obtained in 810 may be ineffective and does not meet the needs of the user.
- the user may wish to modify some of the data in the combined color table.
- the color of the bone and the color of the skin are relatively close, which may cause difficulties in distinguishing between the two tissues.
- the color of the bone or skin can then be determined as data that needs to be modified.
- the transparency of the skin is not high enough to allow internal tissue conditions to be observed. You can determine the transparency of the skin as needing to be modified. data.
- a new combined color table can be determined by modifying the data.
- the user can modify the data determined in 820 that needs to be modified by modifying the corresponding data in the combined color table data unit.
- the data in the combined color table can be displayed with a more user friendly interface. For example, the data in the combined color table may be corresponding to each button on the display window, and the data may be modified by clicking the corresponding button.
- the new combined color table can be stored in a combined color table library.
- the storage method can be referred to the description in 770. If the combined color table obtained in 810 is also from the same combined color table library, the original combined color table data can be overwritten with the new combined color table data.
- FIG. 9 is an illustration of a selection combined color table shown in accordance with some embodiments of the present application.
- the eight images shown in the figure may be corresponding effect diagrams after the search results in the combined color table library are applied according to the acquired medical image information.
- the user can visually see the preliminary effects of the color medical images drawn according to different combined color tables from the screen and can select one or more of the combined color tables by operations such as a mouse click.
- FIG. 10A-D are examples of applying different combined color tables, shown in accordance with some embodiments of the present application.
- different color charts can be used to draw color medical images to obtain different display effects.
- the display effect of FIG. 10A is that only the blood vessels of the head are displayed
- the display effect of FIG. 10B is the blood vessel and the skull
- the display effect of FIG. 10C is the bone
- the display effect of FIG. 10D is the blood vessel, the skull, and the skin.
- FIG. 11 is an example of a user interface for modifying a combined color table, shown in accordance with some embodiments of the present application.
- the title of the user interface shown is a list of organizations.
- the tissue types in the combined color table are listed as blood vessels and bones in the dashed box A.
- the dotted line frame B is a color adjustment item, and the user can adjust the tissue color through the button.
- the dotted box C is a hidden item, and the user can select whether to hide the corresponding organization through the button.
- the dashed box D is another data adjustment item, and the user can adjust data such as transparency, brightness, contrast, and the like.
- the present application uses specific words to describe embodiments of the present application.
- a "one embodiment,” “an embodiment,” and/or “some embodiments” means a feature, structure, or feature associated with at least one embodiment of the present application. Therefore, it should be emphasized and noted that “an embodiment” or “an embodiment” or “an alternative embodiment” that is referred to in this specification two or more times in different positions does not necessarily refer to the same embodiment. . Furthermore, some of the features, structures, or characteristics of one or more embodiments of the present application can be combined as appropriate.
- aspects of the present application can be illustrated and described by a number of patentable categories or conditions, including any new and useful process, machine, product, or combination of materials, or Any new and useful improvements. Accordingly, various aspects of the present application can be performed entirely by hardware, entirely by software (including firmware, resident software, microcode, etc.) or by a combination of hardware and software.
- the above hardware or software may be referred to as a "data block,” “module,” “engine,” “unit,” “component,” or “system.”
- aspects of the present application may be embodied in a computer product located in one or more computer readable medium(s) including a computer readable program code.
- a computer readable signal medium may contain a propagated data signal containing a computer program code, for example, on a baseband or as part of a carrier.
- the propagated signal may have a variety of manifestations, including electromagnetic forms, optical forms, and the like, or a suitable combination.
- the computer readable signal medium may be any computer readable medium other than a computer readable storage medium that can be communicated, propagated, or transmitted for use by connection to an instruction execution system, apparatus, or device.
- Program code located on a computer readable signal medium can be propagated through any suitable medium, including a radio, cable, fiber optic cable, RF, or similar medium, or a combination of any of the above.
- the computer program code required for the operation of various parts of the application can be written in any one or more programming languages, including object oriented programming languages such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C++, C#, VB.NET, Python. Etc., regular programming languages such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, dynamic programming languages such as Python, Ruby and Groovy, or other programming languages.
- the program code can run entirely on the user's computer, or run as a stand-alone software package on the user's computer, or partially on the user's computer, partly on a remote computer, or entirely on a remote computer or server.
- the remote computer can be connected to the user's computer via any network, such as a local area network (LAN) or wide area network (WAN), or connected to an external computer (eg via the Internet), or in a cloud computing environment, or as a service.
- LAN local area network
- WAN wide area network
- an external computer eg via the Internet
- SaaS software as a service
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Pathology (AREA)
- Optics & Photonics (AREA)
- High Energy & Nuclear Physics (AREA)
- Biophysics (AREA)
- Apparatus For Radiation Diagnosis (AREA)
- Magnetic Resonance Imaging Apparatus (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
本申请披露了一种生成彩色医学影像的方法。所述生成彩色医学影像的方法可以包括:获取医学影像数据,划分影像中的组织,根据划分的组织,从一个组合颜色表库中选择一个组合颜色表,以及根据选择的组合颜色表绘制包含所述划分的组织的彩色的医学影像。在一些实施例中,所述组合颜色表可以为一种包含所述划分的组织配色方案的数据。
Description
本申请涉及医学影像生成方法及系统,尤其是涉及根据组合颜色表生成彩色医学影像的方法和系统。
随着医学影像在医学领域的广泛应用,人们也开始不断地努力研究改善医学图像质量以及提高医学图像显示的可鉴别度。在医学影像领域,有大量的图像信息,包括XR图像,CT图像,MR图像以及超声图像等,这些图像大部分数据都表示着信号的强度,如CT值代表着衡量组织对于X光的吸收性,反映了组织的密度。这些信号绝大多数都是一维信号,从图像的表现形式来说也就是绘图图像。一般人眼对于黑白灰度级的分辨能力较差,多数只能辨别二十多个灰度等级,对灰度变化并不敏感。但是人眼却能同时区分上千种不同亮度、色调、饱和度的彩色图像。因此,人们常把灰度图像变换为彩色图像,利用彩色来显示特定的灰度图像,将人眼不能区分的微小灰度差别显示为明显色彩差异,从而提高图像的可鉴别度,而且能够使医务人员从图像中获得更多的图像信息,有助于疾病的诊断。目前在医学图像中常用的彩色显示技术为颜色表(伪彩色表),也就是利用不同的颜色和透明度表示不同的灰阶,从而达到通过色彩手段显示医学影响的目的。但是随着医学应用的发展,一幅医学影像中常包含了多种组织,而医学应用需要同时显示多种组织并加以区分。这就需要在图像分割的基础上,使用多个颜色表来分别配置不同的组织区域,从而达到显示多个组织的目的。这其中,使用多个颜色表尽管可以达到多组织同时显示的目的,但在切换组织显示以及修改某个颜色表配置的信息时操作十分繁琐,需要医务人员进行很多个步骤的操作才能实现,增加了操作的复杂度和耗时。
简述
本申请的一个方面是关于一种生成彩色医学影像的方法。所述生成彩色医学影像的方法可以包括:获取医学影像数据,划分影像中的组织,根据划分的
组织,从一个组合颜色表库中选择一个组合颜色表,以及根据选择的组合颜色表绘制包含所述划分的组织的彩色的医学影像。在一些实施例中中,所述组合颜色表可以为一种包含所述划分的组织配色方案的数据。
本申请的另一个方面是关于一种非暂时性的计算机可读介质。所述非暂时性的计算机可读介质可以包括可执行指令。所述指令被至少一个处理器执行时,可以导致所述至少一个处理器实现一种方法。
本申请的另一个方面是关于一个生成彩色医学影像的系统。所述系统可以包括:至少一个处理器,以及可执行指令。所述指令被至少一个处理器执行时,导致所述至少一个处理器实现所述生成彩色医学影像的方法。
本申请的另一个方面是关于一个系统。所述系统可以包括:至少一个处理器,以及用来存储指令的存储器,所述指令被所述至少一个处理器执行时,可以导致所述系统实现所述生成彩色医学影像的方法。
根据本申请的一些实施例,所述生成彩色医学影像的方法可以进一步包括将绘制的包含所述划分的组织的彩色医学影像展示给用户。
根据本申请的一些实施例,所述根据划分的组织,从一个组合颜色表库中选择一个组合颜色表可以包括:确定组织的种类,检索与确定的组织的种类相关的组合颜色表,呈现相关的组合颜色表给用户,获取一个选择组合颜色表的指令,以及确定一个组合颜色表。
根据本申请的一些实施例,所述生成彩色医学影像的方法可以进一步包括组合颜色表库中的组合颜色表的生成方法,所述组合颜色表生成方法可以包括:确定组织数,确定第一个组织的颜色表,根据已确定颜色表确定一个新的颜色表,直到所有组织的颜色表都被确定,以及将所有的颜色表组合为一个组合颜色表。
根据本申请的一些实施例,所述生成彩色医学影像的方法可以进一步包括组合颜色表库中的组合颜色表的修改方法,所述组合颜色表修改方法可以包括:获取一个组合颜色表,确定组合颜色表中需要修改的数据,通过修改所述需要修改的数据确定一个新的组合颜色表,以及将所述新的组合颜色表存储至组合颜色表库中。
根据本申请的一些实施例,所述医学影像数据可以至少为以下组合中的一种:一个二维图片、一个三维空间模型、一个动态影像。
根据本申请的一些实施例,所述划分影像中的组织可以包括:划分所述医学影像中的不同组织以及相同组织的不同部分。
根据本申请的一些实施例,所述组织配色方案可以包括编号、名称、标签、颜色、透明度、是否隐藏。
根据本申请的一些实施例,所述确定组织的种类可以是确定组织的种类特征,其中,组织的种类特征可以包括组织的种类、不同组织的数量以及各个组织的形态。
根据本申请的一些实施例,所述选择组合颜色表的指令可以由用户通过交互设备发出,其中,交互设备上可以显示有所述检索的检索结果,所述显示的方式可以包括列表、简图和应用某组合颜色表绘制彩色医学影像的效果图。
根据本申请的一些实施例,所述组合颜色表可以包括:多个组织,所述组织表示医学影像数据的部分区域或子集,以及与各个组织对应的颜色表,所述颜色表包括组织的窗宽窗位以及按照窗宽窗位确定的灰度值所对应的彩色配色效果。
根据本申请的一些实施例,所述检索与确定的组织的种类相关的组合颜色表可以是根据组织的种类特征分层级进行检索。
根据本申请的一些实施例,所述组合颜色表可以进一步包括所述各个组织的显示或者隐藏选项。
根据本申请的一些实施例,所述组合颜色可以表进一步包括所述各个组织的透明度选项。
附图描述
在此所述的附图用来提供对本申请的进一步理解,构成本申请的一部分,本申请的示意性实施例及其说明用于解释本申请,并不构成对本申请的限定。在各图中,相同标号表示相同部件。
图1是根据本申请的一些实施例所示的彩色医学影像生成系统示意图;
图2是根据本申请的一些实施例所示的一个计算设备的结构示意图;
图3是根据本申请的一些实施例所示的一个彩色医学影像生成系统示意图;
图4是根据本申请的一些实施例所示的一个组合颜色表选择单元示意图;
图5是根据本申请的一些实施例所示的一个生成彩色医学影像的示例性流程图;
图6是根据本申请的一些实施例所示的一个从一个组合颜色表库中选择一个组合颜色表的示例性流程图;
图7是根据本申请的一些实施例所示的构建组合颜色表库的示例性流程图;
图8是根据本申请的一些实施例所示的修改组合颜色表的示例性流程图;
图9是根据本申请的一些实施例所示的一个选择组合颜色表的示例;
图10A-D是根据本申请的一些实施例所示的应用不同组合颜色表生成彩色医学图像的示例;以及
图11是根据本申请的一些实施例所示的修改组合颜色表的用户界面示例。
具体描述
为了更清楚地说明本申请的实施例的技术方案,下面将对实施例描述中所需要使用的附图作简单的介绍。显而易见地,下面描述中的附图仅仅是本申请的一些示例或实施例,对于本领域的普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图将本申请应用于其他类似情景。除非从语言环境中显而易见或另做说明,图中相同标号代表相同结构或操作。
如本申请和权利要求书中所示,除非上下文明确提示例外情形,“一”、“一个”、“一种”和/或“该”等词并非特指单数,也可包括复数。一般说来,术语“包括”与“包含”仅提示包括已明确标识的步骤和元素,而这些步骤和元素不构成一个排它性的罗列,方法或者设备也可能包含其他的步骤或元素。
虽然本申请对根据本申请的实施例的数据处理系统中的某些模块做出了各种引用,然而,任何数量的不同模块可以被使用并运行在一个通过网络与该系统连接的客户端和/或服务器上。所述模块仅是说明性的,并且所述系统和方法的不同方面可以使用不同模块。
本申请中使用了流程图用来说明根据本申请的实施例的数据处理系统所执行的操作步骤。应当理解的是,显示在前面或后面的操作步骤不一定按照顺序来精确地执行。相反,可以按照倒序或同时处理各种步骤。同时,也可以将其他操作步骤添加到这些过程中,或从这些过程移除某一步或数步操作。
在图像数据处理过程中,“组织划分”、“图像分割”、“图像提取”、“图像分类”可以相互转化,均表达从大范围区域内选取符合某条件的图像。在一些实施例中,成像系统可以包括一种或多种形态。所述形态可以包括数字减影血管造影(DSA)、磁共振成像(MRI)、磁共振血管造影(MRA)、计算机断层扫描(CT)、计算机断层扫描血管造影(CTA)、超声波扫描(US)、正电子发射断层扫描术(PET)、单光子发射计算机断层扫描(SPECT)、SPECT-MR、CT-PET、CE-SPECT、DSA-MR、PET-MR、PET-US、SPECT-US、TMS-MR、US-CT、US-MR、X射线-CT、X射线-PET、X射线-US、视频-CT、视频-US和/或类似的一种或多种的组合。在一些实施例中,成像扫描的目标可以是器官、机体、物体、损伤部位、肿瘤等一种或多种的组合。在一些实施例中,成像扫描的目标可以是头部、胸腔、腹部、器官、骨骼、血管等一种或多种的组合。在一些实施例中,扫描的目标可以为一个或多个部位的血管组织。在一些实施例中,图像可以是二维图像和/或三维图像。在二维图像中,最细微的可分辨元素可以为像素点(pixel)。在三维图像中,最细微的可分辨元素可以为体素点(voxel)。在三维图像中,图像可由一系列的二维切片或二维图层构成。
组织划分过程可以基于图像的像素点(或体素点)的相应特征进行。在一些实施例中,所述像素点(或体素点)的相应特征可以包括纹理结构、灰度、平均灰度、信号强度、颜色饱和度、对比度、亮度等一种或多种的组合。在一些实施例中,所述像素点(或体素点)的空间位置特征也可以用于图像分割过程。
需要注意的是,以上对于图像数据处理系统的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各
个模块进行任意组合,或者构成子系统与其他模块连接,对实施上述方法和系统的应用领域形式和细节上的各种修正和改变。
图1是根据本申请的一些实施例所示的彩色医学影像生成系统100示意图。该彩色医学影像生成系统100可以包括数据采集设备110、处理设备120、存储设备130和交互设备140。数据采集设备110、处理设备120、存储设备130和交互设备140相互之间可以通过网络180进行通信。
数据采集设备110可以是一个采集数据的设备。所述数据可以包括图像数据、对象特征数据等。在一些实施例中,所述数据采集设备110可以包括一个成像设备。所述成像设备可以采集所述图像数据。所述成像设备可以是磁共振成像仪(magnetic resonance imaging,MRI)、电子计算机断层扫描仪(computed tomography,CT)、正电子发射型计算机断层显像仪(positron emission computed tomography,PET)、B超仪(b-scan ultrasonography)、超声诊断仪(diasonography)、热断层扫描仪(thermal texture maps,TTM)、医用电子内窥镜(medical electronic endoscope,MEE)等中的一种或多种的组合。所述图像数据可以是包括对象的血管、组织或器官的图片或数据。在一些实施例中,所述对象特征采集设备可以集成在所述成像设备中,从而同时采集图像数据和对象特征数据。在一些实施例中,所述数据采集设备110可以通过网络180将其所采集的数据发送至处理设备120、存储设备130和/或交互设备140等。
处理设备120可以对数据进行处理。所述数据可以是通过数据采集设备110采集到的数据,从存储设备130中读取的数据,从交互设备140中获得的反馈数据,如用户的输入数据,或通过网络180从云端或者外接设备中获得的数据等。在一些实施例中,所述数据可以包括图像数据、对象特征数据、用户输入数据等。所述处理可以包括在图像数据中选择感兴趣的区域。所述感兴趣的区域可以由处理设备120自行选择或根据用户输入数据选择。在一些实施例中,选择的感兴趣区域可以是血管、组织或者器官等。例如,所述感兴趣区域可以是头部区域,如血管、骨头、皮肤等。处理设备120可以进一步对所述图像中对感兴趣的区域进行分割。图像分割的方法可以包括基于边缘的图像分割方法,如Perwitt算子法、Sobel算子法、梯度算子法、Kirch算子法等,基于区域的图像分割方法,如区域生长法、阈值法、聚类法等以及其他分割方法,如基于模糊集、神经网络
的方法等。
在一些实施例中,处理设备120可以进行不同组织的划分。如将头部区域灰度图像划分为头部血管、头部细血管、头部骨骼、头部皮肤等。组织划分可以基于用户标识或者交互式划分的方式,也可以基于自动划分组织的算法,如区域增长算法、基于灰度级算法、水平集、神经网络、聚类法、图切割、形变模型和图谱法等。
在一些实施例中,处理设备120可以根据组合颜色表将灰度图像转换为彩色医学影像。所述组合颜色表可以包括:多个组织,所述组织表示医学影像数据的部分区域或子集;与各个组织对应的颜色表,所述颜色表可以包括组织的窗宽窗位以及按照窗宽窗位确定的灰度值所对应的彩色配色效果;所述各个组织的显示或者隐藏选项以及所述各个组织的透明度选项。例如,对头部灰度图像应用不同配色方案进行处理的结果可以包括头部血管彩色医学影像150、头部细血管彩色医学影像160、头部血管和头骨彩色医学影像170、头部细血管和头骨彩色医学影像180等中的一种或多种的组合。
在一些实施例中,处理设备120可以编辑组合颜色表对应的组织管理模块,如组织显示控制、各个组织的颜色、颜色表控制、透明度、光照参数等。在一些实施例中,处理设备120可以对一个选定的组合颜色表中的某一组织的透明度参数进行修改。在一些实施例中,处理设备120可以对一个选定的组合颜色表的某几个组织的颜色参数进行修改。
在一些实施例中,处理设备120可以对其获得的数据或处理结果进行降噪或平滑处理。在一些实施例中,处理设备120可以将其获得的数据或处理结果发送至存储设备130进行存储,或者发送至交互设备140进行显示。所述处理结果可以是处理过程中产生的中间结果,如组织划分的结果,也可以是处理的最终结果,如最终获得的彩色医学影像。在一些实施例中,处理设备120可以是一个或多个处理元件或设备,如中央处理器(central processing unit,CPU)、图形处理器(graphics processing unit,GPU)、数字信号处理器(digital signal processor,DSP)、系统芯片(system on a chip,SoC)、微控制器(microcontroller unit,MCU)等。在一些实施例中,处理设备120也可以是特殊设计的具备特殊功能的处理元件或设备。处理设备120可以是本地的,或相对于数据采集设备110是远程的。
存储设备130可以储存数据或信息。所述数据或信息可以包括数据采集设备110获取的数据、处理设备120产生的处理结果或控制指令、以及交互设备140所接收到的用户输入数据等。存储设备130可以是一种或多种可以读取或写入的存储媒介,包括静态随机存储器(static random access memory,SRAM),随机存储器(random-access memory,RAM)、只读存储器(read-only memory,ROM)、硬盘、闪存等。在一些实施例中,存储设备130也可以是远程的存储器,如云盘等。
交互设备140可以接收、发送,以及/或显示数据或信息。所述接收的数据或信息可以包括数据采集设备110获取的数据、处理设备120产生的处理结果、存储设备130存储的数据等。例如,交互设备140显示的数据或信息可以包括数据采集设备110获得的头部血管的实际图像、处理设备120根据一个配色方案生成的的彩色医学影像等。显示的形式可以包括二维的彩色医学图像、三维的彩色医学图像、彩色几何模型及其网格分析、矢量图(如速度矢量线)、等值线图、填充型的等值线图(云图)、XY散点图、粒子轨迹图、模拟流动效果等一种或多种组合。又例如,交互设备140发送的数据或信息可以包括用户的输入信息。交互设备140可以接收用户输入的一个或多个运行参数,并发送到处理设备120。在一些实施例中,交互设备140可以包括一个用户交互界面。用户可以通过特定的交互装置,如鼠标、键盘、触摸板、麦克风等向交互设备140输入一个用户输入数据。
在一些实施例中,交互设备140可以是显示屏等具有显示功能的设备。在一些实施例中,交互设备140可以具有处理设备120部分或全部的功能。例如,交互设备140可以对处理设备120生成的结果进行平滑、降噪、变色等操作。举例说明,变色操作可以将一个灰度图变成彩图,或将一个彩图变成一个灰度图。在一些实施例中,交互设备140与处理设备120可以是一个集成的设备。所述集成的设备可以同时实现处理设备120和交互设备140的功能。在一些实施例中,交互设备140可以包括台式电脑、服务器、移动设备等。移动设备可以包括笔记本电脑、平板电脑、ipad、交通工具(例如,机动车、船、飞机等)的内置设备、可穿戴设备等。在一些实施例中,交互设备140可以包括或连接到显示装置、打印机、传真等。
网络180可以用于彩色医学影像生成系统100内部的通信,接收系统外部的信息,向系统外部发送信息等。在一些实施例中,数据采集设备110、处理设备120和交互设备140之间可以通过有线连接、无线连接、或其结合的方式接入网络180。网络180可以是单一网络,也可以是多种网络的组合。在一些实施例中,网络180可以包括但不限于局域网、广域网、公用网络、专用网络、无线局域网、虚拟网络、都市城域网、公用开关电话网络等中的一种或几种的组合。在一些实施例中,网络180可以包括多种网络接入点,例如有线或无线接入点、基站或网络交换点,通过以上接入点使数据源连接网络180并通过网络发送信息。
图2是根据本申请的一些实施例所示的一个计算设备200的结构示意图。该计算设备200可以实施本申请中披露的特定系统。本实施例中的特定系统利用功能框图解释了一个包含用户界面的硬件平台。计算设备200可以实施当前彩色医学影像生成系统100中的一个或多个组件、模块、单元、子单元(例如,处理设备120,交互设备140等)。另外,彩色医学影像生成系统100中的一个或多个组件、模块、单元、子单元(例如,处理设备120,交互设备140等)能够被计算设备200通过其硬件设备、软件程序、固件以及它们的组合所实现。这种计算机可以是一个通用目的的计算机,也可以是一个有特定目的的计算机。两种计算机都可以被用于实现本实施例中的特定系统。为了方便起见,图2中只绘制了一台计算设备,但是本实施例所描述的进行信息处理并推送信息的相关计算机功能是可以以分布的方式、由一组相似的平台所实施的,分散系统的处理负荷。
如图2所示,计算设备200可以包括内部通信总线210,处理器(processor)220,只读存储器(ROM)230,随机存取存储器(RAM)240,通信端口250,输入/输出组件260,硬盘270,用户界面280。内部通信总线210可以实现计算设备200组件间的数据通信。处理器220可以执行程序指令完成在此披露书中所描述的彩色医学影像生成系统100的一个或多个功能、组件、模块、单元、子单元。处理器220由一个或多个处理器组成。通信端口250可以配置实现计算设备200与彩色医学影像生成系统100其他部件(比如数据采集设备110)之间数据通信(比如通过网络180)。计算设备200还可以包括不同形式的程序储存单元以及数据储存单元,例如硬盘270,只读存储器(ROM)230,随机存取存储器
(RAM)240,能够用于计算机处理和/或通信使用的各种数据文件,以及处理器220所执行的可能的程序指令。输入/输出组件260支持计算设备200与其他组件(如用户界面280),和/或与彩色医学影像生成系统100其他组件(如数据库140)之间的输入/输出数据流。计算设备200也可以通过通信端口250从网络180发送和接收信息及数据。
图3是根据本申请的一些实施例所示的一个彩色医学影像生成系统300示意图。彩色医学影像生成系统300可以包含一个接收模块310、一个处理模块320、一个存储模块330和一个输出模块340。其中处理模块320还可以包含一个组织划分单元321、一个组合颜色表选择单元322和一个图像生成单元323。所示模块之间可以彼此直接(和/或间接)连接。
接收模块310可以接收医学影像。所述医学影像可以包括XR图像、CT图像、MR图像以及超声图像等。所述医学影像可以反映人体或动植物的某一部分组织的信息。在一些实施例中,所述医学影像是一个或一组二维影像。例如,黑白的X光胶片,不同断层的CT扫描影像等。所述二维医学影像可以由若干像素(Pixel)组成。在一些实施例中,所述医学影像可以是一个三维的空间模型,例如,根据不同断层的CT扫描影像重建的器官三维模型,或者由具有三维造影能力的设备输出的三维空间模型。所述三维医学影像可以由若干体素(Voxel)组成。在一些实施例中,所述医学影像还可以是一段时间内的动态影像。例如,一段反映心脏及其周围组织在一个心动周期内的变化情况的视频等。所述医学影像可以来自于成像设备110,可以来自于存储模块330,也可以来自于用户通过交互设备的输入。在一些实施例中,所述构成医学影像的像素或体素为黑白像素或体素。所述像素或体素的灰度值可以根据成像组织器官的不同状态相应改变以呈现出黑白的组织器官影像。所述像素或体素与其对应的灰度值可以以列表的形式存储在存储模块330中。在一些实施例中,属于某一组织器官的像素或体素可以对应一部分的灰度值。例如,在某医学影像上同时存在骨头,肌肉和皮肤组织。各个组织可以分别对应某一灰度值的取值范围。例如,骨头的灰度值较高,肌肉的灰度值次之,而皮肤的灰度值最小等。
处理模块320可以对接收模块310所接收到的医学影像进行处理。所述处理可以包括对医学影像中的不同组织或相同组织的不同部分进行划分、确定划
分后的各个组织的配色方案,以及根据确定的配色方案生成彩色的医学影像。所述处理过程可以分别由组织划分单元321、组合颜色表选择单元322和图像生成单元323来执行。
组织划分单元321可以将接收到的医学影像中不同的组织或者相同组织的不同部分进行划分。所述划分可以是自动划分、半自动划分或手动划分。所述不同的组织可以是医学影像中包含的不同器官,例如骨头、皮肤、脏器等。所述相同组织的不同部分可以是某一组织器官的在位置上存在差别的部分,例如人的左肺叶和右肺叶等。在一些实施例中,所述划分可以是将不同组织对应的像素或体素进行分类。例如,在一幅包含骨头、肌肉和皮肤的医学影像中,骨头部分对应的像素或体素可以被分为第一类,肌肉部分对应的像素或体素可以被分为第二类,皮肤部分对应的像素或体素可以被分为第三类。所述像素或体素的分类信息可以存储在存储模块330中。
组合颜色表选择单元322可以确定一个用来辅助生成彩色医学影像的组合颜色表。在一些实施例中,所述组合颜色表可以是由一系列的颜色表组成。所述颜色表可以是目前在医学影像中常用的彩色显示技术中应用的颜色表(伪彩色表)。所述颜色表可以针对不同的灰度或灰度区间对应不同的颜色和/或透明度。例如,可以灰度值越高可以对应波长越长的颜色,或者某一灰度区间对应统一种颜色。在一些实施例中,所述颜色表也可以对应于某一类的像素或体素。例如,对于同一类的像素或体素可以对应同一种颜色。在一些实施例中,所述颜色表的内容可以包括某一类组织的编号、名称、标签、颜色、透明度、是否隐藏等信息。所述是否隐藏信息可以是一个是或否的判断信息。例如当是否隐藏信息为是时,则可以在最终生成的彩色医学影像中将该类的组织隐去。在一些实施例中所述组合颜色表可以通过可扩展标记语言Xml(eXtensible Markup Language)相关技术来实现组合颜色表的解析和存储。
在一些实施例中,所述组合颜色表可以是上述至少两个颜色表的集合。例如,某组合颜色表可以包含骨头和皮肤这两类组织的颜色表,则根据该组合颜色表绘制的彩色医学影像中,骨头部分根据该组合颜色表中的骨头类信息进行绘制,皮肤部分根据该组合颜色表中的皮肤类信息进行绘制。在一些实施例中,所述组合颜色表中不同种类组织的信息可以是相互关联的。例如,在一个包含骨头
和皮肤两类组织的组合颜色表中,骨头类信息中的颜色与皮肤类信息中的颜色可能存在明显的差别,以利于在生成彩色的医学影像时可以通过人眼进行分辨。又例如在生成三维的彩色医学影像时,某组织A可能完全被组织B包围。则在相应的组合颜色表中,用户可以选择隐藏组织B的信息,以利于在需要观察组织A的情况时在三维的彩色医学影像中显示组织A。
在一些实施例中,所述组合颜色表可以存储在一个组合颜色表库中。例如,通过Xml相关技术来实现组合颜色表的解析和存储。用户可以根据实际需求从组合颜色表库中调取所需的组合颜色表。组合颜色表选择单元322可以根据用户指令从一个组合颜色表库中确定针对不同组织划分的不同组合颜色表来进行后续的彩色医学影像绘制工作。在一些实施例中,可以通过MVC(Model-view-controller)来实现对组合颜色表的管理。即将组合颜色表内容作为模型,将显示的内容作为视图。通过这种方法,可以实现根据用户选择的多个组织划分,从组合颜色表库中自动获取相关的多个组合颜色表模块,一步生成包括多个组织划分的彩色医学影像。
图像生成单元323可以根据组合颜色表选择单元322确定的组合颜色表来生成彩色的医学影像。所述生成方法可以是基于颜色表的绘制技术,例如医学影像的VR(virtual reality)、MIP(Maximum Intensity Projection)等绘制技术中,采用基于光线照射的体绘制技术。所述基于光线照射的体绘制技术可以是通过OpenGL的Shader技术来实现光线投射模拟。在一些实施例中,所述绘制技术可以是基于多线程CPU或者基于GPU的绘制技术。
存储模块330可以存储数据或信息。存储的数据或信息可以是各种形式,例如,数值、信号、图像、目标物体的相关信息、命令、算法、程序等一种或多种的组合。在一些实施例中,存储数据可以是黑白医学影像、彩色医学影像、组织颜色表、组合颜色表或图像处理所应用的程序和/或算法等。
输出模块340可以将生成的彩色医学影像进行输出。例如,输出模块340可以将彩色医学影像发送至存储设备130进行存储,或者发送至交互设备140进行显示或以其他方式(如图像,声音等)呈现给客户。展示的内容可以是生成的中间结果,如感兴趣区域的模型,或生成的最终结果,如绘制彩色医学影像时的组织划分情况等。
需要注意的是,以上对于彩色医学影像生成系统300的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对实施上述方法和系统的应用领域形式和细节上的各种修正和改变。
图4是根据本申请的一些实施例所示的一个组合颜色表选择单元322示意图。组合颜色表选择单元322可以包含一个组织种类确定子单元410、一个检索子单元420和一个组合颜色表确定子单元430。所示模块之间可以彼此直接(和/或间接)连接。
组织种类确定子单元410可以确定接收到的医学影像中所包含组织的种类特征。所述种类特征可以包括组织的种类、不同种类组织的数量和各个组织的形态等。在一些实施例中,组织的种类数可以被组织种类确定子单元410确定。例如,某一幅医学影像中包含骨头、皮肤和肌肉,则三种不同组织可以被确定。在一些实施例中,某种组织的数量可以被确定。例如,某一幅医学影像中包含5段脊椎骨,则该5段脊椎骨可以被确定。在一些实施例中,某一组织的形态也可以被确定。例如,某心脏病患者的心脏形态由于某种疾病异于常人,则该形态特征可以被确定。所述一系列种类特征可以作为接收到的医学影像的图像特征且为后续彩色影像绘制过程提供参考。
检索子单元420可以根据组织种类确定子单元410确定的种类特征在一个组合颜色表数据库中进行检索。所述组合颜色表库可以是存储在存储模块330中,也可以存储在某一云端服务器中并通过网络180进行访问。在一些实施例中,所述检索可以是分层级进行检索。例如,可以先通过组织种类及数量进行一层的检索。如一幅医学影像中包含三种组织且分别为骨头、皮肤和肌肉,则第一层的检索可以筛选出组织种类信息由骨头、皮肤和肌肉组成的组合颜色表。可以通过某些组织的具体数量进行进一步检索。例如,针对包含5段脊椎骨的医学影像,在第一层检索后的结果中可以进一步将包含有5段脊椎骨的组合颜色表筛选出来。可以通过某种组织的形态进一步检索。例如,针对包含两片肺叶的医学影像,在第一层或进一步检索的结果中将包含有两片肺叶信息的组合颜色表筛选出来。经过至少一层的检索,可能有若干相关的组合颜色表作为检索结果被筛选出来。
组合颜色表确定子单元430可以确定检索子单元420中筛选出来的一系列检索结果。在一些实施例中,所述一些列检索结果可以被显示在交互设备140上。显示方式可以是列表、简图,应用某组合颜色表绘制彩色医学影像的效果图等形式。用户可以通过交互设备140执行选择操作,由组合颜色表确定子单元430确认其中一个组合颜色表。
需要注意的是,以上对于组合颜色表选择单元322的描述,仅为描述方便,并不能把本申请限制在所举实施例范围之内。可以理解,对于本领域的技术人员来说,在了解该系统的原理后,可能在不背离这一原理的情况下,对各个模块进行任意组合,或者构成子系统与其他模块连接,对实施上述方法和系统的应用领域形式和细节上的各种修正和改变。例如在一些实施例中,组合颜色表确定子单元430可以被省略或集成到检索子单元420中,由检索子单元420直接输出只包含一个组合颜色表的检索结果。
图5是根据本申请的一些实施例所示的一个获取彩色医学影像的示例性流程图。所述获取彩色医学影像的流程可以包括获取医学影像数据510,划分影像中的组织520,根据划分的组织从一个组合颜色表库中选择一个组合颜色表530,根据选择的组合颜色表绘制彩色的医学影像540,展示彩色医学影像550。
在510中,可以获取医学影像。在一些实施例中,获取医学影像操作可以被接收模块310执行。所述医学影像可以包含XR图像,CT图像,MR图像,超声图像等灰度图像,或者上述图像的任意组合。在一些实施例中,医学影像可以从存储模块330获取。在一些实施例中,图像数据可以通过网络180从外部数据源获取。在一些实施例中,图像数据可以从输入/输出设备260获取。相关医学影像获取方法可以参考接收模块310中的描述。
在520中,可以划分医学影像中包含的不同组织或组织的不同部分。在一些实施例中,所述组织划分可以由组织划分单元321来执行。在一些实施例中,组织划分操作可以基于一个或多个算法,例如,基于边缘的图像分割方法,如Perwitt算子法、Sobel算子法、梯度算子法、Kirch算子法等,基于区域的图像分割方法,如区域生长法、阈值法、聚类法等以及其他分割方法,如基于模糊集、神经网络的方法等,或者上述算法的任意组合。在一些实施例中,组织划分可以基于Mesh结构、Mask方式,或者上述方式的任意组合。相关的组织划分方法可
以参考组织划分单元321中的描述。
在530中,可以根据划分的组织,从一个组合颜色表库中选择一个组合颜色表。在一些实施例中,所述选择操作可以由组合颜色表选择单元322来执行。所述根据划分的组织可以是根据组织的种类或者某种组织的不同部分来进行选择。所述选择可以是先根据划分的组织在所述组合颜色表库中进行检索,并在检索结果中选择其中的一个。相关的组合颜色表选择方法可以参考组合颜色表选择单元322的表述。
在540中,可以根据在530中选择的组合颜色表绘制彩色的医学影像。在一些实施例中,彩色医学影像的绘制可以由图像生成单元323来执行。相关的绘制方法可以参考图像生成单元323中的描述。
在550中,可以将540中绘制的彩色医学影像进行展示。在一些实施例中,所述展示可以是经由输出模块340将彩色医学影像传输至交互设备140进行展示。所述展示可以是以图像、声音或影片的形式呈现给用户。
需要注意的是,以上对彩色医学影像生成流程的描述仅仅是具体的示例,不应被视为是唯一可行的实施方案。显然,对于本领域的专业人员来说,在了解图像形成过程的基本原理后,可能在不背离这一原理的情况下,对图像形成过程的具体实施方式与步骤进行形式和细节上的各种修正和改变,还可以做出若干简单推演或替换,在不付出创造性劳动的前提下,对个别步骤的顺序作出一定调整或组合,但是这些修正和改变仍在以上描述的范围之内。在一些实施例中,540和550可以组合成一个步骤。在一些实施例中,在540执行之后,图像数据处理流程可以返回至530,进行图像数据的进一步处理。例如,重新选择一个组合颜色表进行绘制。在一些实施例中,一个或多个操作可以添加至流程中,或从流程中删除。例如,在510之前,可以添加一个被测物的扫描操作,被测物可以由成像设备进行扫描。再如,在510、520、530、540和/或550之间或之后,可以添加一个数据存储操作。数据可以保存在存储模块330。
图6是根据本申请的一些实施例所示的一个从一个组合颜色表库中选择一个组合颜色表的示例性流程图。从一个组合颜色表库中选择一个组合颜色表的流程可以包括确定组织的种类610,根据确定的组织的种类,检索相关的组合颜色表620,呈现相关的组合颜色表给用户630,获取一个选择组合颜色方案的指
令640以及确定一个组合颜色表650。
在610中,可以确定接收到的医学影像中的组织的种类。在一些实施例中,确定组织种类的操作可以被组织种类确定子单元410来执行。所述确定组织的种类包括确定组织的种类特征。所述种类特征可以包括组织的种类、不同种类组织的数量和各个组织的形态等。相关的确定组织种类的方法可以参考组织种类确定子单元410中的描述。
在620中,可以根据确定的组织的种类,检索相关的组合颜色表。在一些实施例中,检索相关组合颜色表的操作可以被检索子单元420执行。所述检索是根据610中确定的组织种类特征进行检索的。该检索步骤可以分为多个层级,逐步缩小检索结果的范围。相关的检索方法可以参考检索子单元420中的描述。
在630中,可以呈现相关的组合颜色表给用户。在一些实施例中,该呈现组合颜色表的操作可以由输出模块340来执行。所述相关的组合颜色表可以是620中的检索结果。在一些实施例中,所述检索结果可以被显示在交互设备140上。显示方式可以是列表、简图,应用某组合颜色表绘制彩色医学影像的效果图等形式。例如,在显示器上可以显示一组应用不同组合颜色表的效果图。用户可以直观的看到不同组合颜色表的绘制效果。
在640中,可以获取一个选择组合颜色方案的指令。在一些实施例中,该获取指令的操作可以由接收模块310来执行。所述选择组合颜色表的指令可以是由用户通过交互设备140来发出的。在一些实施例中,所述选择组合颜色表的指令可以是在630呈现的检索结果中选中其中的一个。例如,在显示器上显示了一组应用不同组合颜色表的效果图。可以用鼠标点选其中的一个效果图,即意味着发出了选择该组合颜色表的指令。
在650中,可以确定一个组合颜色表。在一些实施例中,所述确定一个组合颜色表可以由组合颜色表确定子单元430来执行。所述确定一个组合颜色表可以是根据640中获取的用户指令来确定。所述确定的组合颜色表可以作为绘制彩色医学影像的依据。
图7是根据本申请的一些实施例所示的构建组合颜色表库的示例性流程图。所述构建组合颜色表库的流程可以包括确定组织数710,确定第一个组织的颜色表并存储,判断是否所有组织的颜色表都已确定730,根据已确定的颜色表
确定一个新的颜色表740,将该新的颜色表存储至组合颜色表中750,根据已确定的颜色表确定一个组合颜色表760,将该组合颜色表存储至组合颜色表库中770。
在710中,可以确定组织数。在构建组合颜色表库的过程中,可以先通过完成单个的组合颜色表,再将若干组合颜色表构成一个组合颜色表库。在构建单个组合颜色表时,可以先确定该组合颜色表中的组织数。在一些实施例中,所述组织数可以是组织的种类数,也可以是同种组织不同部分数(例如脊椎影像中脊椎骨的根数等)。在一些实施例中,所述组织数可以用来辅助构建一个组合颜色表数据单元。所述数据单元可以根据所述组织数分为若干部分。所述数据单元的若干部分可以分别对应某一种组织或某种组织的不同部分。例如,某组合颜色表的数据单元可以分为三个部分,第一部分为骨头的相关信息,第二部分为左肺叶的相关信息,第三部分为右肺叶的相关信息。
在720,可以确定第一个组织的颜色表并存储。在一些实施例中,所述存储可以是存储到710中所述的数据单元中。所述第一个组织可以是710中确定的组织中的任意一个,也可以是特定的某一种。例如,在构建包含骨头、皮肤和肌肉的组合颜色表时,可以任意选择一种组织作为第一个需要确定颜色表的组织。在一些实施例中,有些组织颜色表的确定较其他组织颜色表的确定优先级较高,则先将该组织认为是第一个组织。例如,在同时显示骨头、皮肤和肌肉时,可能对皮肤的透明度有更高的要求,则可以将皮肤作为第一个组织先行确定其颜色表。所述颜色表可以包括组织的编号、名称、标签、颜色、透明度、是否隐藏等信息。
在730中,可以判断是否所有组织的颜色表都已确定。如果还有组织的颜色表没有确定,则执行步骤740。如果所有组织的颜色表都已经确定,则执行步骤760。在一些实施例中,所述判断可以是根据710中所述的颜色表数据单元中的数据量与组织数是否相同来判断。例如,当第一次执行730时,所述颜色表数据单元中只有第一个组织的颜色表数据,则当前组合颜色表数据单元中还有数据没有填满,即还有组织的颜色表未确定,继续执行740。又例如,所述颜色表数据单元中所有的组织的数量与710中的数量相符,则执行760。
在740中,可以根据已确定的颜色表确定一个新的颜色表。在一些实施例中,已经确定的组织的颜色表中的信息可能会对未确定的组织的颜色表中信息
产生制约。例如,如果已确定的骨头的颜色表中颜色为绿色,则之后确定的组织的颜色表中的颜色就不可以为绿色。
在750中,可以将该新的颜色表存储至组合颜色表中。所述存储可以是将新确定的颜色表信息写入710中所述的组合颜色表数据单元中。所述写入可以是写入到已经根据组织种类划分好的数据单元部分中。
在760中,可以根据已确定的颜色表确定一个组合颜色表。在730的判断中,所有组织的颜色表都已经被确定,即所述组合颜色表数据单元中所有信息都已经被填满。则可以将所述组合颜色表数据单元中的信息确定为一个组合颜色表。
在770中,可以将该组合颜色表存储至组合颜色表库中。所述存储可以是根据710中确定的组织种类进行编号并存储。所述编号可以是对该组合颜色表设定若干标签,使得在检索时可以根据检索词与标签的关系检索出该组合颜色表。例如,包含骨头、皮肤和肌肉的组合颜色表的标签可以被设定为:包含三种组织、包含骨头、包含皮肤、包含肌肉等。
图8是根据本申请的一些实施例所示的修改组合颜色表的示例性流程图。所述修改组合颜色表的流程可以包括获取一个组合颜色表810,确定组合颜色表中需要修改的数据820,通过修改该数据确定一个新的组合颜色表830以及将该新的组合颜色表存储至组合颜色表库中。
在810中,可以获取一个组合颜色表。所述获取的组合颜色表可以是从组合颜色表库中调取的,也可以是用户手动编写的。在一些实施例中,用户在应用某个组合颜色表辅助绘制彩色医学影像时,可能效果不佳,需要对其中的某些数据进行调整。用户可以根据显示效果手动编写一个组合颜色表。例如,按照前述组合颜色表数据单元的数据结构将当前显示效果对应的数据填入其中。
在820中,可以确定组合颜色表中需要修改的数据。在一些实施例中,根据810中获取的组合颜色表绘制的彩色医学影像可能效果不佳,不能满足用户的需求。则用户可能会希望对该组合颜色表中的某些数据进行修改。例如,骨头的颜色和皮肤的颜色较为接近,导致在分辨这两种组织的时候可能会存在困难。则可以将骨头或皮肤的颜色确定为需要修改的数据。又例如,皮肤的透明度不够高,导致无法观察到内部的组织情况。则可以将皮肤的透明度确定为需要修改的
数据。
在830中,可以通过修改该数据确定一个新的组合颜色表。在一些实施例中,用户可以通过修改组合颜色表数据单元中的对应数据来修改820中确定的需要修改的数据。在一些实施例中,可以用一个更加友好的用户界面来显示组合颜色表中的数据。例如,可以将组合颜色表中的数据对应为显示器窗口上的各个按钮,通过点选对应按钮来进行数据的修改。
在840中,可以将新的组合颜色表存储至组合颜色表库中。所述存储方法可以参考770中的描述。如果810中获取的组合颜色表也是来自同一组合颜色表库,则可以用新的组合颜色表数据覆盖原来的组合颜色表数据。
图9是根据本申请的一些实施例所示的一个选择组合颜色表的示例。图示的八张图可以是根据获取的医学影像信息在应用了组合颜色表库中的检索结果后对应的效果图。用户可以直观地从屏幕上看到根据不同的组合颜色表绘制出的彩色医学影像的初步效果并可以通过诸如鼠标单击等操作来选中其中的一个或多个组合颜色表。
图10A-D是根据本申请的一些实施例所示的应用不同组合颜色表的示例。对于同一幅医学影像,应用不同的组合颜色表进行彩色医学影像绘制可以得到不同的显示效果。图10A的显示效果为只显示头部血管,图10B的显示效果为血管和头骨,图10C的显示效果为骨头,图10D的显示效果为血管、头骨和皮肤。
图11是根据本申请的一些实施例所示的修改组合颜色表的用户界面示例。所示用户界面的标题为组织列表。在虚线框A中列出了该组合颜色表中的组织种类为血管和骨。虚线框B为颜色调整项,用户可以通过该按钮调整组织颜色。虚线框C为是否隐藏项,用户可以通过该按钮来选择是否隐藏相应组织。虚线框D为其他数据调整项,用户可以调整诸如透明度,亮度,对比度等数据。
上文已对基本概念做了描述,显然,对于本领域技术人员来说,上述发明披露仅仅作为示例,而并不构成对本申请的限定。虽然此处并没有明确说明,本领域技术人员可能会对本申请进行各种修改、改进和修正。该类修改、改进和修正在本申请中被建议,所以该类修改、改进、修正仍属于本申请示范实施例的精神和范围。
同时,本申请使用了特定词语来描述本申请的实施例。如“一个实施例”、“一实施例”、和/或“一些实施例”意指与本申请至少一个实施例相关的某一特征、结构或特点。因此,应强调并注意的是,本说明书中在不同位置两次或多次提及的“一实施例”或“一个实施例”或“一替代性实施例”并不一定是指同一实施例。此外,本申请的一个或多个实施例中的某些特征、结构或特点可以进行适当的组合。
此外,本领域技术人员可以理解,本申请的各方面可以通过若干具有可专利性的种类或情况进行说明和描述,包括任何新的和有用的工序、机器、产品或物质的组合,或对他们的任何新的和有用的改进。相应地,本申请的各个方面可以完全由硬件执行、可以完全由软件(包括固件、常驻软件、微码等)执行、也可以由硬件和软件组合执行。以上硬件或软件均可被称为“数据块”、“模块”、“引擎”、“单元”、“组件”或“系统”。此外,本申请的各方面可能表现为位于一个或多个计算机可读介质中的计算机产品,该产品包括计算机可读程序编码。
计算机可读信号介质可能包含一个内含有计算机程序编码的传播数据信号,例如在基带上或作为载波的一部分。该传播信号可能有多种表现形式,包括电磁形式、光形式等等、或合适的组合形式。计算机可读信号介质可以是除计算机可读存储介质之外的任何计算机可读介质,该介质可以通过连接至一个指令执行系统、装置或设备以实现通讯、传播或传输供使用的程序。位于计算机可读信号介质上的程序编码可以通过任何合适的介质进行传播,包括无线电、电缆、光纤电缆、RF、或类似介质、或任何上述介质的组合。
本申请各部分操作所需的计算机程序编码可以用任意一种或多种程序语言编写,包括面向对象编程语言如Java、Scala、Smalltalk、Eiffel、JADE、Emerald、C++、C#、VB.NET、Python等,常规程序化编程语言如C语言、Visual Basic、Fortran 2003、Perl、COBOL 2002、PHP、ABAP,动态编程语言如Python、Ruby和Groovy,或其他编程语言等。该程序编码可以完全在用户计算机上运行、或作为独立的软件包在用户计算机上运行、或部分在用户计算机上运行部分在远程计算机运行、或完全在远程计算机或服务器上运行。在后种情况下,远程计算机可以通过任何网络形式与用户计算机连接,比如局域网(LAN)或广域网(WAN),或连接至外部计算机(例如通过因特网),或在云计算环境中,或作为服务使用如软件即服务(SaaS)。
此外,除非权利要求中明确说明,本申请所述处理元素和序列的顺序、数字字母的使用、或其他名称的使用,并非用于限定本申请流程和方法的顺序。尽管上述披露中通过各种示例讨论了一些目前认为有用的发明实施例,但应当理解的是,该类细节仅起到说明的目的,附加的权利要求并不仅限于披露的实施例,相反,权利要求旨在覆盖所有符合本申请实施例实质和范围的修正和等价组合。例如,虽然以上所描述的系统组件可以通过硬件设备实现,但是也可以只通过软件的解决方案得以实现,如在现有的服务器或移动设备上安装所描述的系统。
同理,应当注意的是,为了简化本申请披露的表述,从而帮助对一个或多个发明实施例的理解,前文对本申请实施例的描述中,有时会将多种特征归并至一个实施例、附图或对其的描述中。但是,这种披露方法并不意味着本申请对象所需要的特征比权利要求中提及的特征多。实际上,实施例的特征要少于上述披露的单个实施例的全部特征。
一些实施例中使用了描述成分、属性数量的数字,应当理解的是,此类用于实施例描述的数字,在一些示例中使用了修饰词“大约”、“近似”或“大体上”等来修饰。除非另外说明,“大约”、“近似”或“大体上”表明所述数字允许有±20%的变化。相应地,在一些实施例中,说明书和权利要求中使用的数值数据均为近似值,该近似值根据个别实施例所需特点可以发生改变。在一些实施例中,数值数据应考虑规定的有效数位并采用一般位数保留的方法。尽管本申请一些实施例中用于确认其范围广度的数值域和数据为近似值,在具体实施例中,此类数值的设定在可行范围内尽可能精确。
最后,应当理解的是,本申请中所述实施例仅用以说明本申请实施例的原则。其他的变形也可能属于本申请的范围。因此,作为示例而非限制,本申请实施例的替代配置可视为与本申请的教导一致。相应地,本申请的实施例不仅限于本申请明确介绍和描述的实施例。
Claims (20)
- 一种生成彩色医学影像的方法,包括:获取医学影像数据;划分影像中的组织;根据划分的组织,从一个组合颜色表库中选择一个组合颜色表;以及根据选择的组合颜色表绘制包含所述划分的组织的彩色的医学影像,其中,所述组合颜色表为一种包含所述划分的组织配色方案的数据。
- 权利要求1所述的方法,还包括将绘制的包含所述划分的组织的彩色医学影像展示给用户。
- 权利要求1所述的方法,所述根据划分的组织,从一个组合颜色表库中选择一个组合颜色表包括:确定组织的种类;检索与确定的组织的种类相关的组合颜色表;呈现相关的组合颜色表给用户;获取一个选择组合颜色表的指令;以及确定一个组合颜色表。
- 权利要求1所述的方法,进一步包括所述组合颜色表库中的组合颜色表的生成方法,所述组合颜色表的生成方法包括:确定组织数;确定第一个组织的颜色表;根据已确定颜色表确定一个新的颜色表,直到所有组织的颜色表都被确定;以及将所有的颜色表组合为一个组合颜色表。
- 权利要求1所述的方法,进一步包括所述组合颜色表库中的组合颜色表的修改方法,所述组合颜色表的修改方法包括:获取一个组合颜色表;确定组合颜色表中需要修改的数据;通过修改所述需要修改的数据确定一个新的组合颜色表;以及将所述新的组合颜色表存储至组合颜色表库中。
- 权利要求1所述的方法,所述医学影像数据至少为以下组合中的一种:一个二维图片、一个三维空间模型、一个动态影像。
- 权利要求1所述的方法,所述划分影像中的组织包括划分所述医学影像中的不同组织以及相同组织的不同部分。
- 权利要求1所述的方法,所述组织配色方案包括编号、名称、标签、颜色、透明度、是否隐藏中的一种或多种。
- 权利要求1所述的方法,所述组合颜色表包括:多个组织,所述组织表示医学影像数据的子集;以及与各个组织对应的颜色表,所述颜色表包括组织的窗宽窗位以及按照窗宽窗位确定的灰度值所对应的彩色配色效果。
- 权利要求3所述的方法,所述确定组织的种类是确定组织的种类特征,其中,组织的种类特征包括组织的种类、不同组织的数量以及各个组织的形态。
- 权利要求3所述的方法,所述选择组合颜色表的指令由用户通过交互设备发出,其中,交互设备上显示有所述检索的检索结果,所述显示的方式包括列表、简图和应用某组合颜色表绘制彩色医学影像的效果图。
- 权利要求9所述的方法,所述检索与确定组织的种类相关的组合颜色表是根据组织的种类特征分层级进行检索。
- 权利要求11所述的方法,所述组合颜色表进一步包括所述各个组织的显示或者隐藏选项。
- 权利要求11所述的方法,所述组合颜色表进一步包括所述各个组织的透明度选项。
- 一种非暂时性的计算机可读介质,包括可执行指令,所述指令被至少一个处理器执行时,导致所述至少一个处理器实现一种方法,包括:获取医学影像数据;划分影像中的组织;根据划分的组织,从一个组合颜色表库中选择一个组合颜色表;以及根据选择的组合颜色表绘制包含所述划划分的组织的彩色的医学影像,其中,所述组合颜色表为一种包含所述划分的组织配色方案的数据。
- 权利要求15所述的计算机可读介质,所述根据划分的组织,从一个组合颜色表库中选择一个组合颜色表包括:确定组织的种类;检索与确定的组织的种类相关的组合颜色表;呈现相关的组合颜色表给用户;获取一个选择组合颜色表的指令;以及确定一个组合颜色表。
- 一个生成彩色医学影像的系统,包括:至少一个处理器,以及可执行指令,所述指令被至少一个处理器执行时,导致所述至少一个处理器实现一种方法,包括:获取医学影像数据;划分影像中的组织;根据划分的组织,从一个组合颜色表库中选择一个组合颜色表;以及根据选择的组合颜色表绘制包含所述划划分的组织的彩色的医学影像,其中,所述组合颜色表为一种包含所述划分的组织配色方案的数据。
- 权利要求17所述的系统,所述根据划分的组织,从一个组合颜色表库中选择一个组合颜色表包括:确定组织的种类;检索与确定的组织的种类相关的组合颜色表;呈现相关的组合颜色表给用户;获取一个选择组合颜色表的指令;以及确定一个组合颜色表。
- 一种系统包括:至少一个处理器,以及存储器,用来存储指令,所述指令被所述至少一个处理器执行时,导致所述系统实现的操作包括:获取医学影像数据;划分影像中的组织;根据划分的组织,从一个组合颜色表库中选择一个组合颜色表;以及根据选择的组合颜色表绘制包含所述划划分的组织的彩色的医学影像,其中,所述组合颜色表为一种包含所述划分的组织配色方案的数据。
- 权利要求19所述的系统,所述根据划分的组织,从一个组合颜色表库中选择一个组合颜色表包括:确定组织的种类;检索与确定的组织的种类相关的组合颜色表;呈现相关的组合颜色表给用户;获取一个选择组合颜色表的指令;以及确定一个组合颜色表。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/075892 WO2018161257A1 (zh) | 2017-03-07 | 2017-03-07 | 生成彩色医学影像的方法及系统 |
EP17900232.4A EP3588438A4 (en) | 2017-03-07 | 2017-03-07 | METHOD AND SYSTEM FOR PRODUCING COLORED MEDICAL IMAGES |
US15/691,815 US10580181B2 (en) | 2017-03-07 | 2017-08-31 | Method and system for generating color medical image based on combined color table |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/075892 WO2018161257A1 (zh) | 2017-03-07 | 2017-03-07 | 生成彩色医学影像的方法及系统 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/691,815 Continuation US10580181B2 (en) | 2017-03-07 | 2017-08-31 | Method and system for generating color medical image based on combined color table |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018161257A1 true WO2018161257A1 (zh) | 2018-09-13 |
Family
ID=63444660
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/075892 WO2018161257A1 (zh) | 2017-03-07 | 2017-03-07 | 生成彩色医学影像的方法及系统 |
Country Status (3)
Country | Link |
---|---|
US (1) | US10580181B2 (zh) |
EP (1) | EP3588438A4 (zh) |
WO (1) | WO2018161257A1 (zh) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3108456B1 (en) * | 2014-02-19 | 2020-06-24 | Koninklijke Philips N.V. | Motion adaptive visualization in medical 4d imaging |
WO2018161257A1 (zh) * | 2017-03-07 | 2018-09-13 | 上海联影医疗科技有限公司 | 生成彩色医学影像的方法及系统 |
US11080857B2 (en) * | 2018-04-26 | 2021-08-03 | NeuralSeg Ltd. | Systems and methods for segmenting an image |
KR102313749B1 (ko) * | 2019-12-23 | 2021-10-18 | 주식회사 메가젠임플란트 | 인공지능 기반 자동 구강 ct 색상변환장치 및 그 장치의 구동방법 |
CN111145278B (zh) * | 2019-12-31 | 2024-01-09 | 上海联影医疗科技股份有限公司 | 弥散张量图像的颜色编码方法、装置、设备及存储介质 |
CN111311738B (zh) * | 2020-03-04 | 2023-08-11 | 杭州市第三人民医院 | 一种采用影像学的输尿管3d数模建立方法及其数据采集装置 |
CN113870167A (zh) * | 2020-06-11 | 2021-12-31 | 通用电气精准医疗有限责任公司 | 用于对肺部图像进行分割的方法、系统以及存储介质 |
CN113041515A (zh) * | 2021-03-25 | 2021-06-29 | 中国科学院近代物理研究所 | 三维图像引导运动器官定位方法、系统及存储介质 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6549645B1 (en) * | 1997-06-13 | 2003-04-15 | Hitachi, Ltd. | Image processing method and apparatus adapted for radiotherapy treatment planning using digitally reconstructed radiograph |
CN101336831A (zh) * | 2008-08-13 | 2009-01-07 | 汕头超声仪器研究所 | 实时三维医学超声图像的重建方法 |
US20090174729A1 (en) * | 2008-01-09 | 2009-07-09 | Ziosoft, Inc. | Image display device and control method thereof |
CN102573625A (zh) * | 2010-10-13 | 2012-07-11 | 株式会社东芝 | 磁共振成像装置、磁共振成像方法及图像显示装置 |
CN104167010A (zh) * | 2014-06-03 | 2014-11-26 | 上海联影医疗科技有限公司 | 一种迭代渲染的方法 |
CN105828726A (zh) * | 2014-06-11 | 2016-08-03 | 奥林巴斯株式会社 | 医用诊断装置、医用诊断装置的工作方法以及医用诊断装置的工作程序 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2006030731A1 (ja) * | 2004-09-13 | 2006-03-23 | Hitachi Medical Corporation | 超音波撮像装置及び投影像生成方法 |
US7561728B2 (en) | 2005-03-23 | 2009-07-14 | Siemens Medical Solutions Usa, Inc. | Detection of intervertebral disk orientation in spine images using curve evolution |
US8488863B2 (en) * | 2008-11-06 | 2013-07-16 | Los Alamos National Security, Llc | Combinational pixel-by-pixel and object-level classifying, segmenting, and agglomerating in performing quantitative image analysis that distinguishes between healthy non-cancerous and cancerous cell nuclei and delineates nuclear, cytoplasm, and stromal material objects from stained biological tissue materials |
CN102292747B (zh) * | 2009-01-22 | 2015-01-28 | 皇家飞利浦电子股份有限公司 | 针对pet/ct图像的像素特征混合融合 |
CN102727200B (zh) | 2011-03-31 | 2016-03-30 | 深圳迈瑞生物医疗电子股份有限公司 | 脊柱椎体和椎间盘分割方法、装置、磁共振成像系统 |
CN103300856B (zh) | 2012-03-13 | 2015-11-25 | 深圳迈瑞生物医疗电子股份有限公司 | Mri图像的颈椎椎体轴线及相关组织的定位方法与装置 |
WO2014201052A2 (en) * | 2013-06-10 | 2014-12-18 | University Of Mississippi Medical Center | Medical image processing method |
US9536045B1 (en) * | 2015-03-16 | 2017-01-03 | D.R. Systems, Inc. | Dynamic digital image compression based on digital image characteristics |
WO2018161257A1 (zh) * | 2017-03-07 | 2018-09-13 | 上海联影医疗科技有限公司 | 生成彩色医学影像的方法及系统 |
US10482604B2 (en) * | 2017-05-05 | 2019-11-19 | Shanghai United Imaging Healthcare Co., Ltd. | Systems and methods for image processing |
-
2017
- 2017-03-07 WO PCT/CN2017/075892 patent/WO2018161257A1/zh unknown
- 2017-03-07 EP EP17900232.4A patent/EP3588438A4/en active Pending
- 2017-08-31 US US15/691,815 patent/US10580181B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6549645B1 (en) * | 1997-06-13 | 2003-04-15 | Hitachi, Ltd. | Image processing method and apparatus adapted for radiotherapy treatment planning using digitally reconstructed radiograph |
US20090174729A1 (en) * | 2008-01-09 | 2009-07-09 | Ziosoft, Inc. | Image display device and control method thereof |
CN101336831A (zh) * | 2008-08-13 | 2009-01-07 | 汕头超声仪器研究所 | 实时三维医学超声图像的重建方法 |
CN102573625A (zh) * | 2010-10-13 | 2012-07-11 | 株式会社东芝 | 磁共振成像装置、磁共振成像方法及图像显示装置 |
CN104167010A (zh) * | 2014-06-03 | 2014-11-26 | 上海联影医疗科技有限公司 | 一种迭代渲染的方法 |
CN105828726A (zh) * | 2014-06-11 | 2016-08-03 | 奥林巴斯株式会社 | 医用诊断装置、医用诊断装置的工作方法以及医用诊断装置的工作程序 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3588438A4 * |
Also Published As
Publication number | Publication date |
---|---|
EP3588438A4 (en) | 2020-03-18 |
EP3588438A1 (en) | 2020-01-01 |
US10580181B2 (en) | 2020-03-03 |
US20180260989A1 (en) | 2018-09-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10818048B2 (en) | Advanced medical image processing wizard | |
WO2018161257A1 (zh) | 生成彩色医学影像的方法及系统 | |
US20240144495A1 (en) | Method and system for processing multi-modality image | |
Lawonn et al. | A survey on multimodal medical data visualization | |
JP6081126B2 (ja) | 医用画像処理装置、画像診断装置、コンピュータシステム、医用画像処理プログラム、及び医用画像処理方法 | |
CN109801254B (zh) | 医学成像中的传递函数确定 | |
CN108701370A (zh) | 基于机器学习的基于内容的医学成像渲染 | |
CN103222876B (zh) | 医用图像处理装置、图像诊断装置、计算机系统以及医用图像处理方法 | |
CN109712217B (zh) | 一种医学图像可视化方法和系统 | |
US10188361B2 (en) | System for synthetic display of multi-modality data | |
CN106934841A (zh) | 生成彩色医学影像的方法及医学系统 | |
US10275946B2 (en) | Visualization of imaging uncertainty | |
US20130135306A1 (en) | Method and device for efficiently editing a three-dimensional volume using ray casting | |
JP6564075B2 (ja) | 医用画像を表示するための伝達関数の選択 | |
US9799135B2 (en) | Semantic cinematic volume rendering | |
CN114387380A (zh) | 用于生成3d医学图像数据的基于计算机的可视化的方法 | |
US20230410413A1 (en) | Systems and methods for volume rendering | |
Preim et al. | Visualization, Visual Analytics and Virtual Reality in Medicine: State-of-the-art Techniques and Applications | |
EP4439445A1 (en) | Device, system and method for generating a medical image of a region of interest of a subject indicating contrast-enhanced regions | |
Jung | Feature-Driven Volume Visualization of Medical Imaging Data | |
Eichner | Interactive co-registration for multi-modal cancer imaging data based on segmentation masks | |
WO2024083817A1 (en) | De-identifying sensitive information in 3d a setting | |
Raspe | GPU-assisted diagnosis and visualization of medical volume data | |
Monclús Lahoya | Advanced interaction techniques for medical models | |
CN112541882A (zh) | 医学体积渲染中的隐式表面着色 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17900232 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2017900232 Country of ref document: EP Effective date: 20190923 |