CN107833231B - Medical image display method, apparatus and computer storage medium - Google Patents

Medical image display method, apparatus and computer storage medium Download PDF

Info

Publication number
CN107833231B
CN107833231B CN201711173703.4A CN201711173703A CN107833231B CN 107833231 B CN107833231 B CN 107833231B CN 201711173703 A CN201711173703 A CN 201711173703A CN 107833231 B CN107833231 B CN 107833231B
Authority
CN
China
Prior art keywords
medical image
window width
regions
window
region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711173703.4A
Other languages
Chinese (zh)
Other versions
CN107833231A (en
Inventor
荣成城
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai United Imaging Healthcare Co Ltd
Original Assignee
Shanghai United Imaging Healthcare Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai United Imaging Healthcare Co Ltd filed Critical Shanghai United Imaging Healthcare Co Ltd
Priority to CN201711173703.4A priority Critical patent/CN107833231B/en
Publication of CN107833231A publication Critical patent/CN107833231A/en
Application granted granted Critical
Publication of CN107833231B publication Critical patent/CN107833231B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10104Positron emission tomography [PET]

Abstract

The invention provides a medical image display method, a medical image display device and a computer storage medium. The medical image display method includes the steps of: segmenting the medical image into a plurality of regions; mapping by using corresponding window width and window level values aiming at each region; and jointly displaying at least two different areas of the plurality of mapped areas.

Description

Medical image display method, apparatus and computer storage medium
Technical Field
The present invention relates to the field of medical images, and in particular, to a medical image display method, apparatus, and computer storage medium.
Background
Medical images, in particular Computed Tomography (CT), Positron Emission Tomography (PET) and Magnetic Resonance (MR) images are very important medical images, and can provide an image of the anatomical structure of a patient noninvasively, thereby providing an effective technical support for diagnosis of related diseases.
Information acquired by medical imaging equipment such as CT, PET, MR and the like is converted into gray level images which can be recognized by doctors through visual eyes after the steps of image reconstruction algorithm, image preprocessing and the like. The distribution range of the intensity values of the original signal collected by the imaging device is wide (e.g. 0-65535), and due to the limitation of hardware, the range of the gray values that can be displayed by a typical display is much smaller than the range of the original intensity values, for example, the gray value of a typical home computer is only 256 levels, and the gray value of a medical special display can be expanded to 1024 or 2048 levels, but still smaller than the range of the original signal intensity values.
Therefore, when converting the original signal into display data of the display, a numerical mapping process is required, i.e. mapping the original intensity values of more levels (e.g. 65535 levels) into the range of gray values supported by the display (e.g. 256 levels), and the calculation process of this mapping is medically referred to as window width (window width) and window level (window level) mapping. An exemplary calculation formula is as follows:
Figure BDA0001477747670000011
wherein, V is the original signal intensity value of a certain pixel, Vmax and Vmin respectively represent the maximum and minimum values of V in the whole image, G (V) is the gray value displayed by the display, g (V) is the gray value displayed by the displaymIs the maximum displayable gray value (e.g., 255) of the display, W is the current window width, and L is the current window level.
The density of human tissues is different, and the original signal intensity value in the medical image is also greatly different. For example, the original signal intensity of bone is about 1000, while the lung has more air, the value is about-1000, and the value of internal organs such as liver is distributed between 100-300. The window width level W, L in the above method is variable, so that by selecting different combinations of window width levels, specific pixel values can be displayed, and other ranges of pixel values can be masked, so that local details can be observed more specifically. However, selecting different window width and window level combinations means that different window width and window level values need to be switched back and forth, which brings inconvenience to the film reading work.
Disclosure of Invention
The invention aims to provide a medical image display method, a medical image display device and a computer storage medium, which can reduce the frequency of switching window width and window level when a doctor reads a film.
In order to solve the technical problem, the invention provides a medical image display method, which comprises the following steps: segmenting the medical image into a plurality of regions; mapping by using corresponding window width and window level values aiming at each region; and jointly displaying at least two different areas of the plurality of mapped areas.
In an embodiment of the invention, each of the plurality of regions corresponds to a range of CT values of the medical image.
In an embodiment of the invention, each of the plurality of regions corresponds to an organ or tissue of a display object of the medical image.
In an embodiment of the invention, the step of segmenting the medical image into a plurality of regions comprises: in the process of receiving the imported one or more medical images, automatically segmenting the medical images into a plurality of regions; or upon receipt of the imported one or more medical images, segmenting the selected region from the medical images in response to the selected region.
In an embodiment of the present invention, the mapping, using the corresponding window width and window level values, of the pixels in each region includes: creating a corresponding marker for each pixel of the medical image according to the segmented region; and traversing all pixels of the medical image globally, and mapping each pixel according to preset corresponding window width and window level values to obtain a display gray value.
In an embodiment of the present invention, the mapping, using the corresponding window width and window level values, of the pixels in each region includes: and traversing the regions in the boundaries of the regions, and mapping each pixel in each region according to preset corresponding window width and window level values to obtain a display gray value.
In an embodiment of the present invention, the method further includes: automatically or in response to an operation, displaying window width and window level values of each currently displayed area; or in response to a selection operation, displaying window width and level values of the selected region.
In an embodiment of the present invention, the method further includes: receiving a modification to a window level and/or a window width of a selected area; correspondingly changing the window level and/or the window width of the medical image according to the modification.
In an embodiment of the present invention, the operation manner includes: one or more of mouse operation, keyboard operation, touch operation and voice operation.
In an embodiment of the invention, the medical image is a computed tomography image, a positron emission tomography image or a magnetic resonance image.
The present invention also proposes a medical image display apparatus comprising: a memory for storing instructions executable by the processor; a processor for executing the instructions to implement the method as described above.
The present invention also proposes a computer-readable storage medium having stored thereon computer instructions, wherein the computer instructions, when executed by a processor, perform the method as described above.
Compared with the prior art, the method and the device have the advantages that different window width and window level value combinations are set for different areas, and all the areas are jointly displayed. Therefore, two regions with far different signal values in the same medical image can see the effect of detail at the same time, and the frequency of switching back and forth of an observer (such as a technician or a doctor) is reduced.
Drawings
FIG. 1 is a schematic diagram of a computer device according to an embodiment of the present invention;
FIG. 2 is a flow chart of a medical image display method according to an embodiment of the invention;
fig. 3 is a schematic view of a segmentation selection interface of a medical image display method according to an embodiment of the present invention.
Fig. 4 is a medical image segmentation schematic diagram of a medical image display method according to an embodiment of the present invention.
Fig. 5 is a combined display diagram of a medical image display method according to an embodiment of the present invention.
FIG. 6 is a flow chart of a medical image display method according to another embodiment of the present invention;
fig. 7A and 7B are schematic views of a display window width and window level interface of a medical image display method according to an embodiment of the present invention.
FIG. 8 is a flow chart of a modified window width level of a medical image display method according to another embodiment of the present invention;
fig. 9 is an interface diagram of a modified window width level of the medical image display method according to an embodiment of the present invention.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, from which the application can also be applied to other similar scenarios without inventive effort for a person skilled in the art. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
As used in this application and the appended claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are intended to be inclusive in the plural unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Although various references are made herein to certain modules in a system according to embodiments of the present application, any number of different modules may be used and run on a display system and/or processor. The modules are merely illustrative and different aspects of the systems and methods may use different modules.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding or following operations are not necessarily performed in the exact order in which they are performed. Rather, various steps may be processed in reverse order or simultaneously. Meanwhile, other operations are added to or removed from these processes.
FIG. 1 is a schematic block diagram of a computer device according to some embodiments of the invention. Computer 100 may be used to implement particular methods and apparatus disclosed in some embodiments of the invention. The specific apparatus in this embodiment is illustrated by a functional block diagram of a hardware platform that includes a display module. In some embodiments, computer 100 may implement implementations of some embodiments of the invention by its hardware devices, software programs, firmware, and combinations thereof. In some embodiments, the computer 100 may be a general purpose computer, or a special purpose computer.
As shown in FIG. 1, computer 100 may include an internal communication bus 101, a processor (processor)102, Read Only Memory (ROM)103, Random Access Memory (RAM)104, communication ports 105, input/output components 106, a hard disk 107, and a user interface 108. The internal communication bus 101 may enable data communication among the components of the computer 100. The processor 102 may make the determination and issue the prompt. In some embodiments, the processor 102 may be comprised of one or more processors. The communication port 105 may enable the computer 100 to communicate with other components (not shown), such as: and the external equipment, the image acquisition equipment, the database, the external storage, the image processing workstation and the like are in data communication. In some embodiments, computer 100 may send and receive information and data from a network through communication port 105. Input/output component 106 supports the flow of input/output data between computer 100 and other components. The user interface 108 may enable interaction and information exchange between the computer 100 and a user. The computer 100 may also include various forms of program storage units and data storage units such as a hard disk 107, Read Only Memory (ROM)103 and Random Access Memory (RAM)104, capable of storing various data files used in computer processing and/or communications, as well as possible program instructions executed by the processor 102.
By way of example, the input/output components 106 may include one or more of the following components: a mouse, a trackball, a keyboard, a touch-sensitive component, a sound receiver, etc.
Fig. 2 is a flowchart of a medical image display method according to an embodiment of the present invention. Referring to fig. 2, the medical image display method of the embodiment includes the following steps:
in step 201, a medical image is segmented into a plurality of regions.
At step 202, for each region, a mapping is performed using the corresponding window width and level values.
At step 203, at least two regions of the plurality of regions that have undergone mapping are jointly displayed.
In this embodiment, the window width and the window level value may be set for each region, respectively. For example, a first region having a first window width and a first window level, a second region having a second window width and a second window level, a third region having a third window width and a third window level, etc. The window width and level combinations may be different in each region. However, the window width of each region may be the same, or the window level of each region may be the same. By setting different window width and window level value combinations for different areas, the presentation effect of the area can be improved. Thus, in the joint display of step 203, two regions with far different signal values in the same medical image can see the effect of detail at the same time, reducing the frequency of switching back and forth by the observer (e.g., technician or doctor).
In an embodiment of the invention, each of the plurality of regions may correspond to a range of CT values of the medical image. In another embodiment, each of the plurality of regions may correspond to an organ or tissue of a display object of the medical image. Taking an organ as an example, it may include the head, chest, lung, pleura, mediastinum, abdomen, large intestine, small intestine, bladder, gall bladder, triple energizer, pelvic cavity, diaphysis, terminal, skeletal, vascular, or the like, or any combination thereof. In step 201, a model-based image segmentation method, a threshold-based image segmentation method, an active contour-based image segmentation method, a level set-based image segmentation method, or the like may be used. By the segmentation, for example, the outer contour of the skin, the lung, the liver, the spleen, the skeleton of the whole body, and the like of the human body can be obtained. It should be noted that the medical image may be entirely segmented into a plurality of regions, or a portion (e.g., a main portion) may be segmented into a plurality of regions.
In embodiments of the present invention, each region may comprise one connected region, such as a single connected region or a multiple connected region; each zone may also include a plurality of connected zones.
In an embodiment of the present invention, step 201 may be performed automatically. For example, during the reception of the imported medical image or images, the respective medical image is automatically segmented into a plurality of regions (e.g. organs). Such an operation may reduce user operations and, in the event that system processing resources are sufficient, may also reduce the wait time required for a user to view a medical image.
In another embodiment of the present invention, step 201 may be performed under a user operation. Specifically, upon receiving the imported one or more medical images, the selected region is segmented from the medical images in response to the selected region. Fig. 3 is a schematic view of a segmentation selection interface of a medical image display method according to an embodiment of the present invention. Referring to fig. 3, in a segmentation selection interface 300, an organ list 301 is provided for selection by a user. When the user has selected some of the organs, the organs are entered into the selection list 302 on the right. Upon selection of button 303, organ selection is complete and the system then performs region segmentation at step 201.
In the embodiment of the present invention, the segmentation performed in step 201 may segment a partial region of the medical image, or may segment a whole region of the medical image.
In an embodiment of the present invention, the method of step 202 may include, first, creating a corresponding label for each pixel of the medical image according to the segmented region; and then, traversing all pixels of the medical image globally, and mapping each pixel according to preset corresponding window width and window level values to obtain a display gray value. For example, pixels in the lungs of the medical image are labeled 0, pixels in the liver are labeled 1, and so on. In the process of globally traversing all pixels of the medical image, for each marked pixel, carrying out numerical mapping calculation on a window width window level according to a window width window level preset and corresponding to the mark based on the system. Alternatively, a plurality of CT value ranges (which will be used for region segmentation) may be established corresponding to the window width level, and before the mapping calculation, the corresponding CT value range is found according to the current region, and then the corresponding window width level is found, so as to perform the mapping calculation. Or after the region is divided, the corresponding relation between the region and the window width and window level can be established. Before mapping calculation, finding out the corresponding window width and window level according to the current area, thereby carrying out mapping calculation.
In another embodiment of the present invention, the method of step 202 may include performing a region traversal within the boundary of each region, and mapping each pixel in each region according to a preset corresponding window width and window level value to obtain a display gray value. For example, a region traversal is performed on a liver region, and each pixel in the liver region is mapped according to a preset corresponding window width and window level value. This region traversal may be performed immediately after each region is segmented, or after all regions to be segmented are segmented.
Fig. 4 is a medical image segmentation schematic diagram of a medical image display method according to an embodiment of the present invention. Referring to fig. 4, a medical image 400 is segmented to have regions 401, 402a-402c, 403, 404, and 405a-405 b.
As an example, the method shown in FIG. 2 may be performed on the computer device shown in FIG. 1. A display interface of the medical image is presented through the user interface 108 and a slider is presented. The input/output component 106 can detect an event, such as an operation by a user. Here, the manner of operation includes: one or more of mouse operation, keyboard operation, touch operation and voice operation. The processor 102 may perform region segmentation in response to the import operation. The processor 102 maps with corresponding window width and level values for each region. At least two of the mapped regions are jointly displayed on the user interface 108.
Fig. 5 is a combined display diagram of a medical image display method according to an embodiment of the present invention. Referring to fig. 5, a medical image 500 containing a joint display of the liver, lungs, and various soft tissues is displayed, in which the window width and level of each region are set according to the characteristics of each region, thereby better representing this region.
Embodiments of the present invention also provide for the display of window width and level. Fig. 6 is a flowchart of a medical image display method according to another embodiment of the present invention. Referring to fig. 6, the medical image display method of the present embodiment includes the following steps:
in step 601, a medical image is segmented into a plurality of regions.
At step 602, for each region, a mapping is performed using the corresponding window width and level values.
At step 603, at least two regions of the mapped plurality of regions are jointly displayed.
In step 604, the window width and level values for each region currently displayed are displayed.
Step 604 may be performed in response to the operation. Fig. 7A and 7B are schematic views of a display window width and window level interface of a medical image display method according to an embodiment of the present invention. Referring to FIG. 7A, a right-click menu may pop up on the displayed image, for example, by clicking a right mouse button; the right-click menu has a "united window width level" item, and after the left-click of the left mouse button, a dialog box as shown in fig. 7B pops up to display a list of all currently applied window width level values, and the user can know all applied window width level values through the list. For example, global 1000/500 represents a window width of 1000 and a window level of 500 for the medical image as a whole, for setting the window width and level for those regions that are not segmented. Liver: 200/50 indicates a window width of 200 for liver organs, a window level of 50, and the rest is similar.
Alternatively, the window width and level values of the selected region may be displayed in response to the selection operation in step 604. For example, when a user clicks on or stays in a certain area, the window width and level values of this area are displayed.
In another alternative embodiment, step 604 may be performed automatically. For example, the window width and the window level of each area are displayed in a window of the display interface.
Embodiments of the present invention also provide for modifications to window width and level. Fig. 8 is a flowchart of a medical image display method according to another embodiment of the present invention.
In step 801, a medical image is segmented into a plurality of regions.
At step 802, for each region, a mapping is performed using the corresponding window width and level values.
At step 803, at least two regions of the mapped plurality of regions are jointly displayed.
At step 804, a modification to the window level and/or window width of the selected region is received.
In step 805, the window level and/or the window width of the medical image is correspondingly changed according to the modification.
It will be appreciated that the operation of the window width and level may be independent of each other. For example, when the window level of the medical image is changed according to the change, the window width of the medical image is kept unchanged. When the window width of the medical image is changed, the window level of the medical image is kept unchanged. It is understood, however, that the operation of window width and level may also be interrelated.
Fig. 9 is an interface diagram of a modified window width level of the medical image display method according to an embodiment of the present invention. Referring to FIG. 9, in the display interface 900, if the user wants to edit the window level of a certain local window, the window level adjustment tool may be selected first, and then the mouse may be moved over the current image. When the mouse enters the boundary of a certain area (e.g., area 904) during the movement, the software system automatically recognizes the state of entering the boundary. At this time, the user can enter a mode of adjusting the window width and the window level in the area by pressing the left mouse button. The adjustment may be performed in various ways, such as moving the window width horizontally, moving the window level vertically, or adjusting by inputting a value.
In an embodiment of the invention, the medical image may be a computed tomography image, a positron emission tomography image, a magnetic resonance image or other image. The medical image may be from medical imaging devices such as X-ray machine devices, Magnetic Resonance imaging devices (MR), Computed Tomography imaging devices (CT), Positron Emission imaging devices (PET), and multi-modality imaging devices formed by combining the above-mentioned various imaging devices, such as PET-CT devices, PET-MR devices, RT-MR devices, and the like.
Implementing the method described above in a computer device, a medical image display apparatus may be implemented, comprising a memory for storing instructions executable by a processor; a processor for executing the instructions to implement the methods as described in the embodiments above.
The methods described above may be implemented as a computer-readable storage medium having stored thereon computer instructions, which when executed by a processor, perform the methods as described in the embodiments above.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing disclosure is by way of example only, and is not intended to limit the present application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such modifications, improvements and adaptations are proposed in the present application and thus fall within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Numerals describing the number of components, attributes, etc. are used in some embodiments, it being understood that such numerals used in the description of the embodiments are modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the application.
Although the present invention has been described with reference to the present specific embodiments, it will be appreciated by those skilled in the art that the above embodiments are merely illustrative of the present invention, and various equivalent changes and substitutions may be made without departing from the spirit of the invention, and therefore, it is intended that all changes and modifications to the above embodiments within the spirit and scope of the present invention be covered by the appended claims.

Claims (12)

1. A medical image display method, comprising the steps of:
segmenting the medical image into a plurality of regions;
mapping by using preset corresponding window width and window level values aiming at each region; and
at least two different regions of the mapped plurality of regions are jointly displayed.
2. The method of claim 1, wherein each of the plurality of regions corresponds to a range of CT values for the medical image.
3. The method of claim 1, wherein each of the plurality of regions corresponds to an organ or tissue of a display object of the medical image.
4. The method of claim 1, wherein the step of segmenting the medical image into a plurality of regions comprises:
in the process of receiving the imported one or more medical images, automatically segmenting the medical images into a plurality of regions; or
Upon receiving the imported one or more medical images, the selected region is segmented from the medical images in response to the selected region.
5. The method of claim 1, wherein the step of mapping with corresponding window width and level values for the pixels of each region comprises:
creating a corresponding marker for each pixel of the medical image according to the segmented region;
and traversing all pixels of the medical image globally, and mapping each pixel according to preset corresponding window width and window level values to obtain a display gray value.
6. The method of claim 1, wherein the step of mapping with corresponding window width and level values for the pixels of each region comprises:
and traversing the regions in the boundaries of the regions, and mapping each pixel in each region according to preset corresponding window width and window level values to obtain a display gray value.
7. The method of claim 1, further comprising:
automatically or in response to an operation, displaying window width and window level values of each currently displayed area; or
In response to the selection operation, the window width and level values of the selected area are displayed.
8. The method of claim 1 or 7, further comprising:
receiving a modification to a window level and/or a window width of a selected area;
correspondingly changing the window level and/or the window width of the medical image according to the modification.
9. The method of claim 7, wherein the manner of operation comprises: one or more of mouse operation, keyboard operation, touch operation and voice operation.
10. The method of claim 1, wherein the medical image is a computed tomography image, a positron emission tomography image, or a magnetic resonance image.
11. A medical image display apparatus comprising:
a memory for storing instructions executable by the processor;
a processor for executing the instructions to implement the method of any one of claims 1-10.
12. A computer readable storage medium having computer instructions stored thereon, wherein the computer instructions, when executed by a processor, perform the method of any of claims 1-10.
CN201711173703.4A 2017-11-22 2017-11-22 Medical image display method, apparatus and computer storage medium Active CN107833231B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711173703.4A CN107833231B (en) 2017-11-22 2017-11-22 Medical image display method, apparatus and computer storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711173703.4A CN107833231B (en) 2017-11-22 2017-11-22 Medical image display method, apparatus and computer storage medium

Publications (2)

Publication Number Publication Date
CN107833231A CN107833231A (en) 2018-03-23
CN107833231B true CN107833231B (en) 2020-12-04

Family

ID=61652334

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711173703.4A Active CN107833231B (en) 2017-11-22 2017-11-22 Medical image display method, apparatus and computer storage medium

Country Status (1)

Country Link
CN (1) CN107833231B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3564906A1 (en) * 2018-05-04 2019-11-06 Siemens Healthcare GmbH Method for generating image data in a computer tomography unit, image generating computer, computer tomography unit, computer program product and computer-readable medium
CN109493396B (en) * 2018-11-07 2024-03-26 上海联影医疗科技股份有限公司 CT image display method, device, equipment and medium
CN109542284B (en) * 2018-11-15 2022-09-06 上海联影医疗科技股份有限公司 Image display parameter adjusting method, device, terminal and storage medium
CN109725796A (en) * 2018-12-28 2019-05-07 上海联影医疗科技有限公司 A kind of medical image display method and its device
CN112991242A (en) * 2019-12-13 2021-06-18 RealMe重庆移动通信有限公司 Image processing method, image processing apparatus, storage medium, and terminal device
CN111166362B (en) * 2019-12-31 2021-12-03 推想医疗科技股份有限公司 Medical image display method and device, storage medium and electronic equipment
CN111462139A (en) * 2020-04-24 2020-07-28 上海联影医疗科技有限公司 Medical image display method, medical image display device, computer equipment and readable storage medium
CN111462115A (en) * 2020-04-27 2020-07-28 上海联影医疗科技有限公司 Medical image display method and device and computer equipment
CN111803104B (en) * 2020-07-20 2021-06-11 上海杏脉信息科技有限公司 Medical image display method, medium and electronic equipment
CN114219813A (en) * 2021-12-16 2022-03-22 数坤(北京)网络科技股份有限公司 Image processing method, intelligent terminal and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0409206A2 (en) * 1989-07-19 1991-01-23 Kabushiki Kaisha Toshiba Digital image display apparatus
WO2006132651A2 (en) * 2004-08-26 2006-12-14 Lumeniq, Inc. Dynamic contrast visualization (dcv)
CN101112315A (en) * 2007-08-24 2008-01-30 珠海友通科技有限公司 X-ray human body clairvoyance image automatic anastomosing and splicing method
CN101901474A (en) * 2008-11-27 2010-12-01 爱克发医疗保健公司 Method of changing at least one of density and contrast of an image
WO2017135686A1 (en) * 2016-02-04 2017-08-10 삼성전자 주식회사 Tomographic image processing device and method, and recording medium relating to method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002017685A (en) * 2000-07-10 2002-01-22 Ge Yokogawa Medical Systems Ltd Medical image displaying method and medical image displaying device
US9741104B2 (en) * 2015-05-18 2017-08-22 Toshiba Medical Systems Corporation Apparatus, method, and computer-readable medium for quad reconstruction using hybrid filter convolution and high dynamic range tone-mapping
CN106557224A (en) * 2015-09-29 2017-04-05 青岛海信医疗设备股份有限公司 A kind of CT method for displaying image and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0409206A2 (en) * 1989-07-19 1991-01-23 Kabushiki Kaisha Toshiba Digital image display apparatus
WO2006132651A2 (en) * 2004-08-26 2006-12-14 Lumeniq, Inc. Dynamic contrast visualization (dcv)
CN101112315A (en) * 2007-08-24 2008-01-30 珠海友通科技有限公司 X-ray human body clairvoyance image automatic anastomosing and splicing method
CN101901474A (en) * 2008-11-27 2010-12-01 爱克发医疗保健公司 Method of changing at least one of density and contrast of an image
WO2017135686A1 (en) * 2016-02-04 2017-08-10 삼성전자 주식회사 Tomographic image processing device and method, and recording medium relating to method

Also Published As

Publication number Publication date
CN107833231A (en) 2018-03-23

Similar Documents

Publication Publication Date Title
CN107833231B (en) Medical image display method, apparatus and computer storage medium
JP6598452B2 (en) Medical image processing apparatus and medical image processing method
US7860331B2 (en) Purpose-driven enhancement filtering of anatomical data
WO2018095346A1 (en) Medical imaging system based on hmds
US10249092B2 (en) System and method for rendering complex data in a virtual reality or augmented reality environment
US10580181B2 (en) Method and system for generating color medical image based on combined color table
Maksimovic et al. Computed tomography image analyzer: 3D reconstruction and segmentation applying active contour models—‘snakes’
DE112005001755T5 (en) System and method for tree model visualization for the detection of pulmonary embolism
US10188361B2 (en) System for synthetic display of multi-modality data
US8588490B2 (en) Image-based diagnosis assistance apparatus, its operation method and program
JP2007275595A (en) View creating method for reproducing tomographic image data
CN111260703A (en) Method, system, medium and storage medium for obtaining spinal straightening image set
EP3311362B1 (en) Selecting transfer functions for displaying medical images
WO2021030995A1 (en) Inferior vena cava image analysis method and product based on vrds ai
JP4668289B2 (en) Image processing apparatus and method, and program
JP4473578B2 (en) Method and apparatus for forming an isolated visualized body structure
AU2019431568B2 (en) Method and product for processing of vrds 4d medical images
Jung et al. Occlusion and slice-based volume rendering augmentation for PET-CT
Meng et al. Multi-modal MRI image fusion of the brain based on joint bilateral filter and non-subsampled shearlet transform
WO2024083817A1 (en) De-identifying sensitive information in 3d a setting
Preim et al. Visualization, Visual Analytics and Virtual Reality in Medicine: State-of-the-art Techniques and Applications
Jung Feature-Driven Volume Visualization of Medical Imaging Data
JP2023132107A (en) Image processing device, method for processing image, and program
KR20230156940A (en) How to visualize at least one region of an object in at least one interface
CN114140332A (en) Deep learning-based breast ultrasound enhancement method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant after: Shanghai Lianying Medical Technology Co., Ltd

Address before: 201807 Shanghai City, north of the city of Jiading District Road No. 2258

Applicant before: SHANGHAI UNITED IMAGING HEALTHCARE Co.,Ltd.

GR01 Patent grant
GR01 Patent grant