WO2006027536A1 - User interface for ct scan analysis - Google Patents

User interface for ct scan analysis Download PDF

Info

Publication number
WO2006027536A1
WO2006027536A1 PCT/GB2005/002953 GB2005002953W WO2006027536A1 WO 2006027536 A1 WO2006027536 A1 WO 2006027536A1 GB 2005002953 W GB2005002953 W GB 2005002953W WO 2006027536 A1 WO2006027536 A1 WO 2006027536A1
Authority
WO
WIPO (PCT)
Prior art keywords
scan image
enhanced
original
image
original scan
Prior art date
Application number
PCT/GB2005/002953
Other languages
French (fr)
Inventor
Jameshid Dehmeshki
Original Assignee
Medicsight Plc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from GB0420147A external-priority patent/GB2418094B/en
Application filed by Medicsight Plc filed Critical Medicsight Plc
Priority to AU2005281551A priority Critical patent/AU2005281551A1/en
Priority to CA002579858A priority patent/CA2579858A1/en
Priority to JP2007530755A priority patent/JP2008512161A/en
Publication of WO2006027536A1 publication Critical patent/WO2006027536A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/02Arrangements for diagnosis sequentially in different planes; Stereoscopic radiation diagnosis
    • A61B6/03Computed tomography [CT]
    • A61B6/032Transmission computed tomography [CT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/467Arrangements for interfacing with the operator or the patient characterised by special input means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • G06T11/008Specific post-processing after tomographic reconstruction, e.g. voxelisation, metal artifact correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20192Edge enhancement; Edge preservation

Definitions

  • the present invention relates to a user interface for CT scan analysis, and particularly for displaying a computed tomography (CT) scan image which is enhanced so as to highlight potential lesions or other abnormalities.
  • CT computed tomography
  • a radiologist visually inspects each slice of a CT scan, using his or her expertise to identify abnormalities and to distinguish them from normal structures.
  • the task can be made easier by storing the scan image on a computer and providing a user interface which allows the user to move rapidly between slices and visualise the structures in different ways.
  • the process is time consuming and must be performed with great care in case any abnormalities are missed.
  • CAD Computer Assisted Detection
  • the results of CAD must be checked by a radiologist as a safeguard. If the software is used as the 'first reader', the radiologist only verifies the results produced by the software and does not analyse the original CT scan. To be effective as a 'first reader', the software must have both high sensitivity (i.e. a low percentage of missed lesions) and high specificity (i.e. a low percentage of false positives), because the radiologist may make medically important decisions based on the results. 'First reader' has the greatest potential to save radiologists' time, but it is a great challenge to achieve both high sensitivity and high specificity.
  • the software can be used as a 'second reader', where the radiologist makes a preliminary diagnosis based entirely on the original images, and then runs the software as a check for any missed lesions.
  • the software does not actually save time, but can assist the radiologist in making better diagnoses.
  • the software does not need to have particularly high sensitivity or specificity, so long as it leads to more accurate diagnoses than an unassisted radiologist. Used in this way, the software is analogous to a spelling or grammar checker for word-processing - it merely draws the user's attention to oversights, rather than replacing the actions of the user.
  • the document WO-A-03/077203 discloses a user interface which allows corresponding areas from different scans to be displayed side-by-side.
  • a further problem is that many CAD algorithms rely for detection on a predefined set of parameter ranges.
  • the Agatston method as originally described in 'Quantification of coronary artery calcium using ultrafast computed tomography', Agatston AS, Janowitz WR, Hildner FJ et al, J Am Coll Cardiol 1990 15:827-832, applies a threshold of 130 Hounsfield units (HU) to the CT image, and identifies all pixels above that threshold as containing calcium.
  • a scoring system is then used to rate the severity of the calcification, based on the number of pixels above the threshold multiplied by a weight based on the highest intensity within the calcification.
  • the weight is 1; if between 200 and 300 HU, the weight is 2; and if over 300 HU, the weight is 3.
  • the values of the threshold and the weights are based on empirical studies of coronary scans and the subsequent outcome for the patients. However, there is continuing debate as to which parameter ranges give the most accurate results; different ranges may be appropriate for different scan images.
  • the document US6058322 discloses an interactive user modification function in which the software displays detected microcalcifications and the user may then add or delete microcalcifications.
  • the software modifies its estimated likelihood of malignancy accordingly.
  • a method of displaying a CT scan image comprising displaying an original scan image and displaying an enhanced scan image derived from the original scan image.
  • the original and enhanced images are displayed with similar image attributes, such as size and scale, to facilitate comparison between the original and enhanced image.
  • the enhanced image may be enhanced so as to identify to the user lesions or abnormalities in the original image.
  • the original and enhanced images are displayed simultaneously. In another embodiment, the original and enhanced images are displayed alternately, with a similar size, scale and position. Preferably, the original and enhanced images are displayed on the same display device.
  • An advantage of these arrangements is that the original image can be visually checked against the enhanced image without the enhanced image obscuring features of the original image.
  • the enhanced image acts as a joint reader with the user, who can examine the original image while using the enhanced image as assistance.
  • the user is able to change enhancement parameters of the enhanced image while viewing the original and enhanced image.
  • the user is able to adjust the parameters to the most suitable settings for the image while observing the effect on the enhanced image and comparing this with his or her own analysis of the original image.
  • the adjusted parameters may then be applied to other parts of the original image so as to .provide a more accurate analysis of those parts.
  • Figure 1 is a schematic diagram showing a CT scanner and a remote computer for processing image data from the scanner and operating a user interface;
  • Figure 2 is a flow chart of the main steps of the method of operation of the user interface in an embodiment of the invention
  • Figure 3 is a screenshot of the user interface, with a sphericity enhancement filter applied
  • Figure 4 shows a filter window of the user interface with the sphericity enhancement filter selected
  • Figure 5 is a screenshot of the user interface, with an edge enhancement filter applied
  • Figure 6 shows the filter window with a standard noise removal filter selected
  • Figure 7 shows the filter window with an advanced noise removal filter selected
  • Figure 8 is a screenshot of the user interface in a second embodiment, for use with colon scan images.
  • a CT image may include one or more slices obtained from a human or animal patient using a CT scanner.
  • Each slice is a 2-dimensional digital grey-scale image of the x-ray absorption of the scanned area.
  • the properties of the slice may depend on the CT scanner used; for example, a high-resolution multi-slice CT scanner may produce images with a resolution of 0.5-0.6 mm/pixel in the x and y directions (i.e. in the plane of the slice).
  • Each pixel may have 32-bit greyscale resolution.
  • the intensity value of each pixel is normally expressed in Hounsfield units (HU).
  • Sequential slices may be separated by a constant distance along the z direction (i.e. the scan separation axis); for example, by a distance of between 0.75-2.5 mm.
  • the scan image may be a two- dimensional (2D) or three-dimensional (3D) grey scale image, with an overall size depending on the area and number of slices scanned.
  • the present invention is not restricted to any specific scanning technique, and is applicable to electron beam computed tomography (EBCT), multi-detector or spiral scans or any technique which produces as output a 2D or 3D image representing X-ray absorption.
  • EBCT electron beam computed tomography
  • multi-detector multi-detector
  • spiral scans any technique which produces as output a 2D or 3D image representing X-ray absorption.
  • the scan image is created by a computer 4 which receives scan data from a scanner 2 and constructs the scan image.
  • the scan image is saved as an electronic file or a series of files which are stored on a storage medium 6, such as a fixed or removable disc.
  • the scan image may be stored in a standard format, such as DIRCOM 3.0.
  • the scan image may be processed by the computer 4, or the scan image may be transferred to another computer 8 which runs software for processing and displaying images as described below.
  • the software may be stored on a carrier, such as a removable disc, or downloaded over a network.
  • FIG. 1 A flowchart of a method of operation of a user interface, implemented by software in an embodiment of the invention, is shown in Figure 2.
  • An original scan image is provided as input (Sl) and is processed to provide an enhanced scan image (S2).
  • An enhanced scan image (S3), either side by side or alternately, together with an indication of the parameter values used for the enhancement. If a user input is received (S4) to adjust the parameter values, the values are adjusted (S5) and the enhanced scan image is produced again (S2) using the adjusted parameters.
  • the original and enhanced images may be displayed side by side, or may be displayed alternately at the same position and with the same size and scale; for example, the image enhancement may be alternately switched on and off.
  • the switching between original and enhanced images may occur every time the user presses a specific key, or clicks on a button on-screen.
  • Figure 3 shows a screenshot of the user interface in a first embodiment, displaying a two-dimensional slice of a scan of a lung phantom (i.e. a model approximating the lung and containing objects with known properties, used for testing and calibration).
  • a lung phantom i.e. a model approximating the lung and containing objects with known properties, used for testing and calibration.
  • the user interface is shown in a window comprising an original image pane 10, and an enhanced image pane 12.
  • the original image pane 10 displays a slice of the scan image.
  • a first toolbar 18 is located at the bottom of the original image pane 10, containing buttons which allow the user to manipulate the original image, for example by zooming, panning and rotating.
  • the original image pane 10 also includes a first scroll bar 22 which allows the user to move forward and backward between slices.
  • the enhanced image pane 12 displays a version of the original image processed by one or more filters so as to highlight features of the original image, hi this example, a sphericity enhancement filter is applied, and objects satisfying the sphericity filter are circled.
  • a second toolbar 20 in the enhanced image pane 12 contains buttons which allow the user to manipulate the enhanced image, for example by zooming, panning and rotating.
  • the enhanced image pane 12 also includes a second scroll bar 24 which allows the user to move forward and backward between slices.
  • the first and second toolbars 18, 20 may be linked so that an image manipulation performed by either toolbar has a corresponding or equivalent effect on both the original and the enhanced image.
  • magnifying the original scan image using the first toolbar 18 magnifies the enhanced scan image in the enhanced image pane 12 by the same amount
  • magnifying the enhanced scan image using the second toolbar 20 magnifies the original scan image in original image pane 10 by the same amount. In this way, a side-by-side comparison may be made between the same view of the original and enhanced images.
  • the first and second scroll bars 22, 24 may also be linked so that moving either scrollbar has a corresponding or equal effect on both the original scan image and the enhanced scan image. This allows the user to move back and forth in the z-direction while comparing the original and enhanced scan images.
  • the user interface window includes a right-hand toolbar 14 which displays various options selectable by the user.
  • a 'Display' option allows the user to control how many panes are displayed: single, two, four or window, in which the panes are displayed as separate windows. In this example, only two panes are used for displaying scan images, while the third and fourth pane are assigned a data display and 3D nodule display function respectively.
  • the type and parameters of the filter are controlled by the user by means of a filter window 16, which is shown in more detail in Figure 4.
  • a filter window 16 which is shown in more detail in Figure 4.
  • the user can select any one of a sphericity enhancement filter, an edge filter, a standard noise removal filter and an enhanced noise removal filter, by selecting the corresponding tab.
  • the user sets the parameter values of the filter, and clicks a button to apply the filter to update the enhanced image.
  • the sphericity enhancement filter tab includes a sphericity slider 26 allowing the user to set the minimum level of sphericity for objects passed by the filter, a minimum intensity slider 28a and a maximum intensity slider 28b allowing the user to select the minimum and maximum peak intensity respectively within an object to be passed by the filter, and a nodule size selector 30 allowing the user to select one of a plurality of different size ranges of objects to be passed by the filter.
  • the user can also select a first option 32 to detect plural tiny nodules, and a second option 34 to remove cylindrical shapes. The latter provides the user with the ability to avoid enhancement of spherical shapes that are within cylindrical structures, such as blood vessels.
  • the sphericity enhancement filter may operate by analysing each volume element (voxel) within the scan image and comparing with surrounding voxels of similar intensity to derive the three-dimensional curvature of a surface of equal intensity.
  • Surfaces having a sphericity above the level selected by the user are identified as belonging to spherical objects. Voxels contained within those surfaces are then grouped together as parts of the same object. Once all such objects have been identified, those having a maximum intensity between the minimum and maximum selected by the user, and a size within the range selected by the user, are highlighted by the filter.
  • the object may be highlighted only on the slice which contains the greatest area of the object.
  • Figure 5 shows the embodiment when applied to another scan image, with the edge enhancements filter selected and applied.
  • the edge enhancement is applied to the lung parenchyma area only, and is advantageous in identifying signs of interstitial disease such as reticulation.
  • a contrast setting window 36 is displayed, allowing the user to vary contrast parameters of the edge enhancement filter.
  • Figure 6 shows the filter window 16 when the standard noise removal filter is selected.
  • the filter window displays a blur slider 38 by which the user can vary the degree of noise smoothing.
  • This filter is advantageous for reading low-dose multi-slice spiral CT (MSCT) studies, where radiation dose reduction is achieved at the expense of increased background noise.
  • the standard noise removal filter smoothes the background noise to the degree set by the user.
  • FIG. 7 shows the filter window 16 when the advanced noise removal filter is selected.
  • the filter window 16 displays a window size selector 40 which sets the window size used by the advanced noise reduction technique. This technique results in less apparent blurring than the standard technique, but is more time consuming.
  • Figure 8 shows a screen display of a user interface in a second embodiment, for use with scan images of the colon.
  • This embodiment differs from the lung scan embodiment in the type of filters available; a single polyp enhancement filter is available rather than the choice of filters used in the lung scan embodiment. However, any of the filters available in the lung scan embodiment may be used with the second embodiment. Similar parts are shown with the same reference numerals, and their description is not repeated.
  • the filter window 16 displays settings for the polyp enhancement filter which highlights raised objects with a spherical element, but does not highlight objects that are elongated and likely to be folds.
  • the filter window includes minimum and maximum flatness sliders 38a, 38b which allow the user to select the degree of flatness of objects to be highlighted.
  • the third and fourth panes comprise a second original image pane 40 and a second enhanced image pane 42, displaying a supine axial view of the scan image while the original image pane 10 and enhanced image pane 12 display a prone axial view.
  • Manipulation of the second original image pane 40 and a second enhanced image pane 42 displays is also linked so that changes made in one pane are echoed in the other.
  • the user may select a display mode in which only a single scan image view is displayed in the user interface window, and the enhancement selected by the filter window can be switched on and off by the user, for example by toggling a button in the filter window.
  • Image display parameters are kept unchanged when switching between the original and the enhanced image, so that the user can easily compare the enhancement and the original image.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Theoretical Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Optics & Photonics (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Veterinary Medicine (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Pulmonology (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Image Processing (AREA)

Abstract

An original scan image and an enhanced scan image derived from the original scan image are displayed side by side or alternately, to facilitate comparison between the original and enhanced image. The user is able to change enhancement parameters of the enhanced image while viewing the original and enhanced image, to obtain the most suitable settings for the image while observing the effect on the enhanced image and comparing this with his or her own analysis of the original image. The adjusted parameters may then be applied to other parts of the original image so as to provide a more accurate analysis of those parts.

Description

USER INTERFACE FOR CT SCAN ANALYSIS
Field of the Invention
The present invention relates to a user interface for CT scan analysis, and particularly for displaying a computed tomography (CT) scan image which is enhanced so as to highlight potential lesions or other abnormalities.
Background of the Invention
In conventional analysis of CT images, a radiologist visually inspects each slice of a CT scan, using his or her expertise to identify abnormalities and to distinguish them from normal structures. The task can be made easier by storing the scan image on a computer and providing a user interface which allows the user to move rapidly between slices and visualise the structures in different ways. However, the process is time consuming and must be performed with great care in case any abnormalities are missed.
To replace some or all of the work of the radiologist, Computer Assisted Detection (CAD) software has been designed to analyse the scan image and detect potential lesions. The detection can be performed semi-automatically, with some interaction with the radiologist, or automatically, involving no interaction beyond the selection of the image to be analysed. For example, the applicant's MedicHeart™, MedicLung™ and MedicColon™ diagnostic software perfoπn semiautomatic diagnosis of CT scans of the heart, lung and colon respectively.
In practice, the results of CAD must be checked by a radiologist as a safeguard. If the software is used as the 'first reader', the radiologist only verifies the results produced by the software and does not analyse the original CT scan. To be effective as a 'first reader', the software must have both high sensitivity (i.e. a low percentage of missed lesions) and high specificity (i.e. a low percentage of false positives), because the radiologist may make medically important decisions based on the results. 'First reader' has the greatest potential to save radiologists' time, but it is a great challenge to achieve both high sensitivity and high specificity.
Alternatively, the software can be used as a 'second reader', where the radiologist makes a preliminary diagnosis based entirely on the original images, and then runs the software as a check for any missed lesions. When used as a 'second reader', the software does not actually save time, but can assist the radiologist in making better diagnoses. The software does not need to have particularly high sensitivity or specificity, so long as it leads to more accurate diagnoses than an unassisted radiologist. Used in this way, the software is analogous to a spelling or grammar checker for word-processing - it merely draws the user's attention to oversights, rather than replacing the actions of the user.
It would be desirable to produce a user interface for analysis of CT scans that does not need to be as accurate as 'first reader' software but saves more time than 'second reader' software.
The document WO-A-03/077203 discloses a user interface which allows corresponding areas from different scans to be displayed side-by-side.
A further problem is that many CAD algorithms rely for detection on a predefined set of parameter ranges. For example the Agatston method, as originally described in 'Quantification of coronary artery calcium using ultrafast computed tomography', Agatston AS, Janowitz WR, Hildner FJ et al, J Am Coll Cardiol 1990 15:827-832, applies a threshold of 130 Hounsfield units (HU) to the CT image, and identifies all pixels above that threshold as containing calcium. A scoring system is then used to rate the severity of the calcification, based on the number of pixels above the threshold multiplied by a weight based on the highest intensity within the calcification. If the highest intensity is between 130 and 200 HU, then the weight is 1; if between 200 and 300 HU, the weight is 2; and if over 300 HU, the weight is 3. The values of the threshold and the weights are based on empirical studies of coronary scans and the subsequent outcome for the patients. However, there is continuing debate as to which parameter ranges give the most accurate results; different ranges may be appropriate for different scan images.
The document US6058322 discloses an interactive user modification function in which the software displays detected microcalcifications and the user may then add or delete microcalcifications. The software modifies its estimated likelihood of malignancy accordingly.
Statement of the Invention
According to one aspect of the invention, there is provided a method of displaying a CT scan image, comprising displaying an original scan image and displaying an enhanced scan image derived from the original scan image. Preferably, the original and enhanced images are displayed with similar image attributes, such as size and scale, to facilitate comparison between the original and enhanced image. The enhanced image may be enhanced so as to identify to the user lesions or abnormalities in the original image.
In one embodiment, the original and enhanced images are displayed simultaneously. In another embodiment, the original and enhanced images are displayed alternately, with a similar size, scale and position. Preferably, the original and enhanced images are displayed on the same display device.
An advantage of these arrangements is that the original image can be visually checked against the enhanced image without the enhanced image obscuring features of the original image. Instead of using the enhanced image as a first or second reader, the enhanced image acts as a joint reader with the user, who can examine the original image while using the enhanced image as assistance.
In one embodiment, the user is able to change enhancement parameters of the enhanced image while viewing the original and enhanced image. In this way, the user is able to adjust the parameters to the most suitable settings for the image while observing the effect on the enhanced image and comparing this with his or her own analysis of the original image. The adjusted parameters may then be applied to other parts of the original image so as to .provide a more accurate analysis of those parts.
Brief Description of the Drawings
Specific embodiments of the present invention will now be illustrated with reference to the accompanying drawings, in which:
Figure 1 is a schematic diagram showing a CT scanner and a remote computer for processing image data from the scanner and operating a user interface;
Figure 2 is a flow chart of the main steps of the method of operation of the user interface in an embodiment of the invention;
Figure 3 is a screenshot of the user interface, with a sphericity enhancement filter applied;
Figure 4 shows a filter window of the user interface with the sphericity enhancement filter selected;
Figure 5 is a screenshot of the user interface, with an edge enhancement filter applied;
Figure 6 shows the filter window with a standard noise removal filter selected;
Figure 7 shows the filter window with an advanced noise removal filter selected; and
Figure 8 is a screenshot of the user interface in a second embodiment, for use with colon scan images.
Detailed Description of the Embodiments
Scan Image
A CT image may include one or more slices obtained from a human or animal patient using a CT scanner. Each slice is a 2-dimensional digital grey-scale image of the x-ray absorption of the scanned area. The properties of the slice may depend on the CT scanner used; for example, a high-resolution multi-slice CT scanner may produce images with a resolution of 0.5-0.6 mm/pixel in the x and y directions (i.e. in the plane of the slice). Each pixel may have 32-bit greyscale resolution. The intensity value of each pixel is normally expressed in Hounsfield units (HU). Sequential slices may be separated by a constant distance along the z direction (i.e. the scan separation axis); for example, by a distance of between 0.75-2.5 mm. Hence, the scan image may be a two- dimensional (2D) or three-dimensional (3D) grey scale image, with an overall size depending on the area and number of slices scanned.
The present invention is not restricted to any specific scanning technique, and is applicable to electron beam computed tomography (EBCT), multi-detector or spiral scans or any technique which produces as output a 2D or 3D image representing X-ray absorption.
Computer System
As shown in Figure 1, the scan image is created by a computer 4 which receives scan data from a scanner 2 and constructs the scan image. The scan image is saved as an electronic file or a series of files which are stored on a storage medium 6, such as a fixed or removable disc. The scan image may be stored in a standard format, such as DIRCOM 3.0. The scan image may be processed by the computer 4, or the scan image may be transferred to another computer 8 which runs software for processing and displaying images as described below. The software may be stored on a carrier, such as a removable disc, or downloaded over a network.
User Interface Flowchart
A flowchart of a method of operation of a user interface, implemented by software in an embodiment of the invention, is shown in Figure 2. An original scan image is provided as input (Sl) and is processed to provide an enhanced scan image (S2). A representation of the original scan image and the enhanced scan image are displayed (S3), either side by side or alternately, together with an indication of the parameter values used for the enhancement. If a user input is received (S4) to adjust the parameter values, the values are adjusted (S5) and the enhanced scan image is produced again (S2) using the adjusted parameters.
The original and enhanced images may be displayed side by side, or may be displayed alternately at the same position and with the same size and scale; for example, the image enhancement may be alternately switched on and off. The switching between original and enhanced images may occur every time the user presses a specific key, or clicks on a button on-screen.
Lung Scan Embodiment
Figure 3 shows a screenshot of the user interface in a first embodiment, displaying a two-dimensional slice of a scan of a lung phantom (i.e. a model approximating the lung and containing objects with known properties, used for testing and calibration).
The user interface is shown in a window comprising an original image pane 10, and an enhanced image pane 12. The original image pane 10 displays a slice of the scan image. A first toolbar 18 is located at the bottom of the original image pane 10, containing buttons which allow the user to manipulate the original image, for example by zooming, panning and rotating. The original image pane 10 also includes a first scroll bar 22 which allows the user to move forward and backward between slices.
The enhanced image pane 12 displays a version of the original image processed by one or more filters so as to highlight features of the original image, hi this example, a sphericity enhancement filter is applied, and objects satisfying the sphericity filter are circled. A second toolbar 20 in the enhanced image pane 12 contains buttons which allow the user to manipulate the enhanced image, for example by zooming, panning and rotating. The enhanced image pane 12 also includes a second scroll bar 24 which allows the user to move forward and backward between slices. The first and second toolbars 18, 20 may be linked so that an image manipulation performed by either toolbar has a corresponding or equivalent effect on both the original and the enhanced image. For example, magnifying the original scan image using the first toolbar 18 magnifies the enhanced scan image in the enhanced image pane 12 by the same amount, m another example, magnifying the enhanced scan image using the second toolbar 20 magnifies the original scan image in original image pane 10 by the same amount. In this way, a side-by-side comparison may be made between the same view of the original and enhanced images.
The first and second scroll bars 22, 24 may also be linked so that moving either scrollbar has a corresponding or equal effect on both the original scan image and the enhanced scan image. This allows the user to move back and forth in the z-direction while comparing the original and enhanced scan images.
The user interface window includes a right-hand toolbar 14 which displays various options selectable by the user. A 'Display' option allows the user to control how many panes are displayed: single, two, four or window, in which the panes are displayed as separate windows. In this example, only two panes are used for displaying scan images, while the third and fourth pane are assigned a data display and 3D nodule display function respectively.
The type and parameters of the filter are controlled by the user by means of a filter window 16, which is shown in more detail in Figure 4. In this example, the user can select any one of a sphericity enhancement filter, an edge filter, a standard noise removal filter and an enhanced noise removal filter, by selecting the corresponding tab. The user then sets the parameter values of the filter, and clicks a button to apply the filter to update the enhanced image.
Sphericity Enhancement Filter
In this example, the sphericity enhancement filter tab includes a sphericity slider 26 allowing the user to set the minimum level of sphericity for objects passed by the filter, a minimum intensity slider 28a and a maximum intensity slider 28b allowing the user to select the minimum and maximum peak intensity respectively within an object to be passed by the filter, and a nodule size selector 30 allowing the user to select one of a plurality of different size ranges of objects to be passed by the filter. The user can also select a first option 32 to detect plural tiny nodules, and a second option 34 to remove cylindrical shapes. The latter provides the user with the ability to avoid enhancement of spherical shapes that are within cylindrical structures, such as blood vessels.
The sphericity enhancement filter may operate by analysing each volume element (voxel) within the scan image and comparing with surrounding voxels of similar intensity to derive the three-dimensional curvature of a surface of equal intensity. Surfaces having a sphericity above the level selected by the user are identified as belonging to spherical objects. Voxels contained within those surfaces are then grouped together as parts of the same object. Once all such objects have been identified, those having a maximum intensity between the minimum and maximum selected by the user, and a size within the range selected by the user, are highlighted by the filter.
Where a spherical object passed by the filter occupies multiple consecutive slices, the object may be highlighted only on the slice which contains the greatest area of the object.
Edge Enhancement Filter
Figure 5 shows the embodiment when applied to another scan image, with the edge enhancements filter selected and applied. The edge enhancement is applied to the lung parenchyma area only, and is advantageous in identifying signs of interstitial disease such as reticulation.
When the edge enhancement filter is selected, a contrast setting window 36 is displayed, allowing the user to vary contrast parameters of the edge enhancement filter. Standard Noise Removal Filter
Figure 6 shows the filter window 16 when the standard noise removal filter is selected. The filter window displays a blur slider 38 by which the user can vary the degree of noise smoothing. This filter is advantageous for reading low-dose multi-slice spiral CT (MSCT) studies, where radiation dose reduction is achieved at the expense of increased background noise. The standard noise removal filter smoothes the background noise to the degree set by the user.
Advanced Noise Removal Filter
Figure 7 shows the filter window 16 when the advanced noise removal filter is selected. The filter window 16 displays a window size selector 40 which sets the window size used by the advanced noise reduction technique. This technique results in less apparent blurring than the standard technique, but is more time consuming.
Colon Scan Embodiment
Figure 8 shows a screen display of a user interface in a second embodiment, for use with scan images of the colon. This embodiment differs from the lung scan embodiment in the type of filters available; a single polyp enhancement filter is available rather than the choice of filters used in the lung scan embodiment. However, any of the filters available in the lung scan embodiment may be used with the second embodiment. Similar parts are shown with the same reference numerals, and their description is not repeated.
The filter window 16 displays settings for the polyp enhancement filter which highlights raised objects with a spherical element, but does not highlight objects that are elongated and likely to be folds. The filter window includes minimum and maximum flatness sliders 38a, 38b which allow the user to select the degree of flatness of objects to be highlighted. In this example, the third and fourth panes comprise a second original image pane 40 and a second enhanced image pane 42, displaying a supine axial view of the scan image while the original image pane 10 and enhanced image pane 12 display a prone axial view. Manipulation of the second original image pane 40 and a second enhanced image pane 42 displays is also linked so that changes made in one pane are echoed in the other.
Single View Embodiment
As an alternative to the side-by-side views described above, the user may select a display mode in which only a single scan image view is displayed in the user interface window, and the enhancement selected by the filter window can be switched on and off by the user, for example by toggling a button in the filter window. Image display parameters are kept unchanged when switching between the original and the enhanced image, so that the user can easily compare the enhancement and the original image.
Alternative Embodiments
The embodiments described above are illustrative of rather than limiting to the present invention. Alternative embodiments apparent on reading the above description may nevertheless fall within the scope of the invention

Claims

C L A I M S
1. A computer-implemented method of displaying a computed tomography (CT) scan image, comprising: a) processing (S2) an original scan image to generate an enhanced scan image in which one or more features indicative of an abnormality in the original scan image are enhanced and at least one other feature of the original scan image is obscured; and characterised by: b) displaying (S3) the original scan image and the enhanced scan image so as to enable a visual comparison therebetween.
2. The method of claim 1, wherein the original scan image (10) and the enhanced scan image (12) are displayed simultaneously.
3. The method of claim 1, wherein the original scan image and the enhanced scan image are displayed alternately.
4. The method of claim 3, wherein the display alternates between the original scan image and the enhanced scan image in response to a user input.
5. The method of claim 3 or claim 4, wherein the original scan image and the enhanced scan image are displayed alternately in substantially the same position.
6. The method of any preceding claim, including modifying (S5) a displayed representation (10) of the original scan image and a displayed representation (12) of the enhanced scan image, in response to a user input (S4).
7. The method of claim 6, wherein the displayed representation (10) of the original scan image is a slice of the original scan image and the displayed representation (12) of the enhanced scan image is a corresponding slice of the enhanced scan image, and the displayed slice of the original scan image and the corresponding slice of the enhanced scan image are modified in response to the user input.
8. The method of any preceding claim, wherein the processing step (S2) is performed according to one or more user-defined parameters, the method including receiving a user input for modifying one or more said parameters, and performing the processing and displaying steps using said modified one or more parameters.
9. The method of any preceding claim, wherein the processing step is performed using a selected one of a plurality of filters, the method including receiving a user input for selecting one of said filters, and performing the processing and displaying steps using the selected filter.
10. The method of any preceding claim, wherein the original scan image is a scan image of a lung, and the original scan image is processed (S2) to enhance spherical objects in the enhanced scan image.
11. The method of any preceding claim, wherein the original scan image is a scan image of a lung, and the original scan image is processed (S2) to enhance edges in the enhanced scan image.
12. The method of any one of claims 1 to 9, wherein the original scan image is a scan image of a colon, and the original scan image is processed (S2) to enhance spherical objects raised from the colon wall in the enhanced scan image.
13. A computer program arranged to perform the method of any preceding claim.
14. A computer program product comprising the computer program of claim 13, recorded on a carrier.
15. Apparatus (8) for displaying a computed tomography (CT) scan image, comprising: a) means arranged to process an original scan image to generate an enhanced scan image in which one or more features indicative of an abnormality in the original scan image are enhanced and at least one other feature of the original scan image is obscured; and b) means arranged to display the original scan image and the enhanced scan image so as to enable a visual comparison therebetween.
PCT/GB2005/002953 2004-09-10 2005-07-27 User interface for ct scan analysis WO2006027536A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
AU2005281551A AU2005281551A1 (en) 2004-09-10 2005-07-27 User interface for CT scan analysis
CA002579858A CA2579858A1 (en) 2004-09-10 2005-07-27 User interface for ct scan analysis
JP2007530755A JP2008512161A (en) 2004-09-10 2005-07-27 User interface for CT scan analysis

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
GB0420147.1 2004-09-10
GB0420147A GB2418094B (en) 2004-09-10 2004-09-10 User interface for CT scan analysis
US10/978,369 US7149334B2 (en) 2004-09-10 2004-11-02 User interface for computed tomography (CT) scan analysis
US10/978,369 2004-11-02
EP05251836.2 2005-03-24
EP05251836A EP1635295A1 (en) 2004-09-10 2005-03-24 User interface for CT scan analysis

Publications (1)

Publication Number Publication Date
WO2006027536A1 true WO2006027536A1 (en) 2006-03-16

Family

ID=35197721

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2005/002953 WO2006027536A1 (en) 2004-09-10 2005-07-27 User interface for ct scan analysis

Country Status (5)

Country Link
JP (1) JP2008512161A (en)
KR (1) KR20070083645A (en)
AU (1) AU2005281551A1 (en)
CA (1) CA2579858A1 (en)
WO (1) WO2006027536A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013014554A1 (en) * 2011-07-28 2013-01-31 Koninklijke Philips Electronics N.V. Image generation apparatus
EP2766816A4 (en) * 2011-10-10 2016-01-27 Vivoom Inc Network-based rendering and steering of visual effects

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101129718B1 (en) * 2008-01-04 2012-03-28 주식회사 메디칼스탠다드 Reading aid method of digital test images
JP5863330B2 (en) * 2011-08-22 2016-02-16 国立大学法人旭川医科大学 Image processing apparatus, image processing method, and program
TR201809179T4 (en) * 2012-10-01 2018-07-23 Koninklijke Philips Nv Visualization of image data.
KR102451377B1 (en) * 2015-10-08 2022-10-06 (주)바텍이우홀딩스 Method and Apparatus for Providing a Reconstructed Medical Image with Enhanced Depth Information

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5982917A (en) * 1996-06-03 1999-11-09 University Of South Florida Computer-assisted method and apparatus for displaying x-ray images
US20020076091A1 (en) * 1993-09-29 2002-06-20 Shih-Ping Wang Computer-aided diagnosis method and system
US6697506B1 (en) * 1999-03-17 2004-02-24 Siemens Corporate Research, Inc. Mark-free computer-assisted diagnosis method and system for assisting diagnosis of abnormalities in digital medical images using diagnosis based image enhancement

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0737061A (en) * 1993-07-20 1995-02-07 Toshiba Corp Medical diagnosis support device
JPH08294479A (en) * 1995-01-23 1996-11-12 Fuji Photo Film Co Ltd Computer aided image diagnostic device
JPH1031761A (en) * 1996-07-17 1998-02-03 Ge Yokogawa Medical Syst Ltd Image display method and image display device
CN1157682C (en) * 1997-11-28 2004-07-14 王士平 Computer-aided diagnosis system and method
JP2003070781A (en) * 2001-09-04 2003-03-11 Hitachi Medical Corp Supporting unit for medical image diagnosis
JP2003190134A (en) * 2001-12-28 2003-07-08 Konica Corp Medical image processor, medical image processing method, program and storage medium
JP2003319933A (en) * 2002-05-01 2003-11-11 Fuji Photo Film Co Ltd Picture display system
JP2004073432A (en) * 2002-08-15 2004-03-11 Ge Medical Systems Global Technology Co Llc X-ray ct apparatus, image processing method, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020076091A1 (en) * 1993-09-29 2002-06-20 Shih-Ping Wang Computer-aided diagnosis method and system
US5982917A (en) * 1996-06-03 1999-11-09 University Of South Florida Computer-assisted method and apparatus for displaying x-ray images
US6697506B1 (en) * 1999-03-17 2004-02-24 Siemens Corporate Research, Inc. Mark-free computer-assisted diagnosis method and system for assisting diagnosis of abnormalities in digital medical images using diagnosis based image enhancement

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013014554A1 (en) * 2011-07-28 2013-01-31 Koninklijke Philips Electronics N.V. Image generation apparatus
US9299172B2 (en) 2011-07-28 2016-03-29 Koninklijke Philips N.V. Image generation apparatus
EP2766816A4 (en) * 2011-10-10 2016-01-27 Vivoom Inc Network-based rendering and steering of visual effects

Also Published As

Publication number Publication date
CA2579858A1 (en) 2006-03-16
AU2005281551A1 (en) 2006-03-16
JP2008512161A (en) 2008-04-24
KR20070083645A (en) 2007-08-24

Similar Documents

Publication Publication Date Title
EP1635295A1 (en) User interface for CT scan analysis
US7133546B2 (en) Digital medical image analysis
US8165365B2 (en) Method and system for displaying tomosynthesis images
JP5039310B2 (en) Cerebral hemorrhage segmentation device
JP5138910B2 (en) 3D CAD system and method using projected images
Zhou et al. Automated coronary artery tree extraction in coronary CT angiography using a multiscale enhancement and dynamic balloon tracking (MSCAR-DBT) method
US7756314B2 (en) Methods and systems for computer aided targeting
US6463181B2 (en) Method for optimizing visual display of enhanced digital images
JP4911029B2 (en) Abnormal shadow candidate detection method, abnormal shadow candidate detection device
US8326013B2 (en) Image processing device, image processing method, and image processing program
JP2007203046A (en) Method and system for preparing image slice of object
JP2004105731A (en) Processing of computer aided medical image
JP2006095279A (en) Medical image display apparatus
US10140715B2 (en) Method and system for computing digital tomosynthesis images
JP2000287955A (en) Image diagnostic supporting apparatus
WO2006027536A1 (en) User interface for ct scan analysis
JP2007151645A (en) Medical diagnostic imaging support system
JP2006334140A (en) Display method of abnormal shadow candidate and medical image processing system
JPH08287230A (en) Computer-aided image diagnostic device
Kaucic et al. Model-based detection of lung lesions in CT exams
WO2017018230A1 (en) Image processing device, method, and program
JP2007534352A (en) System and method for performing ground glass-like nodule (GGN) segmentation
CN114947912A (en) Medical image display device, medical image display method, and storage medium
Jiang et al. Support system for lung cancer screening by CT

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2007530755

Country of ref document: JP

Ref document number: 2005281551

Country of ref document: AU

Ref document number: 2579858

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 200580030239.2

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2005281551

Country of ref document: AU

Date of ref document: 20050727

Kind code of ref document: A

WWP Wipo information: published in national office

Ref document number: 2005281551

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 1020077008158

Country of ref document: KR

122 Ep: pct application non-entry in european phase