WO2008007344A2 - Locally optimized transfer functions for volume visualization - Google Patents

Locally optimized transfer functions for volume visualization Download PDF

Info

Publication number
WO2008007344A2
WO2008007344A2 PCT/IB2007/052770 IB2007052770W WO2008007344A2 WO 2008007344 A2 WO2008007344 A2 WO 2008007344A2 IB 2007052770 W IB2007052770 W IB 2007052770W WO 2008007344 A2 WO2008007344 A2 WO 2008007344A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
computing
display
image data
region
Prior art date
Application number
PCT/IB2007/052770
Other languages
French (fr)
Other versions
WO2008007344A3 (en
Inventor
Hauke Schramm
Gundolf Kiefer
Original Assignee
Koninklijke Philips Elecronics N.V.
Philips Intellectual Property & Standards Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Elecronics N.V., Philips Intellectual Property & Standards Gmbh filed Critical Koninklijke Philips Elecronics N.V.
Priority to JP2009519047A priority Critical patent/JP2010509941A/en
Priority to EP07805118A priority patent/EP2044574A2/en
Publication of WO2008007344A2 publication Critical patent/WO2008007344A2/en
Publication of WO2008007344A3 publication Critical patent/WO2008007344A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects

Definitions

  • the invention relates to the field of volume visualization of medical image data and more specifically to the use of transfer functions for direct volume visualization of medical image data.
  • the TF is represented by a set of coefficients of a linear combination of basis transfer functions.
  • a genetic search algorithm optimizes a plurality of sets of coefficients, each set representing a candidate TF.
  • the evaluation of search results may be based on a user evaluation or on an automatic evaluation of candidate TFs.
  • the search terminates when a satisfactory TF is found. Additional operations such as smoothing the candidate TFs are also described.
  • a problem with a TF generated in this way is that the quality of an image computed using the generated TF is often unsatisfactory.
  • a system for computing an image based on image data for displaying in a display comprises: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image.
  • the first determination unit is arranged to determine the first image data within the image data for computing the first image for displaying in the first region of the display, e.g. in the top half of the display, and to determine the second image data within the image data for computing the second image for displaying in the second region of the display, e.g. in the bottom half of the display.
  • the second determination unit is arranged to determine the first TF for computing the first image and for determining the second TF for computing the second image.
  • the first TF and the second TF may be determined, for example, based on a user input for selecting a TF from a plurality of predefined TFs for optimal visualization of the first image data and of the second image data, respectively.
  • the first TF and the second TF may be further optimized.
  • the first TF and the second TF may be computed by the second determination unit on the basis of the first image data and of the second image data, respectively.
  • a method of computing a TF based on image data is described in Ref. 1.
  • the viewing direction is perpendicular to the viewing plane.
  • the viewing plane may be considered substantially identical with the plane defined by the display.
  • the viewing direction is defined relative to the image data.
  • the viewing direction for computing the first image and for computing the second image is determined by the third determination unit, e.g. based on a user input for selecting an orientation of the image data relative to the viewing direction.
  • the fourth determination unit is arranged to determine the first region of the display for displaying the first image and a second region of the display for displaying the second image.
  • the computation unit is arranged to compute the first image based on the first image data, the viewing direction, and the first TF.
  • the computation unit is further arranged to compute the second image based on the second image data, the viewing direction, and the second TF. Determining two independent TFs for computing the first image and the second image allows determining the first TF so as to be optimal for visualizing the first image data and the second TF so as to be optimal for visualizing the second image data.
  • the system is thus capable of improving the quality of the displayed image in the first display region and in the second display region.
  • the first image data is determined based on the viewing direction and the first region of the display.
  • the viewing direction is determined relative to the image data.
  • the orthographic ray casting is employed for computing the image.
  • the viewing rays of the orthographic ray casting are mutually parallel and are perpendicular to the viewing plane.
  • the plane defined by the display may be considered substantially identical with the viewing plane.
  • the viewing plane comprises the first region of the display.
  • the first region of the display may be determined based on a user input. The user may interactively select the first region of the display using a viewing frame, which may be moved and sized within a viewing region.
  • a data element belongs to the first image data when the orthogonal projection of the spatial location of said data element onto the viewing plane is comprised in the first region of the display.
  • the first transfer function may then be optimally determined for viewing the first image computed from the first image data in the first region of the display.
  • the embodiment offers a method for improving the quality of the first image.
  • the second image data may be determined based on the viewing direction and the second region of the display in a similar way.
  • the second region of the display may be also rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a second window comprising the second region of the display. In an embodiment of the system, the second region of the display is determined based on the first region of the display.
  • the second region of the display may be a complement of the first region of the display in a viewing region of the display.
  • the viewing region of the display is a portion of the display, e.g. a window, reserved for displaying the image.
  • the first region of the display may be determined by the user using, for example, a square or a rectangular frame for selecting the first region of the display in the viewing region of the display.
  • the viewing area may be divided into the first region and the second region by a line or a curve. This embodiment further simplifies selecting the first display region and the second display region.
  • the first transfer function is determined based on the first image data and the second transfer function is determined based on second image data.
  • An optimal first TF and an optimal second TF may be generated using the method described in Ref. 1 , for example, in order to compute an optimal first image and an optimal second image.
  • the system is further arranged to compute a transition image for displaying in a transition region of the display separating the first region of the display and the second region of the display, and: the first determination unit is further arranged to determine a transition image data within the image data for computing the transition image; - the fourth determination unit is further arranged to determine the transition region of the display for displaying the transition image; and the computation unit is further arranged to compute the transition image based on the first transfer function and on the second transfer function. To avoid artificial edges in the displayed image comprising the first image and the second image, i.e.
  • the transition image for displaying in the transition region of the display separating the first region of the display and the second region of the display is computed based on the first TF and on the second TF.
  • the renderable value corresponding to a data element in the transition image data which contributes to the transition image for displaying in the transition region of the display, may be a normalized linear combination of the renderable value defined by the first TF and of the renderable value defined by the second TF.
  • the first coefficient of the normalized linear combination and the second coefficient of the normalized linear combination are functions of the location of the data element. At a location close to the first image data volume, the value of the first coefficient is close to one. At a location close to the second image data volume, the value of the first coefficient is close to zero.
  • the sum of the first coefficient of the normalized linear combination and the second coefficient of the normalized linear combination is equal to one.
  • an image acquisition apparatus comprises a system for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; - a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; - a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image.
  • a workstation comprises a system for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image.
  • a method of computing an image based on image data for displaying in a display comprises: a first determination step for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination step for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination step for determining a viewing direction for computing the first image and for computing the second image; a fourth determination step for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation step for computing the first image and the second image, thereby computing the image.
  • a computer program product to be loaded by a computer arrangement comprises instructions for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the computer arrangement comprising a processing unit and a memory, the computer program product, after being loaded, providing said processing unit with the capability to carry out the following tasks: - a first determination step for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination step for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination step for determining a viewing direction for computing the first image and for computing the second image; a fourth determination step for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation step for computing the first image and the second image, thereby computing the image.
  • FIG. 1 schematically shows a block diagram of an exemplary embodiment of the system
  • Fig. 2 shows a flowchart of an exemplary implementation of the method
  • Fig. 3 illustrates determining the first image data on the basis of the viewing direction and the first display region
  • Fig. 4 shows an exemplary image comprising a first image and a second image computed by an embodiment of the system
  • Fig. 5 schematically shows an exemplary embodiment of the image acquisition apparatus
  • Fig. 6 schematically shows an exemplary embodiment of the workstation.
  • the same reference numerals are used to denote similar parts throughout the Figures.
  • Fig. 1 schematically shows a block diagram of an exemplary embodiment of the system 100 for computing an image based on image data for displaying in a display, the image comprising a first image and a second image
  • the system 100 comprising: - a first determination unit 110 for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit 120 for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit 130 for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit 140 for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit 150 for computing the first image and the second image, thereby computing the image.
  • the exemplary embodiment of the system 100 further comprises the following optional units: - a control unit 160 for controlling the workflow in the system 100; a user interface 165 for communicating with a user of the system 100; and a memory unit 170 for storing data.
  • the first input connector 181 is arranged to receive data coming in from a data storage means such as, but not limited to, a hard disk, a magnetic tape, a flash memory, or an optical disk.
  • the second input connector 182 is arranged to receive data coming in from a user input device such as, but not limited to, a mouse or a touch screen.
  • the third input connector 183 is arranged to receive data coming in from a user input device such as a keyboard.
  • the input connectors 181, 182 and 183 are connected to an input control unit 180.
  • the first output connector 191 is arranged to output the data to a data storage means such as a hard disk, a magnetic tape, a flash memory, or an optical disk.
  • the second output connector 192 is arranged to output the data to a display device.
  • the output connectors 191 and 192 receive the respective data via an output control unit 190.
  • the system 100 comprises a memory unit 170.
  • the system 100 is arranged to receive input data from external devices via any of the input connectors 181, 182, and 183 and to store the received input data in the memory unit 170. Loading the input data into the memory unit 170 allows quick access to relevant data portions by the units of the system 100.
  • the input data may comprise, for example, the image data.
  • the memory unit 170 may be embodied by devices such as, but not limited to, a Random Access Memory (RAM) chip, a Read Only Memory (ROM) chip, and/or a hard disk drive and a hard disk.
  • the memory unit 170 may be further arranged to store the output data.
  • the output data may comprise, for example, the computed first image and the computed second image.
  • the memory unit 170 is also arranged to receive data from and deliver data to the units of the system 100 comprising the first determination unit 110, the second determination unit 120, the third determination unit 130, the fourth determination unit 140, the computation unit 150, the control unit 160, and the user interface 165, via a memory bus 175.
  • the memory unit 170 is further arranged to make the output data available to external devices via any of the output connectors 191 and 192. Storing the data from the units of the system 100 in the memory unit 170 may advantageously improve the performance of the units of the system 100 as well as the rate of transfer of the output data from the units of the system 100 to external devices.
  • the system 100 may not comprise the memory unit 170 and the memory bus 175.
  • the input data used by the system 100 may be supplied by at least one external device, such as an external memory or a processor, connected to the units of the system 100.
  • the output data produced by the system 100 may be supplied to at least one external device, such as an external memory or a processor, connected to the units of the system 100.
  • the units of the system 100 may be arranged to receive the data from each other via internal connections or via a data bus.
  • the system 100 comprises a control unit 160 for controlling the workflow in the system 100.
  • the control unit may be arranged to receive control data from and provide control data to the units of the system 100.
  • the first determination unit 110 may be arranged to provide input control data "the first image data is ready" to the control unit 160 and the control unit 160 may be arranged to provide output control data "start determining the first TF" to the second determination unit 120, requesting the second determination unit 120 to start the process of determining the first TF.
  • a control function may be implemented in another unit of the system 100.
  • the system 100 comprises a user interface 165 for communicating with the user of the system 100.
  • the user interface 165 may be arranged to prompt the user for input and to accept a user input for determining the first image data and the second image data, for example.
  • the user interface 165 may further provide the user with a test image of the image data to facilitate the user-selection of the viewing direction.
  • the user interface may receive a user input for selecting a mode of operation of the system 100, such as a mode for using an algorithm for generating TFs.
  • a mode of operation of the system 100 such as a mode for using an algorithm for generating TFs.
  • the system 100 may comprise an input device such as a mouse or a keyboard and/or an output device such as a display
  • an input device such as a mouse or a keyboard
  • an output device such as a display
  • the system 100 obtains image data via the input connector 181 and stores the image data in the memory 170.
  • the first determination unit 110 is arranged to determine a first image data and a second image data.
  • the second determination unit 120 is arranged to determine a first TF and a second TF.
  • the third determination unit 130 is arranged to determine a viewing direction for computing the first image and for computing the second image.
  • the system 100 further comprises the fourth determination unit 140 for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image.
  • the computation unit 150 is arranged to compute the image comprising the first image and the second image.
  • the first image and the second image computed by the computation unit 150 are stored in the memory unit 170.
  • the control unit 160 is further arranged to control the workflow in the system 100.
  • the control unit 160 is arranged to terminate computing the image after receiving control data "the image is computed" from the computation unit 150 or "terminate computing immediately" from the user interface 165.
  • Fig. 2 shows a flowchart of an exemplary implementation of the method 200 of computing an image based on image data for displaying in a display, the image comprising a first image and a second image.
  • a first determination step 210 a first image data for computing the first image and a second image data for computing the second image are determined within the image data.
  • the method 200 proceeds to a second determination step 220 for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image.
  • the method 200 proceeds to a third determination step 230 for determining a viewing direction for computing the first image and for computing the second image.
  • the method 200 proceeds to a fourth determination step 240 for determining the first region of the display and the second region of the display.
  • the method 200 proceeds to a computation step 250 for computing the first image based on the first image data and on the first transfer function and for computing the second image based on the second image data and on the second transfer function, thereby computing the first image and the second image.
  • the method 200 terminates.
  • the first determination unit 110 is arranged to determine a first image data within the image data for computing the first image and to determine a second image data within the image data for computing the second image.
  • Each element (x, y, z, I) of the image data comprises a location (x, y, z), typically represented by three Cartesian coordinates x, y, z in an image data coordinate system, and an intensity I at this location.
  • the image data volume is defined by the locations (x, y, z) comprised in the image data elements (x, y, z, I).
  • the determination unit 110 is arranged to determine the image data volume. The user may be prompted via the user interface 165 to define a plane cutting through the image data volume.
  • the cutting plane divides the image data into a first image data and a second image data.
  • the first image data comprises image data elements (x, y, z, I) comprising locations (x, y, z) above the cutting plane.
  • the second image data comprises image data elements (x, y, z, I) comprising locations (x, y, z) below the cutting plane. Any image data element (x, y, z, I) located in the cutting plane may be assigned, for example, to the first image data.
  • the skilled person will understand that there are other ways of determining the first image data and the second image data, which can be employed by the system 100.
  • the second determination unit 120 is arranged to determine a first transfer function for computing the first image and to determine a second transfer function for computing the second image.
  • the determination of TFs may be based on a user input.
  • the user interface 165 may be arranged to prompt the user to select the first TF and the second TF. The selection may be done by specifying the structures to be displayed in the first image and in the second image and/or the modality used to acquire the image data.
  • the system may further comprise a table with typical TFs used for displaying typical structures and/or used for displaying images computed from image data acquired by various modalities. The optimal TF is selected based on the user-specified structure and/or image modality.
  • the user interface 165 may provide the user with interactive means for constructing the first TF and the second TF based on displayed histograms and/or on test images computed from the first image data and from the second image data and displayed in the display.
  • the user may define how to assign opacities to intensities comprised in the image data.
  • the user may define how to assign other renderable properties, e.g. colors, to intensities comprised in the image data.
  • a method for interactive direct volume rendering is described by P. Hadover et al in an article entitled "Fast Analysis of Intracranial Aneurysms Based on Interactive Direct Volume
  • the second determination unit 120 is arranged to generate the first TF based on the first image data, using the algorithm described in Ref. 1.
  • Another algorithm for generating a TF is described by G. Kindlmann and J. W. Durkin in an article entitled “Semi-automatic generation of transfer functions for direct volume rendering", Proceedings of the 1998 IEEE Symposium on Volume Visualization, pages 79-86, October 19-20, 1998, hereinafter referred to as Ref. 2.
  • the algorithm described in Ref. 2 may be employed to generate the first TF instead of the algorithm described in Ref. 1.
  • the second determination unit 120 may be further arranged to generate the second TF based on the image data, using the algorithm described in Ref. 1 or in Ref. 2.
  • the skilled person will understand that the choice of the algorithm for determining the first TF and the second TF does not limit the scope of the claims.
  • the second determination unit 120 is arranged to generate the first TF based on the first image data, using the algorithm described in Ref. 1 or in Ref. 2, and to generate the second TF based on the second image data, using the algorithm described in Ref. 1 or in Ref. 2.
  • the third determination unit 130 is arranged to determine a viewing direction for computing the first image and for computing the second image.
  • the viewing direction is defined relative to the image data volume, for example, in an image data coordinate system for defining locations of the image data elements.
  • the viewing direction is typically perpendicular to the viewing plane, which may be considered substantially identical with the display plane.
  • the user interface enables the user to orient the image data volume relative to the viewing direction and the viewing plane.
  • the third determination unit determines the viewing direction based on the orientation of the image data volume.
  • the user interface 165 may enable the user to define the viewing direction without reorienting the image data volume, e.g. by specifying the directional cosines of the viewing direction in the image data coordinate system.
  • the fourth determination unit 140 of the system 100 is arranged to determine the first region of the display and the second region of the display.
  • the first image displayed in the first display region visualizes in greater detail a part of the second image displayed in the second display region.
  • the second image may visualize a lateral surface of the brain and the first image may visualize an area on the parietal lobe surface.
  • the first region of the display may be quadratic.
  • the fourth determination unit 140 is arranged to determine the location and the size of the first region of the display based on a user input.
  • the second region is rectangular. The location, the width, and the height of the second region are automatically determined by the fourth determination unit 140.
  • the first display region and the second display region may be rectangular.
  • the fourth determination unit is arranged to determine the location, the width, and the height of the first and of the second display region based on a user input. Alternatively, the location, the size, and the shape of the first region of the display and of the second region of the display may be automatically determined by the system 100.
  • the system 100 further comprises the computation unit 150 for computing the first image and the second image.
  • the first image is computed based on the first image data, the first TF, and the viewing direction.
  • the second image is computed based on the second image data, the second TF, and the viewing direction.
  • Section 2 of Ref. 1 describes computing an image based on a TF.
  • a location on a viewing ray extending substantially parallel to the viewing direction and passing through an image data volume contributes to the computed image intensity defined by the viewing ray. The contribution is inversely proportional to the absorption of light at the contributing location on the viewing ray. The absorption of light at the contributing location on the viewing ray is determined by the opacity at the contributing location on the viewing ray.
  • the opacity at the contributing location on the viewing ray is defined by an opacity-TF.
  • the opacity-TF assigns opacity to intensity at the contributing location on the viewing ray.
  • a source term describing the light added at the contributing location on the viewing ray may be specified using a source-TF.
  • the first image data is determined based on the viewing direction and the first region of the display.
  • the viewing plane may be considered substantially identical with the display plane. If, for example, orthographic ray casting is employed for computing the image, then a data element belongs to the first image data when the orthogonal projection of the spatial location of said data element onto the display plane is comprised in the first region of the display.
  • the first region of the display may be rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a first window comprising the first region of the display.
  • Fig. 3 illustrates the process of determining the first image data on the basis of the viewing direction and the first display region.
  • the viewing direction 315 is perpendicular to the viewing plane 305.
  • the locations of the first image data are comprised in the first cuboid 320.
  • the second image data may be determined based on the viewing direction and the second region of the display in a similar way.
  • the second region of the display may be also rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a second window comprising the second region of the display. In an embodiment of the system 100, the second region of the display is determined based on the first region of the display.
  • the second region of the display may be a complement of the first region of the display in a viewing region of the display.
  • the viewing region of the display is a portion of the display, e.g. a window, reserved for displaying the image.
  • the first region of the display may be determined by the user using a square or a rectangular, for example, frame for selecting the first region of the display in the viewing region of the display. This embodiment further simplifies selecting the first region of the display and the second region of the display, and may also simplify selecting the first image data and the second image data.
  • the first region of the display for displaying the first image may be used for improving the view of a structure of interest in the second image.
  • Fig. 4 shows an exemplary image 400 comprising a first image 410 and a second image 420 computed by an embodiment of the system 100.
  • the image shows a human head computed from MRI data.
  • the first image 410 is surrounded by a rectangular frame 430.
  • the first TF for computing the first image is optimized based on the first image data to show the surface of the brain.
  • the first image 410 shows a detail of the surface of the brain.
  • the second region of the display is the complement of the first region of the display in the viewing region.
  • the second transfer function is optimized based on all the image data for visualizing the skin tissue.
  • the frame for determining the first region of the display may be moved to display another region of the brain surface computed from another first image data.
  • the frame may be resized to display a larger or a smaller region of the brain surface.
  • the user interface 165 may provide the user with a possibility to determine the scale of the first image, i.e. to zoom-in the first image or to zoom-out the first image.
  • the system 100 is further arranged to compute a transition image for displaying in a transition region of the display separating the first region of the display and the second region of the display, and: the first determination unit is further arranged to determine a transition image data within the image data for computing the transition image; - the fourth determination unit is further arranged to determine the transition region of the display for displaying the transition image; and the computation unit is further arranged to compute the transition image based on the first transfer function and on the second transfer function.
  • the transition image for displaying in the transition region of the display separating the first region of the display and the second region of the display is computed based on the first transfer function TFl and on the second transfer function TF2.
  • the renderable value corresponding to a data element (x,y,z, I) contributing to the transition region of the display may be a normalized linear combination of the renderable value defined by TFl(I) and of the renderable value defined by TF2(I): cl(x,y,z)*TFl(I) + c2(x,y,z)*TF2(I).
  • the first coefficient cl(x,y,z) of the normalized linear combination and the second coefficient c2(x,y,z) of the normalized linear combination are functions of the location (x,y,z) of the data element (x,y,z,I).
  • cl(x,y,z) d2(x,y,z)/[ dl(x,y,z) + d2(x,y,z)], where dl(x,y,z) is a distance from the location (x,y,z) in the transition image data region to the first image data region and d2(x,y,z) is a distance from the location (x,y,z) in the transition image data region to the second image data region.
  • the distance functions may be Euclidean distance functions, for example.
  • the transition region of the display may be automatically determined by the system 100 and the transition image data volume may be automatically determined by the system 100.
  • the transition region of the display may be determined based on a user input. For example, the user may determine the first region of the display, using a first frame.
  • the fourth determination unit 140 of the system 100 may be arranged to define a part of the viewing region of the display surrounding the first region of the display as a transition region of the display.
  • a system for computing an image the image comprising a plurality of partial images, each partial image based on a partial image data and a partial TF, and for displaying in a partial region of the display, is also contemplated.
  • the skilled person will further understand that other embodiments of the system 100 are also possible. It is possible, among other things, to redefine the units of the system and to redistribute their functions.
  • the functions of the control unit 160 may be assigned to other units of the system 100.
  • the units of the system 100 may be implemented using a processor. Normally, their functions are performed under control of a software program product. During execution, the software program product is normally loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetic and/or optical storage means, or may be loaded via a network like the Internet. Optionally, an application-specific integrated circuit may provide the described functionality.
  • the order of steps in the method 200 of computing an image comprising a first image and a second image is not mandatory, the skilled person may change the order of some steps or perform some steps concurrently using threading models, multiprocessor systems or multiple processes without departing from the concept as intended by the present invention.
  • determining the first display region, the second display region and the third display region must precede determining the first image data and the second image data.
  • two or more steps of the method 100 of the current invention may be combined into one step.
  • a step of the method 100 of the current invention may be split into a plurality of steps. Some steps of the method 100 are optional and may be omitted.
  • Fig. 5 schematically shows an exemplary embodiment of the image acquisition apparatus 500 employing the system 100, said image acquisition apparatus 500 comprising an image acquisition unit 510 connected via an internal connection with the system 100, an input connector 501, and an output connector 502.
  • This arrangement advantageously increases the capabilities of the image acquisition apparatus 500 by providing said image acquisition apparatus 500 with advantageous capabilities of the system 100 for computing an image based on image data.
  • image acquisition apparatus comprise, but are not limited to, a CT system, an X-ray system, an MRI system, a US system, a PET system, a SPECT system, and an NM system.
  • Fig. 6 schematically shows an exemplary embodiment of the workstation 600.
  • the workstation comprises a system bus 601.
  • a processor 610, a memory 620, a disk input/output (I/O) adapter 630, and a user interface (UI) 640 are operatively connected to the system bus 601.
  • a disk storage device 631 is operatively coupled to the disk I/O adapter 630.
  • a keyboard 641, a mouse 642, and a display 643 are operatively coupled to the UI 640.
  • the system 100 of the invention, implemented as a computer program, is stored in the disk storage device 631.
  • the workstation 600 is arranged to load the program and input data into memory 620 and execute the program on the processor 610.
  • the user can input information into the workstation 600, using the keyboard 641 and/or the mouse 642.
  • the workstation is arranged to output information to the display device 643 and/or to the disk 631.
  • the skilled person will understand that there are numerous other embodiments of the workstation 600 known in the art and that the present embodiment serves the purpose of illustrating the invention and must not be interpreted as limiting the invention to this particular embodiment.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Geometry (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Apparatus For Radiation Diagnosis (AREA)
  • Magnetic Resonance Imaging Apparatus (AREA)
  • Image Processing (AREA)

Abstract

The invention relates to a system (100) for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising a first determination unit (110) for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit (120) for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit (130) for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit (140) for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit (150) for computing the first image and the second image, thereby computing the image. The system (100) is thus capable of improving the quality of the displayed image by optimizing the first transfer function for computing the first image for displaying in the first display region and by optimizing the second transfer function for computing the second image for displaying in the second display region.

Description

Locally optimized transfer functions for volume visualization
FIELD OF THE INVENTION
The invention relates to the field of volume visualization of medical image data and more specifically to the use of transfer functions for direct volume visualization of medical image data.
BACKGROUND OF THE INVENTION
An implementation of image visualization based on direct volume rendering is described, for example, by T. He et al in an article entitled "Generation of Transfer Functions with Stochastic Search Techniques" in Proceedings of IEEE Visualization, pages 227- 234, 1996, hereinafter referred to as Ref. 1. In this article, the authors describe a stochastic search technique for generating a useful transfer function (TF). The TFs are used to make a volume data visible by assigning renderable optical properties, such as opacity, color and/or emittance, to the image intensities comprised in the image data. Often, it is useful to include the gradient direction and/or the gradient value and assign renderable optical properties to a plurality of quantities, e.g. to a pair of quantities comprising an intensity and a gradient value. Using second order derivatives is also possible. In Ref. 1, the TF is represented by a set of coefficients of a linear combination of basis transfer functions. A genetic search algorithm optimizes a plurality of sets of coefficients, each set representing a candidate TF. The evaluation of search results may be based on a user evaluation or on an automatic evaluation of candidate TFs. The search terminates when a satisfactory TF is found. Additional operations such as smoothing the candidate TFs are also described. A problem with a TF generated in this way is that the quality of an image computed using the generated TF is often unsatisfactory.
SUMMARY OF THE INVENTION
A careful study of various TFs was carried out and showed that while the quality of a computed image displayed in a first region of the display might be good, the quality of the computed image displayed in a second region of the display was often poor, e.g. the details displayed in the second display region were fuzzy. The study further showed that by changing criteria of generating the TF to improve the quality of the image in the second region, the quality of the image displayed in the first region of the display often worsened. It would be advantageous to achieve an improvement of image quality in both the first display region and the second display region. To better address this concern, in an aspect of the invention, a system for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, comprises: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image.
The first determination unit is arranged to determine the first image data within the image data for computing the first image for displaying in the first region of the display, e.g. in the top half of the display, and to determine the second image data within the image data for computing the second image for displaying in the second region of the display, e.g. in the bottom half of the display. The second determination unit is arranged to determine the first TF for computing the first image and for determining the second TF for computing the second image. The first TF and the second TF may be determined, for example, based on a user input for selecting a TF from a plurality of predefined TFs for optimal visualization of the first image data and of the second image data, respectively. Optionally, the first TF and the second TF may be further optimized. Alternatively, the first TF and the second TF may be computed by the second determination unit on the basis of the first image data and of the second image data, respectively. A method of computing a TF based on image data is described in Ref. 1. Typically, the viewing direction is perpendicular to the viewing plane. The viewing plane may be considered substantially identical with the plane defined by the display. The viewing direction is defined relative to the image data. The viewing direction for computing the first image and for computing the second image is determined by the third determination unit, e.g. based on a user input for selecting an orientation of the image data relative to the viewing direction. The fourth determination unit is arranged to determine the first region of the display for displaying the first image and a second region of the display for displaying the second image. The computation unit is arranged to compute the first image based on the first image data, the viewing direction, and the first TF. The computation unit is further arranged to compute the second image based on the second image data, the viewing direction, and the second TF. Determining two independent TFs for computing the first image and the second image allows determining the first TF so as to be optimal for visualizing the first image data and the second TF so as to be optimal for visualizing the second image data. The system is thus capable of improving the quality of the displayed image in the first display region and in the second display region.
In an embodiment of the system, the first image data is determined based on the viewing direction and the first region of the display. The viewing direction is determined relative to the image data. In an embodiment, the orthographic ray casting is employed for computing the image. The viewing rays of the orthographic ray casting are mutually parallel and are perpendicular to the viewing plane. The plane defined by the display may be considered substantially identical with the viewing plane. The viewing plane comprises the first region of the display. The first region of the display may be determined based on a user input. The user may interactively select the first region of the display using a viewing frame, which may be moved and sized within a viewing region. A data element belongs to the first image data when the orthogonal projection of the spatial location of said data element onto the viewing plane is comprised in the first region of the display. The first transfer function may then be optimally determined for viewing the first image computed from the first image data in the first region of the display. Thus, the embodiment offers a method for improving the quality of the first image. The second image data may be determined based on the viewing direction and the second region of the display in a similar way. The second region of the display may be also rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a second window comprising the second region of the display. In an embodiment of the system, the second region of the display is determined based on the first region of the display. The second region of the display may be a complement of the first region of the display in a viewing region of the display. The viewing region of the display is a portion of the display, e.g. a window, reserved for displaying the image. The first region of the display may be determined by the user using, for example, a square or a rectangular frame for selecting the first region of the display in the viewing region of the display. Alternatively, the viewing area may be divided into the first region and the second region by a line or a curve. This embodiment further simplifies selecting the first display region and the second display region. In an embodiment of the system, the first transfer function is determined based on the first image data and the second transfer function is determined based on second image data. An optimal first TF and an optimal second TF may be generated using the method described in Ref. 1 , for example, in order to compute an optimal first image and an optimal second image. In an embodiment of the system, the system is further arranged to compute a transition image for displaying in a transition region of the display separating the first region of the display and the second region of the display, and: the first determination unit is further arranged to determine a transition image data within the image data for computing the transition image; - the fourth determination unit is further arranged to determine the transition region of the display for displaying the transition image; and the computation unit is further arranged to compute the transition image based on the first transfer function and on the second transfer function. To avoid artificial edges in the displayed image comprising the first image and the second image, i.e. to blend the first image and the second image, the transition image for displaying in the transition region of the display separating the first region of the display and the second region of the display is computed based on the first TF and on the second TF. For example, the renderable value corresponding to a data element in the transition image data, which contributes to the transition image for displaying in the transition region of the display, may be a normalized linear combination of the renderable value defined by the first TF and of the renderable value defined by the second TF. The first coefficient of the normalized linear combination and the second coefficient of the normalized linear combination are functions of the location of the data element. At a location close to the first image data volume, the value of the first coefficient is close to one. At a location close to the second image data volume, the value of the first coefficient is close to zero. The sum of the first coefficient of the normalized linear combination and the second coefficient of the normalized linear combination is equal to one.
In a further aspect of the invention, an image acquisition apparatus comprises a system for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; - a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; - a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image. In a further aspect of the invention, a workstation comprises a system for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising: a first determination unit for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit for computing the first image and the second image, thereby computing the image.
In a further aspect of the invention, a method of computing an image based on image data for displaying in a display, the image comprising a first image and a second image, comprises: a first determination step for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination step for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination step for determining a viewing direction for computing the first image and for computing the second image; a fourth determination step for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation step for computing the first image and the second image, thereby computing the image.
In a further aspect of the invention, a computer program product to be loaded by a computer arrangement comprises instructions for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the computer arrangement comprising a processing unit and a memory, the computer program product, after being loaded, providing said processing unit with the capability to carry out the following tasks: - a first determination step for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination step for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination step for determining a viewing direction for computing the first image and for computing the second image; a fourth determination step for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation step for computing the first image and the second image, thereby computing the image.
Modifications and variations thereof, of the image acquisition apparatus, of the workstation, of the method, and/or of the computer program product, which correspond to modifications of the system and variations thereof being described, can be carried out by a skilled person on the basis of the present description. The skilled person will appreciate that the method may be applied to three- dimensional (3D) image data generated by various acquisition modalities such as, but not limited to, conventional X-Ray, Computed Tomography (CT), Magnetic Resonance Imaging (MRI), Ultrasound (US), Positron Emission Tomography (PET), Single Photon Emission Computed Tomography (SPECT), and Nuclear Medicine (NM).
BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will become apparent from and will be elucidated with respect to the implementations and embodiments described hereinafter and with reference to the accompanying drawings, wherein: Fig. 1 schematically shows a block diagram of an exemplary embodiment of the system;
Fig. 2 shows a flowchart of an exemplary implementation of the method;
Fig. 3 illustrates determining the first image data on the basis of the viewing direction and the first display region; Fig. 4 shows an exemplary image comprising a first image and a second image computed by an embodiment of the system;
Fig. 5 schematically shows an exemplary embodiment of the image acquisition apparatus; and
Fig. 6 schematically shows an exemplary embodiment of the workstation. The same reference numerals are used to denote similar parts throughout the Figures. DETAILED DESCRIPTION OF EMBODIMENTS
Fig. 1 schematically shows a block diagram of an exemplary embodiment of the system 100 for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system 100 comprising: - a first determination unit 110 for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit 120 for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit 130 for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit 140 for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit 150 for computing the first image and the second image, thereby computing the image.
The exemplary embodiment of the system 100 further comprises the following optional units: - a control unit 160 for controlling the workflow in the system 100; a user interface 165 for communicating with a user of the system 100; and a memory unit 170 for storing data.
In the exemplary embodiment of the system 100, there are three input connectors 181, 182 and 183 for the incoming data. The first input connector 181 is arranged to receive data coming in from a data storage means such as, but not limited to, a hard disk, a magnetic tape, a flash memory, or an optical disk. The second input connector 182 is arranged to receive data coming in from a user input device such as, but not limited to, a mouse or a touch screen. The third input connector 183 is arranged to receive data coming in from a user input device such as a keyboard. The input connectors 181, 182 and 183 are connected to an input control unit 180.
In the exemplary embodiment of the system 100, there are two output connectors 191 and 192 for the outgoing data. The first output connector 191 is arranged to output the data to a data storage means such as a hard disk, a magnetic tape, a flash memory, or an optical disk. The second output connector 192 is arranged to output the data to a display device. The output connectors 191 and 192 receive the respective data via an output control unit 190.
The skilled person will understand that there are many ways to connect input devices to the input connectors 181, 182 and 183 and the output devices to the output connectors 191 and 192 of the system 100. These ways comprise, but are not limited to, a wired and a wireless connection, a digital network such as, but not limited to, a Local Area Network (LAN) and a Wide Area Network (WAN), the Internet, a digital telephone network, and an analogue telephone network.
In the exemplary embodiment of the system 100, the system 100 comprises a memory unit 170. The system 100 is arranged to receive input data from external devices via any of the input connectors 181, 182, and 183 and to store the received input data in the memory unit 170. Loading the input data into the memory unit 170 allows quick access to relevant data portions by the units of the system 100. The input data may comprise, for example, the image data. The memory unit 170 may be embodied by devices such as, but not limited to, a Random Access Memory (RAM) chip, a Read Only Memory (ROM) chip, and/or a hard disk drive and a hard disk. The memory unit 170 may be further arranged to store the output data. The output data may comprise, for example, the computed first image and the computed second image. The memory unit 170 is also arranged to receive data from and deliver data to the units of the system 100 comprising the first determination unit 110, the second determination unit 120, the third determination unit 130, the fourth determination unit 140, the computation unit 150, the control unit 160, and the user interface 165, via a memory bus 175. The memory unit 170 is further arranged to make the output data available to external devices via any of the output connectors 191 and 192. Storing the data from the units of the system 100 in the memory unit 170 may advantageously improve the performance of the units of the system 100 as well as the rate of transfer of the output data from the units of the system 100 to external devices.
Alternatively, the system 100 may not comprise the memory unit 170 and the memory bus 175. The input data used by the system 100 may be supplied by at least one external device, such as an external memory or a processor, connected to the units of the system 100. Similarly, the output data produced by the system 100 may be supplied to at least one external device, such as an external memory or a processor, connected to the units of the system 100. The units of the system 100 may be arranged to receive the data from each other via internal connections or via a data bus. In the exemplary embodiment of the system 100 shown in Fig. 1, the system 100 comprises a control unit 160 for controlling the workflow in the system 100. The control unit may be arranged to receive control data from and provide control data to the units of the system 100. For example, after the first image data is determined by the first determination unit 110, the first determination unit 110 may be arranged to provide input control data "the first image data is ready" to the control unit 160 and the control unit 160 may be arranged to provide output control data "start determining the first TF" to the second determination unit 120, requesting the second determination unit 120 to start the process of determining the first TF. Alternatively, a control function may be implemented in another unit of the system 100.
In the exemplary embodiment of the system 100 shown in Fig. 1, the system 100 comprises a user interface 165 for communicating with the user of the system 100. The user interface 165 may be arranged to prompt the user for input and to accept a user input for determining the first image data and the second image data, for example. The user interface 165 may further provide the user with a test image of the image data to facilitate the user-selection of the viewing direction. Optionally, the user interface may receive a user input for selecting a mode of operation of the system 100, such as a mode for using an algorithm for generating TFs. The skilled person will understand that more functions may be advantageously implemented in the user interface 165 of the system 100.
Optionally, in a further embodiment of the system 100, the system 100 may comprise an input device such as a mouse or a keyboard and/or an output device such as a display The skilled person will understand that there exist a large number of input and output devices that can be advantageously comprised in the system 100. In the embodiment of the system 100 depicted in Fig. 1, the system 100 obtains image data via the input connector 181 and stores the image data in the memory 170. The first determination unit 110 is arranged to determine a first image data and a second image data. The second determination unit 120 is arranged to determine a first TF and a second TF. The third determination unit 130 is arranged to determine a viewing direction for computing the first image and for computing the second image. The system 100 further comprises the fourth determination unit 140 for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image. The computation unit 150 is arranged to compute the image comprising the first image and the second image. The first image and the second image computed by the computation unit 150 are stored in the memory unit 170. The control unit 160 is further arranged to control the workflow in the system 100. The control unit 160 is arranged to terminate computing the image after receiving control data "the image is computed" from the computation unit 150 or "terminate computing immediately" from the user interface 165.
Fig. 2 shows a flowchart of an exemplary implementation of the method 200 of computing an image based on image data for displaying in a display, the image comprising a first image and a second image. In a first determination step 210, a first image data for computing the first image and a second image data for computing the second image are determined within the image data. After the first determination step 210, the method 200 proceeds to a second determination step 220 for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image. After the second determination step 220, the method 200 proceeds to a third determination step 230 for determining a viewing direction for computing the first image and for computing the second image. After the third determination step 230, the method 200 proceeds to a fourth determination step 240 for determining the first region of the display and the second region of the display. After the fourth determination step 240, the method 200 proceeds to a computation step 250 for computing the first image based on the first image data and on the first transfer function and for computing the second image based on the second image data and on the second transfer function, thereby computing the first image and the second image. After the computation step 250, the method 200 terminates.
The first determination unit 110 is arranged to determine a first image data within the image data for computing the first image and to determine a second image data within the image data for computing the second image. Each element (x, y, z, I) of the image data comprises a location (x, y, z), typically represented by three Cartesian coordinates x, y, z in an image data coordinate system, and an intensity I at this location. The image data volume is defined by the locations (x, y, z) comprised in the image data elements (x, y, z, I). The determination unit 110 is arranged to determine the image data volume. The user may be prompted via the user interface 165 to define a plane cutting through the image data volume. The cutting plane divides the image data into a first image data and a second image data. The first image data comprises image data elements (x, y, z, I) comprising locations (x, y, z) above the cutting plane. The second image data comprises image data elements (x, y, z, I) comprising locations (x, y, z) below the cutting plane. Any image data element (x, y, z, I) located in the cutting plane may be assigned, for example, to the first image data. The skilled person will understand that there are other ways of determining the first image data and the second image data, which can be employed by the system 100.
The second determination unit 120 is arranged to determine a first transfer function for computing the first image and to determine a second transfer function for computing the second image. In an embodiment of the system 100, the determination of TFs may be based on a user input. The user interface 165 may be arranged to prompt the user to select the first TF and the second TF. The selection may be done by specifying the structures to be displayed in the first image and in the second image and/or the modality used to acquire the image data. The system may further comprise a table with typical TFs used for displaying typical structures and/or used for displaying images computed from image data acquired by various modalities. The optimal TF is selected based on the user-specified structure and/or image modality. In a further embodiment of the system 100, the user interface 165 may provide the user with interactive means for constructing the first TF and the second TF based on displayed histograms and/or on test images computed from the first image data and from the second image data and displayed in the display. The user may define how to assign opacities to intensities comprised in the image data. Optionally, the user may define how to assign other renderable properties, e.g. colors, to intensities comprised in the image data. A method for interactive direct volume rendering is described by P. Hastreiter et al in an article entitled "Fast Analysis of Intracranial Aneurysms Based on Interactive Direct Volume
Rendering and CT-Angiography", published in Proc. MICCAI, pages 660-669, Springer, 1998. The skilled person will understand that there are many methods for determining TFs. The described methods are for illustrating the embodiments of the system 100 and do not limit the scope of the claims. In an embodiment of the system 100, the second determination unit 120 is arranged to generate the first TF based on the first image data, using the algorithm described in Ref. 1. Another algorithm for generating a TF is described by G. Kindlmann and J. W. Durkin in an article entitled "Semi-automatic generation of transfer functions for direct volume rendering", Proceedings of the 1998 IEEE Symposium on Volume Visualization, pages 79-86, October 19-20, 1998, hereinafter referred to as Ref. 2. The algorithm described in Ref. 2 may be employed to generate the first TF instead of the algorithm described in Ref. 1. The second determination unit 120 may be further arranged to generate the second TF based on the image data, using the algorithm described in Ref. 1 or in Ref. 2. The skilled person will understand that the choice of the algorithm for determining the first TF and the second TF does not limit the scope of the claims.
In an embodiment of the system 100, the second determination unit 120 is arranged to generate the first TF based on the first image data, using the algorithm described in Ref. 1 or in Ref. 2, and to generate the second TF based on the second image data, using the algorithm described in Ref. 1 or in Ref. 2.
The third determination unit 130 is arranged to determine a viewing direction for computing the first image and for computing the second image. The viewing direction is defined relative to the image data volume, for example, in an image data coordinate system for defining locations of the image data elements. The viewing direction is typically perpendicular to the viewing plane, which may be considered substantially identical with the display plane. The user interface enables the user to orient the image data volume relative to the viewing direction and the viewing plane. The third determination unit determines the viewing direction based on the orientation of the image data volume. Alternatively, the user interface 165 may enable the user to define the viewing direction without reorienting the image data volume, e.g. by specifying the directional cosines of the viewing direction in the image data coordinate system.
The fourth determination unit 140 of the system 100 is arranged to determine the first region of the display and the second region of the display. In an embodiment, the first image displayed in the first display region visualizes in greater detail a part of the second image displayed in the second display region. For example, the second image may visualize a lateral surface of the brain and the first image may visualize an area on the parietal lobe surface. The first region of the display may be quadratic. The fourth determination unit 140 is arranged to determine the location and the size of the first region of the display based on a user input. The second region is rectangular. The location, the width, and the height of the second region are automatically determined by the fourth determination unit 140.
In an embodiment of the system 100, the first display region and the second display region may be rectangular. The fourth determination unit is arranged to determine the location, the width, and the height of the first and of the second display region based on a user input. Alternatively, the location, the size, and the shape of the first region of the display and of the second region of the display may be automatically determined by the system 100.
The system 100 further comprises the computation unit 150 for computing the first image and the second image. The first image is computed based on the first image data, the first TF, and the viewing direction. Similarly, the second image is computed based on the second image data, the second TF, and the viewing direction. Section 2 of Ref. 1 describes computing an image based on a TF. A location on a viewing ray extending substantially parallel to the viewing direction and passing through an image data volume contributes to the computed image intensity defined by the viewing ray. The contribution is inversely proportional to the absorption of light at the contributing location on the viewing ray. The absorption of light at the contributing location on the viewing ray is determined by the opacity at the contributing location on the viewing ray. The opacity at the contributing location on the viewing ray is defined by an opacity-TF. The opacity-TF assigns opacity to intensity at the contributing location on the viewing ray. A source term describing the light added at the contributing location on the viewing ray may be specified using a source-TF.
The skilled person will understand that there are many methods described in the literature, including, but not limited to, orthographic ray casting, perspective ray casting, and ray tracing, which may be employed for computing the first image and the second image. The scope of the claims is independent of the method of computing the first image and the second image employed by the computation unit 150 of the system 100.
In an embodiment of the system 100, the first image data is determined based on the viewing direction and the first region of the display. The viewing plane may be considered substantially identical with the display plane. If, for example, orthographic ray casting is employed for computing the image, then a data element belongs to the first image data when the orthogonal projection of the spatial location of said data element onto the display plane is comprised in the first region of the display. The first region of the display may be rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a first window comprising the first region of the display. Fig. 3 illustrates the process of determining the first image data on the basis of the viewing direction and the first display region. Fig. 3 shows an exemplary image data volume 300, an exemplary viewing plane 305 and an exemplary first display region 310 in the viewing plane 305. The viewing direction 315 is perpendicular to the viewing plane 305. The locations of the first image data are comprised in the first cuboid 320. The second image data may be determined based on the viewing direction and the second region of the display in a similar way. The second region of the display may be also rectangular and may be determined based on a user input. For example, the user may determine the width and the height of a second window comprising the second region of the display. In an embodiment of the system 100, the second region of the display is determined based on the first region of the display. The second region of the display may be a complement of the first region of the display in a viewing region of the display. The viewing region of the display is a portion of the display, e.g. a window, reserved for displaying the image. The first region of the display may be determined by the user using a square or a rectangular, for example, frame for selecting the first region of the display in the viewing region of the display. This embodiment further simplifies selecting the first region of the display and the second region of the display, and may also simplify selecting the first image data and the second image data. Advantageously, the first region of the display for displaying the first image may be used for improving the view of a structure of interest in the second image.
Fig. 4 shows an exemplary image 400 comprising a first image 410 and a second image 420 computed by an embodiment of the system 100. The image shows a human head computed from MRI data. The first image 410 is surrounded by a rectangular frame 430. The first TF for computing the first image is optimized based on the first image data to show the surface of the brain. Thus, the first image 410 shows a detail of the surface of the brain. The second region of the display is the complement of the first region of the display in the viewing region. The second transfer function is optimized based on all the image data for visualizing the skin tissue. The frame for determining the first region of the display may be moved to display another region of the brain surface computed from another first image data. Optionally, the frame may be resized to display a larger or a smaller region of the brain surface. Optionally, the user interface 165 may provide the user with a possibility to determine the scale of the first image, i.e. to zoom-in the first image or to zoom-out the first image. In an embodiment of the system 100, the system 100 is further arranged to compute a transition image for displaying in a transition region of the display separating the first region of the display and the second region of the display, and: the first determination unit is further arranged to determine a transition image data within the image data for computing the transition image; - the fourth determination unit is further arranged to determine the transition region of the display for displaying the transition image; and the computation unit is further arranged to compute the transition image based on the first transfer function and on the second transfer function. To avoid sharp edges in the displayed image comprising the first image and the second image, the transition image for displaying in the transition region of the display separating the first region of the display and the second region of the display is computed based on the first transfer function TFl and on the second transfer function TF2. For example, the renderable value corresponding to a data element (x,y,z, I) contributing to the transition region of the display may be a normalized linear combination of the renderable value defined by TFl(I) and of the renderable value defined by TF2(I): cl(x,y,z)*TFl(I) + c2(x,y,z)*TF2(I). The first coefficient cl(x,y,z) of the normalized linear combination and the second coefficient c2(x,y,z) of the normalized linear combination are functions of the location (x,y,z) of the data element (x,y,z,I). The sum of the first coefficient of the normalized linear combination and the second coefficient of the normalized linear combination is equal to one: cl(x,y,z) + c2(x,y,z) = 1. In an embodiment, cl(x,y,z) = d2(x,y,z)/[ dl(x,y,z) + d2(x,y,z)], where dl(x,y,z) is a distance from the location (x,y,z) in the transition image data region to the first image data region and d2(x,y,z) is a distance from the location (x,y,z) in the transition image data region to the second image data region. The distance functions may be Euclidean distance functions, for example. Hence, at a location (x,y,z) close to the first image data volume, i.e. when dl(x,y,z) = 0, cl(x,y,z) = 1 and c2(x,y,z) = 0; at a location (x,y,z) close to the second image data volume, i.e. when d2(x,y,z) = 0, cl(x,y,z) = 0 and c2(x,y,z) » 1; at a location (x,y,z) substantially equally distant from the first image data volume and from the second image data volume, i.e. when dl(x,y,z) = d2(x,y,z), cl(x,y,z) = 0.5 and c2(x,y,z) = 0.5. The skilled person will understand that the described coefficients cl and c2 illustrate the embodiment and do not limit the scope of the claims.
In an embodiment of the system 100, the transition region of the display may be automatically determined by the system 100 and the transition image data volume may be automatically determined by the system 100. In another embodiment, the transition region of the display may be determined based on a user input. For example, the user may determine the first region of the display, using a first frame. The fourth determination unit 140 of the system 100 may be arranged to define a part of the viewing region of the display surrounding the first region of the display as a transition region of the display.
The skilled person will understand that a system for computing an image, the image comprising a plurality of partial images, each partial image based on a partial image data and a partial TF, and for displaying in a partial region of the display, is also contemplated. The skilled person will further understand that other embodiments of the system 100 are also possible. It is possible, among other things, to redefine the units of the system and to redistribute their functions. For example, in an embodiment of the system 100, the functions of the control unit 160 may be assigned to other units of the system 100. In a further embodiment of the system 100, there can be a plurality of computation units replacing the computation unit 150 of the previous embodiments of the system 100, and each computation unit may be arranged to employ a different computation algorithm. The employed algorithm may be based on a user selection.
The units of the system 100 may be implemented using a processor. Normally, their functions are performed under control of a software program product. During execution, the software program product is normally loaded into a memory, like a RAM, and executed from there. The program may be loaded from a background memory, like a ROM, hard disk, or magnetic and/or optical storage means, or may be loaded via a network like the Internet. Optionally, an application-specific integrated circuit may provide the described functionality. The order of steps in the method 200 of computing an image comprising a first image and a second image is not mandatory, the skilled person may change the order of some steps or perform some steps concurrently using threading models, multiprocessor systems or multiple processes without departing from the concept as intended by the present invention. For example, in the embodiment where the first image data is based on the first region of the display and on the viewing direction and where the second image data is determined based on the second region of the display and on the viewing direction, determining the first display region, the second display region and the third display region must precede determining the first image data and the second image data. Optionally, two or more steps of the method 100 of the current invention may be combined into one step. Optionally, a step of the method 100 of the current invention may be split into a plurality of steps. Some steps of the method 100 are optional and may be omitted.
Fig. 5 schematically shows an exemplary embodiment of the image acquisition apparatus 500 employing the system 100, said image acquisition apparatus 500 comprising an image acquisition unit 510 connected via an internal connection with the system 100, an input connector 501, and an output connector 502. This arrangement advantageously increases the capabilities of the image acquisition apparatus 500 by providing said image acquisition apparatus 500 with advantageous capabilities of the system 100 for computing an image based on image data. Examples of image acquisition apparatus comprise, but are not limited to, a CT system, an X-ray system, an MRI system, a US system, a PET system, a SPECT system, and an NM system.
Fig. 6 schematically shows an exemplary embodiment of the workstation 600. The workstation comprises a system bus 601. A processor 610, a memory 620, a disk input/output (I/O) adapter 630, and a user interface (UI) 640 are operatively connected to the system bus 601. A disk storage device 631 is operatively coupled to the disk I/O adapter 630. A keyboard 641, a mouse 642, and a display 643 are operatively coupled to the UI 640. The system 100 of the invention, implemented as a computer program, is stored in the disk storage device 631. The workstation 600 is arranged to load the program and input data into memory 620 and execute the program on the processor 610. The user can input information into the workstation 600, using the keyboard 641 and/or the mouse 642. The workstation is arranged to output information to the display device 643 and/or to the disk 631. The skilled person will understand that there are numerous other embodiments of the workstation 600 known in the art and that the present embodiment serves the purpose of illustrating the invention and must not be interpreted as limiting the invention to this particular embodiment.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim or in the description. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention can be implemented by means of hardware comprising several distinct elements and by means of a programmed computer. In the system claims enumerating several units, several of these units can be embodied by one and the same item of hardware or software. The usage of the words first, second and third, et cetera does not indicate any ordering. These words are to be interpreted as names.

Claims

CLAIMS:
1. A system (100) for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the system comprising: a first determination unit (110) for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination unit (120) for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination unit (130) for determining a viewing direction for computing the first image and for computing the second image; a fourth determination unit (140) for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation unit (150) for computing the first image and the second image, thereby computing the image.
2. A system (100) as claimed in claim 1, wherein the first image data is determined based on the viewing direction and the first region of the display.
3. A system (100) as claimed in claim 2, wherein the second region of the display is determined based on the first region of the display.
4. A system (100) as claimed in claim 1, wherein the first transfer function is determined based on the first image data and the second transfer function is determined based on the second image data.
5. A system (100) as claimed in claim 1, further arranged to compute a transition image for displaying in a transition region of the display separating the first region of the display and the second region of the display, wherein: the first determination unit (110) is further arranged to determine a transition image data within the image data for computing the transition image; the fourth determination unit (140) is further arranged to determine the transition region of the display for displaying the transition image; and - the computation unit (150) is further arranged to compute the transition image based on the first transfer function and on the second transfer function.
6. An image acquisition apparatus (500) comprising a system (100) as claimed in claim 1.
7. A workstation (600) comprising a system (100) as claimed in claim 1.
8. A method (200) of computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the method comprising: - a first determination step (210) for determining a first image data within the image data for computing the first image and for determining a second image data within the image data for computing the second image; a second determination step (220) for determining a first transfer function for computing the first image and for determining a second transfer function for computing the second image; a third determination step (230) for determining a viewing direction for computing the first image and for computing the second image; a fourth determination step (240) for determining a first region of the display for displaying the first image and for determining a second region of the display for displaying the second image; and a computation step (250) for computing the first image and the second image, thereby computing the image.
9. A computer program product to be loaded by a computer arrangement, comprising instructions for computing an image based on image data for displaying in a display, the image comprising a first image and a second image, the computer arrangement comprising a processing unit and a memory, the computer program product, after being loaded, providing said processing unit with the capability to carry out the following tasks: determining a first image data within the image data for computing the first image and determining a second image data within the image data for computing the second image; determining a first transfer function for computing the first image and determining a second transfer function for computing the second image; determining a viewing direction for computing the first image and for computing the second image; determining a first region of the display for displaying the first image and determining a second region of the display for displaying the second image; and - computing the first image and the second image, thereby computing the image.
PCT/IB2007/052770 2006-07-12 2007-07-11 Locally optimized transfer functions for volume visualization WO2008007344A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2009519047A JP2010509941A (en) 2006-07-12 2007-07-11 Locally optimized transfer function for volume visualization
EP07805118A EP2044574A2 (en) 2006-07-12 2007-07-11 Locally optimized transfer functions for volume visualization

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP06117040 2006-07-12
EP06117040.3 2006-07-12

Publications (2)

Publication Number Publication Date
WO2008007344A2 true WO2008007344A2 (en) 2008-01-17
WO2008007344A3 WO2008007344A3 (en) 2008-03-13

Family

ID=38828746

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2007/052770 WO2008007344A2 (en) 2006-07-12 2007-07-11 Locally optimized transfer functions for volume visualization

Country Status (4)

Country Link
EP (1) EP2044574A2 (en)
JP (1) JP2010509941A (en)
CN (1) CN101490713A (en)
WO (1) WO2008007344A2 (en)

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
HADWIGER M ET AL INSTITUTE OF ELECTRICAL AND ELECTRONICS ENGINEERS: "High-quality two-level volume rendering of segmented data sets on consumer graphics hardware" VIS 2003. IEEE VISUALIZATION 2003. PROCEEDINGS. SEATTLE, WA, OCT. 19 - 24, 2003, ANNUAL IEEE CONFERENCE ON VISUALIZATION, NEW YORK, NY : IEEE, US, 19 October 2003 (2003-10-19), pages 301-308, XP010671910 ISBN: 0-7803-8120-3 *
KNISS J ET AL: "Multidimensional transfer functions for interactive volume rendering" IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, IEEE SERVICE CENTER, LOS ALAMITOS, CA, US, vol. 8, no. 3, July 2002 (2002-07), pages 270-285, XP002366242 ISSN: 1077-2626 *
LUM E B ET AL ASSOCIATION FOR COMPUTING MACHINERY: "HARDWARE-ACCELERATED PARALLEL NON-PHOTOREALISTIC VOLUME RENDERING" NPAR 2002. SYMPOSIUM ON NON - PHOTOREALISTIC ANIMATION AND RENDERING. ANNECY, FRANCE, JUNE 3 - 5, 2002, SYMPOSIUM ON NON - PHOTOREALISTIC ANIMATION AND RENDERING, NEW YORK, NY : ACM, US, 3 June 2002 (2002-06-03), pages 67-74, XP001201012 ISBN: 1-58113-494-0 *
MUELLER, DANIEL C. AND MAEDER, ANTHONY J. AND O'SHEA, PETER J.: "Implementing Direct Volume Visualisation with Spatial Classification" PROCEEDINGS APRS WORKSHOP ON DIGITAL IMAGE COMPUTING (WDIC2005), 2005, pages 49-54, XP002463312 Queensland *
SÖREN GRIMM: "Real-time Mono- and Multi-Volume Rendering of Large Medical Datasets on Standard PC Hardwares" February 2005 (2005-02), VIENNA UNIVERSITY OF TECHNOLOGY , VIENNA , XP002463313 section 4.6 *

Also Published As

Publication number Publication date
WO2008007344A3 (en) 2008-03-13
CN101490713A (en) 2009-07-22
EP2044574A2 (en) 2009-04-08
JP2010509941A (en) 2010-04-02

Similar Documents

Publication Publication Date Title
EP2074499B1 (en) 3d connected shadow mouse pointer
EP2080170B1 (en) Combined intensity projection
EP2054860B1 (en) Selection of datasets from 3d renderings for viewing
JP4152648B2 (en) Method for segmenting a 3D image contained in an object
US10540745B2 (en) Zooming of medical images
US20090116765A1 (en) Compensating in-plane and off-plane motion in medical images
JP5122650B2 (en) Path neighborhood rendering
WO2009101577A2 (en) Interactive selection of a region of interest and segmentation of image data
WO2004095378A1 (en) Combined 3d and 2d views
US8957891B2 (en) Anatomy-defined automated image generation
WO2008007344A2 (en) Locally optimized transfer functions for volume visualization
WO2008065586A2 (en) Improved manipulation of a transfer function for volume rending
WO2009072050A1 (en) Automatic landmark placement

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780026051.X

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07805118

Country of ref document: EP

Kind code of ref document: A2

WWE Wipo information: entry into national phase

Ref document number: 2007805118

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2009519047

Country of ref document: JP

NENP Non-entry into the national phase in:

Ref country code: DE

NENP Non-entry into the national phase in:

Ref country code: RU