WO2008149291A1 - X-ray tool for 3d ultrasound - Google Patents

X-ray tool for 3d ultrasound Download PDF

Info

Publication number
WO2008149291A1
WO2008149291A1 PCT/IB2008/052166 IB2008052166W WO2008149291A1 WO 2008149291 A1 WO2008149291 A1 WO 2008149291A1 IB 2008052166 W IB2008052166 W IB 2008052166W WO 2008149291 A1 WO2008149291 A1 WO 2008149291A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
ultrasound volume
rendering
ultrasound
composite image
Prior art date
Application number
PCT/IB2008/052166
Other languages
French (fr)
Inventor
Michael Vion
Allen David Snook
Rohit Garg
Original Assignee
Koninklijke Philips Electronics, N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics, N.V. filed Critical Koninklijke Philips Electronics, N.V.
Priority to CN200880018715A priority Critical patent/CN101680950A/en
Priority to JP2010510935A priority patent/JP2010535043A/en
Priority to EP08763176A priority patent/EP2167991A1/en
Priority to RU2009149622/28A priority patent/RU2469308C2/en
Priority to US12/663,088 priority patent/US8466914B2/en
Publication of WO2008149291A1 publication Critical patent/WO2008149291A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • G01S7/52063Sector scan display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/52084Constructional features related to particular user interfaces
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present disclosure is directed to systems and methods for displaying medical diagnostic images and, more particularly, to ultrasound data display systems/apparatus featuring interactive, user-controlled image manipulation.
  • Ultrasound technology is increasingly being utilized to beneficial effect in a wide variety of clinical applications.
  • two-dimensional (2D), three- dimensional (3D), and/or motion-reproduction (e.g., color and power Doppler velocities, loops/sequenced images, etc.) ultrasonic technology is now regularly used for collecting data and generating diagnostic images with respect to most bodily structures and volumes, including: abdominal (e.g., kidney, gallbladder), interventional (e.g., breast-ductal carcinoma/RFA contrast), obstetric/prenatal, breast (e.g., breast fibroadenoma), transcranial (e.g., cerebral artery), cardiac (myocardial ischemia), pediatrics/neonatal, musculoskeletal (e.g., biceps tendonitis), vascular (e.g.,
  • Such ultrasound systems would be of little use, however, without complementary tools or systems (sometimes referred to as "quantification” tools or systems) designed and configured, for example, to receive and process such image data in an efficient and orderly fashion, and/or to store, distribute, and display such data at such times, and in such forms, as will be most convenient, useful and edifying to the intended viewer.
  • the intended viewer could be, for example, a technician tasked with using the ultrasound system to conduct a diagnostic test, a nurse or other health care worker processing or reviewing the test results, a physician attempting to develop a diagnosis based on such results, or a sick patient attempting to learn more about their medical condition.
  • Philips Electronics' QLABTM Advanced Quantification software product provides a user with the ability to analyze image data either on the ultrasound system itself , or on a separate personal computer or workstation (so- called 'off-cart' use). More particularly, Philips' QLABTM features a user interface via which an operator is allowed, for example, to adjust one or more visualization settings associated with an ultrasound volume (e.g., brightness, transparency, thresholding, etc.). QLABTM further features a sculpting tool and an eraser to enable an operator or practitioner to crop data from a visually-displayed ultrasound volume.
  • Philips' QLABTM features a user interface via which an operator is allowed, for example, to adjust one or more visualization settings associated with an ultrasound volume (e.g., brightness, transparency, thresholding, etc.).
  • QLABTM further features a sculpting tool and an eraser to enable an operator or practitioner to crop data from a visually-displayed ultrasound volume.
  • U.S. Patent No. 6,975,335 to Watanabe discloses magnified or reduced areas of a display that are easily distinguished by shades of color and patent density corresponding to the magnification or reduction ratio of areas of the display.
  • the Watanabe '335 patent describes a method for linking the displaying of a diagram to a pointing device so that a displayed portion is magnified when the area is pointed to by the pointing device.
  • U.S. Patent No. 6,803,931 to Roman et al. discloses a graphical user interface (GUI) corresponding to an image display window through which a single image or a stream of images or video frames are displayed.
  • the GUI includes a zoom control box having an inner region positioned within an outer region, wherein the size of the inner region relative to the outer region represents the magnification of the portion of the image being displayed within the image display window.
  • the magnification of the image being displayed can be increased or decreased, respectively, by positioning a cursor within the inner region and clicking a cursor control device, or by positioning the cursor outside of the inner region but inside of the outer region and clicking the cursor control device.
  • the size of the inner region relative to the outer region is changed accordingly.
  • the portion of the image being displayed within the image display window is changed by clicking and dragging the inner region to the desired position within the outer region using the cursor control device.
  • U.S. Patent No. 6,633,305 to Sarfeld is disclosed an image editing system that uses a loupe cursor to magnify a selected area of a basic image displayed on a display device.
  • the system generates basic image data representing the selected area of the basic image, and generates magnified image data by magnifying the selected basic image data. It displays within the loupe cursor a loupe image based on the magnified image data.
  • the system When a user editing signal is received for editing the loupe image, the system generates modified image data, and dynamically modifies the loupe image displayed within the loupe cursor based on the modified image data.
  • An ultrasound ultrasonic image scanning system for scanning an organic object is disclosed in U.S. Patent Application Publication No. 2006/0111634 by Wu that includes a display system for displaying a scanned image of the organic object in a plurality of display modes.
  • the display system is operative to simultaneously display, as respective upper and lower images shown on a screen, a so-called 'zoom-out' image including a graphic border for defining a so-called zoom region of interest (ZROI) on the zoom-out image, and a so-called 'zoom-in' image containing a magnified version of such ZROL.
  • the zoom-in updates in real time as the user uses a trackball to pan and/or resize the ZROI.
  • a system and method are disclosed for rendering an ultrasound volume including generating an external image of an ultrasound volume, wherein a fractional part of the external image corresponds to a fractional portion of the ultrasound volume, and generating a composite image of the ultrasound using the external image, wherein the fractional part of the external image is replaced with an internal image of the ultrasound volume corresponding to the fractional portion.
  • the internal image may be generated by changing a value of a visualization parameter used to generate the external image to a value more suitable for rendering an internal image.
  • the ultrasound volume may include an organic structure, wherein the external image depicts an outer surface of the organic structure, and the internal image depicts a vascularity of the organic structure, such that the composite image simultaneously depicts both an outer surface and the vascularity of the organic structure. Additional features, functions and benefits of the disclosed systems and methods for rendering an ultrasound volume will be apparent from the description which follows, particularly when read in conjunction with the appended figures.
  • FIGURE 1 illustrates a screen display of an external image of an ultrasound volume according to the present disclosure
  • FIGURE 2 illustrates a screen display of a composite image of the Figure 1 ultrasound volume generated using the Figure 1 external image, a part of the external image having been replaced with an internal image of the ultrasound volume according to the present disclosure
  • FIGURE 3 illustrates a screen display of a modified composite image of the Figure 1 ultrasound volume, also generated using the Figure 1 external image, showing a different portion of the ultrasound volume in internal view in response to a user-directed change according to the present disclosure
  • FIGURE 4 illustrates a screen display of an external image of another ultrasound volume according to the present disclosure
  • FIGURE 5 illustrates a screen display of a composite image of the Figure 4 ultrasound volume generated using the Figure 4 external image, a part of the external image having been replaced with an internal image of the ultrasound volume according to the present disclosure
  • FIGURE 6 illustrates a screen display of a modified composite image of the Figure 4 ultrasound volume, also generated using the Figure 4 external image, showing a different portion of the ultrasound volume in internal view in response to a user-directed change according to the present disclosure.
  • a 3D visualization tool for rendering ultrasound volumes, wherein an external image of an ultrasound volume is generated, and a composite image of the ultrasound volume is generated using the external image.
  • a fractional part of the external image corresponding to a fractional portion of the ultrasound volume may be replaced in the composite image with an internal image of the ultrasound volume corresponding to the same fractional portion of the ultrasound volume.
  • Such functionality enables a user or viewer to obtain a localized view into the ultrasound volume without changing the overall visualization parameter values.
  • a composite image of an ultrasound volume including an organic structure is provided including outer surfaces of much of the organic structure, as well as interior details associated with a selected portion of the organic structure.
  • a portion of the screen image 100 includes an external image 102 of an ultrasound volume 104.
  • the ultrasound volume 104 may contain an organic structure 106.
  • An outer surface of the organic structure is at least partially shown in the external image 102.
  • the external image 102 may be generated by applying a set of visualization parameters to the ultrasound volume 104 for highlighting or emphasizing externally- oriented and/or externally-disposed aspects of the ultrasound volume 104 and/or the organic structure 106 contained therewithin.
  • such set of visualization parameters may include one or more of such visualization parameters as Brightness, Transparency, Thresholding, Lighting, Smoothing, Gray Map, Chroma Map, and Ray Cast Method. Other visualization parameters are possible.
  • a portion of the screen image 200 includes a composite image 202 of the ultrasound volume 104.
  • the composite image may be generated using the external image 102 of Figure 1, wherein a fractional part of the external image 102 is replaced with an internal image 204 of a corresponding fractional portion of the ultrasound volume 104.
  • a vasculature 206 of the organic structure 106 is shown.
  • the vasculature 206 of the organic structure 106 is not necessarily shown in the external image 102 of Figure 1.
  • the fractional part of the Figure 1 external image 102 replaced by the internal image 204 in Figure 2 may be defined by a visible border 208, which may be depicted as part of the screen image 200.
  • the screen image 200 may include no such border 208, or a differently-appearing border.
  • the border 208 shown in solid line in Figure 2
  • the composite image 202 is typically generated by applying the visualization parameters and parameter values associated with the external image 102 of Figure 1.
  • the composite image 202 is generated by applying different visualization parameters and/or parameter values than those associated with the external image 102.
  • a set of visualization parameters applied to the ultrasound volume 104 to generate the internal image 204 may include one or more of such visualization parameters as Brightness, Transparency, Thresholding, Lighting, Smoothing, Gray Map, Chroma Map, and Ray Cast Method, including a set of such visualization parameters similar to or identical to that used to generate the external image 102 of the ultrasound volume 104, wherein one or more of such visualization parameters in the case of the internal image 204 is associated with a different value than that associated with a corresponding visualization parameter in the case of the external image 102.
  • Such differences may contribute to the generation of an internal (e.g., as opposed to an external) image of the ultrasound volume 104.
  • the values of the visualization parameters associated with the internal image 204 may be selected so as to highlight or emphasize internally-oriented and/or internally-disposed aspects of the ultrasound volume 104 and/or the organic structure 106 contained within the corresponding fractional portion of the ultrasound volume 104. Accordingly, within the internal image 204, the composite image 202 appears to at least some extent different than in a corresponding fractional part of the Figure 1 external image 102 that is absent in Figure 2.
  • a value of a single visualization parameter associated with the internal image 204 that is at least incrementally different than the value of a corresponding visualization parameter associated with the external image 202 may be sufficient to generate and/or display a composite image (e.g., the composite image 202) of an ultrasound volume (e.g., the ultrasound volume 104) from an external image (e.g., the external image 102), wherein a fractional part of the external image is replaced by an internal image (e.g., the internal image 204) of the ultrasound volume.
  • a composite image e.g., the composite image 202 of an ultrasound volume (e.g., the ultrasound volume 104) from an external image (e.g., the external image 102), wherein a fractional part of the external image is replaced by an internal image (e.g., the internal image 204) of the ultrasound volume.
  • the same parameter and parameter value may be used to generate either such image that does not necessarily correlate to suitability or non-suitability with respect to generating an internal image of a given ultrasound volume.
  • such parameters may in some circumstances include Image Magnification.
  • the internal image 204 can be generated by applying parameters to the ultrasound volume 104 that are set in advance, and/or by default. Examples of such preset parameters and/or parameter values may include X-Ray, average, and minimum. Upon or after the composite image 202 being displayed to the viewer or user, he or she may elect to apply the same visualization parameters and/or parameter values to the entire ultrasound volume 104 that were applied to the fractional portion thereof to which the internal view 104 corresponds. In such circumstances, an internal image (not separately shown) of the ultrasound volume 104 may be generated that, in the context of the composite image 202, is substantially coextensive with, and therefore more or less fully replaces, the external image 102.
  • a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device (not separately shown) may be used to execute so-called "on-the-fly" modifications or adjustments to values associated with one or more such visualization parameters associated with the internal image 204 (e.g., changes to a Transparency visualization setting) to emphasize or highlight internally- disposed features within the corresponding fractional portion of the ultrasound volume 104.
  • a group of different types of such modifications or adjustments may be implemented simultaneously, e.g., via a predetermined mouse click or series of mouse clicks, a particular software menu command (not separately shown), and/or a dedicated hardware switch (not separately shown).
  • a predetermined mouse click or series of mouse clicks e.g., a particular software menu command (not separately shown), and/or a dedicated hardware switch (not separately shown).
  • Other implementation techniques with respect to modifications or adjustments to the visualization parameters used to generate the internal image 204 are possible.
  • the internal image 204 (e.g., and/or the border 208 associated therewith) can be of a different size (e.g., of a larger or smaller absolute size than that shown in Figure 2), and/or of a different size relative to that of the overall composite image 202, that of the screen image 200, and/or that of the external image 102, in accordance with embodiments of the present disclosure.
  • the internal image 204 (and/or the border 208) may of a different shape than rectangular/square or polygonal (e.g., a curved and/or circular shape, an irregular shape, etc.), in accordance with embodiments of the present disclosure.
  • a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to execute so-called “on-the-fly” modifications or adjustments to the size or shape of the internal image 204 (and/or of the border 208).
  • a computer mouse e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith
  • any other suitable indicating, pointing, or cursor movement device may be used to execute so-called “on-the-fly” modifications or adjustments to the size or shape of the internal image 204 (and/or of the border 208).
  • the size and/or shape of the internal image 204 (and/or of the border 208) may be changed to, from, or between any one of a number of predetermined sizes or shapes, including to, from, or between one or more customized shapes corresponding that of a particular organic structure or volume (e.g., as viewed from a particular perspective or vantage point) or a separately identifiable portion thereof, e.g., via a predetermined mouse click or series of mouse clicks, a particular software menu command (not separately shown), and/or a dedicated hardware switch (not separately shown).
  • Other implementation techniques with respect to modifications or adjustments to the size and/or shape of the internal image 204 (and/or of the border 208) are possible.
  • visualization settings applied to the ultrasound volume 104 to generate the external image 102 can be selected so as (and/or generally tend) to afford the ultrasound volume 104 an opaque and/or three-dimensional overall appearance, such that the external image 102 will appear to show an outer surface or wall(s) of a particular organic structure (e.g., a heart or other bodily organ).
  • modifications or adjustments to such visualization settings can be selected so as to produce within the internal image 204 a visual effect (e.g., akin to an X-ray effect) for viewing one or more structures (e.g., cardiac vascularity, such as a coronary artery) and/or functions (e.g., valve operation, blood flow, etc.) ordinarily understood and/or expected by a practitioner or technician to be disposed (or to take place) within the ultrasound volume 104, rather than on its periphery or outer surface.
  • a visual effect e.g., akin to an X-ray effect
  • structures e.g., cardiac vascularity, such as a coronary artery
  • functions e.g., valve operation, blood flow, etc.
  • the compound visualization parameters applied to the ultrasound volume 104 may be determined or selected to give the internal image 204 (and/or the border 208) the appearance of a 'window' with respect to the inner structural and/or functional constituents of an organic structure contained within the ultrasound volume 104, while at the same time, and in the same image, at least some portion of the external or outer wall structures of such organic structure is also visible (e.g., in those portions of the composite image 202 appearing outside the internal image 204 ).
  • Applying compound visualization settings to the ultrasound volume 104 in such a way can be advantageous, at least insofar as it can provide a practitioner or technician with important and powerful visual cues and/or contextual information with respect to the external (e.g., outer wall) structure of a bodily structure in the same image in which he or she may also (e.g., simultaneously) observe important structural and/or functional details associated with interior regions of such bodily structure.
  • the image magnification or zoom parameters associated with the external image 102 of the ultrasound volume 104 may be maintained and/or kept constant, even as other visualization settings are modified or adjusted to generate the internal image 204 thereof.
  • applying the same magnification or zoom settings to the ultrasound volume 104 both within and outside the fractional portion thereof associated with the internal image 204 can be advantageous, insofar as such an arrangement may tend to increase and/or maximize the amount of accurate (e.g., clinically useable and/or reliable) visual or image information appearing at or near the border 208.
  • such window can be sharply defined (e.g., the window can appear to be relatively sharply 'cut' from surrounding external structure) so as to reduce and/or minimize any loss of visual detail associated with rendering such inner workings.
  • the ultrasound volume 104 to which the above-discussed external, internal, and/or compound set of visualization parameters are applied can be any one or more of the following ultrasound volume types: an echo volume, a color volume, and/or an echo+color volume. Still further ultrasound volume types are possible.
  • a position of the internal image 204 (and/or of the border 208) within the screen image 200, and/or within the composite image 202, may be changed in accordance with the present invention to correspond to a different fractional portion of the ultrasound volume 104, and/or to another fractional part of the Figure 1 external image 102.
  • a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to so move the internal image 204 (and/or the border 208) from the position within the screen image 200 or within the composite image 202 shown in Figure 2 to a new position, e.g., as shown in Figure 3 with respect to a new screen image 300, and in a new composite image 302.
  • Other techniques for so moving the interior image 204 (and/or the border 208) are possible.
  • the same visualization parameters and/or parameter values discussed above and associated with generating an external image may now be applied to the corresponding fractional portion of ultrasound volume 104, thereby restoring to the now new composite image 302 the previously replaced fractional part of the external image 102.
  • the new position of the internal image 204 within the composite image 302 is sufficiently close to the earlier position therein, some overlap may exist as between the previous and new corresponding fractional parts of the external image 202. In such circumstances, the previously replaced fractional part of the external image 102 may only be partly restored by virtue of such movement of the internal image 204.
  • a screen image 400 associated with acoustic data from an ultrasound system is displayed in accordance with an exemplary embodiment of the present disclosure.
  • a portion of the screen image 400 includes an external image 402 of another ultrasound volume 404.
  • the ultrasound volume 404 may contain an organic structure 406.
  • An outer surface of the organic structure is at least partially shown in the external image 402.
  • the external image 402 may be generated by applying a set of visualization parameters to the ultrasound volume 404 for highlighting or emphasizing externally-oriented and/or externally-disposed aspects of the ultrasound volume 404 and/or the organic structure 406 contained therewithin.
  • a portion of the screen image 500 includes a composite image 502 of the ultrasound volume 404.
  • the composite image may be generated using the external image 402 of Figure 4, wherein a fractional part of the external image 402 is replaced with an internal image 504 of a corresponding fractional portion of the ultrasound volume 404.
  • a vasculature 506 of the organic structure 406 is shown within the internal image 504.
  • the same vasculature 506 of the organic structure 406 may not be shown, or at least may not be as effectively shown, or as easily visible, in the external image 402 of Figure 4.
  • a position of the internal image 504 (and/or of the border 508) within the screen image 500, and/or within the composite image 502, may be changed in accordance with the present invention to correspond to a different fractional portion of the ultrasound volume 404, and/or to another fractional part of the Figure 4 external image 402.
  • a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to so move the internal image 504 (and/or the border 508) from the position within the screen image 500 or within the composite image 502 shown in Figure 5 to a new position, e.g., as shown in Figure 6 with respect to a new screen image 600, and in a new composite image 602.
  • Other techniques for so moving the interior image 504 (and/or the border 508) are possible.
  • Embodiments of the present disclosure include a computer system (not shown, e.g., including a processor, related accessories such as a computer mouse and/or a trackball, and a computer monitor or other display) and a related algorithm or software product operable via such computer system and/or by said processor for permitting a user of such computer system to display the screen images 100, 200, 300, 400, 500, and 500 including the various images depicted therein of the respective ultrasound volumes 104, 404, and to manipulate such images in the manner described herein, including but not limited to achieving the above-described 'window' or 'X- ray' visual effect.
  • a computer system not shown, e.g., including a processor, related accessories such as a computer mouse and/or a trackball, and a computer monitor or other display
  • a related algorithm or software product operable via such computer system and/or by said processor for permitting a user of such computer system to display the screen images 100, 200, 300, 400, 500, and 500 including the various images
  • hardware-related embodiments of the present disclosure may include a personal computer or workstation, including a computer display or monitor, e.g., such as are presently used to run and/or utilize the above- discussed Philips Electronics QLABTM Advanced Quantification software product off-cart, or a computer-implemented ultrasound system (e.g., on-cart) such as the above-discussed Philips Electronics iU22, iE33, and/or HDl 1 XE Ultrasound Systems.
  • a personal computer or workstation including a computer display or monitor, e.g., such as are presently used to run and/or utilize the above- discussed Philips Electronics QLABTM Advanced Quantification software product off-cart, or a computer-implemented ultrasound system (e.g., on-cart) such as the above-discussed Philips Electronics iU22, iE33, and/or HDl 1 XE Ultrasound Systems.
  • a computer display or monitor e.g., such as are presently used
  • Software-related embodiments of the present disclosure may include a quantification software product including all relevant features and aspects of the above-discussed Philips Electronics QLABTM software product, with additional code, for example, and/or one or more appropriate software 'plug-ins' for implementing additional features and aspects as disclosed herein.
  • Embodiments of the present disclosure may further include a computer-readable medium including a computer- executable algorithm for implementing the ultrasound imaging features and functions disclosed herein.
  • a computer- executable algorithm for implementing the ultrasound imaging features and functions disclosed herein.
  • such an algorithm may include appropriate computer- executable code.

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Acoustics & Sound (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A system and method is disclosed for rendering an ultrasound volume. An external image of an ultrasound volume is generated. A fractional part of the external image corresponds to a fractional portion of the ultrasound volume. A composite image of the ultrasound is generated using the external image, wherein the fractional part of the external image is replaced with an internal image of the ultrasound volume fractional portion. The internal image may be generated by changing a value of a visualization parameter used to generate the external image to a value more suitable for rendering an internal image. The ultrasound volume may include a organic structure, wherein the external image depicts an outer surface of the organic structure, and the internal image depicts a vascularity of the organic structure, such that the composite image simultaneously depicts both an outer surface and the vascularity of the organic structure.

Description

X-RAY TOOL FOR 3D ULTRASOUND
The present disclosure is directed to systems and methods for displaying medical diagnostic images and, more particularly, to ultrasound data display systems/apparatus featuring interactive, user-controlled image manipulation. Ultrasound technology is increasingly being utilized to beneficial effect in a wide variety of clinical applications. For example, two-dimensional (2D), three- dimensional (3D), and/or motion-reproduction (e.g., color and power Doppler velocities, loops/sequenced images, etc.) ultrasonic technology is now regularly used for collecting data and generating diagnostic images with respect to most bodily structures and volumes, including: abdominal (e.g., kidney, gallbladder), interventional (e.g., breast-ductal carcinoma/RFA contrast), obstetric/prenatal, breast (e.g., breast fibroadenoma), transcranial (e.g., cerebral artery), cardiac (myocardial ischemia), pediatrics/neonatal, musculoskeletal (e.g., biceps tendonitis), vascular (e.g., femoral vein thrombosis, pre-clinical atherosclerosis), and/or small parts (e.g., testicular abnormalities). As the demand on the part of doctors and their patients for ultrasound services and diagnostic data has increased, the market for related equipment has likewise grown. Modern embodiments of such equipment, such as the iU22, iE33, and HDl 1 XE Ultrasound Systems manufactured by Philips Electronic, can be enormously sophisticated tools for generating optimized image data containing high quality, undistorted acoustic information, often in real time, and commonly in large quantity.
Such ultrasound systems would be of little use, however, without complementary tools or systems (sometimes referred to as "quantification" tools or systems) designed and configured, for example, to receive and process such image data in an efficient and orderly fashion, and/or to store, distribute, and display such data at such times, and in such forms, as will be most convenient, useful and edifying to the intended viewer. Depending on the particular context, the intended viewer could be, for example, a technician tasked with using the ultrasound system to conduct a diagnostic test, a nurse or other health care worker processing or reviewing the test results, a physician attempting to develop a diagnosis based on such results, or a sick patient attempting to learn more about their medical condition.
One modern example of a solution for ultrasound data quantification is Philips Electronics' QLAB™ Advanced Quantification software product. The QLAB™ quantification software provides a user with the ability to analyze image data either on the ultrasound system itself , or on a separate personal computer or workstation (so- called 'off-cart' use). More particularly, Philips' QLAB™ features a user interface via which an operator is allowed, for example, to adjust one or more visualization settings associated with an ultrasound volume (e.g., brightness, transparency, thresholding, etc.). QLAB™ further features a sculpting tool and an eraser to enable an operator or practitioner to crop data from a visually-displayed ultrasound volume.
The patent literature includes additional teachings relative to user-adjustable display settings. For example, U.S. Patent No. 6,975,335 to Watanabe discloses magnified or reduced areas of a display that are easily distinguished by shades of color and patent density corresponding to the magnification or reduction ratio of areas of the display. In addition, the Watanabe '335 patent describes a method for linking the displaying of a diagram to a pointing device so that a displayed portion is magnified when the area is pointed to by the pointing device.
U.S. Patent No. 6,803,931 to Roman et al. discloses a graphical user interface (GUI) corresponding to an image display window through which a single image or a stream of images or video frames are displayed. According to the '931 Roman et al. patent, the GUI includes a zoom control box having an inner region positioned within an outer region, wherein the size of the inner region relative to the outer region represents the magnification of the portion of the image being displayed within the image display window. The magnification of the image being displayed can be increased or decreased, respectively, by positioning a cursor within the inner region and clicking a cursor control device, or by positioning the cursor outside of the inner region but inside of the outer region and clicking the cursor control device. As the magnification is increased or decreased, the size of the inner region relative to the outer region is changed accordingly. The portion of the image being displayed within the image display window is changed by clicking and dragging the inner region to the desired position within the outer region using the cursor control device.
In U.S. Patent No. 6,633,305 to Sarfeld is disclosed an image editing system that uses a loupe cursor to magnify a selected area of a basic image displayed on a display device. According to the '305 Sarfeld patent, the system generates basic image data representing the selected area of the basic image, and generates magnified image data by magnifying the selected basic image data. It displays within the loupe cursor a loupe image based on the magnified image data. When a user editing signal is received for editing the loupe image, the system generates modified image data, and dynamically modifies the loupe image displayed within the loupe cursor based on the modified image data.
An ultrasound ultrasonic image scanning system for scanning an organic object is disclosed in U.S. Patent Application Publication No. 2006/0111634 by Wu that includes a display system for displaying a scanned image of the organic object in a plurality of display modes. According to the '634 Wu publication, the display system is operative to simultaneously display, as respective upper and lower images shown on a screen, a so-called 'zoom-out' image including a graphic border for defining a so-called zoom region of interest (ZROI) on the zoom-out image, and a so-called 'zoom-in' image containing a magnified version of such ZROL. The zoom-in updates in real time as the user uses a trackball to pan and/or resize the ZROI. Despite efforts to date, a need remains for ultrasound data quantification solutions that are effective to distribute, display, and/or store acoustic information in such forms and at such times as to be convenient, useful, and/or informative to the intended recipients or viewers of such data. These and other needs are satisfied by the disclosed systems and methods, as will be apparent from the description which follows. A system and method are disclosed for rendering an ultrasound volume including generating an external image of an ultrasound volume, wherein a fractional part of the external image corresponds to a fractional portion of the ultrasound volume, and generating a composite image of the ultrasound using the external image, wherein the fractional part of the external image is replaced with an internal image of the ultrasound volume corresponding to the fractional portion.
The internal image may be generated by changing a value of a visualization parameter used to generate the external image to a value more suitable for rendering an internal image. The ultrasound volume may include an organic structure, wherein the external image depicts an outer surface of the organic structure, and the internal image depicts a vascularity of the organic structure, such that the composite image simultaneously depicts both an outer surface and the vascularity of the organic structure. Additional features, functions and benefits of the disclosed systems and methods for rendering an ultrasound volume will be apparent from the description which follows, particularly when read in conjunction with the appended figures.
To assist those of skill in the art in making and using the disclosed systems and methods for rendering an ultrasound volume, reference is made to the accompanying figures, wherein:
FIGURE 1 illustrates a screen display of an external image of an ultrasound volume according to the present disclosure;
FIGURE 2 illustrates a screen display of a composite image of the Figure 1 ultrasound volume generated using the Figure 1 external image, a part of the external image having been replaced with an internal image of the ultrasound volume according to the present disclosure;
FIGURE 3 illustrates a screen display of a modified composite image of the Figure 1 ultrasound volume, also generated using the Figure 1 external image, showing a different portion of the ultrasound volume in internal view in response to a user-directed change according to the present disclosure;
FIGURE 4 illustrates a screen display of an external image of another ultrasound volume according to the present disclosure;
FIGURE 5 illustrates a screen display of a composite image of the Figure 4 ultrasound volume generated using the Figure 4 external image, a part of the external image having been replaced with an internal image of the ultrasound volume according to the present disclosure; and
FIGURE 6 illustrates a screen display of a modified composite image of the Figure 4 ultrasound volume, also generated using the Figure 4 external image, showing a different portion of the ultrasound volume in internal view in response to a user-directed change according to the present disclosure.
In accordance with exemplary embodiments of the present disclosure, a 3D visualization tool is provided for rendering ultrasound volumes, wherein an external image of an ultrasound volume is generated, and a composite image of the ultrasound volume is generated using the external image. A fractional part of the external image corresponding to a fractional portion of the ultrasound volume may be replaced in the composite image with an internal image of the ultrasound volume corresponding to the same fractional portion of the ultrasound volume. Such functionality enables a user or viewer to obtain a localized view into the ultrasound volume without changing the overall visualization parameter values. In examples, a composite image of an ultrasound volume including an organic structure is provided including outer surfaces of much of the organic structure, as well as interior details associated with a selected portion of the organic structure.
Referring now to Figure 1, a screen image 100 associated with acoustic data from an ultrasound system (not separately shown) is displayed in accordance with an exemplary embodiment of the present disclosure. A portion of the screen image 100 includes an external image 102 of an ultrasound volume 104. The ultrasound volume 104 may contain an organic structure 106. An outer surface of the organic structure is at least partially shown in the external image 102. The external image 102 may be generated by applying a set of visualization parameters to the ultrasound volume 104 for highlighting or emphasizing externally- oriented and/or externally-disposed aspects of the ultrasound volume 104 and/or the organic structure 106 contained therewithin. For example, such set of visualization parameters may include one or more of such visualization parameters as Brightness, Transparency, Thresholding, Lighting, Smoothing, Gray Map, Chroma Map, and Ray Cast Method. Other visualization parameters are possible.
Turning now to Figure 2, a screen image 200 associated with acoustic data from an ultrasound system (not separately shown) is displayed in accordance with an exemplary embodiment of the present disclosure. A portion of the screen image 200 includes a composite image 202 of the ultrasound volume 104. The composite image may be generated using the external image 102 of Figure 1, wherein a fractional part of the external image 102 is replaced with an internal image 204 of a corresponding fractional portion of the ultrasound volume 104. Within the internal view 204, a vasculature 206 of the organic structure 106 is shown. By contrast, the vasculature 206 of the organic structure 106 is not necessarily shown in the external image 102 of Figure 1.
The fractional part of the Figure 1 external image 102 replaced by the internal image 204 in Figure 2 may be defined by a visible border 208, which may be depicted as part of the screen image 200. Alternatively, the screen image 200 may include no such border 208, or a differently-appearing border. For example, the border 208, shown in solid line in Figure 2, may be shown in dashed or ghost line form, and/or may have another shape, e.g., depending on the shape of the internal image 204, which in turn may be of any suitable shape. Other than within the internal image 204, the composite image 202 is typically generated by applying the visualization parameters and parameter values associated with the external image 102 of Figure 1. Within the internal image 204, however, the composite image 202 is generated by applying different visualization parameters and/or parameter values than those associated with the external image 102. For example, a set of visualization parameters applied to the ultrasound volume 104 to generate the internal image 204 may include one or more of such visualization parameters as Brightness, Transparency, Thresholding, Lighting, Smoothing, Gray Map, Chroma Map, and Ray Cast Method, including a set of such visualization parameters similar to or identical to that used to generate the external image 102 of the ultrasound volume 104, wherein one or more of such visualization parameters in the case of the internal image 204 is associated with a different value than that associated with a corresponding visualization parameter in the case of the external image 102. Such differences may contribute to the generation of an internal (e.g., as opposed to an external) image of the ultrasound volume 104.
The values of the visualization parameters associated with the internal image 204 may be selected so as to highlight or emphasize internally-oriented and/or internally-disposed aspects of the ultrasound volume 104 and/or the organic structure 106 contained within the corresponding fractional portion of the ultrasound volume 104. Accordingly, within the internal image 204, the composite image 202 appears to at least some extent different than in a corresponding fractional part of the Figure 1 external image 102 that is absent in Figure 2. In accordance with the present disclosure, a value of a single visualization parameter associated with the internal image 204 that is at least incrementally different than the value of a corresponding visualization parameter associated with the external image 202 may be sufficient to generate and/or display a composite image (e.g., the composite image 202) of an ultrasound volume (e.g., the ultrasound volume 104) from an external image (e.g., the external image 102), wherein a fractional part of the external image is replaced by an internal image (e.g., the internal image 204) of the ultrasound volume. Other arrangements are possible, including wherein two or more common visualization parameters have different values, wherein one or more common visualization parameters have widely differing values, and/or wherein one or more visualization parameters applied to the ultrasound volume 104 to generate the internal image 204 or the external image 102 was not so applied to generate the other such view. At least some of the visualization parameters applied to the ultrasound volume
104 to generate the internal image 204 can be the same or similar to those used to generate the external image 102. For example, the same parameter and parameter value may be used to generate either such image that does not necessarily correlate to suitability or non-suitability with respect to generating an internal image of a given ultrasound volume. In accordance with the present disclosure, such parameters may in some circumstances include Image Magnification.
The internal image 204 can be generated by applying parameters to the ultrasound volume 104 that are set in advance, and/or by default. Examples of such preset parameters and/or parameter values may include X-Ray, average, and minimum. Upon or after the composite image 202 being displayed to the viewer or user, he or she may elect to apply the same visualization parameters and/or parameter values to the entire ultrasound volume 104 that were applied to the fractional portion thereof to which the internal view 104 corresponds. In such circumstances, an internal image (not separately shown) of the ultrasound volume 104 may be generated that, in the context of the composite image 202, is substantially coextensive with, and therefore more or less fully replaces, the external image 102.
The above-discussed modifications or adjustments with respect to the visualization parameters and/or parameter values as between the internal image 204 and the external image 102 may be implemented in one or more of a plurality of different ways in accordance with embodiments of the present disclosure. For example, a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device (not separately shown) may be used to execute so-called "on-the-fly" modifications or adjustments to values associated with one or more such visualization parameters associated with the internal image 204 (e.g., changes to a Transparency visualization setting) to emphasize or highlight internally- disposed features within the corresponding fractional portion of the ultrasound volume 104. For another example, a group of different types of such modifications or adjustments, one or more of which may be determined in advance, and/or according to a pre-set menu or schedule of adjustments intended to create a particular visual effect or a distinct look (e.g., depending on the particular clinical application), may be implemented simultaneously, e.g., via a predetermined mouse click or series of mouse clicks, a particular software menu command (not separately shown), and/or a dedicated hardware switch (not separately shown). Other implementation techniques with respect to modifications or adjustments to the visualization parameters used to generate the internal image 204 are possible.
The internal image 204 (e.g., and/or the border 208 associated therewith) can be of a different size (e.g., of a larger or smaller absolute size than that shown in Figure 2), and/or of a different size relative to that of the overall composite image 202, that of the screen image 200, and/or that of the external image 102, in accordance with embodiments of the present disclosure. In addition, the internal image 204 (and/or the border 208) may of a different shape than rectangular/square or polygonal (e.g., a curved and/or circular shape, an irregular shape, etc.), in accordance with embodiments of the present disclosure. For example, a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to execute so-called "on-the-fly" modifications or adjustments to the size or shape of the internal image 204 (and/or of the border 208).. For another example, the size and/or shape of the internal image 204 (and/or of the border 208) may be changed to, from, or between any one of a number of predetermined sizes or shapes, including to, from, or between one or more customized shapes corresponding that of a particular organic structure or volume (e.g., as viewed from a particular perspective or vantage point) or a separately identifiable portion thereof, e.g., via a predetermined mouse click or series of mouse clicks, a particular software menu command (not separately shown), and/or a dedicated hardware switch (not separately shown). Other implementation techniques with respect to modifications or adjustments to the size and/or shape of the internal image 204 (and/or of the border 208) are possible.
The above-discussed modifications or adjustments to the visualization settings of the external image 102 of the ultrasound volume 104 can be implemented to cause the composite image 202 to reflect one or more desired visual effects or distinct looks within the internal image 204 (e.g., as compared to those of the ultrasound volume 104 as a whole) in accordance with embodiments of the present disclosure. For example, in some embodiments in accordance with the present disclosure, visualization settings applied to the ultrasound volume 104 to generate the external image 102 can be selected so as (and/or generally tend) to afford the ultrasound volume 104 an opaque and/or three-dimensional overall appearance, such that the external image 102 will appear to show an outer surface or wall(s) of a particular organic structure (e.g., a heart or other bodily organ). In at least some such instances, modifications or adjustments to such visualization settings can be selected so as to produce within the internal image 204 a visual effect (e.g., akin to an X-ray effect) for viewing one or more structures (e.g., cardiac vascularity, such as a coronary artery) and/or functions (e.g., valve operation, blood flow, etc.) ordinarily understood and/or expected by a practitioner or technician to be disposed (or to take place) within the ultrasound volume 104, rather than on its periphery or outer surface. In other words, the compound visualization parameters applied to the ultrasound volume 104 may be determined or selected to give the internal image 204 (and/or the border 208) the appearance of a 'window' with respect to the inner structural and/or functional constituents of an organic structure contained within the ultrasound volume 104, while at the same time, and in the same image, at least some portion of the external or outer wall structures of such organic structure is also visible (e.g., in those portions of the composite image 202 appearing outside the internal image 204 ).
Applying compound visualization settings to the ultrasound volume 104 in such a way can be advantageous, at least insofar as it can provide a practitioner or technician with important and powerful visual cues and/or contextual information with respect to the external (e.g., outer wall) structure of a bodily structure in the same image in which he or she may also (e.g., simultaneously) observe important structural and/or functional details associated with interior regions of such bodily structure.
As discussed above, in accordance with embodiments of the present disclosure, the image magnification or zoom parameters associated with the external image 102 of the ultrasound volume 104 may be maintained and/or kept constant, even as other visualization settings are modified or adjusted to generate the internal image 204 thereof. For example, in accordance with embodiments of the present disclosure, applying the same magnification or zoom settings to the ultrasound volume 104 both within and outside the fractional portion thereof associated with the internal image 204 can be advantageous, insofar as such an arrangement may tend to increase and/or maximize the amount of accurate (e.g., clinically useable and/or reliable) visual or image information appearing at or near the border 208. In other words, to the extent compound visualization parameters are applied to the ultrasound volume 104 to produce the appearance of a 'window' to the inner workings of an otherwise externally-rendered bodily structure, such window can be sharply defined (e.g., the window can appear to be relatively sharply 'cut' from surrounding external structure) so as to reduce and/or minimize any loss of visual detail associated with rendering such inner workings.
In accordance with embodiments of the present disclosure, the ultrasound volume 104 to which the above-discussed external, internal, and/or compound set of visualization parameters are applied can be any one or more of the following ultrasound volume types: an echo volume, a color volume, and/or an echo+color volume. Still further ultrasound volume types are possible.
Referring now to Figures 2 and 3, a position of the internal image 204 (and/or of the border 208) within the screen image 200, and/or within the composite image 202, may be changed in accordance with the present invention to correspond to a different fractional portion of the ultrasound volume 104, and/or to another fractional part of the Figure 1 external image 102. For example, a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to so move the internal image 204 (and/or the border 208) from the position within the screen image 200 or within the composite image 202 shown in Figure 2 to a new position, e.g., as shown in Figure 3 with respect to a new screen image 300, and in a new composite image 302. Other techniques for so moving the interior image 204 (and/or the border 208) are possible.
In accordance with embodiments of the present disclosure, and as shown in Figures 2 and 3, upon the internal image 204 (and/or the border 208) being moved as described immediately above to correspond to the new fractional portion of the ultrasound volume 104, the same visualization parameters and/or parameter values discussed above and associated with generating an internal image may be applied to the ultrasound volume 104 there, where previously, the visualization parameters and/or parameter values associated with generating an external image were applied. By the same token, and/or by virtue of the internal image 204 (and/or the border 208) being moved away from one fractional part of the external image 102 and to a new fractional part thereof, the same visualization parameters and/or parameter values discussed above and associated with generating an external image may now be applied to the corresponding fractional portion of ultrasound volume 104, thereby restoring to the now new composite image 302 the previously replaced fractional part of the external image 102. To the extent the new position of the internal image 204 within the composite image 302 is sufficiently close to the earlier position therein, some overlap may exist as between the previous and new corresponding fractional parts of the external image 202. In such circumstances, the previously replaced fractional part of the external image 102 may only be partly restored by virtue of such movement of the internal image 204.
Referring now to Figure 4, a screen image 400 associated with acoustic data from an ultrasound system (not separately shown) is displayed in accordance with an exemplary embodiment of the present disclosure. A portion of the screen image 400 includes an external image 402 of another ultrasound volume 404. The ultrasound volume 404 may contain an organic structure 406. An outer surface of the organic structure is at least partially shown in the external image 402.
As discussed above, the external image 402 may be generated by applying a set of visualization parameters to the ultrasound volume 404 for highlighting or emphasizing externally-oriented and/or externally-disposed aspects of the ultrasound volume 404 and/or the organic structure 406 contained therewithin.
Turning now to Figure 5, a screen image 500 associated with acoustic data from an ultrasound system (not separately shown) is displayed in accordance with an exemplary embodiment of the present disclosure. A portion of the screen image 500 includes a composite image 502 of the ultrasound volume 404. The composite image may be generated using the external image 402 of Figure 4, wherein a fractional part of the external image 402 is replaced with an internal image 504 of a corresponding fractional portion of the ultrasound volume 404. Within the internal image 504, a vasculature 506 of the organic structure 406 is shown. By contrast, the same vasculature 506 of the organic structure 406 may not be shown, or at least may not be as effectively shown, or as easily visible, in the external image 402 of Figure 4.
Referring now to Figures 5 and 6, a position of the internal image 504 (and/or of the border 508) within the screen image 500, and/or within the composite image 502, may be changed in accordance with the present invention to correspond to a different fractional portion of the ultrasound volume 404, and/or to another fractional part of the Figure 4 external image 402. For example, a computer mouse (not separately shown)(e.g., a rotatable wheel thereof, or a click-and-drag feature associated therewith), and/or any other suitable indicating, pointing, or cursor movement device may be used to so move the internal image 504 (and/or the border 508) from the position within the screen image 500 or within the composite image 502 shown in Figure 5 to a new position, e.g., as shown in Figure 6 with respect to a new screen image 600, and in a new composite image 602. Other techniques for so moving the interior image 504 (and/or the border 508) are possible. Embodiments of the present disclosure include a computer system (not shown, e.g., including a processor, related accessories such as a computer mouse and/or a trackball, and a computer monitor or other display) and a related algorithm or software product operable via such computer system and/or by said processor for permitting a user of such computer system to display the screen images 100, 200, 300, 400, 500, and 500 including the various images depicted therein of the respective ultrasound volumes 104, 404, and to manipulate such images in the manner described herein, including but not limited to achieving the above-described 'window' or 'X- ray' visual effect. For example, hardware-related embodiments of the present disclosure may include a personal computer or workstation, including a computer display or monitor, e.g., such as are presently used to run and/or utilize the above- discussed Philips Electronics QLAB™ Advanced Quantification software product off-cart, or a computer-implemented ultrasound system (e.g., on-cart) such as the above-discussed Philips Electronics iU22, iE33, and/or HDl 1 XE Ultrasound Systems. Software-related embodiments of the present disclosure may include a quantification software product including all relevant features and aspects of the above-discussed Philips Electronics QLAB™ software product, with additional code, for example, and/or one or more appropriate software 'plug-ins' for implementing additional features and aspects as disclosed herein. Embodiments of the present disclosure may further include a computer-readable medium including a computer- executable algorithm for implementing the ultrasound imaging features and functions disclosed herein. For example, such an algorithm may include appropriate computer- executable code.
The systems and methods of the present disclosure are particularly useful for displaying and manipulating displayed images of ultrasound volumes. However, the disclosed systems and methods are susceptible to many variations and alternative applications, without departing from the spirit or scope of the present disclosure.

Claims

CLAIMS:
1. A method for rendering an ultrasound volume, the method comprising: generating an external image of an ultrasound volume, wherein a fractional part of said external image corresponds to a fractional portion of said ultrasound volume; and generating a composite image of said ultrasound volume using said external image, wherein said fractional part is replaced with an internal image of said ultrasound volume corresponding to said fractional portion.
2. A method for rendering an ultrasound volume according to claim 1, wherein said external image generation step includes setting a visualization parameter to reflect a first parameter value and applying said visualization parameter to said ultrasound volume, and further comprising generating said internal image by setting said visualization parameter to a second parameter value different than said first parameter value, said second parameter value being at least incrementally more suitable than said first parameter value for purposes of rendering an internal image of said ultrasound volume.
3. A method for rendering an ultrasound volume according to claim 2, the method further comprising displaying said composite image on a screen of a computer monitor in conjunction with a mouse wheel, and permitting a viewer of said computer monitor screen to use the mouse wheel to selectably incrementally adjust a value of said visualization parameter associated with said internal image.
4. A method for rendering an ultrasound volume according to claim 3, wherein said visualization parameter is Transparency.
5. A method for rendering an ultrasound volume according to claim 2, wherein said visualization parameter includes at least one selected from the group comprising Brightness, Transparency, Thresholding, Lighting, Smoothing, Gray Map, Chroma Map, and Ray Cast Method.
6. A method for rendering an ultrasound volume according to claim 5, wherein said visualization parameter comprehends a set of two or more visualization parameters selected from said group, and said first and second values comprehend respective sets of values to which the visualization parameters of said set of two or more visualization parameters are respectively set.
7. A method for rendering an ultrasound volume according to claim 2, wherein said visualization parameter is an intensity parameter, and said second value of said visualization parameter is preset to include at least one selected from the group comprising X-ray, average, and minimum.
8. A method for rendering an ultrasound volume according to claim 1, wherein an image magnification parameter associated with said internal and external images is of substantially the same value.
9. A method for rendering an ultrasound volume according to claim 1, wherein said ultrasound volume is one selected from a group comprising an echo volume, a color volume, or an echo+color volume.
10. A method for rendering an ultrasound volume according to claim 1, wherein said ultrasound volume includes an organic structure, said external image depicts an outer surface of said organic structure, and said internal image depicts a vascularity of said organic structure, such that said composite image simultaneously depicts both an outer surface of said organic structure and said vascularity thereof.
11. A method for rendering an ultrasound volume according to claim 1 , the method further comprising displaying said external image on a screen of a computer monitor in conjunction with a user-movable cursor, and permitting a viewer of said computer monitor to initiate said composite image generation step by using said cursor to designate said fractional portion of said ultrasound volume for internal viewing.
12. A method for rendering an ultrasound volume according to claim 1, the method further comprising displaying said composite image on a screen of a computer monitor, and upon receipt of an appropriate command from a viewer of said computer monitor screen, selectably entirely replacing said composite image with an internal image of said ultrasound volume coextensive with said external image thereof, and associated with the same visualization parameters and parameter values of the internal image of the fractional portion.
13. A method for rendering an ultrasound volume according to claim 1, the method further comprising displaying said composite image on a screen of a computer monitor in conjunction with a user movable cursor positioned over said composite image, and permitting a viewer of said computer monitor to trigger said composite image modification step by moving said cursor to a new position over said composite image.
14. A method for rendering an ultrasound volume according to claim 1, wherein a respective second fractional part of said external image corresponds to a second fractional portion of said ultrasound volume, and the method further comprising modifying said composite image to form a modified composite image wherein said second fractional part is replaced with a second internal image of said ultrasound volume corresponding to said second fractional portion.
15. A method for rendering an ultrasound volume according to claim 14, the method further comprising displaying said composite image on a screen of a computer monitor, and wherein said composite image modification step includes wherein said internal image of said fractional portion contained in said composite image is replaced on said computer monitor screen with said fractional part to whatever extent said fractional part does not intersect with said second fractional part.
16. A method for rendering an ultrasound volume according to claim 14, the method further comprising displaying said composite image on a screen of a computer monitor in conjunction with a user movable cursor, and permitting a viewer of said computer monitor screen to initiate said composite image modification step by using said cursor to designate said second fractional portion of said ultrasound volume for internal viewing.
17. A method for rendering an ultrasound volume according to claim 1, the method further comprising displaying said composite image on a screen of a computer monitor in conjunction with a user movable cursor, permitting a viewer of said computer monitor screen to move said cursor relative to said composite image to designate a second fractional portion of said ultrasound volume for internal viewing, and in response to said user designation, generating a modified composite image of said ultrasound volume using said external image, wherein said fractional part is restored, and a second fractional part of said external image corresponding to said second fractional portion is replaced with an internal image of said ultrasound volume corresponding to said second fractional portion.
18. A computer readable medium comprising a program that, when executed by a processor, performs a method for rendering an ultrasound volume, the method comprising: generating an external image of an ultrasound volume, wherein a fractional part of said external image corresponds to a fractional portion of said ultrasound volume; and generating a composite image of said ultrasound volume using said external image, wherein said fractional part is replaced with an internal image of said ultrasound volume corresponding to said fractional portion.
19. A computer system, comprising: a computer readable medium comprising a program that, when executed by a processor, performs a method for rendering an ultrasound volume, the method comprising: generating an external image of an ultrasound volume, wherein a fractional part of said external image corresponds to a fractional portion of said ultrasound volume; and generating a composite image of said ultrasound volume using said external image, wherein said fractional part is replaced with an internal image of said ultrasound volume corresponding to said fractional portion; and a processor for executing said program.
PCT/IB2008/052166 2007-06-04 2008-06-03 X-ray tool for 3d ultrasound WO2008149291A1 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN200880018715A CN101680950A (en) 2007-06-04 2008-06-03 X-ray tool for 3D ultrasound
JP2010510935A JP2010535043A (en) 2007-06-04 2008-06-03 X-ray tool for 3D ultrasound
EP08763176A EP2167991A1 (en) 2007-06-04 2008-06-03 X-ray tool for 3d ultrasound
RU2009149622/28A RU2469308C2 (en) 2007-06-04 2008-06-03 X-ray instrument for three-dimensional ultrasonic analysis
US12/663,088 US8466914B2 (en) 2007-06-04 2008-06-03 X-ray tool for 3D ultrasound

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US94176107P 2007-06-04 2007-06-04
US60/941,761 2007-06-04

Publications (1)

Publication Number Publication Date
WO2008149291A1 true WO2008149291A1 (en) 2008-12-11

Family

ID=39820934

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2008/052166 WO2008149291A1 (en) 2007-06-04 2008-06-03 X-ray tool for 3d ultrasound

Country Status (6)

Country Link
US (1) US8466914B2 (en)
EP (1) EP2167991A1 (en)
JP (1) JP2010535043A (en)
CN (1) CN101680950A (en)
RU (1) RU2469308C2 (en)
WO (1) WO2008149291A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010188118A (en) * 2009-01-20 2010-09-02 Toshiba Corp Ultrasound diagnosis apparatus, ultrasound image processing apparatus, image processing method, and image display method
EP2253273A1 (en) 2009-05-18 2010-11-24 Medison Co., Ltd. Ultrasound diagnostic system and method for displaying organ
EP3178402A1 (en) * 2015-12-10 2017-06-14 Samsung Medison Co., Ltd. Ultrasound apparatus and method of displaying ultrasound images

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101009782B1 (en) * 2008-10-28 2011-01-19 (주)메디슨 Ultrasound system and method providing wide image mode
US20140358004A1 (en) * 2012-02-13 2014-12-04 Koninklijke Philips N.V. Simultaneous ultrasonic viewing of 3d volume from multiple directions
US20140078134A1 (en) * 2012-09-18 2014-03-20 Ixonos Oyj Method for determining three-dimensional visual effect on information element using apparatus with touch sensitive display
KR102367194B1 (en) 2014-12-31 2022-02-25 삼성메디슨 주식회사 Ultrasonic diagnostic apparatus and operating method for the same
JP6691209B2 (en) * 2015-09-26 2020-04-28 ボストン サイエンティフィック サイムド,インコーポレイテッドBoston Scientific Scimed,Inc. Methods for editing anatomical shells
WO2019119429A1 (en) * 2017-12-22 2019-06-27 中国科学院深圳先进技术研究院 Dual-transducer compensation imaging method and ultrasonic imaging system
US20220292655A1 (en) * 2021-03-15 2022-09-15 Canon Medical Systems Corporation Medical image processing apparatus, x-ray diagnostic apparatus, and method of medical image processing

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6254540B1 (en) * 1999-11-19 2001-07-03 Olympus Optical Co., Ltd. Ultrasonic image processing apparatus for constructing three-dimensional image using volume-rendering data and surface-rendering data simultaneously
US20020183607A1 (en) * 2000-09-11 2002-12-05 Thomas Bauch Method and system for visualizing a body volume and computer program product
WO2003045222A2 (en) * 2001-11-21 2003-06-05 Viatronix Incorporated System and method for visualization and navigation of three-dimensional medical images
US6803931B1 (en) * 1999-11-04 2004-10-12 Kendyl A. Roman Graphical user interface including zoom control box representing image and magnification of displayed image
US20060111634A1 (en) * 2004-10-30 2006-05-25 Sonowise, Inc. User interface for medical imaging including improved pan-zoom control
US20070046661A1 (en) * 2005-08-31 2007-03-01 Siemens Medical Solutions Usa, Inc. Three or four-dimensional medical imaging navigation methods and systems

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05288495A (en) * 1992-04-08 1993-11-02 Mitsubishi Heavy Ind Ltd Trajectory analyser
UA9565A (en) * 1995-07-28 1996-09-30 Володимир Васильович Овчаренко Method for ultrasonic visualization of biological structures and device for its realization
US5645066A (en) * 1996-04-26 1997-07-08 Advanced Technology Laboratories, Inc. Medical ultrasonic diagnostic imaging system with scanning guide for three dimensional imaging
JP2001128982A (en) * 1999-11-02 2001-05-15 Toshiba Corp Ultrasonic image diagnosing apparatus and image processor
JP2001188639A (en) * 1999-12-28 2001-07-10 Internatl Business Mach Corp <Ibm> Method and device for displaying magnified and reduced areas
CA2310945C (en) * 2000-06-05 2009-02-03 Corel Corporation System and method for magnifying and editing images
RU2208391C1 (en) * 2001-11-05 2003-07-20 Демин Виктор Владимирович Method for three-dimensional visualization of atheromatosis substrate at obliterating arterial lesions during one's life period
RU2232547C2 (en) * 2002-03-29 2004-07-20 Общество с ограниченной ответственностью "АММ - 2000" Method and device for making ultrasonic images of cerebral structures and blood vessels
JP2004141514A (en) * 2002-10-28 2004-05-20 Toshiba Corp Image processing apparatus and ultrasonic diagnostic apparatus
JP2006000127A (en) * 2004-06-15 2006-01-05 Fuji Photo Film Co Ltd Image processing method, apparatus and program
US7604595B2 (en) * 2004-06-22 2009-10-20 General Electric Company Method and system for performing real time navigation of ultrasound volumetric data
US7339585B2 (en) * 2004-07-19 2008-03-04 Pie Medical Imaging B.V. Method and apparatus for visualization of biological structures with use of 3D position information from segmentation results
JP2006130071A (en) * 2004-11-05 2006-05-25 Matsushita Electric Ind Co Ltd Image processing apparatus
US20060203010A1 (en) * 2005-03-14 2006-09-14 Kirchner Peter D Real-time rendering of embedded transparent geometry in volumes on commodity graphics processing units
JP4653542B2 (en) * 2005-04-06 2011-03-16 株式会社東芝 Image processing device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6803931B1 (en) * 1999-11-04 2004-10-12 Kendyl A. Roman Graphical user interface including zoom control box representing image and magnification of displayed image
US6254540B1 (en) * 1999-11-19 2001-07-03 Olympus Optical Co., Ltd. Ultrasonic image processing apparatus for constructing three-dimensional image using volume-rendering data and surface-rendering data simultaneously
US20020183607A1 (en) * 2000-09-11 2002-12-05 Thomas Bauch Method and system for visualizing a body volume and computer program product
WO2003045222A2 (en) * 2001-11-21 2003-06-05 Viatronix Incorporated System and method for visualization and navigation of three-dimensional medical images
US20060111634A1 (en) * 2004-10-30 2006-05-25 Sonowise, Inc. User interface for medical imaging including improved pan-zoom control
US20070046661A1 (en) * 2005-08-31 2007-03-01 Siemens Medical Solutions Usa, Inc. Three or four-dimensional medical imaging navigation methods and systems

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HUBBOLD R J ET AL: "STEREO DISPLAY OF NESTED 3D VOLUME DATA USING AUTOMATIC TUNNELLING", PROCEEDINGS OF THE SPIE - THE INTERNATIONAL SOCIETY FOR OPTICAL ENGINEERING, SPIE, BELLINGHAM, VA; US, vol. 3639, 1 January 1999 (1999-01-01), pages 200 - 207, XP000901206 *
ROBB R A: "Visualization in biomedical computing", PARALLEL COMPUTING, ELSEVIER PUBLISHERS, AMSTERDAM, NL, vol. 25, no. 13-14, 1 December 1999 (1999-12-01), pages 2067 - 2110, XP004363672, ISSN: 0167-8191 *
VIEGA J ET AL: "3D MAGIC LENSES", UIST '96. 9TH ANNUAL SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY. PROCEEDINGS OF THE ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY. SEATTLE, WA, NOV. 6 - 8, 1996; [ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY], NEW YORK,, 6 November 1996 (1996-11-06), pages 51 - 58, XP000728616, ISBN: 978-0-89791-798-8 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010188118A (en) * 2009-01-20 2010-09-02 Toshiba Corp Ultrasound diagnosis apparatus, ultrasound image processing apparatus, image processing method, and image display method
EP2253273A1 (en) 2009-05-18 2010-11-24 Medison Co., Ltd. Ultrasound diagnostic system and method for displaying organ
EP3178402A1 (en) * 2015-12-10 2017-06-14 Samsung Medison Co., Ltd. Ultrasound apparatus and method of displaying ultrasound images
US10278674B2 (en) 2015-12-10 2019-05-07 Samsung Medison Co., Ltd. Ultrasound apparatus and method of displaying ultrasound images

Also Published As

Publication number Publication date
JP2010535043A (en) 2010-11-18
RU2469308C2 (en) 2012-12-10
CN101680950A (en) 2010-03-24
RU2009149622A (en) 2011-07-20
US20100188398A1 (en) 2010-07-29
US8466914B2 (en) 2013-06-18
EP2167991A1 (en) 2010-03-31

Similar Documents

Publication Publication Date Title
US8466914B2 (en) X-ray tool for 3D ultrasound
JP6208731B2 (en) System and method for generating 2D images from tomosynthesis data sets
JP6088498B2 (en) System and method for processing medical images
KR102307530B1 (en) Method and system for displaying to a user a transition between a first rendered projection and a second rendered projection
JP4937261B2 (en) System and method for selectively mixing 2D X-ray images and 3D ultrasound images
CN106569673B (en) Display method and display equipment for multimedia medical record report
US11055899B2 (en) Systems and methods for generating B-mode images from 3D ultrasound data
US20070279436A1 (en) Method and system for selective visualization and interaction with 3D image data, in a tunnel viewer
US20070279435A1 (en) Method and system for selective visualization and interaction with 3D image data
US20070046661A1 (en) Three or four-dimensional medical imaging navigation methods and systems
JPWO2006033377A1 (en) Medical image display apparatus and method, and program
JP6058286B2 (en) Medical image diagnostic apparatus, medical image processing apparatus and method
US9792261B2 (en) Medical image display apparatus, medical image display method, and recording medium
CA3152809A1 (en) Method for analysing medical image data in a virtual multi-user collaboration, a computer program, a user interface and a system
JP6647796B2 (en) Image processing apparatus and X-ray diagnostic apparatus
JP6740051B2 (en) Ultrasonic diagnostic device, medical image processing device, and medical image processing program
JP2002101428A (en) Image stereoscopic vision display device
JP2001101449A (en) Three-dimensional image display device
JP6662580B2 (en) Medical image processing equipment
Jian et al. A preliminary study on multi-touch based medical image analysis and visualization system
Aspin et al. Medivol: An initial study into real-time, interactive 3D visualisation of soft tissue pathologies
Wick et al. First experiences with cardiological teleconsultation using 3D ultrasound data sets
Gering Integrated image zoom and montage

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200880018715.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 08763176

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2008763176

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2010510935

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 12663088

Country of ref document: US

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 7543/CHENP/2009

Country of ref document: IN

WWE Wipo information: entry into national phase

Ref document number: 2009149622

Country of ref document: RU