US20050033123A1 - Region of interest methods and systems for ultrasound imaging - Google Patents

Region of interest methods and systems for ultrasound imaging Download PDF

Info

Publication number
US20050033123A1
US20050033123A1 US10/861,880 US86188004A US2005033123A1 US 20050033123 A1 US20050033123 A1 US 20050033123A1 US 86188004 A US86188004 A US 86188004A US 2005033123 A1 US2005033123 A1 US 2005033123A1
Authority
US
United States
Prior art keywords
invention
ultrasound images
selected
phase
ultrasound
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US10/861,880
Inventor
Edward Gardner
Richard Kane
Joan Main
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens Medical Solutions USA Inc
Original Assignee
Siemens Medical Solutions USA Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US49032403P priority Critical
Application filed by Siemens Medical Solutions USA Inc filed Critical Siemens Medical Solutions USA Inc
Priority to US10/861,880 priority patent/US20050033123A1/en
Assigned to SIEMENS MEDICAL SOLUTIONS USA, INC. reassignment SIEMENS MEDICAL SOLUTIONS USA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: GARDNER, EDWARD A., KANE, RICHARD M., MAIN, JOAN C.
Publication of US20050033123A1 publication Critical patent/US20050033123A1/en
Application status is Abandoned legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • A61B5/7207Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/481Diagnostic techniques involving the use of contrast agent, e.g. microbubbles introduced into the bloodstream
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8979Combined Doppler and pulse-echo imaging systems
    • G01S15/8981Discriminating between fixed and moving objects or between objects moving at different speeds, e.g. wall clutter filter
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52023Details of receivers
    • G01S7/52036Details of receivers using analysis of echo signal for target characterisation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • G01S7/52066Time-position or time-motion displays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52085Details related to the ultrasound signal acquisition, e.g. scan sequences
    • G01S7/52087Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques
    • G01S7/52088Details related to the ultrasound signal acquisition, e.g. scan sequences using synchronization techniques involving retrospective scan line rearrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Detecting, measuring or recording for diagnostic purposes; Identification of persons
    • A61B5/04Measuring bioelectric signals of the body or parts thereof
    • A61B5/0402Electrocardiography, i.e. ECG
    • A61B5/0452Detecting specific parameters of the electrocardiograph cycle
    • A61B5/0456Detecting R peaks, e.g. for synchronising diagnostic apparatus
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0891Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5284Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving retrospective matching to a physiological signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20104Interactive definition of region of interest [ROI]
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30048Heart; Cardiac

Abstract

In one embodiment, at least one visual characteristic from a selected image is used to automatically select an ultrasound image from a set of images from a plurality of heart cycles. In another embodiment, motion correction is performed on ultrasound images that are automatically selected from a plurality of ultrasound images associated with the same phase of the heart cycle. In yet another embodiment, ultrasound images are automatically selected from a set of images based on a time interval that is within a tolerance range from a reference phase of the heart cycle. In another embodiment, a stored user-preference of a phase of the heart cycle is used to automatically select an ultrasound image from a set of images. In yet another embodiment, a user-defined region of interest is placed on an ultrasound image associated with a selected phase of a heart cycle.

Description

    RELATED APPLICATION
  • This application claims the benefit of U.S. Provisional Application No. 60/490,324, filed Jul. 25, 2003, which is hereby incorporated by reference herein.
  • BACKGROUND
  • Contrast agents can be used in a medical ultrasound examination to enhance diagnosis. The rate of contrast agent enhancement of tissue relates to the rate of blood flow to the tissue and can be used to diagnose a variety of disease states. While contrast agent quantification can be performed over an entire heart cycle, techniques have been discussed for performing quantification on specific parts of the heart cycle. For example, U.S. patent application Publication No. US2003/0114759A1 to Skyba et al. describes an ultrasonic imaging system and method for displaying tissue perfusion and other parameters varying with time in which gated or ungated images are used to enable a parametric display to be keyed to specific phases of the heart cycle. Additionally, QLAB Software by Philips Medical Systems has been described as containing custom tools to “auto trim” relative to an ECG trigger for quantification of specific parts of a cardiac cycle.
  • SUMMARY
  • The present invention is defined by the following claims, and nothing in this section should be taken as a limitation on those claims.
  • By way of introduction, the embodiments described below relate to phase selection for cardiac contrast assessment. In one embodiment, at least one visual characteristic from a selected image is used to automatically select an ultrasound image from a set of images from a plurality of heart cycles. In another embodiment, motion correction is performed on ultrasound images that are automatically selected from a plurality of ultrasound images associated with the same phase of the heart cycle. In yet another embodiment, ultrasound images are automatically selected from a set of images based on a time interval that is within a tolerance range from a reference phase of the heart cycle. In another embodiment, a stored user-preference of a phase of the heart cycle is used to automatically select an ultrasound image from a set of images. In yet another embodiment, a user-defined region of interest is placed on an ultrasound image associated with a selected phase of a heart cycle. Other embodiments are provided, and each of the embodiments described herein can be used alone or in combination with one another.
  • The embodiments will now be described with reference to the attached drawings.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a block diagram of a medical diagnostic ultrasound imaging system of an embodiment.
  • FIG. 2 is a schematic of ultrasound images showing the time course of contrast agent into a heart.
  • FIG. 3 shows time intensity curves of an embodiment.
  • FIG. 4 is a flow chart of a method of an embodiment.
  • FIG. 5 illustrates a plurality of sets of ultrasound images of an embodiment.
  • FIG. 6 is a timeline showing an ECG trace and times when image frames of an embodiment were acquired.
  • BRIEF DESCRIPTION OF THE PRESENTLY PREFERRED EMBODIMENTS
  • Introduction
  • Turning to the drawings, FIG. 1 is a block diagram of a medical diagnostic ultrasound imaging system 100 that can be used with the embodiments described herein, which generally relate to diagnostic ultrasonic imaging with contrast agents. As shown in FIG. 1, the ultrasound system 5 comprises a transducer probe 10, a beamformer 15, a processor 20, a display device 25, an ECG device 30, and a user interface 35, each of which is in communication with the other through one or more named or unnamed components. Two components can be in communication with each other through a wired or wireless connection. The term “processor” broadly refers to the appropriate hardware and/or software components of the ultrasound system 5 that can be used to implement the functionality described herein. The ultrasound system 5 can comprise additional components, which are not shown in FIG. 1 for simplicity.
  • During an ultrasound examination, a sonographer contacts the transducer probe 10 with a patient, and the ultrasound system's processor 20 causes the beamformer 15 to apply a voltage to the transducer 10 to cause it to vibrate and emit an ultrasonic beam into the portion of the patient's body in contact with the transducer 10. Ultrasonic energy reflected from the patient's body impinges on the transducer 10, and the resulting voltages created by the transducer 10 are received by the beamformer 15. The processor 20 processes the sensed voltages to create an ultrasound image and displays the image on the display device 25. The ECG device 30 captures ECG information (e.g., a heart cycle waveform) from the patient. The ECG information can be displayed with an ultrasound image during image acquisition. Additionally, the ECG information can be stored with an ultrasound image on a storage device internal or external to the ultrasound system 5 for later review.
  • The ultrasound system 5 can be used for contrast agent imaging to enhance diagnosis. Venous injection of contrast agent causes an increase in the ultrasound signal when the contrast agent washes into the tissue being imaged. In operation, a high-intensity ultrasound pulse is transmitted to destroy contrast agent in a tissue, and then the high-intensity ultrasound pulse is turned off and the tissue is imaged as new contrast-agent-filled blood perfuses into the tissue. The rate of the contrast agent enhancement (or “wash-in”) relates to the rate of blood flow to the tissue and ultimately to tissue perfusion. Because of this, the time course of contrast enhancement can be assessed to diagnose a variety of disease states from coronary artery disease to tumor neovasculature to liver metastases. As used herein, the phrase “assessment of the time course of contrast enhancement” refers to either qualitative or quantitative assessment of contrast ultrasound images. All of these techniques rely on comparisons between images acquired at different times in order to see the variation caused by contrast agent enhancement. These techniques will now be discussed.
  • Qualitative assessment is the simplest method of determining the time course of contrast enhancement and relies on observing, by eye, the visible changes in an ultrasound image caused by contrast agent. Qualitative assessment had been done successfully for coronary perfusion assessment (Cwajg et al., “Detection of Angiographically Significant Coronary Artery Disease with Accelerated Intermittent Imaging and Intravenous Administration of Ultrasound Contrast Materials,” American Heart J. 139: 675-683 (2000)) and characterization of liver diseases. Diagnoses can be made by simply observing the contrast enhancement in a “live” ultrasound image or by reviewing a stored image clip.
  • Quantitative assessment (or analysis) refers to time intensity curve analysis, parametric imaging, displaying results of calculations, or any other contrast enhancement assessment technique (now existing or later developed) other than pure qualitative analysis. Time intensity curve (TIC) analysis allows more precise assessment of the time course of contrast enhancement than can be made by eye alone. TIC's present the variation of the average signal level or number of color pixels in a number of regions of interest (ROI's) in graphical form to identify contrast enhancement differences between the regions. The data is also fit to functional curves, the parameters of which are used to make absolute determinations of flow characteristics (Wei et al., “Basis for Detection of Stenosis Using Venous Administration of Microbubbles During Myocardial Contrast Echocardiography: Bolus or Continuous Infusion,” JACC 32: 252-60 (July 1998)). These parameters can be related to coronary flow reserve (Wei et al., “Noninvasive Quantification of Coronary Blood Flow Reserve in Humans Using Myocardial Contrast Echocardiography,” Circulation 103: 2560-2565 (2001)) and can be used to make diagnoses.
  • FIGS. 2 and 3 illustrate the use of TICs. FIG. 2 is a schematic of ultrasound images showing the time course of contrast agent into a heart. The image at time 0 seconds was taken before the start of contrast agent wash-in, and the images at times 1 second, 2 seconds, and 3 seconds were taken during contrast wash-in. As shown in these images, the signal from tissue grows over time, but enhancement is slower in the top part of the image, indicating a problem in blood flow. Quantification is accomplished by placing two regions of interest 40, 50 at different regions of the heart image. Changes in the pixel intensity in the two regions of interest 40, 50 is indicative of contrast wash-in in those regions 40, 50, and the average intensity from within those regions 40, 50 is plotted in time-intensity curves (see FIG. 3). The time-intensity curves show that the contrast enhancement in region of interest 50 is slower than the contrast enhancement in region of interest 40 by virtue of the reduced signal at times 1 second and 2 seconds. The data points in the time intensity curves are fitted with smooth functions 60, 70, and parameters of these functions are related to the blood flow in the tissue.
  • TIC analysis has had limited penetration into the market due to the relatively time-consuming analysis required and the lack of spatial information derived from a small number of ROI's. Removing this obstacle while maintaining much of its sensitivity is parametric imaging. Parametric imaging combines the spatial resolution and ease of use of qualitative assessment with the sensitivity of quantitative curve fitting. In parametric imaging, the variation over time of signal level from each pixel is fitted by some function that relates to a physiological phenomenon. Selected parameters from the function for each pixel can then be combined into an image to show variation in the entire series in a static image. For contrast agent imaging, some parameter or combination of parameters related to contrast wash-in is displayed in this parametric image. In this way, spatial variation of the wash-in time course can be easily assessed.
  • Motion Effects on the Assessment of the Time Course of Contrast Enhancement
  • During a myocardial contrast imaging exam, cardiac ultrasound images are acquired at a high frame rate (typically, more than 15 frames per second) in order to provide wall motion information for qualitative wall motion assessment as well as to facilitate maintaining the scan plane. However, this high frame rate provides little additional information for quantification assessment, and cardiac motion shown by this high frame rate can make quantification difficult. For example, cardiac motion can cause substantial changes in intensity from different regions. These cyclic changes are apparent in time intensity curves generated from each frame in a real-time clip. It is also difficult or impossible to maintain the same tissue in a ROI through the cardiac cycle. Thickening causes the tissue in a fixed area to change, and any rotation of the heart will also cause different tissue to be in the image plane at different phases of the heart cycle.
  • One solution to this problem is to select one image from each heart cycle for analysis. (As described below, in some situations, more than one image from each heart cycle can be selected.) The selection of images from the same phase of the heart cycle prevents contractile cardiac motion from interfering with quantification of myocardial perfusion because subsequent analysis (preferably including ROI placement) is based on the set of images acquired at the selected phase instead of all of the acquired images. (As used herein, “set” refers to a group of one or more than one member.)
  • FIG. 4 is a flow chart of a method for phase selection for cardiac contrast assessment of an embodiment. As shown in FIG. 4, a plurality of sets of ultrasound images associated with a respective plurality of heart cycles is provided (act 100). Next, a phase of the heart cycle is selected (act 110), and ultrasound images associated with the selected phase are automatically selected (act 120). Next, qualitative and/or quantitative assessment is performed. With qualitative assessment, the automatically-selected ultrasound images are displayed for user assessment (act 130). With quantification assessment involving time intensity curves, one or more regions of interest are placed on an ultrasound image associated with the selected phase (act 150), and quantification analysis is performed (act 160). As shown in FIG. 4, with parametric imaging, no regions of interest are necessary (since quantification is done on each pixel).
  • Motion correction (act 140) can be used with either qualitative or quantitative assessment. It is important to note that while FIG. 4 shows a method with multiple acts, each of these acts can be used alone or in combination with one another. Additionally, these acts can be performed on an ultrasound system or an image review station (e.g., a dedicated workstation or a personal computer with a processor, display device, and user interface). These acts will now be discussed.
  • Providing a Plurality of Sets of Ultrasound Images
  • The first act of the method is to provide a plurality of sets of ultrasound images associated with a respective plurality of heart cycles (act 100). These images can be stored, for example, in a storage device in an ultrasound system or image review station, on removable media (e.g., a magneto-optical disc), or in a network location accessible by the ultrasound system or image review station.
  • With some of the embodiments below, it is preferred that the stored images be acquired using “real-time imaging.” As used herein, the term “real-time imaging” refers to acquiring ultrasound images with a sufficiently high frame rate such that enough ultrasound images are acquired to allow the selection of an image associated with a desired phase of the heart cycle. “Real-time images” refers to images acquired during “real-time imaging.” “Real-time imaging” is different from gated imaging, in which a portion of the heart cycle, such as an R-wave, triggers the acquisition of a single ultrasound image per heart cycle. “Gated images” refer to images acquired using gated imaging. Without intending to set a lower limit to the definition of “real-time imaging,” an example of real-time imaging is acquiring ultrasound images at a frame rate of more than 15 frames per second. It should be noted that, as used herein, the term “real time” does not indicate when images are displayed with respect to a time of acquisition. Accordingly, images played from a stored clip can be “real time” images even though they are being displayed at a time after acquisition.
  • Selecting a Phase of the Heart Cycle
  • Next, a phase of the heart cycle is selected (act 110). This act can be performed manually or automatically. FIG. 5 will be used to illustrate the manual operation. FIG. 5 shows a display of four sets 200, 210, 220, 230 of ultrasound images of an apical four-chamber view of the heart over four heart cycles. Each of the displayed images is associated with a particular phase of the heart cycle, and selection one of the displayed images selects a phase in the heart cycle. In operation, a user selects a displayed image to represent the cardiac phase that he is interested in based on the shape of the heart in the image or the state of the ECG signal associated with that image. For example, selection of image A selects a portion of the heart cycle that is X ms after the preceding R-wave.
  • In the automatic operation, a user can set a phase preference (e.g., X ms after the R-wave) that is stored in a storage device of the ultrasound system. In future examinations, the ultrasound system retrieves the stored user-preference to automatically select the phase rather than requiring the user to select a displayed image or take some other action to manually select the desired phase. A combination of the automatic and manual operations can be used. For example, the automatic operation can be used to initially select the phase, and, if the user is dissatisfied with this selection, the manual operation can be used.
  • Automatically Selecting Ultrasound Images Associated with the Selected Phase
  • For each of the plurality of sets of ultrasound images, an image associated with the selected phase of the heart cycle is automatically selected (act 120). This selection can be performed, for example, based on a time interval from a part of the heart cycle, such as the R-wave, or from image information. Each of these techniques will now be described.
  • Cardiac ultrasound machines provide a subsystem that measures physiological signals (such as the ECG signal) and present the information along with the ultrasound images as a reference, as shown in FIG. 5. The ultrasound system typically detects when the R-wave occurs in each heartbeat. The times for each R-wave can be stored along with the times of acquisition of each of the contrast images. The difference between the timestamp for each image and the time of the previous R-wave produces an interval that is analogous to the cardiac phase. Cardiac phase can be defined in a number of ways. The simplest method is to define only two phases of the heart—systole and diastole. This is similar to dividing a sinusoid into a negative phase and a positive phase. Sinusoidal phase can be more specifically defined using a phase angle, and, in a similar fashion, the phase of the cardiac cycle can be divided more finely. Although the phase angle does not fit for the cardiac cycle as it does for sinusoids, the phase can be defined as changing continuously during a cycle where the heart is in the same state during the same phase in all cycles. Irregularities can be considered as variations in the rate of phase change during a heartbeat or between heartbeats.
  • In operation, the time interval between a reference phase (e.g., a preceding R-wave) and the selected phase is determined. That time interval is then compared with the time interval between the reference phase and the phase associated with each image in a set of images from a heart cycle. An image in each cardiac cycle can be selected that has an interval from the preceding R-wave that most closely matches the interval for the image associated with the selected phased. With reference to FIG. 6, a timeline 300 shows an ECG trace and the times when image frames were acquired. Each rectangle indicates the time for one ultrasound frame. When a user selects image A, the ultrasound system automatically identifies frames B, C, D, E, F, and G as having a similar phase as A based on the ECG signal characteristics, such as the time interval from the preceding R-wave or some other characteristic of the ECG signal. In this way, the ECG information is used to sample an image clip at a rate of one frame per heartbeat at the selected phase to retain only images A-G for subsequent analysis.
  • If the imaging frame rate is irregular (possibly due to contrast destruction), there may not be an image that was acquired with a similar interval. In this case, the system preferably identifies that the match is poor and does not provide any images for some of the cardiac cycles. For example, a poor match can be defined as when the best match differs by more than 100 ms. Further, the above method relies on each cardiac cycle being identical. When there are irregularities, the interval between an image and the preceding R-wave may not give a good indication of the phase of the heart in that image. The interval to the succeeding R-wave can also be used to identify the cardiac phase in an image in question, although it should be acknowledged that the length of diastole varies to a much greater extent than does the length of systole. Other characteristics of the ECG signal can also be used to get a better estimate of the phase of a particular image. For example, the T-wave can be identified in order to better identify end-systole. This time can be used as another reference when determining the phase of a particular frame.
  • Instead of using ECG information, an image can be automatically selected based on one or more visual characteristics of an image, such as the shape of the heart and/or the state of the valves. In this way, visual characteristics are used to identify a phase in the heart cycle. In operation, an image from a set of images associated with a heart cycle is selected. This image can be manually selected by a user or automatically selected by identifying which image from the set is associated with a phase in a stored user-preference. Then, at least one visual characteristic from the selected image is compared to at least one visual characteristic of the ultrasound images from the other sets. The images that most-closely match (or match using some other criteria) would be automatically selected. With reference again to the illustration in FIG. 5, if the user selects image A, the ultrasound system would compare image characteristics such as position of the valve leaflets and/or the shape of the myocardium in image A with corresponding visual characteristics in other images in the clip to select the images that match most closely. In this example, A closely matches images B, C, and D, so images A, B, C, and D are used for subsequent analysis. (Alternatively, only the automatically-selected images (images B, C, and D) can be used for subsequent analysis and/or display.)
  • In the above discussion, only one frame per cardiac cycle was selected for analysis. In some cases, however, the heart does not change significantly over portions of the heart cycle. For example, in diastole, there are periods where the heart is comparatively still. Several neighboring images from these quiet periods can be used in quantification (to provide more data points) without the complication of heart motion. To do this based on the time interval from the previous R-wave, the system would identify all the images that had intervals within some tolerance range of the interval for the user-selected image. Because the speed of the heart changes dramatically during the heart cycle, this tolerance range preferably varies based on the interval (e.g., longer for intervals within diastole and shorter for intervals within systole).
  • Qualitative Assessment
  • During qualitative assessment, the ultrasound images associated with the desired phase are displayed (act 130). By visually observing the images, the user can detect visible changes caused by contrast agent to make a diagnosis.
  • Quantitative Assessment
  • As described above, quantitative assessment (or analysis) refers to time intensity curve analysis, parametric imaging, displaying results of calculations, or any other contrast enhancement assessment technique (now existing or later developed) other than pure qualitative analysis. Quantitative assessment often relies on the placement of at least one region of interest. Although the region(s) of interest can be placed on an image before or after the automatic selection of images described above, placing the ROI(s) on an automatically-selected image allows a user to rapidly scroll through an image clip to check ROI placement in all the relevant images because the data set has been reduced to images at a particular cardiac phase. After the ROI(s) have been placed, quantification can be performed. For example, time intensity curves can be drawn for each phase of interest in the cardiac cycle. In one presently preferred embodiment, at least one region of interest used in quantification is user-defined and comprises more than nine pixels but covers less than the entire imaged tissue (e.g., the entire myocardium). As used herein, the term “pixel” refers to any one of the small discrete basic elements from which an image is composed.
  • Motion Correction
  • By selecting images from the same phase of the cardiac cycle, contractile cardiac motion is removed before further analysis. Contractile motion makes up a large part of the motion of the heart, but patient breathing and inadvertent probe movement can also cause motion of the heart in an ultrasound image. Preferably, this motion is also tracked in order to effectively quantify changes in the ultrasound signal received from the tissue. Many techniques have been developed to automatically determine the motion between pairs of images. While these techniques have been used, motion correction in a real-time image clip can be very difficult because of contractile cardiac motion. The phase selection techniques described above simplify motion correction because they allow motion tracking to be performed on a reduced set of images. By first identifying the phase to align, motion correction processing time can be reduced by a factor of 10-30 due to the reduction in the number of frames processed.
  • The motion between pairs of images can be used to change the location of the ROI's so that each ROI always encompasses the same tissue. Alternatively, after the motion has been determined, the images can be registered together so that the motion between the images is removed. Simple registration can allow the entire image to move as a unit to remove bulk motion. More complicated techniques register different parts of the image independently and stretch or shrink the intervening image in order to accommodate relative movement in an image.
  • The most successful motion detection techniques use correlation-based algorithms such as minimizing the sum of absolute differences (SAD). The SAD technique is a good choice because it is well-suited for rapid calculation. Using this technique, the absolute difference is calculated between each point in the original image and points in the subsequent image corresponding to each of many potential motions. The best estimate of the actual motion is chosen as the movement of the second image that produces the minimum of the sum of the absolute differences. When ROI's are tracked to eliminate motion, the SAD is preferably calculated between each ROI in the original image and possible motions of that ROI in the subsequent image. The minimum of the SAD for each ROI will provide an estimate of the position of each ROI in the subsequent image. When simple, bulk-movement registration is used, the SAD is preferably calculated for a large region in the center of the image. This will estimate the motion of the most important part of the image in order to eliminate that motion through registration. When more complicated registration techniques are used that accommodate relative movement in the image, the SAD is preferably calculated for many regions throughout the image. The movement of each of these regions can then be estimated based on the minimum of these SAD calculations and error checking. In this technique, more weight is preferably placed on the displacement estimates from some regions when calculating the displacement field for the image.
  • There are several alternatives that can be used with these embodiments. For example, while the embodiments described above have been illustrated using contrast-enhanced ultrasound images, these techniques can also be used with ultrasound images that are free of contrast agent. Further, as noted above, these embodiments can be used with real-time or gated images. For example, motion correction can be performed on gated images or images manually selected by a user from a real-time image clip instead of automatically-selected images. Further, motion correction can take place before or after region of interest placement and can be used in conjunction with either qualitative or quantitative assessment. Finally, as noted above, each of the embodiments described herein can be used alone or in combination with one another.
  • It is intended that the foregoing detailed description be understood as an illustration of selected forms that the invention can take and not as a definition of the invention. It is only the following claims, including all equivalents, that are intended to define the scope of this invention.

Claims (88)

1. A method for automatically selecting an ultrasound image from a set of ultrasound images, the method comprising:
(a) providing a plurality of sets of ultrasound images associated with a respective plurality of heart cycles;
(b) selecting a first ultrasound image from a first set of ultrasound images associated with a first heart cycle; and
(c) automatically selecting an ultrasound image from each of the other sets of ultrasound images by comparing at least one visual characteristic of the first ultrasound image with at least one visual characteristic of the ultrasound images in each of the other sets.
2. The invention of claim 1, wherein (c) comprises determining which ultrasound image in each of the other sets comprises at least one visual characteristic that most-closely matches the at least one visual characteristic of the first ultrasound image.
3. The invention of claim 1, wherein the ultrasound images are of a heart, and wherein the at least one visual characteristic comprises a shape of the heart.
4. The invention of claim 1, wherein the ultrasound images are of a heart comprising a valve, and wherein the at least one visual characteristic comprises a state of the valve.
5. The invention of claim 1, wherein the ultrasound images comprise real-time images.
6. The invention of claim 1, wherein the ultrasound images comprise contrast-enhanced ultrasound images.
7. The invention of claim 1 further comprising:
(d) displaying the ultrasound images automatically selected in (c).
8. The invention of claim 7, wherein (d) further comprises displaying the ultrasound image selected in (b).
9. The invention of claim 1 further comprising:
(d) performing a quantification analysis on the ultrasound images automatically selected in (c).
10. The invention of claim 9, wherein (d) comprises performing a quantification analysis on the ultrasound images automatically selected in (c) and the ultrasound image selected in (b).
11. The invention of claim 1 further comprising:
(d) generating a set of time intensity curves using the ultrasound images automatically selected in (c).
12. The invention of claim 1 further comprising:
(d) performing motion correction on the ultrasound images automatically selected in (c).
13. The invention of claim 12, wherein (d) comprises performing motion correction on the ultrasound images automatically selected in (c) and the ultrasound image selected in (b).
14. The invention of claim 1, wherein the first ultrasound image is selected by a user from a plurality of displayed ultrasound images.
15. The invention of claim 1, wherein the first ultrasound image is automatically selected by identifying which image from the first set is associated with a phase of the heart cycle indicated by a stored user-preference.
16. The invention of claim 1 further comprising placing at least one region of interest on at least one of the first image and an automatically-selected image.
17. A method for motion correcting a set of ultrasound images associated with a phase of a heart cycle, the method comprising:
(a) providing a plurality of sets of ultrasound images associated with a respective plurality of heart cycles;
(b) selecting a phase of a heart cycle;
(c) for each of the plurality of sets, automatically selecting an ultrasound image that is associated with the selected phase of the heart cycle; and
(d) performing motion correction on the automatically-selected ultrasound images.
18. The invention of claim 17, wherein the motion correction comprises a sum of absolute differences (SAD) technique.
19. The invention of claim 17 further comprising placing a region of interest on an image associated with the selected phase before (d), and wherein (d) comprises changing the location of the region of interest.
20. The invention of claim 17, wherein the motion correction registers ultrasound images together.
21. The invention of claim 17, wherein the phase of the heart cycle is selected in (b) by a stored user-preference.
22. The invention of claim 17, wherein the phase of the heart cycle is selected in (b) by user selection of one of a plurality of displayed ultrasound images, each of the displayed ultrasound images being associated with a respective phase of the heart cycle.
23. The invention of claim 22, wherein the user-selected image comprises at least one visual characteristic, and wherein the ultrasound images are automatically selected in (c) by comparing the at least one visual characteristic of the user-selected ultrasound image with at least one visual characteristic of the ultrasound images in each of the plurality of sets.
24. The invention of claim 17, wherein the ultrasound images are automatically selected in (c) by comparing (1) a time interval between a reference phase in the heart cycle and a phase associated with an ultrasound images in a set with (2) a time interval between the reference phase and the selected phase.
25. The invention of claim 24, wherein the reference phase comprises a preceding R-wave.
26. The invention of claim 24, wherein the reference phase comprises a succeeding R-wave.
27. The invention of claim 24, wherein the reference phase comprises a T-wave.
28. The invention of claim 17, wherein the ultrasound images comprise real-time images.
29. The invention of claim 17, wherein the ultrasound images comprise contrast-enhanced ultrasound images.
30. The invention of claim 17 further comprising displaying the ultrasound images associated with the selected phase.
31. The invention of claim 30, wherein the displaying is performed before the motion correction in (c).
32. The invention of claim 30, wherein the displaying is performed after the motion correction in (c).
33. The invention of claim 17 further comprising:
(e) performing a quantification analysis on the motion-corrected ultrasound images.
34. The invention of claim 17 further comprising:
(e) generating a set of time intensity curves using the motion-corrected ultrasound images.
35. The invention of claim 17 further comprising placing at least one region of interest on an automatically-selected ultrasound image.
36. A method for motion correcting a set of ultrasound images associated with a phase of a heart cycle, the method comprising:
(a) providing a plurality of ultrasound images associated with a single phase of a heart cycle; and
(b) performing motion correction on the plurality of ultrasound images.
37. The invention of claim 36, wherein the plurality of ultrasound images comprise gated images.
38. The invention of claim 36, wherein the plurality of ultrasound images comprise real-time images.
39. The invention of claim 36, wherein the plurality of ultrasound images comprise contrast-enhanced ultrasound images.
40. The invention of claim 36, wherein the plurality of ultrasound images are manually selected by a user from a real-time image clip.
41. The invention of claim 36, wherein the motion correction comprises a sum of absolute differences (SAD) technique.
42. A method for automatically selecting ultrasound images associated with a phase of a heart cycle, the method comprising:
(a) providing a plurality of sets of ultrasound images associated with a respective plurality of heart cycles, wherein each ultrasound image is characterized by a time interval between an acquisition time of the ultrasound image and a reference phase in the heart cycle;
(b) selecting a phase of the heart cycle, wherein the selected phase is characterized by a time interval between the selected phase and the reference phase; and
(c) automatically selecting a plurality of ultrasound images from each heart cycle that are characterized by a time interval that is within a tolerance range of the time interval between the selected phase and the reference phase.
43. The invention of claim 42, wherein the automatically-selected plurality of ultrasound images are from the diastole portion of the heart cycle.
44. The invention of claim 42, wherein the tolerance range varies with the heart cycle.
45. The invention of claim 44, wherein the tolerance range is greater for diastole than it is for systole.
46. The invention of claim 42, wherein the reference phase comprises a preceding R-wave.
47. The invention of claim 42, wherein the reference phase comprises a succeeding R-wave.
48. The invention of claim 42, wherein the reference phase comprises a T-wave.
49. The invention of claim 42 further comprising;
(d) motion correcting the automatically-selected plurality of ultrasound images.
50. The invention of claim 42 further comprising:
(d) displaying the automatically-selected plurality of ultrasound images.
51. The invention of claim 42 further comprising:
(d) performing a quantification analysis on the automatically-selected plurality of ultrasound images.
52. The invention of claim 42 further comprising:
(d) generating a set of time intensity curves using the automatically-selected plurality of ultrasound images.
53. The invention of claim 42, wherein the phase of the heart cycle is selected in (b) by a stored user-preference.
54. The invention of claim 42, wherein the phase of the heart cycle is selected in (b) by user selection of one of a plurality of displayed ultrasound images, each of the displayed ultrasound images being associated with a respective phase of the heart cycle.
55. The invention of claim 42, wherein the ultrasound images comprise real-time images.
56. The invention of claim 42, wherein the ultrasound images comprise contrast-enhanced ultrasound images.
57. The invention of claim 42 further comprising placing at least one region of interest on an automatically-selected ultrasound image.
58. A method for automatically selecting ultrasound images associated with a phase of a heart cycle, the method comprising:
(a) providing a plurality of sets of ultrasound images associated with a respective plurality of heart cycles;
(b) retrieving a stored user-preference of a phase of the heart cycle; and
(c) for each of the plurality of sets of ultrasound images, automatically selecting an ultrasound image that is associated with the phase specified in the stored user-preference.
59. The invention of claim 58 further comprising;
(d) performing motion correction on the automatically-selected ultrasound images.
60. The invention of claim 58 further comprising;
(d) displaying the automatically-selected ultrasound images.
61. The invention of claim 58 further comprising;
(d) performing a quantification analysis on the automatically-selected ultrasound images.
62. The invention of claim 58 further comprising;
(d) generating a set of time intensity curves using the automatically-selected ultrasound images.
63. The invention of claim 58, wherein the ultrasound images are automatically selected in (c) by comparing at least one visual characteristic of an ultrasound image associated with the phase specified in the stored user preference with at least one visual characteristic of the ultrasound images in each of the plurality of sets.
64. The invention of claim 58, wherein the ultrasound images are automatically selected in (c) by comparing (1) a time interval between a reference phase in the heart cycle and a phase associated with an ultrasound images in a set with (2) a time interval between the reference phase and the phase specified in the stored user preference.
65. The invention of claim 58, wherein the ultrasound images comprise real-time images.
66. The invention of claim 58, wherein the ultrasound images comprise contrast-enhanced ultrasound images.
67. The invention of claim 58 further comprising placing at least one region of interest on an automatically-selected image.
68. A method for automatically selecting ultrasound images associated with a phase of a heart cycle, the method comprising:
(a) providing a plurality of sets of ultrasound images associated with a respective plurality of heart cycles;
(b) selecting a phase of a heart cycle;
(c) for each of the plurality of sets, automatically selecting an ultrasound image that is associated with the selected phase of the heart cycle; and
(d) placing at least one user-defined region of interest on an ultrasound image associated with the selected phase of the heart cycle.
69. The invention of claim 68, wherein the at least one user-defined region of interest comprises more than nine pixels.
70. The invention of claim 68, wherein the at least one user-defined region of interest comprises less than an entire imaged tissue.
71. The invention of claim 68, wherein the at least one user-defined region of interest comprises less than an entire myocardium.
72. The invention of claim 68 further comprising performing motion correction on the automatically-selected ultrasound images.
73. The invention of claim 72, wherein the motion correction is performed before (d).
74. The invention of claim 72, wherein the motion correction is performed after (d).
75. The invention of claim 68, wherein the phase of the heart cycle is selected in (b) by a stored user-preference.
76. The invention of claim 68, wherein the phase of the heart cycle is selected in (b) by user selection of one of a plurality of displayed ultrasound images, each of the displayed ultrasound images being associated with a respective phase of the heart cycle.
77. The invention of claim 76, wherein the user-selected image comprises at least one visual characteristic, and wherein the ultrasound images are automatically selected in (c) by comparing the at least one visual characteristic of the user-selected image with at least one visual characteristic of the ultrasound images in each of the plurality of sets.
78. The invention of claim 68, wherein the ultrasound images are automatically selected in (c) by comparing (1) a time interval between a reference phase in the heart cycle and a phase associated with an ultrasound images in a set with (2) a time interval between the reference phase and the selected phase.
79. The invention of claim 78, wherein the reference phase comprises a preceding R-wave.
80. The invention of claim 78, wherein the reference phase comprises a succeeding R-wave.
81. The invention of claim 78, wherein the reference phase comprises a T-wave.
82. The invention of claim 68, wherein the ultrasound images comprise real-time images.
83. The invention of claim 68, wherein the ultrasound images comprise contrast-enhanced ultrasound images.
84. The invention of claim 68 further comprising displaying the automatically-selected ultrasound images.
85. The invention of claim 68 further comprising:
(e) performing a quantification analysis on the automatically-selected ultrasound images.
86. The invention of claim 68 further comprising:
(e) generating a time-intensity curve for the at lease one region of interest.
87. A medical diagnostic ultrasound imaging system comprising:
a transducer;
a beamformer in communication with the transducer;
a processor in communication with the beamformer; and
a display device in communication with the processor;
wherein the processor is operative to perform one or more of the methods in claims 1, 17, 36, 42, 58, or 68.
88. An ultrasound image review station comprising:
a display device;
a user interface device; and
a processor in communication with the display device and the user interface device, wherein the processor is operative to perform one or more of the methods in claims 1, 17, 36, 42, 58, or 68.
US10/861,880 2003-07-25 2004-06-03 Region of interest methods and systems for ultrasound imaging Abandoned US20050033123A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US49032403P true 2003-07-25 2003-07-25
US10/861,880 US20050033123A1 (en) 2003-07-25 2004-06-03 Region of interest methods and systems for ultrasound imaging

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US10/861,880 US20050033123A1 (en) 2003-07-25 2004-06-03 Region of interest methods and systems for ultrasound imaging
US12/234,484 US8285357B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging
US12/234,511 US8320989B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/234,511 Division US8320989B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging
US12/234,484 Division US8285357B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging

Publications (1)

Publication Number Publication Date
US20050033123A1 true US20050033123A1 (en) 2005-02-10

Family

ID=34118825

Family Applications (3)

Application Number Title Priority Date Filing Date
US10/861,880 Abandoned US20050033123A1 (en) 2003-07-25 2004-06-03 Region of interest methods and systems for ultrasound imaging
US12/234,511 Active 2026-11-09 US8320989B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging
US12/234,484 Active 2026-09-29 US8285357B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging

Family Applications After (2)

Application Number Title Priority Date Filing Date
US12/234,511 Active 2026-11-09 US8320989B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging
US12/234,484 Active 2026-09-29 US8285357B2 (en) 2003-07-25 2008-09-19 Region of interest methods and systems for ultrasound imaging

Country Status (1)

Country Link
US (3) US20050033123A1 (en)

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050033179A1 (en) * 2003-07-25 2005-02-10 Gardner Edward A. Phase selection for cardiac contrast assessment
US20060241457A1 (en) * 2005-03-09 2006-10-26 Siemens Medical Solutions Usa, Inc. Cyclical information determination with medical diagnostic ultrasound
EP1757955A1 (en) * 2005-08-24 2007-02-28 Medison Co., Ltd. Apparatus and method for processing an ultrasound image
EP1806594A2 (en) * 2006-01-06 2007-07-11 Medison Co., Ltd. Ultrasound system and method of displaying ultrasound image
US20070230758A1 (en) * 2006-03-31 2007-10-04 Siemens Medical Solutions Usa, Inc. Cross reference measurement for diagnostic medical imaging
JP2007330764A (en) * 2006-01-10 2007-12-27 Toshiba Corp Ultrasonic diagnostic apparatus and ultrasonic image creating method
EP1872724A1 (en) * 2006-01-10 2008-01-02 Kabushiki Kaisha Toshiba Ultrasonograph and ultrasonogram creating method
WO2008053401A1 (en) * 2006-11-03 2008-05-08 Philips Intellectual Property & Standards Gmbh Cardiac phase determination
KR100836146B1 (en) 2005-08-24 2008-06-16 주식회사 메디슨 Apparatus and method for processing a 3-dimensional ultrasound image
EP1949857A1 (en) * 2005-11-15 2008-07-30 Hitachi Medical Corporation Ultrasonographic device
US20080214934A1 (en) * 2007-03-02 2008-09-04 Siemens Medical Solutions Usa, Inc. Inter-frame processing for contrast agent enhanced medical diagnostic ultrasound imaging
US20090010511A1 (en) * 2003-07-25 2009-01-08 Gardner Edward A Region of interest methods and systems for ultrasound imaging
WO2009061521A1 (en) * 2007-11-11 2009-05-14 Imacor, Llc Method and system for synchronized playback of ultrasound images
WO2009093211A1 (en) * 2008-01-23 2009-07-30 Michalakis Averkiou Therapy assessment with ultrasonic contrast agents
US20100228127A1 (en) * 2006-08-09 2010-09-09 Koninklijke Philips Electronics N.V. Ultrasound imaging system
US20120288172A1 (en) * 2011-05-10 2012-11-15 General Electric Company Method and system for ultrasound imaging with cross-plane images
US20130169782A1 (en) * 2012-01-04 2013-07-04 Samsung Medison Co., Ltd. Diagnostic imaging apparatus and method of operating the same
WO2013140358A1 (en) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Imaging system for imaging a periodically moving object
US20130274601A1 (en) * 2010-12-13 2013-10-17 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus, image processing apparatus, and image processing method
GB2505988A (en) * 2012-06-26 2014-03-19 Gen Electric Diagnostic system and method for obtaining an ultrasound image frame
US20140099010A1 (en) * 2012-10-07 2014-04-10 Aspect Imaging Ltd. Mri system with means to eliminate object movement whilst acquiring its image
US20150031995A1 (en) * 2013-07-26 2015-01-29 Siemens Medical Solutions Usa, Inc. Motion Artifact Suppression for Three-Dimensional Parametric Ultrasound Imaging
EP2886059A1 (en) * 2013-09-25 2015-06-24 CureFab Technologies GmbH 4d pulse corrector with deformable registration
CN105686851A (en) * 2016-01-14 2016-06-22 深圳开立生物医疗科技股份有限公司 Blood flow 3D imaging method, device and ultrasonic equipment thereof

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8520947B2 (en) * 2007-05-22 2013-08-27 The University Of Western Ontario Method for automatic boundary segmentation of object in 2D and/or 3D image
EP2248122A1 (en) * 2008-02-25 2010-11-10 Inventive Medical Limited Medical training method and apparatus
DE102010033610A1 (en) * 2010-08-06 2012-02-09 Siemens Aktiengesellschaft Method for displaying a lymph node and a correspondingly designed combined MR / PET device
JP6150532B2 (en) * 2013-01-22 2017-06-21 オリンパス株式会社 Measuring device and program
KR20150106779A (en) 2014-03-12 2015-09-22 삼성메디슨 주식회사 The method and apparatus for displaying a plurality of different images of an object
CN105426927B (en) * 2014-08-26 2019-05-10 东芝医疗系统株式会社 Medical image processing devices, medical image processing method and medical image equipment
CN104915924B (en) * 2015-05-14 2018-01-26 常州迪正雅合电子科技有限公司 One kind realizes that three-dimensional ultrasound pattern determines calibration method automatically
US20170119352A1 (en) 2015-10-30 2017-05-04 Carestream Health, Inc. Ultrasound display method
US9971498B2 (en) * 2015-12-15 2018-05-15 General Electric Company Medical imaging device and method for using adaptive UI objects
US20170347992A1 (en) 2016-06-02 2017-12-07 Carestream Health, Inc. Automated region of interest placement
WO2019210292A1 (en) * 2018-04-27 2019-10-31 Delphinus Medical Technologies, Inc. System and method for feature extraction and classification on ultrasound tomography images

Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2005A (en) * 1841-03-16 Improvement in the manner of constructing molds for casting butt-hinges
US2009A (en) * 1841-03-18 Improvement in machines for boring war-rockets
US2008A (en) * 1841-03-18 Gas-lamp eok conducting gas pkom ah elevated buhner to one below it
US4373533A (en) * 1980-02-27 1983-02-15 Tokyo Shibaura Denki Kabushiki Kaisha Ultrasonic diagnosing apparatus
US5355887A (en) * 1991-10-31 1994-10-18 Fujitsu Limited Ultrasonic diagnostic apparatus
US5456255A (en) * 1993-07-12 1995-10-10 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus
US5538003A (en) * 1995-05-18 1996-07-23 Hewlett-Packard Company Quick method and apparatus for identifying a region of interest in an ultrasound display
US5615680A (en) * 1994-07-22 1997-04-01 Kabushiki Kaisha Toshiba Method of imaging in ultrasound diagnosis and diagnostic ultrasound system
US5657760A (en) * 1994-05-03 1997-08-19 Board Of Regents, The University Of Texas System Apparatus and method for noninvasive doppler ultrasound-guided real-time control of tissue damage in thermal therapy
US5701897A (en) * 1992-10-02 1997-12-30 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus and image displaying system
US5743266A (en) * 1995-04-25 1998-04-28 Molecular Biosystems, Inc. Method for processing real-time contrast enhanced ultrasonic images
US5785654A (en) * 1995-11-21 1998-07-28 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus
US5820561A (en) * 1996-07-30 1998-10-13 Vingmed Sound A/S Analysis and measurement of temporal tissue velocity information
US6030344A (en) * 1996-12-04 2000-02-29 Acuson Corporation Methods and apparatus for ultrasound image quantification
US6174287B1 (en) * 1999-06-11 2001-01-16 Acuson Corporation Medical diagnostic ultrasound system and method for continuous M-mode imaging and periodic imaging of contrast agents
US6213945B1 (en) * 1999-08-18 2001-04-10 Acuson Corporation Ultrasound system and method for generating a graphical vascular report
US6217520B1 (en) * 1998-12-02 2001-04-17 Acuson Corporation Diagnostic medical ultrasound system and method for object of interest extraction
US6306095B1 (en) * 1997-04-11 2001-10-23 Acuson Corporation Gated ultrasound imaging apparatus and method
US6352507B1 (en) * 1999-08-23 2002-03-05 G.E. Vingmed Ultrasound As Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
US6360027B1 (en) * 1996-02-29 2002-03-19 Acuson Corporation Multiple ultrasound image registration system, method and transducer
US6447453B1 (en) * 2000-12-07 2002-09-10 Koninklijke Philips Electronics N.V. Analysis of cardiac performance using ultrasonic diagnostic images
US6458082B1 (en) * 1999-09-29 2002-10-01 Acuson Corporation System and method for the display of ultrasound data
US6500121B1 (en) * 1997-10-14 2002-12-31 Guided Therapy Systems, Inc. Imaging, therapy, and temperature monitoring ultrasonic system
US6579240B2 (en) * 2001-06-12 2003-06-17 Ge Medical Systems Global Technology Company, Llc Ultrasound display of selected movement parameter values
US20040064036A1 (en) * 2002-09-26 2004-04-01 Zuhua Mao Methods and systems for motion tracking
US6884216B2 (en) * 2002-08-12 2005-04-26 Kabushiki Kaisha Toshiba Ultrasound diagnosis apparatus and ultrasound image display method and apparatus
US6994673B2 (en) * 2003-01-16 2006-02-07 Ge Ultrasound Israel, Ltd Method and apparatus for quantitative myocardial assessment

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5235984A (en) 1992-03-30 1993-08-17 Hewlett-Packard Company On-line acoustic densitometry tool for use with an ultrasonic imaging system
NO943214D0 (en) 1994-08-30 1994-08-30 Vingmed Sound As A method of ultrasound imaging
US5779641A (en) * 1997-05-07 1998-07-14 General Electric Company Method and apparatus for three-dimensional ultrasound imaging by projecting filtered pixel data
US6099471A (en) * 1997-10-07 2000-08-08 General Electric Company Method and apparatus for real-time calculation and display of strain in ultrasound imaging
US6346124B1 (en) * 1998-08-25 2002-02-12 University Of Florida Autonomous boundary detection system for echocardiographic images
US6193660B1 (en) * 1999-03-31 2001-02-27 Acuson Corporation Medical diagnostic ultrasound system and method for region of interest determination
US6558325B1 (en) 2000-07-13 2003-05-06 Acuson Corporation Medical diagnostic ultrasonic imaging method and system for displaying multi-phase, multi-frame images
US6491636B2 (en) * 2000-12-07 2002-12-10 Koninklijke Philips Electronics N.V. Automated border detection in ultrasonic diagnostic images
AT539681T (en) * 2001-01-30 2012-01-15 R Christopher Decharms Methods for physiological monitoring, training and regulation
US6692438B2 (en) * 2001-12-18 2004-02-17 Koninklijke Philips Electronics Nv Ultrasonic imaging system and method for displaying tissue perfusion and other parameters varying with time
US6673017B1 (en) * 2002-08-28 2004-01-06 Acuson Corporation Temporal resolution method and systems for ultrasound imaging
US7593554B2 (en) * 2002-10-03 2009-09-22 Koninklijke Philips Electronics N.V. System and method for comparing ultrasound images corresponding to two user-selected data points
US20040066389A1 (en) * 2002-10-03 2004-04-08 Koninklijke Philips Electronics N.V System and method for automatically generating a series of ultrasound images each representing the same point in a physiologic periodic waveform
US7731660B2 (en) 2003-07-25 2010-06-08 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US20050033123A1 (en) * 2003-07-25 2005-02-10 Siemens Medical Solutions Usa, Inc. Region of interest methods and systems for ultrasound imaging
US6980844B2 (en) * 2003-08-28 2005-12-27 Ge Medical Systems Global Technology Company Method and apparatus for correcting a volumetric scan of an object moving at an uneven period

Patent Citations (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2009A (en) * 1841-03-18 Improvement in machines for boring war-rockets
US2008A (en) * 1841-03-18 Gas-lamp eok conducting gas pkom ah elevated buhner to one below it
US2005A (en) * 1841-03-16 Improvement in the manner of constructing molds for casting butt-hinges
US4373533A (en) * 1980-02-27 1983-02-15 Tokyo Shibaura Denki Kabushiki Kaisha Ultrasonic diagnosing apparatus
US5355887A (en) * 1991-10-31 1994-10-18 Fujitsu Limited Ultrasonic diagnostic apparatus
US5701897A (en) * 1992-10-02 1997-12-30 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus and image displaying system
US5456255A (en) * 1993-07-12 1995-10-10 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus
US5657760A (en) * 1994-05-03 1997-08-19 Board Of Regents, The University Of Texas System Apparatus and method for noninvasive doppler ultrasound-guided real-time control of tissue damage in thermal therapy
US5615680A (en) * 1994-07-22 1997-04-01 Kabushiki Kaisha Toshiba Method of imaging in ultrasound diagnosis and diagnostic ultrasound system
US5743266A (en) * 1995-04-25 1998-04-28 Molecular Biosystems, Inc. Method for processing real-time contrast enhanced ultrasonic images
US5538003A (en) * 1995-05-18 1996-07-23 Hewlett-Packard Company Quick method and apparatus for identifying a region of interest in an ultrasound display
US5785654A (en) * 1995-11-21 1998-07-28 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus
US6360027B1 (en) * 1996-02-29 2002-03-19 Acuson Corporation Multiple ultrasound image registration system, method and transducer
US5820561A (en) * 1996-07-30 1998-10-13 Vingmed Sound A/S Analysis and measurement of temporal tissue velocity information
US6030344A (en) * 1996-12-04 2000-02-29 Acuson Corporation Methods and apparatus for ultrasound image quantification
US6306095B1 (en) * 1997-04-11 2001-10-23 Acuson Corporation Gated ultrasound imaging apparatus and method
US6500121B1 (en) * 1997-10-14 2002-12-31 Guided Therapy Systems, Inc. Imaging, therapy, and temperature monitoring ultrasonic system
US6217520B1 (en) * 1998-12-02 2001-04-17 Acuson Corporation Diagnostic medical ultrasound system and method for object of interest extraction
US6174287B1 (en) * 1999-06-11 2001-01-16 Acuson Corporation Medical diagnostic ultrasound system and method for continuous M-mode imaging and periodic imaging of contrast agents
US6213945B1 (en) * 1999-08-18 2001-04-10 Acuson Corporation Ultrasound system and method for generating a graphical vascular report
US6352507B1 (en) * 1999-08-23 2002-03-05 G.E. Vingmed Ultrasound As Method and apparatus for providing real-time calculation and display of tissue deformation in ultrasound imaging
US6458082B1 (en) * 1999-09-29 2002-10-01 Acuson Corporation System and method for the display of ultrasound data
US6447453B1 (en) * 2000-12-07 2002-09-10 Koninklijke Philips Electronics N.V. Analysis of cardiac performance using ultrasonic diagnostic images
US6579240B2 (en) * 2001-06-12 2003-06-17 Ge Medical Systems Global Technology Company, Llc Ultrasound display of selected movement parameter values
US6884216B2 (en) * 2002-08-12 2005-04-26 Kabushiki Kaisha Toshiba Ultrasound diagnosis apparatus and ultrasound image display method and apparatus
US20040064036A1 (en) * 2002-09-26 2004-04-01 Zuhua Mao Methods and systems for motion tracking
US6994673B2 (en) * 2003-01-16 2006-02-07 Ge Ultrasound Israel, Ltd Method and apparatus for quantitative myocardial assessment

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7731660B2 (en) 2003-07-25 2010-06-08 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US8285357B2 (en) 2003-07-25 2012-10-09 Siemens Medical Solutions Usa, Inc. Region of interest methods and systems for ultrasound imaging
US20080033294A1 (en) * 2003-07-25 2008-02-07 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US20080027319A1 (en) * 2003-07-25 2008-01-31 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US8320989B2 (en) 2003-07-25 2012-11-27 Siemens Medical Solutions Usa, Inc. Region of interest methods and systems for ultrasound imaging
US7981035B2 (en) 2003-07-25 2011-07-19 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US7854702B2 (en) 2003-07-25 2010-12-21 Siemens Medical Solutions Usa, Inc. Phase selection for cardiac contrast assessment
US20050033179A1 (en) * 2003-07-25 2005-02-10 Gardner Edward A. Phase selection for cardiac contrast assessment
US20090010511A1 (en) * 2003-07-25 2009-01-08 Gardner Edward A Region of interest methods and systems for ultrasound imaging
US20090016586A1 (en) * 2003-07-25 2009-01-15 Gardner Edward A Region of interest methods and systems for ultrasound imaging
US7775978B2 (en) 2005-03-09 2010-08-17 Siemens Medical Solutions Usa, Inc. Cyclical information determination with medical diagnostic ultrasound
US20060241457A1 (en) * 2005-03-09 2006-10-26 Siemens Medical Solutions Usa, Inc. Cyclical information determination with medical diagnostic ultrasound
KR100836146B1 (en) 2005-08-24 2008-06-16 주식회사 메디슨 Apparatus and method for processing a 3-dimensional ultrasound image
US7628755B2 (en) 2005-08-24 2009-12-08 Medison Co., Ltd. Apparatus and method for processing an ultrasound image
US20070053566A1 (en) * 2005-08-24 2007-03-08 Medison Co., Ltd. Apparatus and method for processing an ultrasound image
EP1757955A1 (en) * 2005-08-24 2007-02-28 Medison Co., Ltd. Apparatus and method for processing an ultrasound image
US20100022877A1 (en) * 2005-11-15 2010-01-28 Tomoaki Chono Ultrasonographic device
EP1949857A4 (en) * 2005-11-15 2009-12-23 Hitachi Medical Corp Ultrasonographic device
EP1949857A1 (en) * 2005-11-15 2008-07-30 Hitachi Medical Corporation Ultrasonographic device
EP1806594A2 (en) * 2006-01-06 2007-07-11 Medison Co., Ltd. Ultrasound system and method of displaying ultrasound image
US20070161895A1 (en) * 2006-01-06 2007-07-12 Medison Co., Ltd. Ultrasound system and method of displaying ultrasound image
EP1806594A3 (en) * 2006-01-06 2007-12-12 Medison Co., Ltd. Ultrasound system and method of displaying ultrasound image
EP1872724A1 (en) * 2006-01-10 2008-01-02 Kabushiki Kaisha Toshiba Ultrasonograph and ultrasonogram creating method
US10265053B2 (en) 2006-01-10 2019-04-23 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and method of generating ultrasonic image
JP2013059679A (en) * 2006-01-10 2013-04-04 Toshiba Corp Ultrasonic diagnostic apparatus
JP2007330764A (en) * 2006-01-10 2007-12-27 Toshiba Corp Ultrasonic diagnostic apparatus and ultrasonic image creating method
EP1872724A4 (en) * 2006-01-10 2010-02-24 Toshiba Kk Ultrasonograph and ultrasonogram creating method
US20080262354A1 (en) * 2006-01-10 2008-10-23 Tetsuya Yoshida Ultrasonic diagnostic apparatus and method of generating ultrasonic image
US20070230758A1 (en) * 2006-03-31 2007-10-04 Siemens Medical Solutions Usa, Inc. Cross reference measurement for diagnostic medical imaging
US7817835B2 (en) * 2006-03-31 2010-10-19 Siemens Medical Solutions Usa, Inc. Cross reference measurement for diagnostic medical imaging
US20100228127A1 (en) * 2006-08-09 2010-09-09 Koninklijke Philips Electronics N.V. Ultrasound imaging system
US10353069B2 (en) 2006-08-09 2019-07-16 Koninklijke Philips N.V. Ultrasound imaging system with image rate sequences
WO2008053401A1 (en) * 2006-11-03 2008-05-08 Philips Intellectual Property & Standards Gmbh Cardiac phase determination
US8175356B2 (en) 2006-11-03 2012-05-08 Koninklijke Philips Electronics N.V. Cardiac phase determination
US20100074485A1 (en) * 2006-11-03 2010-03-25 Koninklijke Philips Electronics N.V. Cardiac phase determination
US20080214934A1 (en) * 2007-03-02 2008-09-04 Siemens Medical Solutions Usa, Inc. Inter-frame processing for contrast agent enhanced medical diagnostic ultrasound imaging
WO2008108922A1 (en) * 2007-03-02 2008-09-12 Siemens Medical Solutions Usa, Inc. Inter-frame processing for contrast agent enhanced medical diagnostic ultrasound imaging
WO2009061521A1 (en) * 2007-11-11 2009-05-14 Imacor, Llc Method and system for synchronized playback of ultrasound images
US20090149749A1 (en) * 2007-11-11 2009-06-11 Imacor Method and system for synchronized playback of ultrasound images
WO2009093211A1 (en) * 2008-01-23 2009-07-30 Michalakis Averkiou Therapy assessment with ultrasonic contrast agents
US8460194B2 (en) 2008-01-23 2013-06-11 Michalakis Averkiou Therapy assessment with ultrasound contrast agents
US20100298710A1 (en) * 2008-01-23 2010-11-25 Koninklijke Philips Electronics N.V. Therapy assessment with ultrasound contrast agents
US20130274601A1 (en) * 2010-12-13 2013-10-17 Toshiba Medical Systems Corporation Ultrasound diagnosis apparatus, image processing apparatus, and image processing method
US20120288172A1 (en) * 2011-05-10 2012-11-15 General Electric Company Method and system for ultrasound imaging with cross-plane images
US8798342B2 (en) * 2011-05-10 2014-08-05 General Electric Company Method and system for ultrasound imaging with cross-plane images
US20130169782A1 (en) * 2012-01-04 2013-07-04 Samsung Medison Co., Ltd. Diagnostic imaging apparatus and method of operating the same
US9398857B2 (en) * 2012-01-04 2016-07-26 Samsung Medison Co., Ltd. Diagnostic imaging apparatus and method of operating the same
US10130341B2 (en) 2012-03-23 2018-11-20 Koninklijke Philips N.V. Imaging system for imaging a periodically moving object
CN104203108A (en) * 2012-03-23 2014-12-10 皇家飞利浦有限公司 Imaging system for imaging a periodically moving object
WO2013140358A1 (en) * 2012-03-23 2013-09-26 Koninklijke Philips N.V. Imaging system for imaging a periodically moving object
GB2505988A (en) * 2012-06-26 2014-03-19 Gen Electric Diagnostic system and method for obtaining an ultrasound image frame
US8777856B2 (en) 2012-06-26 2014-07-15 General Electric Company Diagnostic system and method for obtaining an ultrasound image frame
US9709652B2 (en) * 2012-10-07 2017-07-18 Aspect Imaging Ltd. MRI system with means to eliminate object movement whilst acquiring its image
US20140099010A1 (en) * 2012-10-07 2014-04-10 Aspect Imaging Ltd. Mri system with means to eliminate object movement whilst acquiring its image
US10034657B2 (en) * 2013-07-26 2018-07-31 Siemens Medical Solutions Usa, Inc. Motion artifact suppression for three-dimensional parametric ultrasound imaging
US20150031995A1 (en) * 2013-07-26 2015-01-29 Siemens Medical Solutions Usa, Inc. Motion Artifact Suppression for Three-Dimensional Parametric Ultrasound Imaging
EP2886059A1 (en) * 2013-09-25 2015-06-24 CureFab Technologies GmbH 4d pulse corrector with deformable registration
CN105686851A (en) * 2016-01-14 2016-06-22 深圳开立生物医疗科技股份有限公司 Blood flow 3D imaging method, device and ultrasonic equipment thereof

Also Published As

Publication number Publication date
US8320989B2 (en) 2012-11-27
US8285357B2 (en) 2012-10-09
US20090016586A1 (en) 2009-01-15
US20090010511A1 (en) 2009-01-08

Similar Documents

Publication Publication Date Title
Klingensmith et al. Evaluation of three-dimensional segmentation algorithms for the identification of luminal and medial-adventitial borders in intravascular ultrasound images
Wilkenshoff et al. Regional mean systolic myocardial velocity estimation by real-time color Doppler myocardial imaging: a new technique for quantifying regional systolic function
Kühl et al. High-resolution transthoracic real-time three-dimensional echocardiography: quantitation of cardiac volumes and function using semi-automatic border detection and comparison with cardiac magnetic resonance imaging
EP1350470B1 (en) Image processing device and ultrasonic diagnostic device
US8622915B2 (en) Ultrasound image processing to render three-dimensional images from two-dimensional images
Badano et al. Right ventricle in pulmonary arterial hypertension: haemodynamics, structural changes, imaging, and proposal of a study protocol aimed to assess remodelling and treatment effects
Ingul et al. Automated analysis of strain rate and strain: feasibility and clinical implications
Gopal et al. Freehand three-dimensional echocardiography for determination of left ventricular volume and mass in patients with abnormal ventricles: comparison with magnetic resonance imaging
US7824337B2 (en) Ultrasonic image processing apparatus and control program for ultrasonic image processing apparatus
US6436049B1 (en) Three-dimensional ultrasound diagnosis based on contrast echo technique
US7308297B2 (en) Cardiac imaging system and method for quantification of desynchrony of ventricles for biventricular pacing
US8150128B2 (en) Systems and method for composite elastography and wave imaging
CN102123665B (en) Dynamical visualization of coronary vessels and myocardial perfusion information
EP1543773B1 (en) Biological tissue motion trace method and image diagnosis device using the trace method
US6994673B2 (en) Method and apparatus for quantitative myocardial assessment
US6628743B1 (en) Method and apparatus for acquiring and analyzing cardiac data from a patient
JP4763883B2 (en) Ultrasonic diagnostic equipment
JP2006522664A (en) Method and system for knowledge-based diagnostic imaging
US8265358B2 (en) Ultrasonic image processing apparatus and method for processing ultrasonic image
JP4172962B2 (en) Ultrasound image acquisition with synchronized reference images
JP2005080791A (en) Ultrasonic diagnostic equipment and image processing apparatus
JP4676334B2 (en) Biological signal monitoring device
US20060078182A1 (en) Methods and apparatus for analyzing ultrasound images
Pérez et al. On-line assessment of ventricular function by automatic boundary detection and ultrasonic backscatter imaging
Horton et al. Assessment of the right ventricle by echocardiography: a primer for cardiac sonographers

Legal Events

Date Code Title Description
AS Assignment

Owner name: SIEMENS MEDICAL SOLUTIONS USA, INC., PENNSYLVANIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:GARDNER, EDWARD A.;KANE, RICHARD M.;MAIN, JOAN C.;REEL/FRAME:015456/0959;SIGNING DATES FROM 20040602 TO 20040603

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION