WO2014097103A1 - Segmentation of breast lesions in 3d ultrasound images - Google Patents

Segmentation of breast lesions in 3d ultrasound images Download PDF

Info

Publication number
WO2014097103A1
WO2014097103A1 PCT/IB2013/060965 IB2013060965W WO2014097103A1 WO 2014097103 A1 WO2014097103 A1 WO 2014097103A1 IB 2013060965 W IB2013060965 W IB 2013060965W WO 2014097103 A1 WO2014097103 A1 WO 2014097103A1
Authority
WO
WIPO (PCT)
Prior art keywords
segmentation
image
interest
region
markers
Prior art date
Application number
PCT/IB2013/060965
Other languages
French (fr)
Inventor
Kongkuo Lu
Xin Liu
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201261737994P priority Critical
Priority to US61/737,994 priority
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2014097103A1 publication Critical patent/WO2014097103A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/149Segmentation; Edge detection involving deformable models, e.g. active contour models
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/12Edge-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • G06T2207/101363D ultrasound image
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user
    • G06T2207/20101Interactive definition of point of interest, landmark or seed
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30068Mammography; Breast
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30096Tumor; Lesion

Abstract

A medical imaging system for segmentation of breast lesions in3Dultrasound images has a display screen in which automatic segmentation, semi- automatic segmentation or manual segmentation of a breast lesion may be selected. When the lesion in an image has been previously delineated by the placement of six markers around the lesion or marking of its three major axes, the system automatically detects the markers, then automatically segments the boundary of the lesion from the marker locations. If automatic segmentation fails, semi-automatic segmentation may be enabled to allow manual placement of the markers on an image followed by automatic boundary segmentation. If neither automatic nor semi-automatic segmentation is effective, manual segmentation can be enabled.

Description

SEGMENTATION OF BREAST LESIONS

IN 3D ULTRASOUND IMAGES

This invention relates to medical diagnostic ultrasound and, in particular, to the segmentation of breast lesions in three dimensional diagnostic images .

Technological advances in the field of non¬ invasive imaging of the human body have expanded the ability of medical professionals to detect, diagnose, and treat disease. Among the imaging modalities currently in use for breast cancer diagnosis,

ultrasound has many advantages such as being easy-to- use, inexpensive, employing non-iodizing radiation, sensitive to dense breast tissue, and having low false positive rates. The diagnostic information obtained from breast imaging can serve as input to clinical decision support systems (CDSS) . A CDSS system can take input from segmentation results to help ascertain malignancy. For example, CDSS could potentially greatly increase the currently low sensitivity/specificity of the diagnosis of

ultrasound breast cancer. A CDSS system takes as its input characteristics of identified breast lesions, processes and weights the information, and suggests a probable diagnosis of malignant or benign or displays similar cases with known diagnosis from a database. This guidance can assist physicians in improving breast lesion diagnosis and therapy. For most CDSS systems, lesion segmentation is one of the first steps. As used herein segmentation means delineating the boundary of the lesion from surrounding normal tissue in a medical image. However, due to the nature of some imaging modalities such as ultrasound, these images often suffer from poor image quality caused by speckle noise, low contrast, blurred edges and shadow effects, which pose difficulties to automated segmentation techniques. This is

particularly the case with lesions, which are often highly heterogeneous and present overlapping signals with those from the background. Recently, several research groups have proposed semi-automated methods for segmenting lesions in medical images. Most proposed methods require the clinician to identify the region of interest (ROI) which encompasses the lesion or where the lesion can be found, then

starting the segmentation algorithm in a proper direction by clicking on certain features or

landmarks in the image. Even so, existing methods are not robust for every type of lesion and current segmentation techniques do not always result in accurate lesion boundaries required by many CDSS systems. The present invention is directed to improving lesion segmentation in a workflow that identifies key features in an image and uses such identification to maximize automation of the

segmentation process while still providing clinically acceptable segmentation results.

In accordance with the principles of the present invention, a diagnostic or interventional or therapy system receives a plurality of lesion images and enables a clinician to select one of the images for the particular clinical procedure. If the image has already undergone basic identification of the long and short axes of the lesion, the system

automatically identifies the short and long axes markers, then uses their locations to segment the lesion. If no markers are present or found in the image, the clinician is prompted to indicate marker locations and the system then uses the user-applied markers to segment the lesion. If the quality of the image is such that automatic segmentation cannot be performed with a high likelihood of success, the clinician is prompted to do the segmentation

manually. The user interface enables the clinician to automatically start with high level automation and proceed to the lowest level of automation which produces clinically acceptable results, or to start immediately with a lower level of automated

segmentation .

In the drawings :

FIGURE 1 illustrates in block diagram form an ultrasonic diagnostic imaging system constructed in accordance with the principles of the present

invention .

FIGURE 2 illustrates a user interface for a breast lesion diagnosis system constructed in

accordance with the principles of the present

invention .

FIGURE 3 is a flowchart of a sequence of operating the system of FIGURE 2 from the highest level of automation to the lowest.

FIGURE 4 illustrates the user interface of FIGURE 2 after fully automated or semi-automated segmentation of a breast lesion.

FIGURE 5 illustrates the user interface of FIGURE 2 after a breast lesion has been manually segmented .

FIGURE 6 is a flowchart of a process for

segmenting a three dimension image of a breast lesion .

FIGURE 7 illustrates diagrams explaining the segmentation of a three dimensional image of a breast lesion . FIGURE 1 illustrates an ultrasound system constructed in accordance with the present invention. An ultrasound probe 410 which includes a ID or 2D array transducer 412 transmits ultrasonic waves and received ultrasonic echo signals. This transmission and reception is performed under control of a beamformer 420 which processes in received echo signals to form coherent beams of echo signals from the anatomy being scanned. When the probe contains a 2D array transducer the beamformation process is partitioned between a microbeamformer in the probe which partially beamforms received echo signals, and the system beamformer 420 completes the beamforming process for three dimensional imaging. The echo information is Doppler processed by a Doppler processor 430 when flow or motion information is to be obtained, and the processed

Doppler information is coupled to an image processor 440 which forms 2D or 3D grayscale or Doppler images. The images pass through a Cineloop memory 460 from which they may be coupled directly to a video processor 470 for display of the images on an image display 480. In

accordance with the present invention the images are also applied to an image segmentation processor 490 which operates on the 2D or 3D images as described below to define the anatomical borders and boundaries of lesions in the images. The defined borders are overlaid on the images which are coupled to the video processor 470 for display. The system may operate to define and display boundaries on loops of images saved in the Cineloop memory 460, or to display borders drawn on real time images produced during live scanning of a patient.

FIGURE 2 illustrates the display screen of an

ultrasound system when operating in the segmentation mode in accordance with the present invention. This display screen may be produced and used on the

ultrasound system which acquired the images such as the ultrasound system shown in FIGURE 1, or it may be produced and used on a diagnostic image workstation to which acquired images have been exported.

Arranged vertically along the left side of the display screen are images 12 acquired in an

ultrasound exam which may be used in a diagnosis, as stated at step 300 of FIGURE 3. The clinician reviews these images 12 and selects one for display in the main review window 14, as stated at step 302 of FIGURE 3. The images 12 in this example are images of a breast which may contain lesions and the objective of the exam is to identify and diagnose lesions found in the breast. Breast images in many clinics undergo an initial screening in which a clinician reviews the images, looking for breast lesions. When the clinician finds a lesion, the clinician marks the height and width dimensions of the lesion. This may be done in different ways, as by marking the top and bottom of the lesion' s maximum vertical extent and the left and right side of the maximum horizontal extent of the lesion, the short and long axes of the lesion. Lines may be drawn between the indicated vertical and horizontal points, or marker icons may be placed on the points indicated by mouse clicks by the clinician, depending upon the options available for marking and selected for use by the clinician on the ultrasound system. In the example of FIGURE 2, the selected image 16 in the review window 14 has previously been marked by a clinician who placed markers 20 marking the height and width of a lesion in the image 16. The markers 20 used on the ultrasound system on which this preliminary review was performed are seen to be small circles with crosses over them.

When the clinician sees height and width markers 20 of the axes of a lesion in an image as in FIGURE 2, the clinician can command the system to

automatically detect them by invoking the "auto" mode in the segmentation mode box 30. In another

implementation, the marker-searching algorithm will automatically identify the markers when present, and automatically trigger auto-segmentation by switching to "auto" mode. The system can use various techniques to detect the markers. If the image format has a graphic overlay with the markers on it, the system simply has to identify the image locations of the overlaid markers. Another technique is to perform image analysis to search the image for the marker graphics. A preferred technique is to have a number of marker templates available in the system which correspond to marker graphics in common use on ultrasound systems. The clinician may then select the appropriate graphic template, or the system can automatically apply each template available to it. The system then performs a matching process, trying to correlate uniformly appearing pixel patterns in the image to the shapes and appearances of template graphics. As each marker is found, its image

location is recorded by the system.

After the four markers 20 are detected in step

304 of FIGURE 3, automatic segmentation of the lesion in the image 16 can be performed as stated in step 308. A preferred automatic technique is to

analytically connect the four markers with a polygon to define an initial contour delineating the lesion.

For example, the four markers can be connected by an ellipse (piece-wise spline curve) or by straight lines which form a diamond shape. This initial contour is then used to initialize a more local search for the lesion boundary in the image. This search can proceed in directions away from the initial contour line and on both sides of the line, looking for an energy change that denotes a lesion boundary. For example, the brightness of lines of pixels extending outward from the contour line can be examined to look for a sharp change, a maximum gradient, of the pixel brightness at a boundary as illustrated in Figs. 7a-7c of US Pat. 6,447,454

(Chenal et al . ) This brightness gradient search technique is especially effective with a fluid-filled lesion such as a cyst or non-diffusive mass, which will appear distinctive against the surrounding bright boundary of the lesion with normal tissue. Other energy characteristics of the pixels can be taken into consideration, such as the smoothness of normal tissue in comparison with that of the lesion, or the homogeneity of normal tissue in comparison with a non-homogeneous lesion. A transition from one characteristic to another is indicative of a lesion boundary. Other techniques can be used in the localized search such as spline-fitting a line to the lesion boundary, or snake algorithms. Preferably the lesion will be found within the initial contour so that the initial contour can be applied to limit possible leakage of a delineated boundary beyond the lesion. Through a number of iterations of energy optimization, a stopping criterion will be met when the energy converges into the local minima. FIGURE 4 illustrates a successful automatically fitted

boundary 22 of a lesion in ultrasound image 16.

If the four markers 20 cannot be automatically detected in step 304 of FIGURE 3, a message to that effect is presented to the clinician and the

clinician is invited to proceed with semi-automatic segmentation in step 306. It is possible that the four markers were not found because they are

indistinct against the surrounding image, or that there were no markers in the image because the image did not undergo preliminary review and marker

placement. In either case, the clinician can perform semi-automatic segmentation of a lesion by indicating marker locations in the image with a pointing device such as a computer mouse, and clicking to fix markers in the proper locations in the image. After the markers have been placed in position by the

clinician, the automated procedure of fitting an initial contour to the markers and searching locally to delineate the lesion boundary can be performed as described above in conjunction with step 308.

After the lesion has been segmented by

delineating the lesion boundary 22, a clinician evaluates the results of the segmentation in step 310. If the clinician judges the segmentation to be acceptable, the shape of the lesion boundary 22 can be extracted from the image 16 and used for further processing, such as applying it to a CDSS system to develop a suggested diagnosis, using it for image- guided ablation planning, and so on. For instance, if the shape of the lesion boundary 22 is highly irregular as illustrated in FIGURE 4, this high irregularity is indicative of malignancy. A smooth boundary shape is indicative of a benign lesion.

Other characteristics may be extracted using the lesion boundary 22, such as using the boundary to delineate tissue within the (ROI) in comparison to normal tissue outside of the boundary as indicated in step 312. A lesion with a homogeneous tissue texture is generally benign, whereas a non-homogeneous tissue texture is often malignant. This texture feature is extracted at step 316 and supplied to a CDSS system to aid in diagnosis of the lesion.

If the lesion boundary cannot be automatically traced in steps 306 or 308, or the clinician does not think the segmentation was successful in step 310, the clinician is then presented with the option to segment the lesion manually. The clinician selects the manual mode in box 30 and is presented with the tools necessary to manually trace the lesion

boundary. This is generally done by clicking on the boundary with a computer mouse to start tracing, then carefully moving the pointing device along the lesion boundary to trace it freehand. Alternatively the clinician can place a polygon shape over the lesion, then can click on control points on the polygon to stretch it to the lesion boundary, a process known as

"rubberbanding" as described in the aforementioned Chenal et al . patent. FIGURE 5 illustrates a lesion which has been successfully segmented with a manually drawn tracing 24. After manual segmentation, the shape of the boundary tracing 24 or the texture of the lesion tissue or both can be extracted and used in a CDSS system to aid lesion diagnosis.

FIGURES 6 and 7 illustrate use of the lesion segmentation system of the present invention with three dimensional (3D) images. In this case the initial review of 3D images of a lesion includes the placement of six markers 1-6 on a 3D image around the ROI of the lesion. The six markers are placed above, below, to the left, to the right, and in front and back of the ROI of the lesion, generally in line with the three major axes in the three dimensions of the lesion. FIGURE 6 describes the use of the screen display of FIGURE 2 to segment 3D images. In step 600 3D images from an exam are imported into the system and displayed as the series of images 12. One of the 3D images 16 is selected for review in the main review window 14. Automatic segmentation begins by automatically detecting the six markers 1-6 as indicated at step 602. Instead of segmenting the lesion in the full volume image, it is preferable to perform segmentation on three planar images

intersecting the lesion as illustrated by FIGURE 7. The cube 700 in FIGURE 7A represents an image volume which contains a lesion. The coordinate directions of the image volume are indicated in the upper right corner of the drawing. This volume is intersected by three planes 702, 704 and 706 as shown in FIGURE 7A. These planes may be individually scanned planes through the volume or may be multiplanar

reconstructed (MPR) images of three planar images through the volume image data. In this example the three planes are all orthogonal to each other, although that will not necessarily be true in most situations. In the preliminary review six markers 1- 6 have been placed in the volume around the ROI of the lesion. Markers 1 and 3 are above and below the lesion and markers 2 and 4 are to the right and left of the lesion. Markers 5 and 6 are in front of and behind the lesion. Image plane 702 is selected so that its orientation intersects markers 6, 2, 5, and

4 as shown in FIGURE 7A and individually in FIGURE 7B. Similarly, image plane 704 is oriented to intersect markers 1, 2, 3, and 4, and image plane 706 is oriented to intersect markers 1, 6, 3, and 4 as shown in FIGURES 7A, 7C, and 7D. Since each image plane includes four markers as in the 2D imaging example, six-corner segmentation can be performed on each of the three planar images with their four markers as in the 2D image case described above.

When the six markers 1-6 are automatically identified in step 602 using the marker detection techniques described above, the boundary of the lesion in each of the three image planes 702, 704, and 706 can be automatically performed in step 606 using the

segmentation techniques described above. It can be seen from FIGURE 7 that the markers in two of the image planes which define the boundaries in those two planes will also contain four markers of the third plane which define the boundary in that third plane. For example, the boundaries on planes 704 and 706 will intersect the boundary on plane 702 at four intersection points. Using the four intersection points on each 2D sectional image, the four-corner segmentation describe above will be performed to define the boundary on each plane. When all the 2D planar images are processed, the set of 2D boundary tracings on all of the planes result in the 3D segmentation of the lesion. The three lesion

boundary tracings are output to a CDSS system in step 608.

If the six markers 1-6 are not detected

automatically in step 602 or the image being reviewed does not contain markers, the clinician is given the option to proceed with semi-automatic segmentation by placing the six markers 1-6 on the volume image in step 604. After the six markers have been manually placed around the lesion ROI by the clinician, six- corner segmentation can be performed on the three intersecting image planes as shown in step 606. If the automatic segmentation process of step 606 fails for any reason, the clinician can be given the option of manually selecting three image planes from the volume and either tracing the lesion boundary

freehand or by rubberbanding a polygon shape or shapes over the three image planes. As before, the clinician is again given the option to proceed with automatic, semi-automatic, or manual boundary

segmentation to produce the most accurate boundary tracings with the greatest amount of effective automation and in the most efficient length of time.

Other variation of the above techniques and embodiments will readily occur to those skilled in the art. For instance, segmentation of a planar lesion image may be performed with less than four markers 20. In FIGURE 2 the right-most marker 20 is seen to be at the edge of the image 16, but a given image may not extend this far to the right. The system will then be able to detect only the three markers 20 in the image area but not the marker to the right of the image area. The symmetry of the marker placement can indicate to the system that the missing marker is to the right of the image area, and segmentation can proceed with just the three markers identified in the image area, with the boundary tracing truncated at the right side of the image. In the 3D case, different volumes with different

orientations of the lesion in the volume can be analyzed to measure the extent of the lesion in different image planes through the lesion. Only one or two boundary tracing may be output for diagnosis. For example, only the most irregular boundary tracing may be output for diagnostic aid, or only that which delineates the most non-homogeneous lesion ROI may be output for further use.

Claims

WHAT IS CLAIMED IS:
1. A medical imaging system for segmentation of a region of interest in a three dimensional diagnostic image comprising:
a display screen onto which a three dimensional diagnostic image can be imported for review in a review window;
an image segmentation processor, responsive to an imported image, to automatically detect markers in the image which delineate a region of interest in three dimensions,
wherein the processor is further responsive to the detection of the markers to segment the region of interest by delineation of the boundary of the region of interest;
wherein the display screen further enables selection of semi-automatic segmentation of a region of interest by enabling the manual placement of markers on an imported image; and
wherein the display screen further enables selection of manual segmentation of a region of interest by enabling the manual delineation of the boundary of a region of interest.
2. The medical imaging system of Claim 1, wherein the processor is further responsive to the manual placement of markers on an image to segment a region of interest by delineation of the boundary of the region of interest.
3. The medical imaging system of Claim 1 wherein the image segmentation processor is further adapted to automatically detect six markers which delineate the region of interest.
4. The medical imaging system of Claim 3, wherein the six markers are located to the left, to the right, above, below, in front of, and behind the region of interest in the three dimensional image.
5. The medical imaging system of Claim 4, wherein the image segmentation processor is further operable to identify three planes through the region of interest which intersect marker locations.
6. The medical imaging system of Claim
wherein the three planes further comprise
orthogonally oriented planes, each of which
intersects four marker locations.
7. The medical imaging system of Claim 5, wherein the three planes further comprise three image planes of the region of interest,
wherein the processor is further responsive to the locations of the markers in each image plane to delineate a boundary of the region of interest in each image plane.
8. The medical imaging system of Claim 7, wherein the region of interest of the ultrasound image further comprises a breast lesion,
wherein the six markers are located above, below, to the left, to the right, in front of and behind the breast lesion in the image.
9. The medical imaging system of Claim 8, wherein pairs of the six markers are located on the approximate locations of three major, generally orthogonal axes of the three dimensional image of the breast lesion.
10. The medical imaging system of Claim 3, wherein the three dimensional diagnostic image further comprises no detectable markers,
wherein segmentation of a region of interest in the three dimensional image is performed by enabling semi-automatic or manual segmentation of a region of interest .
11. The medical imaging system of Claim 10, wherein the region of interest contains a breast lesion;
wherein the boundary of the region of interest further comprises a breast lesion boundary; and
wherein the system further produces as an output at least one of a breast lesion boundary shape or a breast lesion texture.
12. The medical imaging system of Claim 10, wherein, in the event of a segmentation of a region of interest which is unacceptable, the selection of manual segmentation is enabled.
13. The medical imaging system of Claim 12, wherein the selection of manual segmentation enables either freehand segmentation or manual segmentation by use of a polygon shape.
14. The medical imaging system of Claim 1, wherein the selection of manual segmentation enables either freehand segmentation or manual segmentation by use of a polygon shape.
15. The medical imaging system of Claim 14, wherein manual segmentation by use of a polygon shape further comprises stretching the polygon shape to the boundary of a region of interest in the image by a rubberbanding process.
PCT/IB2013/060965 2012-12-17 2013-12-16 Segmentation of breast lesions in 3d ultrasound images WO2014097103A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201261737994P true 2012-12-17 2012-12-17
US61/737,994 2012-12-17

Publications (1)

Publication Number Publication Date
WO2014097103A1 true WO2014097103A1 (en) 2014-06-26

Family

ID=50029169

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/060965 WO2014097103A1 (en) 2012-12-17 2013-12-16 Segmentation of breast lesions in 3d ultrasound images

Country Status (1)

Country Link
WO (1) WO2014097103A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990922A2 (en) * 1998-09-30 2000-04-05 Matsushita Electric Industrial Co., Ltd. System for accurately obtaining a contour and/or quantitative information from an echo image with reduced manual operation
WO2002045586A1 (en) * 2000-12-07 2002-06-13 Koninklijke Philips Electronics N.V. Acquisition, analysis and display of ultrasonic diagnostic cardiac images
WO2007071050A1 (en) * 2005-12-20 2007-06-28 Resonant Medical Inc. Methods and systems for segmentation and surface matching

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0990922A2 (en) * 1998-09-30 2000-04-05 Matsushita Electric Industrial Co., Ltd. System for accurately obtaining a contour and/or quantitative information from an echo image with reduced manual operation
WO2002045586A1 (en) * 2000-12-07 2002-06-13 Koninklijke Philips Electronics N.V. Acquisition, analysis and display of ultrasonic diagnostic cardiac images
US6447454B1 (en) 2000-12-07 2002-09-10 Koninklijke Philips Electronics N.V. Acquisition, analysis and display of ultrasonic diagnostic cardiac images
WO2007071050A1 (en) * 2005-12-20 2007-06-28 Resonant Medical Inc. Methods and systems for segmentation and surface matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
DE BRUIN P W ET AL: "Interactive 3D segmentation using connected orthogonal contours", COMPUTERS IN BIOLOGY AND MEDICINE, NEW YORK, NY, US, vol. 35, no. 4, 1 May 2005 (2005-05-01), pages 329 - 346, XP027650246, ISSN: 0010-4825, [retrieved on 20050501] *
HANSEGARD JOGER ET AL: "Semi-automated quantification of left ventricular volumes and ejection fraction by real-time three-dimensional echocardiography", CARDIOVASCULAR ULTRASOUND, BIOMED CENTRAL, LONDON, GB, vol. 7, no. 1, 20 April 2009 (2009-04-20), XP021050230, ISSN: 1476-7120, DOI: 10.1186/1476-7120-7-18 *
OLABARRIAGA S D ET AL: "Interaction in the segmentation of medical images: A survey", MEDICAL IMAGE ANALYSIS, OXFORD UNIVERSITY PRESS, OXOFRD, GB, vol. 5, no. 2, 13 June 2001 (2001-06-13), pages 127 - 142, XP002463194, ISSN: 1361-8415, DOI: 10.1016/S1361-8415(00)00041-4 *

Similar Documents

Publication Publication Date Title
Rohling et al. Automatic registration of 3-D ultrasound images
US6764449B2 (en) Method and apparatus for enabling a biopsy needle to be observed
CN103402453B (en) Auto-initiation and the system and method for registration for navigation system
Zhang et al. Tissue characterization in intravascular ultrasound images
JP5474342B2 (en) Anatomical modeling with 3-D images and surface mapping
EP1778093B1 (en) Ultrasonic diagnosis of ischemic cardiodisease
JP4152746B2 (en) Ultrasound diagnostic cardiac image capture, analysis and display method
EP1538986B1 (en) 3d ultrasound-based instrument for non-invasive measurement of fluid-filled and non fluid-filled structures
US20070116357A1 (en) Method for point-of-interest attraction in digital images
EP1673013B1 (en) Ultrasonic cardiac volume quantification
Grau et al. Registration of multiview real-time 3-D echocardiographic sequences
US6537221B2 (en) Strain rate analysis in ultrasonic diagnostic images
US20080146932A1 (en) 3D ultrasound-based instrument for non-invasive measurement of Amniotic Fluid Volume
US6491636B2 (en) Automated border detection in ultrasonic diagnostic images
CN101357067B (en) Edge detection in ultrasound images
JP2010505558A (en) System and method for segmenting regions in medical images
US6447453B1 (en) Analysis of cardiac performance using ultrasonic diagnostic images
US7678052B2 (en) Method and apparatus for detecting anatomic structures
EP1799110B1 (en) Ultrasonic imaging system with body marker annotations
US7450746B2 (en) System and method for cardiac imaging
US7343031B2 (en) Method of tracking position and velocity of objects' borders in two or three dimensional digital images, particularly in echographic images
US8444543B2 (en) Apparatus and computing device for performing brachytherapy and methods of imaging using the same
Wein et al. Simulation and fully automatic multimodal registration of medical ultrasound
US7087022B2 (en) 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US20030038802A1 (en) Automatic delineation of heart borders and surfaces from images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13824660

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13824660

Country of ref document: EP

Kind code of ref document: A1