WO2014155223A1 - Segmentation of planar contours of target anatomy in 3d ultrasound images - Google Patents

Segmentation of planar contours of target anatomy in 3d ultrasound images Download PDF

Info

Publication number
WO2014155223A1
WO2014155223A1 PCT/IB2014/059656 IB2014059656W WO2014155223A1 WO 2014155223 A1 WO2014155223 A1 WO 2014155223A1 IB 2014059656 W IB2014059656 W IB 2014059656W WO 2014155223 A1 WO2014155223 A1 WO 2014155223A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target anatomy
plane
planes
further
Prior art date
Application number
PCT/IB2014/059656
Other languages
French (fr)
Inventor
Robert Joseph SCHNEIDER
Michael Daniel Cardinale
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to US201361804793P priority Critical
Priority to US61/804,793 priority
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2014155223A1 publication Critical patent/WO2014155223A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0883Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the heart
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • A61B8/469Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means for selection of a region of interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5223Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for extracting a diagnostic or physiological parameter from medical diagnostic data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52073Production of cursor lines, markers or indicia by electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/523Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for generating planar views from image data in a user selectable plane not corresponding to the acquisition plane
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/5206Two-dimensional coordinated display of distance and direction; B-scan display
    • G01S7/52063Sector scan display
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/008Cut plane or projection plane definition

Abstract

An ultrasonic diagnostic imaging system and method enable a clinician to automatically delineate the plane of target anatomy in a 3D image dataset. An initial image is acquired of the approximate location of the target anatomy. Three reference images are positioned to intersect the plane of the initial image and to view a different anatomical landmark of the target anatomy. The ultrasound system then uses the three landmarks to produce an MPR image of a plane of the 3D dataset which contains the three anatomical landmarks. The target anatomy can then be outlined and measured to produce a diagnosis of the patient.

Description

SEGMEN A ION OF PLANAR CONTOURS OF

TARGET ANATOMY IN 3D ULTRASOUND IMAGES

This invention relates to medical diagnostic ultrasound systems and, in particular, to the

segmentation of planar contours of target anatomy in volumetric ultrasound images.

One of the important uses for diagnostic

ultrasound imaging is to identify and then measure anatomical regions of the body. Such measurements are needed for instance in planning for an implant such as heart valve replacement. In order to measure specific anatomy, the anatomy must first be located by imaging, then a planar region defined which contains the target anatomy. This survey of the anatomy often begins with acquisition of a three dimensional (3D) image of the region of the target anatomy, as structures in the body can usually be found quickly with 3D imaging and the 3D image data will generally contain a planar image region of the target anatomy. It is generally desirable to measure anatomy in a 2D (two dimensional) image plane, since the accuracy and scale of a 2D image is readily apparent on a flat ultrasound system display screen. Measurements are then made of the anatomy when viewed in the 2D image.

Defining a desired planar anatomical region within a 3D image can be particularly difficult. Difficulties arise due to limitations in visualizing the anatomy of interest contained within the 3D image, for instance, due to volume rendering capabilities, or due to the use of cut planes which may or may not be able to contain landmarks of interest. Difficulties also arise in specifying the position and orientation of the image plane that contains the anatomical region. This is because the orientation of the plane has six degrees of freedom, and ultrasound systems are typically equipped with pointing devices that are limited to two- dimensional operations with respect to visualization and manipulation (i.e., 2D images on a flat screen monitor, mouse pointers that move in only two- dimensions, etc.)

An example of an application where a planar region needs to be found within a 3D image is the problem of defining the virtual annulus of the aortic valve in a 3D ultrasound image. The virtual annulus is a planar contour which contains the three basal attachment points of the aortic valve. Measurement of this annulus has become more important recently as it is used to size replacement valve devices in

transcathether aortic valve replacement (TAVR)

procedures . A typical procedure used to segment

(delineate) the virtual annulus of the aortic valve is to acquire a number of cut planes through a volumetric region of the body. A limited number of cut planes are used to define and iteratively refine the planar location of the virtual annulus, after which the contour is segmented using image information contained solely within the plane. This method, however, is time consuming and complicated and has been found to require a great deal of expertise and experience, as it requires that a mental understanding of the three- dimensional location of the basal attachment points relative to the reference planes needs to be maintained throughout the procedure. Accordingly it is desirable to have image manipulation and analysis tools which enable a clinician to quickly and easily identify an image plane containing a region of interest in a 3D volumetric region.

In accordance with the principles of the present invention, a diagnostic ultrasound imaging system and method are provided for quickly and easily delineating a planar region of interest in a volumetric region.

Volumetric ultrasonic image data is acquired which contains the region of interest. An image plane is identified in approximately the location of the desired image plane of the region of interest. A plurality of image planes which intersect the approximated image plane are positioned with each intersecting an

identifiable landmark of the region of interest. The ultrasound system then produces an image of the plane containing the identified landmarks and containing the region of interest. Quantification can then be

performed of the anatomy in the planar contour and measurements made for uses such as sizing for a

transplant .

In the drawings :

FIGURE 1 illustrates in block diagram form an ultrasonic diagnostic imaging system constructed in accordance with the principles of the present invention.

FIGURE 2 illustrates an aortic valve graphically and in relation to image planes oriented in accordance with the present invention for identifying the image plane of the virtual annulus of the valve.

FIGURES 3a and 3b illustrate the problem with dealing with the many degrees of freedom when

positioning a plane in a volumetric region to identify the image plane of specific target anatomy.

FIGURE 4 illustrates the initial image planes acquired in a diagnostic imaging workflow in accordance with the present invention.

FIGURE 5 illustrates the re-orienting of the images in a second step of the workflow depicted in FIGURE 4 to identify the image plane of a region of interest in an ultrasound image volume.

FIGURE 6 illustrates the refinement of the image orientation of FIGURE 5 and measurements made of target anatomy in an identified image plane.

FIGURES 7 and 8 illustrate images acquired and oriented in accordance with the present invention and used to delineate and measure the sinus of Valsalva of an aortic valve.

FIGURES 9 and 10 illustrate images acquired and oriented in accordance with the present invention and used to delineate and measure the sinotubular junction of an aortic valve.

Referring first to FIGURE 1, an ultrasound system constructed in accordance with the principles of the present invention is shown in block diagram form. The illustrated ultrasound system operates through two major subsystems, a front end acquisition subsystem 10A and a display subsystem 10B. An ultrasound probe is coupled to the acquisition subsystem to transmit ultrasound waves and receive ultrasound echo signals. The probe does this with a two-dimensional matrix array of transducer elements 70. The 2D matrix array is operated by a micro-beamformer 72 located in the probe with the array transducer. The micro-beamformer contains circuitry which controls the signals applied to groups of elements of the array transducer 70 for transmission and acquisition of ultrasonic echo signals and does some processing of the echo signals received by elements of each group. Micro-beamforming in the probe advantageously reduces the number of conductors in the cable between the probe and the ultrasound system and is described in US Pat. 5,997,479 (Savord et al . ) and in US Pat. 6,436,048 (Pesque), and provides electronic steering and focusing of beams on transmit and during beam reception for the production of highly resolved image data of volumetric regions of the body and suitable for anatomical quantification and

measurement .

The probe 70,72 is coupled to the acquisition subsystem 10A of the ultrasound system. The

acquisition subsystem includes a beamform controller 74 which provides control signals to the microbeamformer 72, instructing the probe as to the timing, frequency, direction and focusing of transmit and receive beams. The beamformer controller can also control the timing of acquisition so as to be synchronized to

physiological activity of the body indicated by gating signals. The beamform controller also controls the beamforming of echo signals received by the acquisition subsystem by its control of analog-to-digital (A/D) converters 18 and a system beamformer 20. Partially beamformed echo signals received by the probe are amplified by preamplifier and TGC (time gain control) circuitry 16 in the acquisition subsystem, then digitized by the A/D converters 18. The digitized echo signals are formed into fully steered and focused beams by the main system beamformer 20. The echo signals are processed by an image data processor 22 which performs digital filtering, B mode and M mode detection, and Doppler processing, and can also perform other signal processing such as harmonic separation, speckle reduction, and other desired image signal processing.

The echo signals produced by the acquisition subsystem 10A are coupled to the display subsystem 10B, which processes the echo signals for display in the desired image format on the display screen 62. The echo signals are processed by an image line processor 24, which is capable of sampling the echo signals, splicing segments of beams into complete line signals, and averaging line signals for signal-to-noise

improvement or flow persistence. The image lines for a 2D (two dimensional) image are scan converted into the desired image format by a 2D image processor 26 which performs R-theta conversion as is known in the art. The 2D image processor can thus format rectilinear or sector image formats . The 2D images are stored in an image memory 28 with other 2D images from which they can be displayed on the display 62. The images in memory are also overlaid with graphics to be displayed with the images, which are generated by a graphics generator 34. Individual images or image sequences can be stored in the image memory 28 for display of image loops or live sequences.

For real-time volumetric imaging the display subsystem 10B also includes a 3D (three dimensional) image processor 30 which receives image lines from the image line processor 24 for the rendering of real-time three dimensional images. The 3D images can be displayed as live (real time) 3D images on the display 62 or stored in the image memory 28 for later review and diagnosis. The 2D and 3D images are coupled to a border detector 32 which can manually or automatically delineate specific anatomy by tracing its outline or border such as the endothelial wall of a chamber of the heart. The traced borders of anatomy in an image are outlined graphically by coupling the border detector 32 to the graphics generator 34. The border detector 32 is also responsive to user control signals from a control panel 40 to quantify and measure anatomical regions in the ultrasound images .

An ECG subsystem is provided for use when it is desirable to acquire images at particular phases of the heart cycle. ECG leads 50 can be adhesively attached to a patient and provide ECG signals for a QRS

processor 52 which identifies the R-wave peak of each heartbeat. The timing of the R-wave is used to acquire images of a particular heart cycle. Images of the heart can be acquired at specific phases of the heart cycle by coupling the R-wave timing as a trigger signal from a trigger signal generator 54 to the beamform controller 74 and the controls of the control panel 40 used to select the desired heart phases at which heart phase-gated images are to be acquired by the ultrasound system.

The concept of the present invention is illustrated in FIGURE 2, which is a schematic representation of the anatomical structures of an aortic valve. Illustrated in this representation are the sinotubular junction 42, the crown-like ring 44 of the valve leaflets, the ventriculo-arterial junction 46, and the virtual ring 48 of the basal attachments of the aortic valvular leaflets. Instead of trying to define the image plane of the virtual ring of the three basal attachment points BAP1, BAP2, and BAP3, the ultrasound system of the present invention enables the clinician to acquire a planar image of each basal attachment point

individually by manipulating three image planes 64, 66, and 68 which each intersect one of the basal attachment points. Each image plane is shown as a live 2D image which enables the clinician to find one of the basal attachment points. In this example image plane 64 has been manipulated to acquire an image containing BAP1, image plane 66 has been manipulated to acquire an image containing BAP2, and image plane 68 has been located to acquire an image containing BAP3. The image planes are all seen to intersect the plane of the virtual ring 48 and in this example are seen to be substantially orthogonal to the plane of the virtual ring. Once the three basal attachment points are acquired in the three 2D images 64, 66, and 68, the clinician clicks on the attachment points BAP1, BAP2 and BAP3 in the three images to define the points for the ultrasound system and the border detector of the system then creates a 2D image of the plane containing the three points by multiplanar reconstruction (MPR) from the image data of the 3D image. The clinician is then presented with a live image of the virtual ring and can measure

characteristics of the ring for replacement by an appropriately sized new valve.

The reason why it is difficult to simply position the image plane of the virtual ring directly can be understood with reference to FIGURES 3a and 3b. One source of difficulty is that it can be difficult to see the virtual ring within the 3D volume 76 due to rendering issues. As the planar image of the virtual ring is contained in a plane 74 that generally does not coincide with any of the major axes of the volume 76 but is rather arbitrarily positioned and oriented, visualizing the object of interest 48 using only a single cut plane 74 (i.e., MPR image) can still be difficult as adequate overlap of the object of interest and the cut plane is not assured. Furthermore, defining the desired plane location (which is a six degree-of- freedom problem as shown by the arrows in FIGURE 3a, as the position and orientation need to be defined) by manipulating the plane 74 leads to an iterative and time intensive process, as an MPR cut plane shows only 2D information and allows for

manipulation of the plane in only three dimensions (x- and y-position of the desired plane within the cut plane), and rotation of the desired plane 74 about the cut plane's normal axis. The present invention

overcomes these difficulties by allowing for the specification of three landmarks PI, P2, and P3 which are known to be contained in the desired plane as shown in FIGURE 3b. The three landmarks define the plane of the target anatomy and subsequently constrain the segmentation of the anatomical object. This approach of defining landmarks which are then used to define the desired plane 74 allows for the separation of the plane definition problem into three easier sub-problems of specifying point locations (which each have only three degrees of freedom as shown by the arrows in FIGURE 3b) which are easier to locate in exploratory cut planes through the image. The corresponding workflow is much more intuitive and requires less expertise and

experience on the part of the user as compared to other workflows which require the user to place the plane at the exact desired position and orientation through iterative refinement within cut plane images.

As a comparative example, assume we can restrict the search space for the object plane such that there are Nx, Ny, and Nz possible coordinate positions for the plane center along each primary axis of the 3D image, and Rx, Ry, and Rz possible rotational positions of the plane about the primary axes. If we assume that

Nx=Ny=Nz=Rx=Ry=Rz=100 , then the size of the search space to define the plane using the traditional method is 1006, or 1012. By comparison, the size of the search space to define the plane using the principles of the present invention is only 3NxNyNz = 3(1003) = 3(106), which is six orders of magnitude smaller than the traditional method .

The following illustrations show actual ultrasound images used in implementations of the present invention. In these illustrations the conventional white-on-black ultrasound image presentation has been reversed to show anatomical structure in black against a white

background for ease of illustration. The first example shows the segmentation of an image plane of an aortic valve virtual annulus . Since the aortic valve is a tri-leaflet valve, it lends itself nicely to the three- landmark segmentation strategy of the present invention. However, as the succeeding examples show, the invention is not limited to only segmenting the aortic valve virtual annulus as seen in 3D ultrasound, but can be used on a variety of structures and medical images.

In a first step the long axis of the aortic valve is located using a three-orthogonal-plane system. The long axis is generally aligned with the length of the blood vessel and the short axis refers to image planes which cut across the vessel. The short axis view is shown in an image outlined in red (R) in this example. Other images are outlined in green (G) and blue (B) , a conventional way in which multiple image planes are shown simultaneously on an ultrasound display. The short-axis plane (R) is positioned along the axis of the valve such that the three cusps of the aortic valve are visible as shown in the R image of FIGURE 4. The G and B image planes are at this point oriented

orthogonal to the R image plane and to each other as indicated by the R, B, and G lines in the drawing which mark the relative positions of the three image planes. The positioning of the three image planes is shown in a perspective view 80 in the lower right quadrant of the screen illustration of FIGURE 4. After the short axis view R has been position to show the three valve cusps, the user hits a button on the user interface 40 to proceed to the next step.

Next, the user positions three long axis reference planes G, B, and Y, corresponding to the non, left, and right coronary cusps, at the center of the respective cusps as seen in the short axis image R in FIGURE 5 and indicated by the arrows. The three reference planes are marked as NCC, LCC, and RCC in the short axis view and in the individual planes . In the example shown in FIGURE 2 the reference planes were positioned

tangentially to the annulus of the valve, thereby intersecting points BAP1, BAP2, and PAP3. In the "edge-on" view of the reference planes of FIGURE 5 it is seen that the user has rotated the reference planes NCC, LCC and RCC as indicated by the arrows to be approximately normal to the annulus of the valve and to all intersect in the middle of the valve When the three reference planes are positioned as indicated in the short axis view, a basal attachment point (BAP) is visible in each image. The user marks the basal attachment point (BAP) in each of the three

corresponding image planes using a pointing device on the user interface. Once finished, the user hits a button on the user interface 40 to proceed to the next step .

The image plane of the annulus contour is

initialized by the border detector 32 using the

specified basal attachment points, either by use of only the geometry of the three points in the volume data or through the use of a more sophisticated

segmentation algorithm that uses the point locations and the surrounding image data. The short axis image plane R is thus repositioned to the plane of the valve annulus as shown in FIGURE 6. The user can edit the position of the annulus contour contour plane by either editing the plane coordinates in the short-axis image plane R, or by editing the marked basal attachment point in any of the three long-axis image planes G, B, or Y. The adjustment of the contour intersection points BAP are constrained to the plane defined by the basal attachment sites specified in the previous step as indicated by the arrows in the long axis images. The long-axis planes can be rotated around the virtual annulus planar normal to allow for long-axis

visualization and editing of the annulus at all

positions around the valve. The border detector 32 can then be used to provide measurements of the annulus such as by manually or automatically tracing the border 86 of the annulus as shown in the annulus plane image R in FIGURE 6, or by delineating and measuring the minor diameter 82 and major diameter 84 of the valve annulus.

FIGURES 7 and 8 show an example of segmentation of the image plane of the sinus of Valsalva of the aortic root, the major diameter landmark of an aortic valve. Selection of this image plane begins by locating the short axis plane further along the long axis of the valve as shown by the short axis R view in FIGURE 7.

The long axis G, B, and Y views are positioned in line with the three coronary cusps at individual landmarks PI, P2 and P3 of the sinus of Valsalva around the valve. The border detector 32 then uses the three landmark points PI, P2 and P3 of the three reference images to create an MPR image of the plane defined by the three points and displays this MPR image as the short axis R image as shown in FIGURE 8. As before, each landmark point can be repositioned but constrained to adjustment in the previously positioned long axis reference planes. The border detector 32 can then be used to delineate and measure the outline and diameters of the identified sinus of Valsalva in the short axis view R.

The third example illustrates segmentation of the image plane of the sinotubular junction of the aortic valve. As before, the short axis view R is initially positioned along the valve at the approximate location of the sinotubular junction. The reference long axis views G, B, and Y are then rotated and positioned to intersect three landmarks PI, P2, and P3 of the

junction around the valve as shown in FIGURE 9. The border detector 32 then uses the coordinates of the three identified landmarks in the reference views to reposition the short axis view to the plane of the sinotubular junction as shown by the short axis R view in FIGURE 10. As before, fine adjustment can be made to the orientation of the sinotubular junction plane by moving the landmark points PI, P2 and P3 in their

reference planes. When the sinotubular junction plane is positioned as desired, the border detector can be used to identify characteristics and make measurements of the sinotubular junction.

Claims

WHAT IS CLAIMED IS:
1. A method for using an ultrasonic diagnostic imaging system to produce an image plane of target anatomy from a volumetric image dataset comprising: acquiring a volumetric image dataset of a region of a subject;
positioning a first image plane in intersection with the volumetric image dataset at an approximate location of target anatomy;
positioning a plurality of reference image planes, each of which intersect the first image plane, so that the reference image planes each display a planar image of a different landmark of the target anatomy;
identifying the different landmarks in the reference image planes; and
automatically displaying a planar image of the target anatomy in a plane defined by the identified different landmarks.
2. The method of Claim 1, further comprising adjusting the identified position of a landmark in one or more of the reference image planes.
3. The method of Claim 1, further comprising measuring a characteristic of the target anatomy in the image of the target anatomy.
4. The method of Claim 3, wherein measuring further comprises measuring a diameter of the target anatomy .
5. The method of Claim 3, further comprising segmenting the target anatomy in the image of the target anatomy.
6. The method of Claim 5, wherein segmenting further comprises outlining the border of the target anatomy .
7. The method of Claim 1, further comprising simultaneously producing live images of the first image plane and the reference image planes.
8. The method of Claim 7, further comprising indicating the relative positions of other planes in relation to each of the image planes.
9. The method of Claim 1, further comprising distinguishing each image plane and its relative position to other image planes by a distinctive color.
10. An ultrasonic diagnostic imaging system which produces an image plane of target anatomy from a volumetric image dataset comprising:
a source of volumetric image data;
a two dimensional image processor which produces a two dimensional image of a region in which target anatomy is located and a plurality of differently positioned reference images;
a display which displays the two dimensional image of a region of target anatomy and the plurality of reference images simultaneously;
a user control operable to position the plane of the two dimensional image at approximately the location of the target anatomy, wherein the user control is further operable to position the planes of the reference images to
intersect the plane of the two dimensional image so that each images a different landmark of the target anatomy,
wherein the user control is further operable to identify the different landmarks in the reference images; and
a border detector responsive to the identified different landmarks which produces a planar image of the target anatomy in a plane defined by the identified different landmarks.
11. The ultrasonic diagnostic imaging system of Claim 10, wherein the target anatomy further comprises an aortic valve.
12. The ultrasonic diagnostic imaging system of Claim 10, wherein the two dimensional image processor is further adapted to produce three differently positioned reference images.
13. The ultrasonic diagnostic imaging system of Claim 12, wherein the two dimensional image processor produces live two dimensional images of the target anatomy and the reference images simultaneously.
14. The ultrasonic diagnostic imaging system of Claim 10, wherein the border detector is further adapted to delineate the border of the target anatomy in the planar image.
15. The ultrasonic diagnostic imaging system of Claim 10 wherein the border detector is further adapted to measure a characteristic of the target anatomy in the planar image.
PCT/IB2014/059656 2013-03-25 2014-03-12 Segmentation of planar contours of target anatomy in 3d ultrasound images WO2014155223A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US201361804793P true 2013-03-25 2013-03-25
US61/804,793 2013-03-25

Publications (1)

Publication Number Publication Date
WO2014155223A1 true WO2014155223A1 (en) 2014-10-02

Family

ID=50391240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/059656 WO2014155223A1 (en) 2013-03-25 2014-03-12 Segmentation of planar contours of target anatomy in 3d ultrasound images

Country Status (1)

Country Link
WO (1) WO2014155223A1 (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6436048B1 (en) 2000-08-24 2002-08-20 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with scanhead elevation beamforming
JP2003325514A (en) * 2002-05-16 2003-11-18 Aloka Co Ltd Ultrasonic diagnostic apparatus
US20040210138A1 (en) * 2003-04-21 2004-10-21 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US20090136109A1 (en) * 2006-03-20 2009-05-28 Koninklijke Philips Electronics, N.V. Ultrasonic diagnosis by quantification of myocardial performance
US20110066031A1 (en) * 2009-09-16 2011-03-17 Kwang Hee Lee Ultrasound system and method of performing measurement on three-dimensional ultrasound image
US20110172534A1 (en) * 2010-01-12 2011-07-14 Medison Co., Ltd. Providing at least one slice image based on at least three points in an ultrasound system
US20110172536A1 (en) * 2008-09-24 2011-07-14 Koninklijke Philips Electronics N.V. Generation of standard protocols for review of 3d ultrasound image data
US20110201935A1 (en) * 2008-10-22 2011-08-18 Koninklijke Philips Electronics N.V. 3-d ultrasound imaging
US20110282205A1 (en) * 2010-05-13 2011-11-17 Samsung Medison Co., Ltd. Providing at least one slice image with additional information in an ultrasound system

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6436048B1 (en) 2000-08-24 2002-08-20 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with scanhead elevation beamforming
JP2003325514A (en) * 2002-05-16 2003-11-18 Aloka Co Ltd Ultrasonic diagnostic apparatus
US20040210138A1 (en) * 2003-04-21 2004-10-21 Aloka Co., Ltd. Ultrasonic diagnostic apparatus
US20090136109A1 (en) * 2006-03-20 2009-05-28 Koninklijke Philips Electronics, N.V. Ultrasonic diagnosis by quantification of myocardial performance
US20110172536A1 (en) * 2008-09-24 2011-07-14 Koninklijke Philips Electronics N.V. Generation of standard protocols for review of 3d ultrasound image data
US20110201935A1 (en) * 2008-10-22 2011-08-18 Koninklijke Philips Electronics N.V. 3-d ultrasound imaging
US20110066031A1 (en) * 2009-09-16 2011-03-17 Kwang Hee Lee Ultrasound system and method of performing measurement on three-dimensional ultrasound image
US20110172534A1 (en) * 2010-01-12 2011-07-14 Medison Co., Ltd. Providing at least one slice image based on at least three points in an ultrasound system
US20110282205A1 (en) * 2010-05-13 2011-11-17 Samsung Medison Co., Ltd. Providing at least one slice image with additional information in an ultrasound system

Similar Documents

Publication Publication Date Title
AU2006201644B2 (en) Registration of electro-anatomical map with pre-acquired imaging using ultrasound
US6049622A (en) Graphic navigational guides for accurate image orientation and navigation
US6443894B1 (en) Medical diagnostic ultrasound system and method for mapping surface data for three dimensional imaging
EP0966691B1 (en) System for displaying a 2-d ultrasound image within a 3-d viewing environment
US20040081340A1 (en) Image processing apparatus and ultrasound diagnosis apparatus
US7092749B2 (en) System and method for adapting the behavior of a diagnostic medical ultrasound system based on anatomic features present in ultrasound images
RU2468436C2 (en) System and method to combine ultrasonic images in real time with previously received medical images
JP5124162B2 (en) Method and system for measuring the flow through the heart valve
US8303502B2 (en) Method and apparatus for tracking points in an ultrasound image
JP4473729B2 (en) By plain ultrasonic visualization process for the acquisition of time interleaving data
CN1764849B (en) Guidance of invasive medical devices by wide view three dimensional ultrasonic imaging
US8989842B2 (en) System and method to register a tracking system with intracardiac echocardiography (ICE) imaging system
JP4470187B2 (en) Ultrasound devices, ultrasound imaging program and ultrasonic imaging method
US6733458B1 (en) Diagnostic medical ultrasound systems and methods using image based freehand needle guidance
US5608849A (en) Method of visual guidance for positioning images or data in three-dimensional space
US6607488B1 (en) Medical diagnostic ultrasound system and method for scanning plane orientation
US20070255137A1 (en) Extended volume ultrasound data display and measurement
US20040249282A1 (en) System and method for extracting information based on ultrasound-located landmarks
US20090136109A1 (en) Ultrasonic diagnosis by quantification of myocardial performance
JP5782428B2 (en) System for adaptation volume imaging
US8172753B2 (en) Systems and methods for visualization of an ultrasound probe relative to an object
JP4758355B2 (en) System for guiding the medical device to the patient's body
US20060034513A1 (en) View assistance in three-dimensional ultrasound imaging
US6515657B1 (en) Ultrasonic imager
JP4677199B2 (en) The ultrasonic diagnostic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14713922

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14713922

Country of ref document: EP

Kind code of ref document: A1