WO2015068073A1 - Multi-plane target tracking with an ultrasonic diagnostic imaging system - Google Patents

Multi-plane target tracking with an ultrasonic diagnostic imaging system Download PDF

Info

Publication number
WO2015068073A1
WO2015068073A1 PCT/IB2014/065528 IB2014065528W WO2015068073A1 WO 2015068073 A1 WO2015068073 A1 WO 2015068073A1 IB 2014065528 W IB2014065528 W IB 2014065528W WO 2015068073 A1 WO2015068073 A1 WO 2015068073A1
Authority
WO
WIPO (PCT)
Prior art keywords
target
image
images
planes
plane
Prior art date
Application number
PCT/IB2014/065528
Other languages
French (fr)
Inventor
Robert Joseph SCHNEIDER
Michael Daniel Cardinale
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2015068073A1 publication Critical patent/WO2015068073A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • A61B8/145Echo-tomography characterised by scanning multiple planes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/467Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient characterised by special input means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5269Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts
    • A61B8/5276Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving detection or reduction of artifacts due to motion
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8993Three dimensional imaging systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52073Production of cursor lines, markers or indicia by electronic means
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer

Abstract

A two dimensional array transducer probe (70, 72) acquires real time images of a plurality of different image planes (50, 52) of a volumetric region, at least one of which contains an image of a target object or anatomy. A user control or image processing is used to identify a target in one of the images and the other images are steered to image the same target. The target is tracked from frame to frame in the images by motion detection using one of block matching, optical flow, rigid registration, or non-rigid registration of successively acquired images. The motion detection is used to continually image the target in different image planes in real time.

Description

MULTI-PLANE TARGET TRACKING WITH AN
ULTRASONIC DIAGNOSTIC IMAGING SYSTEM
This invention relates to medical ultrasound systems and, in particular, to ultrasonic imaging systems which automatically track anatomy or an
object of interest in the body by multi-plane imaging.
Real-time multi-plane ultrasound imaging, such as the xPlane imaging mode on Philips ultrasound systems, has the advantage of allowing a clinician to visualize anatomy from multiple views, which is often necessary for diagnosis or interventional purposes. The advantage of real-time (live) multi-plane imaging, such as the xPlane imaging mode on the iE33 and Epiq ultrasound systems available from Philips Healthcare of Andover, MA is that it allows for visualization of objects and anatomy from multiple vantage points.
Unlike three dimensional (3D) ultrasound, the multi¬ plane mode does not require the training necessary to know how to interpret live 3D images and navigate and manipulate cut planes and cropping boxes to
accurately visualize a region of interest. Multi¬ plane imaging typically also has the advantage over 3D ultrasound of higher frame rates, higher spatial resolution, and better image quality. The multiple views allow the clinician to navigate around a region of interest using information contained in all
available planes. Navigation is performed either by manually moving the probe or by electronically
controlling the position of the planes using a
graphical interface that shows the relative position and orientation of the planes on a display screen.
In the commercially available implementation of the multi-plane mode the orientation of one of the planes is fixed relative to the ultrasound probe, extending normal to the center of the transducer. This plane is used as a reference plane. The probe is
manipulated until the anatomy or object of interest is visualized in this plane. The second plane, when utilizing only two planes (biplane imaging) , is initially oriented orthogonal to the reference plane, however the relative orientation of the second plane (and any additional planes) can be manipulated by the user. Once the anatomy or object of interest is visualized in the reference plane, a user control on the control panel is manipulated to steer the second plane to also capture the anatomy or object of interest. Real time images of both image planes are then displayed side-by-side on the display screen.
The problem with xPlane imaging, or any 2D imaging mode for that matter, is that objects or anatomy of interest have the potential to move out of the viewing plane due to either probe movement or movement of anatomy. Probe movement may be caused by the clinician manually moving the probe
(intentionally or unintentionally), or may be due to the probe moving relative to anatomy due to things such as patient respiration. Anatomic movement is typical in structures which exhibit appreciable translation or rotation during normal or abnormal functioning, such as the movement seen in a beating heart .
These movements, either of the probe or of the anatomy, can be problematic for those cases where continuous observation of a region of interest is desired. For instance, in an interventional case where a device is being deployed to a particular location within the body (for instance, in a
transcatheter aortic valve replacement (TAVR) procedure) , it is desired to continuously observe the anatomic target location at which the device should be deployed. This is true for cases such as TAVR, for needle biopsies in the liver or breast, or any other intervention where a device or instrument needs to be guided to a particular location within the body.
Accordingly it is desirable to be able to
continuously and reliably visualize the anatomy or object of interest in the multiple images, despite effects such as probe or anatomical movement.
In accordance with the principles of the present invention, an ultrasonic imaging system with a realtime multi-plane ultrasound imaging mode is provided that is designed to track a point (i.e., an object or anatomy) of interest, thereby providing continuous visualization of the target in the multiple planes. The target, defined by the user or by an automated algorithm, is tracked by real-time image registration and/or an optical flow algorithm which uses
information from the multiple planes to determine the movement of the target. The motion of the target and its estimated real-time position are used to
determine where to electronically steer the multi¬ plane system such that the target is always contained within the multi-plane system. This is performed in real-time using a fast tracking method, such as localized block matching, optical flow, rigid
registration, non-rigid registration, or other motion tracking techniques. Because the multi-plane system is electronically steered, the user does not have to manually correct for the position of the probe to continuously observe the target within the multiple cut plane images. Rather, this is done automatically, where the only action the user has to perform is to keep the probe in contact with the patient so that adequate images are acquired which can be used to track the target. In utilizing an xPlane system, or two or more planes which are not parallel to each other, the component of the motion of the target that does not reside within a plane can generally be determined from the motion of the target as found in the other planes.
In the drawings :
FIGURE 1 illustrates in block diagram form an ultrasonic imaging system constructed in accordance with the principles of the present invention.
FIGURE 2 illustrates the tracking of an object of interest in a biplane imaging mode in accordance with the present invention.
FIGURE 3 illustrates a heart valve being tracked and visualized in a tri-plane imaging mode.
FIGURE 4 illustrates a biplane image with an icon that indicates relative image plane orientation.
FIGURES 5a and 5b illustrate how the orientation icon of FIGURE 4 changes as the image plane
orientation is changed.
FIGURE 5c illustrates another implementation of an image plane orientation icon for multi-images.
FIGURE 6 illustrates two biplane images and their orientation in relation to an array transducer.
FIGURE 7 illustrates a thick slice image which may be used in an implementation of the present invention .
FIGURES 8 and 9 illustrate the tracking of two points of an object of interest in accordance with the present invention.
FIGURE 10 illustrates a catheter being used in a cardiac procedure.
FIGURES 11a and lib illustrate the tracking of the tip of the catheter of FIGURE 10 in biplane images in accordance with the present invention. Referring first to FIGURE 1, an ultrasonic imaging system of the present invention is shown in block diagram form. The ultrasound system is
configured by two subsystems, a front end acquisition subsystem 10A and a display subsystem 10B. An ultrasound probe is coupled to the acquisition subsystem which includes a two-dimensional matrix array transducer 70 and a micro-beamformer 72. The micro-beamformer contains circuitry which control the signals applied to groups of elements ("patches") of the array transducer 70 for transmission, applying properly timed transmit waveforms to the elements to steer transmitted beams in the desired direction and to the desired focal depth, and does some processing of the echo signals received by elements of each group in response to transmit waves. Micro- beamforming in the probe advantageously reduces the number of conductors in the cable between the probe and the ultrasound system and is described in US Pat. 5,997,479 (Savord et al . ) and in US Pat. 6,436,048
(Pesque) .
The probe is coupled to the acquisition
subsystem 10A of the ultrasound system. The
acquisition subsystem includes a beamform controller 74 which is responsive to a user control 36 and provides control signals to the microbeamformer 72, instructing the probe as to the timing, frequency, direction and focusing of transmit beams. The beamform controller also controls the beamforming of echo signals received by the acquisition subsystem by its control of analog-to-digital (A/D) converters 18 and a system beamformer 20. Echo signals received by the probe are amplified by preamplifier and TGC (time gain control) circuitry 16 in the acquisition
subsystem, then digitized by the A/D converters 18. The digitized echo signals are then formed into fully steered and focused beams by the system beamformer 20. The echo signals are then processed by a signal processor 22 which performs digital filtering, B mode and M mode detection, and Doppler processing, and can also perform other signal processing such as harmonic separation, speckle reduction, and other desired image signal processing.
The echo signals produced by the acquisition subsystem 10A are coupled to the display subsystem
10B, which processes the echo signals for display in the desired image format. The echo signals are processed by an image line processor 24, which is capable of sampling the echo signals, splicing
segments of beams into complete line signals, and averaging line signals for signal-to-noise
improvement or flow persistence. The image lines for a 2D image are scan converted into the desired image format by a scan converter 26 which performs R-theta conversion as is known in the art. The image is then stored in an image buffer or memory 28 from which it can be displayed on a display 38. The image in
memory 28 is also overlaid with graphics to be
displayed with the image, which are generated by a graphics generator (not shown) which is responsive to the user control 36. Individual images or image sequences can be stored in a cine memory (not shown) during capture of image loops or sequences.
For real-time volumetric imaging the display subsystem 10B also includes a 3D image rendering processor 32 which receives image lines from the image line processor 24 for the rendering of realtime three dimensional images. The 3D images can be displayed as live (real time) 3D images on the
display 38 or coupled to the image memory 28 for storage of the 3D data sets for later review and diagnosis .
In accordance with the principles of the present invention, an x,y, z motion detector 30 estimates the motion of selected anatomy or targets in multiple images when the ultrasound system is operated in a multi-plane mode, such as the biplane or higher order live 2D imaging mode. In a live imaging multi-plane mode, a 3D imaging probe, preferably one with an electronically steered two dimensional array
transducer, is operated to scan only selected B mode image planes of a volumetric region of the body. US Pat. 6,709,394 (Frisa et al . ) describes a biplane imaging system, for example. In a conventional biplane imaging mode, one plane is used as a
reference plane. The reference plane is at a fixed relation to the imaging probe, extending normal to the center of the transducer array. As the imaging probe is manipulated to find the anatomy of interest the reference plane is used to look for the anatomy or target. One the target or anatomy is found in the reference plane, the second or other planes imaged by the probe are tilted or rotated by manipulation of a user control to view the anatomy or target in a second or additional planes. Use of an ultrasound system of the present invention begins in the same way, with the user manipulating the probe 70,72 to acquire an image of the desired anatomy or target in one of the viewing planes. Once the desired anatomy or target is acquired in one of the images, the user marks the target or anatomy in the image by use of a control on the control panel 36 or a softkey on the display screen 38. Alternatively, automated
techniques such as border detection can be used to mark the target on the display. With the target or desired anatomy now indicated to the ultrasound system, the x,y,z motion detector 30 now tracks the motion of the target or anatomy and its estimated real time position and uses this information to control the beamformer controller to continually electronically steer the imaging planes of the probe to the target or anatomy. Because the multi-plane system is electronically steered, the user does not have to manually correct for the position of the probe to continuously observe the target within the multiple cut plane images. Rather, this is done automatically, where the only action the user has to perform is to keep the probe in contact with the patient such that adequate images are acquired which can be used to track the target.
An example of this operation is illustrated in FIGURE 2. In this example, the multi-plane system is comprised of only two planes, although more planes could be used as necessary or desired as shown in FIGURE 3. The relative position of each cut plane 50,
52 with respect to the other is shown by the
corresponding line 51, 53 within each cut plane. In a constructed embodiment the outline of each image plane and its cut line is shown in a distinctive color. For instance, the position of the image plane
52 in step A with respect to the imaging plane 50 is shown by a red line 51 within the image plane 50, which itself is outlined in green. The intersection of plane 50 with image plane 52 is shown by the green cut line 53 in image plane 52, and image 52 is
outlined in red in correspondence with the color of its cut line 51. In the first step A, the user or an automated algorithm identifies a target (dot 54) within one of the planes in the real-time multi-plane system. Notice that at this step, the user is not necessarily capable of viewing the target 54 in the other image planes. In step B, which immediately follows the target point selection (step A), the image planes which do not already contain the target point 54 are electronically steered such that the target 54 is contained in these other cut planes, plane 52 in this example. Step A and step B can be considered as initialization steps. Once
initialization is complete, step C and step D are carried out in a real-time and iterative fashion. In step C, the movement of the target 54 within each cut plane is detected and computed by a real-time
tracking method executed by the x,y,z motion detector 30 (block matching, optical flow, rigid registration, non-rigid registration, etc.) . In step D, the newly detected target position is used to determine where to electronically steer the multiple planes in the multi-plane system such that the target 54 is
continuously visualized in all planes. Note that if a large displacement of the target is exhibited between any two consecutive frames in the imaging sequence, step C and step D may need to be performed several times (i.e., iteratively) in the time between the acquisition of these two frames to accurately reflect and capture the movement of the target.
Resolving the motion of the target is only possible, however, because of the relative orientation of the different cut planes. In having image planes that are not parallel to one another, the component of the displacement of the target that cannot be detected in one plane, because the motion is orthogonal to the plane, can be detected in the remaining planes, thereby allowing for real-time and continuous
tracking of the target.
As mentioned above, the x,y,z motion detector can operate by any of a number of image processing techniques that estimate frame-to-frame motion.
Block matching such as the MSAD (minimum sum of absolute difference) block matching technique
described in US Pat. 6,442,289 (Olsson et al . ) may be used. Optical flow analysis as described in the paper "Determining Optical Flow," by B. Horn & B. Schunck, Artificial Intelligence 17 at pgs . 185-203 (1981) may be employed. Rigid registration as described in co-pending application SN (2013PF02066) entitled "AUTOMATED SEGMENTATION OF TRI-PLANE IMAGES FOR REAL TIME ULTRASONIC IMAGING," (Schneider et al . ) may be used for motion estimation, as well as non- rigid registration. The so-estimated motion
transform or vectors of the identified target or anatomy of interest from one frame to another is applied to the beamform controller in the ultrasound system of FIGURE 1 to steer the direction of
acquisition of subsequent multi-planes so that the target or anatomy is imaged continuously during the sequence of image acquisition. For more smoothly appearing image sequences the tracked location can be computed such at the planes smoothly shift from one plane orientation to the next, thereby presenting more appealing visualization of the plane shifting.
The multiple planes in the multi-plane system are preferably just a few steered B mode planes with scanlines extending from the array transducer for high frame rates of display, but could also be cut planes (MPR planes) taken from a three-dimensional ultrasound volume. The tracking in such an
implementation is done using the volumetric
information, or just information from the extracted planes. Representative MPR cut planes are then shown from the successive 3D ultrasound volumes which contain the tracked target or anatomy. In this way, the user would not have to re-position cut planes within a 3D ultrasound volume or re-position cropping borders. This implementation, however, has lower frame rates of display since the entire 3D volume
(rather than just selected image planes) has to be re-scanned for each frame update.
FIGURE 3 illustrates a tri-plane implementation of the present invention, in which an anatomical target is continuously tracked in three image planes
50, 52 and 56 of a volumetric region of the body. In this example the heart 56 of a patient is being imaged and the tracking system is continuously steering the planes of the three images so that the anatomical target, heart valve 54, is always being visualized in the three planar images.
FIGURE 4 illustrates a biplane display screen for an implementation of the present invention which shows a left image L, a right image R, and an image orientation icon 400. Each image has a dot 402, 404 associated with it which marks a particular side of the ultrasound probe with an orientation marker on it which denotes a particular side of the image. The probe marker enables the sonographer to distinguish a side of the image relative to the probe. In this example the marker and dots mark the right side of the planar images L and R. The circle 410 of the orientation icon 400 indicates the circle about which the two image planes can be rotated by the
sonographer. In this example the planes of the two images, indicated by the line 412 are aligned. The dots 406, 408 of the icon indicate the right-side dots of the images L and R. As the plane of the right image is rotated, the plane line of the R plane 412 departs from the plane line of the L plane as shown in FIGURE 5a. When the right image is rotated another 180° the plane lines take on the relative orientation shown in FIGURE 5b. The image
orientation icon of FIGURES 5a, 5b and 5c are
suitable for a biplane system which permits rotation of the two planes about their common center line.
FIGURE 5c illustrates an orientation icon 600 which can be used to indicate the relative tilt orientation of two image planes for a biplane system which permits plane tilting. In the icon 600 the small graphical sector 602 represents the fixed position of the left image L in FIGURE 4. A cursor line 604 represents the right image R viewed "edge-on from the side. In this example the right image plane is tilted 30° from a nominal orientation in which the center lines of the two images are aligned, which is a 0° reference orientation. In the orientation at initialization the cursor line is vertically oriented in the icon 600. As the plane of the right image is tilted relative to the left image, the cursor line moves to indicate the relative tilt of one image plane to the other. As an alternative to the icon 600, the cursor line 604 can be displayed over the reference image L. Cursor display types other than a line, such as dots or pointers, can also be used for cursor line 604. Icons for both rotation and tilting can be shown individually or an icon illustrating both characteristics can be used. Instead of viewing the cursor line 604 for the right image R edge-on, for instance, the line 604 can be replaced by a perspective view of a rotated and tilted right image plane, for instance.
FIGURE 6 is another illustration of two biplane sector images 291 and 286, shown in relation to the 2D transducer array 337 which acquired them. In this example the X-axis is the long axis of the transducer array and is the azimuth (in-plane) dimension of the right image 286. Icon 370 is the orientation icon for the right image 286. The left image 291 has its azimuth dimension in the Y direction when the image planes are orthogonal as shown in this drawing. An orientation icon 372 shows the orientation of the plane of the left image 291. The Z (depth) dimension of both images extends normal to the plane of the transducer array 337. Both planes may be rotated and tilted from these initial orientations to follow a target and keep it in the planes of both images.
For relatively slow moving targets such a a lesion in the liver that moves with the patient's respiration or the wall of the heart which moves with the motion of the heart in the chest cavity the motion is generally slow enough (and the frame rate high enough) so that a target will not leave both planes in less than an inter-frame interval, that is, will not suddenly move a significant distance in the elevation direction of both planes. The direction and velocity of the motion that is determined in one inter-frame interval can be used to predict the location of the target at the time of the next plane, enabling the next frame to "lead" the moving target and acquire the next image at the predicted target location. However there can be times when the target suddenly leaves both frames such as can occur due to abrupt plane motion, requiring the target to be re- acquired in an image and marked again for tracking.
This problem can be exacerbated when a frame is very thin due to tight beam focusing. The problem can be ameliorated somewhat by use of "thick slice" imaging as described in US Pat. pub. no. 2010/0168580
(Thiele) . FIGURE 7 illustrates a volumetric region 10 in which a thick slice is acquired. The thick slice comprises three thin, parallel adjacent planar slices 12, 14, and 16 which are combined in the elevation direction to form a single thick slice for imaging. When used in an implementation of the present invention the target will be tracked to keep it visible in the center slice 14 of the three parallel slices. Should the target move in the elevation direction it will then be found in one of the elevationally adjacent slices 12 or 16, and the slice acquisition re-steered to again acquire the target in the relocated center slice 14. While thick slice imaging will increase the acquisition time and hence slow the frame rate due to the need to acquire multiple adjacent slices instead of just a single planar image, it can enhance the ability to
continually track the target by judicious choice of the inter-slice thickness and number of slices used.
Instead of tracking a single point target, multiple points can be tracked. If the probe
capabilities are such that all utilized planes cannot be generated so that they pass through the multiple tracked targets, best-fit plane approximations can be used instead as described in the above-mentioned co- pending application application SN (2013PF02066) .
FIGURES 8 and 9 show an implementation of the present invention in which a 2D array transducer 500 is used to obtain data from a set of biplanes 510 and 512. FIGURE 8 illustrates the array transducer and
biplanes in perspective and FIGURE 9 shows the image planes and transducer from above or below. In this example the target 130 has several points 506, 508, 514 and 516 which are initially located in both of planes 510 and 512. In this implementation the array transducer generates beams such as beam 504 that lies in plane 510, intersecting points 514 and 506; and beam 505 that lies in plane 512, intersecting points 516 and 508. The rays emanating from two-dimensional array transducer 500 are electronically steered in three dimensions, thus avoiding the need to
mechanically sweep the transducer across the
volumetric region of interest. In similar fashion, data is received from the lines of interest in the respective planes using well-known beam steering and focusing and/or gating techniques applicable to a two-dimensional array transducer. Once the points of the target 130 have been located and marked in one or both image planes they can be tracked in both planes to span the two planes, or the image planes can be re-steered to relocate the target in the center at the intersection of the two planes 510 and 512 and tracked from there.
An ultrasound system of the present invention can be used to locate and track an invasive device inside the body during an interventional procedure, such as a catheter used for a cardiac ablation procedure. FIGURE 10 is a cross-sectional view of the human heart along its long axis, showing the introduction of a catheter 40 into the left ventricle (LV) 394 from the aortic outflow tract (AO) and through the aortic valve. The distal tip 46 of the catheter is shown resting on the myocardial wall 399 of the LV where an ablation procedure is to be performed. The site of the procedure is within a depth range bounded by lines 416B and 417B where the focus of the probe is optimized. The site 415B of the procedure is visualized in sector images images (FIGURES 11a and lib) bounded by lines 412B and 413B in one plane, and lines 445B and 446B in another. Also shown in FIGURE 10 are the left atrium (LA) 390 and the mitral valve 392 between the LA and the LV. The right ventricle (RV) 386 is at the top of the heart in this view.
In the conduct of the procedure the heart is imaged to find the distal tip 46 of the catheter in an image. The tip 46 is manually selected as the target to be tracked by clicking a cursor on the image display screen on the catheter tip in an image. Alternatively, the catheter tip can be automatically identified and selected by image processing to find its specular reflection in the image or by a signal produced by a locating transducer on the catheter tip as shown in US Pat. 5,158,088 (Nelson et al . ) when so equipped. The plane of the second image is then steered to image the catheter tip 46 in the second image. Thereafter, the motion of the catheter tip 46 is tracked and the image plane orientations adjusted so that the catheter tip is continually visualized in the two images 450B and 420B throughout the ablation procedure. The short axis image sector 445B-446B in
FIGURE 11a shows the myocardial wall 388 of the RV at the top of the image, the LV 394 in the center of the image, the mitral valve 392, and the myocardial wall 399 of the LV on which the catheter tip 46 is
positioned for the procedure in site region 448B.
The long axis image sector 412B-413B of the second view in FIGURE lib shows a more cross-sectional view of the catheter and its distal tip 46 on the wall 399 of the myocardium 388 of the LV. To the right of each image is an orientation icon 460B, 430B. These icons each illustrate the position of the respective image sector 464B, 434B over a range 462B, 436B of possible in-plane sector steering. The clinician can thereby continually view the tip of the catheter and the site of the procedure in two different views as the procedure is performed.

Claims

WHAT IS CLAIMED IS:
1. An ultrasonic imaging system for imaging a targeted object or anatomy in a plurality of
different image planes comprising:
a two dimensional array transducer probe adapted to image a plurality of different image planes in a volumetric region;
a beamform controller coupled to the two
dimensional array transducer probe;
an image processing and display system, coupled to the two dimensional array transducer probe, and adapted to display real time images of a plurality of different image planes;
a target selector adapted to identify a target in an ultrasound image produced by the image
processing and display system; and
a motion detector, responsive to the
identification of a target in an ultrasound image and adapted to track the target for display in the plurality of real time images.
2. The ultrasonic imaging system of Claim 1, wherein the motion detector is further coupled to the beamform controller,
wherein the beamform controller is further responsive to a tracking signal from the motion detector to steer the planes of the real time images.
3. The ultrasonic imaging system of Claim 1, further comprising a 3D image memory which comprises a 3D volume data set,
wherein the 3D image memory is further
responsive to a tracking signal from the motion detector to select a plurality of different image planes containing a view of the target.
4. The ultrasonic imaging system of Claim 1, wherein the target selector further comprises a manually operated user control for selecting a target in an image plane.
5. The ultrasonic imaging system of Claim 1, wherein the target selector further comprises an image processor responsive to images produced by the image processing and display system which
automatically identifies a desired target in an image
6. The ultrasonic imaging system of Claim 1, wherein the motion detector produces a target
tracking signal by one of block matching, optical flow, rigid registration, or non-rigid registration of successively acquired images of an image plane.
7. The ultrasonic imaging system of Claim 6, wherein the image processing and display system is further adapted to simultaneously display real time biplane images of different image planes.
8. The ultrasonic imaging system of Claim 6, wherein the image processing and display system is further adapted to simultaneously display real time tri-plane images of different image planes.
9. The ultrasonic imaging system of Claim 1, wherein the target further comprises a lesion in an organ .
10. The ultrasonic imaging system of Claim 1, wherein the target further comprises an invasive object for an interventional procedure.
11. A method for imaging a targeted object or anatomy in a plurality of different image planes comprising :
imaging a subject with a two dimensional array transducer probe adapted to image a plurality of different image planes in a volumetric region;
controlling the two dimensional array transducer probe with a beamform controller;
identifying a target in a first ultrasound image with a target selector;
tracking the target in the ultrasound images; and
processing and displaying real time images of the target in a plurality of different image planes.
12. The method of Claim 11, further comprising, following the identifying step, imaging the target in a second ultrasound image.
13. The method of Claim 12, wherein tracking further comprises tracking the target in the
ultrasound images by one of of block matching, optical flow, rigid registration, or non-rigid registration of successively acquired images of an image plane.
14. The method of Claim 11, further comprising displaying a plane orientation icon indicating the relative orientation of the different image planes.
15. The method of Claim 11, wherein controlling further comprises steering the planes of the images imaged with the two dimensional array transducer probe in response to the tracking step
PCT/IB2014/065528 2013-11-11 2014-10-22 Multi-plane target tracking with an ultrasonic diagnostic imaging system WO2015068073A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361902547P 2013-11-11 2013-11-11
US61/902,547 2013-11-11

Publications (1)

Publication Number Publication Date
WO2015068073A1 true WO2015068073A1 (en) 2015-05-14

Family

ID=52004004

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2014/065528 WO2015068073A1 (en) 2013-11-11 2014-10-22 Multi-plane target tracking with an ultrasonic diagnostic imaging system

Country Status (1)

Country Link
WO (1) WO2015068073A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018065282A1 (en) * 2016-10-03 2018-04-12 Koninklijke Philips N.V. X-plane and 3d imaging for asymmetric apertures
CN108852409A (en) * 2017-05-10 2018-11-23 通用电气公司 For the visualization method and system by across planar ultrasound image enhancing moving structure
US20190197764A1 (en) * 2016-06-12 2019-06-27 Telefield Medical Imaging Limited Three-dimensional imaging method and system
WO2020043795A1 (en) * 2018-08-29 2020-03-05 Koninklijke Philips N.V. Imaging plane control and display for intraluminal ultrasound, and associated devices, systems, and methods
CN111200972A (en) * 2017-10-05 2020-05-26 医视特有限公司 Frameless ultrasound therapy
US11357473B2 (en) 2017-02-14 2022-06-14 Koninklijke Philips N.V. Path tracking in ultrasound system for device tracking

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5158088A (en) 1990-11-14 1992-10-27 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic systems for imaging medical instruments within the body
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6436048B1 (en) 2000-08-24 2002-08-20 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with scanhead elevation beamforming
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
US6709394B2 (en) 2000-08-17 2004-03-23 Koninklijke Philips Electronics N.V. Biplane ultrasonic imaging
US20060004291A1 (en) * 2004-06-22 2006-01-05 Andreas Heimdal Methods and apparatus for visualization of quantitative data on a model
US20100168580A1 (en) 2007-04-13 2010-07-01 Koninklijke Philips Electronics N.V. High speed ultrasonic thick slice imaging
US8303502B2 (en) * 2007-03-06 2012-11-06 General Electric Company Method and apparatus for tracking points in an ultrasound image
GB2493902A (en) * 2011-06-28 2013-02-27 Surf Technology As Multiple scan-plane ultrasound imaging apparatus and method suitable for the assessment of wall motion abnormalities of the heart

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5158088A (en) 1990-11-14 1992-10-27 Advanced Technology Laboratories, Inc. Ultrasonic diagnostic systems for imaging medical instruments within the body
US5997479A (en) 1998-05-28 1999-12-07 Hewlett-Packard Company Phased array acoustic systems with intra-group processors
US6442289B1 (en) 1999-06-30 2002-08-27 Koninklijke Philips Electronics N.V. Extended field of view ultrasonic diagnostic imaging
US6709394B2 (en) 2000-08-17 2004-03-23 Koninklijke Philips Electronics N.V. Biplane ultrasonic imaging
US6436048B1 (en) 2000-08-24 2002-08-20 Koninklijke Philips Electronics N.V. Ultrasonic diagnostic imaging system with scanhead elevation beamforming
US20060004291A1 (en) * 2004-06-22 2006-01-05 Andreas Heimdal Methods and apparatus for visualization of quantitative data on a model
US8303502B2 (en) * 2007-03-06 2012-11-06 General Electric Company Method and apparatus for tracking points in an ultrasound image
US20100168580A1 (en) 2007-04-13 2010-07-01 Koninklijke Philips Electronics N.V. High speed ultrasonic thick slice imaging
GB2493902A (en) * 2011-06-28 2013-02-27 Surf Technology As Multiple scan-plane ultrasound imaging apparatus and method suitable for the assessment of wall motion abnormalities of the heart

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
B. HORN; B. SCHUNCK: "Determining Optical Flow", ARTIFICIAL INTELLIGENCE, vol. 17, 1981, pages 185 - 203, XP000195787, DOI: doi:10.1016/0004-3702(81)90024-2

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190197764A1 (en) * 2016-06-12 2019-06-27 Telefield Medical Imaging Limited Three-dimensional imaging method and system
WO2018065282A1 (en) * 2016-10-03 2018-04-12 Koninklijke Philips N.V. X-plane and 3d imaging for asymmetric apertures
US11357473B2 (en) 2017-02-14 2022-06-14 Koninklijke Philips N.V. Path tracking in ultrasound system for device tracking
CN108852409A (en) * 2017-05-10 2018-11-23 通用电气公司 For the visualization method and system by across planar ultrasound image enhancing moving structure
US10299764B2 (en) * 2017-05-10 2019-05-28 General Electric Company Method and system for enhanced visualization of moving structures with cross-plane ultrasound images
KR20210011477A (en) * 2017-05-10 2021-02-01 제네럴 일렉트릭 컴퍼니 Method and system for enhanced visualization of moving structures with cross-plane ultrasound images
KR102321853B1 (en) * 2017-05-10 2021-11-08 제네럴 일렉트릭 컴퍼니 Method and system for enhanced visualization of moving structures with cross-plane ultrasound images
CN108852409B (en) * 2017-05-10 2022-03-22 通用电气公司 Method and system for enhancing visualization of moving structures by cross-plane ultrasound images
CN111200972A (en) * 2017-10-05 2020-05-26 医视特有限公司 Frameless ultrasound therapy
WO2020043795A1 (en) * 2018-08-29 2020-03-05 Koninklijke Philips N.V. Imaging plane control and display for intraluminal ultrasound, and associated devices, systems, and methods
CN112638277A (en) * 2018-08-29 2021-04-09 皇家飞利浦有限公司 Imaging plane control and display for intraluminal ultrasound and related devices, systems, and methods
CN112638277B (en) * 2018-08-29 2024-03-08 皇家飞利浦有限公司 Imaging plane control and display for intraluminal ultrasound and related devices, systems, and methods

Similar Documents

Publication Publication Date Title
US10410409B2 (en) Automatic positioning of standard planes for real-time fetal heart evaluation
US7993272B2 (en) Image plane stabilization for medical imaging
JP4950747B2 (en) User interface for automatic multi-plane imaging ultrasound system
WO2015068073A1 (en) Multi-plane target tracking with an ultrasonic diagnostic imaging system
JP5645811B2 (en) Medical image diagnostic apparatus, region of interest setting method, medical image processing apparatus, and region of interest setting program
EP2411963B1 (en) Improvements to medical imaging
US20070255137A1 (en) Extended volume ultrasound data display and measurement
US20100130855A1 (en) Systems and methods for active optimized spatio-temporal sampling
US8540636B2 (en) Ultrasonic diagnostic apparatus and medical image processing apparatus
JP7150800B2 (en) Motion-adaptive visualization in medical 4D imaging
WO2015092628A1 (en) Ultrasound imaging systems and methods for tracking locations of an invasive medical device
US9877698B2 (en) Ultrasonic diagnosis apparatus and ultrasonic image processing apparatus
CN111629671A (en) Ultrasonic imaging apparatus and method of controlling ultrasonic imaging apparatus
US20150087981A1 (en) Ultrasound diagnosis apparatus, computer program product, and control method
US20220071595A1 (en) Method and system for adapting user interface elements based on real-time anatomical structure recognition in acquired ultrasound image views
CN113573645A (en) Method and system for adjusting field of view of ultrasound probe
JP6914847B2 (en) Segmentation selection system and segmentation selection method
US20230218265A1 (en) System and Method for Displaying Position of Ultrasound Probe Using Diastasis 3D Imaging
EP4108182A1 (en) Reconstructing a 4d shell of a volume of an organ using a 4d ultrasound catheter
US20230172585A1 (en) Methods and systems for live image acquisition
WO2014155223A1 (en) Segmentation of planar contours of target anatomy in 3d ultrasound images

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14806424

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14806424

Country of ref document: EP

Kind code of ref document: A1