US20210106310A1 - Ultrasound imaging system having automatic image presentation - Google Patents

Ultrasound imaging system having automatic image presentation Download PDF

Info

Publication number
US20210106310A1
US20210106310A1 US17/131,073 US202017131073A US2021106310A1 US 20210106310 A1 US20210106310 A1 US 20210106310A1 US 202017131073 A US202017131073 A US 202017131073A US 2021106310 A1 US2021106310 A1 US 2021106310A1
Authority
US
United States
Prior art keywords
ultrasound
dimensional
imaging
medical device
probe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US17/131,073
Inventor
Jeremy B. Cox
Michael A. Randall
Peng Zheng
Dean M. Addison
Bryan A. Matthews
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Starfish Product Engineering Inc
CR Bard Inc
Bard Peripheral Vascular Inc
Original Assignee
CR Bard Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by CR Bard Inc filed Critical CR Bard Inc
Priority to US17/131,073 priority Critical patent/US20210106310A1/en
Publication of US20210106310A1 publication Critical patent/US20210106310A1/en
Assigned to C.R. BARD, INC. reassignment C.R. BARD, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BARD PERIPHERAL VASCULAR, INC.
Assigned to BARD PERIPHERAL VASCULAR, INC. reassignment BARD PERIPHERAL VASCULAR, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: STARFISH PRODUCT ENGINEERING INC.
Assigned to STARFISH PRODUCT ENGINEERING INC. reassignment STARFISH PRODUCT ENGINEERING INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: Addison, Dean M., Matthews, Bryan A.
Assigned to BARD PERIPHERAL VASCULAR, INC. reassignment BARD PERIPHERAL VASCULAR, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: COX, JEREMY B., ZHENG, PENG, Randall, Michael A.
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/0841Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4254Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors mounted on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/42Details of probe positioning or probe attachment to the patient
    • A61B8/4245Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient
    • A61B8/4263Details of probe positioning or probe attachment to the patient involving determining the position of the probe, e.g. with respect to an external reference frame or to the patient using sensors not mounted on the probe, e.g. mounted on an external reference frame
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4405Device being mounted on a trolley
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/483Diagnostic techniques involving the acquisition of a 3D volume of data
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4461Features of the scanning mechanism, e.g. for moving the transducer within the housing of the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/46Ultrasonic, sonic or infrasonic diagnostic devices with special arrangements for interfacing with the operator or the patient
    • A61B8/461Displaying means of special interest
    • A61B8/466Displaying means of special interest adapted to display 3D data

Definitions

  • the present invention relates to ultrasound imaging, and, more particularly, to an ultrasound imaging system that assists in the positioning of an ultrasound probe.
  • an ultrasound imaging system as in the present invention, which assists a person not experienced in ultrasound imaging in successful image acquisition, via system assisted positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the ultrasound probe can be displayed.
  • the present invention provides an ultrasound imaging system that assists in image acquisition, and in positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the probe can be displayed.
  • the ultrasound imaging system assists in the positioning of an ultrasound probe such that a specific image containing a medical device and/or the surrounding area can automatically be presented to the user.
  • the system may further be used to create three-dimensional (3D) images of underlying structures, which may convey additional information regarding the state of the underlying anatomy. This may assist one performing peripheral arterial disease (PAD) or other interventional procedures.
  • PID peripheral arterial disease
  • the invention in one form is directed to an ultrasound imaging system that includes an electromagnetic (EM) field generator configured to generate an EM locator field.
  • An interventional medical device is defined by an elongate body having a distal tip and a distal end portion extending proximally from the distal tip.
  • the interventional medical device has a first tracking element mounted at the distal end portion of the interventional medical device.
  • the first tracking element is configured to generate tip location data based on the EM locator field.
  • An ultrasound probe has a probe housing, an ultrasound transducer mechanism, and a second tracking element.
  • the probe housing has a handle portion and a head portion. The ultrasound transducer mechanism and the second tracking element are mounted to the probe housing.
  • the ultrasound transducer mechanism has an active ultrasound transducer array configured to generate two-dimensional ultrasound slice data at any of a plurality of discrete imaging locations within a three-dimensional imaging volume associated with the head portion.
  • the second tracking element is configured to generate probe location data based on the EM locator field.
  • a display screen is configured to display an ultrasound image.
  • a processor circuit is communicatively coupled to the first tracking element, the second tracking element, the ultrasound transducer mechanism, and the display screen.
  • the processor circuit is configured to execute program instructions to process the two-dimensional ultrasound slice data to generate the ultrasound image for display at the display screen.
  • the processor circuit is configured to generate a positioning signal based on the tip location data and the probe location data to dynamically position the active ultrasound transducer array at a desired imaging location of the plurality of discrete imaging locations so that the two-dimensional ultrasound slice data includes at least the distal tip of the interventional medical device so long as a location of the distal tip of the interventional medical device remains in the three-dimensional imaging volume.
  • a further version of the invention lies in the electromagnetic field generator adapted for use in such a system, the interventional medical device adapted for use in such a system, an ultrasound probe adapted for use in such a system, a display screen adapted for use in such a system, and a processor circuit adapted for use in such a system.
  • An alternative version of the invention lies in a system comprising a combination of any of the objects recited in the previous sentence.
  • the invention in another form is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and driving an ultrasound transducer mechanism to position an active ultrasound transducer array of the ultrasound probe at a determined point of convergence as defined by the offset distance.
  • a motion indicator is located on at least one of the ultrasound probe and the display screen.
  • the processor circuit is operably coupled to the motion indicator, wherein if the distal tip of the interventional medical device is presently located outside the three-dimensional imaging volume, a visual prompt is generated at the motion indicator to prompt the user to move the head portion of the ultrasound probe in a particular direction to a general location such that the distal tip of the interventional medical device resides in the three-dimensional imaging volume.
  • a third tracking element is attached to a patient, wherein when the third tracking element is energized by the EM field generator.
  • the third tracking element generates six axis patient location data, which is supplied to the processor circuit.
  • the processor circuit processes the six-axis patient location data and assigns location information for images captured by the active ultrasound transducer array to known positions within a 3D volume referenced from the third tracking element.
  • the ultrasound imaging system has a three-dimensional imaging mode, wherein with the ultrasound probe held in a fixed position over an area of interest, a scanning signal is supplied to the ultrasound transducer mechanism to scan the active ultrasound transducer array over at least a portion of the possible imaging volume located below the transducer array.
  • the active transducer array is repeatedly actuated during the scan to generate a plurality of sequential two-dimensional ultrasound data slices which are combined to form three-dimensional ultrasound volumetric data from which a three-dimensional ultrasound image is generated.
  • the active ultrasound transducer array is operated to generate multiple sets of ultrasound image data that includes metadata describing the location of the scan within the three-dimensional volume.
  • the multiple sets of ultrasound image data are summed to generate composite ultrasound image data.
  • a desired image plane is defined in the three-dimensional ultrasound volumetric data. At least one synthetic scan plane is generated corresponding to the desired image plane.
  • a first two-dimensional ultrasound image slice is generated from a series of two-dimensional B-scan ultrasound image slices acquired from the three-dimensional ultrasound volumetric data.
  • the first two-dimensional ultrasound image slice includes a particular region of interest.
  • the first two-dimensional ultrasound image slice lies in a first imaging plane different from that of the native B-scan imaging plane of the series of two-dimensional ultrasound image slices.
  • At least one slice selection slider provides a sequential parallel variation from the first two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the first two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the first two-dimensional ultrasound image slice.
  • an orientation of the ultrasound image that is displayed on a display screen is adjusted such that a vertical top of the acquired ultrasound image data is always rendered as “up” on the display screen relative to the position of the patient, and regardless of the actual orientation of ultrasound probe relative to the patient.
  • Another aspect of the invention is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and using the offset distance to dynamically control at least one ultrasound imaging setting of the ultrasound imaging system in near real time.
  • near real time means real time as limited by data acquisition and processing speed of the processing system.
  • the at least one ultrasound imaging setting may include ultrasound focus, such that a lateral resolution is optimized at a depth that contains the interventional medical device.
  • the at least one ultrasound imaging setting may include a depth setting, such that a depth of imaging is automatically adjusted to match a depth of the interventional medical device.
  • the at least one ultrasound imaging setting may include zoom, wherein an imaging window can be “zoomed” such that a larger view of an area of interest is automatically displayed to the user.
  • FIG. 1 is an illustration of an ultrasound imaging system in accordance with an aspect of the present invention.
  • FIG. 2 is an electrical block diagram of the ultrasound imaging system of FIG. 1 .
  • FIG. 3 shows an interventional medical device, such as a catheter or sheath, having a tracking element near its distal tip.
  • FIG. 4 shows an interventional medical device, such as a catheter, having a wireless dongle.
  • FIG. 5A shows the ultrasound probe of FIG. 1 having an ultrasound transducer mechanism with an active ultrasound transducer array configured to generate two-dimensional ultrasound slice data.
  • FIG. 5B shows a graphical user interface having a display screen showing a two-dimensional ultrasound image of the two-dimensional ultrasound slice data acquired by the ultrasound probe depicted in FIG. 5A .
  • FIG. 6A is a block diagram of an embodiment of the ultrasound probe of FIG. 1 , having a movable one-dimensional transducer array.
  • FIG. 6B shows the ultrasound probe of FIGS. 1 and 6A , with a portion broken away to expose an ultrasound transducer mechanism having a movable one-dimensional transducer array, a carriage, and a stepper motor.
  • FIG. 7A is a block diagram of another embodiment of the ultrasound probe of FIG. 1 , having a stationary two-dimensional transducer array.
  • FIG. 7B shows the ultrasound probe of FIG. 7A , depicting the two-dimensional transducer array in phantom (dashed) lines.
  • FIG. 8 is a flowchart depicting a lock-on tracking mode in accordance with an aspect of the present invention.
  • FIG. 9 is a flowchart depicting ultrasound data acquisition in accordance with an aspect of the present invention.
  • FIG. 10 shows a general side view of a patient having a position tracking element affixed to the skin.
  • FIG. 11 shows a screen of the graphical user interface of FIG. 1 , configured to display one or more synthetic (user chosen) scan planes, such as a coronal scan plane and an axial (sagittal) scan plane.
  • synthetic (user chosen) scan planes such as a coronal scan plane and an axial (sagittal) scan plane.
  • FIG. 12 is a pictorial representation of the graphical user interface of FIG. 1 depicting a sagittal plane slice extending through a series of two-dimensional ultrasound image slices in a three-dimensional imaging volume at sagittal slice location 270 .
  • FIG. 13 is a pictorial representation of the graphical user interface of FIG. 1 depicting a coronal plane slice extending through a series of two-dimensional ultrasound image slices in a three-dimensional imaging volume at coronal slice location 150 .
  • FIG. 14 is a flowchart describing the generation of a set of ultrasound images derived or synthesized from the three-dimensional volume data set, and shown in the correct location in the 3D virtual environment, in accordance with an aspect of the present invention.
  • FIG. 15A is a diagrammatic illustration of the ultrasound probe of FIG. 1 taking a two-dimensional ultrasound imaging slice of a portion of a leg of a patient.
  • FIG. 15B is a diagrammatic illustration of the graphical user interface of FIG. 1 having a patient oriented imaging window depicting a patient oriented virtual environment, wherein the location and orientation of the acquired ultrasound image data is rendered on the display screen to correspond to the orientation of the patient, such that the orientation and location of where the image is being acquired relative to the patient can be indicated and communicated to the viewer via use of the virtual environment.
  • FIG. 15C is a full view of the ultrasound image shown in FIG. 15B , in which the orientation of the location and orientation of the acquired ultrasound image data is rendered on the display screen to correspond to the orientation of the patient.
  • FIG. 15D is a comparative view of the ultrasound image shown in FIG. 15B when rendered in accordance with the prior art, wherein the orientation of the acquired ultrasound image data rendered on the display screen does not correspond to the orientation of the patient.
  • FIG. 16 is a flowchart of a patient oriented imaging window mode, or virtual environment imaging mode, associated with the depiction of the patient oriented imaging window of FIG. 15B shown in the correct location in the 3D virtual environment, in accordance with an aspect of the present invention.
  • FIGS. 1 and 2 there is shown an ultrasound imaging system 10 in accordance with the present invention.
  • Ultrasound imaging system 10 includes an electromagnetic (EM) field generator 12 , an ultrasound console 14 , and an ultrasound probe 16 (handheld). Ultrasound probe 16 is connected to an ultrasound console 14 by a flexible electrical cable 17 . Supplemental to ultrasound imaging system 10 is an interventional medical device 18 .
  • EM electromagnetic
  • ultrasound console 14 includes an ultrasound console 14 and an ultrasound probe 16 (handheld).
  • ultrasound probe 16 is connected to an ultrasound console 14 by a flexible electrical cable 17 .
  • Supplemental to ultrasound imaging system 10 is an interventional medical device 18 .
  • interventional medical device is an elongate intrusive medical device that is configured to be inserted into the tissue, vessel or cavity of a patient.
  • interventional medical device 18 may be, for example, a catheter, a lesion crossing catheter such as the CROSSER® Catheter available from C. R. Bard, Inc., a guide wire, a sheath, an angioplasty balloon, a stent delivery catheter, or a needle. It is intended that the interventional medical device 18 may be considered as a part of the overall ultrasound imaging system 10 , but alternatively, also may be considered as an auxiliary part of ultrasound imaging system 10 as a separately provided item.
  • Ultrasound imaging system 10 is configured to track the location of the ultrasound probe 16 and interventional medical device 18 , and in turn, to operate ultrasound probe 16 such that an active ultrasound transducer array of ultrasound probe 16 is dynamically positioned to image a desired portion of interventional medical device 18 , as further described below.
  • ultrasound console 14 includes a mobile housing 20 , to which is mounted a graphical user interface 22 , and a processor circuit 24 .
  • Graphical user interface 22 may be in the form of a touch-screen display 26 having a display screen 28 .
  • Graphical user interface 22 is used in displaying information to the user, and accommodates user input via the touch-screen 26 .
  • touch-screen 26 is configured to display an ultrasound image formed from two-dimensional ultrasound slice data provided by ultrasound probe 16 , to display virtual location information of tracked elements within a 3D volume, and to display prompts intended to guide the user in the correct positioning of the ultrasound probe 16 above the area of interest.
  • Processor circuit 24 is an electrical circuit that has data processing capability and command generating capability, and in the present embodiment has a microprocessor 24 - 1 and associated non-transitory electronic memory 24 - 2 .
  • Microprocessor 24 - 1 and associated non-transitory electronic memory 24 - 2 are commercially available components, as will be recognized by one skilled in the art.
  • Microprocessor 24 - 1 may be in the form of a single microprocessor, or two or more parallel microprocessors, as is known in the art.
  • Non-transitory electronic memory 24 - 2 may include multiple types of digital data memory, such as random access memory (RAM), non-volatile RAM (NVRAM), read only memory (ROM), and/or electrically erasable programmable read-only memory (EEPROM).
  • RAM random access memory
  • NVRAM non-volatile RAM
  • ROM read only memory
  • EEPROM electrically erasable programmable read-only memory
  • Non-transitory electronic memory 24 - 2 may further include mass data storage in one or more of the electronic memory forms described above, or on a computer hard disk drive or optical disk.
  • processor circuit 24 may be assembled as one or more Application Specific Integrated Circuits (ASIC).
  • ASIC Application Specific Integrated Circuits
  • Processor circuit 24 processes program instructions received from a program source, such as software or firmware, to which processor circuit 24 has electronic access. More particularly, processor circuit 24 is configured, as more fully described below, to process location signals received from ultrasound probe 16 and interventional medical device 18 , and to generate a digital positioning signal that is conditioned and provided as a control output to ultrasound probe 16 . More particularly, the digital positioning signal and control output correspond to a coordinate in the scan axis, e.g., the y-axis, of ultrasound probe 16 where the active ultrasound transducer array of ultrasound probe 16 is to be positioned.
  • a program source such as software or firmware
  • Processor circuit 24 is communicatively coupled to a probe input/output (I/O) interface circuit 30 , a probe position control circuit 31 , and a device input/output (I/O) interface circuit 32 via an internal bus structure 30 - 1 , 31 - 1 , and 32 - 1 , respectively.
  • the term “communicatively coupled” means connected for communication over a communication medium, wherein the communication medium may be a direct wired connection having electrical conductors and/or printed circuit electrical conduction paths, or a wireless connection, and may be an indirect wired or wireless connection having intervening electrical circuits, such as amplifiers or repeaters.
  • Probe input/output (I/O) interface circuit 30 and probe position control circuit 31 are configured to connect to electrical cable 17 , which in turn is connected to ultrasound probe 16 .
  • device input/output (I/O) interface circuit 32 is configured to connect to a flexible electrical cable 34 , which in turn is connected to interventional medical device 18 .
  • EM field generator 12 is placed near the area of interest of the patient P, and is used in triangulating the location of one or more tracked elements, such as the position of ultrasound probe 16 and interventional medical device 18 .
  • EM field generator 12 may be, for example, the field generator of an Aurora Electromagnetic Tracking System available from Northern Digital Inc. (NDI), which generates a base electromagnetic field that radiates in a known orientation to facilitate electromagnetic spatial measurement, which will be referred to hereinafter as an EM locator field 36 (see FIG. 2 ).
  • NDI Northern Digital Inc.
  • the field strength of the EM locator field 36 defines a detection volume 38 , as diagrammatically illustrated as a cube volume, for convenience, in FIG. 1 .
  • interventional medical device 18 has a distal tip 40 and a distal end portion 42 extending proximally from the distal tip 40 .
  • a tracking element 44 i.e., a wire electrical tracking coil
  • the term “near” is a range of zero to 2 centimeters (cm)
  • the extent of distal end portion 42 is in a range of 1 millimeter (mm) to 3 cm.
  • tracking element 44 allows the location of interventional medical device 18 to be known relative to ultrasound probe 16 , as more fully described below.
  • Tracking element 44 is configured to generate tip location data defining five degrees of freedom based on the EM locator field 36 generated by EM field generator 12 .
  • the five degrees of freedom are the X-axis, Y-axis, Z-axis, pitch, and yaw.
  • a sixth degree of freedom, i.e., roll, may be also included, if desired.
  • Tracking element 44 of interventional medical device 18 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 34 , serving as a communication link 46 between processor circuit 24 and tracking element 44 .
  • “communications link” refers to an electrical transmission of data, i.e., information, and/or electrical power signals, over a wired or wireless communication medium.
  • the communication link 46 provided by electrical cable 34 is a multi-conductor electrical cable that physically connects tracking element 44 to the ultrasound console 14 , and in turn to processor circuit 24 .
  • communication link 46 may be in the form of a short range wireless connection, such as Bluetooth, via a Bluetooth dongle 48 attached to interventional medical device 18 .
  • the Bluetooth dongle 48 is configured as a Bluetooth transmitter using Bluetooth protocol, and a corresponding Bluetooth receiver is connected to processor circuit 24 .
  • Bluetooth dongle 48 communicates tracking information from tracking element 44 , and other information associated with interventional medical device 18 , such as an operating state, to processor circuit 24 of ultrasound imaging system 10 .
  • Bluetooth dongle 48 may be used to provide power to the EM tracking components incorporated into interventional medical device 18 , in the case where the EM tracking component is an active circuit requiring a power source.
  • Bluetooth dongle 48 may be disposable, and included with each interventional medical device 18 .
  • Bluetooth dongle 48 may be reusable. Sterility requirements for the reusable dongle are addressed by placing the sterilized dongle in a sterile bag through which a sterile connection to interventional medical device 18 is made.
  • ultrasound probe 16 includes a probe housing 50 having a handle portion 52 joined with a head portion 54 .
  • handle portion 52 has an extent that is generally perpendicular (range of ⁇ 5 degrees) to the extent of head portion 54 .
  • Ultrasound probe 16 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 17 , which may be a wired or a wireless connection.
  • electrical cable 17 is depicted as a multi-conductor electrical cable that physically connects ultrasound probe 16 to ultrasound console 14 , and includes a communication link 56 , a communication link 58 , and a communication link 60 , each formed with wire conductors.
  • communication link 56 , communication link 58 , and communication link 60 may be in the form of a (short range) wireless connection, such as Bluetooth.
  • Portions of the processor circuit 24 could also be embedded in the ultrasound probe to analyze or process the received/transmitted signal to the ultrasound emitting element. The analyzed or processed signal is then transmitted back to the console via electrical cable.
  • ultrasound probe 16 includes an ultrasound transducer mechanism 62 and a tracking element 64 . Both ultrasound transducer mechanism 62 and tracking element 64 are mounted to probe housing 50 (see also FIG. 5A ), and may be contained within probe housing 50 , which may be formed from plastic. Also, tracking element 64 may be embedded in the plastic of probe housing 50 . Ultrasound transducer mechanism 62 is communicatively coupled to processor circuit 24 via communication links 56 and 58 .
  • ultrasound transducer mechanism 62 has an active ultrasound transducer array 66 configured to generate two-dimensional ultrasound slice data representing a two-dimensional ultrasound imaging slice 67 at any of a plurality of discrete imaging locations within a three-dimensional imaging volume 68 associated with head portion 54 of ultrasound probe 16 .
  • the three-dimensional imaging volume 68 is defined by a depth 68 - 1 of penetration of the ultrasound emission in the direction of the z-axis, a width 68 - 2 of ultrasound emission in the x-axis, and an ultrasound transducer scan extent 68 - 3 along the y-axis.
  • Active ultrasound transducer array 66 may be, for example, a one-dimensional transducer array in the form of a linear ultrasound transducer array, or alternatively, may be in the form of a convex or concave ultrasound transducer array.
  • one-dimensional transducer array is an array of ultrasound transducer elements arranged in a single row, wherein the row may be linear or curved.
  • Active ultrasound transducer array 66 is communicatively coupled to processor circuit 24 via communication link 58 , and supplies two-dimensional ultrasound data to processor circuit 24 via communication link 58 .
  • processor circuit 24 executes program instructions to store the two-dimensional ultrasound data in mass storage provided in non-transitory electronic memory 24 - 2 .
  • processor circuit 24 includes circuitry, or alternatively executes program instructions, to convert the two-dimensional ultrasound data to a form for viewing as a two-dimensional ultrasound image 69 on display screen 28 of graphical user interface 22 .
  • the two-dimensional ultrasound image 69 depicts interventional medical device 18 having tracking element 44 located in a blood vessel BV, and depicts distal tip 40 of distal end portion 42 of interventional medical device 18 engaged with an intravascular occlusion IC.
  • tracking element 64 i.e., a wire electrical tracking coil
  • Tracking element 64 is configured to generate probe location data defining six degrees of freedom based on the EM locator field 36 generated by EM field generator 12 .
  • the six degrees of freedom are the X-axis, Y-axis, Z-axis, pitch, yaw, and roll.
  • Tracking element 64 is communicatively coupled to processor circuit 24 via communication link 60 , and supplies probe location data to processor circuit 24 via communication link 60 .
  • Tracking element 64 allows for the determination of the location of ultrasound probe 16 within detection volume 38 as depicted in FIG. 1 , wherein detection volume 38 is considerably larger (more than 20 times larger) than the three-dimensional imaging volume 68 of ultrasound probe 16 depicted in FIG. 5A .
  • active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 may incorporate a movable one-dimensional (1D) transducer array, as in the embodiment depicted in FIGS. 6A and 6B .
  • active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 may be in the form of a selectable portion of a two-dimensional (2D) matrix transducer array.
  • active ultrasound transducer array 66 is physically movable relative to the probe housing 50 , i.e., is dynamically positioned within probe housing 50 , in order to capture ultrasound images of locations within the three-dimensional imaging volume 68 (diagrammatically illustrated cube volume, for convenience) beneath ultrasound probe 16 .
  • ultrasound transducer mechanism 62 includes a one-dimensional (1D) ultrasound transducer array 70 , a carriage 72 , and a stepper motor 74 .
  • one-dimensional ultrasound transducer array 70 serves as the active ultrasound transducer array 66 .
  • the one-dimensional ultrasound transducer array 70 has a row of a plurality of discrete ultrasound transducer elements.
  • Carriage 72 is connected to one-dimensional ultrasound transducer array 70 , such that one-dimensional ultrasound transducer array 70 moves in unison with carriage 72 .
  • Carriage 72 converts a rotation of a rotatable shaft 74 - 1 of stepper motor 74 into a linear translation of carriage 72 , and in turn, into a linear translation of one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50 , in a determined one of two translation directions D 1 , D 2 .
  • Stepper motor 74 is operably connected (electrically and communicatively) to probe position control circuit 31 (see FIG. 2 ) via communication link 56 of electrical cable 17 .
  • probe position control circuit 31 is in the form of a motor control circuit, which converts the digital positioning signal supplied by processor circuit 24 into a stepper motor positioning signal, which may include multiple stepper motor control signals, and which are supplied by motor control circuit 76 to stepper motor 74 to command rotation of rotatable shaft 74 - 1 by an amount corresponding to the amount and position dictated by the digital positioning signal.
  • the digital positioning signal and the stepper motor positioning signal may be referred to herein collectively as the “positioning signal”, since the stepper motor positioning signal is a form change of the digital positioning signal, and the “positioning signal” is considered herein to have been generated by processor circuit 24 .
  • Carriage 72 converts the rotation of rotatable shaft 74 - 1 of stepper motor 74 into a linear translation of carriage 72 , and in turn, moves one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50 in a determined one of two translation directions D 1 , D 2 , to a location thus dictated by the digital positioning signal generated by processor circuit 24 .
  • the one-dimensional ultrasound transducer array 70 may be moved to a desired position relative to head portion 54 of probe housing 50 .
  • FIG. 6B shows an embodiment of carriage 72 , wherein carriage 72 has an endless toothed belt 78 suspended between two longitudinally spaced idler gears/pulleys 80 - 1 , 80 - 2 .
  • Rotatable shaft 74 - 1 of stepper motor 74 is connected to a drive gear 82 .
  • Drive gear 82 is drivably engaged with the teeth of endless toothed belt 78 .
  • One-dimensional ultrasound transducer array 70 is attached to the lower run 78 - 1 of endless toothed belt 78 , and is movable along the longitudinal extent between the two longitudinally spaced idler gears/pulleys 80 - 1 , 80 - 2 .
  • toothed belt 78 suspended between two longitudinally spaced idler gears/pulleys 80 - 1 , 80 - 2 converts a rotation of the rotatable shaft 74 - 1 of the stepper motor 74 into a translation of the one-dimensional ultrasound transducer array 70 in a selectable one of the two translation directions D 1 , D 2 .
  • an alternative ultrasound transducer mechanism 62 - 1 includes a two-dimensional (2D) ultrasound transducer array 84 , and probe position control circuit 31 (see FIG. 2 ) is in the form of a matrix address circuit of the type used in addressing electronic memory.
  • Two-dimensional ultrasound transducer array 84 has a plurality of columns 84 - 1 and a plurality of addressable rows 84 - 2 of discrete ultrasound transducer elements arranged in a matrix pattern.
  • the two-dimensional ultrasound transducer array 84 may be a planar transducer arrangement, or alternatively may be a concave or convex arrangement.
  • Two-dimensional ultrasound transducer array 84 is communicatively coupled to processor circuit 24 via communications link 58 to supply two-dimensional ultrasound data from two-dimensional ultrasound transducer array 84 to processor circuit 24 .
  • probe position control circuit 31 is electrically connected to processor circuit 24 to receive the digital positioning signal generated by processor circuit 24 .
  • probe position control circuit 31 operates as a matrix address circuit to convert the digital positioning signal supplied by processor circuit 24 into a row selection positioning signal which is supplied to two-dimensional (2D) ultrasound transducer array 84 via communications link 56 to dynamically select one row of the plurality of rows 84 - 2 of discrete ultrasound transducer elements as the active linear ultrasound transducer array 66 .
  • the row selection positioning signal corresponds to the position dictated by the digital positioning signal generated by processor circuit 24 .
  • the row selection positioning signal is a form change of the digital positioning signal
  • the digital positioning signal and the row selection positioning signal may be referred to herein collectively as the “positioning signal”, and the “positioning signal” is considered herein to have been generated by processor circuit 24 .
  • FIGS. 7A and 7B emulates the dynamic positioning of the one-dimensional ultrasound transducer array 70 discussed above with respect to FIGS. 6A and 6B , and allows for similar control of where the ultrasound probe will image within the three-dimensional imaging volume 68 beneath the ultrasound probe (see FIG. 5A ).
  • ultrasound imaging system 10 provides a “lock-on” functionality, wherein the position of each of the ultrasound probe 16 and interventional medical device 18 are tracked, and the active ultrasound transducer array 66 in ultrasound probe 16 is dynamically positioned at a convergence of the tracking information, which is further described with reference to the flowchart of FIG. 8 .
  • processor circuit 24 is communicatively coupled to each of the tracking element 44 of interventional medical device 18 , tracking element 64 of ultrasound probe 16 , ultrasound transducer mechanism 62 of ultrasound probe 16 , and to the graphical user interface 22 having display screen 28 .
  • processor circuit 24 executes program instructions to determine the type of tracking elements that are associated with each of ultrasound probe 16 and interventional medical device 18 , the communications rate between processor circuit 24 and each of ultrasound probe 16 and interventional medical device 18 , the rate of data acquisition updating, and probe parameters.
  • probe parameters may include, scan extent start point and end point, and the desired velocity of the movement of active ultrasound transducer array 66 , with respect to the origin point 71 (see FIG. 5A ), defining the 0, 0, 0 location in the X, Y, and Z axes.
  • the location of tracking elements of ultrasound probe 16 and interventional medical device 18 may be calibrated with respect to the 3D detection volume 38 defined by EM field generator 12 (see FIG. 1 ).
  • “WHILE” defines the entry into a continuous loop to virtually converge the position of the ultrasound imaging plane of active ultrasound transducer array 66 of ultrasound probe 16 with the position of tracking element 44 , and in turn distal tip 40 , of interventional medical device 18 .
  • Processor circuit 24 remains in this continuous loop until the program execution is stopped.
  • the current position of tracking element 44 of interventional medical device 18 is determined in relation to the 3D detection volume 38 defined by EM field generator 12 .
  • tracking element 44 of interventional medical device 18 generates tip location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12 , and provides the tip location data associated with the physical coordinates to processor circuit 24 .
  • step S 106 in parallel to step S 104 , the current position of tracking element 64 of ultrasound (US) probe 16 is determined in relation to the 3D detection volume 38 defined by EM field generator 12 .
  • tracking element 64 of ultrasound probe 16 generates probe location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12 , and provides the probe location data associated with the physical coordinates to processor circuit 24 .
  • an ultrasound plane position (B-scan position) is determined based on the probe location data.
  • processor circuit 24 executes program instructions to define a unit vector, i.e., the Z-axis at origin point 71 (0,0,0) of FIG. 5A , that is perpendicular to (e.g., points downwardly from) the surface of head portion 54 of ultrasound probe 16 , wherein the unit vector initially lies on a current ultrasound image plane.
  • Processor circuit 24 executes program instructions to virtually rotate the vector to be normal to the current ultrasound image plane.
  • Processor circuit 24 then executes program instructions to rotate the normal vector about the Z-axis using the probe location data acquired at step S 106 , which corresponds to the orientation angle of ultrasound probe 16 .
  • Processor circuit 24 then executes program instructions to determine the position of the current ultrasound image plane, with respect to the origin, using the following equation:
  • A, B, C are coefficients of the x, y, z position coordinates (of the probe location data) defining the plane of ultrasound probe 16
  • D is the length of the distance vector from the origin point 71 to the Ax+By+Cz plane.
  • processor circuit 24 executes program instructions to calculate an offset distance between the position of interventional medical device 18 , as defined by the tip location data, and the ultrasound plane position (determined at step S 108 ) of ultrasound probe 16 , by using the equation:
  • A, B, C, and D are coefficients of the ultrasound plane position (see step S 108 ), and xl, yl, zl are the position coordinates (of the tip location data) of interventional medical device 18 .
  • the Equation 2 offset calculation gives the minimum, or perpendicular, distance from tracking element 44 of interventional medical device 18 to the ultrasound plane position, which is the distance (and direction) that ultrasound transducer mechanism 62 needs to move active ultrasound transducer array 66 so that there is a convergence (intersection) of the ultrasound position plane with the tracking element 44 , and in turn distal tip 40 , of interventional medical device 18 .
  • the calculation determines the offset used to achieve a convergence of the tip location data with the ultrasound plane position associated with the probe location data.
  • ultrasound transducer mechanism 62 is driven to position active ultrasound transducer array 66 at the determined point of convergence as defined by the OFFSET calculated at step S 110 .
  • processor circuit 24 executes program instructions to process the OFFSET to generate the positioning signal corresponding to the point of convergence, and the positioning signal is communicatively coupled to ultrasound transducer mechanism 62 to dynamically position active ultrasound transducer array 66 at a desired imaging location of the plurality of discrete imaging locations, so that the two-dimensional ultrasound slice data captured by active ultrasound transducer array 66 includes an image of at least the distal tip 40 of interventional medical device 18 , so long as distal tip 40 of the interventional medical device 18 remains in the three-dimensional imaging volume 68 under the surface of the head portion of ultrasound probe 16 .
  • the positioning signal will culminate in stepper motor control signal that are supplied to stepper motor 74 .
  • the positioning signal will culminate in a row selection signal supplied to two-dimensional ultrasound transducer array 84 .
  • under or “underlying” with respect to ultrasound probe 16 means within the possible imaging view extent of ultrasound probe 16 .
  • step S 102 “WHILE”, to continue in the continuous loop in maintaining a convergence of the position of the active ultrasound transducer array 66 of ultrasound probe 16 with tracking element 44 , and in turn distal tip 40 , of interventional medical device 18 .
  • FIG. 9 there is shown a flowchart describing the acquisition of ultrasound data concurrently with, i.e., during, the “lock-on” function described above with respect to FIG. 8 .
  • ultrasound probe 16 is configured for acquisition of ultrasound data. For example, parameters such as the desired resolution, and emission strength of active ultrasound transducer array 66 to achieve a desired depth of penetration, may be set.
  • ultrasound imaging system 10 is configured to collect a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data.
  • ultrasound imaging system 10 is configured to collect a series of ultrasound B-scan data to form three-dimensional ultrasound volumetric data representing the three-dimensional imaging volume 68 , from which C-scan data, or other plane oriented data, may be derived.
  • “WHILE” defines the entry into a continuous loop for acquisition of ultrasound data with active ultrasound transducer array 66 of ultrasound probe 16 .
  • processor circuit 24 is configured to execute program instructions, or alternatively includes circuitry, to process two-dimensional ultrasound slice data generated by the active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 , and to generate the ultrasound image for display at display screen 28 of graphical user interface 22 .
  • processor circuit 24 may execute program instructions to automatically store the two-dimensional ultrasound slice data in non-transitory electronic memory 24 - 2 , and thus accumulate multiple image data sets of the location of interest.
  • graphical user interface 22 may provide a user command to processor circuit 24 to store the two-dimensional ultrasound slice data in non-transitory electronic memory 24 - 2 on demand at the command from a user.
  • a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data is collected and stored in non-transitory electronic memory 24 - 2 .
  • active ultrasound transducer array 66 is scanned along the Y-axis across all, or a selected portion, of the three-dimensional imaging volume 68 to take a detailed volumetric scan of the underlying area beneath head portion 54 of ultrasound probe 16 , such that a series of ultrasound B-scan data representing the three-dimensional imaging volume is collected and stored in non-transitory electronic memory 24 - 2 .
  • step S 202 “WHILE”, to continue in the acquisition and updating of the ultrasound data.
  • ultrasound imaging system 10 is able to dynamically position active ultrasound transducer array 66 to converge at a desired imaging location of the plurality of discrete imaging locations in the three-dimensional imaging volume 68 so that the two-dimensional ultrasound slice data includes an image of at least the distal tip 40 of interventional medical device 18 in generating the ultrasound image displayed on display screen 28 .
  • a motion indicator 88 located on at least one of the ultrasound probe 16 and the display screen 28 of graphical user interface 22 (see also FIG. 2 ) is provided to guide the user to an acceptable placement of ultrasound probe 16 relative to the tracked interventional medical device 18 .
  • Motion indicator 88 is operably coupled to processor 24 , and may be in the form of directional arrows that may be selectively illuminated by processor circuit 24 so as to guide the user to an acceptable placement of ultrasound probe 16 relative to the tracked interventional medical device 18 .
  • processor circuit 24 executes program logic to determine whether tracking element 44 of interventional medical device 18 is outside the three-dimensional imaging volume 68 , and thus is outside the imagable range of ultrasound probe 16 .
  • processor circuit 24 executes program instructions to determine whether the distal tip 40 of the interventional medical device 18 is presently located outside the three-dimensional imaging volume 68 .
  • processor circuit 24 of ultrasound imaging system 10 further executes program instructions to generate a visual prompt at motion indicator 88 to prompt the user to move head portion 54 of ultrasound probe 16 in a particular direction to a general location such that tracking element 44 , and thus distal tip 40 , of interventional medical device 18 resides in the three-dimensional imaging volume 68 under ultrasound probe 16 , thereby permitting the active ultrasound transducer array 66 of ultrasound probe 16 to automatically capture ultrasound image data containing the tracking element 44 and distal tip 40 of interventional medical device 18 for display on display screen 28 .
  • ultrasound imaging system 10 uses ultrasound imaging system 10 to converge on a two-dimensional ultrasound image slice that includes the underlying interventional medical device 18 , even if ultrasound probe 16 is not placed directly over tracking element 44 /distal tip 40 of interventional medical device 18 .
  • the position of the active ultrasound transducer array 66 of ultrasound probe 16 is dynamically adjusted in near real time, limited by data acquisition and processing speed, which allows ultrasound imaging system 10 to adapt to small changes in position of ultrasound probe 16 , the position of the tracking element 44 of interventional medical device 18 , and/or the patient position, such that an ultrasound image of the underlying interventional medical device 18 is maintained within view of ultrasound probe 16 .
  • positioning prompts in the form of motion indicator 88 are again generated and used to prompt the user to move ultrasound probe 16 in a direction that allows ultrasound imaging system 10 to again converge on, and display, an ultrasound image of the underlying interventional medical device 18 .
  • Ultrasound imaging system 10 also may be operated in a three-dimensional (3D) high resolution scan imaging mode, with reference to step S 204 of FIG. 9 .
  • the ultrasound probe 16 in the three-dimensional (3D) high resolution imaging mode the ultrasound probe 16 is held in a fixed position over an area of interest, and the active ultrasound transducer array 66 is scanned along the Y-axis across all, or a selected portion, of the three-dimensional imaging volume 68 to take a detailed volumetric scan of the underlying area beneath head portion 54 of ultrasound probe 16 .
  • Ultrasound probe 16 may be held in the fixed position by the hand of the user. Metadata containing the position location from each two-dimensional slice obtained in the high resolution mode is further used to identify images taken from the same point in space, and subsequently used for image integration processing.
  • processor circuit 24 of ultrasound console 14 is configured to execute program instructions to generate a scanning signal that is supplied to ultrasound transducer mechanism 62 to scan active ultrasound transducer array 66 over at least a portion of the three-dimensional imaging volume 68 .
  • the active ultrasound transducer array 66 is repeatedly actuated during the scan to generate a plurality, i.e., a series, of sequential two-dimensional ultrasound slices, which are stored in memory 24 - 2 , and combined to form the 3D ultrasound volumetric data from which a three-dimensional (3D) high resolution ultrasound image is formed and displayed on display screen 28 of graphical user interface 22 (see also FIG. 2 ).
  • the quality of the high resolution 3D images may be improved by generating a composite ultrasound image of the location of interest. Because the location of the ultrasound probe 16 is known by processor circuit 24 , multiple sets of 2D or 3D, ultrasound images of a particular location in the three-dimensional imaging volume 68 underlying, e.g., perpendicular to, the surface of the head portion 54 of ultrasound probe 16 may be taken, and stored in non-transitory electronic memory 24 - 2 , from which a compound composite ultrasound image may be generated from the multiple sets of 2D, or 3D, ultrasound images by summing together the multiple sets of ultrasound images of the same location.
  • processor circuit 24 is configured to execute program instructions to operate the active ultrasound transducer array 66 to generate multiple sets of ultrasound image data that includes metadata corresponding to a particular location, i.e., metadata describing the location of the scan within the three-dimensional volume 68 , and save the multiple sets in non-transitory electronic memory 24 - 2 .
  • Processor circuit 24 is further configured to execute program instructions to sum the multiple sets of ultrasound image data to generate composite (compound) ultrasound image data, which is then stored in non-transitory memory 24 - 2 and/or is displayed on display screen 28 of graphical user interface 22 .
  • the quality of the high resolution 3D images also may be improved by tracking the position of the patient P in relation to the position of ultrasound probe 16 to reduce motion artifacts in the 3D images.
  • a third EM tracking element 90 i.e., a wire electrical tracking coil
  • Tracking element 90 is communicatively coupled to processor circuit 24 of ultrasound console 14 by a communication link 92 , such as a wired or wireless connection.
  • Tracking element 90 when energized by electromagnetic (EM) field generator 12 , generates three-axis patient location data, which is supplied via communications link 92 to processor circuit 24 .
  • EM electromagnetic
  • Processor circuit 24 processes the three-axis patient location data to further adjust the position of the active ultrasound transducer array 66 of ultrasound probe 16 in response to any motion of the patient.
  • tracking element 90 allows for the position of the patient to be known, which in turn allows ultrasound imaging system 10 to adjust the position of the active ultrasound transducer array 66 of ultrasound probe 16 to any motion created by the patient.
  • Ultrasound imaging system 10 also may be operated to render and display one or more synthetic (user chosen) scan planes.
  • the graphical user interface 22 having a three-dimensional ultrasound image 94 and user controls 96 displayed on display screen 28 .
  • a plurality, i.e., a series, of sequential two-dimensional ultrasound slices may be generated and combined to generate 3D ultrasound volumetric data defining a three-dimensional imaging volume.
  • the user may select for rendering and display one or more synthetic (user chosen) scan planes, such as a coronal scan plane 98 and an axial (sagittal) scan plane 100 .
  • the user may define, using user controls 96 , a desired synthetic plane orientation with respect to the 3D ultrasound volumetric data associated with three-dimensional ultrasound image 94 .
  • processor circuit 24 of ultrasound imaging system 10 executes program instructions to identify within the 3D ultrasound volumetric data of three-dimensional ultrasound image 94 the image data associated with the desired synthetic plane orientation.
  • the desired synthetic plane may pass through multiple two-dimensional image data slices in the 3D ultrasound volumetric data.
  • the desired one or more synthetic (user chosen) scan planes may be rendered and displayed on display screen 28 of graphical user interface 22 within the generated three-dimensional ultrasound image 94 as shown in FIG. 11 , or as standalone two-dimensional images.
  • Various views such as those associated with the sagittal plane, the transverse plane, and the coronal plane, may be visualized, and a slice from one or more, or all, of the planes, as defined by the location of the tracked device(s), e.g., tracking element 44 of interventional medical device 18 and/or tracking element 64 of ultrasound probe 16 , can be displayed, individually or as a group.
  • the tracked device(s) e.g., tracking element 44 of interventional medical device 18 and/or tracking element 64 of ultrasound probe 16
  • scan planes that do not exist at 90 degrees from each other could also be defined and selected by the user. Additionally, the user defined scan planes may not be planar, and may follow a curved path.
  • Another aspect of the present invention provides for a focusing of the three-dimensional imaging volume around a determined region of interest, i.e., the region around the location of tracking element 44 of interventional medical device 18 , by reducing the scan extent along the Y-axis (see FIG. 5A ), thus reducing the amount of three-dimensional ultrasound volumetric data required to adequately view the region surrounding interventional medical device 18 .
  • the scan extent of active ultrasound transducer array 66 along the Y-axis is reduced, i.e., focused, to that of most interest, thus reducing scanning time and the amount of data required to adequately represent the three-dimensional volume of interest.
  • processor circuit 24 executes program instructions to determine a region of interest in the three-dimensional ultrasound volumetric data defining the three-dimensional imaging volume 68 .
  • Processor circuit 24 also executes program instructions to reduce the scan range of the active ultrasound transducer array 66 of the ultrasound transducer mechanism 62 along the Y-axis for acquisition of subsequent three-dimensional ultrasound volumetric data at the region of interest from that of the scan range of the previous scan, so as to reduce the amount of acquired three-dimensional ultrasound volumetric data from that of the prior scan.
  • user controls 96 of graphical user interface 22 may include one or more slice selection sliders 102 , such as a coronal slider 102 - 1 and a sagittal slider 102 - 2 , to provide a sequential variation from an automatically, or manually, selected two-dimensional ultrasound image slice being displayed.
  • slice selection sliders 102 such as a coronal slider 102 - 1 and a sagittal slider 102 - 2 , to provide a sequential variation from an automatically, or manually, selected two-dimensional ultrasound image slice being displayed.
  • a plurality, i.e., a series, of sequential two-dimensional ultrasound B-scan imaging slices 67 may be generated and combined to generate 3D ultrasound volumetric data defining a three-dimensional imaging volume 68 .
  • a desired two-dimensional ultrasound image slice on a desired imaging plane may be generated from the 3D ultrasound volumetric data that includes a particular region of interest, such as distal tip 40 of interventional medical device 18 .
  • the desired two-dimensional ultrasound image slice may be in an imaging plane different from that of the native B-scan imaging plane of the sequential two-dimensional ultrasound imaging slices 67 that when combined form the 3D ultrasound volumetric data defining the three-dimensional imaging volume 68 .
  • slice selection sliders 102 permit the user to select a slice in each of one or more imaging planes for display, if desired, wherein the selected two-dimensional ultrasound image slice may intersect, or lie on either side of, the two-dimensional ultrasound image slice that was automatically, or manually, selected.
  • the slice selection sliders 102 are configured to provide a sequential parallel variation from the initially selected two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the initially selected two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the initially selected two-dimensional ultrasound image slice.
  • FIG. 12 is a pictorial representation at graphical user interface 22 depicting a selection of a sagittal plane slice 104 extending through a series of two-dimensional ultrasound image slices 67 in the three-dimensional imaging volume 68 at sagittal slice location 270 .
  • sagittal slider 102 - 2 By manipulation of sagittal slider 102 - 2 using one of the up-down arrows, sagittal slice location 271 , or others 1 - 269 or 272 - 560 , parallel to the sagittal slice location 270 may be selected for display.
  • FIG. 12 is a pictorial representation at graphical user interface 22 depicting a selection of a sagittal plane slice 104 extending through a series of two-dimensional ultrasound image slices 67 in the three-dimensional imaging volume 68 at sagittal slice location 270 .
  • sagittal slice location 271 By manipulation of sagittal slider 102 - 2 using one of the up-down arrows, sagittal slice
  • FIG. 13 is a pictorial representation depicting a selection of a coronal plane slice 106 extending through a series of two-dimensional ultrasound image slices 67 in a three-dimensional imaging volume 68 at coronal slice location 150 .
  • coronal slider 102 - 1 By manipulation of coronal slider 102 - 1 using one of the up-down arrows, coronal slice location 151 , or others 1 - 149 or 152 - 560 , may be selected for display.
  • FIG. 14 there is shown a flowchart describing the generation of a 3D ultrasound image as a set of three orthogonal ultrasound images.
  • ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image as a set of three orthogonal images, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models.
  • “WHILE” defines the entry into a continuous loop for generation and updating of the displayed 3D ultrasound image.
  • an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16 , as determined at step S 106 of FIG. 8 .
  • processor circuit 24 executes program instructions to move the 3D model of the three-dimensional imaging volume 68 to match the current position of ultrasound probe 16 .
  • processor circuit 24 executes program instructions to choose a two-dimensional ultrasound imaging slice 67 (B-scan) from a C-scan data slice that includes the tracking element 44 , and in turn the distal tip 40 , of interventional medical device 18 .
  • B-scan two-dimensional ultrasound imaging slice 67
  • processor circuit 24 executes program instructions to generate 3D display data representative of three orthogonal images in a virtual 3D environment associated with the three-dimensional imaging volume 68 matched to the current position of ultrasound probe 16 .
  • Processor circuit 24 sends the 3D display data to user interface 22 for display on display screen 28 as three orthogonal images that include the tracking element 44 , and in turn the distal tip 40 , of interventional medical device 18 .
  • step S 302 “WHILE”, to continue updating the displayed 3D ultrasound image.
  • FIGS. 15A, 15B, 15C and 16 there is described below a patient oriented imaging window mode.
  • that which was rendered as “up” on the ultrasound display screen followed the orientation of the ultrasound probe.
  • the orientation of the displayed ultrasound image is true to the orientation of the patient, regardless of the actual orientation of the ultrasound probe.
  • FIG. 15A shows a diagrammatic illustration of ultrasound probe 16 taking a two-dimensional ultrasound imaging slice 67 of a portion of a leg L of a patient.
  • FIG. 15B is a diagrammatic illustration of graphical user interface 22 having a patient oriented imaging window 108 depicting a patient oriented virtual environment on display screen 28 of graphical user interface 22 , wherein the location and orientation of the acquired ultrasound image data is rendered on the display screen 28 to correspond to the orientation of the patient P, wherein the orientation and location of where the ultrasound image is being acquired relative to a position of the patient P is indicated and communicated to the clinician via use of the virtual environment.
  • FIG. 15B is a diagrammatic illustration of graphical user interface 22 having a patient oriented imaging window 108 depicting a patient oriented virtual environment on display screen 28 of graphical user interface 22 , wherein the location and orientation of the acquired ultrasound image data is rendered on the display screen 28 to correspond to the orientation of the patient P, wherein the orientation and location of where the ultrasound image is being acquired relative to a position of the patient P is indicated and communicated to the clinician via use of the virtual environment.
  • FIG. 15B is a diagrammatic illustration of graphical user interface 22 having a patient oriented
  • 15B shows a diagrammatic illustration of graphical user interface 22 having patient oriented imaging window 108 including an image of leg L, rendered as an actual image of patient leg L or as a computer generated virtual rendering, and including a virtual rendering of ultrasound probe 16 and two-dimensional ultrasound imaging slice 67 that is generated by ultrasound probe 16 . Also shown is a secondary imaging window 110 including a computer generated virtual rendering, i.e., a graphic, of the orientation of the body of patient P, as well as an UP arrow indicating the orientation of the UP relative to the patient.
  • a computer generated virtual rendering i.e., a graphic
  • the display of the ultrasound image on display screen 28 of graphical user interface 22 may be adjusted such that a vertical “top” 67 - 1 of the acquired ultrasound image data of two-dimensional ultrasound imaging slice 67 , or the vertical top of the acquired volumetric data in 3D data acquisition, is always rendered as “UP” on display screen 28 relative to the position of the patient P, and regardless of the actual orientation of ultrasound probe 16 relative to the patient. In other words, even if the actual orientation of ultrasound probe 16 is changed relative to the position of the leg L from that depicted in FIG.
  • FIG. 15D depicts the ultrasound image generated in FIG. 15A as it would be rendered in accordance with the prior art, wherein the orientation of the acquired ultrasound image data rendered on the display screen does not correspond to the orientation of the patient.
  • the image is rendered on the display screen wherein the ultrasound probe head is in a virtual position at the top of the display screen and the bottom on the display screen always corresponds to the distal extent of the generated ultrasound image. More particularly, with the ultrasound probe oriented as depicted in FIGS. 15A and 15B , the prior art rendered ultrasound image would position the upper blood vessel 107 - 1 and the lower-left blood vessel 107 - 2 on the display screen as shown in FIG.
  • the displayed image no longer corresponds to the orientation of the patient P. Rather, as shown in FIG. 15D , using arrow 112 to designate the true “up” orientation, the prior art ultrasound image is actually rendered to face toward the left on the display screen. Accordingly, in the prior art, the ultrasound technician was required to mentally associate the orientation of the displayed image with that of the actual orientation of the patient.
  • the patient oriented imaging window aspect of the present invention described above with respect to FIGS. 15A, 15B and 15C , generates a virtual environment that aids a clinician, including a person not experienced in ultrasound imaging, in successful image acquisition.
  • FIG. 16 is a flowchart of a patient oriented imaging window mode, i.e., a virtual environment imaging mode, associated with the generation of the patient oriented imaging window as depicted above with respect to FIGS. 15A, 15B and 15C .
  • a patient oriented imaging window mode i.e., a virtual environment imaging mode
  • ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models, initializing a camera video data transfer, and configuring appropriate patient lighting for video.
  • “WHILE” defines the entry into a continuous loop for generation and updating of the displayed patient oriented imaging window 108 as depicted in FIGS. 15B and 15C .
  • an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16 , as determined at step S 106 of FIG. 8 .
  • processor circuit 24 executes program instructions to move the 3D model of the three-dimensional imaging volume 68 (see FIG. 5A ) to match the current position of ultrasound probe 16 .
  • an ultrasound (US) image transform node is updated based on the calculated OFFSET from step S 110 of FIG. 8 .
  • processor circuit 24 executes program instructions to update the ultrasound image transform node by moving a 3D model of the three-dimensional ultrasound imaging data to match the current two-dimensional ultrasound imaging slice 67 (B-scan) acquired from ultrasound probe 16 .
  • processor circuit 24 executes program instructions to display the two-dimensional ultrasound imaging slice 67 (B-scan) in a 3-D environment in the patient oriented imaging window 108 , such that the vertical “top” 67 - 1 of the acquired ultrasound image data of two-dimensional ultrasound imaging slice 67 , or the vertical top of the acquired volumetric data in 3D data acquisition, is always rendered as “up” on display screen 28 relative to the position of the patient, and regardless of the actual orientation of ultrasound probe 16 relative to the patient.
  • step 402 “WHILE”, to continue in updating the patient oriented imaging window 108 .
  • this offset, or depth information can further be used to dynamically control some of the ultrasound imaging settings in near real time, as identified below. This allows the system to optimize the image quality settings such that the best image of the interventional medical device 18 is displayed to the user at display screen 28 .
  • the ultrasound imaging settings that may be dynamically controlled because the z-axis offset from the ultrasound probe 16 can be calculated may include:
  • Depth setting because the z-axis offset from the ultrasound probe 16 can be calculated, the Depth setting can be dynamically controlled such that the depth of imaging is automatically adjusted to match the depth of the interventional medical device 18 .
  • the imaging window can be “zoomed” such that a larger view of the area of interest may be automatically displayed to the user.

Abstract

An ultrasound imaging system includes an interventional medical device having a first tracking element that generates tip location data based on an EM locator field. An ultrasound probe has an ultrasound transducer mechanism and a second tracking element. The ultrasound transducer mechanism has an active ultrasound transducer array that generates two-dimensional ultrasound slice data at any of a plurality of discrete imaging locations within a three-dimensional imaging volume. The second tracking element generates probe location data based on the EM locator field. A processor circuit is configured to execute program instructions to generate an ultrasound image for display, and is configured to generate a positioning signal based on the tip location data and the probe location data to dynamically position the active ultrasound transducer array so that the two-dimensional ultrasound slice data includes the distal tip of the interventional medical device.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims priority to U.S. provisional patent application Ser. No. 62/081,275, filed Nov. 18, 2014, which is incorporated herein by reference in its entirety.
  • BACKGROUND OF THE INVENTION 1. Field of the Invention
  • The present invention relates to ultrasound imaging, and, more particularly, to an ultrasound imaging system that assists in the positioning of an ultrasound probe.
  • 2. Description of the Related Art
  • Correctly positioning an ultrasound probe such that a diagnostically relevant image is produced is a skill often only obtained after training and consistent ultrasound use. This initial “training period” necessary to become proficient in ultrasound imaging may be a contributing factor to the current underutilization of ultrasound by non-sonographers.
  • What is needed in the art is an ultrasound imaging system, as in the present invention, which assists a person not experienced in ultrasound imaging in successful image acquisition, via system assisted positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the ultrasound probe can be displayed.
  • SUMMARY OF THE INVENTION
  • The present invention provides an ultrasound imaging system that assists in image acquisition, and in positioning of an ultrasound probe, such that an image of a location of interest under, i.e., in the imaging view of, the probe can be displayed. For example, the ultrasound imaging system assists in the positioning of an ultrasound probe such that a specific image containing a medical device and/or the surrounding area can automatically be presented to the user. The system may further be used to create three-dimensional (3D) images of underlying structures, which may convey additional information regarding the state of the underlying anatomy. This may assist one performing peripheral arterial disease (PAD) or other interventional procedures.
  • The invention in one form is directed to an ultrasound imaging system that includes an electromagnetic (EM) field generator configured to generate an EM locator field. An interventional medical device is defined by an elongate body having a distal tip and a distal end portion extending proximally from the distal tip. The interventional medical device has a first tracking element mounted at the distal end portion of the interventional medical device. The first tracking element is configured to generate tip location data based on the EM locator field. An ultrasound probe has a probe housing, an ultrasound transducer mechanism, and a second tracking element. The probe housing has a handle portion and a head portion. The ultrasound transducer mechanism and the second tracking element are mounted to the probe housing. The ultrasound transducer mechanism has an active ultrasound transducer array configured to generate two-dimensional ultrasound slice data at any of a plurality of discrete imaging locations within a three-dimensional imaging volume associated with the head portion. The second tracking element is configured to generate probe location data based on the EM locator field. A display screen is configured to display an ultrasound image. A processor circuit is communicatively coupled to the first tracking element, the second tracking element, the ultrasound transducer mechanism, and the display screen. The processor circuit is configured to execute program instructions to process the two-dimensional ultrasound slice data to generate the ultrasound image for display at the display screen. Also, the processor circuit is configured to generate a positioning signal based on the tip location data and the probe location data to dynamically position the active ultrasound transducer array at a desired imaging location of the plurality of discrete imaging locations so that the two-dimensional ultrasound slice data includes at least the distal tip of the interventional medical device so long as a location of the distal tip of the interventional medical device remains in the three-dimensional imaging volume.
  • A further version of the invention lies in the electromagnetic field generator adapted for use in such a system, the interventional medical device adapted for use in such a system, an ultrasound probe adapted for use in such a system, a display screen adapted for use in such a system, and a processor circuit adapted for use in such a system. An alternative version of the invention lies in a system comprising a combination of any of the objects recited in the previous sentence.
  • The invention in another form is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and driving an ultrasound transducer mechanism to position an active ultrasound transducer array of the ultrasound probe at a determined point of convergence as defined by the offset distance.
  • In accordance with another aspect of the invention, a motion indicator is located on at least one of the ultrasound probe and the display screen. The processor circuit is operably coupled to the motion indicator, wherein if the distal tip of the interventional medical device is presently located outside the three-dimensional imaging volume, a visual prompt is generated at the motion indicator to prompt the user to move the head portion of the ultrasound probe in a particular direction to a general location such that the distal tip of the interventional medical device resides in the three-dimensional imaging volume.
  • In accordance with another aspect of the invention, a third tracking element is attached to a patient, wherein when the third tracking element is energized by the EM field generator. The third tracking element generates six axis patient location data, which is supplied to the processor circuit. The processor circuit processes the six-axis patient location data and assigns location information for images captured by the active ultrasound transducer array to known positions within a 3D volume referenced from the third tracking element.
  • In accordance with another aspect of the invention, the ultrasound imaging system has a three-dimensional imaging mode, wherein with the ultrasound probe held in a fixed position over an area of interest, a scanning signal is supplied to the ultrasound transducer mechanism to scan the active ultrasound transducer array over at least a portion of the possible imaging volume located below the transducer array. The active transducer array is repeatedly actuated during the scan to generate a plurality of sequential two-dimensional ultrasound data slices which are combined to form three-dimensional ultrasound volumetric data from which a three-dimensional ultrasound image is generated.
  • In accordance with another aspect of the invention, the active ultrasound transducer array is operated to generate multiple sets of ultrasound image data that includes metadata describing the location of the scan within the three-dimensional volume. The multiple sets of ultrasound image data are summed to generate composite ultrasound image data.
  • In accordance with another aspect of the invention, a desired image plane is defined in the three-dimensional ultrasound volumetric data. At least one synthetic scan plane is generated corresponding to the desired image plane.
  • In accordance with another aspect of the invention, a first two-dimensional ultrasound image slice is generated from a series of two-dimensional B-scan ultrasound image slices acquired from the three-dimensional ultrasound volumetric data. The first two-dimensional ultrasound image slice includes a particular region of interest. The first two-dimensional ultrasound image slice lies in a first imaging plane different from that of the native B-scan imaging plane of the series of two-dimensional ultrasound image slices. At least one slice selection slider provides a sequential parallel variation from the first two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the first two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the first two-dimensional ultrasound image slice.
  • In accordance with another aspect of the invention, an orientation of the ultrasound image that is displayed on a display screen is adjusted such that a vertical top of the acquired ultrasound image data is always rendered as “up” on the display screen relative to the position of the patient, and regardless of the actual orientation of ultrasound probe relative to the patient.
  • Another aspect of the invention is directed to a method of operating an ultrasound imaging system, including acquiring a position of a first tracking element associated with an interventional medical device; acquiring a position of a second tracking element associated with an ultrasound probe; determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element; determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and using the offset distance to dynamically control at least one ultrasound imaging setting of the ultrasound imaging system in near real time. As used herein, the term “near real time” means real time as limited by data acquisition and processing speed of the processing system. The at least one ultrasound imaging setting may include ultrasound focus, such that a lateral resolution is optimized at a depth that contains the interventional medical device. Also, the at least one ultrasound imaging setting may include a depth setting, such that a depth of imaging is automatically adjusted to match a depth of the interventional medical device. Also, the at least one ultrasound imaging setting may include zoom, wherein an imaging window can be “zoomed” such that a larger view of an area of interest is automatically displayed to the user.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above-mentioned and other features and advantages of this invention, and the manner of attaining them, will become more apparent and the invention will be better understood by reference to the following description of embodiments of the invention taken in conjunction with the accompanying drawings, wherein:
  • FIG. 1 is an illustration of an ultrasound imaging system in accordance with an aspect of the present invention.
  • FIG. 2 is an electrical block diagram of the ultrasound imaging system of FIG. 1.
  • FIG. 3 shows an interventional medical device, such as a catheter or sheath, having a tracking element near its distal tip.
  • FIG. 4 shows an interventional medical device, such as a catheter, having a wireless dongle.
  • FIG. 5A shows the ultrasound probe of FIG. 1 having an ultrasound transducer mechanism with an active ultrasound transducer array configured to generate two-dimensional ultrasound slice data.
  • FIG. 5B shows a graphical user interface having a display screen showing a two-dimensional ultrasound image of the two-dimensional ultrasound slice data acquired by the ultrasound probe depicted in FIG. 5A.
  • FIG. 6A is a block diagram of an embodiment of the ultrasound probe of FIG. 1, having a movable one-dimensional transducer array.
  • FIG. 6B shows the ultrasound probe of FIGS. 1 and 6A, with a portion broken away to expose an ultrasound transducer mechanism having a movable one-dimensional transducer array, a carriage, and a stepper motor.
  • FIG. 7A is a block diagram of another embodiment of the ultrasound probe of FIG. 1, having a stationary two-dimensional transducer array.
  • FIG. 7B shows the ultrasound probe of FIG. 7A, depicting the two-dimensional transducer array in phantom (dashed) lines.
  • FIG. 8 is a flowchart depicting a lock-on tracking mode in accordance with an aspect of the present invention.
  • FIG. 9 is a flowchart depicting ultrasound data acquisition in accordance with an aspect of the present invention.
  • FIG. 10 shows a general side view of a patient having a position tracking element affixed to the skin.
  • FIG. 11 shows a screen of the graphical user interface of FIG. 1, configured to display one or more synthetic (user chosen) scan planes, such as a coronal scan plane and an axial (sagittal) scan plane.
  • FIG. 12 is a pictorial representation of the graphical user interface of FIG. 1 depicting a sagittal plane slice extending through a series of two-dimensional ultrasound image slices in a three-dimensional imaging volume at sagittal slice location 270.
  • FIG. 13 is a pictorial representation of the graphical user interface of FIG. 1 depicting a coronal plane slice extending through a series of two-dimensional ultrasound image slices in a three-dimensional imaging volume at coronal slice location 150.
  • FIG. 14 is a flowchart describing the generation of a set of ultrasound images derived or synthesized from the three-dimensional volume data set, and shown in the correct location in the 3D virtual environment, in accordance with an aspect of the present invention.
  • FIG. 15A is a diagrammatic illustration of the ultrasound probe of FIG. 1 taking a two-dimensional ultrasound imaging slice of a portion of a leg of a patient.
  • FIG. 15B is a diagrammatic illustration of the graphical user interface of FIG. 1 having a patient oriented imaging window depicting a patient oriented virtual environment, wherein the location and orientation of the acquired ultrasound image data is rendered on the display screen to correspond to the orientation of the patient, such that the orientation and location of where the image is being acquired relative to the patient can be indicated and communicated to the viewer via use of the virtual environment.
  • FIG. 15C is a full view of the ultrasound image shown in FIG. 15B, in which the orientation of the location and orientation of the acquired ultrasound image data is rendered on the display screen to correspond to the orientation of the patient.
  • FIG. 15D is a comparative view of the ultrasound image shown in FIG. 15B when rendered in accordance with the prior art, wherein the orientation of the acquired ultrasound image data rendered on the display screen does not correspond to the orientation of the patient.
  • FIG. 16 is a flowchart of a patient oriented imaging window mode, or virtual environment imaging mode, associated with the depiction of the patient oriented imaging window of FIG. 15B shown in the correct location in the 3D virtual environment, in accordance with an aspect of the present invention.
  • Corresponding reference characters indicate corresponding parts throughout the several views. The exemplifications set out herein illustrate embodiments of the invention, and such exemplifications are not to be construed as limiting the scope of the invention in any manner.
  • DETAILED DESCRIPTION OF THE INVENTION
  • Referring now to the drawings, and more particularly to FIGS. 1 and 2, there is shown an ultrasound imaging system 10 in accordance with the present invention.
  • Ultrasound imaging system 10 includes an electromagnetic (EM) field generator 12, an ultrasound console 14, and an ultrasound probe 16 (handheld). Ultrasound probe 16 is connected to an ultrasound console 14 by a flexible electrical cable 17. Supplemental to ultrasound imaging system 10 is an interventional medical device 18.
  • As used herein, the term “interventional medical device” is an elongate intrusive medical device that is configured to be inserted into the tissue, vessel or cavity of a patient. In the context of the present invention, interventional medical device 18 may be, for example, a catheter, a lesion crossing catheter such as the CROSSER® Catheter available from C. R. Bard, Inc., a guide wire, a sheath, an angioplasty balloon, a stent delivery catheter, or a needle. It is intended that the interventional medical device 18 may be considered as a part of the overall ultrasound imaging system 10, but alternatively, also may be considered as an auxiliary part of ultrasound imaging system 10 as a separately provided item.
  • Ultrasound imaging system 10 is configured to track the location of the ultrasound probe 16 and interventional medical device 18, and in turn, to operate ultrasound probe 16 such that an active ultrasound transducer array of ultrasound probe 16 is dynamically positioned to image a desired portion of interventional medical device 18, as further described below.
  • In the present embodiment, ultrasound console 14 includes a mobile housing 20, to which is mounted a graphical user interface 22, and a processor circuit 24. Graphical user interface 22 may be in the form of a touch-screen display 26 having a display screen 28. Graphical user interface 22 is used in displaying information to the user, and accommodates user input via the touch-screen 26. For example, touch-screen 26 is configured to display an ultrasound image formed from two-dimensional ultrasound slice data provided by ultrasound probe 16, to display virtual location information of tracked elements within a 3D volume, and to display prompts intended to guide the user in the correct positioning of the ultrasound probe 16 above the area of interest.
  • Processor circuit 24 is an electrical circuit that has data processing capability and command generating capability, and in the present embodiment has a microprocessor 24-1 and associated non-transitory electronic memory 24-2. Microprocessor 24-1 and associated non-transitory electronic memory 24-2 are commercially available components, as will be recognized by one skilled in the art. Microprocessor 24-1 may be in the form of a single microprocessor, or two or more parallel microprocessors, as is known in the art. Non-transitory electronic memory 24-2 may include multiple types of digital data memory, such as random access memory (RAM), non-volatile RAM (NVRAM), read only memory (ROM), and/or electrically erasable programmable read-only memory (EEPROM). Non-transitory electronic memory 24-2 may further include mass data storage in one or more of the electronic memory forms described above, or on a computer hard disk drive or optical disk. Alternatively, processor circuit 24 may be assembled as one or more Application Specific Integrated Circuits (ASIC).
  • Processor circuit 24 processes program instructions received from a program source, such as software or firmware, to which processor circuit 24 has electronic access. More particularly, processor circuit 24 is configured, as more fully described below, to process location signals received from ultrasound probe 16 and interventional medical device 18, and to generate a digital positioning signal that is conditioned and provided as a control output to ultrasound probe 16. More particularly, the digital positioning signal and control output correspond to a coordinate in the scan axis, e.g., the y-axis, of ultrasound probe 16 where the active ultrasound transducer array of ultrasound probe 16 is to be positioned.
  • Processor circuit 24 is communicatively coupled to a probe input/output (I/O) interface circuit 30, a probe position control circuit 31, and a device input/output (I/O) interface circuit 32 via an internal bus structure 30-1, 31-1, and 32-1, respectively. As used herein, the term “communicatively coupled” means connected for communication over a communication medium, wherein the communication medium may be a direct wired connection having electrical conductors and/or printed circuit electrical conduction paths, or a wireless connection, and may be an indirect wired or wireless connection having intervening electrical circuits, such as amplifiers or repeaters. Probe input/output (I/O) interface circuit 30 and probe position control circuit 31 are configured to connect to electrical cable 17, which in turn is connected to ultrasound probe 16. In the present embodiment, device input/output (I/O) interface circuit 32 is configured to connect to a flexible electrical cable 34, which in turn is connected to interventional medical device 18.
  • Referring again to FIG. 1, EM field generator 12 is placed near the area of interest of the patient P, and is used in triangulating the location of one or more tracked elements, such as the position of ultrasound probe 16 and interventional medical device 18. EM field generator 12 may be, for example, the field generator of an Aurora Electromagnetic Tracking System available from Northern Digital Inc. (NDI), which generates a base electromagnetic field that radiates in a known orientation to facilitate electromagnetic spatial measurement, which will be referred to hereinafter as an EM locator field 36 (see FIG. 2). The field strength of the EM locator field 36 defines a detection volume 38, as diagrammatically illustrated as a cube volume, for convenience, in FIG. 1.
  • Referring also to FIG. 3, interventional medical device 18 has a distal tip 40 and a distal end portion 42 extending proximally from the distal tip 40. In the present embodiment, a tracking element 44 (i.e., a wire electrical tracking coil) is mounted at distal end portion 42 of interventional medical device 18 near distal tip 40. In the context of the preceding sentence, the term “near” is a range of zero to 2 centimeters (cm), and the extent of distal end portion 42 is in a range of 1 millimeter (mm) to 3 cm. Those skilled in the art will recognize, however, that the exact location of the placement of tracking element 44 on interventional medical device 18 will depend on the portion of interventional medical device 18 that is to be tracked by ultrasound imaging system 10. Tracking element 44 allows the location of interventional medical device 18 to be known relative to ultrasound probe 16, as more fully described below.
  • Tracking element 44 is configured to generate tip location data defining five degrees of freedom based on the EM locator field 36 generated by EM field generator 12. The five degrees of freedom are the X-axis, Y-axis, Z-axis, pitch, and yaw. A sixth degree of freedom, i.e., roll, may be also included, if desired. Tracking element 44 of interventional medical device 18 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 34, serving as a communication link 46 between processor circuit 24 and tracking element 44. As used herein, “communications link” refers to an electrical transmission of data, i.e., information, and/or electrical power signals, over a wired or wireless communication medium. In the present embodiment, the communication link 46 provided by electrical cable 34 is a multi-conductor electrical cable that physically connects tracking element 44 to the ultrasound console 14, and in turn to processor circuit 24.
  • Alternatively, as depicted in FIG. 4, in place of a physical connection, communication link 46 may be in the form of a short range wireless connection, such as Bluetooth, via a Bluetooth dongle 48 attached to interventional medical device 18. The Bluetooth dongle 48 is configured as a Bluetooth transmitter using Bluetooth protocol, and a corresponding Bluetooth receiver is connected to processor circuit 24. Bluetooth dongle 48 communicates tracking information from tracking element 44, and other information associated with interventional medical device 18, such as an operating state, to processor circuit 24 of ultrasound imaging system 10. Also, Bluetooth dongle 48 may be used to provide power to the EM tracking components incorporated into interventional medical device 18, in the case where the EM tracking component is an active circuit requiring a power source.
  • Bluetooth dongle 48 may be disposable, and included with each interventional medical device 18. Alternatively, Bluetooth dongle 48 may be reusable. Sterility requirements for the reusable dongle are addressed by placing the sterilized dongle in a sterile bag through which a sterile connection to interventional medical device 18 is made.
  • As shown in FIG. 5A, ultrasound probe 16 includes a probe housing 50 having a handle portion 52 joined with a head portion 54. In the present embodiment, handle portion 52 has an extent that is generally perpendicular (range of ±5 degrees) to the extent of head portion 54.
  • Ultrasound probe 16 is communicatively coupled to processor circuit 24 of ultrasound console 14 via electrical cable 17, which may be a wired or a wireless connection. In the present embodiment, with reference to FIG. 2, electrical cable 17 is depicted as a multi-conductor electrical cable that physically connects ultrasound probe 16 to ultrasound console 14, and includes a communication link 56, a communication link 58, and a communication link 60, each formed with wire conductors. However, it is contemplated that one or more of communication link 56, communication link 58, and communication link 60 may be in the form of a (short range) wireless connection, such as Bluetooth. Portions of the processor circuit 24 could also be embedded in the ultrasound probe to analyze or process the received/transmitted signal to the ultrasound emitting element. The analyzed or processed signal is then transmitted back to the console via electrical cable.
  • Referring to FIG. 2, ultrasound probe 16 includes an ultrasound transducer mechanism 62 and a tracking element 64. Both ultrasound transducer mechanism 62 and tracking element 64 are mounted to probe housing 50 (see also FIG. 5A), and may be contained within probe housing 50, which may be formed from plastic. Also, tracking element 64 may be embedded in the plastic of probe housing 50. Ultrasound transducer mechanism 62 is communicatively coupled to processor circuit 24 via communication links 56 and 58.
  • Referring to FIGS. 2 and 5A, ultrasound transducer mechanism 62 has an active ultrasound transducer array 66 configured to generate two-dimensional ultrasound slice data representing a two-dimensional ultrasound imaging slice 67 at any of a plurality of discrete imaging locations within a three-dimensional imaging volume 68 associated with head portion 54 of ultrasound probe 16. The three-dimensional imaging volume 68 is defined by a depth 68-1 of penetration of the ultrasound emission in the direction of the z-axis, a width 68-2 of ultrasound emission in the x-axis, and an ultrasound transducer scan extent 68-3 along the y-axis. Active ultrasound transducer array 66 may be, for example, a one-dimensional transducer array in the form of a linear ultrasound transducer array, or alternatively, may be in the form of a convex or concave ultrasound transducer array. As used herein, the term “one-dimensional transducer array” is an array of ultrasound transducer elements arranged in a single row, wherein the row may be linear or curved.
  • Active ultrasound transducer array 66 is communicatively coupled to processor circuit 24 via communication link 58, and supplies two-dimensional ultrasound data to processor circuit 24 via communication link 58. Automatically, or alternatively based on a user input at graphical user interface 22, processor circuit 24 executes program instructions to store the two-dimensional ultrasound data in mass storage provided in non-transitory electronic memory 24-2.
  • Referring also to FIG. 5B, processor circuit 24 includes circuitry, or alternatively executes program instructions, to convert the two-dimensional ultrasound data to a form for viewing as a two-dimensional ultrasound image 69 on display screen 28 of graphical user interface 22. The two-dimensional ultrasound image 69 depicts interventional medical device 18 having tracking element 44 located in a blood vessel BV, and depicts distal tip 40 of distal end portion 42 of interventional medical device 18 engaged with an intravascular occlusion IC.
  • Referring again to FIGS. 2 and 5A, tracking element 64 (i.e., a wire electrical tracking coil) is configured to generate probe location data defining six degrees of freedom based on the EM locator field 36 generated by EM field generator 12. The six degrees of freedom are the X-axis, Y-axis, Z-axis, pitch, yaw, and roll. Tracking element 64 is communicatively coupled to processor circuit 24 via communication link 60, and supplies probe location data to processor circuit 24 via communication link 60. Tracking element 64 allows for the determination of the location of ultrasound probe 16 within detection volume 38 as depicted in FIG. 1, wherein detection volume 38 is considerably larger (more than 20 times larger) than the three-dimensional imaging volume 68 of ultrasound probe 16 depicted in FIG. 5A.
  • In accordance with the present invention, active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 may incorporate a movable one-dimensional (1D) transducer array, as in the embodiment depicted in FIGS. 6A and 6B. Alternatively, as depicted in FIGS. 7A and 7B, active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16 may be in the form of a selectable portion of a two-dimensional (2D) matrix transducer array.
  • In the embodiment depicted in FIGS. 6A and 6B, active ultrasound transducer array 66 is physically movable relative to the probe housing 50, i.e., is dynamically positioned within probe housing 50, in order to capture ultrasound images of locations within the three-dimensional imaging volume 68 (diagrammatically illustrated cube volume, for convenience) beneath ultrasound probe 16.
  • In the embodiment of FIGS. 6A and 6B, ultrasound transducer mechanism 62 includes a one-dimensional (1D) ultrasound transducer array 70, a carriage 72, and a stepper motor 74. In the present embodiment, one-dimensional ultrasound transducer array 70 serves as the active ultrasound transducer array 66. The one-dimensional ultrasound transducer array 70 has a row of a plurality of discrete ultrasound transducer elements.
  • Carriage 72 is connected to one-dimensional ultrasound transducer array 70, such that one-dimensional ultrasound transducer array 70 moves in unison with carriage 72. Carriage 72 converts a rotation of a rotatable shaft 74-1 of stepper motor 74 into a linear translation of carriage 72, and in turn, into a linear translation of one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50, in a determined one of two translation directions D1, D2.
  • Stepper motor 74 is operably connected (electrically and communicatively) to probe position control circuit 31 (see FIG. 2) via communication link 56 of electrical cable 17. In the present embodiment, probe position control circuit 31 is in the form of a motor control circuit, which converts the digital positioning signal supplied by processor circuit 24 into a stepper motor positioning signal, which may include multiple stepper motor control signals, and which are supplied by motor control circuit 76 to stepper motor 74 to command rotation of rotatable shaft 74-1 by an amount corresponding to the amount and position dictated by the digital positioning signal. In the present embodiment, the digital positioning signal and the stepper motor positioning signal may be referred to herein collectively as the “positioning signal”, since the stepper motor positioning signal is a form change of the digital positioning signal, and the “positioning signal” is considered herein to have been generated by processor circuit 24.
  • Carriage 72 converts the rotation of rotatable shaft 74-1 of stepper motor 74 into a linear translation of carriage 72, and in turn, moves one-dimensional ultrasound transducer array 70 relative to head portion 54 of probe housing 50 in a determined one of two translation directions D1, D2, to a location thus dictated by the digital positioning signal generated by processor circuit 24. Thus, based on the positioning signal initiated by processor circuit 24, the one-dimensional ultrasound transducer array 70 may be moved to a desired position relative to head portion 54 of probe housing 50.
  • FIG. 6B shows an embodiment of carriage 72, wherein carriage 72 has an endless toothed belt 78 suspended between two longitudinally spaced idler gears/pulleys 80-1, 80-2. Rotatable shaft 74-1 of stepper motor 74 is connected to a drive gear 82. Drive gear 82 is drivably engaged with the teeth of endless toothed belt 78. One-dimensional ultrasound transducer array 70 is attached to the lower run 78-1 of endless toothed belt 78, and is movable along the longitudinal extent between the two longitudinally spaced idler gears/pulleys 80-1, 80-2. As such, the arrangement of toothed belt 78 suspended between two longitudinally spaced idler gears/pulleys 80-1, 80-2 converts a rotation of the rotatable shaft 74-1 of the stepper motor 74 into a translation of the one-dimensional ultrasound transducer array 70 in a selectable one of the two translation directions D1, D2.
  • In the alternative embodiment depicted in FIGS. 7A and 7B, and identified as ultrasound probe 16-1, an alternative ultrasound transducer mechanism 62-1 includes a two-dimensional (2D) ultrasound transducer array 84, and probe position control circuit 31 (see FIG. 2) is in the form of a matrix address circuit of the type used in addressing electronic memory. Two-dimensional ultrasound transducer array 84 has a plurality of columns 84-1 and a plurality of addressable rows 84-2 of discrete ultrasound transducer elements arranged in a matrix pattern. The two-dimensional ultrasound transducer array 84 may be a planar transducer arrangement, or alternatively may be a concave or convex arrangement. Two-dimensional ultrasound transducer array 84 is communicatively coupled to processor circuit 24 via communications link 58 to supply two-dimensional ultrasound data from two-dimensional ultrasound transducer array 84 to processor circuit 24.
  • In the embodiment of FIGS. 7A, 7B, with reference to FIG. 2, probe position control circuit 31 is electrically connected to processor circuit 24 to receive the digital positioning signal generated by processor circuit 24. In the present embodiment, probe position control circuit 31 operates as a matrix address circuit to convert the digital positioning signal supplied by processor circuit 24 into a row selection positioning signal which is supplied to two-dimensional (2D) ultrasound transducer array 84 via communications link 56 to dynamically select one row of the plurality of rows 84-2 of discrete ultrasound transducer elements as the active linear ultrasound transducer array 66. Thus, the row selection positioning signal corresponds to the position dictated by the digital positioning signal generated by processor circuit 24.
  • In the embodiment of FIGS. 7A and 7B, since the row selection positioning signal is a form change of the digital positioning signal, the digital positioning signal and the row selection positioning signal may be referred to herein collectively as the “positioning signal”, and the “positioning signal” is considered herein to have been generated by processor circuit 24.
  • As such, the embodiment of FIGS. 7A and 7B emulates the dynamic positioning of the one-dimensional ultrasound transducer array 70 discussed above with respect to FIGS. 6A and 6B, and allows for similar control of where the ultrasound probe will image within the three-dimensional imaging volume 68 beneath the ultrasound probe (see FIG. 5A).
  • In accordance with the present invention, and in view of the embodiments discussed above, ultrasound imaging system 10 provides a “lock-on” functionality, wherein the position of each of the ultrasound probe 16 and interventional medical device 18 are tracked, and the active ultrasound transducer array 66 in ultrasound probe 16 is dynamically positioned at a convergence of the tracking information, which is further described with reference to the flowchart of FIG. 8. Recall that processor circuit 24 is communicatively coupled to each of the tracking element 44 of interventional medical device 18, tracking element 64 of ultrasound probe 16, ultrasound transducer mechanism 62 of ultrasound probe 16, and to the graphical user interface 22 having display screen 28.
  • Referring to FIG. 8, at step S100, the tracking and data acquisition aspects of ultrasound imaging system 10 are initialized. In particular, processor circuit 24 executes program instructions to determine the type of tracking elements that are associated with each of ultrasound probe 16 and interventional medical device 18, the communications rate between processor circuit 24 and each of ultrasound probe 16 and interventional medical device 18, the rate of data acquisition updating, and probe parameters. Such probe parameters may include, scan extent start point and end point, and the desired velocity of the movement of active ultrasound transducer array 66, with respect to the origin point 71 (see FIG. 5A), defining the 0, 0, 0 location in the X, Y, and Z axes. Also, the location of tracking elements of ultrasound probe 16 and interventional medical device 18 may be calibrated with respect to the 3D detection volume 38 defined by EM field generator 12 (see FIG. 1).
  • At step S102, “WHILE” defines the entry into a continuous loop to virtually converge the position of the ultrasound imaging plane of active ultrasound transducer array 66 of ultrasound probe 16 with the position of tracking element 44, and in turn distal tip 40, of interventional medical device 18. Processor circuit 24 remains in this continuous loop until the program execution is stopped.
  • At step S104, the current position of tracking element 44 of interventional medical device 18 is determined in relation to the 3D detection volume 38 defined by EM field generator 12. In particular, tracking element 44 of interventional medical device 18, generates tip location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12, and provides the tip location data associated with the physical coordinates to processor circuit 24.
  • At step S106, in parallel to step S104, the current position of tracking element 64 of ultrasound (US) probe 16 is determined in relation to the 3D detection volume 38 defined by EM field generator 12. In particular, tracking element 64 of ultrasound probe 16 generates probe location data as physical coordinates based on the EM locator field 36 generated by EM field generator 12, and provides the probe location data associated with the physical coordinates to processor circuit 24.
  • At step S108, an ultrasound plane position (B-scan position) is determined based on the probe location data. In particular, processor circuit 24 executes program instructions to define a unit vector, i.e., the Z-axis at origin point 71 (0,0,0) of FIG. 5A, that is perpendicular to (e.g., points downwardly from) the surface of head portion 54 of ultrasound probe 16, wherein the unit vector initially lies on a current ultrasound image plane. Processor circuit 24 executes program instructions to virtually rotate the vector to be normal to the current ultrasound image plane. Processor circuit 24 then executes program instructions to rotate the normal vector about the Z-axis using the probe location data acquired at step S106, which corresponds to the orientation angle of ultrasound probe 16. Processor circuit 24 then executes program instructions to determine the position of the current ultrasound image plane, with respect to the origin, using the following equation:

  • ultrasound plane position=(Ax+By+Cz+D),  Equation 1:
  • where A, B, C are coefficients of the x, y, z position coordinates (of the probe location data) defining the plane of ultrasound probe 16, and D is the length of the distance vector from the origin point 71 to the Ax+By+Cz plane.
  • At step S110, processor circuit 24 executes program instructions to calculate an offset distance between the position of interventional medical device 18, as defined by the tip location data, and the ultrasound plane position (determined at step S108) of ultrasound probe 16, by using the equation:

  • OFFSET=(Ax1+By1+Cz1+D)/sqrt(A 2 +B 2 +C 2),  Equation 2:
  • where: A, B, C, and D are coefficients of the ultrasound plane position (see step S108), and xl, yl, zl are the position coordinates (of the tip location data) of interventional medical device 18.
  • The Equation 2 offset calculation gives the minimum, or perpendicular, distance from tracking element 44 of interventional medical device 18 to the ultrasound plane position, which is the distance (and direction) that ultrasound transducer mechanism 62 needs to move active ultrasound transducer array 66 so that there is a convergence (intersection) of the ultrasound position plane with the tracking element 44, and in turn distal tip 40, of interventional medical device 18. Thus, in essence, the calculation determines the offset used to achieve a convergence of the tip location data with the ultrasound plane position associated with the probe location data.
  • At step S112, ultrasound transducer mechanism 62 is driven to position active ultrasound transducer array 66 at the determined point of convergence as defined by the OFFSET calculated at step S110. In particular, processor circuit 24 executes program instructions to process the OFFSET to generate the positioning signal corresponding to the point of convergence, and the positioning signal is communicatively coupled to ultrasound transducer mechanism 62 to dynamically position active ultrasound transducer array 66 at a desired imaging location of the plurality of discrete imaging locations, so that the two-dimensional ultrasound slice data captured by active ultrasound transducer array 66 includes an image of at least the distal tip 40 of interventional medical device 18, so long as distal tip 40 of the interventional medical device 18 remains in the three-dimensional imaging volume 68 under the surface of the head portion of ultrasound probe 16.
  • In the embodiment of FIGS. 6A and 6B, the positioning signal will culminate in stepper motor control signal that are supplied to stepper motor 74. In the embodiment of FIGS. 7A and 7B, the positioning signal will culminate in a row selection signal supplied to two-dimensional ultrasound transducer array 84. As used herein, the terms “under” or “underlying” with respect to ultrasound probe 16, means within the possible imaging view extent of ultrasound probe 16.
  • Thereafter, the process returns to step S102, “WHILE”, to continue in the continuous loop in maintaining a convergence of the position of the active ultrasound transducer array 66 of ultrasound probe 16 with tracking element 44, and in turn distal tip 40, of interventional medical device 18.
  • Referring to FIG. 9, there is shown a flowchart describing the acquisition of ultrasound data concurrently with, i.e., during, the “lock-on” function described above with respect to FIG. 8.
  • At step S200, ultrasound probe 16 is configured for acquisition of ultrasound data. For example, parameters such as the desired resolution, and emission strength of active ultrasound transducer array 66 to achieve a desired depth of penetration, may be set. For two-dimensional image scanning, ultrasound imaging system 10 is configured to collect a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data. For volume scan imaging, ultrasound imaging system 10 is configured to collect a series of ultrasound B-scan data to form three-dimensional ultrasound volumetric data representing the three-dimensional imaging volume 68, from which C-scan data, or other plane oriented data, may be derived.
  • At step S202, “WHILE” defines the entry into a continuous loop for acquisition of ultrasound data with active ultrasound transducer array 66 of ultrasound probe 16.
  • At step S204, ultrasound image data is acquired. More particularly, with reference to FIGS. 2 and 5A, processor circuit 24 is configured to execute program instructions, or alternatively includes circuitry, to process two-dimensional ultrasound slice data generated by the active ultrasound transducer array 66 of ultrasound transducer mechanism 62 of ultrasound probe 16, and to generate the ultrasound image for display at display screen 28 of graphical user interface 22. Also, processor circuit 24 may execute program instructions to automatically store the two-dimensional ultrasound slice data in non-transitory electronic memory 24-2, and thus accumulate multiple image data sets of the location of interest. Alternatively, graphical user interface 22 may provide a user command to processor circuit 24 to store the two-dimensional ultrasound slice data in non-transitory electronic memory 24-2 on demand at the command from a user.
  • For two-dimensional image scanning, a series of two-dimensional ultrasound imaging slices (ultrasound B-scan) data is collected and stored in non-transitory electronic memory 24-2. For volume scan imaging, active ultrasound transducer array 66 is scanned along the Y-axis across all, or a selected portion, of the three-dimensional imaging volume 68 to take a detailed volumetric scan of the underlying area beneath head portion 54 of ultrasound probe 16, such that a series of ultrasound B-scan data representing the three-dimensional imaging volume is collected and stored in non-transitory electronic memory 24-2.
  • Thereafter, the process returns to step S202, “WHILE”, to continue in the acquisition and updating of the ultrasound data.
  • While relative movement of ultrasound probe 16 and the distal tip 40 of interventional medical device 18 will result in a movement of the location of distal tip 40 of interventional medical device 18 in the three-dimensional imaging volume 68, so long as tracking element 44 and thus distal tip 40 of interventional medical device 18 remains in the three-dimensional imaging volume 68 of ultrasound probe 16, ultrasound imaging system 10 is able to dynamically position active ultrasound transducer array 66 to converge at a desired imaging location of the plurality of discrete imaging locations in the three-dimensional imaging volume 68 so that the two-dimensional ultrasound slice data includes an image of at least the distal tip 40 of interventional medical device 18 in generating the ultrasound image displayed on display screen 28.
  • However, referring again to FIG. 5A, in the event that tracking element 44 of interventional medical device 18 is outside the three-dimensional imaging volume 68, a motion indicator 88 located on at least one of the ultrasound probe 16 and the display screen 28 of graphical user interface 22 (see also FIG. 2) is provided to guide the user to an acceptable placement of ultrasound probe 16 relative to the tracked interventional medical device 18. Motion indicator 88 is operably coupled to processor 24, and may be in the form of directional arrows that may be selectively illuminated by processor circuit 24 so as to guide the user to an acceptable placement of ultrasound probe 16 relative to the tracked interventional medical device 18.
  • In particular, based on the tip location data provided by tracking element 44 of interventional medical device 18 and the probe location data tracking element 64 of ultrasound probe 16 processed by processor circuit 24, processor circuit 24 executes program logic to determine whether tracking element 44 of interventional medical device 18 is outside the three-dimensional imaging volume 68, and thus is outside the imagable range of ultrasound probe 16.
  • For example, when ultrasound probe 16 having tracking element 64 and interventional medical device 18 having tracking element 44 are placed within detection volume 38 of the EM field generator 12, the location of both tracking element 44 and tracking element 64, and the relative distance between tracking element 44 and tracking element 64, are calculated by processor circuit 24. Using this location and distance information, processor circuit 24 executes program instructions to determine whether the distal tip 40 of the interventional medical device 18 is presently located outside the three-dimensional imaging volume 68. If so, processor circuit 24 of ultrasound imaging system 10 further executes program instructions to generate a visual prompt at motion indicator 88 to prompt the user to move head portion 54 of ultrasound probe 16 in a particular direction to a general location such that tracking element 44, and thus distal tip 40, of interventional medical device 18 resides in the three-dimensional imaging volume 68 under ultrasound probe 16, thereby permitting the active ultrasound transducer array 66 of ultrasound probe 16 to automatically capture ultrasound image data containing the tracking element 44 and distal tip 40 of interventional medical device 18 for display on display screen 28.
  • Thus, in practicing the “lock-on” functionality mode of action of the present invention, if the tracking element 44, and thus distal tip 40, of the interventional medical device 18 is outside the three-dimensional imaging volume 68 of ultrasound probe 16, manual probe positioning prompts will be generated, in the form of motion indicator 88, which is present on ultrasound probe 16 and/or on graphical user interface 22 to prompt the user to move ultrasound probe 16 to the general location that contains the interventional medical device 18 having tracking element 44, such that tracking element 44 and distal tip 40 of interventional medical device 18 lies within the three-dimensional imaging volume 68 of ultrasound probe 16.
  • Once the user has placed ultrasound probe 16 over the general area to be visualized, location information from ultrasound probe 16 and interventional medical device 18 is further used to move the position of the active ultrasound transducer array 66 of ultrasound probe 16, which allows ultrasound imaging system 10 to converge on a two-dimensional ultrasound image slice that includes the underlying interventional medical device 18, even if ultrasound probe 16 is not placed directly over tracking element 44/distal tip 40 of interventional medical device 18.
  • The position of the active ultrasound transducer array 66 of ultrasound probe 16 is dynamically adjusted in near real time, limited by data acquisition and processing speed, which allows ultrasound imaging system 10 to adapt to small changes in position of ultrasound probe 16, the position of the tracking element 44 of interventional medical device 18, and/or the patient position, such that an ultrasound image of the underlying interventional medical device 18 is maintained within view of ultrasound probe 16.
  • If the interventional medical device 18 to be imaged moves outside of the possible three-dimensional imaging volume 68 beneath ultrasound probe 16, positioning prompts in the form of motion indicator 88 are again generated and used to prompt the user to move ultrasound probe 16 in a direction that allows ultrasound imaging system 10 to again converge on, and display, an ultrasound image of the underlying interventional medical device 18.
  • Ultrasound imaging system 10 also may be operated in a three-dimensional (3D) high resolution scan imaging mode, with reference to step S204 of FIG. 9.
  • In general, with further reference to FIG. 5A, in the three-dimensional (3D) high resolution imaging mode the ultrasound probe 16 is held in a fixed position over an area of interest, and the active ultrasound transducer array 66 is scanned along the Y-axis across all, or a selected portion, of the three-dimensional imaging volume 68 to take a detailed volumetric scan of the underlying area beneath head portion 54 of ultrasound probe 16. Ultrasound probe 16 may be held in the fixed position by the hand of the user. Metadata containing the position location from each two-dimensional slice obtained in the high resolution mode is further used to identify images taken from the same point in space, and subsequently used for image integration processing.
  • More particularly, in the 3D high resolution imaging mode, processor circuit 24 of ultrasound console 14 is configured to execute program instructions to generate a scanning signal that is supplied to ultrasound transducer mechanism 62 to scan active ultrasound transducer array 66 over at least a portion of the three-dimensional imaging volume 68. The active ultrasound transducer array 66 is repeatedly actuated during the scan to generate a plurality, i.e., a series, of sequential two-dimensional ultrasound slices, which are stored in memory 24-2, and combined to form the 3D ultrasound volumetric data from which a three-dimensional (3D) high resolution ultrasound image is formed and displayed on display screen 28 of graphical user interface 22 (see also FIG. 2).
  • The quality of the high resolution 3D images may be improved by generating a composite ultrasound image of the location of interest. Because the location of the ultrasound probe 16 is known by processor circuit 24, multiple sets of 2D or 3D, ultrasound images of a particular location in the three-dimensional imaging volume 68 underlying, e.g., perpendicular to, the surface of the head portion 54 of ultrasound probe 16 may be taken, and stored in non-transitory electronic memory 24-2, from which a compound composite ultrasound image may be generated from the multiple sets of 2D, or 3D, ultrasound images by summing together the multiple sets of ultrasound images of the same location.
  • In particular, processor circuit 24 is configured to execute program instructions to operate the active ultrasound transducer array 66 to generate multiple sets of ultrasound image data that includes metadata corresponding to a particular location, i.e., metadata describing the location of the scan within the three-dimensional volume 68, and save the multiple sets in non-transitory electronic memory 24-2. Processor circuit 24 is further configured to execute program instructions to sum the multiple sets of ultrasound image data to generate composite (compound) ultrasound image data, which is then stored in non-transitory memory 24-2 and/or is displayed on display screen 28 of graphical user interface 22.
  • Referring also to FIG. 10, the quality of the high resolution 3D images also may be improved by tracking the position of the patient P in relation to the position of ultrasound probe 16 to reduce motion artifacts in the 3D images. A third EM tracking element 90 (i.e., a wire electrical tracking coil), is affixed to the patient, such as by an adhesive. Tracking element 90 is communicatively coupled to processor circuit 24 of ultrasound console 14 by a communication link 92, such as a wired or wireless connection. Tracking element 90, when energized by electromagnetic (EM) field generator 12, generates three-axis patient location data, which is supplied via communications link 92 to processor circuit 24. Processor circuit 24 processes the three-axis patient location data to further adjust the position of the active ultrasound transducer array 66 of ultrasound probe 16 in response to any motion of the patient. In other words, tracking element 90 allows for the position of the patient to be known, which in turn allows ultrasound imaging system 10 to adjust the position of the active ultrasound transducer array 66 of ultrasound probe 16 to any motion created by the patient.
  • Ultrasound imaging system 10 also may be operated to render and display one or more synthetic (user chosen) scan planes.
  • Referring also to FIG. 11, there is shown the graphical user interface 22 having a three-dimensional ultrasound image 94 and user controls 96 displayed on display screen 28. As described above, a plurality, i.e., a series, of sequential two-dimensional ultrasound slices may be generated and combined to generate 3D ultrasound volumetric data defining a three-dimensional imaging volume. Using the 3D ultrasound volumetric data acquired from ultrasound probe 16, the user may select for rendering and display one or more synthetic (user chosen) scan planes, such as a coronal scan plane 98 and an axial (sagittal) scan plane 100.
  • In particular, the user may define, using user controls 96, a desired synthetic plane orientation with respect to the 3D ultrasound volumetric data associated with three-dimensional ultrasound image 94. From the plane orientation inputs provided at user controls 96, processor circuit 24 of ultrasound imaging system 10 executes program instructions to identify within the 3D ultrasound volumetric data of three-dimensional ultrasound image 94 the image data associated with the desired synthetic plane orientation. The desired synthetic plane may pass through multiple two-dimensional image data slices in the 3D ultrasound volumetric data. Once the image data associated with the desired synthetic plane orientation within the 3D ultrasound volumetric data is identified, the desired one or more synthetic (user chosen) scan planes may be rendered and displayed on display screen 28 of graphical user interface 22 within the generated three-dimensional ultrasound image 94 as shown in FIG. 11, or as standalone two-dimensional images. These additional views may allow for further inspection of the underlying anatomy, beyond what is normally obtained via fluoroscopy, which in turn may result in improved clinical outcomes.
  • Various views, such as those associated with the sagittal plane, the transverse plane, and the coronal plane, may be visualized, and a slice from one or more, or all, of the planes, as defined by the location of the tracked device(s), e.g., tracking element 44 of interventional medical device 18 and/or tracking element 64 of ultrasound probe 16, can be displayed, individually or as a group. It is also envisioned that scan planes that do not exist at 90 degrees from each other could also be defined and selected by the user. Additionally, the user defined scan planes may not be planar, and may follow a curved path.
  • Another aspect of the present invention provides for a focusing of the three-dimensional imaging volume around a determined region of interest, i.e., the region around the location of tracking element 44 of interventional medical device 18, by reducing the scan extent along the Y-axis (see FIG. 5A), thus reducing the amount of three-dimensional ultrasound volumetric data required to adequately view the region surrounding interventional medical device 18. In other words, following an initial 3D ultrasound volumetric data scan, on a subsequent 3D ultrasound volumetric data scan centered on the determined region of interest, the scan extent of active ultrasound transducer array 66 along the Y-axis is reduced, i.e., focused, to that of most interest, thus reducing scanning time and the amount of data required to adequately represent the three-dimensional volume of interest.
  • In particular, processor circuit 24 executes program instructions to determine a region of interest in the three-dimensional ultrasound volumetric data defining the three-dimensional imaging volume 68. Processor circuit 24 also executes program instructions to reduce the scan range of the active ultrasound transducer array 66 of the ultrasound transducer mechanism 62 along the Y-axis for acquisition of subsequent three-dimensional ultrasound volumetric data at the region of interest from that of the scan range of the previous scan, so as to reduce the amount of acquired three-dimensional ultrasound volumetric data from that of the prior scan.
  • Referring to FIGS. 12 and 13, as another aspect of the present invention, user controls 96 of graphical user interface 22 may include one or more slice selection sliders 102, such as a coronal slider 102-1 and a sagittal slider 102-2, to provide a sequential variation from an automatically, or manually, selected two-dimensional ultrasound image slice being displayed.
  • Referring also to FIG. 5A, a plurality, i.e., a series, of sequential two-dimensional ultrasound B-scan imaging slices 67 may be generated and combined to generate 3D ultrasound volumetric data defining a three-dimensional imaging volume 68. As such, based on tracking of the location of tracking element 44 of interventional medical device 18 and tracking element 64 of ultrasound probe 16, a desired two-dimensional ultrasound image slice on a desired imaging plane may be generated from the 3D ultrasound volumetric data that includes a particular region of interest, such as distal tip 40 of interventional medical device 18. The desired two-dimensional ultrasound image slice may be in an imaging plane different from that of the native B-scan imaging plane of the sequential two-dimensional ultrasound imaging slices 67 that when combined form the 3D ultrasound volumetric data defining the three-dimensional imaging volume 68.
  • Thus, slice selection sliders 102 permit the user to select a slice in each of one or more imaging planes for display, if desired, wherein the selected two-dimensional ultrasound image slice may intersect, or lie on either side of, the two-dimensional ultrasound image slice that was automatically, or manually, selected. The slice selection sliders 102 are configured to provide a sequential parallel variation from the initially selected two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the initially selected two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the initially selected two-dimensional ultrasound image slice.
  • For example, FIG. 12 is a pictorial representation at graphical user interface 22 depicting a selection of a sagittal plane slice 104 extending through a series of two-dimensional ultrasound image slices 67 in the three-dimensional imaging volume 68 at sagittal slice location 270. By manipulation of sagittal slider 102-2 using one of the up-down arrows, sagittal slice location 271, or others 1-269 or 272-560, parallel to the sagittal slice location 270 may be selected for display. Likewise, FIG. 13 is a pictorial representation depicting a selection of a coronal plane slice 106 extending through a series of two-dimensional ultrasound image slices 67 in a three-dimensional imaging volume 68 at coronal slice location 150. By manipulation of coronal slider 102-1 using one of the up-down arrows, coronal slice location 151, or others 1-149 or 152-560, may be selected for display.
  • Referring to FIG. 14, there is shown a flowchart describing the generation of a 3D ultrasound image as a set of three orthogonal ultrasound images.
  • At step S300, ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image as a set of three orthogonal images, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models.
  • At step S302, “WHILE” defines the entry into a continuous loop for generation and updating of the displayed 3D ultrasound image.
  • At step S304, an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16, as determined at step S106 of FIG. 8. In particular, processor circuit 24 executes program instructions to move the 3D model of the three-dimensional imaging volume 68 to match the current position of ultrasound probe 16.
  • At step S306, using the calculated OFFSET from step S110 of FIG. 8, and the 3D image data acquisition as described at step S204 of FIG. 9, processor circuit 24 executes program instructions to choose a two-dimensional ultrasound imaging slice 67 (B-scan) from a C-scan data slice that includes the tracking element 44, and in turn the distal tip 40, of interventional medical device 18.
  • At step S308, processor circuit 24 executes program instructions to generate 3D display data representative of three orthogonal images in a virtual 3D environment associated with the three-dimensional imaging volume 68 matched to the current position of ultrasound probe 16. Processor circuit 24 sends the 3D display data to user interface 22 for display on display screen 28 as three orthogonal images that include the tracking element 44, and in turn the distal tip 40, of interventional medical device 18.
  • Thereafter, the process returns to step S302, “WHILE”, to continue updating the displayed 3D ultrasound image.
  • Referring now to FIGS. 15A, 15B, 15C and 16, there is described below a patient oriented imaging window mode. In the past, that which was rendered as “up” on the ultrasound display screen followed the orientation of the ultrasound probe. However, in this aspect of the present invention, the orientation of the displayed ultrasound image is true to the orientation of the patient, regardless of the actual orientation of the ultrasound probe.
  • FIG. 15A shows a diagrammatic illustration of ultrasound probe 16 taking a two-dimensional ultrasound imaging slice 67 of a portion of a leg L of a patient. For purposes of comparison, note the location and orientation of the upper blood vessel 107-1, and the lower-left blood vessel 107-2, in relation to the orientation of the leg L of a patient P.
  • FIG. 15B is a diagrammatic illustration of graphical user interface 22 having a patient oriented imaging window 108 depicting a patient oriented virtual environment on display screen 28 of graphical user interface 22, wherein the location and orientation of the acquired ultrasound image data is rendered on the display screen 28 to correspond to the orientation of the patient P, wherein the orientation and location of where the ultrasound image is being acquired relative to a position of the patient P is indicated and communicated to the clinician via use of the virtual environment. In particular, FIG. 15B shows a diagrammatic illustration of graphical user interface 22 having patient oriented imaging window 108 including an image of leg L, rendered as an actual image of patient leg L or as a computer generated virtual rendering, and including a virtual rendering of ultrasound probe 16 and two-dimensional ultrasound imaging slice 67 that is generated by ultrasound probe 16. Also shown is a secondary imaging window 110 including a computer generated virtual rendering, i.e., a graphic, of the orientation of the body of patient P, as well as an UP arrow indicating the orientation of the UP relative to the patient.
  • Referring also to FIG. 15C, since the orientation of ultrasound probe 16 is known to ultrasound imaging system 10, as described above, the display of the ultrasound image on display screen 28 of graphical user interface 22 may be adjusted such that a vertical “top” 67-1 of the acquired ultrasound image data of two-dimensional ultrasound imaging slice 67, or the vertical top of the acquired volumetric data in 3D data acquisition, is always rendered as “UP” on display screen 28 relative to the position of the patient P, and regardless of the actual orientation of ultrasound probe 16 relative to the patient. In other words, even if the actual orientation of ultrasound probe 16 is changed relative to the position of the leg L from that depicted in FIG. 15B, such as the head of ultrasound probe 16 pointing downward, the orientation of the ultrasound image on display screen 28 of graphical user interface 22 remains as depicted in FIG. 15C. Thus, as viewed in display screen 28, features of the displayed image, such as the upper blood vessel 107-1, and the lower-left blood vessel 107-2, are always displayed in the correct orientation relative to the patient P.
  • In comparison, FIG. 15D depicts the ultrasound image generated in FIG. 15A as it would be rendered in accordance with the prior art, wherein the orientation of the acquired ultrasound image data rendered on the display screen does not correspond to the orientation of the patient. This is because in the prior art, the image is rendered on the display screen wherein the ultrasound probe head is in a virtual position at the top of the display screen and the bottom on the display screen always corresponds to the distal extent of the generated ultrasound image. More particularly, with the ultrasound probe oriented as depicted in FIGS. 15A and 15B, the prior art rendered ultrasound image would position the upper blood vessel 107-1 and the lower-left blood vessel 107-2 on the display screen as shown in FIG. 15D (i.e., rotated 90 degrees from that depicted in FIG. 15C), and as such, the displayed image no longer corresponds to the orientation of the patient P. Rather, as shown in FIG. 15D, using arrow 112 to designate the true “up” orientation, the prior art ultrasound image is actually rendered to face toward the left on the display screen. Accordingly, in the prior art, the ultrasound technician was required to mentally associate the orientation of the displayed image with that of the actual orientation of the patient.
  • Advantageously, the patient oriented imaging window aspect of the present invention described above with respect to FIGS. 15A, 15B and 15C, generates a virtual environment that aids a clinician, including a person not experienced in ultrasound imaging, in successful image acquisition.
  • More particularly, FIG. 16 is a flowchart of a patient oriented imaging window mode, i.e., a virtual environment imaging mode, associated with the generation of the patient oriented imaging window as depicted above with respect to FIGS. 15A, 15B and 15C.
  • At step S400, ultrasound imaging system 10 is initialized for rendering a 3D ultrasound image, such as setting up processor circuit 24 and graphical user interface 22 for construction of 3D models, initializing a camera video data transfer, and configuring appropriate patient lighting for video.
  • At step 402, “WHILE” defines the entry into a continuous loop for generation and updating of the displayed patient oriented imaging window 108 as depicted in FIGS. 15B and 15C.
  • At step S404, an ultrasound (US) volume transform node is updated based on the position of ultrasound probe 16, as determined at step S106 of FIG. 8. In particular, processor circuit 24 executes program instructions to move the 3D model of the three-dimensional imaging volume 68 (see FIG. 5A) to match the current position of ultrasound probe 16.
  • At step S406, an ultrasound (US) image transform node is updated based on the calculated OFFSET from step S110 of FIG. 8. In particular, processor circuit 24 executes program instructions to update the ultrasound image transform node by moving a 3D model of the three-dimensional ultrasound imaging data to match the current two-dimensional ultrasound imaging slice 67 (B-scan) acquired from ultrasound probe 16.
  • At step 408, based on 2D and/or 3D image data acquisition as described at step S204 of FIG. 9, processor circuit 24 executes program instructions to display the two-dimensional ultrasound imaging slice 67 (B-scan) in a 3-D environment in the patient oriented imaging window 108, such that the vertical “top” 67-1 of the acquired ultrasound image data of two-dimensional ultrasound imaging slice 67, or the vertical top of the acquired volumetric data in 3D data acquisition, is always rendered as “up” on display screen 28 relative to the position of the patient, and regardless of the actual orientation of ultrasound probe 16 relative to the patient.
  • Thereafter, the process returns to step 402, “WHILE”, to continue in updating the patient oriented imaging window 108.
  • As an additional aspect, since the offset distance (z-axis) between the ultrasound probe 16 and the interventional medical device 18 can be calculated using Equations 1 and 2 (see steps S108 and S110, discussed above), this offset, or depth information, can further be used to dynamically control some of the ultrasound imaging settings in near real time, as identified below. This allows the system to optimize the image quality settings such that the best image of the interventional medical device 18 is displayed to the user at display screen 28. The ultrasound imaging settings that may be dynamically controlled because the z-axis offset from the ultrasound probe 16 can be calculated may include:
  • 1) Ultrasound focus; such that the lateral resolution is optimized at the depth that contains the interventional medical device 18. Using the z-axis offset between the ultrasound probe 16 and the interventional medical device 18, the focus can be automatically adjusted to the depth that contains the interventional medical device 18.
  • 2) Depth setting; because the z-axis offset from the ultrasound probe 16 can be calculated, the Depth setting can be dynamically controlled such that the depth of imaging is automatically adjusted to match the depth of the interventional medical device 18.
  • 3) Zoom; because the z-axis offset from the ultrasound probe 16 can be calculated, the imaging window can be “zoomed” such that a larger view of the area of interest may be automatically displayed to the user.
  • While this invention has been described with respect to at least one embodiment, the present invention can be further modified within the spirit and scope of this disclosure. This application is therefore intended to cover any variations, uses, or adaptations of the invention using its general principles. Further, this application is intended to cover such departures from the present disclosure as come within known or customary practice in the art to which this invention pertains and which fall within the limits of the appended claims.

Claims (23)

1-23. (canceled)
24. A method of operating an ultrasound imaging system, comprising:
acquiring a position of a first tracking element associated with an interventional medical device;
acquiring a position of a second tracking element associated with an ultrasound probe;
determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element;
determining an offset distance between the position of first tracking element of the interventional medical device and the ultrasound plane position; and
driving an ultrasound transducer mechanism to position an active ultrasound transducer array of the ultrasound probe at a determined point of convergence as defined by the offset distance.
25. The method of claim 24, the first tracking element being associated with a distal tip of the interventional medical device, the method comprising:
determining whether the distal tip of the interventional medical device is presently located outside a three-dimensional imaging volume defined by the ultrasound probe; and
generating a visual prompt to prompt the user to move a head portion of the ultrasound probe in a particular direction to a general location such that the distal tip of the interventional medical device resides in the three-dimensional imaging volume of the ultrasound probe.
26. The method of claim 24, wherein the ultrasound transducer mechanism includes:
a motion unit arranged to perform linear movement and configured to effect rotational-to-linear translation conversion;
a one-dimensional ultrasound transducer array as the active ultrasound transducer array, the one-dimensional ultrasound transducer array being connected to the motion unit for movement in unison with the motion unit; and
the motion unit including a stepper motor having a rotatable shaft rotated by a rotational amount corresponding to the offset distance, the rotatable shaft being drivably coupled to the carriage, wherein the carriage converts a rotation of the rotatable shaft of the stepper motor to a translation of the one-dimensional ultrasound transducer array to position the one-dimensional ultrasound transducer array at the determined point of convergence.
27. The method of claim 24, wherein the transducer mechanism includes a two-dimensional ultrasound transducer array having a plurality of columns and a plurality of rows of ultrasound transducer elements arranged in a matrix pattern, wherein one row of the plurality of rows of discrete ultrasound transducer elements is selected as the active ultrasound transducer array based on the offset distance to position the active ultrasound transducer array at the point of convergence.
28. The method of claim 24, wherein the interventional medical device is one of a catheter, a lesion crossing catheter, a guide wire, a sheath, an angioplasty balloon, a stent delivery catheter, and a needle.
29. The method of claim 24, comprising:
scanning the active ultrasound transducer array over at least a portion of the three-dimensional imaging volume; and
repeatedly actuating the active ultrasound transducer array during the scanning to generate a plurality of sequential two-dimensional ultrasound data slices which are combined to form three-dimensional ultrasound volumetric data from which a three-dimensional ultrasound image is generated.
30. The method of claim 24, comprising:
operating the active ultrasound transducer array to generate multiple sets of ultrasound image data corresponding to a particular location; and
summing the multiple sets of ultrasound image data to generate composite ultrasound image data.
31. The method of claim 24, comprising:
attaching a third tracking element to a patient;
energizing the third tracking element with an EM field to generate six-axis patient location data; and
adjusting a position of the active ultrasound transducer array of the ultrasound transducer mechanism of the ultrasound probe in response to any motion of the patient.
32. The method of claim 29, comprising:
defining a desired image plane in the three-dimensional ultrasound volumetric data; and
generating at least one synthetic scan plane corresponding to the desired image plane.
33. The method of claim 32, wherein the desired image plane is one of a coronal scan plane and an axial scan plane.
34. The method of claim 29, comprising:
determining a region of interest in the three-dimensional ultrasound volumetric data defining the three-dimensional imaging volume; and
reducing the scan range of the active ultrasound transducer array of the ultrasound transducer mechanism for acquisition of subsequent three-dimensional ultrasound volumetric data at the region of interest from that of the scan range of the previous scan.
35. The method of claim 29, comprising:
generating a first two-dimensional ultrasound image slice from a series of two-dimensional ultrasound image slices in the three-dimensional ultrasound volumetric data, the first two-dimensional ultrasound image slice including a particular region of interest, the first two-dimensional ultrasound image slice lying in a first imaging plane different from that of the imaging plane of the series of two-dimensional ultrasound image slices; and
providing at least one slice selection slider configured to provide a sequential parallel variation from the first two-dimensional ultrasound image slice to manually select a second two-dimensional ultrasound image slice parallel to the first two-dimensional ultrasound image, wherein the second two-dimensional ultrasound image slice lies on either side of the first two-dimensional ultrasound image slice.
36. The method of claim 35, wherein the particular region of interest includes the distal tip of the interventional medical device.
37. The method of claim 35, wherein the at least one slice selection slider includes a sagittal slice selection slider and the first imaging plane is a sagittal plane.
38. The method of claim 35, wherein the at least one slice selection slider includes a coronal slice selection slider and the first imaging plane is a coronal plane.
39. The method of claim 24, comprising adjusting an orientation of the ultrasound image that is displayed on a display screen such that a vertical top of acquired ultrasound image data is always rendered as “up” on the display screen relative to the position of the patient, and regardless of the actual orientation of the ultrasound probe relative to the patient.
40. The method of claim 39, wherein the ultrasound image is a two-dimensional ultrasound image.
41. The method of claim 39, wherein the ultrasound image is a three-dimensional ultrasound image.
42. A method of operating an ultrasound imaging system, comprising:
acquiring a position of a first tracking element associated with an interventional medical device;
acquiring a position of a second tracking element associated with an ultrasound probe;
determining an ultrasound imaging plane position of the ultrasound probe based on the position of the second tracking element;
determining an offset distance between the position of the first tracking element of the interventional medical device and the ultrasound plane position; and
using the offset distance to dynamically control at least one ultrasound imaging setting of the ultrasound imaging system in near real time.
43. The method of claim 42, wherein the at least one ultrasound imaging setting includes ultrasound focus, such that a lateral resolution is optimized at a depth that contains the interventional medical device.
44. The method of claim 42, wherein the at least one ultrasound imaging setting includes a depth setting, such that a depth of imaging is automatically adjusted to match a depth of the interventional medical device.
45. The method of claim 42, wherein the at least one ultrasound imaging setting includes zoom, wherein an imaging window can be “zoomed” such that a larger view of an area of interest is automatically displayed to the user.
US17/131,073 2014-11-18 2020-12-22 Ultrasound imaging system having automatic image presentation Pending US20210106310A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US17/131,073 US20210106310A1 (en) 2014-11-18 2020-12-22 Ultrasound imaging system having automatic image presentation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201462081275P 2014-11-18 2014-11-18
PCT/US2015/018068 WO2016081023A1 (en) 2014-11-18 2015-02-27 Ultrasound imaging system having automatic image presentation
US201715525307A 2017-05-09 2017-05-09
US17/131,073 US20210106310A1 (en) 2014-11-18 2020-12-22 Ultrasound imaging system having automatic image presentation

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US15/525,307 Division US10905396B2 (en) 2014-11-18 2015-02-27 Ultrasound imaging system having automatic image presentation
PCT/US2015/018068 Division WO2016081023A1 (en) 2014-11-18 2015-02-27 Ultrasound imaging system having automatic image presentation

Publications (1)

Publication Number Publication Date
US20210106310A1 true US20210106310A1 (en) 2021-04-15

Family

ID=52633725

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/525,307 Active 2036-02-11 US10905396B2 (en) 2014-11-18 2015-02-27 Ultrasound imaging system having automatic image presentation
US17/131,073 Pending US20210106310A1 (en) 2014-11-18 2020-12-22 Ultrasound imaging system having automatic image presentation

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/525,307 Active 2036-02-11 US10905396B2 (en) 2014-11-18 2015-02-27 Ultrasound imaging system having automatic image presentation

Country Status (4)

Country Link
US (2) US10905396B2 (en)
EP (1) EP3220828B1 (en)
CN (1) CN106999146B (en)
WO (1) WO2016081023A1 (en)

Families Citing this family (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016081321A2 (en) 2014-11-18 2016-05-26 C.R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US10905396B2 (en) 2014-11-18 2021-02-02 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US20180153504A1 (en) * 2015-06-08 2018-06-07 The Board Of Trustees Of The Leland Stanford Junior University 3d ultrasound imaging, associated methods, devices, and systems
US20170203053A1 (en) * 2016-01-19 2017-07-20 Joseph Choate Burkett Visual-Assisted Insertion Device
KR20170093632A (en) * 2016-02-05 2017-08-16 삼성전자주식회사 Electronic device and operating method thereof
CA3022430A1 (en) * 2016-04-27 2017-11-02 Neux Technologies, Inc. Electrotherapeutic treatment
WO2017202795A1 (en) * 2016-05-23 2017-11-30 Koninklijke Philips N.V. Correcting probe induced deformation in an ultrasound fusing imaging system
CN107518917A (en) * 2017-08-21 2017-12-29 慈溪立声科技有限公司 A kind of imaging device of deformable hand-held three dimensional ultrasound probe and the application probe
US11911144B2 (en) 2017-08-22 2024-02-27 C. R. Bard, Inc. Ultrasound imaging system and interventional medical device for use therewith
WO2019040045A1 (en) * 2017-08-22 2019-02-28 C.R. Bard, Inc. Ultrasound imaging probe for use in an ultrasound imaging system
CA3077311A1 (en) * 2017-09-29 2019-04-04 C. R. Bard, Inc. Apparatus and method for tracking a medical ultrasonic object
EP3482691A1 (en) * 2017-11-14 2019-05-15 Koninklijke Philips N.V. Ice catheter with multiple transducer arrays
CN111655156A (en) * 2017-12-19 2020-09-11 皇家飞利浦有限公司 Combining image-based and inertial probe tracking
WO2019162422A1 (en) * 2018-02-22 2019-08-29 Koninklijke Philips N.V. Interventional medical device tracking
EP3574841A1 (en) * 2018-05-28 2019-12-04 Koninklijke Philips N.V. Ultrasound probe positioning system
CN112312840A (en) * 2018-06-22 2021-02-02 皇家飞利浦有限公司 Intravascular ultrasound location identification
US10685439B2 (en) * 2018-06-27 2020-06-16 General Electric Company Imaging system and method providing scalable resolution in multi-dimensional image data
US20210251697A1 (en) * 2018-08-22 2021-08-19 Koninklijke Philips N.V. 3d tracking of interventional medical devices
US10966750B2 (en) 2018-08-29 2021-04-06 Andrew J. Butki Needle assembly with reverberation feature to facilitate ultrasound guidance of the needle assembly
US11278312B2 (en) 2018-08-29 2022-03-22 Andrew J. Butki Introducer assembly with reverberation feature to facilitate ultrasound guidance
US20200113542A1 (en) * 2018-10-16 2020-04-16 General Electric Company Methods and system for detecting medical imaging scan planes using probe position feedback
CN109701168B (en) * 2018-12-27 2022-03-18 成植温 Gamma radiation tumor treatment system
CN111789630B (en) * 2019-04-08 2023-06-20 中慧医学成像有限公司 Ultrasonic probe three-dimensional space information measuring device
EP3968861B1 (en) * 2019-05-17 2022-11-09 Koninklijke Philips N.V. Ultrasound system and method for tracking movement of an object
EP4025132A4 (en) 2019-09-20 2023-10-04 Bard Access Systems, Inc. Automatic vessel detection tools and methods
CN111436937A (en) * 2020-03-16 2020-07-24 北京东软医疗设备有限公司 Catheter/guide wire tracking method and device and scanning equipment
CN216135922U (en) * 2020-09-08 2022-03-29 巴德阿克塞斯系统股份有限公司 Dynamically adjusting an ultrasound imaging system
CN112155595B (en) * 2020-10-10 2023-07-07 达闼机器人股份有限公司 Ultrasonic diagnostic apparatus, ultrasonic probe, image generation method, and storage medium
US20220346881A1 (en) * 2021-04-30 2022-11-03 Manikantan Shanmugham System and device to provide umbilical catheter tracking navigation and confirmation
CN113425325A (en) * 2021-06-24 2021-09-24 北京理工大学 Preoperative liver three-dimensional ultrasonic splicing system and method
CN113855071B (en) * 2021-09-28 2023-05-30 核工业总医院 Improved ultrasonic diagnostic apparatus and method of displaying ultrasonic image
CN116019486A (en) * 2021-10-25 2023-04-28 巴德阿克塞斯系统股份有限公司 High fidelity Doppler ultrasound with relative orientation using vessel detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US10905396B2 (en) * 2014-11-18 2021-02-02 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation

Family Cites Families (288)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB1484699A (en) 1973-10-15 1977-09-01 Tokyo Shibaura Electric Co Ultrasonic wave diagnosis apparatus
EP0038055A3 (en) 1980-04-16 1982-08-18 Georg Fischer Aktiengesellschaft Ultrasonic test method
US4483344A (en) 1980-12-30 1984-11-20 Atkov Oleg J Device for positioning cardiographic sensor
US4431007A (en) 1981-02-04 1984-02-14 General Electric Company Referenced real-time ultrasonic image display
US4669482A (en) 1985-10-28 1987-06-02 Board Of Regents, The University Of Texas System Pulse echo method and apparatus for sound velocity estimation in vivo
US4821731A (en) 1986-04-25 1989-04-18 Intra-Sonix, Inc. Acoustic image system and method
US4796632A (en) 1986-08-11 1989-01-10 General Electric Company Standoff adapter for ultrasound probe
US4920966A (en) 1986-10-02 1990-05-01 Hon Edward H Ultrasound transducer holder
US4831601A (en) 1986-10-31 1989-05-16 Siemens Aktiengesellschaft Apparatus for transmitting and receiving ultrasonic signals
FR2626773A1 (en) 1988-02-05 1989-08-11 Puy Philippe ECHOGRAPHIC PROBE SUPPORT, IN PARTICULAR ECHOCARDIOGRAPHIC PROBE
US5372138A (en) 1988-03-21 1994-12-13 Boston Scientific Corporation Acousting imaging catheters and the like
US4974593A (en) 1989-11-24 1990-12-04 Ng Raymond C Holder apparatus for transducer applicable to human body
CA2032204C (en) 1989-12-14 1995-03-14 Takashi Mochizuki Three-dimensional ultrasonic scanner
DE9004824U1 (en) 1990-04-27 1990-08-02 Hewlett-Packard Gmbh, 7030 Boeblingen, De
JPH0773576B2 (en) 1992-05-27 1995-08-09 アロカ株式会社 Ultrasonic probe for 3D data acquisition
US6757557B1 (en) 1992-08-14 2004-06-29 British Telecommunications Position location system
US5622174A (en) 1992-10-02 1997-04-22 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus and image displaying system
US5335663A (en) * 1992-12-11 1994-08-09 Tetrad Corporation Laparoscopic probes and probe sheaths useful in ultrasonic imaging applications
US5381794A (en) 1993-01-21 1995-01-17 Aloka Co., Ltd. Ultrasonic probe apparatus
IL116699A (en) 1996-01-08 2001-09-13 Biosense Ltd Method of constructing cardiac map
US5615680A (en) 1994-07-22 1997-04-01 Kabushiki Kaisha Toshiba Method of imaging in ultrasound diagnosis and diagnostic ultrasound system
US5503152A (en) 1994-09-28 1996-04-02 Tetrad Corporation Ultrasonic transducer assembly and method for three-dimensional imaging
US6690963B2 (en) 1995-01-24 2004-02-10 Biosense, Inc. System for determining the location and orientation of an invasive medical instrument
US5626554A (en) 1995-02-21 1997-05-06 Exogen, Inc. Gel containment structure
US5592939A (en) 1995-06-14 1997-01-14 Martinelli; Michael A. Method and system for navigating a catheter probe
US6048323A (en) 1995-10-02 2000-04-11 Hon; Edward H. Transducer support plate and tocodynamometer attachment system
US5598845A (en) 1995-11-16 1997-02-04 Stellartech Research Corporation Ultrasound transducer device for continuous imaging of the heart and other body parts
US5769843A (en) 1996-02-20 1998-06-23 Cormedica Percutaneous endomyocardial revascularization
EP0883860B1 (en) 1996-02-29 2006-08-23 Acuson Corporation Multiple ultrasound image registration system, method and transducer
US5669385A (en) 1996-03-13 1997-09-23 Advanced Technology Laboratories, Inc. Ultrasonic scanning of tissue motion in three dimensions
US5727553A (en) 1996-03-25 1998-03-17 Saad; Saad A. Catheter with integral electromagnetic location identification device
US5860929A (en) 1996-06-07 1999-01-19 The Regents Of The University Of Michigan Fractional moving blood volume estimation with power doppler ultrasound
US6443974B1 (en) 1996-07-28 2002-09-03 Biosense, Inc. Electromagnetic cardiac biostimulation
US6490474B1 (en) 1997-08-01 2002-12-03 Cardiac Pathways Corporation System and method for electrode localization using ultrasound
US6248074B1 (en) 1997-09-30 2001-06-19 Olympus Optical Co., Ltd. Ultrasonic diagnosis system in which periphery of magnetic sensor included in distal part of ultrasonic endoscope is made of non-conductive material
US6081738A (en) 1998-01-15 2000-06-27 Lumend, Inc. Method and apparatus for the guided bypass of coronary occlusions
US6505062B1 (en) 1998-02-09 2003-01-07 Stereotaxis, Inc. Method for locating magnetic implant by source field
JP3218216B2 (en) 1998-03-20 2001-10-15 アロカ株式会社 3D image processing device
US6241675B1 (en) 1998-06-09 2001-06-05 Volumetrics Medical Imaging Methods and systems for determining velocity of tissue using three dimensional ultrasound data
AUPP431898A0 (en) 1998-06-24 1998-07-16 Northern Cardiac Sonography Pty Ltd Ultrasonic cardiac output monitor
US7806829B2 (en) 1998-06-30 2010-10-05 St. Jude Medical, Atrial Fibrillation Division, Inc. System and method for navigating an ultrasound catheter to image a beating heart
JP2003524443A (en) 1998-08-02 2003-08-19 スーパー ディメンション リミテッド Medical guidance device
US6132378A (en) 1998-08-10 2000-10-17 Marino; Sharon Cover for ultrasound probe
US6261231B1 (en) 1998-09-22 2001-07-17 Dupont Pharmaceuticals Company Hands-free ultrasound probe holder
US6080108A (en) 1998-11-17 2000-06-27 Atl Ultrasound, Inc. Scanning aid for quantified three dimensional ultrasonic diagnostic imaging
US6312380B1 (en) 1998-12-23 2001-11-06 Radi Medical Systems Ab Method and sensor for wireless measurement of physiological variables
US7194294B2 (en) 1999-01-06 2007-03-20 Scimed Life Systems, Inc. Multi-functional medical catheter and methods of use
US6911026B1 (en) 1999-07-12 2005-06-28 Stereotaxis, Inc. Magnetically guided atherectomy
DE19922056A1 (en) 1999-05-14 2000-11-23 Heinz Lehr Medical instrument for internal examinations using ultrasonic or electromagnetic transducers, has drive connected to transducer at remote end of instrument so that it can be rotated
US7778688B2 (en) 1999-05-18 2010-08-17 MediGuide, Ltd. System and method for delivering a stent to a selected position within a lumen
US20040015079A1 (en) 1999-06-22 2004-01-22 Teratech Corporation Ultrasound probe with integrated electronics
US6527718B1 (en) 1999-08-20 2003-03-04 Brian G Connor Ultrasound system for continuous imaging and delivery of an encapsulated agent
US6464642B1 (en) 1999-08-20 2002-10-15 Kabushiki Kaisha Toshiba Ultrasonic diagnosis apparatus
US6423006B1 (en) 2000-01-21 2002-07-23 Siemens Medical Solutions Usa, Inc. Method and apparatus for automatic vessel tracking in ultrasound systems
US6413218B1 (en) 2000-02-10 2002-07-02 Acuson Corporation Medical diagnostic ultrasound imaging system and method for determining an acoustic output parameter of a transmitted ultrasonic beam
US6716166B2 (en) 2000-08-18 2004-04-06 Biosense, Inc. Three-dimensional reconstruction using ultrasound
US6524303B1 (en) 2000-09-08 2003-02-25 Stereotaxis, Inc. Variable stiffness magnetic catheter
US7210223B2 (en) 2000-12-13 2007-05-01 Image-Guided Neurologics, Inc. Method of manufacturing a microcoil construction
US8214015B2 (en) 2001-02-06 2012-07-03 Medtronic Vascular, Inc. In vivo localization and tracking of tissue penetrating catheters using magnetic resonance imaging
DE10115341A1 (en) 2001-03-28 2002-10-02 Philips Corp Intellectual Pty Method and imaging ultrasound system for determining the position of a catheter
US6685644B2 (en) 2001-04-24 2004-02-03 Kabushiki Kaisha Toshiba Ultrasound diagnostic apparatus
US6773402B2 (en) 2001-07-10 2004-08-10 Biosense, Inc. Location sensing with real-time ultrasound imaging
US6895267B2 (en) 2001-10-24 2005-05-17 Scimed Life Systems, Inc. Systems and methods for guiding and locating functional elements on medical devices positioned in a body
US6735465B2 (en) 2001-10-24 2004-05-11 Scimed Life Systems, Inc. Systems and processes for refining a registered map of a body cavity
JP2003180697A (en) 2001-12-18 2003-07-02 Olympus Optical Co Ltd Ultrasonic diagnostic equipment
US20050131289A1 (en) 2002-01-08 2005-06-16 Bio Scan Ltd Ultrasonic transducer probe
DE10203371A1 (en) 2002-01-29 2003-08-07 Siemens Ag Intravascular catheter with magnetic component in tip, allows magnetic field generated to be varied after introducing catheter into patient
DE10203372A1 (en) 2002-01-29 2003-09-04 Siemens Ag Medical examination and / or treatment system
US6755789B2 (en) 2002-02-05 2004-06-29 Inceptio Medical Technologies, Llc Ultrasonic vascular imaging system and method of blood vessel cannulation
US7806828B2 (en) 2002-02-05 2010-10-05 Inceptio Medical Technologies, Lc Multiplanar ultrasonic vascular sensor assembly and apparatus for movably affixing a sensor assembly to a body
KR100437974B1 (en) 2002-05-11 2004-07-02 주식회사 메디슨 Three-dimensional ultrasound imaging method and apparatus using lateral distance correlation function
CA2487140C (en) 2002-05-29 2011-09-20 Surgi-Vision, Inc. Magnetic resonance probes
JP4233808B2 (en) 2002-06-04 2009-03-04 株式会社日立メディコ Ultrasonic diagnostic equipment
US7520857B2 (en) 2002-06-07 2009-04-21 Verathon Inc. 3D ultrasound-based instrument for non-invasive measurement of amniotic fluid volume
US7477763B2 (en) 2002-06-18 2009-01-13 Boston Scientific Scimed, Inc. Computer generated representation of the imaging pattern of an imaging device
US6957098B1 (en) 2002-06-27 2005-10-18 Advanced Cardiovascular Systems, Inc. Markers for interventional devices in magnetic resonant image (MRI) systems
US7769427B2 (en) 2002-07-16 2010-08-03 Magnetics, Inc. Apparatus and method for catheter guidance control and imaging
JP4202697B2 (en) 2002-08-12 2008-12-24 株式会社東芝 Ultrasonic diagnostic apparatus, ultrasonic image display apparatus, and ultrasonic image display method
US7493154B2 (en) 2002-10-23 2009-02-17 Medtronic, Inc. Methods and apparatus for locating body vessels and occlusions in body vessels
US7881769B2 (en) 2002-11-18 2011-02-01 Mediguide Ltd. Method and system for mounting an MPS sensor on a catheter
US20040097803A1 (en) 2002-11-20 2004-05-20 Dorin Panescu 3-D catheter localization using permanent magnets with asymmetrical properties about their longitudinal axis
AU2003295968A1 (en) 2002-11-27 2004-06-23 Wang Shih-Ping Volumetric ultrasound scanning of smaller-sized breast
US8591417B2 (en) 2003-05-20 2013-11-26 Panasonic Corporation Ultrasonic diagnostic apparatus
US7090639B2 (en) 2003-05-29 2006-08-15 Biosense, Inc. Ultrasound catheter calibration system
US8335555B2 (en) 2003-05-30 2012-12-18 Lawrence Livermore National Security, Llc Radial reflection diffraction tomography
WO2004109495A1 (en) 2003-06-10 2004-12-16 Koninklijke Philips Electronics, N.V. System and method for annotating an ultrasound image
US6951543B2 (en) 2003-06-24 2005-10-04 Koninklijke Philips Electronics N.V. Automatic setup system and method for ultrasound imaging systems
US20050027195A1 (en) 2003-08-01 2005-02-03 Assaf Govari Calibration data compression
US7803116B2 (en) 2003-10-03 2010-09-28 University of Washington through its Center for Commericalization Transcutaneous localization of arterial bleeding by two-dimensional ultrasonic imaging of tissue vibrations
US7951081B2 (en) 2003-10-20 2011-05-31 Boston Scientific Scimed, Inc. Transducer/sensor assembly
US20050085718A1 (en) 2003-10-21 2005-04-21 Ramin Shahidi Systems and methods for intraoperative targetting
JP4594610B2 (en) 2003-10-21 2010-12-08 株式会社東芝 Ultrasonic image processing apparatus and ultrasonic diagnostic apparatus
DE10354496B4 (en) 2003-11-21 2011-03-31 Siemens Ag Medical examination and / or treatment system
US7727151B2 (en) 2003-11-28 2010-06-01 U-Systems Inc. Navigation among multiple breast ultrasound volumes
US7081093B2 (en) 2003-12-05 2006-07-25 Vermon Array transducer for 3D tilting probes
US20050154308A1 (en) 2003-12-30 2005-07-14 Liposonix, Inc. Disposable transducer seal
US20080021297A1 (en) 2004-02-10 2008-01-24 Koninklijke Philips Electronic, N.V. Method,a System for Generating a Spatial Roadmap for an Interventional Device and Quality Control System for Guarding the Spatial Accuracy Thereof
DE102004008371B4 (en) 2004-02-20 2006-05-24 Siemens Ag atherectomy
DE102004015640B4 (en) 2004-03-31 2007-05-03 Siemens Ag Apparatus for performing a cutting-balloon intervention with OCT monitoring
US8900149B2 (en) 2004-04-02 2014-12-02 Teratech Corporation Wall motion analyzer
US20060282081A1 (en) 2004-04-16 2006-12-14 Fanton Gary S Apparatus and method for securing tissue to bone with a suture
US7197354B2 (en) 2004-06-21 2007-03-27 Mediguide Ltd. System for determining the position and orientation of a catheter
US20060004291A1 (en) 2004-06-22 2006-01-05 Andreas Heimdal Methods and apparatus for visualization of quantitative data on a model
US7433504B2 (en) 2004-08-27 2008-10-07 General Electric Company User interactive method for indicating a region of interest
JP2008511367A (en) 2004-08-30 2008-04-17 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Method and apparatus for adjustable trace of flow velocity in Doppler velocity spectrum
US8303507B2 (en) 2004-09-07 2012-11-06 Kabushiki Kaisha Toshiba Ultrasonic doppler diagnostic apparatus and measuring method of diagnostic parameter
WO2006031765A2 (en) 2004-09-13 2006-03-23 University Of California Fluoroscopy-free guidewire systems and methods
WO2006042047A1 (en) 2004-10-07 2006-04-20 Flea Street Translational, Llc Methods, systems and devices for establishing communication between hollow organs and tissue lumens
US7713210B2 (en) 2004-11-23 2010-05-11 St. Jude Medical, Atrial Fibrillation Division, Inc. Method and apparatus for localizing an ultrasound catheter
CN100574707C (en) 2004-11-24 2009-12-30 株式会社日立医药 Ultrasonic imaging apparatus
DE102004058008B4 (en) 2004-12-01 2007-08-23 Siemens Ag Guidewire for vascular catheter with improved tracking and navigation
WO2006073088A1 (en) 2005-01-04 2006-07-13 Hitachi Medical Corporation Ultrasonographic device, ultrasonographic program, and ultrasonographic method
US20060184029A1 (en) 2005-01-13 2006-08-17 Ronen Haim Ultrasound guiding system and method for vascular access and operation mode
KR100562886B1 (en) 2005-03-24 2006-03-22 주식회사 프로소닉 Ultrasonic probe for 4 dimensional image
CA2603495A1 (en) 2005-04-01 2006-10-12 Visualsonics Inc. System and method for 3-d visualization of vascular structures using ultrasound
GB0508250D0 (en) 2005-04-23 2005-06-01 Smith & Nephew Composition
US10143398B2 (en) 2005-04-26 2018-12-04 Biosense Webster, Inc. Registration of ultrasound data with pre-acquired image
US7517318B2 (en) 2005-04-26 2009-04-14 Biosense Webster, Inc. Registration of electro-anatomical map with pre-acquired image using ultrasound
US20060247522A1 (en) 2005-04-28 2006-11-02 Boston Scientific Scimed, Inc. Magnetic navigation systems with dynamic mechanically manipulatable catheters
US8212554B2 (en) 2005-05-11 2012-07-03 The University Of Houston System Intraluminal magneto sensor system and method of use
DE102005027951A1 (en) 2005-06-16 2007-01-04 Siemens Ag Medical system for introducing a catheter into a vessel
DE102005028226A1 (en) 2005-06-17 2006-12-28 Siemens Ag Device for controlling movement of catheter in patient's body, has control device coupled with joystick that guides magnetic tip of catheter in patient's body, when catheter approaches obstacle in patient's body
JP4920302B2 (en) 2005-06-20 2012-04-18 株式会社東芝 Ultrasonic diagnostic apparatus and ultrasonic measurement method
US9314222B2 (en) 2005-07-07 2016-04-19 Stereotaxis, Inc. Operation of a remote medical navigation system using ultrasound image
US7536218B2 (en) 2005-07-15 2009-05-19 Biosense Webster, Inc. Hybrid magnetic-based and impedance-based position sensing
CN100445488C (en) 2005-08-01 2008-12-24 邱则有 Hollow member for cast-in-situ concrete moulding
KR20080031929A (en) 2005-08-04 2008-04-11 코닌클리케 필립스 일렉트로닉스 엔.브이. System and method for magnetic tracking of a sensor for interventional device localization
US20070038113A1 (en) 2005-08-11 2007-02-15 Kabushiki Kaisha Toshiba Puncture adaptor, ultrasonic probe for puncture, ultrasonic diagnostic apparatus for puncture, method for detecting angle of puncture needle
US7740584B2 (en) 2005-08-16 2010-06-22 The General Electric Company Method and system for mapping physiology information onto ultrasound-based anatomic structure
US7981038B2 (en) 2005-10-11 2011-07-19 Carnegie Mellon University Sensor guided catheter navigation system
CN101291629B (en) 2005-10-19 2010-12-01 株式会社日立医药 Ultrasonic diagnosis device
US8167805B2 (en) 2005-10-20 2012-05-01 Kona Medical, Inc. Systems and methods for ultrasound applicator station keeping
US7766833B2 (en) 2005-11-23 2010-08-03 General Electric Company Ablation array having independently activated ablation elements
DE102005059261B4 (en) 2005-12-12 2013-09-05 Siemens Aktiengesellschaft Catheter device for the treatment of a partial and / or complete vascular occlusion and X-ray device
JP2007175431A (en) 2005-12-28 2007-07-12 Olympus Medical Systems Corp Ultrasonograph
US20070239020A1 (en) * 2006-01-19 2007-10-11 Kazuhiro Iinuma Ultrasonography apparatus
US7677078B2 (en) 2006-02-02 2010-03-16 Siemens Medical Solutions Usa, Inc. Line-based calibration of ultrasound transducer integrated with a pose sensor
US20070238979A1 (en) 2006-03-23 2007-10-11 Medtronic Vascular, Inc. Reference Devices for Placement in Heart Structures for Visualization During Heart Valve Procedures
US9612142B2 (en) 2006-04-27 2017-04-04 General Electric Company Method and system for measuring flow through a heart valve
WO2008115188A2 (en) 2006-05-08 2008-09-25 C. R. Bard, Inc. User interface and methods for sonographic display device
KR20090010995A (en) 2006-05-26 2009-01-30 코닌클리케 필립스 일렉트로닉스 엔.브이. Improved calibration method for catheter tracking system using medical imaging data
JP4745133B2 (en) 2006-05-30 2011-08-10 株式会社東芝 Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program
JP5019562B2 (en) 2006-06-01 2012-09-05 株式会社東芝 Ultrasonic diagnostic apparatus and diagnostic program for the apparatus
US7961924B2 (en) 2006-08-21 2011-06-14 Stereotaxis, Inc. Method of three-dimensional device localization using single-plane imaging
US8652086B2 (en) 2006-09-08 2014-02-18 Abbott Medical Optics Inc. Systems and methods for power and flow rate control
WO2008042423A2 (en) 2006-10-02 2008-04-10 Hansen Medical, Inc. Systems for three-dimensional ultrasound mapping
US9241683B2 (en) 2006-10-04 2016-01-26 Ardent Sound Inc. Ultrasound system and method for imaging and/or measuring displacement of moving tissue and fluid
US8211025B2 (en) 2006-10-20 2012-07-03 General Electric Company Four-way steerable catheter system and method of use
US8388546B2 (en) 2006-10-23 2013-03-05 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
US7794407B2 (en) 2006-10-23 2010-09-14 Bard Access Systems, Inc. Method of locating the tip of a central venous catheter
EP2079368B1 (en) 2006-10-26 2011-07-13 Cardiogal Ltd. Non-invasive cardiac parameter measurement
US7831076B2 (en) 2006-12-08 2010-11-09 Biosense Webster, Inc. Coloring electroanatomical maps to indicate ultrasound data acquisition
EP2104460A2 (en) 2006-12-29 2009-09-30 Vascure Ltd. Atherectomy methods and apparatus
CN100581479C (en) * 2007-01-10 2010-01-20 华中科技大学 Method for reestablishing three-D ultrasonic image
US20120046553A9 (en) 2007-01-18 2012-02-23 General Electric Company Ultrasound catheter housing with electromagnetic shielding properties and methods of manufacture
US7735349B2 (en) 2007-01-31 2010-06-15 Biosense Websters, Inc. Correlation of ultrasound images and gated position measurements
US7996057B2 (en) 2007-01-31 2011-08-09 Biosense Webster, Inc. Ultrasound catheter calibration with enhanced accuracy
JP4911540B2 (en) 2007-03-05 2012-04-04 国立大学法人山口大学 Ultrasonic diagnostic equipment
US20080249395A1 (en) 2007-04-06 2008-10-09 Yehoshua Shachar Method and apparatus for controlling catheter positioning and orientation
EP2139392B1 (en) 2007-04-26 2014-02-26 Koninklijke Philips N.V. Localization system
US9757036B2 (en) 2007-05-08 2017-09-12 Mediguide Ltd. Method for producing an electrophysiological map of the heart
US8527032B2 (en) 2007-05-16 2013-09-03 General Electric Company Imaging system and method of delivery of an instrument to an imaged subject
US9055883B2 (en) 2007-05-16 2015-06-16 General Electric Company Surgical navigation system with a trackable ultrasound catheter
US8428690B2 (en) 2007-05-16 2013-04-23 General Electric Company Intracardiac echocardiography image reconstruction in combination with position tracking system
CA2687876A1 (en) 2007-05-23 2009-03-05 Oscillon Ltd. Apparatus and method for guided chronic total occlusion penetration
US9173638B2 (en) 2007-06-04 2015-11-03 Biosense Webster, Inc. Cardiac mechanical assessment using ultrasound
JP4934513B2 (en) 2007-06-08 2012-05-16 株式会社日立メディコ Ultrasonic imaging device
US8172757B2 (en) 2007-06-18 2012-05-08 Sunnybrook Health Sciences Centre Methods and devices for image-guided manipulation or sensing or anatomic structures
US7854237B2 (en) 2007-06-28 2010-12-21 Nancy Beck Irland Fetal monitoring transducer aligning device
EP2166936A4 (en) 2007-07-03 2010-07-28 Irvine Biomedical Inc Magnetically guided catheter with flexible tip
JP2009018115A (en) 2007-07-13 2009-01-29 Toshiba Corp Three-dimensional ultrasonograph
JP5394620B2 (en) 2007-07-23 2014-01-22 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic imaging apparatus and image processing apparatus
DE102008031146B4 (en) 2007-10-05 2012-05-31 Siemens Aktiengesellschaft Device for navigating a catheter through a closure region of a vessel
US20090105579A1 (en) 2007-10-19 2009-04-23 Garibaldi Jeffrey M Method and apparatus for remotely controlled navigation using diagnostically enhanced intra-operative three-dimensional image data
US9439624B2 (en) 2007-10-19 2016-09-13 Metritrack, Inc. Three dimensional mapping display system for diagnostic ultrasound machines and method
JP5449175B2 (en) 2007-10-26 2014-03-19 コーニンクレッカ フィリップス エヌ ヴェ Closed-loop registration control for multi-modality soft tissue imaging
EP2205991B1 (en) 2007-10-29 2018-08-29 Koninklijke Philips N.V. Ultrasound assembly including multiple imaging transducer arrays
US20090118620A1 (en) 2007-11-06 2009-05-07 General Electric Company System and method for tracking an ultrasound catheter
US8034075B2 (en) 2007-11-09 2011-10-11 Micrus Endovascular Corporation Tethered coil for treatment of body lumens
KR101132531B1 (en) 2007-11-14 2012-04-03 삼성메디슨 주식회사 Ultrasound diagnostic device having transducers facing each other
EP2992825B1 (en) 2007-11-26 2017-11-01 C.R. Bard Inc. Integrated system for intravascular placement of a catheter
US8781555B2 (en) 2007-11-26 2014-07-15 C. R. Bard, Inc. System for placement of a catheter including a signal-generating stylet
US9649048B2 (en) 2007-11-26 2017-05-16 C. R. Bard, Inc. Systems and methods for breaching a sterile field for intravascular placement of a catheter
US9456766B2 (en) 2007-11-26 2016-10-04 C. R. Bard, Inc. Apparatus for use with needle insertion guidance system
US9521961B2 (en) 2007-11-26 2016-12-20 C. R. Bard, Inc. Systems and methods for guiding a medical instrument
US8320711B2 (en) 2007-12-05 2012-11-27 Biosense Webster, Inc. Anatomical modeling from a 3-D image and a surface mapping
US8175679B2 (en) 2007-12-26 2012-05-08 St. Jude Medical, Atrial Fibrillation Division, Inc. Catheter electrode that can simultaneously emit electrical energy and facilitate visualization by magnetic resonance imaging
US20130237826A1 (en) 2012-03-12 2013-09-12 Arcscan, Inc. Precision ultrasonic scanner for body parts with extended imaging depth
US20090192385A1 (en) 2008-01-25 2009-07-30 Oliver Meissner Method and system for virtual roadmap imaging
US8478382B2 (en) 2008-02-11 2013-07-02 C. R. Bard, Inc. Systems and methods for positioning a catheter
US8352015B2 (en) 2008-05-27 2013-01-08 Kyma Medical Technologies, Ltd. Location tracking of a metallic object in a living body using a radar detector and guiding an ultrasound probe to direct ultrasound waves at the location
US20100016726A1 (en) 2008-07-18 2010-01-21 Meier Joseph H Handheld Imaging Device And Method For Manufacture Thereof
WO2010024168A1 (en) 2008-08-29 2010-03-04 株式会社 日立メディコ Ultrasonic diagnosing device
US8414495B2 (en) * 2008-09-10 2013-04-09 General Electric Company Ultrasound patch probe with micro-motor
US8086298B2 (en) 2008-09-29 2011-12-27 Civco Medical Instruments Co., Inc. EM tracking systems for use with ultrasound and other imaging modalities
WO2010042776A2 (en) 2008-10-10 2010-04-15 Vasostar, Inc. Medical device with a guidewire for penetrating occlusions
WO2010044385A1 (en) 2008-10-14 2010-04-22 株式会社 日立メディコ Ultrasonographic device and ultrasonographic display method
DE102008054297A1 (en) 2008-11-03 2010-05-06 Siemens Aktiengesellschaft A catheter assembly for insertion into a blood vessel, medical examination and treatment device comprising such a catheter assembly and method for minimally invasive intervention on a blood vessel in the brain
WO2010052868A1 (en) 2008-11-10 2010-05-14 株式会社日立メディコ Ultrasonic image processing method and device, and ultrasonic image processing program
JPWO2010055816A1 (en) 2008-11-14 2012-04-12 株式会社日立メディコ Ultrasound diagnostic apparatus and standard image data generation method for ultrasonic diagnostic apparatus
JP5147656B2 (en) 2008-11-20 2013-02-20 キヤノン株式会社 Image processing apparatus, image processing method, program, and storage medium
US20100160781A1 (en) 2008-12-09 2010-06-24 University Of Washington Doppler and image guided device for negative feedback phased array hifu treatment of vascularized lesions
CN101564304B (en) * 2009-01-19 2013-08-21 北京汇影互联科技有限公司 Method and equipment for standardized and precise ultrasound scanning
US20100191101A1 (en) 2009-01-23 2010-07-29 Yoav Lichtenstein Catheter with isolation between ultrasound transducer and position sensor
FR2942338B1 (en) 2009-02-13 2011-08-26 Univ Paris Descartes ECHOGRAPHIC BROWSER
JP5537171B2 (en) 2009-02-27 2014-07-02 株式会社東芝 Ultrasonic imaging apparatus, image processing apparatus, image processing method, and image processing program
US8298149B2 (en) 2009-03-31 2012-10-30 Boston Scientific Scimed, Inc. Systems and methods for making and using a motor distally-positioned within a catheter of an intravascular ultrasound imaging system
WO2010117025A1 (en) 2009-04-10 2010-10-14 株式会社 日立メディコ Ultrasonic diagnosis apparatus and method for constructing distribution image of blood flow dynamic state
DK200900527A (en) 2009-04-24 2010-10-25 Region Nordjylland Aalborg Syg Device for holding an imaging probe and use of such device
US8945147B2 (en) 2009-04-27 2015-02-03 Smith & Nephew, Inc. System and method for identifying a landmark
CN102421373B (en) 2009-05-20 2014-07-16 株式会社日立医疗器械 Medical image diagnosis device and region-of-interest setting method therefor
KR101121286B1 (en) 2009-07-31 2012-03-23 한국과학기술원 Ultrasound system and method for performing calibration of sensor
US9216299B2 (en) 2009-10-21 2015-12-22 Thomas J. Wolfe Electromagnetic pathologic lesion treatment system and method
US20110125022A1 (en) * 2009-11-25 2011-05-26 Siemens Medical Solutions Usa, Inc. Synchronization for multi-directional ultrasound scanning
US9445780B2 (en) 2009-12-04 2016-09-20 University Of Virginia Patent Foundation Tracked ultrasound vessel imaging
KR101188593B1 (en) 2009-12-15 2012-10-05 삼성메디슨 주식회사 Ultrasound system and method for providing a plurality of three-dimensional ultrasound images
US20110196238A1 (en) 2010-02-05 2011-08-11 Jacobson Nathan A System and Method for Fetal Heart Monitoring Using Ultrasound
CA2733621C (en) 2010-03-10 2017-10-10 Northern Digital Inc. Multi-field magnetic tracking
WO2011117788A1 (en) 2010-03-23 2011-09-29 Koninklijke Philips Electronics N.V. Volumetric ultrasound image data reformatted as an image plane sequence
JP5762076B2 (en) 2010-03-30 2015-08-12 株式会社東芝 Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and medical image diagnostic apparatus
US20110255762A1 (en) 2010-04-15 2011-10-20 Harald Deischinger Method and system for determining a region of interest in ultrasound data
CN102869308B (en) * 2010-05-03 2015-04-29 皇家飞利浦电子股份有限公司 Apparatus and method for ultrasonic tracking of ultrasound transducer(s) aboard an interventional tool
US8439840B1 (en) 2010-05-04 2013-05-14 Sonosite, Inc. Ultrasound imaging system and method with automatic adjustment and/or multiple sample volumes
JP5234671B2 (en) 2010-05-19 2013-07-10 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic equipment
EP4122385A1 (en) 2010-05-28 2023-01-25 C. R. Bard, Inc. Insertion guidance system for needles and medical components
US20110301460A1 (en) 2010-06-04 2011-12-08 Doris Nkiruka Anite Self-administered medical ultrasonic imaging systems
WO2012020758A1 (en) 2010-08-11 2012-02-16 株式会社東芝 Medical image diagnosis device, image-processing device and method
US20120071894A1 (en) 2010-09-17 2012-03-22 Tanner Neal A Robotic medical systems and methods
US8801693B2 (en) 2010-10-29 2014-08-12 C. R. Bard, Inc. Bioimpedance-assisted placement of a medical device
US9289187B2 (en) 2010-12-10 2016-03-22 B-K Medical Aps Imaging transducer probe
US9308041B2 (en) 2010-12-22 2016-04-12 Biosense Webster (Israel) Ltd. Lasso catheter with rotating ultrasound transducer
US20120165671A1 (en) 2010-12-27 2012-06-28 Hill Anthony D Identification of objects in ultrasound
US9107607B2 (en) 2011-01-07 2015-08-18 General Electric Company Method and system for measuring dimensions in volumetric ultrasound data
US9993304B2 (en) * 2011-01-13 2018-06-12 Koninklijke Philips N.V. Visualization of catheter of three-dimensional ultrasound
US9700280B2 (en) 2011-01-31 2017-07-11 Sunnybrook Health Sciences Centre Ultrasonic probe with ultrasonic transducers addressable on common electrical channel
JP5759197B2 (en) 2011-02-09 2015-08-05 キヤノン株式会社 Information processing apparatus and information processing method
US20120245457A1 (en) 2011-03-25 2012-09-27 Crowley Robert J Ultrasound imaging catheters and guidewires with non-interfering and coordinated position and orientation sensors
JP6005905B2 (en) 2011-04-06 2016-10-12 東芝メディカルシステムズ株式会社 Image processing system, image processing apparatus, and image processing method
US20120259209A1 (en) 2011-04-11 2012-10-11 Harhen Edward P Ultrasound guided positioning of cardiac replacement valves
US10918307B2 (en) 2011-09-13 2021-02-16 St. Jude Medical, Atrial Fibrillation Division, Inc. Catheter navigation using impedance and magnetic field measurements
WO2012143885A2 (en) * 2011-04-21 2012-10-26 Koninklijke Philips Electronics N.V. Mpr slice selection for visualization of catheter in three-dimensional ultrasound
DE102011018954B4 (en) 2011-04-29 2017-12-14 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Ultrasonic test head and method for non-destructive testing of a flat test specimen
US20120289830A1 (en) 2011-05-10 2012-11-15 General Electric Company Method and ultrasound imaging system for image-guided procedures
US20120289836A1 (en) 2011-05-12 2012-11-15 Osamu Ukimura Automatic real-time display system for the orientation and location of an ultrasound tomogram in a three-dimensional organ model
WO2012157585A1 (en) 2011-05-19 2012-11-22 株式会社東芝 Medical image diagnostic apparatus and image-processing apparatus
US20120310093A1 (en) 2011-06-06 2012-12-06 Fujifilm Corporation Ultrasound image producing method and ultrasound image diagnostic apparatus
JP5788229B2 (en) 2011-06-06 2015-09-30 株式会社東芝 Ultrasonic diagnostic equipment
JP6053766B2 (en) * 2011-06-13 2016-12-27 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. 3D needle localization using a 2D imaging probe
JP2013017577A (en) 2011-07-08 2013-01-31 Toshiba Corp Image processing system, device, method, and medical image diagnostic device
US20130018264A1 (en) 2011-07-15 2013-01-17 General Electric Company Method and system for ultrasound imaging
JP6054089B2 (en) 2011-08-19 2016-12-27 東芝メディカルシステムズ株式会社 Ultrasonic diagnostic apparatus, medical image processing apparatus, and medical image processing program
US10791950B2 (en) 2011-09-30 2020-10-06 Biosense Webster (Israel) Ltd. In-vivo calibration of contact force-sensing catheters using auto zero zones
US8859718B2 (en) 2011-10-21 2014-10-14 Solvay Usa, Inc. Synthesis of conjugated polymers via oxidative polymerization and related compositions
WO2013065310A1 (en) 2011-11-02 2013-05-10 パナソニック株式会社 Ultrasound probe
US11109835B2 (en) * 2011-12-18 2021-09-07 Metritrack Llc Three dimensional mapping display system for diagnostic ultrasound machines
JP2013132354A (en) 2011-12-26 2013-07-08 Ge Medical Systems Global Technology Co Llc Ultrasonic diagnosis apparatus and control program of the same
KR101364527B1 (en) 2011-12-27 2014-02-19 삼성메디슨 주식회사 Ultrasound system and method for providing motion profile information of target object
KR101406807B1 (en) 2011-12-28 2014-06-12 삼성메디슨 주식회사 Ultrasound system and method for providing user interface
KR101323330B1 (en) 2011-12-28 2013-10-29 삼성메디슨 주식회사 Ultrasound system and method for providing vector doppler image based on decision data
KR101348771B1 (en) 2011-12-28 2014-01-07 삼성메디슨 주식회사 Ultrasound system and method for estimating motion of particle based on vector doppler
KR101368750B1 (en) 2012-01-16 2014-02-28 삼성메디슨 주식회사 Method and apparatus for providing multi spectral doppler images
US9295449B2 (en) 2012-01-23 2016-03-29 Ultrasonix Medical Corporation Landmarks for ultrasound imaging
WO2013116240A1 (en) 2012-01-30 2013-08-08 Inneroptic Technology, Inc. Multiple medical device guidance
US20130197365A1 (en) 2012-01-31 2013-08-01 Toshiba Medical Systems Corporation Ultrasonic diagnostic apparatus and ultrasonic diagnostic apparatus control method
WO2013116807A1 (en) 2012-02-03 2013-08-08 Los Alamos National Security, Llc Systems and methods for synthetic aperture ultrasound tomography
EP2827777A4 (en) 2012-03-23 2015-12-16 Ultrasound Medical Devices Inc Method and system for acquiring and analyzing multiple image data loops
US20130289411A1 (en) 2012-04-26 2013-10-31 dBMEDx INC Apparatus to removably secure an ultrasound probe to tissue
WO2013163605A1 (en) 2012-04-26 2013-10-31 Dbmedx Inc. Ultrasound apparatus and methods to monitor bodily vessels
KR101516992B1 (en) 2012-05-03 2015-05-04 삼성메디슨 주식회사 Apparatus and method for displaying ultrasound image
US20130296691A1 (en) 2012-05-04 2013-11-07 Ascension Technology Corporation Magnetically tracked surgical needle assembly
US20130303886A1 (en) 2012-05-09 2013-11-14 Doron Moshe Ludwin Locating a catheter sheath end point
US10588543B2 (en) 2012-05-23 2020-03-17 Biosense Webster (Israel), Ltd. Position sensing using electric dipole fields
KR101501518B1 (en) 2012-06-11 2015-03-11 삼성메디슨 주식회사 The method and apparatus for displaying a two-dimensional image and a three-dimensional image
US9474465B2 (en) 2012-06-27 2016-10-25 Ascension Technology Corporation System and method for magnetic position tracking
US9572549B2 (en) 2012-08-10 2017-02-21 Maui Imaging, Inc. Calibration of multiple aperture ultrasound probes
KR20140024190A (en) 2012-08-20 2014-02-28 삼성메디슨 주식회사 Method for managing and displaying ultrasound image, and apparatus thereto
KR20140046754A (en) 2012-10-11 2014-04-21 삼성메디슨 주식회사 Ultrasound system and method for automatically activating ultrasound probe based on motion of ultrasound probe
US9375163B2 (en) 2012-11-28 2016-06-28 Biosense Webster (Israel) Ltd. Location sensing using a local coordinate system
US8994366B2 (en) * 2012-12-12 2015-03-31 Ascension Technology Corporation Magnetically tracked sensor
US9204820B2 (en) 2012-12-31 2015-12-08 Biosense Webster (Israel) Ltd. Catheter with combined position and pressure sensing structures
US20140187950A1 (en) * 2012-12-31 2014-07-03 General Electric Company Ultrasound imaging system and method
JP2015008777A (en) 2013-06-27 2015-01-19 ジーイー・メディカル・システムズ・グローバル・テクノロジー・カンパニー・エルエルシー Ultrasonic diagnostic apparatus and control program for the same
US10098565B2 (en) 2013-09-06 2018-10-16 Covidien Lp System and method for lung visualization using ultrasound
US10772489B2 (en) 2014-07-09 2020-09-15 Acclarent, Inc. Guidewire navigation for sinuplasty
CN104083178A (en) * 2014-07-22 2014-10-08 汕头市超声仪器研究所有限公司 Ultrasonic automatic scanning and checking device for mammary gland
WO2016081321A2 (en) 2014-11-18 2016-05-26 C.R. Bard, Inc. Ultrasound imaging system having automatic image presentation
US10675006B2 (en) 2015-05-15 2020-06-09 Siemens Medical Solutions Usa, Inc. Registration for multi-modality medical imaging fusion with narrow field of view
JP6662578B2 (en) 2015-05-18 2020-03-11 キヤノンメディカルシステムズ株式会社 Ultrasonic probe and ultrasonic diagnostic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130195313A1 (en) * 2010-03-19 2013-08-01 Koninklijke Philips Electronics N.V. Automatic positioning of imaging plane in ultrasonic imaging
US10905396B2 (en) * 2014-11-18 2021-02-02 C. R. Bard, Inc. Ultrasound imaging system having automatic image presentation

Also Published As

Publication number Publication date
EP3220828B1 (en) 2021-12-22
CN106999146A (en) 2017-08-01
CN106999146B (en) 2020-11-10
US10905396B2 (en) 2021-02-02
EP3220828A1 (en) 2017-09-27
US20180296185A1 (en) 2018-10-18
WO2016081023A1 (en) 2016-05-26

Similar Documents

Publication Publication Date Title
US20210106310A1 (en) Ultrasound imaging system having automatic image presentation
US11696746B2 (en) Ultrasound imaging system having automatic image presentation
JP5230589B2 (en) Ultrasonic device, ultrasonic imaging program, and ultrasonic imaging method
JP5417609B2 (en) Medical diagnostic imaging equipment
US20100262008A1 (en) Robotic ultrasound system with microadjustment and positioning control using feedback responsive to acquired image data
KR20170125360A (en) A method and apparatus for using a physical object to manipulate corresponding virtual objects in a virtual environment,
CN111655160A (en) Three-dimensional imaging and modeling of ultrasound image data
CN104994792B (en) Ultrasonic diagnostic device and medical image processing device
US20140171800A1 (en) Ultrasound diagnostic device and ultrasound image display method
CN217907826U (en) Medical analysis system
JP5784388B2 (en) Medical manipulator system
US8467850B2 (en) System and method to determine the position of a medical instrument
US20200367856A1 (en) Ultrasound imaging probe for use in an ultrasound imaging system
US20120172701A1 (en) Apparatus and Method for Acquiring Diagnostic Information
US20220270247A1 (en) Apparatus for moving a medical object and method for providing a control instruction
US20190388061A1 (en) Ultrasound diagnosis apparatus displaying shear wave data for object and method for operating same
CN209847368U (en) Diagnosis and treatment integrated surgical robot system
CN112752545A (en) Ultrasound system and method for shear wave elastography of anisotropic tissue
CN117582288A (en) Spatially aware medical device configured to perform insertion path approximation

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

AS Assignment

Owner name: BARD PERIPHERAL VASCULAR, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STARFISH PRODUCT ENGINEERING INC.;REEL/FRAME:061709/0205

Effective date: 20150227

Owner name: STARFISH PRODUCT ENGINEERING INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ADDISON, DEAN M.;MATTHEWS, BRYAN A.;REEL/FRAME:061709/0082

Effective date: 20150227

Owner name: BARD PERIPHERAL VASCULAR, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:COX, JEREMY B.;RANDALL, MICHAEL A.;ZHENG, PENG;SIGNING DATES FROM 20150226 TO 20150227;REEL/FRAME:061708/0774

Owner name: C.R. BARD, INC., NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:BARD PERIPHERAL VASCULAR, INC.;REEL/FRAME:061709/0257

Effective date: 20150227

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: FINAL REJECTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: ADVISORY ACTION MAILED