Connect public, paid and private patent data with Google Patents Public Datasets

Method and system for generating an image from high and low frequency sound waves

Info

Publication number
WO2006124192A2
WO2006124192A2 PCT/US2006/015001 US2006015001W WO2006124192A2 WO 2006124192 A2 WO2006124192 A2 WO 2006124192A2 US 2006015001 W US2006015001 W US 2006015001W WO 2006124192 A2 WO2006124192 A2 WO 2006124192A2
Authority
WO
Grant status
Application
Patent type
Prior art keywords
probe
lfs
sensor
asa
array
Prior art date
Application number
PCT/US2006/015001
Other languages
French (fr)
Other versions
WO2006124192A3 (en )
Inventor
Thomas James Royston
Todd William Spohnholtz
Francis Loth
Original Assignee
The Board Of Trustees Of The University Of Illinois
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4416Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to combined acquisition of different diagnostic modalities, e.g. combination of ultrasound and X-ray acquisitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/56Details of data transmission or power supply
    • A61B8/565Details of data transmission or power supply involving data transmission via a network
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/895Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques characterised by the transmitted frequency spectrum
    • G01S15/8952Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques characterised by the transmitted frequency spectrum using discrete, multiple frequencies
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52053Display arrangements
    • G01S7/52057Cathode ray tube displays
    • G01S7/52074Composite displays, e.g. split-screen displays; Combination of multiple images or of images and alphanumeric tabular information
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/52Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00
    • G01S7/52017Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S15/00 particularly adapted to short-range imaging
    • G01S7/52079Constructional features
    • G01S7/5208Constructional features with integration of processing functions inside probe or scanhead
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • A61B2562/046Arrangements of multiple sensors of the same type in a matrix array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • A61B8/4472Wireless probes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/48Diagnostic techniques
    • A61B8/485Diagnostic techniques involving measuring strain or elastic properties
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/58Testing, adjusting or calibrating the diagnostic device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays

Abstract

A system and method are disclosed for generating an image from high and low frequency sound waves. A system that incorporates teachings of the present disclosure may include, for example, a multimode imager (100) having an ultrasound (US) probe (102), a low frequency sound (LFS) probe (104), and a controller (110) that manages operations of the US and LFS probes. The controller can be programmed to combine (608) images derived from sound data collected by the US and LFS probes. Additional embodiments are disclosed.

Description

METHOD AND SYSTEM FOR GENERATING AN IMAGE FROM HIGH AND LOW FREQUENCY SOUND WAVES

Inventors

Thomas J. Royston

Todd William Spohnholtz

Francis Loth

FEDERAL FUNDING

[0001] The present invention is funded in part by the National Institutes of Health grant no. EB 002511. The U.S. government has certain rights in this invention.

PRIOR APPLICATION

[0002] This application claims the benefit of U.S. Provisional Patent Application No. 60/675,792, filed April 27, 2005, the contents of which are expressly incorporated herein by reference in its entirety.

FIELD OF THE DISCLOSURE

[0003] The present disclosure relates generally to imaging systems, and more specifically to a method and system for generating an image from high and low frequency sound waves.

BACKGROUND

[0004] Atherosclerosis is one of many vascular diseases that result in a narrowing or blocking of blood vessels. A narrowing or constriction of a vein also often happens due to vein wall thickening just downstream of arteriovenous grafts, which are used in many patients with advanced diabetes to aid in dialysis. This wall thickening may be in response to irregular blood flow patterns that occur downstream of the constructed graft.

[0005] There exist several methods to detect such constrictions in blood vessels and identify associated diseases. One of these methods is known as angiography. Angiography is an invasive procedure in which a catheter is surgically inserted directly into an artery and releases dye. Blood flow is then tracked by visualizing the dye, generally by exposing the patient to X-rays, to find areas of reduced blood flow. [0006] Invasive diagnostic techniques such as this can be dangerous, discomforting to a patient, costly, and time consuming to schedule and implement. [0007] A need therefore arises for a non-invasive means to probe a portion of a human body.

BRIEF DESCRIPTION OF THE DRAWINGS

[0008] FIG. 1 depicts an exemplary embodiment of a multimode imager;

[0009] FIG. 2 depicts exemplary embodiments for coupling an ultrasound probe and a low frequency sound (LFS) probe;

[00010] FIGs. 3-5 depict an exemplary embodiment for the construction of an audio sensor array (ASA)in the LFS probe;

[00011] FIG. 6 depicts an exemplary method operating in the multimode imager;

[00012] FIG. 7 depicts an image produced by the multimode imager according to the method of FIG. 6;

[00013] FIG. 8 depicts an exemplary two dimensional array illustrating a means for processing sonic data collected from the LFS probe; and

[00014] FIG. 9 depicts an exemplary diagrammatic representation of a machine in the form of a computer system within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies disclosed herein.

DETAILED DESCRIPTION

[00015] FIG. 1 depicts an exemplary embodiment of a multimode imager 100. The multimode imager 100 can comprise an ultrasound (US) probe 102, a low frequency sound (LFS) probe 104, an electromagnetic position and orientation measurement (EPOM) system comprising an EPOM sensor 106, an EPOM processor 108 and a magnetic field generator 109, and a controller 110.

[00016] The US probe 102 and the controller 110 can comprise common technology such as employed in an Acuson 128xplO color Doppler ultrasonography (CDU) system. Alternatively, the US probe 102 and controller 110 can comprise newer more advanced technology such as found in matrix array ultrasound systems. [00017] The US probe 102 and the LFS probe 104 can be mechanically linked by a housing assembly. In this embodiment the US and LFS probes 102, 104 operate as a single mechanical unit so that the relative position of each probe is well known. In an alternative embodiment, the LFS probe 104 can be placed on a portion 101 of a human body as shown in the illustrations of FIG. 2. In this embodiment, the LFS probe 104 can be designed to cover an expansive portion of the human body such as, for example, a forearm, a thigh, a calf, an entire limb, a chest and/or back, a neck, and so on. Alternatively, the LFS probe 104 can be small covering a small area (e.g., 3 inches by 5 inches). In any case, the LFS probe 104 can be designed with flexible material (e.g., silicone) so that it contours to the skin portion of the human body being probed.

[00018] The LFS probe 104 can comprise an audio sensor array (ASA) pad 300 such as the one illustrated in FIG. 3 for measuring low frequency sound waves of, for example, 1 kHz or less. In a first embodiment, the ASA pad 300 can be both flexible and transparent to ultrasonic waves. Flexibility allows the ASA pad 300 to conform to curved surfaces of the body such as arms or legs. Though not transparent to visible light, the pad is invisible to ultrasound, which allows a commercially available ultrasound system to acquire data directly through the acoustic ASA pad 300 while the pad simultaneously acquires acoustic (sonic) data.

[00019] Accordingly, the ASA pad 300 can have properties in which it will not interfere with a field of view of the US probe 102 when the LFS probe 102 and the US probe 104 are slidably coupled to each other by way of a gel commonly used in ultrasound applications. This embodiment is shown in views (C) and (D) of FIG. 2. [00020] Ih another embodiment, the ASA pad can be semi-rigid with properties which can interfere with the field of view of the US probe 102 when the US and LFS probes are slidably coupled to each other. In this embodiment the US and LFS probes 102, 104 would be operated adjacently such as shown in views (A) and (B) of FIG. 2. [00021] Each of the foregoing embodiments of the ASA pad will be discussed in further detail shortly. [00022] The EPOM sensor 106, EPOM processor 108, and magnetic field generator 109 operate together as a position sensor collectively referred to herein as an EPOM system that tracks positioning information of the US and/or LFS probes 102, 104. The EPOM system can utilize a common acquisition and position and orientation system such as Ascension Technology's pciBIRD system. The EPOM processor 108 can be a common circuit board integrally coupled to the controller 110 as shown in FIG. 1 for performing the processing functions of the EPOM system. The EPOM sensor 106 can comprise a small magnetic probe which can be encapsulated with three orthogonally aligned coils. The magnetic field generator 109 can be placed in the vicinity of the EPOM sensor 106as shown in FIG. 1.

[00023] Current generated in each of the coils of the EPOM sensor 106 from the magnetic fields is used by EPOM processor 108 to calculate positioning information which can have up to six degrees of freedom information, [x, y, z, a, β, γ], where [x, y, z] are Cartesian coordinates and [a, β, γ] are orientation coordinates such as azimuth, elevation, and roll of the EPOM sensor 106. The EPOM sensor 106 can be placed on the US probe 102 and/or the LFS probe 104 depending on the coupling embodiment deployed therebetween. In the illustration of FIG. 1, one EPOM sensor 106 can be placed on the US probe 102 and two or more EPOM sensors 106 can be placed on the periphery of a flexible ASA pad 300 to detect the location of each array element 306 in the ASA pad 300 by interpolation techniques. For semi-rigid ASA pads 300, a single EPOM sensor 106 would be sufficient for position detection. [00024] The EPOM system described above can be used to enhance the generation of images based on sound waves collected by the controller 110 from the US and LFS probes 102, 104 as will be described shortly.

[00025] It would be apparent to an artisan with ordinary skill in the art that there are innumerable techniques for sensing position at the US and LFS probes 102, 104. For instance, three or more metal pellets can be disbursed throughout each of the array elements 306 of the ASA pad 300 as location markers. These pellets can in turn be detected by controller 110 by way of the US probe 102 which can determine therefrom a 3-D coordinate that identifies a location of the array element 306. From 3-D information derived for each of the pellets of the array element 306, the controller 110 can be further programmed to determine the orientation of the US probe 102. The controller 110 can be further programmed to remove an image of the pellets from any images produced for diagnosis.

[00026] In yet another embodiment, a high frequency signal (e.g., a beacon signal) can be injected into each piezo of the ASA pad 300 to generate an ultrasound wave at a frequency range detectable by the US probe 102. The controller 110 can detect this signal by way of the US probe 102 and can thereby determine a 3-D location of the array element 306 transmitting the signal. By varying the frequency transmitted at each piezo, the controller 110 can further identify which array element 306 is transmitting the signal. Additionally, the array elements 306 can be designed small enough so that the US probe 102 can collect at least three unique beacon signals. From the 3-D information captured for each array element 306, the controller 110 can to determine an orientation of the US probe 102. Similar to the earlier embodiment, the controller 110 can be programmed to discard the beacon information once processed and remove any aspect of said signals in the images produced for diagnosis. [00027] There are other embodiments possible for determining position information of the US and LFS probes 102, 104 such as, for example, utilizing a common optical positioning system. It would be apparent therefore to an artisan with ordinary skill in the art that the embodiments not discussed in the present disclosure would be within the scope and spirit of the claims described below. [00028] Referring back to FIG. 3, the ASA pad 300 can utilize common piezoelectric sensors for transduction of mechanical waves to electric signals. The direct piezoelectric effect, discovered in 1880 by Pierre and Jacques Curie (Ikeda 1990), is the property of certain materials to produce a charge separation due to an applied strain. This charge can be picked up by electric leads, electrically conditioned if necessary, and then acquired by a common data acquisition system operating in the controller 110 for analysis.

[00029] The degree of piezoelectric effect present in a material is characterized by its electro-mechanical coupling coefficient, k. A common interpretation of the coefficient is that the coefficient squared is equal to the ratio of the stored mechanical energy to the supplied electrical energy. The coupling constant provides an estimate as to the level of charge produced by a given applied strain in a particular direction. [00030] Polyvinylidene fluoride (PVDF) can be chosen as one embodiment for a piezoelectric material in the construction of the ASA pad 300. PVDF film can easily flex to maintain contact with complex curved surfaces. PVDF has an acoustic impedance closer to that of water and human tissue than other piezoelectric material (approximately 2.6 times that of water as compared to at least 10 times greater for most piezo-ceramics). This facilitates a more efficient coupling of mechanical energy from a sound source to the ASA pad 300 and the US probe 102. [00031] The PVDF film used in the construction of an array element 306 of the ASA pad 300 can be 110 μm thick having a silver coating silk-screened on both surfaces to function as electrodes, and an electro-mechanical coupling coefficient of, for example, 0.12. An array of fifteen individual sensors can be created from the PVDF film by scribing a grid of lines through the silver coating to form electrical sensor regions and conducting pathways on one side of the PVDF film as shown in FIG. 4. Alternatively, a symmetric pattern of twice the desired number of sensor elements (2x15=30) can be scribed such that when folded in half along the line of symmetry, symmetrically opposed array elements would meet and a sensor pad with 15 elements would be formed. This would give twice the piezo surface for each array element (resulting in greater sensitivity), and total electrical shielding (resulting in greater interference rejection). The tradeoff is that it increases ultrasound measurement loss through the ASA pad 300 due to additional boundaries. [00032] For either of these embodiments, a multimeter can be used to ensure adequate scribing and that no sensor is short circuited with an adjacent one. The opposite side of the PVDF film can be left unscribed to form a common ground across the individual sensors. This also provides some shielding of the sensors from electrical interference. A border of unused PVDF film can extend along the outer edge for the purpose of lead attachments 308.

[00033] The controller 110 which captures the acoustic signals can have a limited number of channels of acquisition (e.g., 16 channels). A 3x5 inch array of sensors (i.e., 15 sensors) can be chosen withl cm2 array elements 306 so as to approximately match both the width of the US probe 102 and the physical path length of a desired ultrasound data capture. It should be noted that the ASA pad 300 can be of any suitable size and number of sensors depending on a desired application. For instance, in the case where the LFS probe 104 is utilized for probing a large surface, a large array with hundreds of sensors can be constructed so as to cover an entire limb, a chest, or other body part if desired. Accordingly, a diagnostician can be supplied a number of LFS probes 104 to select from depending on the application. For illustration purposes only, the present disclosure will be limited to a 15 sensor ASA pad 300.

[00034] The 15 sensor ASA pad 300 can have a sensing area (not including the unused border area) of 40 X 67 mm. This corresponds to an individual sensor dimension of approximately 13.3 x 13.4 mm2. Note that some edge sensors have slightly less area than other sensors in order to accommodate the conductive traces leading to the central sensors. This reduction in area, and therefore sensitivity, can be approximately 7-8% for those affected sensors. For this sensor configuration the maximum resistance between the lead attachment zone and the furthest point of the corresponding sensor area can be approximately 2 ohms. This resistance corresponds to the central sensors, while the sensors along the edge of the array having a significantly smaller resistance.

[00035] Shielded copper wire can be used to conduct the generated charge to signal conditioners. Since the electrical leads 310 can not be soldered directly to the PVDF film, an interstitial connection can be made. This interstitial connection can be made by first attaching thin copper foil strips 308 with conductive adhesive on one side to each of the lead attachment zones. The copper lead wire 310 can then be soldered to a surface of the copper foil 308 at the end opposite the PVDF connection as shown in FIG. 5.

[00036] Better connections, both electrically and mechanically, can be made if the copper foil is attached to the PVDF film prior to soldering the electrical leads to the copper foil. Since copper is an efficient thermal conductor and elevated temperatures can "de-pole" the PVDF, a cooled heat sink can be placed in contact with the copper foil during soldering to minimize heat conducted to the PVDF film. A single lead attachment can be used for the non-scribed side of the PVDF film to ground the sensor array. Since electrical leads extend from opposite sides of the sensor array, two 8- conductor shielded cables 312 can be used to connect the ASA pad 300 to common signal conditioners that perform impedance conversion and signal amplification for the ASA pad 300 outputs. Conversion from high to low impedance is desired as close to the source as possible to minimize effects such as capacitive loading of a cable 312 and introduction of tribo-electric and piezo-electric cable noise common with high impedance sources.

[00037] For connection reinforcement and robustness, a flexible plastic film 304 with adhesive on one side covers the lead attachment zones and frames the ASA pad 300. The ASA pad 300 can be encapsulated in Sylgard 184 (an elastometric polymer of Dow Chemicals™), a silicone-based material. This provides protection to the sensor array elements 306 and electrical leads 310, is flexible to conform to curved surfaces, and provides efficient mechanical coupling for ultrasound interrogation through the pad. The PVDF film can be oriented with the grounded side down or exposed so that the scribed traces are embedded in the Sylgard. In this configuration, any small scratches on the surface of the conducting electrode on the exposed surface due to normal wear will not result in an open circuit of the narrow traces leading to the central sensors.

[00038] On the underside of the pad, a flexible plastic film 314 similar to the one covering the copper foil contacts described above can be used to frame the sensor area and provide additional surface protection outside the sensing area. The plastic film constrains elongation and helps prevent lead pullout. This also helps prevent delamination of the PVDF film from the pad due to any adhesion of the PVDF film to the test surface. Additionally, a swatch of fabric 302 can be embedded in the ASA pad 300 surrounding, but not overlapping the acoustic sensors, to provide additional strength to the ASA pad.

[00039] The ASA pad 300 and attached wiring can be placed in a mold and cast in Sylgard 184. The Sylgard 184 can be prepared using a nominal 20:1 mixing ratio of base to curing agent. This results in an elastomer that is far more compliant than the one resulting from the manufacturer's recommended ratio of 10:1. Before casting into the mold, the Sylgard can be degassed in a vacuum chamber to remove entrained bubbles in the viscous solution. The pad can be cast to a thickness of approximately 1 cm and any remaining bubbles can be mechanically removed by lancing and dragging to the surface with a slender, sharp object. The combined sensor array and mold can be cured in an oven or room temperature.

[00040] In an alternative embodiment, a semi-rigid LSF probe 104 can be constructed with piezoceramic audio transducers such as a DigiKey 102-1144-ND impregnated with silicone to enhance coupling to a skin surface. Prior art systems utilize an encapsulating membrane to retain a liquid or semi-liquid coupling gel. One concern with the encapsulating membrane is the trapping of voids between the inner walls of the sensor cavity or the encapsulated gel. These voids serve to reduce the transmitted acoustic energy since some of the energy is spent "pumping" the surrounding gel into the void.

[00041] According to the present disclosure, casting of a solid silicone gel directly into the sensor cavity of the ASA pad 300 embodiments described earlier eliminates the need for an encapsulating membrane and reduces the potential presence of voids. The lack of containment membrane further enhances the transmissibility of the detected acoustic energy by eliminating the additional boundary layer through which acoustic waves must pass. Additionally, the elimination of the encapsulating membrane simplifies the overall design and eliminates the potential of gel leakage due to a compromised membrane.

[00042] The presence of the outer silicone gel disk of the ASA pad 300 prevents direct contact of the LFS probe 102 housing with the tissue surface being examined and provides a larger conformable surface. This allows the sensors of the ASA pad 300 to maintain a large contact area when used on curved or complex surfaces. Additionally, some of the gel coupling materials can be adhesive in nature, thereby facilitating easy attachment to the specimen of interest without the need of additional hardware.

[00043] An ASA pad constructed with piezoceramic audio transducers has the advantage of being more sensitive and robust than PVDF piezoelectric material thereby requiring less amplification of the generated signal. However, piezoceramic transducers have the disadvantage that ultrasound imaging cannot be performed through them. Consequently, a US probe 102 would have to be adjacently coupled to the LFS probe 104 during operation as shown in views (A) and (B) of FIG. 2. An ASA of piezoceramic transducers can be encapsulated with Sylgard 184 and can have electrical connections as described for the previous embodiment. [00044] The flexible or rigid ASA pad 300 can be strapped onto a skin surface over or near, for example, an Arterio-venous (AV) graft region using medical guaze for diagnostic purposes. Standard ultrasound coupling gel can be used at the US and LFS probes 102, 104 for simultaneous ultrasound and sonic measurements as will be described below.

[00045] In yet another embodiment the US probe 102 can employ a matrix array or wobble array that can generate 3D images with its position and that of the LFS probe 104 fixed. Consequently, in these embodiments the EPOM sensor 106 would not be needed. The US and LFS probes 102, 104 would be mechanically linked in such a way that their position and orientation with respect to each other are known. [00046] Referring back to FIG. 1 , the US and LFS probes 102, 104 and the EPOM sensor 106 can be tethered by one or more flexible shielded electrical cables 103. As wireless technologies continue to be designed for miniaturization and portability, a transceiver (such as, for example, a Bluetooth™ transceiver) and other components such as an analog to digital sampler and one or more battery cells can be placed in the US and/or LFS probes 102, 104 in place of cable system 103 to convey to the controller 110 over-the-air digitally sampled information of the US and LFS probes as well as the EPOM sensor 106, thereby affording a diagnostician greater flexibility for movement.

[00047] FIG. 4 depicts an exemplary method 600 operating in the multimode imager 100. Method 600 begins with step 602, where a diagnostician slidably couples the US probe 102 with the LFS probe 104 using a common ultrasound gel. During operation, the controller 110 collects in step 604 low and high frequency sound data from the US and LFS probes 102, 104 as it is applied to a portion of a patient's body. The controller 110 also collects in step 606 positioning information from the EPOM system 106-108. In step 608, the controller 110 processes the sound waves sampled with its corresponding positioning information.

[00048] In this step the calculated sound field of the LFS probe 104 is combined with the geometry obtained from ultrasonic imaging to produce in step 610 a composite image. FIG. 7 shows an artificial representation of a multimode or composite image of a blood vessel with blood flow from right to left. The vessel's geometry as determined by ultrasound is represented by the outline of the image. The vessel has a small constriction 704 present at the center of the image indicating a moderate blockage 705. From experimental results, the constriction can be at a point where the image of the vessel fades slightly. The acoustic field information obtained from the LFS probe 104 can be displayed in color as a region of noise 706. This noise occurs just downstream of the blockage indicating the presence and approximate location of the blockage with a direction of blood flow known from the Doppler mode of conventional ultrasound.

[00049] This bi-modal approach to assessing vascular obstructions is synergistically better than the prior art methods existing today for ultrasound. The sonic image from the LFS probe 104 confirms the presence of the constriction and gives unique information about how the blood flow is affected by this constriction. Additionally, the controller 110 can be programmed in step 614 to detect an anomalous flow, and predict from the sonic image a present or possibly a future constriction that results from an irregular blood flow pattern, which may occur, for example, downstream of a vascular graft. Said predictions can be presented in step 616 with one or more suggestive indicators such as shown in FIG. 7. Although not shown, metric information can also be displayed by the controller 110 (e.g., speed of fluid, average thickness of vessel, etc.).

[00050] The sonic array measurement can also tell the diagnostician whether the flow is still relatively smooth and laminar or has become turbulent. This type of information may be useful in assessing the relative danger of the constriction and can answer such questions such as how severe is the constriction and what types of forces is the blood vessel wall being exposed to from the flow that may cause the vessel wall to remodel itself, thicken or possibly fail in the future. By non-invasively and simultaneously acquiring both ultrasonic and sonic data from the US and LFS probes 102, 104, respectively, and combining them in a single image, a medical professional may more readily determine the best treatment option.

[00051] Referring back to step 608, the controller 110 can process the sonic or acoustic field data from the LFS probe 104 using a Multiple Auscultation Point (MAP) beamformer process described in Owsley, N., Hull, A. (1998), "Beamformed Nearfield Imaging of a Simulated Coronary Artery Containing a Stenosis," IEEE Transactions on Medical Imaging. 17(6): 900-909, incorporated herein by reference in its entirety.

[00052] The LFS probe 104 can provide 'Af independent records of time domain data, each of length 1U data points. The signal obtained can be assumed to be zero- mean and sufficiently wide-sense stationary (WSS). Since in each case the overall dimension of the acoustic array can be greater than the distance from the array to the acoustic source, sensing is performed in the acoustic nearfield and the simplifying assumption of impinging plane waves cannot be made. A coordinate system is chosen such that the x-y plane is parallel to the model's upper surface and z chosen positive into the model.

[00053] FIG. 8 illustrates a two dimensional array with a focus point (only 4 sensor's projected paths shown). The focus point is determined by appropriate selection of each sensor's phase which translates to a particular path-length (and time of flight) from each sensor.

[00054] The vector of time-domain sensor outputs is Fourier transformed at the desired analysis frequency ωo~and is given by

where xn(t) is the time- varying sensor output of the n* sensor and Xn(COo) is the corresponding frequency transformed data at frequency ω0. [00055] The ensemble averaged source intensity, p, determined at frequency ωg and at the coordinate (X1, y., Z1) by the conventionally focused beamformer (CFB), is expressed by

rf(λ-,.,χ.,z,.,<υ0) ι' tf(x;,,y.,z(.,fi>0)

Here, R(ω0) = E[X (O)0)X(CO0Y] is the normalized m x m cross-spectral density matrix (CSDM), E[- • •] is the statistical expectation operator, and d(xt , y. , z. , ωQ ) is the m x 1 focused MAP imaging vector with the kth element given by

Here, η(k) is the Euclidean distance between the coordinate (X^ y0 Z1) and the kth sensor,

η (n) = J[X1 - x(n)f + [y, - y(n)f + [z, - z(n)f (4) c is the wave speed, β is the geometric loss factor.

[00056] The geometric loss factor represents the amplitude attenuation due to the spreading of the wave through space and is equal to 1 for a spherically expanding wave and 0.5 for a circularly (or cylindrically) expanding wave. This attenuation is independent of the amplitude attenuation due to viscoelastic energy loss. Both experimental models can be roughly approximated by a semi-infinite half space with a source close to the free surface. Because of this, radiation of waves from the source in a direction away from the free surface (deeper into the model) closely resemble spherical waves, while those waves radiating from the source towards the free surface almost immediately radiate as rings of circular waves. A geometric loss factor of 0.81 can be chosen for this application.

[00057] An estimate for the cross spectral density matrix (CSDM) Fourier transformed at the analysis frequency /0 is found by first calculating the power spectral density (PSD) of the fifteen channel outputs. A well-known Blackman-Tukey method can be used to estimate the PSD of each channel and can be found by taking the discrete Fourier transform (DFT) of the windowed, biased autocorrelation function. The biased autocorrelation of the nth sensor output is

r v» = T Σ xn<f)x»(t + τ)dt (5)

J^ t=-L+l

The term 'biased' indicates that the autocorrelation is normalized to the length L of the signal xn. The spectral estimate of the nth sensor output can be found by windowing this and Fourier transforming the product.

[00058] The chosen window has a length equal to the original time-domain record length and is applied to the central portion of the autocorrelation. A Blacknian window is chosen for the window function because of its good amplitude accuracy and spectral leakage. The relatively poor frequency resolution of the Blackman window does not impact this work because of the broadband nature of the signal. The resulting PSD has the same number of points as the original time record. Note that the window is not exactly centered about τ = 0 but is one point off since the autocorrelation is an odd number of points and the window is an even number. The resulting PSD estimate has a significantly reduced variance as compared to a periodogram.

[00059] Next, a complex vector is formed by taking the PSD values of each channel corresponding to the analysis frequency /Q .

Since the frequency data were obtained by a PSD estimate rather than a simple DFT of the time-domain data, the square root of the values are taken to produce the correct units. The Fourier transformed CSDM is then found by taking the outer product of the frequency vector with itself.

CSDM = R{ωo) = yX(fo)){ΛIX(fo) \ (8) [00060] In Owsley and Hull's method, the CSDM can be normalized to offset channel-to-channel amplitude variations due to changing conditions not accounted for by calibration. However, this may not be necessary or desirable if the array elements 306 are accurately calibrated with respect to each other. Such calibration could be performed once the LFS probe 104 is in place, using each array element 306 as a driver and performing a self-calibration routine. The normalized CSDM, Ry(a>0) is

where PA(O0) and

^Rn (CO0)

P(Co0) = (10)

[00061] Acoustic intensity can be determined by scanning the beamformer through any set of points, though planar cross sections at roughly constant depth can be most useful for examining vascular structure near the skin surface. In this application the beamformer is tuned to the velocity of shear wave propagation and the resulting intensity field represents acoustic shear wave activity. In other applications, such as in imaging low frequency sound waves emanating from the lung parenchymal region, where compression waves propagate at speeds of 20 -50 m/s, a beamformer can be tuned to the compression wave propagation.

[00062] In an alternative embodiment, the foregoing sonic or audio source localization algorithm can be improved with the Bartlett process described in H. Krim and M. Viberg, "Two decades of array signal processing research," EEEE Sig. Proc. Mag. 13, 67-94 (1996), incorporated herein by reference in its entirety. [00063] The simple beamformer equation for the Bartlett processor is:

Here, a is a vector of the amplitudes of the so called replica vector (calculated based on a model for shear wave propagation in the medium), y is the measured responses at the sensor locations, θ denotes the hypothetical location of the source, ω is the excitation frequency and superscript H denotes the Hermetian of the vector. A 3-dimensional grid of hypothetical locations is created within the volume where the source will be searched. A spherically symmetric shear wave source is placed at each grid location,

and the Bartlett processor is evaluated. Its value reaches a local maximum at locations that are most likely the true source location.

[00064] Referring back to the MAP beamforming process, equation (2) can be replaced by equation (11), which is very similar. Also, equation (3) can be replaced by the following: ak(ω,θi) = -^exp[- jkri(k)] (12)

where k = ω/complex_wave_speed . Here, the complex wave speed incorporates attenuation with distance in its imaginary part. Aside from these changes, the rest of the MAP algorithm remains the same. This calibration can be more reliable for an ASA pad 300 of piezoceramic elements than the PVDF piezo material described above. [00065] This present disclosure improves ultrasound (US) medical imaging technology by integrating a simultaneous noninvasive audible frequency measurement of biological sounds indicative of pathology. This multimode sonic and ultrasound imaging technique advances diagnostic capabilities beyond the state-of-the-art and is ideal for integration into existing ultrasound systems. Measurement of naturally- occurring biological acoustic phenomena can augment conventional imaging technology by providing unique information about material structure and system function. Sonic phenomena of diagnostic value are associated with a wide range of biological functions, such as breath sounds, bowel sounds and vascular sounds. [00066] The present disclosure can be applied to vascular pathology, such as for the characterization of constrictions and the hemodynamic environment downstream of arteriovenous grafts, a common sight of intimal hyperplasia and turbulent blood flow. The multimode imager 100 can form a 3-D composite image of vascular morphology and localized sonic source strength (as a function of frequency and other discriminating clinically-useful parameters). The sonic phenomena can result from turbulent blood flow and its dynamic interaction with the vascular wall, graft and/or plaque and surrounding biological tissues.

[00067] It would be evident to an artisan with ordinary skill in the art that there are many applications for combining ultrasound and sonic sound systems to create composite images. It would also be evident to said artisan that the multimode imaging system 100 of the present disclosure can be utilized for the measurement of fluid dynamics in inanimate objects. Accordingly, said artisan would expect that the embodiments disclosed herein can be rearranged, modified, reduced, or enhanced without departing from the scope and spirit of the claims described below. The reader is therefore directed to the claims for a fuller understanding of the breadth and scope of the present disclosure.

[00068] FIG. 9 depicts an exemplary diagrammatic representation of a machine in the form of a computer system 900 within which a set of instructions, when executed, may cause the machine to perform any one or more of the methodologies discussed above, hi some embodiments, the machine operates as a standalone device, hi some embodiments, the machine may be connected (e.g., using a network) to other machines, hi a networked deployment, the machine may operate in the capacity of a server or a client user machine in server-client user network environment, or as a peer machine in a peer-to-peer (or distributed) network environment. [00069] The machine may comprise a server computer, a client user computer, a personal computer (PC)5 a tablet PC, a laptop computer, a desktop computer, a control system, a network router, switch or bridge, or any machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. It will be understood that a device of the present disclosure includes broadly any electronic device that provides voice, video or data communication. Further, while a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein. [00070] The computer system 900 may include a processor 902 (e.g., a central processing unit (CPU), a graphics processing unit (GPU, or both), a main memory 904 and a static memory 906, which communicate with each other via a bus 908. The computer system 900 may further include a video display unit 910 (e.g., a liquid crystal display (LCD), a flat panel, a solid state display, or a cathode ray tube (CRT)). The computer system 900 may include an input device 912 (e.g., a keyboard), a cursor control device 914 (e.g., a mouse), a disk drive unit 916, a signal generation device 918 (e.g., a speaker or remote control) and a network interface device 920. [00071] The disk drive unit 916 may include a machine-readable medium 922 on which is stored one or more sets of instructions (e.g., software 924) embodying any one or more of the methodologies or functions described herein, including those methods illustrated above. The instructions 924 may also reside, completely or at least partially, within the main memory 904, the static memory 906, and/or within the processor 902 during execution thereof by the computer system 900. The main memory 904 and the processor 902 also may constitute machine-readable media. [00072] Dedicated hardware implementations including, but not limited to, application specific integrated circuits, programmable logic arrays and other hardware devices can likewise be constructed to implement the methods described herein. Applications that may include the apparatus and systems of various embodiments broadly include a variety of electronic and computer systems. Some embodiments implement functions in two or more specific interconnected hardware modules or devices with related control and data signals communicated between and through the modules, or as portions of an application-specific integrated circuit. Thus, the example system is applicable to software, firmware, and hardware implementations. [00073] In accordance with various embodiments of the present disclosure, the methods described herein are intended for operation as software programs running on a computer processor. Furthermore, software implementations can include, but not limited to, distributed processing or component/object distributed processing, parallel processing, or virtual machine processing can also be constructed to implement the methods described herein.

[00074] The present disclosure contemplates a machine readable medium containing instructions 924, or that which receives and executes instructions 924 from a propagated signal so that a device connected to a network environment 926 can send or receive voice, video or data, and to communicate over the network 926 using the instructions 924. The instructions 924 may further be transmitted or received over a network 926 via the network interface device 920.

[00075] While the machine-readable medium 922 is shown in an example embodiment to be a single medium, the term "machine-readable medium" should be taken to include a single medium or multiple media (e.g., a centralized or distributed database, and/or associated caches and servers) that store the one or more sets of instructions. The term "machine-readable medium" shall also be taken to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by the machine and that cause the machine to perform any one or more of the methodologies of the present disclosure.

[00076] The term "machine-readable medium" shall accordingly be taken to include, but not be limited to: solid-state memories such as a memory card or other package that houses one or more read-only (non-volatile) memories, random access memories, or other re-writable (volatile) memories; magneto-optical or optical medium such as a disk or tape; and carrier wave signals such as a signal embodying computer instructions in a transmission medium; and/or a digital file attachment to e- mail or other self-contained information archive or set of archives is considered a distribution medium equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a machine-readable medium or a distribution medium, as listed herein and including art-recognized equivalents and successor media, in which the software implementations herein are stored. [00077] Although the present specification describes components and functions implemented in the embodiments with reference to particular standards and protocols, the disclosure is not limited to such standards and protocols. Each of the standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same functions are considered equivalents.

[00078] The illustrations of embodiments described herein are intended to provide a general understanding of the structure of various embodiments, and they are not intended to serve as a complete description of all the elements and features of apparatus and systems that might make use of the structures described herein. Many other embodiments will be apparent to those of skill in the art upon reviewing the above description. Other embodiments may be utilized and derived therefrom, such that structural and logical substitutions and changes may be made without departing from the scope of this disclosure. Figures are also merely representational and may not be drawn to scale. Certain proportions thereof may be exaggerated, while others may be minimized. Accordingly, the specification and drawings are to be regarded in an illustrative rather than a restrictive sense.

[00079] Such embodiments of the inventive subject matter may be referred to herein, individually and/or collectively, by the term "invention" merely for convenience and without intending to voluntarily limit the scope of this application to any single invention or inventive concept if more than one is in fact disclosed. Thus, although specific embodiments have been illustrated and described herein, it should be appreciated that any arrangement calculated to achieve the same purpose may be substituted for the specific embodiments shown. This disclosure is intended to cover any and all adaptations or variations of various embodiments. Combinations of the above embodiments, and other embodiments not specifically described herein, will be apparent to those of skill in the art upon reviewing the above description. [00080] . The Abstract of the Disclosure is provided to comply with 37 C.F.R. §1.72(b), requiring an abstract that will allow the reader to quickly ascertain the nature of the technical disclosure. It is submitted with the understanding that it will not be used to interpret or limit the scope or meaning of the claims, hi addition, in the foregoing Detailed Description, it can be seen that various features are grouped together in a single embodiment for the purpose of streamlining the disclosure. This method of disclosure is not to be interpreted as reflecting an intention that the claimed embodiments require more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive subject matter lies in less than all features of a single disclosed embodiment. Thus the following claims are hereby incorporated into the Detailed Description, with each claim standing on its own as a separately claimed subject matter.

Claims

CLAIMSWhat is claimed is:
1. A multimode imager, comprising: an ultrasound (US) probe; a low frequency sound (LFS) probe; and a controller that manages operations of the US and LFS probes, wherein the controller is programmed to combine images derived from sound data collected from the US and LFS probes.
2. The multimode imager of claim 1, wherein the US probe and the LFS probe are mechanically linked and are slidably coupled to a portion of a human body to perform a diagnostic measurement.
3. The multimode imager of claim 1, wherein the LFS probe is coupled to a portion of a human body, and wherein the US probe is slidably coupled to LFS probe to perform a diagnostic measurement.
4. The multimode imager of claim 1, wherein a material is applied to the LFS probe to facilitate a slidable contact with the US probe.
5. The multimode imager of claim 1, wherein the LFS probe is coupled to a portion of a human body, and wherein the US probe is adjacently coupled to LFS probe to perform a diagnostic measurement.
6. The multimode imager of claim 1, comprising a position sensor coupled to at least one among the US and LFS probes, wherein the controller is programmed to combine the images according to positioning information supplied by the position sensor.
7. The multimode imager of claim 1, wherein the position sensor comprises an electromagnetic position and orientation measurement (EPOM) sensor coupled to an EPOM processor.
8. The multimode imager of claim 1, wherein the LFS probe comprises an audio sensor array (ASA) having properties that do not interfere with a direct field of view of the US probe.
9. The multimode imager of claim 8, wherein the ASA comprises an array of piezoelectric sensors.
10. The multimode imager of claim 1, comprising a display, wherein the US probe and the LFS probe are coupled therebetween for performing a diagnostic measurement on a portion of a human body, and wherein the controller is programmed to present on the display a composite image having at least two dimensional resolution derived from the sound data collected by the US and LFS probes for illustration of a constriction or potential for constriction in a vessel within the portion of the human body.
11. The multimode imager of claim 1 , wherein the US probe comprises a matrix array probe fixed on a portion of a human body and coupled to the LFS probe for performing a diagnostic measurement on a portion of a human body.
12. A computer-readable storage medium in an imager, comprising computer instructions for producing a composite image having at least two dimensions derived from sound data collected by an ultrasound (US) probe and low frequency sound (LFS) probe.
13. The storage medium of claim 12, comprising computer instructions for: receiving position information from a position sensor coupled to one among the US and LFS probes; producing the composite image according to the position information.
14. The storage medium of claim 13, wherein the position information comprises three dimensional (3-D) coordinates and orientation coordinates, and wherein the storage medium comprises computer instructions for processing the 3-D coordinates and orientation coordinates for producing the composite image.
15. The storage medium of claim 12, comprising computer instructions for presenting the composite image on a display device of the imager.
16. The storage medium of claim 12, wherein the US and LFS probes are coupled to each other and placed near a portion of a human body, and wherein the storage medium comprises computer instruction for determining a direction of fluid in a vessel within a view of the portion of the human body according to a Doppler mode of ultrasound sound data collected from the US probe.
17. The storage medium of claim 12, comprising computer instructions for: receiving low frequency sound from the LFS; and processing the low frequency sound according to a beamformer method for producing in part the composite image.
18. The storage medium of claim 17, wherein the beamformer method comprises at least one among a Multiple Auscultation Point (MAP) beamformer and a Bartlett beamformer.
19. A method, comprising generating a composite image from sound waves collected by way of an ultrasound (US) probe and an audio sensor array (ASA) probe.
20. The method of claim 19, comprising the step of generating the composite image according to positioning information collected from a position sensor coupled to at least one among the US and ASA probes.
21. The method of claim 20, wherein the positioning information comprises three or more degrees of freedom.
22. The method of claim 19, comprising the step of applying a material to at least one among the US and ASA probes to slidably couple said US and ASA probes and thereby vary a field of view of said US and ASA probes.
23. The method of claim 19, comprising the step of: placing the US and ASA probes on a portion of a human body having one or more vessels for carrying body fluids; detecting an anomalous flow in one or more of the vessels; and presenting in the composite image the one or more vessels with one or more indicators corresponding to the detected anomalous flow.
24. The method of claim 23, wherein the anomalous flow corresponds to a constriction or potential for constriction in the one or more vessels.
25. The method of claim 19, wherein the ASA probe comprises one among a semi-rigid array of low frequency sound sensors, and a flexible array of low frequency sound sensors.
26. The method of claim 19, wherein the US and ASA probes are wirelessly coupled to a controller that generates the composite image.
27. The method of claim 19, comprising the step of generating the composite image according to positioning information collected from at least one position sensor coupled to the ASA probe.
28. The method of claim 19, wherein the ASA probe comprises one or more location markers, wherein the method comprises the step of generating the composite image according to positioning information collected by the US probe when coupled to said location markers.
29. The method of claim 19, comprising the step of: injecting one or more high frequency signals into a corresponding one or more array elements of the ASA probe; and generating the composite image according to positioning information collected by the US probe from a detection of the one or more high frequency signals corresponding to the one or more array elements of the ASA probe.
PCT/US2006/015001 2005-04-28 2006-04-21 Method and system for generating an image from high and low frequency sound waves WO2006124192A3 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US67579205 true 2005-04-28 2005-04-28
US60/675,792 2005-04-28

Publications (2)

Publication Number Publication Date
WO2006124192A2 true true WO2006124192A2 (en) 2006-11-23
WO2006124192A3 true WO2006124192A3 (en) 2007-06-14

Family

ID=37431745

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2006/015001 WO2006124192A3 (en) 2005-04-28 2006-04-21 Method and system for generating an image from high and low frequency sound waves

Country Status (1)

Country Link
WO (1) WO2006124192A3 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050154314A1 (en) * 2003-12-30 2005-07-14 Liposonix, Inc. Component ultrasound transducer

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050154314A1 (en) * 2003-12-30 2005-07-14 Liposonix, Inc. Component ultrasound transducer

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9597008B2 (en) 2011-09-06 2017-03-21 Ezono Ag Imaging probe and method of obtaining position and/or orientation information
US9257220B2 (en) 2013-03-05 2016-02-09 Ezono Ag Magnetization device and method
US9459087B2 (en) 2013-03-05 2016-10-04 Ezono Ag Magnetic position detection system

Also Published As

Publication number Publication date Type
WO2006124192A3 (en) 2007-06-14 application

Similar Documents

Publication Publication Date Title
Ermilov et al. Laser optoacoustic imaging system for detection of breast cancer
US7374538B2 (en) Methods, systems, and computer program products for ultrasound measurements using receive mode parallel processing
Bavu et al. Noninvasive in vivo liver fibrosis evaluation using supersonic shear imaging: a clinical study on 113 hepatitis C virus patients
US6278890B1 (en) Non-invasive turbulent blood flow imaging system
Jensen Estimation of blood velocities using ultrasound: a signal processing approach
Wells et al. Medical ultrasound: imaging of soft tissue strain and elasticity
US7025725B2 (en) Three-dimensional ultrasound computed tomography imaging system
US6709407B2 (en) Method and apparatus for fetal audio stimulation
US5836894A (en) Apparatus for measuring mechanical parameters of the prostate and for imaging the prostate using such parameters
US20100069751A1 (en) Systems and methods for detecting regions of altered stiffness
Doherty et al. Acoustic radiation force elasticity imaging in diagnostic ultrasound
Fatemi et al. Imaging elastic properties of biological tissues by low-frequency harmonic vibration
Xu et al. Effects of acoustic heterogeneity in breast thermoacoustic tomography
Chen et al. Shearwave dispersion ultrasound vibrometry (SDUV) for measuring tissue elasticity and viscosity
US5766208A (en) Body monitoring and imaging apparatus and method
US5785663A (en) Method and device for mechanical imaging of prostate
Winters et al. Estimating the breast surface using UWB microwave monostatic backscatter measurements
US20060173327A1 (en) Ultrasound diagnostic system and method of forming arbitrary M-mode images
US20050277824A1 (en) Non-invasive method of obtaining a pre-determined acoustic wave field in an essentially uniform medium which is concealed by a bone barrier, imaging method and device for carrying out said methods
US20080154144A1 (en) Systems and methods for cardiac contractility analysis
US6585647B1 (en) Method and means for synthetic structural imaging and volume estimation of biological tissue organs
US6620115B2 (en) Apparatus and method for mechanical imaging of breast
JP2004057652A (en) Ultrasonographic system, distortion distribution display method, and elastic modulus distribution display method
US20060241459A1 (en) Automatic signal-optimizing transducer assembly for blood flow measurement
US20040122325A1 (en) Diagnostic analysis of ultrasound data

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

NENP Non-entry into the national phase in:

Ref country code: RU

122 Ep: pct application non-entry in european phase

Ref document number: 06784366

Country of ref document: EP

Kind code of ref document: A2