CA2861089C - Dual modality imaging system for coregistered functional and anatomical mapping - Google Patents
Dual modality imaging system for coregistered functional and anatomical mapping Download PDFInfo
- Publication number
- CA2861089C CA2861089C CA2861089A CA2861089A CA2861089C CA 2861089 C CA2861089 C CA 2861089C CA 2861089 A CA2861089 A CA 2861089A CA 2861089 A CA2861089 A CA 2861089A CA 2861089 C CA2861089 C CA 2861089C
- Authority
- CA
- Canada
- Prior art keywords
- hand
- imaging probe
- held imaging
- optical
- optoacoustic
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0093—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
- A61B5/0095—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/13—Tomography
- A61B8/14—Echo-tomography
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5261—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/1702—Systems in which incident light is modified in accordance with the properties of the material investigated with opto-acoustic detection, e.g. for gases or analysing solids
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0204—Acoustic sensors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
- A61B2562/02—Details of sensors specially adapted for in-vivo measurements
- A61B2562/0233—Special features of optical sensors or probes classified in A61B5/00
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/43—Detecting, measuring or recording for evaluating the reproductive systems
- A61B5/4306—Detecting, measuring or recording for evaluating the reproductive systems for evaluating the female reproductive systems, e.g. gynaecological evaluations
- A61B5/4312—Breast evaluation or disorder diagnosis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
- A61B8/0825—Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Pathology (AREA)
- General Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Public Health (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Molecular Biology (AREA)
- Biomedical Technology (AREA)
- Medical Informatics (AREA)
- Biophysics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Radiology & Medical Imaging (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Immunology (AREA)
- General Physics & Mathematics (AREA)
- Biochemistry (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Optics & Photonics (AREA)
- Acoustics & Sound (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
A real-time imaging system that provides ultrasonic imaging and optoacoustic imaging coregistered through application of the same hand-held probe to generate and detect ultrasonic and optoacoustic signals. These signals are digitized, processed and used to reconstruct anatomical maps superimposed with maps of two functional parameters of blood hemoglobin index and blood oxygenation index. The blood hemoglobin index represents blood hemoglobin concentration changes in the areas of diagnostic interest relative to the background blood concentration. The blood oxygenation index represents blood oxygenation changes in the areas of diagnostic interest relative to the background level of blood oxygenation. These coregistered maps can be used to noninvasively differentiate malignant tumors from benign lumps and cysts.
Description
DUAL MODALITY IMAGING SYSTEM FOR COREGISTERED FUNCTIONAL
AND ANATOMICAL MAPPING
[0001] This application claims priority to U.S. Patent Application Nos.
13/667,808 and 13/667,830 filed November 2, 2012. This application is also a continuation-in-part of U.S.
Patent Application No. 13/507,217, filed 06/13/2012, entitled "System and Method for Acquiring Optoacoustic Data and Producing Parametric Maps Thereof' and is a continuation-in-part of U.S. Patent Application No. 13/341,950, filed December 31, 2011, entitled "System and Method for Adjusting the Light Output of an Optoacoustic Imaging System"
and is a continuation-in-part of U.S. Patent Application No. 13/287,759. filed November
AND ANATOMICAL MAPPING
[0001] This application claims priority to U.S. Patent Application Nos.
13/667,808 and 13/667,830 filed November 2, 2012. This application is also a continuation-in-part of U.S.
Patent Application No. 13/507,217, filed 06/13/2012, entitled "System and Method for Acquiring Optoacoustic Data and Producing Parametric Maps Thereof' and is a continuation-in-part of U.S. Patent Application No. 13/341,950, filed December 31, 2011, entitled "System and Method for Adjusting the Light Output of an Optoacoustic Imaging System"
and is a continuation-in-part of U.S. Patent Application No. 13/287,759. filed November
2, 2011, entitled "Handheld Optoacoustic Probe."
FIELD OF THE TECHNOLOGY
[0002] At least some embodiments disclosed herein relate, in general, to systems for biomedical imaging, and more particularly, to real-time imaging systems that visualize thin tissue slices noninvasively through skin.
BACKGROUND
FIELD OF THE TECHNOLOGY
[0002] At least some embodiments disclosed herein relate, in general, to systems for biomedical imaging, and more particularly, to real-time imaging systems that visualize thin tissue slices noninvasively through skin.
BACKGROUND
[0003] Medical ultrasound imaging is a well-established imaging technology for visualization of tissue morphology in various organs that provides diagnostic information based on analysis of anatomy. Optoacoustic imaging is used in medical applications for in vivo and in vitro mapping of animal and human tissues and organs based on variation in tissue optical properties. Optoacoustic tomography can provide anatomical, functional and molecular imaging, but the most significant value of optoacoustic imaging is in its capability to provide quantitative functional information based on endogenous contrast of molecular constituents of red blood cells. The essence of functional imaging is to provide the physician with a map of blood distribution and its level of oxygenation, so that the physician can determine whether particular tissue functions normally or not. For example, a map of total hemoglobin distribution simultaneously showing an area with increased concentration and decreased oxygen saturation indicates potential malignancy. The essence of molecular imaging is to provide maps of distributions and concentrations of various molecules of interest for a specific health condition. Fur example, distribution of specific protein receptors in cell membranes gives insight into molecular biology or cells that aids in designing drugs and therapeutic methods to treating human diseases.
SUMMARY
SUMMARY
[0004] In an embodiment, the invention provides a real-time imaging system that visualizes thin tissue slices noninvasively through skin and provides three independent and co-registered images with biomedically significant information. Specifically, images of deep biological tissue structures are precisely superimposed with images of tissue functional state, such as the total hemoglobin concentration and blood level of oxygen saturation. The invention in this embodiment thus combines ultrasonic imaging and optoacoustic imaging technologies in a novel manner. These technologies can advantageously be combined given the complementary nature of information provided by them, and the fact that the same set of ultrasonic/pressure detectors and the same set of analog and digital electronics can be used to acquire both types of signals from tissues. In order to achieve a high level of accuracy of quantitative information and present it substantially in real time (i.e.
substantially as it occurs), a design is disclosed that utilizes one or more dual-wavelength short-pulse lasers or a plurality of single-wavelength short pulse lasers, a fiberoptic light delivery system, a hand-held imaging probe, other electronic hardware and processing software.
10005] In an embodiment, an imaging system is disclosed for visualization of slices into the depth of tissue of at least a portion of a body. The system includes a processing subsystem that produces three independent images, including two functional images showing distribution of the total hemoglobin concentration and distribution of the blood oxygen saturation and one morphological image of tissue structures, the images being co-registered in time and space by utilizing one and the same hand-held imaging probe. The system may include a three-dimensional positioning system which provides the capability of assembling three dimensional volumetric images of said body from two-dimensional slices made though the depth of tissue obtained by scanning a hand-held probe along the surface of at least portion of the body.
10006] In an embodiment, an imaging method provides coregistered functional and anatomical mapping of tissue of at least a portion of a body. Ultrasonic pulses are delivered into the tissue and backscattered ultrasonic signals reflected from various structural tissue boundaries associated with body morphology are detected. Two optical pulses having different spectral bands of electromagnetic energy are delivered, and transient ultrasonic signals resulting from selective absorption of different energy fractions from each of the two optical pulses by hemoglobin and oxyhemoglobin of blood containing tissues are detected.
SUBSTITUTE SHEET (RULE 26) Detected ultrasonic signals are processed to remove noise, to revert signal alterations in the course of signal propagation through tissue and through the detection system components, and to restore the temporal shape and ultrasonic spectrum of the original signals. Image reconstruction and processing is performed to generate morphological images of tissue structures coregistered and superimposed with partially transparent functional images of the total hemoglobin concentration and blood oxygen saturation. The above steps of the process are repeated with a video frame rate so that real-time images can display tissue functional and morphological changes substantially as they occur.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The disclosed embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
[0008] FIG. 1A illustrates an embodiment of an optoacoustic probe with illumination of tissue through skin by a scattered light beam formed in tissue by merging two optical beams.
[0009] FIG. 1B illustrates how laser illumination light and an acoustic signal from an optoacoustic probe can be scattered from the skin towards an acoustic lens of a probe.
[0010] FIGS. 2A and 2B illustrate optoacoustic signals showing the impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array, and using detection by transducers tilted at large angle relative to the plane of images generated therefrom.
[0011] FIG. 3 illustrates an embodiment of manifestation of image artifacts associated with the edge effect of optical illumination beam having abrupt changes of the optical fluence.
[0012] FIGS. 4A-4C illustrate embodiments wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances.
[0013] FIGS. 5A and 5B illustrate two embodiments of a hand-held optoacoustic ultrasonic probe protected from optical illumination of the acoustic lens.
[0014] FIG. 6 illustrates optoacoustic images using a probe with an acoustic lens that is not totally optically reflective and with a probe having an optically reflective layer of gold, which removes lens related image artifacts.
[0015] FIG. 7A illustrates an embodiment of an optical beam with sharp edges that may produce edge effects of acoustic waves and related artifacts and an optical beam with smooth SUBSTITUTE SHEET (RULE 26) edges producing reduced edge-related artifacts [0016] FIGS. 7B and 7C illustrate designs of an output fiber bundle with multiple sub-bundles shaped to provide even illumination of the image plane and reduce edge related optoacoustic artifacts.
[0017] FIG. 8 illustrates the effect of optical illumination for two probes where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe.
[0018] FIG. 9A illustrates embodiments of ultrasonic probes having flat, concave or convex arc shapes.
[0019] FIG. 9B shows a hand-held optoacoustic probe having a concave arc shape.
[0020] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape.
[0021] FIG. 9D shows an optoacoustic image of three spherical objects and demonstrates that within the field of view of the arc spatial (and especially lateral) resolution is excellent even for a large object.
[0022] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic hand-held probe design.
[0023] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow ultrasonic frequency band of sensitivity, the impulse response of an ultrawide-band ultrasonic transducer, and the ultrasonic spectra of transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers.
[0024] FIGS. 11A-11B provides an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals where deconvolution restores the original, unaltered, N-shaped pressure signals.
[0025] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales, seven scales and nine scales.
[0026] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array.
[0027] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographic images of an imaging slice through tissue with a small artery, larger vein and a rectangular SUBSTITUTE SHEET (RULE 26) grid allowing estimation of system performance in visualization of microvessels.
[0028] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images of a point spread function as visualized with a flat linear probe using a backpropagation algorithm and an aperture normalized backprojection algorithm.
[0029] FIGS. 16A and 16B provide an illustrative example of optoacoustic images of a phantom with hairs embedded at different depths where the first image was created using an embodiment of a standard palette and the second image was created using an embodiment of a depth-normalized palette.
[0030] FIGS. 17A and 17B provide an illustrative example of optoacoustic images of a phantom of a spherical simulated tumor obtained with a flat linear probe.
[0031] FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 757 nm and 1064 nm, which match the local maximum of hemoglobin absorption like in totally in hypoxic blood (757 nm) and minimum of the ratio of absorption by hypoxic hemoglobin to absorption by oxyhemoglobin like in normally oxygenated blood (1064 nm).
[0032] FIG. 19 illustrates tumor differentiation based on absorption coefficients at two wavelengths in a phantom simulating benign (box) and malignant (sphere) tumors.
[0033] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02].
[0034] FIG. 20B shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe.
[0035] FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels with different anatomical and functional images.
[0036] FIGS. 21A and 21B show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at a wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 21B. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging.
[0037] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water.
[0038] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of SUBSTITUTE SHEET (RULE 26) breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors.
[0039] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors.
DETAILED DESCRIPTION
10040] The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding.
However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.
[0041] Reference in this specification to "an embodiment" or "the embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an embodiment of the disclosure. The appearances of the phrase "in an embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.
System Overview [0042] In at least some embodiments, the present disclosure is directed to a dual-modality ultrasonicioptoacoustic system for medical diagnostics that uses a hand-held probe for scanning along the skin surface of an organ and provides two types of two-dimensional maps into the depth of tissue, anatomical (morphological) and functional (blood hemoglobin index and blood oxygenation index). In an embodiment, these two maps are spatially coregistered by using the same array of ultrasonic transducers and temporally coregistered by acquiring the two types of images in real time, faster than any physiological changes can occur in the tissue of diagnostic interest. The blood hemoglobin index represents blood hemoglobin concentration changes in the areas of diagnostic interest relative to the background blood concentration. The blood oxygenation index represents blood oxygenation changes in the areas of diagnostic interest relative to the background level of blood oxygenation. These SUBSTITUTE SHEET (RULE 26) coregistered maps can be used to noninvasively differentiate malignant tumors from benign lumps and cysts.
[0043] In an embodiment, the dual-modality ultrasonic/optoacoustic system of the present disclosure provides two-dimensional imaging of a body utilizing delivery of optical energy and acoustic detection of resulting transient pressure waves using interchangeable hand-held probes, one of which is flat and used to perform a translational scan through at least a flat portion of the body under examination, and the second of which is curved being shaped as a concave arc to perform a translational scan through at least a cylindrical or curved portion of the body under examination, both scans contribute to a more complete understanding normal or pathological functions in the body.
[0044] In an embodiment, at least a portion of the body under examination contain molecules of blood constituents such as hemoglobin and oxyhemoglobin responsible for body functions or receptors in cells responsible for cell functioning, water, lipids or other constituents.
100451 In an embodiment, optical energy produced using at least one laser beam is used for body illumination with at least one wavelength of light. In an embodiment, the optical energy is pulsed, with the pulse duration shorter than the time of ultrasound propagation through the distance in the body equal to the desirable spatial resolution. In an embodiment, the optical energy is within the spectral range from 532 nm to 1064 nm. In an embodiment, the optical energy is replaced with other electromagnetic energy with the wavelength from 1 nm to lm.
[0046] In an embodiment, electronic signals produced by ultrasonic transducers are amplified using low noise wide band electronic amplifiers with high input impedance. In an embodiment, analog electronic signals are digitized by a multi-channel analog-to-digital converter and further processed utilizing a field programmable gate array. In an embodiment, the ultrasonic transducers are ultrawide-band transducers that detect ultrasonic signals with no or minimal reverberations. In an embodiment, the system is integrated with an ultrasound imaging system used to enhance visualization of acoustic boundaries in the body and parts of the body with different density and/or speed of sound.
10047] In an embodiment, quantitative measurements of concentrations of target molecules, cells or tissues is made through characterization of optical energy propagation and absorption combined with processing of digital electronic signals by deconvolution of the hardware transfer function in order to obtain intrinsic optoacoustic amplitude and profile of such signals and the distribution of the optical absorption coefficient in the body.
SUBSTITUTE SHEET (RULE 26) [0048] In an embodiment, an optoacoustic contrast agent is used to visualize a portion of the body of interest or characterize distribution of certain molecules, cells or tissues in the body.
[0049] In an embodiment, the system comprises at least a laser, a light delivery system, an optoacoustic probe, an electronic system, a computer and an image display.
Laser [0050] In an embodiment, the laser is capable of emitting short, nanosecond pulses of near infrared light at two (or more) different toggling wavelengths, i.e. two different spectral bands. In an embodiment, one of the wavelengths is preferentially absorbed by hemoglobin of blood and the other is preferentially absorbed by oxyhemoglobin of blood.
In an embodiment, illumination of the organ under examination with the first laser pulse at one wavelength (spectral band) and detection of the first optoacoustic signal profile resulting from the first illumination, followed by the illumination with the second laser pulse at the second wavelength band and detecting of the second optoacoustic signal profile, can provide data that can be used for reconstruction of two coregistered tomographic images that can be used for generation of functional maps of the areas of diagnostic interest based on (i) blood hemoglobin index and (ii) blood oxygenation index, Light Delivery System 10051] In an embodiment, the light delivery system comprises bundles of optical fibers.
In an embodiment, the input of the optical fiber bundle is circular to match the incident laser beam, while the output of the fiber bundle is rectangular to match the size and shape of the ultrasonic transducer array. In an embodiment, each fiber has a small diameter (e.g., down to 50 micron) to provide excellent flexibility of the bundles. In an embodiment, the input tip of the fiber bundle is fused to shape the bundle into a hexagon and to eliminate spaces between the fibers in the bundle, thereby providing up to 20% better transmission of the laser energy.
In an embodiment, the output tip of the fiber bundle is fully randomized such that fibers that appear close to each other at the input will appear far from each other at the output or even in different branches of the bifurcated fiber bundle.
Optoacoustic Probe 10052] The probe is designed to provide high contrast and resolution of both optoacoustic and ultrasonic images. In an embodiment, the probe is a hand-held probe with an array of ultrasonic/optoacoustic transducers, which can be designed to be single dimensional, 1.5 dimensional or two-dimensional. In an embodiment, the transducers detect acoustic waves within an ultra-wide band of ultrasonic frequencies and the ultra-wide band is shaped to SUBSTITUTE SHEET (RULE 26) match the spectrum of optoacoustic signals emitted by tissue of diagnostic interest. In an embodiment, the transducers are also designed to emit acoustic waves as short pulses of ultrasound with short ring-down time and minimal reverberations of gradually decreasing magnitude.
[0053] To achieve such a design, transducer material can be chosen from, for example, piezoelectric ceramics (such as PZT, PMN-PT, and PZNT), piezoelectric single crystals (such as PZT, PMN-PT, and PZNT), piezoelectric polymers (such as PVDF and copolymer PVDF
copolymer), composite polymer-ceramic and polymer-crystal piezoelectric materials and capacitive micromachined ultrasonic transducers (CMUT). In an embodiment, the thickness of the transducer elements that provide the central frequency and materials of the backing layer and the front surface matching layer of the transducers are optimized.
[0054] In various embodiments, the shape of the ultrasound transducer array may be either flat or concave arc. A flat design is suited to scanning of the surface of an organ under examination that has radius of curvature much larger than the size of the probe, such as a human body. A concave arc-shaped design provides the largest aperture for the optoacoustic signal detection with minimal physical dimensions. The large aperture, in turn, provides for improved lateral resolution within the angle of the field of view formed by lines connecting the arc's focal point with each edge transducer of the array. The arc-shaped probe is often the most effective for scanning body surfaces that are curved with a radius approximately matching that of the probe (such as the average sized breast, neck, arms and legs).
[0055] FIG. lA
illustrates an embodiment of an optoacoustic probe that provides illumination of tissue (TS) through skin (SK) by the scattered light (SL) beam formed in tissue by merging two optical beams (OB) emerging from fiber bundles (FB) expanding and passing through light diffusers (LD) then passing through optical windows (OW). Acoustic waves (AW) generated in blood vessels or tumors (BV or TM) by the scattered light (SL) in tissue propagate through acoustic lens (AL) to transducers (TR) and converted into electrical signals being transmitted by electrical cables (EC) through the backing material (BM) to the electronic amplifiers.
10056] In an embodiment, the design of the optical fiber bundle is as follows. The input of the fiber bundle is circular with fused fiber tips to avoid loss of light through spaces between the fibers. The fiber diameter may be approximately 200 microns for good flexibility, and a fiber diameter of 100 microns or even 50 microns may be desirable in a particular application. This fiber bundle is Y-split into two half-bundles and fully randomized, so that substantially any two neighboring fibers from the input appear in SUBSTITUTE SHEET (RULE 26) different half-bundles. At least a majority of the neighboring fibers should be randomized in this regard. Each half-bundle is preferably split into multiple sub-bundles, and each sub-bundle is placed in its slot/niche to form fiber bundle "paddles". The two paddles are placed on each side of the ultrasonic transducer (TR) array assembly. As is discussed below with reference to FIGS. 7B and 7C, the output shape of each fiber bundle paddle may be rectangular for the width of the field of view. typically 40 mm, and have triangular shaped ends. Such triangular shape allows the output beam to have smooth edges after passing through light diffuser (LD), FIG. 1A. Finally, the optical beam from fiber bundle paddles exit from the probe into the skin (SK) through optical windows (OW) that comprise thin anti-reflection-coated glass plates or anti-reflection-coated polymer or plastic plates with acoustic impedance matching that of tissues to be imaged.
[0057] There are a number of objectives for the present optoacousfic probe design: (i) substantially no light should propagate either through the acoustic lens (AL) or through the optical block acoustic damper (OBAD) on the sides of the probe, (ii) substantially no acoustic waves should be generated in the acoustic lens or the optical block acoustic damper materials through absorption of light; acoustic waves in a wide range of ultrasonic frequencies from 0.1 MHz to 15 MHz should be able to pass through (AL) with no attenuation, and no acoustic waves should be able to pass through OBAD; (iii) the optical beams (OB) exiting through optical windows (OW) should have smooth edges of the optical fluence, and these optical beams should enter the skin as close to each other as necessary to merge due to optical scattering within the skin and enter underlying tissue under the array or transducers providing maximum fluence in the image plane.
[0058] In an embodiment, the light delivery system directs light underneath the transducer elements, not through the array of transducer elements. In an embodiment, the design of the optoacoustic probe is based on an array of ultrasonic transducers with fiber optic delivery systems on each side of the ultrasonic array, positioned as close to the transducers as possible and with dimensions in the elevation axis of the transducer as small a possible, considering the need to focus ultrasonic beams to the depth of the most probable target. In an embodiment, the fiber optic delivery system is designed to allow penetration of the optical energy of the near infrared light into the organ being imaged, such as a breast, and minimum opto-thermo-mechanical interaction of the light beam with skin.
10059] Another alternative design of the light delivery system delivers light to a mirror or prism(s) placed underneath of the ultrasonic transducers in order to reflect the light orthogonally to the skin surface of an organ being imaged. In such embodiments, a standoff SUBSTITUTE SHEET (RULE 26) can be placed between the transducer elements and the skin/tissue. These alternative embodiments may be combined within the scope of the invention.
Detailed Description of Aspects of System Components Optical Illumination and Probe Design 10060] An acoustic lens is typically placed on transducers within an optoacoustic probe for purposes of focusing ultrasonic beams. While a probe could be provided without an acoustic lens, if there is no lens then ultrasonic transducers may be directly exposed to light and absorb such light, which can result in very large artifact ultrasonic signals, especially where such light is pulsed. Optical illumination of the lens on an ultrasonic probe causes very strong transient acoustic waves that result in image artifacts. Up to 50%
of near infrared light can be diffusely scattered by skin, depending on skin color. Mismatch of acoustic impedance between the lens of the transducer elements can cause reverberations with long ring down time. Therefore, an embodiment of a probe design includes a white strongly scattering opaque lens. If such lens is not needed due to curved shape of each transducer element, then a white strongly scattering front matching layer should be employed to protect transducer elements from near-infrared light.
100611 FIG. 1B illustrates how laser illumination light 110 and 120 from an optoacoustic probe can be scattered 130 from the skin 140 towards an acoustic lens 150 of a probe.
[0062] Furthermore, (laser) optical pulses can have a direct impact on ultrasonic transducers of the acoustic waves induced by strong interaction of the optical pulses with skin of the organ being imaged that laterally traverse along the skin surface in a direction orthogonal to the image plane. When detected by the array of transducers, spatial distributions of these acoustic waves are projected onto the optoacoustic image at a depth equal to the lateral distance between the array of transducers and the optical beams on the skin surface, creating artifacts. Furthermore, acoustic waves generated in skin through reverberation of the acoustic lens and the housing of the probe can further affect the quality of imaging.
[0063] FIGS. 2A and 2B illustrate exemplary optoacoustic signals showing impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array. The signals shown are generated by transducers in the direction almost orthogonal to the plane of images generated therefrom. Such transducers may receive signals at large oblique angle (up to 90 deg) relative to the plane of images generated SUBSTITUTE SHEET (RULE 26) therefrom, which is undesirable. Therefore, the design of the transducer array includes means to reject signals coming out of the image plane. Such means include but not limited to concave arc shape of the transducer elements and acoustic lens and delivery of the optical beam underneath the transducers. The detected optoacoustic signals 210 in FIG.
2A were generated using an effective acoustic coupling agent, in this case water. The signals 220 in FIG. 2B were generated in the absences of such acoustic coupling agent, i.e., using only air space to couple the acoustic signals to the transducer array.
[0064] Furthermore, the finite dimensions of the optical beam can affect the acoustic waves generated in response to impingement of the optical beam on tissue. Such acoustic waves can be generated at the sharp edges of the optical beam, propagate towards the array of transducers and result in artifacts. Since a system that utilizes flat linear array of ultrasonic transducers is configured such that the first and the last transducer in the array detect these waves first and the central transducers detect these waves the last, this "edge effect" results in v-shaped artifacts on a sinogram of optoacoustic signals and low frequency bulk artifacts on optoacoustic images.
[0065] FIG. 3 illustrates an example of manifestation of v-shaped artifacts 310 on a sinogram 300 of optoacoustic signals and associated artifacts 320 on optoacoustic images.
Since these acoustic waves are associated with the edge effect of the optical illumination beams having abrupt changes of the optical fluence, in an embodiment, one can see a V-shaped bright signals on the sinogram and associated series of artifact waves on the opto-acoustic image.
[0066] Furthermore, the illumination geometry of optical beams projected by an optoacoustic probe can affect image quality. Where the optical beams of an optoacoustic probe are placed too far apart, this can result in a gradual transition from the dark field illumination (two separate optical beams on each side of the probe resulting in the absence of direct light under the probe in the image plane) into the bright field of illumination (one beam under the probe going into the depth of tissue along the image plane). This transition creates a problem in the image brightness map making the map not quantitatively accurate and causes artifacts at the depth equal to the initial width between separate optical illumination beams on each side of the probe.
[0067] FIGS. 4A-4C illustrate an embodiment wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe 410, 420, 430 that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances. In the embodiment of FIG. 4A, when the optical beams are delivered under the ultrasonic probe, SUBSTITUTE SHEET (RULE 26) the distribution of the optical energy in the image plane has a smooth gradient with a maximum at the skin surface. This optical distribution is beneficial for high contrast of optoacoustic images. In the embodiment of FIG. 4B, when the optical beams are delivered close to a thin optoacoustic probe, the two beams can merge due to optical scattering within the skin, so that the distribution of the optical energy in tissue under the skin can be made similar to the embodiment of FIG. 4A. In the embodiment of FIG. 4C, when the optical beams are separated by a large distance, they merge only at significant depth within tissue, creating the optical distribution in the image plane with a dark zone (no light) in a subsurface layer of the tissue and a bright zone in the depth of the tissue, which is detrimental to the contrast of optoacoustic images, especially considering projection of brightly illuminated areas of skin onto the optoacoustic image plane at a depth equal to the separation distance of the two beams.
10068] Thus, in the embodiments illustrated in FIGS. 4A-4C, the image brightness map 412, 422 and 432 of the tissue being scanned is optimized where the illumination of the skin is directly under the probe 410. As the distance between the center of the transducers and the center of the optical beams increases, as shown at 420 and 430, the image brightness map 422 and 432 of the tissue being scanned becomes progressively more uneven.
[0069] Lastly, the reflection from boundaries of tissue structures (such as tumor, vessels or tissue layers) of laser-induced ultrasound waves launched into the tissue after being generated in skin, can also lead to image artifacts represented by lines, curves and overall noise.
[0070] In an embodiment, the acoustic lens of the probe is designed such that the lens reflects and scatters, but does not absorb, light from the illumination components, yet it is optically opaque. In various embodiments, such lens can be made either using strongly optically scattering material such as silicon rubber filled with titanium dioxide or barium sulfate powder, or using a thin metallic highly reflective layer such as aluminum or gold or a combination a white opaque lens material and a metal layer. In an embodiment, to avoid peel-off of the thin metallic layer from the front surface of the acoustic lens, in case of a combination of diffusely scattering material of the lens and a thin reflective layer (foil), the metallic reflective layer can be placed between the two layers of diffusely scattering material.
As it is difficult to make a material with absolutely zero optical absorption, and such absorption may generate ultrasound in thermoelastic materials, the lens material can be made from thermoplastic materials having minimal thermal expansion, which produces minimal or no ultrasound in response to the absorbed optical energy.
SUBSTITUTE SHEET (RULE 26) 10071] FIGS. 5A and 5B illustrate two embodiments, respectively, of hand-held optoacoustic ultrasonic probes 510 and 520 that are protected from optical illumination of the acoustic lens. In FIG. 5A, a totally reflective opaque white lens is utilized, and in FIG. 5B a partially reflective white lens is utilized, with light reflection capability of the lens enhanced by a gold layer or coating.
[0072] FIG. 6 illustrates optoacoustic images using a probe with a non-reflective acoustic lens 610 and a probe with reflective layer of gold 620. The probe utilizing a reflective layer of gold 620 produces an image with reduced artifacts 612 and 614.
[0073] In an embodiment, the probe housing serves as hypo-echoic encapsulation of the probe, which means that the probe housing is made from materials that (i) do not absorb laser light (more specifically near-infrared light), but if a small absorption is unavoidable, the materials having low thermal expansion do not emit ultrasound after absorption of the laser light, (ii) strongly attenuate and dampen ultrasonic waves and do not reverberate. The transducer assembly inside the probe housing is also made of the hypo-echoic material.
Alternatively, a layer of said hypo-echoic material is placed between the transducer assembly and the fiberoptic assembly to avoid generation of any ultrasound upon interaction of light with transducer assembly. In various embodiments, such materials can be chosen, for example, from white color porous and anechoic heterogeneous composites for baffles, foams, polymers, rubbers and plastics (such as CR-600 casting resin available from Micro-Mark of Berkeley Heights, NJ, or AM-37 available from Syntech Materials of Springfield, VA), and others. In an embodiment, any such materials are electrically non-conducting insulators to, inter alia, protect the probe from external electromagnetic emissions.
[0074] In an embodiment, the optical illumination subsystem is configured to deliver optical beams with smooth intensity edges. In an embodiment, the width of the optical beams is equal to that of the array of ultrasonic transducers within the optoacoustic probe (for example, about 5 mm). This is achieved by designing the bundle of optical fibers to have a gradually decreasing density of fibers at the edges. This design enables one to deliver laser illumination to the skin of the organ under examination so that the beam does not generate sharp edge-related acoustic waves, and such laser-induced acoustic waves do not produce V-shaped artifacts in the sinogram of optoacoustic images.
10075] FIG. 7A illustrates an embodiment of an optical beam with sharp edges 710 that may produce edge effects of acoustic waves and related artifacts and an optical beam 720 with smooth edges producing reduced edge-related artifacts. FIGS. 7B and 7C
illustrate that an optical beam with smooth edges of fluence can be produced using a fiberoptic bundle SUBSTITUTE SHEET (RULE 26) design having multiple sub-bundles and a triangular shape in each end of the fiber bundle assembly.
10076] In an embodiment, the fiber bundle is positioned at a distance from skin that is sufficient for the optical beam to expand to a desirable width. Where the dimensions of the probe are compact, the fibers used in the fiber bundle can be selected to have a higher numerical aperture (e.g., > 0.22). In an embodiment, in order to achieve better coupling of the optical beam into the skin, the beam is delivered through an optical window. In such embodiment, the optical window touches the skin, making its surface flat for better light penetration, simultaneously removing any excess of coupling gel from the skin surface being imaged. In an embodiment, the fiber bundle and the optical window are incorporated into the probe housing, so that the air gap between the fiber bundle and the window is protected.
10077] In an embodiment, the optical window is designed is to allow minimal interactions of both the optical beam and the laser-induced acoustic waves with such window. In an embodiment, the window is very thin and made of optically transparent material with antireflection (AR) optical coating. In an embodiment, such material has anechoic acoustic properties. These anechoic acoustic properties and the fact that the illuminated skin is being depressed upon optoacoustic scanning results in dampening of ultrasonic waves laterally propagating from the laser-illuminated skin surface to the transducer array, thereby reducing associated artifacts.
10078] In an embodiment, the probe is designed such that the optical beams are very close to the transducer elements on each side of the ultrasonic probe, which is made as thin as technologically possible. In an embodiment, the thickness of the probe is so small (e.g., 5 mm) that the light beams delivered to skin at this distance, d, from the probe center will merge into one beam within the thickness of the skin (about z=5 mm), and the tissue of the organ under examination receives one beam underneath the transducer elements.
This is called bright field illumination. In an embodiment, the optoacoustic probe is designed such that the optical light beam is delivered to the skin directly underneath the transducer elements.
100791 FIG. 8 illustrates the effect of optical illumination for two probes 810 and 820 where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe 812 and on either side of the probe 822. Where the skin is illuminated directly under the probe 812, a tumor 814 is clearly discernable and there is no clutter on the image background 816. Where the skin is illuminated on either side of the probe 822, the tumor is not discernable 824 and there are numerous artifacts on the image SUBSTITUTE SHEET (RULE 26) background 826.
[0080] In an embodiment, the optical beam width is designed to deliver increased light into the slice of tissue being imaged. In an embodiment, the beam is homogeneous, such that it has a constant fluence through the beam, as a heterogeneous beam generates acoustic sources of heterogeneities, which in turn produce artifacts in optoacoustic images. The fluence level is defined by the ANSI laser safety standards for the laser illumination of skin.
The beam width is limited by the capability of the optical scattering in tissue to deliver photons of light into the central slice located underneath the transducer elements (the slice being imaged). In an embodiment, the length of the optical beam is equal to the length of the transducer array. In an embodiment, the optical beam also has smooth edges, that is to say, gradually reduced fluence at the edges, since sharp edges produce strong edge artifacts on optoacoustic images.
10081] In an embodiment, design features of the optical illumination system and optoacoustic probe of the present disclosure can be summarized in the following Table:
Table 1. Summary of optical illumination and probe design.
System Feature Advantages Arc hand-held probe Higher aperture - lower distortions Light delivery into Improves optoacoustic image contrast and decreases artifacts by the imaging plane increasing the ratio of useful information (from the imaging plane) to noise (outside of the imaging plane) Optical shielding of Reduces artifacts from direct and scattered light striking the acoustic the probe lens, probe housing, etc.
Acoustic shielding of Acoustic shielding of the probe's housing reduces artifacts (clutter) the probe from acoustic waves propagating through the probe's housing Using ultrawide band Allows to have the same array working in ultrasonic and optoacoustic transducers for both imaging ultrasound and optoacoustic imaging SUBSTITUTE SHEET (RULE 26) [0082] In various embodiments, the shape of the ultrasonic transducer array for the combined optoacoustic/ultrasonic imaging can be either flat or convex arc-shaped. In an embodiment, the probe shape for optoacoustic imaging is concave arc-shaped.
Such a concave shape provides a large aperture with minimal physical dimensions, wider field of view of an object being imaged, which in turn provides for improved lateral resolution and better reconstruction of the shape of the object being imaged.
[0083] FIGS. 9A-9C illustrate embodiments of optoacoustic/ultrasonic hand-held probes having flat or concave arc shapes 910 (FIG 9A) and a hand-held transrectal probe with a linear shape 920 (FIG. 9B). FIG. 9C illustrates details of the optoacoustic/ultrasonic hand-held probe design with its face showing ultrasonic transducers assembly, two layers of hypo-echoic light reflecting and ultrasound damping material on each side, and two optical windows for delivery of the optical beam.
[0084] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape. Electrical cables 930 are provided for bi-directional communication to and from the probe, and fiberoptic bundles 940 are provided for delivering light to the probe. An array of wide-band ultrasonic transducers 950 send and receive acoustic energy. The transducer array 950 is covered by an opaque white cylindrical lens (not shown for clarity purposes) that extensively scatters and reflects near-infrared light. Optical windows 960 provide optical beam outputs. In the embodiment FIG. 9C, the ultrasonic transducers within the probe may be designed so as not to be sensitive to lateral acoustic (ultrasonic) waves and to be free of reverberations, especially in the lateral direction. This can be achieved by the selection of the piezoelectric composite material, the shape of piezoceramic elements in the matrix and anechoic properties of the matrix. In an embodiment, the ultrasonic transducers are also designed to possess high sensitivity within an ultrawide band of ultrasonic frequencies. This in turn results in minimal reverberations that cause artifacts on optoacoustic/ultrasonic images.
[0085] FIG. 9D shows an optoacoustic image that illustrates advantages of the concave-arc shaped hand-held probe in terms of resolution in optoacoustic images. As presented in this embodiment, the shape and sharp edges of a large sphere are well depicted in cases where the object is within the field of view of the probe aperture. Outside the probe aperture resolution and accuracy of shape reproduction decreases, however remains better than those for flat linear probes of similar width.
[0086] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic hand-held probe design that is capable of two-dimensional imaging within the plane going parallel SUBSTITUTE SHEET (RULE 26) to the skin surface at various selected depths, and three-dimensional images as well [0087] In an embodiment, a hand-held probe that is scanned along the skin surface producing real-time two-dimensional images of tissues in the body under examination also has a component serving for accurate global 3D positioning of the probe. This design allows the imaging system to remember positions of all tissue slices and to reconstruct three-dimensional images at the end of the scanning procedure.
Electronic Data Acquisition System [0088] In an embodiment, the present disclosure is directed to an optoacoustic imaging system having an electronic data acquisition system that operates in both optoacoustic and ultrasonic modes and can rapidly switch between such modes. In an embodiment, this is achieved with firmware that controls functions of a Field Programmable Gate Array (FPGA), the main microprocessor on the electronic data acquisition system. In an embodiment, a reprogrammable FPGA can toggle between optoacoustic and ultrasound operation modes in real-time, thus enabling co-registration of ultrasound and optoacoustic images, which can be used for diagnostic imaging based on functional and anatomical maps. In an embodiment, FPGA functions include controlling, acquiring, and storing optoacoustic and/or ultrasound data, signal processing and transferring data for real-time image reconstruction and processing. In an embodiment, the FPGA may also be employed in ultrasound beam forming and image reconstruction.
[0089] In an embodiment, the electronic data acquisition system design utilizes one or more multi-core Graphical Processor Units (GPU) for image reconstruction and processing.
In the ultrasound mode, in an embodiment, the FPGA controls the ultrasound transmission and it performs both ultrasound and optoacoustic data acquisitions on a multi-channel board.
In order to enhance operation of the memory of the FPGA, an external memory buffer can be used. In an embodiment, the FPGA allows rapid reprogramming of ultrasonic data acquisition with signal/frame repetition rate of about 2 to 20 kHz to optoacoustic data acquisition with signal/frame repetition rate of about 10-20 Hz, and also configures the structure of gates and internal memory structure and size to allow real-time switching between ultrasound emission and detection, laser synchronization, and system controls. In an embodiment, multiple FPGAs can be used to enhance the system performance. In an embodiment, in the ultrasound and optoacoustic modes, the FPGA clock can be changed by the appropriate time-division multiplexing (TDM). In an embodiment, the design of a multi-channel electronic data acquisition system can be based on modules, with a module typically being from 16 to 128 channels, although 256 channels or more may be appropriate in some SUBSTITUTE SHEET (RULE 26) applications. In an embodiment, the design of a multi-channel electronic data acquisition system has 64 channels.
[0090] In order to achieve dual modality operation of the optoacoustic/ultrasonic system, a separate optoacoustic electronic system could also be combined with a separate ultrasonic electronic system through a single probe. In an embodiment, the probe has a cable that has a Y- split to connect the probe to optoacoustic and ultrasonic electronic systems. In an embodiment, a programmable electronic switch allows one to send the detected signal from the probe (transducer array) either to the optoacoustic electronics (to operate in optoacoustic mode) or to the ultrasonic electronics and from the ultrasonic electronics to the probe (to operate in ultrasound mode). In an embodiment, a synchronization trigger signal is sent to the optoacoustic and ultrasonic systems sequentially, so that the optoacoustic and ultrasonic images are acquired one after the other.
Processing, Reconstruction and Display of Images Signal processing [0091] In various embodiments, a goal of the diagnostic imaging procedure is to display each pixel with a brightness that correctly replicates the originally generated signals in each voxel of tissue displayed on the image. On the other hand, intrinsic pressure signals generated by optical pulses within tissues may be significantly altered in the course of propagation through tissue and especially in the course of detection and recording by the ultrasonic transducers and the electronics subsystem.
[0092] In an embodiment, detected signals are processed to reverse alterations and restore original signals. In an embodiment, such reversal can be achieved through deconvolution of the impulse response (IR) of the system. In an embodiment, the impulse response can be measured by recording and digitizing a delta-function ultrasonic signal generated by short (nanosecond) laser pulses in a strongly absorbing optical medium with high thermoelastic expansion coefficient.
[0093] One component of the impulse response is the acousto-electrical impulse response, which provides for the optoacoustic or ultrasonic signal distortions due to the properties of the ultrasonic transducers, cables and analog electronics. A
second part of the impulse response is the spatial impulse response that provides for the signal distortions associated with finite dimensions of the ultrasonic transducers. In various embodiments, large transducers can integrate ultrasonic waves incident at an angle, whereas point-source-like transducers can provide perfect or near perfect delta-function spatial impulse response.
SUBSTITUTE SHEET (RULE 26) [0094] In an embodiment, any distortions in acousto-electrical impulse response can be reversed by the impulse response deconvolution from the detected signals.
However, possible distortions in the spatial impulse response can be avoided by designing transducers with small dimensions within the image plane. In an embodiment, the dimensions of the transducers are much smaller than the shortest wavelength of the ultrasound that may be detected or emitted by the transducers.
[0095] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow band of sensitivity 1010, the impulse response of an ultrawide-band ultrasonic transducer 1020 and ultrasonic spectra of the transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers 1030.
[0096] In an embodiment, the first step in processing an optoacoustic signal in an imaging system that produces two-dimensional optoacoustic images is deconvolution of the acousto-electrical impulse response.
10097] FIGS. 11A and 11B provide an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals 1110, where deconvolution restores the original, unaltered, N-shaped pressure signals 1120.
[0098] In an embodiment, the second step in processing an optoacoustic signal is signal filtering to remove noise using a signal filter. In an embodiment, the signal filter is based on a wavelet transform that operates simultaneously in the frequency and time domain. In an embodiment, such a wavelet filter is capable of filtering certain frequency components of the signal that belong to noise and appear at a given time, while preserving similar frequency components of the useful signal that appear at a different time. In an embodiment, the frequency spectrum of a wavelet filter replicates the frequency band of a typical N-shaped optoacoustic signal while simultaneously providing smooth window edges which do not cause signal distortions upon convolution.
[0099] In an embodiment, such a wavelet filter is useful in optoacoustic imaging in its capability to restore the original pressure profile generated in tissue prior to pressure propagation, In the course of propagation through tissue, the originally positive pressure signal converts into a bipolar (compression / tension) profile. Therefore, reconstruction of an image of absorbed optical energy (optoacoustic image) requires a transformation that starts with bipolar signals and provides for all-positive values of the optoacoustic image intensities.
In an embodiment, a multi-scale wavelet filter, for example, a filter that simultaneously integrates the signal over time and provides summation of a number of frequency bands present in the signal, can convert bipolar pressure signals into monopolar signal representing SUBSTITUTE SHEET (RULE 26) thermal energy or originally generated positive pressure.
[00100] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales 1210, seven scales 1220 and nine scales 1230.
[00101] In various embodiments, wavelet filtering permits enhancements of objects on the image within certain range of dimensions. An imaging operator (ultrasonic technician or diagnostic radiologist) typically desires to better visualize a tumor with specific dimensions and other objects, such as blood vessels, with their specific dimensions. In an embodiment, the wavelet filter allows the operator to apply specific selection of scales of the wavelet filter than would enhance only objects of certain sizes and suppress object of other unimportant sizes. In an embodiment, boundaries can be well visualized for objects of any size, so the high-frequency wavelet scales are beneficial for the image quality and are included in the selection of scales. In an embodiment, for a mathematically correct tomographic reconstruction, a ramp filter can be applied to the signal, which can linearly enhance contribution of higher frequencies.
Image Reconstruction [00102] In various embodiments, image reconstruction typically uses radial back-projection of processed and filtered signals onto the image plane. However, due to the limited field of view available from small hand-held probes, only an incomplete data set can be obtained. As a result, the 2D optoacoustic images may include artifacts distorting the shape and brightness of the objects displayed on the images. In an embodiment, aperture integrated normalized radial back projection is used to correct some of the reconstruction artifacts that are observed in limited aperture optoacoustic tomography.
[00103] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array.
[00104] In an embodiment, Tk,-Tk 4, 1311-1315, are the transducers 1310 in the array, is the brightness (intensity) of a pixel with coordinates (i,j), (owl 1320, 1330 is the angular portion of optoacoustic wave front emitted by the pixel (i,j) as it is visualized by the transducer #k, C2,,J=E oi,j,k (sum of all oid,k) is the portion of the optoacoustic wave front emitted by pixel (i,j) as it is visualized by the entire transducer array, and Sij,k is the sample of the optoacoustic signal measured by kth transducer and used in reconstruction of the SUBSTITUTE SHEET (RULE 26) brightness in the pixel (i,j). Various backpropagation algorithms can be used to normalize an optoacoustic image.
[00105] In an embodiment, a backpropagation algorithm can be expressed as:
kSitt (1) [00106] However, in at least some embodiments, aperture normalized backprojection produces superior image results. In an embodiment, the aperture normalized backprojection can be expressed as:
v = (2) .1) [00107] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographraphic images 1410 and 1420 of an imaging slice through a tumor angiogenesis model. In the first image 1410, a backpropagation algorithm, such as the first algorithm immediately above, is used to normalize the image. The resulting image has strong, bright arc-shaped artifacts 1412 around the blood vessels 1414 that are close to array surface. In the second image 1420, aperture normalized backprojection algorithm, such as the second algorithm immediately above, is used to normalize the image. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces the arc-shaped artifacts.
[00108] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images 1510 and 1520 of a point spread function as visualized with flat linear probe using 1510 a backpropagation algorithm, such as the first algorithm immediately above, and 1520 an aperture normalized backprojection algorithm, such as the second algorithm immediately above. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces artifacts.
Image Processing and Display [00109] In an embodiment, the optoacoustic image palette is equalized to diminish effects of light distribution within tissue. Such equalization transforms the dynamic range of optoacoustic images for better visualization of both shallow and deep objects.
[00110] FIGS. 16A and 16B provide an illustrative example of optoacoustic images 1610 and 1620 of a phantom with hairs embedded at different depths where the first image 1610 was created using a an embodiment of a standard palette and the second image 1620 was created using an embodiment of a depth-normalized palette. As can be seen, utilizing the depth-normalized palette enhances visibility of deep objects in the illustrated embodiment.
100111] In an embodiment, principal component analysis (PCA) on a single optoacoustic SUBSTITUTE SHEET (RULE 26) image acquisition (different channels) is used to remove cross-correlated signal noise.
Principal component analysis on a dataset of optoacoustic signals can remove correlated image clutter. Principal component analysis on optoacoustic frames can also remove correlated image clutter.
100112] FIGS. 17A and 17B provide an illustrative example of optoacoustic images 1710 and 1720 of a phantom of a spherical simulated tumor obtained with flat linear probe. The first image 1710 is a raw image that was not subjected to principal component analysis processing. The second image 1720 has been subjected to principal component analysis processing with first principal component deconvolution. As can be seen, utilizing principal component analysis processing enhances image quality by, inter alia, reducing artifacts.
[00113] In an embodiment, design features of signal and image processing of the present disclosure can be summarized in the Table 2 as follows:
Table 2. Summary of signal and image processing.
System Feature Advantages Operator-assisted Can improve quantitative optoacoustic diagnostics by evaluating the boundary tracking on diagnostic parameters within the tumor boundary defined on US images ultrasonic and Diagnostics can be enhanced by morphological analysis of the tumor optoacoustic images boundary Aperture integrated Corrects some of the reconstruction artifacts that are observed in a normalized radial limited aperture optoacoustic tomography back projection Equalization of the Transforms dynamic range of optoacoustic images for better optoacoustic image visualization of both shallow and deep objects palette to diminish effects of light distribution within the tissue Principal component PCA on a single optoacoustic acquisition (different channels) is a fast analysis (PCA) of the and efficient way to remove cross-correlated signal noise optoacoustic signal PCA on a dataset of optoacoustic signals removes correlated image data clutter PCA on optoacoustic frames removes correlated image clutter SUBSTITUTE SHEET (RULE 26) Optoacoustic Cancer diagnostics based on those parameters or a single malignancy imaging system with index (tHb*water/oxygenation) with respect to average background quantitative assessment of total hemoglobin, blood oxygenation, and water Wavelet transform - Operator can easily select the maximum size of the objects to be that enhances images enhanced on the image. Everything larger will be filtered out of objects within certain dimension range Adaptive beam- - Allows individual reconstruction on a family of radial wavelet sub-forming for bands optoacoustic imaging Diagnostic Image Reprocessing 1001141 The principles of functional diagnostic imaging can be based on the tumor pathophysiology. For example, malignant tumors have enhanced concentration of the total hemoglobin and reduced level of oxygen saturation in the hemoglobin of blood.
In an embodiment, optoacoustic images can be reprocessed and converted into, inter alia, images of (i) the total hemoglobin [tHb] and (ii) the oxygen saturation of hemoglobin [802]. FIG. 18 demonstrates an example of two breast tumors 1001151 FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 755 nm, 1810, and 1064 nm, 1820, which match the local maximum (757 nm) and minimum (1064 nm) of the ratio of absorption by hemoglobin (hypoxic blood) to absorption by oxyhemoglobin. As can be seen, a malignant tumor. 1830, has a higher absorption coefficient at 757 nm than a benign tumor, 1840, whereas the benign tumor, 1840, has a higher absorption coefficient at 1064 nm than a malignant tumor, 1830.
1001161 FIG. 19 illustrates tumor differentiation by optoacoustic imaging based on absorption coefficients at two wavelengths 1910 and 1920 in a phantom. At 757 nm, 1920, a model of a malignant tumor is clearly visible, 1922, whereas the model of the malignant tumor, 1922, is not visible at 1064 nm, 1910.
SUBSTITUTE SHEET (RULE 26) [00117] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02] (98% in the left tube, and 31% in the right tube). The tubes were placed in 1% fat milk with optical properties similar to those found in the human breast. The wavelength of laser illumination used for this image is 1064 nm.
FIG. 20B
shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe. FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels: (1-upper left) ultrasound image depicting anatomy of the body with vessels; (2-upper right) optoacoustic image obtained at the wavelength of 757 nm; (3-lower right) optoacoustic image obtained at the wavelength of 1064 nm; (4-lower left) functional image of the total hemoglobin [tHb]; (5-lower center) functional image of the blood oxygen saturation [S02];
(6-upper center) functional image of the blood oxygen saturation presented only in the area of maximum concentration of the total hemoglobin. Raw optoacoustic images depicted in FIG.
20C in the upper right and lower right panels demonstrate different brightness of blood vessels having blood with different level of the total hemoglobin concentration [tHb] and blood oxygen saturation [S02], accurate quantitative measurements could be performed under conditions of normalized fluence of the optical illumination of tissue in the body as a function of depth. These optoacoustic images were used to reconstruct functional images of the total hemoglobin [tHb] and the blood oxygenation [S02]. All functional images displayed in FIG. 20C are coregistered and superimposed with the anatomical image of tissue structure for better correlation of features.
[00118] FIGS. 21A and 21B show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at the wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 21B. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging. Specifically, this embodiment illustrates quantitative data based on measurements of the optoacoustic signal amplitude in blood having various levels of oxygen saturation (from 30% to 98%) and hematocrit of 38 g/dL of hemoglobin [tHb] in erythrocytes. As predicted by the published absorption spectra of blood, the optoacoustic signal amplitude at 1064 rim illumination increases with increased SUBSTITUTE SHEET (RULE 26) level of oxygen saturation, while the optoacoustie signal amplitude decreases with increased blood oxygenation at 757 nm illumination wavelength.
100119] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water.
Preferred laser wavelengths for functional imaging are 757 nm and 1064 nm matching max and min ratio of [HHb]1[02Hb], while the wavelength of 800 nm is the best for calibration purposes through measurements of the total hemoglobin [tHb].
100120] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors. FIG. 23A shows 2D images of: model of malignant tumor morphology based on ultrasound (left), the same anatomical image coregistered with functional image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right). FIG. 23B shows 2D images of a model benign tumor:
morphology image based on ultrasound (left), the same anatomical image coregistered with functional image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right).
100121] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors. FIG. 24A shows 2D images of invasive ductal carcinoma, a malignant tumor with rough boundaries, heterogeneous morphology, high concentration of total hemoglobin and low oxygen saturation (hypoxia). The malignant tumor morphology is based on ultrasound in the left image, and the same anatomical image coregistered with functional image of the blood oxygenation in the center image and with functional image of the total hemoglobin concentration in the right image. FIG. 24B shows 2D images of a breast with Fibroadenoma, a benign tumor with relatively round boundaries, normal concentration of oxyhemoglobin and relatively low total hemoglobin. Breast morphology is based on ultrasound in the left image, and the same anatomical image is coregistered with a functional image of the blood oxygenation in the center image and with a functional image of the total hemoglobin concentration in the right image.
SUBSTITUTE SHEET (RULE 26) Conclusion [00122] While some embodiments can be implemented in fully functioning computers and computer systems, various embodiments are capable of being distributed as a computing product in a variety of forms and are capable of being applied regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
[00123] At least some aspects disclosed can be embodied, at least in part, in software.
That is, the techniques described herein may be carried out in a special purpose or general purpose computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile RAM, non-volatile memory, cache or a remote storage device.
[00124] Routines executed to implement the embodiments may be implemented as part of an operating system, firmware, ROM, middleware, service delivery platform, SDK
(Software Development Kit) component, web services, or other specific application, component, program, object, module or sequence of instructions referred to as "computer programs."
Invocation interfaces to these routines can be exposed to a software development community as an API (Application Programming Interface). The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations necessary to execute elements involving the various aspects.
[00125] A machine-readable medium can be used to store software and data which when executed by a data processing system causes the system to perform various methods. The executable software and data may be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data may be stored in any one of these storage devices. Further, the data and instructions can be obtained from centralized servers or peer-to-peer networks. Different portions of the data and instructions can be obtained from different centralized servers and/or peer-to-peer networks at different times and in different communication sessions or in a same communication session. The data and instructions can be obtained in entirety prior to the execution of the applications. Alternatively, portions of the data and instructions can be obtained dynamically, just in time, when needed for execution. Thus, it is not required that the data and instructions be on a machine-readable medium in entirety at a particular instance of time.
[00126] Examples of computer-readable media include but are not limited to recordable and non-recordable type media such as volatile and non-volatile memory devices, read only SUBSTITUTE SHEET (RULE 26) memory (ROM), random access memory (RAM), flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs), etc.), among others.
[00127] In general, a machine readable medium includes any mechanism that provides (e.g., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
[00128] In various embodiments, hardwired circuitry may be used in combination with software instructions to implement the techniques. Thus, the techniques are neither limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the data processing system.
[00129] Although some of the drawings illustrate a number of operations in a particular order, operations that are not order dependent may be reordered and other operations may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be apparent to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.
[00130] In the foregoing specification, the disclosure has been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
SUBSTITUTE SHEET (RULE 26)
substantially as it occurs), a design is disclosed that utilizes one or more dual-wavelength short-pulse lasers or a plurality of single-wavelength short pulse lasers, a fiberoptic light delivery system, a hand-held imaging probe, other electronic hardware and processing software.
10005] In an embodiment, an imaging system is disclosed for visualization of slices into the depth of tissue of at least a portion of a body. The system includes a processing subsystem that produces three independent images, including two functional images showing distribution of the total hemoglobin concentration and distribution of the blood oxygen saturation and one morphological image of tissue structures, the images being co-registered in time and space by utilizing one and the same hand-held imaging probe. The system may include a three-dimensional positioning system which provides the capability of assembling three dimensional volumetric images of said body from two-dimensional slices made though the depth of tissue obtained by scanning a hand-held probe along the surface of at least portion of the body.
10006] In an embodiment, an imaging method provides coregistered functional and anatomical mapping of tissue of at least a portion of a body. Ultrasonic pulses are delivered into the tissue and backscattered ultrasonic signals reflected from various structural tissue boundaries associated with body morphology are detected. Two optical pulses having different spectral bands of electromagnetic energy are delivered, and transient ultrasonic signals resulting from selective absorption of different energy fractions from each of the two optical pulses by hemoglobin and oxyhemoglobin of blood containing tissues are detected.
SUBSTITUTE SHEET (RULE 26) Detected ultrasonic signals are processed to remove noise, to revert signal alterations in the course of signal propagation through tissue and through the detection system components, and to restore the temporal shape and ultrasonic spectrum of the original signals. Image reconstruction and processing is performed to generate morphological images of tissue structures coregistered and superimposed with partially transparent functional images of the total hemoglobin concentration and blood oxygen saturation. The above steps of the process are repeated with a video frame rate so that real-time images can display tissue functional and morphological changes substantially as they occur.
BRIEF DESCRIPTION OF THE DRAWINGS
[0007] The disclosed embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
[0008] FIG. 1A illustrates an embodiment of an optoacoustic probe with illumination of tissue through skin by a scattered light beam formed in tissue by merging two optical beams.
[0009] FIG. 1B illustrates how laser illumination light and an acoustic signal from an optoacoustic probe can be scattered from the skin towards an acoustic lens of a probe.
[0010] FIGS. 2A and 2B illustrate optoacoustic signals showing the impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array, and using detection by transducers tilted at large angle relative to the plane of images generated therefrom.
[0011] FIG. 3 illustrates an embodiment of manifestation of image artifacts associated with the edge effect of optical illumination beam having abrupt changes of the optical fluence.
[0012] FIGS. 4A-4C illustrate embodiments wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances.
[0013] FIGS. 5A and 5B illustrate two embodiments of a hand-held optoacoustic ultrasonic probe protected from optical illumination of the acoustic lens.
[0014] FIG. 6 illustrates optoacoustic images using a probe with an acoustic lens that is not totally optically reflective and with a probe having an optically reflective layer of gold, which removes lens related image artifacts.
[0015] FIG. 7A illustrates an embodiment of an optical beam with sharp edges that may produce edge effects of acoustic waves and related artifacts and an optical beam with smooth SUBSTITUTE SHEET (RULE 26) edges producing reduced edge-related artifacts [0016] FIGS. 7B and 7C illustrate designs of an output fiber bundle with multiple sub-bundles shaped to provide even illumination of the image plane and reduce edge related optoacoustic artifacts.
[0017] FIG. 8 illustrates the effect of optical illumination for two probes where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe.
[0018] FIG. 9A illustrates embodiments of ultrasonic probes having flat, concave or convex arc shapes.
[0019] FIG. 9B shows a hand-held optoacoustic probe having a concave arc shape.
[0020] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape.
[0021] FIG. 9D shows an optoacoustic image of three spherical objects and demonstrates that within the field of view of the arc spatial (and especially lateral) resolution is excellent even for a large object.
[0022] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic hand-held probe design.
[0023] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow ultrasonic frequency band of sensitivity, the impulse response of an ultrawide-band ultrasonic transducer, and the ultrasonic spectra of transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers.
[0024] FIGS. 11A-11B provides an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals where deconvolution restores the original, unaltered, N-shaped pressure signals.
[0025] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales, seven scales and nine scales.
[0026] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array.
[0027] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographic images of an imaging slice through tissue with a small artery, larger vein and a rectangular SUBSTITUTE SHEET (RULE 26) grid allowing estimation of system performance in visualization of microvessels.
[0028] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images of a point spread function as visualized with a flat linear probe using a backpropagation algorithm and an aperture normalized backprojection algorithm.
[0029] FIGS. 16A and 16B provide an illustrative example of optoacoustic images of a phantom with hairs embedded at different depths where the first image was created using an embodiment of a standard palette and the second image was created using an embodiment of a depth-normalized palette.
[0030] FIGS. 17A and 17B provide an illustrative example of optoacoustic images of a phantom of a spherical simulated tumor obtained with a flat linear probe.
[0031] FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 757 nm and 1064 nm, which match the local maximum of hemoglobin absorption like in totally in hypoxic blood (757 nm) and minimum of the ratio of absorption by hypoxic hemoglobin to absorption by oxyhemoglobin like in normally oxygenated blood (1064 nm).
[0032] FIG. 19 illustrates tumor differentiation based on absorption coefficients at two wavelengths in a phantom simulating benign (box) and malignant (sphere) tumors.
[0033] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02].
[0034] FIG. 20B shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe.
[0035] FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels with different anatomical and functional images.
[0036] FIGS. 21A and 21B show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at a wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 21B. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging.
[0037] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water.
[0038] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of SUBSTITUTE SHEET (RULE 26) breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors.
[0039] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors.
DETAILED DESCRIPTION
10040] The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding.
However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.
[0041] Reference in this specification to "an embodiment" or "the embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an embodiment of the disclosure. The appearances of the phrase "in an embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.
System Overview [0042] In at least some embodiments, the present disclosure is directed to a dual-modality ultrasonicioptoacoustic system for medical diagnostics that uses a hand-held probe for scanning along the skin surface of an organ and provides two types of two-dimensional maps into the depth of tissue, anatomical (morphological) and functional (blood hemoglobin index and blood oxygenation index). In an embodiment, these two maps are spatially coregistered by using the same array of ultrasonic transducers and temporally coregistered by acquiring the two types of images in real time, faster than any physiological changes can occur in the tissue of diagnostic interest. The blood hemoglobin index represents blood hemoglobin concentration changes in the areas of diagnostic interest relative to the background blood concentration. The blood oxygenation index represents blood oxygenation changes in the areas of diagnostic interest relative to the background level of blood oxygenation. These SUBSTITUTE SHEET (RULE 26) coregistered maps can be used to noninvasively differentiate malignant tumors from benign lumps and cysts.
[0043] In an embodiment, the dual-modality ultrasonic/optoacoustic system of the present disclosure provides two-dimensional imaging of a body utilizing delivery of optical energy and acoustic detection of resulting transient pressure waves using interchangeable hand-held probes, one of which is flat and used to perform a translational scan through at least a flat portion of the body under examination, and the second of which is curved being shaped as a concave arc to perform a translational scan through at least a cylindrical or curved portion of the body under examination, both scans contribute to a more complete understanding normal or pathological functions in the body.
[0044] In an embodiment, at least a portion of the body under examination contain molecules of blood constituents such as hemoglobin and oxyhemoglobin responsible for body functions or receptors in cells responsible for cell functioning, water, lipids or other constituents.
100451 In an embodiment, optical energy produced using at least one laser beam is used for body illumination with at least one wavelength of light. In an embodiment, the optical energy is pulsed, with the pulse duration shorter than the time of ultrasound propagation through the distance in the body equal to the desirable spatial resolution. In an embodiment, the optical energy is within the spectral range from 532 nm to 1064 nm. In an embodiment, the optical energy is replaced with other electromagnetic energy with the wavelength from 1 nm to lm.
[0046] In an embodiment, electronic signals produced by ultrasonic transducers are amplified using low noise wide band electronic amplifiers with high input impedance. In an embodiment, analog electronic signals are digitized by a multi-channel analog-to-digital converter and further processed utilizing a field programmable gate array. In an embodiment, the ultrasonic transducers are ultrawide-band transducers that detect ultrasonic signals with no or minimal reverberations. In an embodiment, the system is integrated with an ultrasound imaging system used to enhance visualization of acoustic boundaries in the body and parts of the body with different density and/or speed of sound.
10047] In an embodiment, quantitative measurements of concentrations of target molecules, cells or tissues is made through characterization of optical energy propagation and absorption combined with processing of digital electronic signals by deconvolution of the hardware transfer function in order to obtain intrinsic optoacoustic amplitude and profile of such signals and the distribution of the optical absorption coefficient in the body.
SUBSTITUTE SHEET (RULE 26) [0048] In an embodiment, an optoacoustic contrast agent is used to visualize a portion of the body of interest or characterize distribution of certain molecules, cells or tissues in the body.
[0049] In an embodiment, the system comprises at least a laser, a light delivery system, an optoacoustic probe, an electronic system, a computer and an image display.
Laser [0050] In an embodiment, the laser is capable of emitting short, nanosecond pulses of near infrared light at two (or more) different toggling wavelengths, i.e. two different spectral bands. In an embodiment, one of the wavelengths is preferentially absorbed by hemoglobin of blood and the other is preferentially absorbed by oxyhemoglobin of blood.
In an embodiment, illumination of the organ under examination with the first laser pulse at one wavelength (spectral band) and detection of the first optoacoustic signal profile resulting from the first illumination, followed by the illumination with the second laser pulse at the second wavelength band and detecting of the second optoacoustic signal profile, can provide data that can be used for reconstruction of two coregistered tomographic images that can be used for generation of functional maps of the areas of diagnostic interest based on (i) blood hemoglobin index and (ii) blood oxygenation index, Light Delivery System 10051] In an embodiment, the light delivery system comprises bundles of optical fibers.
In an embodiment, the input of the optical fiber bundle is circular to match the incident laser beam, while the output of the fiber bundle is rectangular to match the size and shape of the ultrasonic transducer array. In an embodiment, each fiber has a small diameter (e.g., down to 50 micron) to provide excellent flexibility of the bundles. In an embodiment, the input tip of the fiber bundle is fused to shape the bundle into a hexagon and to eliminate spaces between the fibers in the bundle, thereby providing up to 20% better transmission of the laser energy.
In an embodiment, the output tip of the fiber bundle is fully randomized such that fibers that appear close to each other at the input will appear far from each other at the output or even in different branches of the bifurcated fiber bundle.
Optoacoustic Probe 10052] The probe is designed to provide high contrast and resolution of both optoacoustic and ultrasonic images. In an embodiment, the probe is a hand-held probe with an array of ultrasonic/optoacoustic transducers, which can be designed to be single dimensional, 1.5 dimensional or two-dimensional. In an embodiment, the transducers detect acoustic waves within an ultra-wide band of ultrasonic frequencies and the ultra-wide band is shaped to SUBSTITUTE SHEET (RULE 26) match the spectrum of optoacoustic signals emitted by tissue of diagnostic interest. In an embodiment, the transducers are also designed to emit acoustic waves as short pulses of ultrasound with short ring-down time and minimal reverberations of gradually decreasing magnitude.
[0053] To achieve such a design, transducer material can be chosen from, for example, piezoelectric ceramics (such as PZT, PMN-PT, and PZNT), piezoelectric single crystals (such as PZT, PMN-PT, and PZNT), piezoelectric polymers (such as PVDF and copolymer PVDF
copolymer), composite polymer-ceramic and polymer-crystal piezoelectric materials and capacitive micromachined ultrasonic transducers (CMUT). In an embodiment, the thickness of the transducer elements that provide the central frequency and materials of the backing layer and the front surface matching layer of the transducers are optimized.
[0054] In various embodiments, the shape of the ultrasound transducer array may be either flat or concave arc. A flat design is suited to scanning of the surface of an organ under examination that has radius of curvature much larger than the size of the probe, such as a human body. A concave arc-shaped design provides the largest aperture for the optoacoustic signal detection with minimal physical dimensions. The large aperture, in turn, provides for improved lateral resolution within the angle of the field of view formed by lines connecting the arc's focal point with each edge transducer of the array. The arc-shaped probe is often the most effective for scanning body surfaces that are curved with a radius approximately matching that of the probe (such as the average sized breast, neck, arms and legs).
[0055] FIG. lA
illustrates an embodiment of an optoacoustic probe that provides illumination of tissue (TS) through skin (SK) by the scattered light (SL) beam formed in tissue by merging two optical beams (OB) emerging from fiber bundles (FB) expanding and passing through light diffusers (LD) then passing through optical windows (OW). Acoustic waves (AW) generated in blood vessels or tumors (BV or TM) by the scattered light (SL) in tissue propagate through acoustic lens (AL) to transducers (TR) and converted into electrical signals being transmitted by electrical cables (EC) through the backing material (BM) to the electronic amplifiers.
10056] In an embodiment, the design of the optical fiber bundle is as follows. The input of the fiber bundle is circular with fused fiber tips to avoid loss of light through spaces between the fibers. The fiber diameter may be approximately 200 microns for good flexibility, and a fiber diameter of 100 microns or even 50 microns may be desirable in a particular application. This fiber bundle is Y-split into two half-bundles and fully randomized, so that substantially any two neighboring fibers from the input appear in SUBSTITUTE SHEET (RULE 26) different half-bundles. At least a majority of the neighboring fibers should be randomized in this regard. Each half-bundle is preferably split into multiple sub-bundles, and each sub-bundle is placed in its slot/niche to form fiber bundle "paddles". The two paddles are placed on each side of the ultrasonic transducer (TR) array assembly. As is discussed below with reference to FIGS. 7B and 7C, the output shape of each fiber bundle paddle may be rectangular for the width of the field of view. typically 40 mm, and have triangular shaped ends. Such triangular shape allows the output beam to have smooth edges after passing through light diffuser (LD), FIG. 1A. Finally, the optical beam from fiber bundle paddles exit from the probe into the skin (SK) through optical windows (OW) that comprise thin anti-reflection-coated glass plates or anti-reflection-coated polymer or plastic plates with acoustic impedance matching that of tissues to be imaged.
[0057] There are a number of objectives for the present optoacousfic probe design: (i) substantially no light should propagate either through the acoustic lens (AL) or through the optical block acoustic damper (OBAD) on the sides of the probe, (ii) substantially no acoustic waves should be generated in the acoustic lens or the optical block acoustic damper materials through absorption of light; acoustic waves in a wide range of ultrasonic frequencies from 0.1 MHz to 15 MHz should be able to pass through (AL) with no attenuation, and no acoustic waves should be able to pass through OBAD; (iii) the optical beams (OB) exiting through optical windows (OW) should have smooth edges of the optical fluence, and these optical beams should enter the skin as close to each other as necessary to merge due to optical scattering within the skin and enter underlying tissue under the array or transducers providing maximum fluence in the image plane.
[0058] In an embodiment, the light delivery system directs light underneath the transducer elements, not through the array of transducer elements. In an embodiment, the design of the optoacoustic probe is based on an array of ultrasonic transducers with fiber optic delivery systems on each side of the ultrasonic array, positioned as close to the transducers as possible and with dimensions in the elevation axis of the transducer as small a possible, considering the need to focus ultrasonic beams to the depth of the most probable target. In an embodiment, the fiber optic delivery system is designed to allow penetration of the optical energy of the near infrared light into the organ being imaged, such as a breast, and minimum opto-thermo-mechanical interaction of the light beam with skin.
10059] Another alternative design of the light delivery system delivers light to a mirror or prism(s) placed underneath of the ultrasonic transducers in order to reflect the light orthogonally to the skin surface of an organ being imaged. In such embodiments, a standoff SUBSTITUTE SHEET (RULE 26) can be placed between the transducer elements and the skin/tissue. These alternative embodiments may be combined within the scope of the invention.
Detailed Description of Aspects of System Components Optical Illumination and Probe Design 10060] An acoustic lens is typically placed on transducers within an optoacoustic probe for purposes of focusing ultrasonic beams. While a probe could be provided without an acoustic lens, if there is no lens then ultrasonic transducers may be directly exposed to light and absorb such light, which can result in very large artifact ultrasonic signals, especially where such light is pulsed. Optical illumination of the lens on an ultrasonic probe causes very strong transient acoustic waves that result in image artifacts. Up to 50%
of near infrared light can be diffusely scattered by skin, depending on skin color. Mismatch of acoustic impedance between the lens of the transducer elements can cause reverberations with long ring down time. Therefore, an embodiment of a probe design includes a white strongly scattering opaque lens. If such lens is not needed due to curved shape of each transducer element, then a white strongly scattering front matching layer should be employed to protect transducer elements from near-infrared light.
100611 FIG. 1B illustrates how laser illumination light 110 and 120 from an optoacoustic probe can be scattered 130 from the skin 140 towards an acoustic lens 150 of a probe.
[0062] Furthermore, (laser) optical pulses can have a direct impact on ultrasonic transducers of the acoustic waves induced by strong interaction of the optical pulses with skin of the organ being imaged that laterally traverse along the skin surface in a direction orthogonal to the image plane. When detected by the array of transducers, spatial distributions of these acoustic waves are projected onto the optoacoustic image at a depth equal to the lateral distance between the array of transducers and the optical beams on the skin surface, creating artifacts. Furthermore, acoustic waves generated in skin through reverberation of the acoustic lens and the housing of the probe can further affect the quality of imaging.
[0063] FIGS. 2A and 2B illustrate exemplary optoacoustic signals showing impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array. The signals shown are generated by transducers in the direction almost orthogonal to the plane of images generated therefrom. Such transducers may receive signals at large oblique angle (up to 90 deg) relative to the plane of images generated SUBSTITUTE SHEET (RULE 26) therefrom, which is undesirable. Therefore, the design of the transducer array includes means to reject signals coming out of the image plane. Such means include but not limited to concave arc shape of the transducer elements and acoustic lens and delivery of the optical beam underneath the transducers. The detected optoacoustic signals 210 in FIG.
2A were generated using an effective acoustic coupling agent, in this case water. The signals 220 in FIG. 2B were generated in the absences of such acoustic coupling agent, i.e., using only air space to couple the acoustic signals to the transducer array.
[0064] Furthermore, the finite dimensions of the optical beam can affect the acoustic waves generated in response to impingement of the optical beam on tissue. Such acoustic waves can be generated at the sharp edges of the optical beam, propagate towards the array of transducers and result in artifacts. Since a system that utilizes flat linear array of ultrasonic transducers is configured such that the first and the last transducer in the array detect these waves first and the central transducers detect these waves the last, this "edge effect" results in v-shaped artifacts on a sinogram of optoacoustic signals and low frequency bulk artifacts on optoacoustic images.
[0065] FIG. 3 illustrates an example of manifestation of v-shaped artifacts 310 on a sinogram 300 of optoacoustic signals and associated artifacts 320 on optoacoustic images.
Since these acoustic waves are associated with the edge effect of the optical illumination beams having abrupt changes of the optical fluence, in an embodiment, one can see a V-shaped bright signals on the sinogram and associated series of artifact waves on the opto-acoustic image.
[0066] Furthermore, the illumination geometry of optical beams projected by an optoacoustic probe can affect image quality. Where the optical beams of an optoacoustic probe are placed too far apart, this can result in a gradual transition from the dark field illumination (two separate optical beams on each side of the probe resulting in the absence of direct light under the probe in the image plane) into the bright field of illumination (one beam under the probe going into the depth of tissue along the image plane). This transition creates a problem in the image brightness map making the map not quantitatively accurate and causes artifacts at the depth equal to the initial width between separate optical illumination beams on each side of the probe.
[0067] FIGS. 4A-4C illustrate an embodiment wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe 410, 420, 430 that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances. In the embodiment of FIG. 4A, when the optical beams are delivered under the ultrasonic probe, SUBSTITUTE SHEET (RULE 26) the distribution of the optical energy in the image plane has a smooth gradient with a maximum at the skin surface. This optical distribution is beneficial for high contrast of optoacoustic images. In the embodiment of FIG. 4B, when the optical beams are delivered close to a thin optoacoustic probe, the two beams can merge due to optical scattering within the skin, so that the distribution of the optical energy in tissue under the skin can be made similar to the embodiment of FIG. 4A. In the embodiment of FIG. 4C, when the optical beams are separated by a large distance, they merge only at significant depth within tissue, creating the optical distribution in the image plane with a dark zone (no light) in a subsurface layer of the tissue and a bright zone in the depth of the tissue, which is detrimental to the contrast of optoacoustic images, especially considering projection of brightly illuminated areas of skin onto the optoacoustic image plane at a depth equal to the separation distance of the two beams.
10068] Thus, in the embodiments illustrated in FIGS. 4A-4C, the image brightness map 412, 422 and 432 of the tissue being scanned is optimized where the illumination of the skin is directly under the probe 410. As the distance between the center of the transducers and the center of the optical beams increases, as shown at 420 and 430, the image brightness map 422 and 432 of the tissue being scanned becomes progressively more uneven.
[0069] Lastly, the reflection from boundaries of tissue structures (such as tumor, vessels or tissue layers) of laser-induced ultrasound waves launched into the tissue after being generated in skin, can also lead to image artifacts represented by lines, curves and overall noise.
[0070] In an embodiment, the acoustic lens of the probe is designed such that the lens reflects and scatters, but does not absorb, light from the illumination components, yet it is optically opaque. In various embodiments, such lens can be made either using strongly optically scattering material such as silicon rubber filled with titanium dioxide or barium sulfate powder, or using a thin metallic highly reflective layer such as aluminum or gold or a combination a white opaque lens material and a metal layer. In an embodiment, to avoid peel-off of the thin metallic layer from the front surface of the acoustic lens, in case of a combination of diffusely scattering material of the lens and a thin reflective layer (foil), the metallic reflective layer can be placed between the two layers of diffusely scattering material.
As it is difficult to make a material with absolutely zero optical absorption, and such absorption may generate ultrasound in thermoelastic materials, the lens material can be made from thermoplastic materials having minimal thermal expansion, which produces minimal or no ultrasound in response to the absorbed optical energy.
SUBSTITUTE SHEET (RULE 26) 10071] FIGS. 5A and 5B illustrate two embodiments, respectively, of hand-held optoacoustic ultrasonic probes 510 and 520 that are protected from optical illumination of the acoustic lens. In FIG. 5A, a totally reflective opaque white lens is utilized, and in FIG. 5B a partially reflective white lens is utilized, with light reflection capability of the lens enhanced by a gold layer or coating.
[0072] FIG. 6 illustrates optoacoustic images using a probe with a non-reflective acoustic lens 610 and a probe with reflective layer of gold 620. The probe utilizing a reflective layer of gold 620 produces an image with reduced artifacts 612 and 614.
[0073] In an embodiment, the probe housing serves as hypo-echoic encapsulation of the probe, which means that the probe housing is made from materials that (i) do not absorb laser light (more specifically near-infrared light), but if a small absorption is unavoidable, the materials having low thermal expansion do not emit ultrasound after absorption of the laser light, (ii) strongly attenuate and dampen ultrasonic waves and do not reverberate. The transducer assembly inside the probe housing is also made of the hypo-echoic material.
Alternatively, a layer of said hypo-echoic material is placed between the transducer assembly and the fiberoptic assembly to avoid generation of any ultrasound upon interaction of light with transducer assembly. In various embodiments, such materials can be chosen, for example, from white color porous and anechoic heterogeneous composites for baffles, foams, polymers, rubbers and plastics (such as CR-600 casting resin available from Micro-Mark of Berkeley Heights, NJ, or AM-37 available from Syntech Materials of Springfield, VA), and others. In an embodiment, any such materials are electrically non-conducting insulators to, inter alia, protect the probe from external electromagnetic emissions.
[0074] In an embodiment, the optical illumination subsystem is configured to deliver optical beams with smooth intensity edges. In an embodiment, the width of the optical beams is equal to that of the array of ultrasonic transducers within the optoacoustic probe (for example, about 5 mm). This is achieved by designing the bundle of optical fibers to have a gradually decreasing density of fibers at the edges. This design enables one to deliver laser illumination to the skin of the organ under examination so that the beam does not generate sharp edge-related acoustic waves, and such laser-induced acoustic waves do not produce V-shaped artifacts in the sinogram of optoacoustic images.
10075] FIG. 7A illustrates an embodiment of an optical beam with sharp edges 710 that may produce edge effects of acoustic waves and related artifacts and an optical beam 720 with smooth edges producing reduced edge-related artifacts. FIGS. 7B and 7C
illustrate that an optical beam with smooth edges of fluence can be produced using a fiberoptic bundle SUBSTITUTE SHEET (RULE 26) design having multiple sub-bundles and a triangular shape in each end of the fiber bundle assembly.
10076] In an embodiment, the fiber bundle is positioned at a distance from skin that is sufficient for the optical beam to expand to a desirable width. Where the dimensions of the probe are compact, the fibers used in the fiber bundle can be selected to have a higher numerical aperture (e.g., > 0.22). In an embodiment, in order to achieve better coupling of the optical beam into the skin, the beam is delivered through an optical window. In such embodiment, the optical window touches the skin, making its surface flat for better light penetration, simultaneously removing any excess of coupling gel from the skin surface being imaged. In an embodiment, the fiber bundle and the optical window are incorporated into the probe housing, so that the air gap between the fiber bundle and the window is protected.
10077] In an embodiment, the optical window is designed is to allow minimal interactions of both the optical beam and the laser-induced acoustic waves with such window. In an embodiment, the window is very thin and made of optically transparent material with antireflection (AR) optical coating. In an embodiment, such material has anechoic acoustic properties. These anechoic acoustic properties and the fact that the illuminated skin is being depressed upon optoacoustic scanning results in dampening of ultrasonic waves laterally propagating from the laser-illuminated skin surface to the transducer array, thereby reducing associated artifacts.
10078] In an embodiment, the probe is designed such that the optical beams are very close to the transducer elements on each side of the ultrasonic probe, which is made as thin as technologically possible. In an embodiment, the thickness of the probe is so small (e.g., 5 mm) that the light beams delivered to skin at this distance, d, from the probe center will merge into one beam within the thickness of the skin (about z=5 mm), and the tissue of the organ under examination receives one beam underneath the transducer elements.
This is called bright field illumination. In an embodiment, the optoacoustic probe is designed such that the optical light beam is delivered to the skin directly underneath the transducer elements.
100791 FIG. 8 illustrates the effect of optical illumination for two probes 810 and 820 where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe 812 and on either side of the probe 822. Where the skin is illuminated directly under the probe 812, a tumor 814 is clearly discernable and there is no clutter on the image background 816. Where the skin is illuminated on either side of the probe 822, the tumor is not discernable 824 and there are numerous artifacts on the image SUBSTITUTE SHEET (RULE 26) background 826.
[0080] In an embodiment, the optical beam width is designed to deliver increased light into the slice of tissue being imaged. In an embodiment, the beam is homogeneous, such that it has a constant fluence through the beam, as a heterogeneous beam generates acoustic sources of heterogeneities, which in turn produce artifacts in optoacoustic images. The fluence level is defined by the ANSI laser safety standards for the laser illumination of skin.
The beam width is limited by the capability of the optical scattering in tissue to deliver photons of light into the central slice located underneath the transducer elements (the slice being imaged). In an embodiment, the length of the optical beam is equal to the length of the transducer array. In an embodiment, the optical beam also has smooth edges, that is to say, gradually reduced fluence at the edges, since sharp edges produce strong edge artifacts on optoacoustic images.
10081] In an embodiment, design features of the optical illumination system and optoacoustic probe of the present disclosure can be summarized in the following Table:
Table 1. Summary of optical illumination and probe design.
System Feature Advantages Arc hand-held probe Higher aperture - lower distortions Light delivery into Improves optoacoustic image contrast and decreases artifacts by the imaging plane increasing the ratio of useful information (from the imaging plane) to noise (outside of the imaging plane) Optical shielding of Reduces artifacts from direct and scattered light striking the acoustic the probe lens, probe housing, etc.
Acoustic shielding of Acoustic shielding of the probe's housing reduces artifacts (clutter) the probe from acoustic waves propagating through the probe's housing Using ultrawide band Allows to have the same array working in ultrasonic and optoacoustic transducers for both imaging ultrasound and optoacoustic imaging SUBSTITUTE SHEET (RULE 26) [0082] In various embodiments, the shape of the ultrasonic transducer array for the combined optoacoustic/ultrasonic imaging can be either flat or convex arc-shaped. In an embodiment, the probe shape for optoacoustic imaging is concave arc-shaped.
Such a concave shape provides a large aperture with minimal physical dimensions, wider field of view of an object being imaged, which in turn provides for improved lateral resolution and better reconstruction of the shape of the object being imaged.
[0083] FIGS. 9A-9C illustrate embodiments of optoacoustic/ultrasonic hand-held probes having flat or concave arc shapes 910 (FIG 9A) and a hand-held transrectal probe with a linear shape 920 (FIG. 9B). FIG. 9C illustrates details of the optoacoustic/ultrasonic hand-held probe design with its face showing ultrasonic transducers assembly, two layers of hypo-echoic light reflecting and ultrasound damping material on each side, and two optical windows for delivery of the optical beam.
[0084] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape. Electrical cables 930 are provided for bi-directional communication to and from the probe, and fiberoptic bundles 940 are provided for delivering light to the probe. An array of wide-band ultrasonic transducers 950 send and receive acoustic energy. The transducer array 950 is covered by an opaque white cylindrical lens (not shown for clarity purposes) that extensively scatters and reflects near-infrared light. Optical windows 960 provide optical beam outputs. In the embodiment FIG. 9C, the ultrasonic transducers within the probe may be designed so as not to be sensitive to lateral acoustic (ultrasonic) waves and to be free of reverberations, especially in the lateral direction. This can be achieved by the selection of the piezoelectric composite material, the shape of piezoceramic elements in the matrix and anechoic properties of the matrix. In an embodiment, the ultrasonic transducers are also designed to possess high sensitivity within an ultrawide band of ultrasonic frequencies. This in turn results in minimal reverberations that cause artifacts on optoacoustic/ultrasonic images.
[0085] FIG. 9D shows an optoacoustic image that illustrates advantages of the concave-arc shaped hand-held probe in terms of resolution in optoacoustic images. As presented in this embodiment, the shape and sharp edges of a large sphere are well depicted in cases where the object is within the field of view of the probe aperture. Outside the probe aperture resolution and accuracy of shape reproduction decreases, however remains better than those for flat linear probes of similar width.
[0086] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic hand-held probe design that is capable of two-dimensional imaging within the plane going parallel SUBSTITUTE SHEET (RULE 26) to the skin surface at various selected depths, and three-dimensional images as well [0087] In an embodiment, a hand-held probe that is scanned along the skin surface producing real-time two-dimensional images of tissues in the body under examination also has a component serving for accurate global 3D positioning of the probe. This design allows the imaging system to remember positions of all tissue slices and to reconstruct three-dimensional images at the end of the scanning procedure.
Electronic Data Acquisition System [0088] In an embodiment, the present disclosure is directed to an optoacoustic imaging system having an electronic data acquisition system that operates in both optoacoustic and ultrasonic modes and can rapidly switch between such modes. In an embodiment, this is achieved with firmware that controls functions of a Field Programmable Gate Array (FPGA), the main microprocessor on the electronic data acquisition system. In an embodiment, a reprogrammable FPGA can toggle between optoacoustic and ultrasound operation modes in real-time, thus enabling co-registration of ultrasound and optoacoustic images, which can be used for diagnostic imaging based on functional and anatomical maps. In an embodiment, FPGA functions include controlling, acquiring, and storing optoacoustic and/or ultrasound data, signal processing and transferring data for real-time image reconstruction and processing. In an embodiment, the FPGA may also be employed in ultrasound beam forming and image reconstruction.
[0089] In an embodiment, the electronic data acquisition system design utilizes one or more multi-core Graphical Processor Units (GPU) for image reconstruction and processing.
In the ultrasound mode, in an embodiment, the FPGA controls the ultrasound transmission and it performs both ultrasound and optoacoustic data acquisitions on a multi-channel board.
In order to enhance operation of the memory of the FPGA, an external memory buffer can be used. In an embodiment, the FPGA allows rapid reprogramming of ultrasonic data acquisition with signal/frame repetition rate of about 2 to 20 kHz to optoacoustic data acquisition with signal/frame repetition rate of about 10-20 Hz, and also configures the structure of gates and internal memory structure and size to allow real-time switching between ultrasound emission and detection, laser synchronization, and system controls. In an embodiment, multiple FPGAs can be used to enhance the system performance. In an embodiment, in the ultrasound and optoacoustic modes, the FPGA clock can be changed by the appropriate time-division multiplexing (TDM). In an embodiment, the design of a multi-channel electronic data acquisition system can be based on modules, with a module typically being from 16 to 128 channels, although 256 channels or more may be appropriate in some SUBSTITUTE SHEET (RULE 26) applications. In an embodiment, the design of a multi-channel electronic data acquisition system has 64 channels.
[0090] In order to achieve dual modality operation of the optoacoustic/ultrasonic system, a separate optoacoustic electronic system could also be combined with a separate ultrasonic electronic system through a single probe. In an embodiment, the probe has a cable that has a Y- split to connect the probe to optoacoustic and ultrasonic electronic systems. In an embodiment, a programmable electronic switch allows one to send the detected signal from the probe (transducer array) either to the optoacoustic electronics (to operate in optoacoustic mode) or to the ultrasonic electronics and from the ultrasonic electronics to the probe (to operate in ultrasound mode). In an embodiment, a synchronization trigger signal is sent to the optoacoustic and ultrasonic systems sequentially, so that the optoacoustic and ultrasonic images are acquired one after the other.
Processing, Reconstruction and Display of Images Signal processing [0091] In various embodiments, a goal of the diagnostic imaging procedure is to display each pixel with a brightness that correctly replicates the originally generated signals in each voxel of tissue displayed on the image. On the other hand, intrinsic pressure signals generated by optical pulses within tissues may be significantly altered in the course of propagation through tissue and especially in the course of detection and recording by the ultrasonic transducers and the electronics subsystem.
[0092] In an embodiment, detected signals are processed to reverse alterations and restore original signals. In an embodiment, such reversal can be achieved through deconvolution of the impulse response (IR) of the system. In an embodiment, the impulse response can be measured by recording and digitizing a delta-function ultrasonic signal generated by short (nanosecond) laser pulses in a strongly absorbing optical medium with high thermoelastic expansion coefficient.
[0093] One component of the impulse response is the acousto-electrical impulse response, which provides for the optoacoustic or ultrasonic signal distortions due to the properties of the ultrasonic transducers, cables and analog electronics. A
second part of the impulse response is the spatial impulse response that provides for the signal distortions associated with finite dimensions of the ultrasonic transducers. In various embodiments, large transducers can integrate ultrasonic waves incident at an angle, whereas point-source-like transducers can provide perfect or near perfect delta-function spatial impulse response.
SUBSTITUTE SHEET (RULE 26) [0094] In an embodiment, any distortions in acousto-electrical impulse response can be reversed by the impulse response deconvolution from the detected signals.
However, possible distortions in the spatial impulse response can be avoided by designing transducers with small dimensions within the image plane. In an embodiment, the dimensions of the transducers are much smaller than the shortest wavelength of the ultrasound that may be detected or emitted by the transducers.
[0095] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow band of sensitivity 1010, the impulse response of an ultrawide-band ultrasonic transducer 1020 and ultrasonic spectra of the transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers 1030.
[0096] In an embodiment, the first step in processing an optoacoustic signal in an imaging system that produces two-dimensional optoacoustic images is deconvolution of the acousto-electrical impulse response.
10097] FIGS. 11A and 11B provide an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals 1110, where deconvolution restores the original, unaltered, N-shaped pressure signals 1120.
[0098] In an embodiment, the second step in processing an optoacoustic signal is signal filtering to remove noise using a signal filter. In an embodiment, the signal filter is based on a wavelet transform that operates simultaneously in the frequency and time domain. In an embodiment, such a wavelet filter is capable of filtering certain frequency components of the signal that belong to noise and appear at a given time, while preserving similar frequency components of the useful signal that appear at a different time. In an embodiment, the frequency spectrum of a wavelet filter replicates the frequency band of a typical N-shaped optoacoustic signal while simultaneously providing smooth window edges which do not cause signal distortions upon convolution.
[0099] In an embodiment, such a wavelet filter is useful in optoacoustic imaging in its capability to restore the original pressure profile generated in tissue prior to pressure propagation, In the course of propagation through tissue, the originally positive pressure signal converts into a bipolar (compression / tension) profile. Therefore, reconstruction of an image of absorbed optical energy (optoacoustic image) requires a transformation that starts with bipolar signals and provides for all-positive values of the optoacoustic image intensities.
In an embodiment, a multi-scale wavelet filter, for example, a filter that simultaneously integrates the signal over time and provides summation of a number of frequency bands present in the signal, can convert bipolar pressure signals into monopolar signal representing SUBSTITUTE SHEET (RULE 26) thermal energy or originally generated positive pressure.
[00100] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales 1210, seven scales 1220 and nine scales 1230.
[00101] In various embodiments, wavelet filtering permits enhancements of objects on the image within certain range of dimensions. An imaging operator (ultrasonic technician or diagnostic radiologist) typically desires to better visualize a tumor with specific dimensions and other objects, such as blood vessels, with their specific dimensions. In an embodiment, the wavelet filter allows the operator to apply specific selection of scales of the wavelet filter than would enhance only objects of certain sizes and suppress object of other unimportant sizes. In an embodiment, boundaries can be well visualized for objects of any size, so the high-frequency wavelet scales are beneficial for the image quality and are included in the selection of scales. In an embodiment, for a mathematically correct tomographic reconstruction, a ramp filter can be applied to the signal, which can linearly enhance contribution of higher frequencies.
Image Reconstruction [00102] In various embodiments, image reconstruction typically uses radial back-projection of processed and filtered signals onto the image plane. However, due to the limited field of view available from small hand-held probes, only an incomplete data set can be obtained. As a result, the 2D optoacoustic images may include artifacts distorting the shape and brightness of the objects displayed on the images. In an embodiment, aperture integrated normalized radial back projection is used to correct some of the reconstruction artifacts that are observed in limited aperture optoacoustic tomography.
[00103] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array.
[00104] In an embodiment, Tk,-Tk 4, 1311-1315, are the transducers 1310 in the array, is the brightness (intensity) of a pixel with coordinates (i,j), (owl 1320, 1330 is the angular portion of optoacoustic wave front emitted by the pixel (i,j) as it is visualized by the transducer #k, C2,,J=E oi,j,k (sum of all oid,k) is the portion of the optoacoustic wave front emitted by pixel (i,j) as it is visualized by the entire transducer array, and Sij,k is the sample of the optoacoustic signal measured by kth transducer and used in reconstruction of the SUBSTITUTE SHEET (RULE 26) brightness in the pixel (i,j). Various backpropagation algorithms can be used to normalize an optoacoustic image.
[00105] In an embodiment, a backpropagation algorithm can be expressed as:
kSitt (1) [00106] However, in at least some embodiments, aperture normalized backprojection produces superior image results. In an embodiment, the aperture normalized backprojection can be expressed as:
v = (2) .1) [00107] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographraphic images 1410 and 1420 of an imaging slice through a tumor angiogenesis model. In the first image 1410, a backpropagation algorithm, such as the first algorithm immediately above, is used to normalize the image. The resulting image has strong, bright arc-shaped artifacts 1412 around the blood vessels 1414 that are close to array surface. In the second image 1420, aperture normalized backprojection algorithm, such as the second algorithm immediately above, is used to normalize the image. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces the arc-shaped artifacts.
[00108] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images 1510 and 1520 of a point spread function as visualized with flat linear probe using 1510 a backpropagation algorithm, such as the first algorithm immediately above, and 1520 an aperture normalized backprojection algorithm, such as the second algorithm immediately above. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces artifacts.
Image Processing and Display [00109] In an embodiment, the optoacoustic image palette is equalized to diminish effects of light distribution within tissue. Such equalization transforms the dynamic range of optoacoustic images for better visualization of both shallow and deep objects.
[00110] FIGS. 16A and 16B provide an illustrative example of optoacoustic images 1610 and 1620 of a phantom with hairs embedded at different depths where the first image 1610 was created using a an embodiment of a standard palette and the second image 1620 was created using an embodiment of a depth-normalized palette. As can be seen, utilizing the depth-normalized palette enhances visibility of deep objects in the illustrated embodiment.
100111] In an embodiment, principal component analysis (PCA) on a single optoacoustic SUBSTITUTE SHEET (RULE 26) image acquisition (different channels) is used to remove cross-correlated signal noise.
Principal component analysis on a dataset of optoacoustic signals can remove correlated image clutter. Principal component analysis on optoacoustic frames can also remove correlated image clutter.
100112] FIGS. 17A and 17B provide an illustrative example of optoacoustic images 1710 and 1720 of a phantom of a spherical simulated tumor obtained with flat linear probe. The first image 1710 is a raw image that was not subjected to principal component analysis processing. The second image 1720 has been subjected to principal component analysis processing with first principal component deconvolution. As can be seen, utilizing principal component analysis processing enhances image quality by, inter alia, reducing artifacts.
[00113] In an embodiment, design features of signal and image processing of the present disclosure can be summarized in the Table 2 as follows:
Table 2. Summary of signal and image processing.
System Feature Advantages Operator-assisted Can improve quantitative optoacoustic diagnostics by evaluating the boundary tracking on diagnostic parameters within the tumor boundary defined on US images ultrasonic and Diagnostics can be enhanced by morphological analysis of the tumor optoacoustic images boundary Aperture integrated Corrects some of the reconstruction artifacts that are observed in a normalized radial limited aperture optoacoustic tomography back projection Equalization of the Transforms dynamic range of optoacoustic images for better optoacoustic image visualization of both shallow and deep objects palette to diminish effects of light distribution within the tissue Principal component PCA on a single optoacoustic acquisition (different channels) is a fast analysis (PCA) of the and efficient way to remove cross-correlated signal noise optoacoustic signal PCA on a dataset of optoacoustic signals removes correlated image data clutter PCA on optoacoustic frames removes correlated image clutter SUBSTITUTE SHEET (RULE 26) Optoacoustic Cancer diagnostics based on those parameters or a single malignancy imaging system with index (tHb*water/oxygenation) with respect to average background quantitative assessment of total hemoglobin, blood oxygenation, and water Wavelet transform - Operator can easily select the maximum size of the objects to be that enhances images enhanced on the image. Everything larger will be filtered out of objects within certain dimension range Adaptive beam- - Allows individual reconstruction on a family of radial wavelet sub-forming for bands optoacoustic imaging Diagnostic Image Reprocessing 1001141 The principles of functional diagnostic imaging can be based on the tumor pathophysiology. For example, malignant tumors have enhanced concentration of the total hemoglobin and reduced level of oxygen saturation in the hemoglobin of blood.
In an embodiment, optoacoustic images can be reprocessed and converted into, inter alia, images of (i) the total hemoglobin [tHb] and (ii) the oxygen saturation of hemoglobin [802]. FIG. 18 demonstrates an example of two breast tumors 1001151 FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 755 nm, 1810, and 1064 nm, 1820, which match the local maximum (757 nm) and minimum (1064 nm) of the ratio of absorption by hemoglobin (hypoxic blood) to absorption by oxyhemoglobin. As can be seen, a malignant tumor. 1830, has a higher absorption coefficient at 757 nm than a benign tumor, 1840, whereas the benign tumor, 1840, has a higher absorption coefficient at 1064 nm than a malignant tumor, 1830.
1001161 FIG. 19 illustrates tumor differentiation by optoacoustic imaging based on absorption coefficients at two wavelengths 1910 and 1920 in a phantom. At 757 nm, 1920, a model of a malignant tumor is clearly visible, 1922, whereas the model of the malignant tumor, 1922, is not visible at 1064 nm, 1910.
SUBSTITUTE SHEET (RULE 26) [00117] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02] (98% in the left tube, and 31% in the right tube). The tubes were placed in 1% fat milk with optical properties similar to those found in the human breast. The wavelength of laser illumination used for this image is 1064 nm.
FIG. 20B
shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe. FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels: (1-upper left) ultrasound image depicting anatomy of the body with vessels; (2-upper right) optoacoustic image obtained at the wavelength of 757 nm; (3-lower right) optoacoustic image obtained at the wavelength of 1064 nm; (4-lower left) functional image of the total hemoglobin [tHb]; (5-lower center) functional image of the blood oxygen saturation [S02];
(6-upper center) functional image of the blood oxygen saturation presented only in the area of maximum concentration of the total hemoglobin. Raw optoacoustic images depicted in FIG.
20C in the upper right and lower right panels demonstrate different brightness of blood vessels having blood with different level of the total hemoglobin concentration [tHb] and blood oxygen saturation [S02], accurate quantitative measurements could be performed under conditions of normalized fluence of the optical illumination of tissue in the body as a function of depth. These optoacoustic images were used to reconstruct functional images of the total hemoglobin [tHb] and the blood oxygenation [S02]. All functional images displayed in FIG. 20C are coregistered and superimposed with the anatomical image of tissue structure for better correlation of features.
[00118] FIGS. 21A and 21B show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at the wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 21B. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging. Specifically, this embodiment illustrates quantitative data based on measurements of the optoacoustic signal amplitude in blood having various levels of oxygen saturation (from 30% to 98%) and hematocrit of 38 g/dL of hemoglobin [tHb] in erythrocytes. As predicted by the published absorption spectra of blood, the optoacoustic signal amplitude at 1064 rim illumination increases with increased SUBSTITUTE SHEET (RULE 26) level of oxygen saturation, while the optoacoustie signal amplitude decreases with increased blood oxygenation at 757 nm illumination wavelength.
100119] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water.
Preferred laser wavelengths for functional imaging are 757 nm and 1064 nm matching max and min ratio of [HHb]1[02Hb], while the wavelength of 800 nm is the best for calibration purposes through measurements of the total hemoglobin [tHb].
100120] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors. FIG. 23A shows 2D images of: model of malignant tumor morphology based on ultrasound (left), the same anatomical image coregistered with functional image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right). FIG. 23B shows 2D images of a model benign tumor:
morphology image based on ultrasound (left), the same anatomical image coregistered with functional image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right).
100121] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors. FIG. 24A shows 2D images of invasive ductal carcinoma, a malignant tumor with rough boundaries, heterogeneous morphology, high concentration of total hemoglobin and low oxygen saturation (hypoxia). The malignant tumor morphology is based on ultrasound in the left image, and the same anatomical image coregistered with functional image of the blood oxygenation in the center image and with functional image of the total hemoglobin concentration in the right image. FIG. 24B shows 2D images of a breast with Fibroadenoma, a benign tumor with relatively round boundaries, normal concentration of oxyhemoglobin and relatively low total hemoglobin. Breast morphology is based on ultrasound in the left image, and the same anatomical image is coregistered with a functional image of the blood oxygenation in the center image and with a functional image of the total hemoglobin concentration in the right image.
SUBSTITUTE SHEET (RULE 26) Conclusion [00122] While some embodiments can be implemented in fully functioning computers and computer systems, various embodiments are capable of being distributed as a computing product in a variety of forms and are capable of being applied regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
[00123] At least some aspects disclosed can be embodied, at least in part, in software.
That is, the techniques described herein may be carried out in a special purpose or general purpose computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile RAM, non-volatile memory, cache or a remote storage device.
[00124] Routines executed to implement the embodiments may be implemented as part of an operating system, firmware, ROM, middleware, service delivery platform, SDK
(Software Development Kit) component, web services, or other specific application, component, program, object, module or sequence of instructions referred to as "computer programs."
Invocation interfaces to these routines can be exposed to a software development community as an API (Application Programming Interface). The computer programs typically comprise one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations necessary to execute elements involving the various aspects.
[00125] A machine-readable medium can be used to store software and data which when executed by a data processing system causes the system to perform various methods. The executable software and data may be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data may be stored in any one of these storage devices. Further, the data and instructions can be obtained from centralized servers or peer-to-peer networks. Different portions of the data and instructions can be obtained from different centralized servers and/or peer-to-peer networks at different times and in different communication sessions or in a same communication session. The data and instructions can be obtained in entirety prior to the execution of the applications. Alternatively, portions of the data and instructions can be obtained dynamically, just in time, when needed for execution. Thus, it is not required that the data and instructions be on a machine-readable medium in entirety at a particular instance of time.
[00126] Examples of computer-readable media include but are not limited to recordable and non-recordable type media such as volatile and non-volatile memory devices, read only SUBSTITUTE SHEET (RULE 26) memory (ROM), random access memory (RAM), flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs), etc.), among others.
[00127] In general, a machine readable medium includes any mechanism that provides (e.g., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
[00128] In various embodiments, hardwired circuitry may be used in combination with software instructions to implement the techniques. Thus, the techniques are neither limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the data processing system.
[00129] Although some of the drawings illustrate a number of operations in a particular order, operations that are not order dependent may be reordered and other operations may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be apparent to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be implemented in hardware, firmware, software or any combination thereof.
[00130] In the foregoing specification, the disclosure has been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
SUBSTITUTE SHEET (RULE 26)
Claims (45)
1. An imaging system for visualization of slices into a depth of tissue of at least a portion of a body, comprising:
a hand-held imaging probe comprising a light emitting portion and an array of ultrasonic transducers, wherein the light emitting portion comprises:
a first fiber-optic bundle and a first light diffuser configured to form a first light beam and a second fiber-optic bundle and a second light diffuser configured to form a second light beam; and a processing system configured to receive data originating from the hand-held imaging probe and to process at least three independent images based at least in part upon said data wherein the processing comprises deconvolution of an impulse response associated with the array of ultrasonic transducers, the impulse response based on a ultrasonic signal generated by laser pulses in an absorbing optical medium, the three independent images together comprising:
a first functional image reflecting distribution of total hemoglobin concentration; a second functional image reflecting distribution of blood oxygen saturation;
and a morphological image of tissue structures;
the processing system being further configured to substantially co-register the first functional image, the second functional image, and the morphological image in time and space, and to output a substantially co-registered image.
a hand-held imaging probe comprising a light emitting portion and an array of ultrasonic transducers, wherein the light emitting portion comprises:
a first fiber-optic bundle and a first light diffuser configured to form a first light beam and a second fiber-optic bundle and a second light diffuser configured to form a second light beam; and a processing system configured to receive data originating from the hand-held imaging probe and to process at least three independent images based at least in part upon said data wherein the processing comprises deconvolution of an impulse response associated with the array of ultrasonic transducers, the impulse response based on a ultrasonic signal generated by laser pulses in an absorbing optical medium, the three independent images together comprising:
a first functional image reflecting distribution of total hemoglobin concentration; a second functional image reflecting distribution of blood oxygen saturation;
and a morphological image of tissue structures;
the processing system being further configured to substantially co-register the first functional image, the second functional image, and the morphological image in time and space, and to output a substantially co-registered image.
2. The system of claim 1, in which the light emitting portion and the array of ultrasonic transducers in the hand-held imaging probe are arranged in a generally flat linear shape.
3. The system of claim 1, in which the light emitting portion and the array of ultrasonic transducers in the hand-held imaging probe are arranged in a curved concave arc shape.
4. The system of claim 1, in which the hand-held imaging probe is configured to produce at least two optical beams, one on each side of the array of ultrasonic transducers, so as to deliver optical energy to a skin surface at such angle and such distance between them that the optical beams merge into one beam within a distance of skin thickness under the array of ultrasonic transducers.
5. The system of claim 1, further comprising one or more dual-wavelength short-pulse lasers.
6. The system of claim 1, further comprising a plurality of single-wavelength short pulse lasers.
7. The system of claim 1, further comprising a fiberoptic light delivery system.
8. The system of claim 1, in which the system is configured to present images substantially in real time by operating at a video frame rate.
9. The system of claim 1, in which the hand-held imaging probe is configured to deliver optical energy from either under a face of said array of transducers or its side.
10. The system of claim 1, in which the hand-held imaging probe comprises an acoustic lens.
11. The system of claim 10, in which the acoustic lens comprises optically reflective materials.
12. The system of claim 11, in which the optically reflective materials comprise a thin, highly optically reflective metallic layer that removes image artifacts associated with light interactions with the acoustic lens.
13. The system of claim 12, in which the acoustic lens is formed from a white opaque material.
14. The system of claim 12, in which the thin, highly optically reflective metallic layer comprises aluminum, gold, or silver.
15. The system of claim 1, in which the hand-held imaging probe comprises an output fiber bundle with multiple sub-bundles.
16. The system of claim 15, in which said multiple sub-bundles are shaped to provide even illumination of an image plane and smooth illumination edges so as to reduce edge-related optoacoustic artifacts.
17. The system of claim 1, in which the array of ultrasonic transducers comprises ultrasonic transducers having an ultrawide ultrasonic frequency band of sensitivity, with bandwidth of up to 200% from the central frequency.
18. The system of claim 1, in which the hand-held imaging probe comprises an input fiber bundle that is circular in shape to match an incident laser beam.
19. The system of claim 1, in which the hand-held imaging probe comprises an input fiber bundle having a thermally fused fiber bundle tip such that substantially all fibers in the bundle are reshaped to avoid loss of light through spaces between fibers.
20. The system of claim 1, in which the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles, with fibers in each sub-bundle being randomized such that two neighboring fibers at an input appear in different sub-bundles of the fiber bundle.
21. The system of claim 1, in which the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles to form fiber bundle paddles, with at least two paddles placed on each side of the ultrasonic transducer array, each paddle, in turn, being divided into smaller sub-bundles, each smaller sub-bundle being in a slot in said paddle so as to provide controlled profile of an optical beam.
22. The system of claim 1, in which the hand-held imaging probe comprises fiber bundle, which produces an optical beam that is shaped to complement a size and shape of the ultrasonic transducer array.
23. The system of claim 1, in which the hand-held imaging probe comprises an output fiber bundle having triangular shaped ends so as to allow an output beam to have smooth edges of optical fluence after passing through the first light diffuser.
24. The system of claim 1, in which the hand-held imaging probe comprises a plurality of optical windows, each comprising one or more anti-reflection-coated plates with acoustic impedance matching that of tissues to be imaged.
25. The system of claim 24, in which the anti-reflection-coated windows comprise glass, polymer or other solid optically transparent material.
26. The system of claim 1, in which the hand-held imaging probe comprises:
the first light diffuser and the second light diffuser;
first and second optical windows;
at least two output fiber bundles arranged such that optical beams respectively emerging therefrom pass through the respective light diffusers, then pass through the respective optical windows, then merge at least partially.
the first light diffuser and the second light diffuser;
first and second optical windows;
at least two output fiber bundles arranged such that optical beams respectively emerging therefrom pass through the respective light diffusers, then pass through the respective optical windows, then merge at least partially.
27. The system of claim 1, further comprising a three-dimensional positioning system configured to control position of the hand-held imaging probe so as to allow assembly of three-dimensional volumetric images of the body from two-dimensional slices made though the depth of tissue obtained by scanning the hand-held imaging probe along the surface of the at least the portion of the body.
28. The system of claim 1, in which the hand-held imaging probe further comprises an acoustic lens formed from a material that allows it to reflect and scatter light from illumination components with substantially no absorption of such light, and yet be optically opaque.
29. The system of claim 28, in which the acoustic lens is formed from silicon rubber.
30. The system of claim 29, in which the silicon rubber is filled with titanium dioxide.
31. The system of claim 29, in which the silicon rubber is filled with barium sulfate powder.
32. The system of claim 1, in which the hand-held imaging probe further comprises a housing that provides hypo-echoic encapsulation of the probe.
33. The system of claim 32, in which internal or external parts of the housing comprise materials that do not absorb near-infrared laser light.
34. The system of claim 33, in which internal or external parts of the housing comprise materials having low thermal expansion properties such that the housing does not emit ultrasound after absorption of laser light.
35. The system of claim 1, in which an assembly of the array of ultrasonic transducers is made of hypo-echoic material.
36. The system of claim 1, further comprising a layer of hypo-echoic material between an assembly of the array of ultrasonic transducers and a fiberoptic assembly to avoid generation of ultrasound upon interaction of light with assembly of the array of ultrasonic transducers.
37. A hand-held imaging probe comprising an array of ultrasonic transdUcers and a light emitting portion, wherein the light emitting portion comprises:
a first fiber-optic bundle and a first light diffuser configured to form a first light beam and a second fiber-optic bundle and a second light diffuser configured to form a second light beam, wherein the first and second fiber-optic bundles are disposed on opposite sides of the array of ultrasonic transducers; and a first optical windows through which the first light beam passes;
a second optical window through which the second light beam passes;
wherein the first and second fiber optic bundles are oriented such that, when the first optical window and second optical window are positioned proximate a tissue, the first light beam passes through the first optical window and converges in the tissue with the second light beam that passes through the second optical window;
a processing system configured to receive data originating from the hand-held imaging probe and to process independent images based at least in part upon said data, wherein the processing comprises deconvolution of an impulse response associated with the array of ultrasonic transducers, the impulse response based on an ultrasonic signal generated by short laser pulses in an absorbing optical medium.
a first fiber-optic bundle and a first light diffuser configured to form a first light beam and a second fiber-optic bundle and a second light diffuser configured to form a second light beam, wherein the first and second fiber-optic bundles are disposed on opposite sides of the array of ultrasonic transducers; and a first optical windows through which the first light beam passes;
a second optical window through which the second light beam passes;
wherein the first and second fiber optic bundles are oriented such that, when the first optical window and second optical window are positioned proximate a tissue, the first light beam passes through the first optical window and converges in the tissue with the second light beam that passes through the second optical window;
a processing system configured to receive data originating from the hand-held imaging probe and to process independent images based at least in part upon said data, wherein the processing comprises deconvolution of an impulse response associated with the array of ultrasonic transducers, the impulse response based on an ultrasonic signal generated by short laser pulses in an absorbing optical medium.
38. The hand-held imaging probe of claim 37, further comprising a signal filter that filters a delta-fundion ultrasonic signal to remove noise from the ultrasonic signal and comrnunicates with the processing system.
39. The hand-held imaging probe of claim 38, wherein the signal filter is wavelet filter based on a wavelet transform that operates simultaneously in the frequency and time domain.
40. The hand-held imaging probe of claim 39, wherein the wavelet filter is configured to filter frequency components of the delta-function ultrasonic signal that are noise at a first time; and preserve frequency components of the delta-function ultrasonic signal at a second time.
41. The hand-held imaging probe of claim 39, wherein a frequency spectrum of the wavelet filter replicates a frequency band of an N-shaped optoacoustic signal.
42. The hand-held imaging probe of claim 41, wherein the wavelet filter provides smooth window edges that do not cause signal distortions upon convolution.
43. The hand-held imaging probe of claim 41, wherein the wavelet filter converts bipolar pressure signals into monopolar signals.
44. The hand-held imaging probe of claim 37, wherein the deconvolution of the impulse response restores N-shaped pressure signals.
45. The hand-held imaging probe of claim 37, wherein the hand-held imaging probe has a concave arc shape.
Applications Claiming Priority (11)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/287,759 | 2011-11-02 | ||
US13/287,759 US20130109950A1 (en) | 2011-11-02 | 2011-11-02 | Handheld optoacoustic probe |
US13/341,950 US8686335B2 (en) | 2011-12-31 | 2011-12-31 | System and method for adjusting the light output of an optoacoustic imaging system |
US13/341,950 | 2011-12-31 | ||
US13/507,217 US9289191B2 (en) | 2011-10-12 | 2012-06-13 | System and method for acquiring optoacoustic data and producing parametric maps thereof |
US13/507,217 | 2012-06-13 | ||
US13/667,830 | 2012-11-02 | ||
US13/667,808 | 2012-11-02 | ||
US13/667,808 US20130289381A1 (en) | 2011-11-02 | 2012-11-02 | Dual modality imaging system for coregistered functional and anatomical mapping |
PCT/US2012/063409 WO2013067419A1 (en) | 2011-11-02 | 2012-11-02 | Dual modality imaging system for coregistered functional and anatomical mapping |
US13/667,830 US9757092B2 (en) | 2011-11-02 | 2012-11-02 | Method for dual modality optoacoustic imaging |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2861089A1 CA2861089A1 (en) | 2013-05-10 |
CA2861089C true CA2861089C (en) | 2021-01-12 |
Family
ID=48192850
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2861089A Active CA2861089C (en) | 2011-11-02 | 2012-11-02 | Dual modality imaging system for coregistered functional and anatomical mapping |
Country Status (8)
Country | Link |
---|---|
JP (2) | JP6322578B2 (en) |
KR (1) | KR102117132B1 (en) |
AU (2) | AU2012332233B2 (en) |
CA (1) | CA2861089C (en) |
IL (1) | IL232414A0 (en) |
MX (1) | MX2014005408A (en) |
SG (1) | SG11201401986WA (en) |
WO (1) | WO2013067419A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230085179A1 (en) * | 2021-09-10 | 2023-03-16 | Rockley Photonics Limited | Optical speckle receiver |
Families Citing this family (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9451884B2 (en) | 2007-12-13 | 2016-09-27 | Board Of Trustees Of The University Of Arkansas | Device and method for in vivo detection of clots within circulatory vessels |
US9144383B2 (en) | 2007-12-13 | 2015-09-29 | The Board Of Trustees Of The University Of Arkansas | Device and method for in vivo noninvasive magnetic manipulation of circulating objects in bioflows |
US20090156932A1 (en) | 2007-12-13 | 2009-06-18 | Board Of Trustees Of The University Of Arkansas | Device and method for in vivo flow cytometry using the detection of photoacoustic waves |
US8686335B2 (en) | 2011-12-31 | 2014-04-01 | Seno Medical Instruments, Inc. | System and method for adjusting the light output of an optoacoustic imaging system |
US20130116538A1 (en) | 2011-11-02 | 2013-05-09 | Seno Medical Instruments, Inc. | Optoacoustic imaging systems and methods with enhanced safety |
US10433732B2 (en) | 2011-11-02 | 2019-10-08 | Seno Medical Instruments, Inc. | Optoacoustic imaging system having handheld probe utilizing optically reflective material |
US20140005544A1 (en) | 2011-11-02 | 2014-01-02 | Seno Medical Instruments, Inc. | System and method for providing selective channel sensitivity in an optoacoustic imaging system |
US9814394B2 (en) | 2011-11-02 | 2017-11-14 | Seno Medical Instruments, Inc. | Noise suppression in an optoacoustic system |
US20130289381A1 (en) | 2011-11-02 | 2013-10-31 | Seno Medical Instruments, Inc. | Dual modality imaging system for coregistered functional and anatomical mapping |
US9445786B2 (en) | 2011-11-02 | 2016-09-20 | Seno Medical Instruments, Inc. | Interframe energy normalization in an optoacoustic imaging system |
US11191435B2 (en) | 2013-01-22 | 2021-12-07 | Seno Medical Instruments, Inc. | Probe with optoacoustic isolator |
KR102105728B1 (en) | 2012-03-09 | 2020-04-28 | 세노 메디컬 인스투르먼츠 인코포레이티드 | Statistical mapping in an optoacoustic imaging system |
JP6061571B2 (en) * | 2012-09-04 | 2017-01-18 | キヤノン株式会社 | Subject information acquisition device |
CA2884954A1 (en) | 2012-09-25 | 2014-04-03 | Mark S. Smeltzer | Device and method for in vivo photoacoustic diagnosis and photothermal purging of infected blood |
US20160213257A1 (en) * | 2013-09-04 | 2016-07-28 | Canon Kabushiki Kaisha | Photoacoustic apparatus |
CN103512960B (en) * | 2013-09-27 | 2016-01-06 | 中国科学院声学研究所 | A kind of supersonic array formation method |
KR20160067881A (en) | 2013-10-11 | 2016-06-14 | 세노 메디컬 인스투르먼츠 인코포레이티드 | Systems and methods for component separation in medical imaging |
CN104739453A (en) * | 2013-12-31 | 2015-07-01 | 深圳市鹏瑞智能技术应用研究院 | Ultrasonic tomography system and method |
JP6049209B2 (en) * | 2014-01-28 | 2016-12-21 | 富士フイルム株式会社 | Photoacoustic measurement probe and photoacoustic measurement apparatus including the same |
US10945610B2 (en) | 2014-12-31 | 2021-03-16 | Bioventures, Llc | Devices and methods for fractionated photoacoustic flow cytometry |
WO2019156975A1 (en) * | 2018-02-07 | 2019-08-15 | Atherosys, Inc. | Apparatus and method to guide ultrasound acquisition of the peripheral arteries in the transverse plane |
AU2019247406A1 (en) * | 2018-04-04 | 2020-11-26 | Tomowave Laboratories, Inc. | Quantitative imaging system and uses thereof |
AU2019331103A1 (en) * | 2018-08-29 | 2021-03-25 | Tel Hashomer Medical Research Infrastructure And Services Ltd. | System and method for determining oxygenated-blood content of biological tissue |
US11832872B2 (en) | 2019-04-01 | 2023-12-05 | Anya L. Getman | Resonating probe with optional sensor, emitter, and/or injection capability |
JP7301676B2 (en) * | 2019-08-28 | 2023-07-03 | キヤノンメディカルシステムズ株式会社 | ULTRASOUND DIAGNOSTIC APPARATUS, SIGNAL PROCESSING METHOD, AND SIGNAL PROCESSING PROGRAM |
JP7292434B2 (en) * | 2020-01-21 | 2023-06-16 | 株式会社エビデント | Erythrocyte differentiation monitoring device and erythrocyte differentiation monitoring method |
CN111671436A (en) * | 2020-05-21 | 2020-09-18 | 东南大学 | Temperature-compensated photoacoustic noninvasive hemoglobin detection device and detection method |
CN111839730B (en) * | 2020-07-07 | 2022-02-11 | 厦门大学附属翔安医院 | Photoacoustic imaging surgical navigation platform for guiding tumor resection |
CN116138805B (en) * | 2022-12-30 | 2023-09-08 | 深圳开立生物医疗科技股份有限公司 | Photoacoustic ultrasound multi-modality imaging apparatus and method, electronic apparatus, and storage medium |
CN115868956A (en) * | 2023-03-01 | 2023-03-31 | 暨南大学附属第一医院(广州华侨医院) | Anti-interference method of excitation source for magneto-optical acoustic imaging |
CN118356211B (en) * | 2024-06-19 | 2024-09-10 | 杭州励影光电成像有限责任公司 | Ultrahigh-speed multimode fusion imaging system and method |
Family Cites Families (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3974946B2 (en) * | 1994-04-08 | 2007-09-12 | オリンパス株式会社 | Image classification device |
JPH0961359A (en) * | 1995-08-29 | 1997-03-07 | Hamamatsu Photonics Kk | Concentration measuring device |
US5830146A (en) * | 1997-03-17 | 1998-11-03 | Polartechnics Limited | Sheathed probes for tissue type recognition |
JPH1176232A (en) * | 1997-09-11 | 1999-03-23 | Hitachi Medical Corp | Ultrasonic diagnostic apparatus |
US6827686B2 (en) * | 2002-08-21 | 2004-12-07 | Koninklijke Philips Electronics N.V. | System and method for improved harmonic imaging |
JP4406226B2 (en) * | 2003-07-02 | 2010-01-27 | 株式会社東芝 | Biological information video device |
JP4643153B2 (en) * | 2004-02-06 | 2011-03-02 | 株式会社東芝 | Non-invasive biological information imaging device |
IL166408A0 (en) * | 2005-01-20 | 2006-01-15 | Ultraview Ltd | Combined 2d pulse-echo ultrasound and optoacousticsignal for glaucoma treatment |
WO2006090298A1 (en) * | 2005-02-23 | 2006-08-31 | Philips Intellectual Property & Standards Gmbh | Imaging an object of interest |
JP4745743B2 (en) * | 2005-07-14 | 2011-08-10 | Hoya株式会社 | Fluorescence observation endoscope system |
US20070093708A1 (en) * | 2005-10-20 | 2007-04-26 | Benaron David A | Ultra-high-specificity device and methods for the screening of in-vivo tumors |
US20070093702A1 (en) * | 2005-10-26 | 2007-04-26 | Skyline Biomedical, Inc. | Apparatus and method for non-invasive and minimally-invasive sensing of parameters relating to blood |
JP2007267837A (en) * | 2006-03-30 | 2007-10-18 | Toshiba Corp | Biolight measuring apparatus |
CN101472520B (en) * | 2006-06-23 | 2015-06-03 | 皇家飞利浦电子股份有限公司 | Timing controller for combined photoacoustic and ultrasound imager |
US8070682B2 (en) * | 2006-07-19 | 2011-12-06 | The University Of Connecticut | Method and apparatus for medical imaging using combined near-infrared optical tomography, fluorescent tomography and ultrasound |
JP4820239B2 (en) * | 2006-08-28 | 2011-11-24 | 公立大学法人大阪府立大学 | Probe for optical tomography equipment |
BRPI0719142A8 (en) * | 2006-11-21 | 2015-10-13 | Koninklijke Philips Electronics Nv | SYSTEM AND METHOD FOR IMAGE FORMATION OF PROSTATE TISSUE IN AN ANATOMICAL STRUCTURE |
JP5406729B2 (en) * | 2007-02-05 | 2014-02-05 | ブラウン ユニバーシティ | Improved high-resolution acoustic microscope |
JP5002397B2 (en) * | 2007-09-28 | 2012-08-15 | 株式会社東芝 | Ultrasonic diagnostic apparatus and program |
WO2010048258A1 (en) * | 2008-10-23 | 2010-04-29 | Washington University In St. Louis | Reflection-mode photoacoustic tomography using a flexibly-supported cantilever beam |
US8454512B2 (en) * | 2007-10-25 | 2013-06-04 | Washington University | Confocal photoacoustic microscopy with optical lateral resolution |
JP2010179085A (en) * | 2008-07-11 | 2010-08-19 | Canon Inc | Biological information acquisition apparatus |
EP2328480B1 (en) * | 2008-07-18 | 2016-01-06 | University Of Rochester | Low-cost device for c-scan photoacoustic imaging |
US9572497B2 (en) * | 2008-07-25 | 2017-02-21 | Helmholtz Zentrum Munchen Deutsches Forschungszentrum Fur Gesundheit Und Umwelt (Gmbh) | Quantitative multi-spectral opto-acoustic tomography (MSOT) of tissue biomarkers |
JP4900979B2 (en) * | 2008-08-27 | 2012-03-21 | キヤノン株式会社 | Photoacoustic apparatus and probe for receiving photoacoustic waves |
US20100094134A1 (en) * | 2008-10-14 | 2010-04-15 | The University Of Connecticut | Method and apparatus for medical imaging using near-infrared optical tomography combined with photoacoustic and ultrasound guidance |
JP2010125260A (en) * | 2008-12-01 | 2010-06-10 | Canon Inc | Biological testing apparatus |
JP5241465B2 (en) * | 2008-12-11 | 2013-07-17 | キヤノン株式会社 | Photoacoustic imaging apparatus and photoacoustic imaging method |
JP5275830B2 (en) * | 2009-01-26 | 2013-08-28 | 富士フイルム株式会社 | Optical ultrasonic tomographic imaging apparatus and optical ultrasonic tomographic imaging method |
JP5483905B2 (en) * | 2009-03-03 | 2014-05-07 | キヤノン株式会社 | Ultrasonic device |
JP4621781B2 (en) * | 2009-03-06 | 2011-01-26 | 株式会社東芝 | Laser ultrasonic inspection equipment |
WO2010127199A2 (en) * | 2009-05-01 | 2010-11-04 | Visualsonics Inc. | System for photoacoustic imaging and related methods |
JP2011072702A (en) * | 2009-10-01 | 2011-04-14 | Konica Minolta Medical & Graphic Inc | Acoustic lens for ultrasonic probe, and ultrasonic probe |
JP5692988B2 (en) * | 2009-10-19 | 2015-04-01 | キヤノン株式会社 | Acoustic wave measuring device |
EP2494923B1 (en) * | 2009-10-29 | 2015-07-29 | Canon Kabushiki Kaisha | Photo-acoustic device |
WO2011091423A2 (en) * | 2010-01-25 | 2011-07-28 | The Arizona Board Of Regents On Behalf Of The University Of Arizona | Ultrasonic/photoacoustic imaging devices and methods |
JP5818444B2 (en) * | 2010-02-04 | 2015-11-18 | キヤノン株式会社 | Function information acquisition apparatus, function information acquisition method, and program |
JP5448918B2 (en) * | 2010-02-24 | 2014-03-19 | キヤノン株式会社 | Biological information processing device |
JP5479173B2 (en) * | 2010-03-17 | 2014-04-23 | キヤノン株式会社 | Information processing apparatus and information processing method |
-
2012
- 2012-11-02 MX MX2014005408A patent/MX2014005408A/en unknown
- 2012-11-02 SG SG11201401986WA patent/SG11201401986WA/en unknown
- 2012-11-02 KR KR1020147014671A patent/KR102117132B1/en active IP Right Grant
- 2012-11-02 JP JP2014540150A patent/JP6322578B2/en active Active
- 2012-11-02 CA CA2861089A patent/CA2861089C/en active Active
- 2012-11-02 AU AU2012332233A patent/AU2012332233B2/en not_active Ceased
- 2012-11-02 WO PCT/US2012/063409 patent/WO2013067419A1/en active Application Filing
-
2014
- 2014-05-01 IL IL232414A patent/IL232414A0/en unknown
-
2017
- 2017-11-28 AU AU2017268522A patent/AU2017268522A1/en not_active Abandoned
-
2018
- 2018-04-09 JP JP2018074832A patent/JP6732830B2/en active Active
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20230085179A1 (en) * | 2021-09-10 | 2023-03-16 | Rockley Photonics Limited | Optical speckle receiver |
US12109006B2 (en) * | 2021-09-10 | 2024-10-08 | Rockley Photonics Limited | Optical speckle receiver |
Also Published As
Publication number | Publication date |
---|---|
WO2013067419A1 (en) | 2013-05-10 |
KR102117132B1 (en) | 2020-05-29 |
JP6322578B2 (en) | 2018-05-09 |
CA2861089A1 (en) | 2013-05-10 |
AU2012332233B2 (en) | 2017-08-31 |
AU2012332233A1 (en) | 2014-05-22 |
JP6732830B2 (en) | 2020-07-29 |
IL232414A0 (en) | 2014-06-30 |
JP2018143778A (en) | 2018-09-20 |
JP2015501194A (en) | 2015-01-15 |
AU2017268522A1 (en) | 2017-12-14 |
SG11201401986WA (en) | 2014-08-28 |
KR20140103932A (en) | 2014-08-27 |
MX2014005408A (en) | 2015-02-12 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10709419B2 (en) | Dual modality imaging system for coregistered functional and anatomical mapping | |
CA2861089C (en) | Dual modality imaging system for coregistered functional and anatomical mapping | |
US9757092B2 (en) | Method for dual modality optoacoustic imaging | |
US10433732B2 (en) | Optoacoustic imaging system having handheld probe utilizing optically reflective material | |
CA2861979C (en) | Laser optoacoustic ultrasonic imaging system (louis) and methods of use | |
KR102105728B1 (en) | Statistical mapping in an optoacoustic imaging system | |
US20100087733A1 (en) | Biological information processing apparatus and biological information processing method | |
WO2012014391A1 (en) | Image information obtaining apparatus and control method for same | |
US20130190594A1 (en) | Scanning Optoacoustic Imaging System with High Resolution and Improved Signal Collection Efficiency | |
US20220133273A1 (en) | Transparent ultrasound transducers for photoacoustic imaging | |
JP2013255697A (en) | Object information acquiring apparatus and control method thereof | |
JP2013158531A (en) | Apparatus and method for obtaining subject information | |
JP2017047177A (en) | Subject information acquiring apparatus and control method for subject information acquiring apparatus | |
US20150182126A1 (en) | Photoacoustic apparatus, signal processing method, and program | |
JP6486056B2 (en) | Photoacoustic apparatus and processing method of photoacoustic apparatus | |
EP2773267B1 (en) | Dual modality imaging system for coregistered functional and anatomical mapping | |
RU2787527C2 (en) | System for quantitative image generation and its use | |
US20240184241A1 (en) | Systems and methods for an imaging device | |
JP6643108B2 (en) | Subject information acquisition device and subject information acquisition method | |
JP6223073B2 (en) | Subject information acquisition device | |
JP2019136520A (en) | Processing device, photoacoustic image display method, and program | |
JP2017124264A (en) | Processing device, subject information obtaining device, photoacoustic image display method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request |
Effective date: 20171030 |