AU2012332233B2 - Dual modality imaging system for coregistered functional and anatomical mapping - Google Patents

Dual modality imaging system for coregistered functional and anatomical mapping Download PDF

Info

Publication number
AU2012332233B2
AU2012332233B2 AU2012332233A AU2012332233A AU2012332233B2 AU 2012332233 B2 AU2012332233 B2 AU 2012332233B2 AU 2012332233 A AU2012332233 A AU 2012332233A AU 2012332233 A AU2012332233 A AU 2012332233A AU 2012332233 B2 AU2012332233 B2 AU 2012332233B2
Authority
AU
Australia
Prior art keywords
optoacoustic
tissue
optical
image
ultrasonic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
AU2012332233A
Other versions
AU2012332233A1 (en
Inventor
Peter BRECHT
Bryan CLINGMAN
Andre CONJUSTEAU
Sergey Ermilov
Donald G. Herzog
Vyacheslav NADVORETSKIY
Alexander Oraevsky
Richard SU
Jason ZALEV
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Seno Medical Instruments Inc
Original Assignee
Seno Medical Instruments Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US13/287,759 external-priority patent/US20130109950A1/en
Priority claimed from US13/341,950 external-priority patent/US8686335B2/en
Priority claimed from US13/507,217 external-priority patent/US9289191B2/en
Application filed by Seno Medical Instruments Inc filed Critical Seno Medical Instruments Inc
Priority claimed from US13/667,808 external-priority patent/US20130289381A1/en
Priority claimed from US13/667,830 external-priority patent/US9757092B2/en
Publication of AU2012332233A1 publication Critical patent/AU2012332233A1/en
Application granted granted Critical
Publication of AU2012332233B2 publication Critical patent/AU2012332233B2/en
Priority to AU2017268522A priority Critical patent/AU2017268522A1/en
Ceased legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • A61B5/1455Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
    • A61B5/14551Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5261Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from different diagnostic modalities, e.g. ultrasound and X-ray
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/1702Systems in which incident light is modified in accordance with the properties of the material investigated with opto-acoustic detection, e.g. for gases or analysing solids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0204Acoustic sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/43Detecting, measuring or recording for evaluating the reproductive systems
    • A61B5/4306Detecting, measuring or recording for evaluating the reproductive systems for evaluating the female reproductive systems, e.g. gynaecological evaluations
    • A61B5/4312Breast evaluation or disorder diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0825Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the breast, e.g. mammography

Abstract

A real-time imaging system that provides ultrasonic imaging and optoacoustic imaging coregistered through application of the same hand-held probe to generate and detect ultrasonic and optoacoustic signals. These signals are digitized, processed and used to reconstruct anatomical maps superimposed with maps of two functional parameters of blood hemoglobin index and blood oxygenation index. The blood hemoglobin index represents blood hemoglobin concentration changes in the areas of diagnostic interest relative to the background blood concentration. The blood oxygenation index represents blood oxygenation changes in the areas of diagnostic interest relative to the background level of blood oxygenation. These coregistered maps can be used to noninvasively differentiate malignant tumors from benign lumps and cysts.

Description

1 2012332233 16 Aug 2017
DUAL MODALITY IMAGING SYSTEM FOR COREGISTERED FUNCTIONAL
AND ANATOMICAL MAPPING
[0001] This application claims priority to U.S. Patent Application Nos. 13/667,808 and 13/667,830 filed November 2, 2012, U.S. Patent Application No. 13/507,217, filed 06/13/2012, entitled "System and Method for Acquiring Optoacoustic Data and Producing Parametric Maps Thereof', U.S. Patent Application No. 13/341,950, filed December 31, 2011, entitled "System and Method for Adjusting the Light Output of an Optoacoustic Imaging System", and U.S. Patent Application No. 13/287,759, filed November 2, 2011, entitled "Handheld Optoacoustic Probe.” The entire disclosures of those applications, including the appendices thereof, are incorporated herein by reference.
FIELD OF THE TECHNOLOGY
[0002] At least some embodiments disclosed herein relate, in general, to systems for biomedical imaging, and more particularly, to real-time imaging systems that visualize thin tissue slices noninvasively through skin.
BACKGROUND
[0003] Medical ultrasound imaging is a well-established imaging technology for visualization of tissue morphology in various organs that provides diagnostic information based on analysis of anatomy. Optoacoustic imaging is used in medical applications for in vivo and in vitro mapping of animal and human tissues and organs based on variation in tissue optical properties. Optoacoustic tomography can provide anatomical, functional and molecular imaging, but the most significant value of optoacoustic imaging is in its capability to provide quantitative functional information based on endogenous contrast of molecular constituents of red blood cells. The essence of functional imaging is to provide the physician with a map of blood distribution and its level of oxygenation, so that the physician can determine whether particular tissue functions normally or not. For example, a map of total hemoglobin distribution simultaneously showing an area with increased concentration and decreased oxygen saturation indicates potential malignancy. The essence of molecular imaging is to provide maps of distributions and concentrations of various molecules of interest for a specific health condition. For example, distribution of specific protein receptors in cell membranes gives insight into molecular biology or cells that aids in designing drugs and therapeutic methods to treating human diseases.
9380808_1 (GHMatters) P96966.AU 2 2012332233 16 Aug 2017
SUMMARY
[0004] According to a first broad aspect of the present invention, there is provided an optoacoustic imaging system for visualization of slices into the depth of tissue of at least a portion of a body, comprising: a hand-held imaging probe comprising a light emitting portion configured to emit at least two optical pulses having different spectral bands of electromagnetic energy, and an array of ultrasonic transducers configured to detect transient ultrasonic signals resulting from selective absorption of each of the at least two optical pulses in hemoglobin and oxyhemoglobin of blood contained in the tissue; a processing system configured to receive data originating from the ultrasonic transducers of the hand-held imaging probe and to process at least three independent images based at least in part upon said data, the three independent images together comprising: a first functional image reflecting distribution of total hemoglobin concentration; a second functional image reflecting distribution of blood oxygen saturation; and a morphological image of tissue structures; the processing system being further configured to substantially co-register the first functional image, the second functional image, and the morphological image in time and space, and to output a substantially co-registered image.
[0005] In an embodiment, the light emitting portion and the array of ultrasonic transducers in the hand-held imaging probe are arranged in a generally flat linear shape.
[0006] In another embodiment, the light emitting portion and the array of ultrasonic transducers in the hand-held imaging probe are arranged in a curved concave arc shape.
[0007] In a further embodiment, the hand-held imaging probe is configured to produce at least two optical beams, one on each side of the array of ultrasonic transducers, so as to deliver optical energy to a skin surface at such angle and such distance between them that the optical beams merge into one beam within a distance of skin thickness under the array of ultrasonic transducers.
[0008] In a further embodiment, the system further comprises one or more dualwavelength short-pulse lasers.
[0009] In an embodiment, the system further comprises a plurality of single-wavelength short pulse lasers.
[0010] In one embodiment, the system further comprises a fiberoptic light delivery
9380808_1 (GHMatters) P96966.AU 3 2012332233 16 Aug 2017 system.
[0011] In another embodiment, the system is configured to present images substantially in real time by operating at a video frame rate.
[0012] In a certain embodiment, the hand-held optoacoustic probe is configured to deliver optical energy from either under a face of said array of transducers or its side.
[0013] In an embodiment, the hand-held imaging probe comprises an acoustic lens. In an example, the acoustic lens comprises optically reflective materials. The optically reflective materials may comprise a thin, highly optically reflective metallic layer that removes image artifacts associated with light interactions with the acoustic lens. The acoustic lens may be formed from a white opaque material. The thin, highly optically reflective metallic layer may comprise aluminum, gold, or silver.
[0014] In another embodiment, the hand-held imaging probe comprises an output fiber bundle with multiple sub-bundles. In an example, the multiple sub-bundles are shaped to provide even illumination of the image plane and smooth illumination edges so as to reduce edge-related optoacoustic artifacts.
[0015] In yet another embodiment, the array of ultrasonic transducers comprises ultrasonic transducers having an ultrawide ultrasonic frequency band of sensitivity, with bandwidth of up to 200% from the central frequency.
[0016] In a further embodiment, the hand-held imaging probe comprises an input fiber bundle that is circular in shape to match an incident laser beam.
[0017] In an embodiment, the hand-held imaging probe comprises an input fiber bundle having a thermally fused fiber bundle tip such that substantially all fibers in the bundle are reshaped to avoid loss of light through spaces between fibers.
[0018] In another embodiment, the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles, with fibers in each sub-bundle being randomized such that two neighboring fibers at an input appear in different sub-bundles of the output fiber bundle.
[0019] In a further embodiment, the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles to form fiber bundle paddles, with at least two paddles placed on each side of the ultrasonic transducer array, each paddle, in turn, being divided into smaller sub-bundles, each smaller sub-bundle being in a slot in said paddle so as to provide controlled profile of an optical beam.
[0020] In one embodiment, the hand-held imaging probe comprises fiber bundle, which produces an optical beam that is shaped to complement a size and shape of the ultrasonic
9380808 1 (GHMatters) P96966.AU 4 2012332233 16 Aug 2017 transducer array.
[0021] In another embodiment, the hand-held imaging probe comprises an output fiber bundle having triangular shaped ends so as to allow an output beam to have smooth edges of optical fluence after passing through a light diffuser.
[0022] In an embodiment, the hand-held imaging probe comprises a plurality of optical windows, each comprising one or more anti-reflection-coated plates with acoustic impedance matching that of tissues to be imaged. In an example, the anti-reflection-coated windows comprise glass, polymer or other solid optically transparent material.
[0023] In one embodiment, the hand-held imaging probe comprises: first and second light diffusers; first and second optical windows; at least two output fiber bundles arranged such that optical beams respectively emerging therefrom pass through the respective light diffusers, then pass through the respective optical windows, then merge at least partially.
[0024] In an embodiment, the system further comprises a three-dimensional positioning system configured to control position of the hand-held probe so as to allow assembly of three-dimensional volumetric images of the body from two-dimensional slices made though a depth of tissue obtained by scanning the hand-held probe along the surface of at least a portion of the body.
[0025] In another embodiment, the handheld imaging probe further comprises an acoustic lens formed from a material that allows it to reflect and scatter light from illumination components with substantially no absorption of such light, and yet be optically opaque. In an example, the acoustic lens is formed from silicon rubber. The silicon rubber may be filled with titanium dioxide. The silicon rubber may be filled with barium sulfate powder.
[0026] In a further embodiment, the handheld imaging probe further comprises a housing that provides hypo-echoic encapsulation of the probe. In an example, internal or external parts of the housing comprise materials that do not absorb near-infrared laser light. Internal or external parts of the housing comprise materials may have low thermal expansion properties such that the housing does not emit ultrasound after absorption of laser light.
[0027] In one embodiment, an assembly of the array of ultrasonic transducers is made of hypo-echoic material.
[0028] In another embodiment, the system further comprises a layer of hypo-echoic material between an assembly of the array of ultrasonic transducers and a fiberoptic assembly to avoid generation of ultrasound upon interaction of light with assembly of the array of
9380808_1 (GHMatters) P96966.AU 5 2012332233 16 Aug 2017 ultrasonic transducers.
[0029] In a certain embodiment, the processing comprises deconvolution of a hardware transfer function.
[0030] According to a second broad aspect of the present invention, there is provided an optoacoustic imaging method for coregistered functional and anatomical mapping of tissue of at least a portion of a body, the method comprising: a) delivering ultrasonic pulses into the tissue and detecting backscattered ultrasonic signals reflected from structural tissue boundaries associated with body morphology; b) sequentially delivering to the tissue at least two optical pulses having different spectral bands of electromagnetic energy and detecting transient ultrasonic signals resulting from selective absorption of each of the at least two optical pulses in hemoglobin and oxyhemoglobin of blood contained in tissues; c) processing detected ultrasonic signals to remove noise, to revert signal alterations in the course of signal propagation through tissue and through the detection system components, and to restore a temporal shape and ultrasonic spectrum of the original signals; d) performing image reconstruction and further processing to generate morphological images of tissue structures coregistered and superimposed with partially transparent functional images reflecting total hemoglobin concentration and blood oxygen saturation; and, e) repeating steps a) to d) at a video frame rate such that real-time images display tissue functional and morphological changes substantially as they occur.
[0031] In an embodiment, three or more optical pulses each having different spectral bands of electromagnetic radiation are sequentially delivered to the tissue so as to generate functional images with improved accuracy that reflect significant molecular chromophores of tissue. In an example, the molecular chromophores of tissue comprise water. The molecular chromophores of tissue may comprise lipids. Four optical pulses each having different spectral bands of electromagnetic radiation may be sequentially delivered to the tissue. In another example, spectral bands of the two optical pulses are selected so that one of them matches local maximum peak of hemoglobin absorption around 757 nm and the other matches the spectral range around 1064 nm corresponding to the maximum ratio in the optical absorption of oxyhemoglobin to that of hemoglobin.
[0032] In another embodiment, the method further comprises a step of indicating tumor differentiation based at least in part on absorption coefficients measured in the tumor at first
9380808_1 (GHMatters) P96966.AU 6 2012332233 16 Aug 2017 and second wavelengths. In an example, the first wavelength comprises 757 nanometers. In an example, the second wavelength comprises 1064 nanometers.
[0033] In a further embodiment, the step of tumor differentiation comprises displaying either: a. a relatively smooth shape of a tumor, or tissue surrounding a tumor, superimposed with relatively low elevation in concentration of the total hemoglobin and normal blood oxygen saturation to indicate a benign tumor, or b. displaying a rough shape of a tumor, or tissue surrounding the tumor, superimposed with high elevation in the total hemoglobin and low blood oxygen saturation to indicate a malignant tumor.
[0034] In one embodiment, the method further comprises a step of renormalizing an image display palette by reducing relative brightness of image pixels corresponding to the tissue’s surface, thereby amplifying relative brightness of pixels corresponding to larger depths in tissue, so as to make objects located at larger depths visible with greater contrast.
[0035] In another embodiment, the step of processing detected ultrasonic signals to revert signal alterations comprises deconvolution of a hardware transfer function to obtain intrinsic optoacoustic amplitude and profile of such the detected ultrasonic signals and distribution of an optical absorption coefficient in the body.
[0036] In a further embodiment, the method further comprises a step of using optoacoustic and ultrasonic contrast agents to enhance visualization of tissue morphology and contrast of functional information within a portion of the body being imaged.
[0037] In a certain embodiment, the method further comprises a step of using optoacoustic and ultrasonic contrast agents to enhance characterization of distribution of predetermined types of molecules, cells or tissues in the body.
BRIEF DESCRIPTION OF THE DRAWINGS
[0038] In order that the invention may be more clearly ascertained, embodiments will now be described, by way of example, with reference to the accompanying drawing. The disclosed embodiments are illustrated by way of example and not limitation in the figures of the accompanying drawings in which like references indicate similar elements.
[0039] FIG. 1A illustrates an embodiment of an optoacoustic probe with illumination of tissue through skin by a scattered light beam formed in tissue by merging two optical beams.
[0040] FIG. IB illustrates how laser illumination light and an acoustic signal from an optoacoustic probe can be scattered from the skin towards an acoustic lens of a probe.
9380808 1 (GHMatters) P96966.AU 7 2012332233 16 Aug 2017 [0041] FIGS. 2A and 2B illustrate optoacoustic signals showing the impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array, and using detection by transducers tilted at large angle relative to the plane of images generated therefrom.
[0042] FIG. 3 illustrates an embodiment of manifestation of image artifacts associated with the edge effect of optical illumination beam having abrupt changes of the optical fluence.
[0043] FIGS. 4A-4C illustrate embodiments wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances.
[0044] FIGS. 5A and 5B illustrate two embodiments of a hand-held optoacoustic ultrasonic probe protected from optical illumination of the acoustic lens.
[0045] FIG. 6 illustrates optoacoustic images using a probe with an acoustic lens that is not totally optically reflective and with a probe having an optically reflective layer of gold, which removes lens related image artifacts.
[0046] FIG. 7A illustrates an embodiment of an optical beam with sharp edges that may produce edge effects of acoustic waves and related artifacts and an optical beam with smooth edges producing reduced edge-related artifacts.
[0047] FIGS. 7B and 7C illustrate designs of an output fiber bundle with multiple subbundles shaped to provide even illumination of the image plane and reduce edge related optoacoustic artifacts.
[0048] FIG. 8 illustrates the effect of optical illumination for two probes where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe.
[0049] FIG. 9A illustrates embodiments of ultrasonic probes having flat, concave or convex arc shapes.
[0050] FIG. 9B shows a hand-held optoacoustic probe having a concave arc shape.
[0051] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape.
[0052] FIG. 9D shows an optoacoustic image of three spherical objects and demonstrates that within the field of view of the arc spatial (and especially lateral) resolution is excellent even for a large object.
[0053] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic handheld probe design.
9380808_1 (GHMatters) P96966.AU 8 2012332233 16 Aug 2017 [0054] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow ultrasonic frequency band of sensitivity, the impulse response of an ultrawide-band ultrasonic transducer, and the ultrasonic spectra of transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers.
[0055] FIGS. 11A-11B provides an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals where deconvolution restores the original, unaltered, N-shaped pressure signals.
[0056] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales, seven scales and nine scales.
[0057] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array.
[0058] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographic images of an imaging slice through tissue with a small artery, larger vein and a rectangular grid allowing estimation of system performance in visualization of microvessels.
[0059] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images of a point spread function as visualized with a flat linear probe using a backpropagation algorithm and an aperture normalized backprojection algorithm.
[0060] FIGS. 16A and 16B provide an illustrative example of optoacoustic images of a phantom with hairs embedded at different depths where the first image was created using an embodiment of a standard palette and the second image was created using an embodiment of a depth-normalized palette.
[0061] FIGS. 17A and 17B provide an illustrative example of optoacoustic images of a phantom of a spherical simulated tumor obtained with a flat linear probe.
[0062] FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 757 nm and 1064 nm, which match the local maximum of hemoglobin absorption like in totally in hypoxic blood (757 nm) and minimum of the ratio of absorption by hypoxic hemoglobin to absorption by oxyhemoglobin like in normally oxygenated blood (1064 nm).
[0063] FIG. 19 illustrates tumor differentiation based on absorption coefficients at two wavelengths in a phantom simulating benign (box) and malignant (sphere) tumors.
9380808_1 (GHMatters) P96966.AU 9 2012332233 16 Aug 2017 [0064] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02], [0065] FIG. 20B shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe.
[0066] FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels with different anatomical and functional images.
[0067] FIGS. 21A and 2 IB show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at a wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 21B. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging.
[0068] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water.
[0069] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors.
[0070] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors.
DETAILED DESCRIPTION
[0071] The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure are not necessarily references to the same embodiment; and, such references mean at least one.
[0072] Reference in this specification to “an embodiment” or “the embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least an embodiment of the disclosure. The appearances of the phrase “in an embodiment” in various places in the specification are not necessarily all
9380808_1 (GHMatters) P96966.AU ίο 2012332233 16 Aug 2017 referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments. System Overview [0073] In at least some embodiments, the present disclosure is directed to a dual-modality ultrasonic/optoacoustic system for medical diagnostics that uses a hand-held probe for scanning along the skin surface of an organ and provides two types of two-dimensional maps into the depth of tissue, anatomical (morphological) and functional (blood hemoglobin index and blood oxygenation index). In an embodiment, these two maps are spatially coregistered by using the same array of ultrasonic transducers and temporally coregistered by acquiring the two types of images in real time, faster than any physiological changes can occur in the tissue of diagnostic interest. The blood hemoglobin index represents blood hemoglobin concentration changes in the areas of diagnostic interest relative to the background blood concentration. The blood oxygenation index represents blood oxygenation changes in the areas of diagnostic interest relative to the background level of blood oxygenation. These coregistered maps can be used to noninvasively differentiate malignant tumors from benign lumps and cysts.
[0074] In an embodiment, the dual-modality ultrasonic/optoacoustic system of the present disclosure provides two-dimensional imaging of a body utilizing delivery of optical energy and acoustic detection of resulting transient pressure waves using interchangeable hand-held probes, one of which is flat and used to perform a translational scan through at least a flat portion of the body under examination, and the second of which is curved being shaped as a concave arc to perform a translational scan through at least a cylindrical or curved portion of the body under examination, both scans contribute to a more complete understanding normal or pathological functions in the body.
[0075] In an embodiment, at least a portion of the body under examination contain molecules of blood constituents such as hemoglobin and oxyhemoglobin responsible for body functions or receptors in cells responsible for cell functioning, water, lipids or other constituents.
[0076] In an embodiment, optical energy produced using at least one laser beam is used for body illumination with at least one wavelength of light. In an embodiment, the optical energy is pulsed, with the pulse duration shorter than the time of ultrasound propagation through the distance in the body equal to the desirable spatial resolution. In an embodiment,
9380808_1 (GHMatters) P96966.AU 11 2012332233 16 Aug 2017 the optical energy is within the spectral range from 532 nm to 1064 nm. In an embodiment, the optical energy is replaced with other electromagnetic energy with the wavelength from 1 nm to lm.
[0077] In an embodiment, electronic signals produced by ultrasonic transducers are amplified using low noise wide band electronic amplifiers with high input impedance. In an embodiment, analog electronic signals are digitized by a multi-channel analog-to-digital converter and further processed utilizing a field programmable gate array. In an embodiment, the ultrasonic transducers are ultrawide-band transducers that detect ultrasonic signals with no or minimal reverberations. In an embodiment, the system is integrated with an ultrasound imaging system used to enhance visualization of acoustic boundaries in the body and parts of the body with different density and/or speed of sound. |0078| In an embodiment, quantitative measurements of concentrations of target molecules, cells or tissues is made through characterization of optical energy propagation and absorption combined with processing of digital electronic signals by deconvolution of the hardware transfer function in order to obtain intrinsic optoacoustic amplitude and profile of such signals and the distribution of the optical absorption coefficient in the body.
[0079] In an embodiment, an optoacoustic contrast agent is used to visualize a portion of the body of interest or characterize distribution of certain molecules, cells or tissues in the body. |0080] In an embodiment, the system comprises at least a laser, a light delivery system, an optoacoustic probe, an electronic system, a computer and an image display.
Laser |0081] In an embodiment, the laser is capable of emitting short, nanosecond pulses of near infrared light at two (or more) different toggling wavelengths, i.e. two different spectral bands. In an embodiment, one of the wavelengths is preferentially absorbed by hemoglobin of blood and the other is preferentially absorbed by oxyhemoglobin of blood. In an embodiment, illumination of the organ under examination with the first laser pulse at one wavelength (spectral band) and detection of the first optoacoustic signal profile resulting from the first illumination, followed by the illumination with the second laser pulse at the second wavelength band and detecting of the second optoacoustic signal profile, can provide data that can be used for reconstruction of two coregistered tomographic images that can be used for generation of functional maps of the areas of diagnostic interest based on (i) blood hemoglobin index and (ii) blood oxygenation index.
Light Delivery System 9380808_1 (GHMatters) P96966.AU 12 2012332233 16 Aug 2017 [0082] In an embodiment, the light delivery system comprises bundles of optical fibers. In an embodiment, the input of the optical fiber bundle is circular to match the incident laser beam, while the output of the fiber bundle is rectangular to match the size and shape of the ultrasonic transducer array. In an embodiment, each fiber has a small diameter (e.g., down to 50 micron) to provide excellent flexibility of the bundles. In an embodiment, the input tip of the fiber bundle is fused to shape the bundle into a hexagon and to eliminate spaces between the fibers in the bundle, thereby providing up to 20% better transmission of the laser energy. In an embodiment, the output tip of the fiber bundle is fully randomized such that fibers that appear close to each other at the input will appear far from each other at the output or even in different branches of the bifurcated fiber bundle.
Optoacoustic Probe [0083] The probe is designed to provide high contrast and resolution of both optoacoustic and ultrasonic images. In an embodiment, the probe is a hand-held probe with an array of ultrasonic/optoacoustic transducers, which can be designed to be single dimensional, 1.5 dimensional or two-dimensional. In an embodiment, the transducers detect acoustic waves within an ultra-wide band of ultrasonic frequencies and the ultra-wide band is shaped to match the spectrum of optoacoustic signals emitted by tissue of diagnostic interest. In an embodiment, the transducers are also designed to emit acoustic waves as short pulses of ultrasound with short ring-down time and minimal reverberations of gradually decreasing magnitude.
[0084] To achieve such a design, transducer material can be chosen from, for example, piezoelectric ceramics (such as PZT, PMN-PT, and PZNT), piezoelectric single crystals (such as PZT, PMN-PT, and PZNT), piezoelectric polymers (such as PVDF and copolymer PVDF copolymer), composite polymer-ceramic and polymer-crystal piezoelectric materials and capacitive micromachined ultrasonic transducers (CMUT). In an embodiment, the thickness of the transducer elements that provide the central frequency and materials of the backing layer and the front surface matching layer of the transducers are optimized.
[0085] In various embodiments, the shape of the ultrasound transducer array may be either flat or concave arc. A flat design is suited to scanning of the surface of an organ under examination that has radius of curvature much larger than the size of the probe, such as a human body. A concave arc-shaped design provides the largest aperture for the optoacoustic signal detection with minimal physical dimensions. The large aperture, in turn, provides for improved lateral resolution within the angle of the field of view formed by lines connecting the arc's focal point with each edge transducer of the array. The arc-shaped probe is often the
9380808_1 (GHMatters) P96966.AU 13 2012332233 16 Aug 2017 most effective for scanning body surfaces that are curved with a radius approximately matching that of the probe (such as the average sized breast, neck, arms and legs).
[0086] FIG. 1A illustrates an embodiment of an optoacoustic probe that provides illumination of tissue (TS) through skin (SK) by the scattered light (SL) beam formed in tissue by merging two optical beams (OB) emerging from fiber bundles (FB) expanding and passing through light diffusers (LD) then passing through optical windows (OW). Acoustic waves (AW) generated in blood vessels or tumors (BV or TM) by the scattered light (SL) in tissue propagate through acoustic lens (AL) to transducers (TR) and converted into electrical signals being transmitted by electrical cables (EC) through the backing material (BM) to the electronic amplifiers.
[0087] In an embodiment, the design of the optical fiber bundle is as follows. The input of the fiber bundle is circular with fused fiber tips to avoid loss of light through spaces between the fibers. The fiber diameter may be approximately 200 microns for good flexibility, and a fiber diameter of 100 microns or even 50 microns may be desirable in a particular application. This fiber bundle is Y-split into two half-bundles and fully randomized, so that substantially any two neighboring fibers from the input appear in different half-bundles. At least a majority of the neighboring fibers should be randomized in this regard. Each half-bundle is preferably split into multiple sub-bundles, and each subbundle is placed in its slot/niche to form fiber bundle "paddles". The two paddles are placed on each side of the ultrasonic transducer (TR) array assembly. As is discussed below with reference to FIGS. 7B and 7C, the output shape of each fiber bundle paddle may be rectangular for the width of the field of view, typically 40 mm, and have triangular shaped ends. Such triangular shape allows the output beam to have smooth edges after passing through light diffuser (LD), FIG. 1A. Finally, the optical beam from fiber bundle paddles exit from the probe into the skin (SK) through optical windows (OW) that comprise thin anti-reflection-coated glass plates or anti-reflection-coated polymer or plastic plates with acoustic impedance matching that of tissues to be imaged.
[0088] There are a number of objectives for the present optoacoustic probe design: (i) substantially no light should propagate either through the acoustic lens (AL) or through the optical block acoustic damper (OBAD) on the sides of the probe, (ii) substantially no acoustic waves should be generated in the acoustic lens or the optical block acoustic damper materials through absorption of light; acoustic waves in a wide range of ultrasonic frequencies from 0.1 MHz to 15 MHz should be able to pass through (AL) with no attenuation, and no acoustic waves should be able to pass through OBAD; (iii) the optical beams (OB) exiting through
9380808 1 (GHMatters) P96966AU 14 2012332233 16 Aug 2017 optical windows (OW) should have smooth edges of the optical fluence, and these optical beams should enter the skin as close to each other as necessary to merge due to optical scattering within the skin and enter underlying tissue under the array or transducers providing maximum fluence in the image plane.
[0089] In an embodiment, the light delivery system directs light underneath the transducer elements, not through the array of transducer elements. In an embodiment, the design of the optoacoustic probe is based on an array of ultrasonic transducers with fiber optic delivery systems on each side of the ultrasonic array, positioned as close to the transducers as possible and with dimensions in the elevation axis of the transducer as small a possible, considering the need to focus ultrasonic beams to the depth of the most probable target. In an embodiment, the fiber optic delivery system is designed to allow penetration of the optical energy of the near infrared light into the organ being imaged, such as a breast, and minimum opto-thermo-mechanical interaction of the light beam with skin.
[0090] Another alternative design of the light delivery system delivers light to a mirror or prism(s) placed underneath of the ultrasonic transducers in order to reflect the light orthogonally to the skin surface of an organ being imaged. In such embodiments, a standoff can be placed between the transducer elements and the skin/tissue. These alternative embodiments may be combined within the scope of the invention.
Detailed Description of Aspects of System Components Optical Illumination and Probe Design [0091] An acoustic lens is typically placed on transducers within an optoacoustic probe for purposes of focusing ultrasonic beams. While a probe could be provided without an acoustic lens, if there is no lens then ultrasonic transducers may be directly exposed to light and absorb such light, which can result in very large artifact ultrasonic signals, especially where such light is pulsed. Optical illumination of the lens on an ultrasonic probe causes very strong transient acoustic waves that result in image artifacts. Up to 50% of near infrared light can be diffusely scattered by skin, depending on skin color. Mismatch of acoustic impedance between the lens of the transducer elements can cause reverberations with long ring down time. Therefore, an embodiment of a probe design includes a white strongly scattering opaque lens. If such lens is not needed due to curved shape of each transducer element, then a white strongly scattering front matching layer should be employed to protect transducer elements from near-infrared light.
9380808 1 (GHMatters) P96966AU 15 2012332233 16 Aug 2017 [0092] FIG. IB illustrates how laser illumination light 110 and 120 from an optoacoustic probe can be scattered 130 from the skin 140 towards an acoustic lens 150 of a probe.
[0093] Furthermore, (laser) optical pulses can have a direct impact on ultrasonic transducers of the acoustic waves induced by strong interaction of the optical pulses with skin of the organ being imaged that laterally traverse along the skin surface in a direction orthogonal to the image plane. When detected by the array of transducers, spatial distributions of these acoustic waves are projected onto the optoacoustic image at a depth equal to the lateral distance between the array of transducers and the optical beams on the skin surface, creating artifacts. Furthermore, acoustic waves generated in skin through reverberation of the acoustic lens and the housing of the probe can further affect the quality of imaging.
[0094] FIGS. 2A and 2B illustrate exemplary optoacoustic signals showing impact of lateral ultrasonic waves induced by laser pulses in skin using optical beams on each side of an ultrasound transducer array. The signals shown are generated by transducers in the direction almost orthogonal to the plane of images generated therefrom. Such transducers may receive signals at large oblique angle (up to 90 deg) relative to the plane of images generated therefrom, which is undesirable. Therefore, the design of the transducer array includes means to reject signals coming out of the image plane. Such means include but not limited to concave arc shape of the transducer elements and acoustic lens and delivery of the optical beam underneath the transducers. The detected optoacoustic signals 210 in FIG. 2A were generated using an effective acoustic coupling agent, in this case water. The signals 220 in FIG. 2B were generated in the absences of such acoustic coupling agent, i.e., using only air space to couple the acoustic signals to the transducer array.
[0095] Furthermore, the finite dimensions of the optical beam can affect the acoustic waves generated in response to impingement of the optical beam on tissue. Such acoustic waves can be generated at the sharp edges of the optical beam, propagate towards the array of transducers and result in artifacts. Since a system that utilizes flat linear array of ultrasonic transducers is configured such that the first and the last transducer in the array detect these waves first and the central transducers detect these waves the last, this “edge effect” results in v-shaped artifacts on a sinogram of optoacoustic signals and low frequency bulk artifacts on optoacoustic images.
[0096] FIG. 3 illustrates an example of manifestation of v-shaped artifacts 310 on a sinogram 300 of optoacoustic signals and associated artifacts 320 on optoacoustic images. Since these acoustic waves are associated with the edge effect of the optical illumination
9380808_1 (GHMatters) P96966.AU 16 2012332233 16 Aug 2017 beams having abrupt changes of the optical fluence, in an embodiment, one can see a ν'-shaped bright signals on the sinogram and associated series of artifact waves on the opto-acoustic image.
[0097] Furthermore, the illumination geometry of optical beams projected by an optoacoustic probe can affect image quality. Where the optical beams of an optoacoustic probe are placed too far apart, this can result in a gradual transition from the dark field illumination (two separate optical beams on each side of the probe resulting in the absence of direct light under the probe in the image plane) into the bright field of illumination (one beam under the probe going into the depth of tissue along the image plane). This transition creates a problem in the image brightness map making the map not quantitatively accurate and causes artifacts at the depth equal to the initial width between separate optical illumination beams on each side of the probe.
[0098] FIGS. 4A-4C illustrate an embodiment wherein optical illumination of tissue is accomplished using a hand-held optoacoustic probe 410, 420, 430 that delivers optical energy from either under the optoacoustic probe or on the side of the probe at different distances. In the embodiment of FIG. 4A, when the optical beams are delivered under the ultrasonic probe, the distribution of the optical energy in the image plane has a smooth gradient with a maximum at the skin surface. This optical distribution is beneficial for high contrast of optoacoustic images. In the embodiment of FIG. 4B, when the optical beams are delivered close to a thin optoacoustic probe, the two beams can merge due to optical scattering within the skin, so that the distribution of the optical energy in tissue under the skin can be made similar to the embodiment of FIG. 4A. In the embodiment of FIG. 4C, when the optical beams are separated by a large distance, they merge only at significant depth within tissue, creating the optical distribution in the image plane with a dark zone (no light) in a subsurface layer of the tissue and a bright zone in the depth of the tissue, which is detrimental to the contrast of optoacoustic images, especially considering projection of brightly illuminated areas of skin onto the optoacoustic image plane at a depth equal to the separation distance of the two beams.
[0099] Thus, in the embodiments illustrated in FIGS. 4A-4C, the image brightness map 412, 422 and 432 of the tissue being scanned is optimized where the illumination of the skin is directly under the probe 410. As the distance between the center of the transducers and the center of the optical beams increases, as shown at 420 and 430, the image brightness map 422 and 432 of the tissue being scanned becomes progressively more uneven.
[00100] Lastly, the reflection from boundaries of tissue structures (such as tumor, vessels
9380808 1 (GHMatters) P96966AU 17 2012332233 16 Aug 2017 or tissue layers) of laser-induced ultrasound waves launched into the tissue after being generated in skin, can also lead to image artifacts represented by lines, curves and overall noise.
[00101] In an embodiment, the acoustic lens of the probe is designed such that the lens reflects and scatters, but does not absorb, light from the illumination components, yet it is optically opaque. In various embodiments, such lens can be made either using strongly optically scattering material such as silicon rubber filled with titanium dioxide or barium sulfate powder, or using a thin metallic highly reflective layer such as aluminum or gold or a combination a white opaque lens material and a metal layer. In an embodiment, to avoid peel-off of the thin metallic layer from the front surface of the acoustic lens, in case of a combination of diffusely scattering material of the lens and a thin reflective layer (foil), the metallic reflective layer can be placed between the two layers of diffusely scattering material. As it is difficult to make a material with absolutely zero optical absorption, and such absorption may generate ultrasound in thermoelastic materials, the lens material can be made from thermoplastic materials having minimal thermal expansion, which produces minimal or no ultrasound in response to the absorbed optical energy.
[00102] FIGS. 5A and 5B illustrate two embodiments, respectively, of hand-held optoacoustic ultrasonic probes 510 and 520 that are protected from optical illumination of the acoustic lens. In FIG. 5A, a totally reflective opaque white lens is utilized, and in FIG. 5B a partially reflective white lens is utilized, with light reflection capability of the lens enhanced by a gold layer or coating.
[00103] FIG. 6 illustrates optoacoustic images using a probe with a non-reflective acoustic lens 610 and a probe with reflective layer of gold 620. The probe utilizing a reflective layer of gold 620 produces an image with reduced artifacts 612 and 614.
[00104] In an embodiment, the probe housing serves as hypo-echoic encapsulation of the probe, which means that the probe housing is made from materials that (i) do not absorb laser light (more specifically near-infrared light), but if a small absorption is unavoidable, the materials having low thermal expansion do not emit ultrasound after absorption of the laser light, (ii) strongly attenuate and dampen ultrasonic waves and do not reverberate. The transducer assembly inside the probe housing is also made of the hypo-echoic material. Alternatively, a layer of said hypo-echoic material is placed between the transducer assembly and the fiberoptic assembly to avoid generation of any ultrasound upon interaction of light with transducer assembly. In various embodiments, such materials can be chosen, for example, from white color porous and anechoic heterogeneous composites for baffles, foams,
9380808_1 (GHMatters) P96966.AU 18 2012332233 16 Aug 2017 polymers, rubbers and plastics (such as CR-600 casting resin available from Micro-Mark of Berkeley Heights, NJ, or AM-37 available from Syntech Materials of Springfield, VA), and others. In an embodiment, any such materials are electrically non-conducting insulators to, inter alia, protect the probe from external electromagnetic emissions.
[00105] In an embodiment, the optical illumination subsystem is configured to deliver optical beams with smooth intensity edges. In an embodiment, the width of the optical beams is equal to that of the array of ultrasonic transducers within the optoacoustic probe (for example, about 5 mm). This is achieved by designing the bundle of optical fibers to have a gradually decreasing density of fibers at the edges. This design enables one to deliver laser illumination to the skin of the organ under examination so that the beam does not generate sharp edge-related acoustic waves, and such laser-induced acoustic waves do not produce V-shaped artifacts in the sinogram of optoacoustic images.
[00106] FIG. 7A illustrates an embodiment of an optical beam with sharp edges 710 that may produce edge effects of acoustic waves and related artifacts and an optical beam 720 with smooth edges producing reduced edge-related artifacts. FIGS. 7B and 7C illustrate that an optical beam with smooth edges of fluence can be produced using a fiberoptic bundle design having multiple sub-bundles and a triangular shape in each end of the fiber bundle assembly.
[00107] In an embodiment, the fiber bundle is positioned at a distance from skin that is sufficient for the optical beam to expand to a desirable width. Where the dimensions of the probe are compact, the fibers used in the fiber bundle can be selected to have a higher numerical aperture (e.g., > 0.22). In an embodiment, in order to achieve better coupling of the optical beam into the skin, the beam is delivered through an optical window. In such embodiment, the optical window touches the skin, making its surface flat for better light penetration, simultaneously removing any excess of coupling gel from the skin surface being imaged. In an embodiment, the fiber bundle and the optical window are incorporated into the probe housing, so that the air gap between the fiber bundle and the window is protected.
[00108] In an embodiment, the optical window is designed is to allow minimal interactions of both the optical beam and the laser-induced acoustic waves with such window. In an embodiment, the window is very thin and made of optically transparent material with antireflection (AR) optical coating. In an embodiment, such material has anechoic acoustic properties. These anechoic acoustic properties and the fact that the illuminated skin is being depressed upon optoacoustic scanning results in dampening of ultrasonic waves laterally propagating from the laser-illuminated skin surface to the transducer array, thereby reducing
9380808_1 (GHMatters) P96966.AU 19 2012332233 16 Aug 2017 associated artifacts.
[00109] In an embodiment, the probe is designed such that the optical beams are very close to the transducer elements on each side of the ultrasonic probe, which is made as thin as technologically possible. In an embodiment, the thickness of the probe is so small (e.g., 5 mm) that the light beams delivered to skin at this distance, d, from the probe center will merge into one beam within the thickness of the skin (about z=5 mm), and the tissue of the organ under examination receives one beam underneath the transducer elements. This is called bright field illumination. In an embodiment, the optoacoustic probe is designed such that the optical light beam is delivered to the skin directly underneath the transducer elements.
[00110] FIG. 8 illustrates the effect of optical illumination for two probes 810 and 820 where two fiber bundles on each side of the respective probes are oriented to illuminate skin directly under the probe 812 and on either side of the probe 822. Where the skin is illuminated directly under the probe 812, a tumor 814 is clearly discernable and there is no clutter on the image background 816. Where the skin is illuminated on either side of the probe 822, the tumor is not discernable 824 and there are numerous artifacts on the image background 826.
[00111] In an embodiment, the optical beam width is designed to deliver increased light into the slice of tissue being imaged. In an embodiment, the beam is homogeneous, such that it has a constant fluence through the beam, as a heterogeneous beam generates acoustic sources of heterogeneities, which in turn produce artifacts in optoacoustic images. The fluence level is defined by the ANSI laser safety standards for the laser illumination of skin. The beam width is limited by the capability of the optical scattering in tissue to deliver photons of light into the central slice located underneath the transducer elements (the slice being imaged). In an embodiment, the length of the optical beam is equal to the length of the transducer array. In an embodiment, the optical beam also has smooth edges, that is to say, gradually reduced fluence at the edges, since sharp edges produce strong edge artifacts on optoacoustic images.
[00112] In an embodiment, design features of the optical illumination system and optoacoustic probe of the present disclosure can be summarized in the following Table:
Table 1. Summary of optical illumination and probe design.
9380808 1 (GHMatters) P96966AU 20 2012332233 16 Aug 2017
System Feature Advantages Arc hand-held probe Higher aperture - lower distortions Light delivery into the imaging plane Improves optoacoustic image contrast and decreases artifacts by increasing the ratio of useful information (from the imaging plane) to noise (outside of the imaging plane) Optical shielding of the probe Reduces artifacts from direct and scattered light striking the acoustic lens, probe housing, etc. Acoustic shielding of the probe Acoustic shielding of the probe’s housing reduces artifacts (clutter) from acoustic waves propagating through the probe’s housing Using ultrawide band transducers for both ultrasound and optoacoustic imaging Allows to have the same array working in ultrasonic and optoacoustic imaging [00113] In various embodiments, the shape of the ultrasonic transducer array for the combined optoacoustic/ultrasonic imaging can be either flat or convex arc-shaped. In an embodiment, the probe shape for optoacoustic imaging is concave arc-shaped. Such a concave shape provides a large aperture with minimal physical dimensions, wider field of view of an object being imaged, which in turn provides for improved lateral resolution and better reconstruction of the shape of the object being imaged.
[00114] FIGS. 9A-9C illustrate embodiments of optoacoustic/ultrasonic hand-held probes having flat or concave arc shapes 910 (FIG 9A) and a hand-held transrectal probe with a linear shape 920 (FIG. 9B). FIG. 9C illustrates details of the optoacoustic/ultrasonic handheld probe design with its face showing ultrasonic transducers assembly, two layers of hypo-echoic light reflecting and ultrasound damping material on each side, and two optical windows for delivery of the optical beam.
[00115] FIG. 9C illustrates details of a hand-held optoacoustic probe having a concave arc shape. Electrical cables 930 are provided for bi-directional communication to and from the probe, and fiberoptic bundles 940 are provided for delivering light to the probe. An array of wide-band ultrasonic transducers 950 send and receive acoustic energy. The transducer array 950 is covered by an opaque white cylindrical lens (not shown for clarity purposes) that
9380808_1 (GHMatters) P96966.AU 21 2012332233 16 Aug 2017 extensively scatters and reflects near-infrared light. Optical windows 960 provide optical beam outputs. In the embodiment FIG. 9C, the ultrasonic transducers within the probe may be designed so as not to be sensitive to lateral acoustic (ultrasonic) waves and to be free of reverberations, especially in the lateral direction. This can be achieved by the selection of the piezoelectric composite material, the shape of piezoceramic elements in the matrix and anechoic properties of the matrix. In an embodiment, the ultrasonic transducers are also designed to possess high sensitivity within an ultrawide band of ultrasonic frequencies. This in turn results in minimal reverberations that cause artifacts on optoacoustic/ultrasonic images.
[00116] FIG. 9D shows an optoacoustic image that illustrates advantages of the concave-arc shaped hand-held probe in terms of resolution in optoacoustic images. As presented in this embodiment, the shape and sharp edges of a large sphere are well depicted in cases where the object is within the field of view of the probe aperture. Outside the probe aperture resolution and accuracy of shape reproduction decreases, however remains better than those for flat linear probes of similar width.
[00117] FIG. 9E illustrates an alternate embodiment of an optoacoustic/ultrasonic handheld probe design that is capable of two-dimensional imaging within the plane going parallel to the skin surface at various selected depths, and three-dimensional images as well.
[00118] In an embodiment, a hand-held probe that is scanned along the skin surface producing real-time two-dimensional images of tissues in the body under examination also has a component serving for accurate global 3D positioning of the probe. This design allows the imaging system to remember positions of all tissue slices and to reconstruct three-dimensional images at the end of the scanning procedure.
Electronic Data Acquisition System [00119] In an embodiment, the present disclosure is directed to an optoacoustic imaging system having an electronic data acquisition system that operates in both optoacoustic and ultrasonic modes and can rapidly switch between such modes. In an embodiment, this is achieved with firmware that controls functions of a Field Programmable Gate Array (FPGA), the main microprocessor on the electronic data acquisition system. In an embodiment, a reprogrammable FPGA can toggle between optoacoustic and ultrasound operation modes in real-time, thus enabling co-registration of ultrasound and optoacoustic images, which can be used for diagnostic imaging based on functional and anatomical maps. In an embodiment, FPGA functions include controlling, acquiring, and storing optoacoustic and/or ultrasound data, signal processing and transferring data for real-time image reconstruction and
9380808_1 (GHMatters) P96966AU 22 2012332233 16 Aug 2017 processing. In an embodiment, the FPGA may also be employed in ultrasound beam forming and image reconstruction.
[00120] In an embodiment, the electronic data acquisition system design utilizes one or more multi-core Graphical Processor Units (GPU) for image reconstruction and processing. In the ultrasound mode, in an embodiment, the FPGA controls the ultrasound transmission and it performs both ultrasound and optoacoustic data acquisitions on a multi-channel board. In order to enhance operation of the memory of the FPGA, an external memory buffer can be used. In an embodiment, the FPGA allows rapid reprogramming of ultrasonic data acquisition with signal/frame repetition rate of about 2 to 20 kHz to optoacoustic data acquisition with signal/frame repetition rate of about 10-20 Hz, and also configures the structure of gates and internal memory structure and size to allow real-time switching between ultrasound emission and detection, laser synchronization, and system controls. In an embodiment, multiple FPGAs can be used to enhance the system performance. In an embodiment, in the ultrasound and optoacoustic modes, the FPGA clock can be changed by the appropriate time-division multiplexing (TDM). In an embodiment, the design of a multichannel electronic data acquisition system can be based on modules, with a module typically being from 16 to 128 channels, although 256 channels or more may be appropriate in some applications. In an embodiment, the design of a multi-channel electronic data acquisition system has 64 channels.
[00121] In order to achieve dual modality operation of the optoacoustic/ultrasonic system, a separate optoacoustic electronic system could also be combined with a separate ultrasonic electronic system through a single probe. In an embodiment, the probe has a cable that has a Y- split to connect the probe to optoacoustic and ultrasonic electronic systems. In an embodiment, a programmable electronic switch allows one to send the detected signal from the probe (transducer array) either to the optoacoustic electronics (to operate in optoacoustic mode) or to the ultrasonic electronics and from the ultrasonic electronics to the probe (to operate in ultrasound mode). In an embodiment, a synchronization trigger signal is sent to the optoacoustic and ultrasonic systems sequentially, so that the optoacoustic and ultrasonic images are acquired one after the other.
Processing, Reconstruction and Display of Images
Signal processing [00122] In various embodiments, a goal of the diagnostic imaging procedure is to display each pixel with a brightness that correctly replicates the originally generated signals in each
9380808_1 (GHMatters) P96966.AU 23 2012332233 16 Aug 2017 voxel of tissue displayed on the image. On the other hand, intrinsic pressure signals generated by optical pulses within tissues may be significantly altered in the course of propagation through tissue and especially in the course of detection and recording by the ultrasonic transducers and the electronics subsystem.
[00123] In an embodiment, detected signals are processed to reverse alterations and restore original signals. In an embodiment, such reversal can be achieved through deconvolution of the impulse response (IR) of the system. In an embodiment, the impulse response can be measured by recording and digitizing a delta-function ultrasonic signal generated by short (nanosecond) laser pulses in a strongly absorbing optical medium with high thermoelastic expansion coefficient.
[00124] One component of the impulse response is the acousto-electrical impulse response, which provides for the optoacoustic or ultrasonic signal distortions due to the properties of the ultrasonic transducers, cables and analog electronics. A second part of the impulse response is the spatial impulse response that provides for the signal distortions associated with finite dimensions of the ultrasonic transducers. In various embodiments, large transducers can integrate ultrasonic waves incident at an angle, whereas point-sourcelike transducers can provide perfect or near perfect delta-function spatial impulse response.
[00125] In an embodiment, any distortions in acousto-electrical impulse response can be reversed by the impulse response deconvolution from the detected signals. However, possible distortions in the spatial impulse response can be avoided by designing transducers with small dimensions within the image plane. In an embodiment, the dimensions of the transducers are much smaller than the shortest wavelength of the ultrasound that may be detected or emitted by the transducers.
[00126] FIGS. 10A-10C show examples of the impulse response of an ultrasonic transducer with a relatively narrow band of sensitivity 1010, the impulse response of an ultrawide-band ultrasonic transducer 1020 and ultrasonic spectra of the transducer sensitivity as a function of frequency for ultrawide-band and narrow band resonant transducers 1030.
[00127] In an embodiment, the first step in processing an optoacoustic signal in an imaging system that produces two-dimensional optoacoustic images is deconvolution of the acousto-electrical impulse response.
[00128] FIGS. 11A and 11B provide an illustrative example of the deconvolution of impulse response of transducers from the detected optoacoustic signals 1110, where deconvolution restores the original, unaltered, N-shaped pressure signals 1120.
[00129] In an embodiment, the second step in processing an optoacoustic signal is signal
9380808.1 (GHMatters) P96966.AU 24 2012332233 16 Aug 2017 filtering to remove noise using a signal filter. In an embodiment, the signal filter is based on a wavelet transform that operates simultaneously in the frequency and time domain. In an embodiment, such a wavelet filter is capable of filtering certain frequency components of the signal that belong to noise and appear at a given time, while preserving similar frequency components of the useful signal that appear at a different time. In an embodiment, the frequency spectrum of a wavelet filter replicates the frequency band of a typical N-shaped optoacoustic signal while simultaneously providing smooth window edges which do not cause signal distortions upon convolution.
[00130] In an embodiment, such a wavelet filter is useful in optoacoustic imaging in its capability to restore the original pressure profile generated in tissue prior to pressure propagation. In the course of propagation through tissue, the originally positive pressure signal converts into a bipolar (compression / tension) profile. Therefore, reconstruction of an image of absorbed optical energy (optoacoustic image) requires a transformation that starts with bipolar signals and provides for all-positive values of the optoacoustic image intensities. In an embodiment, a multi-scale wavelet filter, for example, a filter that simultaneously integrates the signal over time and provides summation of a number of frequency bands present in the signal, can convert bipolar pressure signals into monopolar signal representing thermal energy or originally generated positive pressure.
[00131] FIGS. 12A-12C provide an illustrative example of wavelet filtered N-shaped optoacoustic signals restored to their original rectangular pressure profile by summation of all scales corresponding to frequency ranges from low to high for five scales 1210, seven scales 1220 and nine scales 1230.
[00132] In various embodiments, wavelet filtering permits enhancements of objects on the image within certain range of dimensions. An imaging operator (ultrasonic technician or diagnostic radiologist) typically desires to better visualize a tumor with specific dimensions and other objects, such as blood vessels, with their specific dimensions. In an embodiment, the wavelet filter allows the operator to apply specific selection of scales of the wavelet filter than would enhance only objects of certain sizes and suppress object of other unimportant sizes. In an embodiment, boundaries can be well visualized for objects of any size, so the high-frequency wavelet scales are beneficial for the image quality and are included in the selection of scales. In an embodiment, for a mathematically correct tomographic reconstruction, a ramp filter can be applied to the signal, which can linearly enhance contribution of higher frequencies.
9380808_1 (GHMatters) P96966AU 25 2012332233 16 Aug 2017
Image Reconstruction 100133] In various embodiments, image reconstruction typically uses radial back-projection of processed and fdtered signals onto the image plane. However, due to the limited field of view available from small hand-held probes, only an incomplete data set can be obtained. As a result, the 2D optoacoustic images may include artifacts distorting the shape and brightness of the objects displayed on the images. In an embodiment, aperture integrated normalized radial back projection is used to correct some of the reconstruction artifacts that are observed in limited aperture optoacoustic tomography. 100134] FIG. 13 provides an illustrative diagram of radial backprojection where each transducer element aperture is weighted and normalized for the total aperture of the transducer array. 100135] In an embodiment, Tk,-Tk+4, 1311-1315, are the transducers 1310 in the array, By is the brightness (intensity) of a pixel with coordinates (i,j), 1320, 1330 is the angular portion of optoacoustic wave front emitted by the pixel (i,j) as it is visualized by the transducer #k, Ω,0=Σ tOy,k (sum of all co,j.k) is the portion of the optoacoustic wave front emitted by pixel (i,j) as it is visualized by the entire transducer array, and Sy,k is the sample of the optoacoustic signal measured by k,h transducer and used in reconstruction of the brightness in the pixel (i,j). Various backpropagation algorithms can be used to normalize an optoacoustic image. 100136] In an embodiment, a backpropagation algorithm can be expressed as: = (1) [00137] However, in at least some embodiments, aperture normalized backprojection produces superior image results. In an embodiment, the aperture normalized backprojection can be expressed as:
Bi,j Sk ^i,j,k (2) [00138] FIGS. 14A and 14B provide an illustrative example of optoacoustic tomographraphic images 1410 and 1420 of an imaging slice through a tumor angiogenesis model. In the first image 1410, a backpropagation algorithm, such as the first algorithm immediately above, is used to nonnalize the image. The resulting image has strong, bright arc-shaped artifacts 1412 around the blood vessels 1414 that are close to array surface. In the second image 1420, aperture normalized backprojection algorithm, such as the second algorithm immediately above, is used to normalize the image. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces the arc-shaped
9380808_1 (GHMatters) P96966.AU 26 2012332233 16 Aug 2017 artifacts.
[00139] FIGS. 15A and 15B provide an illustrative example of optoacoustic tomographic images 1510 and 1520 of a point spread function as visualized with flat linear probe using 1510 a backpropagation algorithm, such as the first algorithm immediately above, and 1520 an aperture normalized backprojection algorithm, such as the second algorithm immediately above. As can be seen, the aperture normalized backprojection algorithm corrects image brightness and reduces artifacts.
Image Processing and Display [00140] In an embodiment, the optoacoustic image palette is equalized to diminish effects of light distribution within tissue. Such equalization transforms the dynamic range of optoacoustic images for better visualization of both shallow and deep objects.
[00141] FIGS. 16A and 16B provide an illustrative example of optoacoustic images 1610 and 1620 of a phantom with hairs embedded at different depths where the first image 1610 was created using a an embodiment of a standard palette and the second image 1620 was created using an embodiment of a depth-normalized palette. As can be seen, utilizing the depth-normalized palette enhances visibility of deep objects in the illustrated embodiment.
[00142] In an embodiment, principal component analysis (PCA) on a single optoacoustic image acquisition (different channels) is used to remove cross-correlated signal noise. Principal component analysis on a dataset of optoacoustic signals can remove correlated image clutter. Principal component analysis on optoacoustic frames can also remove correlated image clutter.
[00143] FIGS. 17A and 17B provide an illustrative example of optoacoustic images 1710 and 1720 of a phantom of a spherical simulated tumor obtained with flat linear probe. The first image 1710 is a raw image that was not subjected to principal component analysis processing. The second image 1720 has been subjected to principal component analysis processing with first principal component deconvolution. As can be seen, utilizing principal component analysis processing enhances image quality by, inter alia, reducing artifacts.
[00144] In an embodiment, design features of signal and image processing of the present disclosure can be summarized in the Table 2 as follows:
Table 2. Summary of signal and image processing.
System Feature Advantages Operator-assisted Can improve quantitative optoacoustic diagnostics by evaluating the
9380808 1 (GHMatters) P96966AU 27 2012332233 16 Aug 2017 boundary tracking on ultrasonic and optoacoustic images diagnostic parameters within the tumor boundary defined on US images Diagnostics can be enhanced by morphological analysis of the tumor boundary Aperture integrated normalized radial back projection Corrects some of the reconstruction artifacts that are observed in a limited aperture optoacoustic tomography Equalization of the optoacoustic image palette to diminish effects of light distribution within the tissue Transforms dynamic range of optoacoustic images for better visualization of both shallow and deep objects Principal component analysis (PCA) of the optoacoustic signal data PCA on a single optoacoustic acquisition (different channels) is a fast and efficient way to remove cross-correlated signal noise PCA on a dataset of optoacoustic signals removes correlated image clutter PCA on optoacoustic frames removes correlated image clutter Optoacoustic imaging system with quantitative assessment of total hemoglobin, blood oxygenation, and water Cancer diagnostics based on those parameters or a single malignancy index (tHb*water/oxygenation) with respect to average background Wavelet transform that enhances images of objects within certain dimension range - Operator can easily select the maximum size of the objects to be enhanced on the image. Everything larger will be filtered out Adaptive beamforming for optoacoustic imaging - Allows individual reconstruction on a family of radial wavelet subbands
9380808_1 (GHMatters) P96966.AU 28 2012332233 16 Aug 2017
Diagnostic Image Reprocessing [00145] The principles of functional diagnostic imaging can be based on the tumor pathophysiology. For example, malignant tumors have enhanced concentration of the total hemoglobin and reduced level of oxygen saturation in the hemoglobin of blood. In an embodiment, optoacoustic images can be reprocessed and converted into, inter alia, images of (i) the total hemoglobin [tHb] and (ii) the oxygen saturation of hemoglobin [S02], FIG. 18 demonstrates an example of two breast tumors [00146] FIG. 18 shows a diagram illustrating tumor differentiation based on absorption coefficients at two wavelengths, 755 nm, 1810, and 1064 nm, 1820, which match the local maximum (757 nm) and minimum (1064 nm) of the ratio of absorption by hemoglobin (hypoxic blood) to absorption by oxyhemoglobin. As can be seen, a malignant tumor, 1830, has a higher absorption coefficient at 757 nm than a benign tumor, 1840, whereas the benign tumor, 1840, has a higher absorption coefficient at 1064 nm than a malignant tumor, 1830.
[00147] FIG. 19 illustrates tumor differentiation by optoacoustic imaging based on absorption coefficients at two wavelengths 1910 and 1920 in a phantom. At 757 nm, 1920, a model of a malignant tumor is clearly visible, 1922, whereas the model of the malignant tumor, 1922, is not visible at 1064 nm, 1910.
[00148] FIG. 20A shows an optoacoustic image of two intersecting tubes filled with blood having different levels of blood [S02] (98% in the left tube, and 31% in the right tube). The tubes were placed in 1% fat milk with optical properties similar to those found in the human breast. The wavelength of laser illumination used for this image is 1064 nm. FIG. 20B shows a photograph of an experimental setup that includes artificial blood vessels placed in milk solution and imaged using arc-shaped optoacoustic probe. FIG. 20C shows coregistered 2D cross-sectional anatomical and functional images of blood vessel tubes showing six image panels: (1-upper left) ultrasound image depicting anatomy of the body with vessels; (2-upper right) optoacoustic image obtained at the wavelength of 757 nm; (3-lower right) optoacoustic image obtained at the wavelength of 1064 nm; (4-lower left) functional image of the total hemoglobin [tHb]; (5-lower center) functional image of the blood oxygen saturation [S02]; (6-upper center) functional image of the blood oxygen saturation presented only in the area of maximum concentration of the total hemoglobin. Raw optoacoustic images depicted in FIG. 20C in the upper right and lower right panels demonstrate different brightness of blood 9380808.1 (GHMatters) P96966.AU 29 2012332233 16 Aug 2017 vessels having blood with different level of the total hemoglobin concentration [tHb] and blood oxygen saturation [S02], accurate quantitative measurements could be performed under conditions of normalized fluence of the optical illumination of tissue in the body as a function of depth. These optoacoustic images were used to reconstruct functional images of the total hemoglobin [tHb] and the blood oxygenation [S02], All functional images displayed in FfG. 20C are coregistered and superimposed with the anatomical image of tissue structure for better correlation of features.
[00149] FIGS. 21A and 2 IB show optoacoustic signal amplitude as a function of blood oxygen saturation (with constant hematocrit) under laser illumination at the wavelength of 1064 nm in FIG. 21A and at 757 nm in FIG. 2IB. These plots illustrate that blood oxygen saturation can be monitored with optoacoustic imaging. Specifically, this embodiment illustrates quantitative data based on measurements of the optoacoustic signal amplitude in blood having various levels of oxygen saturation (from 30% to 98%) and hematocrit of 38 g/dL of hemoglobin [tHb] in erythrocytes. As predicted by the published absorption spectra of blood, the optoacoustic signal amplitude at 1064 nm illumination increases with increased level of oxygen saturation, while the optoacoustic signal amplitude decreases with increased blood oxygenation at 757 nm illumination wavelength.
[00150] FIG. 22 illustrates optical absorption spectra of the main tissue chromophores absorbing optical energy in the near-infrared range: hemoglobin, oxyhemoglobin and water. Preferred laser wavelengths for functional imaging are 757 nm and 1064 nm matching max and min ratio of [HHb]/[02Hb], while the wavelength of 800 nm is the best for calibration purposes through measurements of the total hemoglobin [tHb],
[00151] FIGS. 23A and 23B illustrate coregistered functional and anatomical imaging of breast tumors in phantoms accurately replicating optical and acoustic properties of an average breast with tumors. FIG. 23A shows 2D images of: model of malignant tumor morphology based on ultrasound (left), the same anatomical image coregistered with functional image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right). FIG. 23B shows 2D images of a model benign tumor: morphology image based on ultrasound (left), the same anatomical image coregistered with functional 9380808.1 (GHMatters) P96966.AU 30 2012332233 16 Aug 2017 image of the total hemoglobin concentration (center) and with functional image of the blood oxygenation (right).
[00152] FIGS. 24A and 24B illustrate coregistered functional and anatomical imaging of breast tumors. FIG. 24A shows 2D images of invasive ductal carcinoma, a malignant tumor with rough boundaries, heterogeneous morphology, high concentration of total hemoglobin and low oxygen saturation (hypoxia). The malignant tumor morphology is based on ultrasound in the left image, and the same anatomical image coregistered with functional image of the blood oxygenation in the center image and with functional image of the total hemoglobin concentration in the right image. FIG. 24B shows 2D images of a breast with Fibroadenoma, a benign tumor with relatively round boundaries, normal concentration of oxyhemoglobin and relatively low total hemoglobin. Breast morphology is based on ultrasound in the left image, and the same anatomical image is coregistered with a functional image of the blood oxygenation in the center image and with a functional image of the total hemoglobin concentration in the right image.
Conclusion [00153] While some embodiments can be implemented in fully functioning computers and computer systems, various embodiments are capable of being distributed as a computing product in a variety of forms and are capable of being applied regardless of the particular type of machine or computer-readable media used to actually effect the distribution.
[00154] At least some aspects disclosed can be embodied, at least in part, in software. That is, the techniques described herein may be carried out in a special purpose or general purpose computer system or other data processing system in response to its processor, such as a microprocessor, executing sequences of instructions contained in a memory, such as ROM, volatile RAM, non-volatile memory, cache or a remote storage device.
[00155] Routines executed to implement the embodiments may be implemented as part of an operating system, firmware, ROM, middleware, service delivery platform, SDK (Software Development Kit) component, web services, or other specific application, component, program, object, module or sequence of instructions referred to as “computer programs.” Invocation interfaces to these routines can be exposed to a software development community as an API (Application Programming Interface). The computer programs typically comprise
9380808_1 (GHMatters) P96966.AU 31 2012332233 16 Aug 2017 one or more instructions set at various times in various memory and storage devices in a computer, and that, when read and executed by one or more processors in a computer, cause the computer to perform operations necessary to execute elements involving the various aspects.
[00156] A machine-readable medium can be used to store software and data which when executed by a data processing system causes the system to perform various methods. The executable software and data may be stored in various places including for example ROM, volatile RAM, non-volatile memory and/or cache. Portions of this software and/or data may be stored in any one of these storage devices. Further, the data and instructions can be obtained from centralized servers or peer-to-peer networks. Different portions of the data and instructions can be obtained from different centralized servers and/or peer-to-peer networks at different times and in different communication sessions or in a same communication session. The data and instructions can be obtained in entirety prior to the execution of the applications. Alternatively, portions of the data and instructions can be obtained dynamically, just in time, when needed for execution. Thus, it is not required that the data and instructions be on a machine-readable medium in entirety at a particular instance of time.
[00157] Examples of computer-readable media include but are not limited to recordable and non-recordable type media such as volatile and non-volatile memory devices, read only memory (ROM), random access memory (RAM), flash memory devices, floppy and other removable disks, magnetic disk storage media, optical storage media (e.g., Compact Disk Read-Only Memory (CD ROMS), Digital Versatile Disks (DVDs), etc.), among others.
[00158] In general, a machine readable medium includes any mechanism that provides (e.g., stores) information in a form accessible by a machine (e.g., a computer, network device, personal digital assistant, manufacturing tool, any device with a set of one or more processors, etc.).
[00159] In various embodiments, hardwired circuitry may be used in combination with software instructions to implement the techniques. Thus, the techniques are neither limited to any specific combination of hardware circuitry and software nor to any particular source for the instructions executed by the data processing system.
[00160] Although some of the drawings illustrate a number of operations in a particular order, operations that are not order dependent may be reordered and other operations may be combined or broken out. While some reordering or other groupings are specifically mentioned, others will be apparent to those of ordinary skill in the art and so do not present an exhaustive list of alternatives. Moreover, it should be recognized that the stages could be
9380808_1 (GHMatters) P96966.AU 32 2012332233 16 Aug 2017 implemented in hardware, firmware, software or any combination thereof.
[00161] In the foregoing specification, the disclosure has been described with reference to specific exemplary embodiments thereof. It will be evident that various modifications may be made thereto without departing from the broader spirit and scope as set forth in the following claims. The specification and drawings are, accordingly, to be regarded in an illustrative sense rather than a restrictive sense.
[00162] In the claims that follow and in the preceding description of the invention, except where the context requires otherwise owing to express language or necessary implication, the word “comprise” or variations such as “comprises” or “comprising” is used in an inclusive sense, that is, to specify the presence of the stated features but not to preclude the presence or addition of further features in various embodiments of the invention.
[00163] Further, any reference herein to prior art is not intended to imply that such prior art forms or formed a part of the common general knowledge in any country.
9380808_1 (GHMatters) P96966.AU

Claims (21)

1. An optoacoustic imaging system for visualization of slices into the depth of tissue of at least a portion of a body, comprising: a hand-held imaging probe comprising a light emitting portion configured to emit at least two optical pulses having different spectral bands of electromagnetic energy, and an array of ultrasonic transducers configured to detect transient ultrasonic signals resulting from selective absorption of each of the at least two optical pulses in hemoglobin and oxyhemoglobin of blood contained in the tissue; a processing system configured to receive data originating from the ultrasonic transducers of the hand-held imaging probe and to process at least three independent images based at least in part upon said data, the three independent images together comprising: a first functional image reflecting distribution of total hemoglobin concentration; a second functional image reflecting distribution of blood oxygen saturation; and a morphological image of tissue structures; the processing system being further configured to substantially co-register the first functional image, the second functional image, and the morphological image in time and space, and to output a substantially co-registered image.
2. The system of claim 1, in which the hand-held imaging probe is configured to produce at least two optical beams, one on each side of the array of ultrasonic transducers, so as to deliver optical energy to a skin surface at such angle and such distance between them that the optical beams merge into one beam within a distance of skin thickness under the array of ultrasonic transducers.
3. The system of either claim 1 or 2, further comprising one or more dual-wavelength short-pulse lasers or comprising a plurality of single-wavelength short pulse lasers and a fiberoptic light delivery system configured to deliver light from the lasers to the probe.
4. The system of any one of the preceding claims, in which the system is configured to present images substantially in real time by operating at a video frame rate.
5. The system of any one of the preceding claims, in which the hand-held imaging probe comprises an acoustic lens.
6. The system of claim 5, wherein the acoustic lens is formed from a white opaque material, and the optically reflective materials comprise a thin, highly optically reflective metallic layer that removes image artifacts associated with light interactions with the acoustic lens.
7. The system of any one of the preceding claims, in which the array of ultrasonic transducers comprises ultrasonic transducers having an ultrawide ultrasonic frequency band of sensitivity, with bandwidth of up to 200% from the central frequency.
8. The system of any one of the preceding claims, in which the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles, with fibers in each subbundle being randomized such that two neighboring fibers at an input appear in different subbundles of the output fiber bundle.
9. The system of any one of the preceding claims, in which the hand-held imaging probe comprises a fiber bundle that is divided into at least two sub-bundles to form fiber bundle paddles, with at least two paddles placed on each side of the ultrasonic transducer array, each paddle, in turn, being divided into smaller sub-bundles, each smaller sub-bundle being in a slot in said paddle so as to provide controlled profile of an optical beam.
10. The system of any one of the preceding claims, in which the hand-held imaging probe comprises: first and second light diffusers; first and second optical windows; at least two output fiber bundles arranged such that optical beams respectively emerging therefrom pass through the respective light diffusers, then pass through the respective optical windows, then merge at least partially.
11. The system of any one of the preceding claims, further comprising a three-dimensional positioning system configured to control position of the hand-held probe so as to allow assembly of three-dimensional volumetric images of the body from two-dimensional slices made though a depth of tissue obtained by scanning the hand-held probe along the surface of at least a portion of the body.
12. The system of any one of the preceding claims, in which the handheld imaging probe further comprises an acoustic lens formed from a material that allows it to reflect and scatter light from illumination components with substantially no absorption of such light, and yet be optically opaque.
13. The system of any one of the preceding claims, further comprising a layer of hypo-echoic material between an assembly of the array of ultrasonic transducers and a fiberoptic assembly to avoid generation of ultrasound upon interaction of light with assembly of the array of ultrasonic transducers.
14. An optoacoustic imaging method for coregistered functional and anatomical mapping of tissue of at least a portion of a body, the method comprising: a) delivering ultrasonic pulses into the tissue and detecting backscattered ultrasonic signals reflected from structural tissue boundaries associated with body morphology; b) sequentially delivering to the tissue at least two optical pulses having different spectral bands of electromagnetic energy and detecting transient ultrasonic signals resulting from selective absorption of each of the at least two optical pulses in hemoglobin and oxyhemoglobin of blood contained in tissues; c) processing detected ultrasonic signals to remove noise, to revert signal alterations in the course of signal propagation through tissue and through the detection system components, and to restore a temporal shape and ultrasonic spectrum of the original signals; d) performing image reconstruction and further processing to generate morphological images of tissue structures coregistered and superimposed with partially transparent functional images reflecting total hemoglobin concentration and blood oxygen saturation; and, e) repeating steps a) to d) at a video frame rate such that real-time images display tissue functional and morphological changes substantially as they occur.
15. The method of claim 14, in which four optical pulses each having different spectral bands of electromagnetic radiation are sequentially delivered to the tissue.
16. The method of either claim 14 or 15, in which spectral bands of the two optical pulses are selected so that one of them matches local maximum peak of hemoglobin absorption around 757 nm and the other matches the spectral range around 1064 nm corresponding to the maximum ratio in the optical absorption of oxyhemoglobin to that of hemoglobin.
17. The method of any one of claims 14 to 16, further comprising a step of indicating tumor differentiation based at least in part on absorption coefficients measured in the tumor at first and second wavelengths.
18. The method of claim 17, wherein the step of tumor differentiation comprises displaying either: a. a relatively smooth shape of a tumor, or tissue surrounding a tumor, superimposed with relatively low elevation in concentration of the total hemoglobin and normal blood oxygen saturation to indicate a benign tumor, or b. displaying a rough shape of a tumor, or tissue surrounding the tumor, superimposed with high elevation in the total hemoglobin and low blood oxygen saturation to indicate a malignant tumor.
19. The method of any one of claims 14 to 18, further comprising a step of renormalizing an image display palette by reducing relative brightness of image pixels corresponding to the tissue’s surface, thereby amplifying relative brightness of pixels corresponding to larger depths in tissue, so as to make objects located at larger depths visible with greater contrast.
20. The method of any one of claims 14 to 19, in which the step of processing detected ultrasonic signals to revert signal alterations comprises deconvolution of a hardware transfer function to obtain intrinsic optoacoustic amplitude and profile of such the detected ultrasonic signals and distribution of an optical absorption coefficient in the body.
21. The system of any one of claims 1 to 13, wherein the processing comprises deconvolution of a hardware transfer function.
AU2012332233A 2011-11-02 2012-11-02 Dual modality imaging system for coregistered functional and anatomical mapping Ceased AU2012332233B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2017268522A AU2017268522A1 (en) 2011-11-02 2017-11-28 Dual modality imaging system for coregistered functional and anatomical mapping

Applications Claiming Priority (11)

Application Number Priority Date Filing Date Title
US13/287,759 2011-11-02
US13/287,759 US20130109950A1 (en) 2011-11-02 2011-11-02 Handheld optoacoustic probe
US13/341,950 US8686335B2 (en) 2011-12-31 2011-12-31 System and method for adjusting the light output of an optoacoustic imaging system
US13/341,950 2011-12-31
US13/507,217 US9289191B2 (en) 2011-10-12 2012-06-13 System and method for acquiring optoacoustic data and producing parametric maps thereof
US13/507,217 2012-06-13
US13/667,830 2012-11-02
US13/667,808 US20130289381A1 (en) 2011-11-02 2012-11-02 Dual modality imaging system for coregistered functional and anatomical mapping
US13/667,808 2012-11-02
PCT/US2012/063409 WO2013067419A1 (en) 2011-11-02 2012-11-02 Dual modality imaging system for coregistered functional and anatomical mapping
US13/667,830 US9757092B2 (en) 2011-11-02 2012-11-02 Method for dual modality optoacoustic imaging

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2017268522A Division AU2017268522A1 (en) 2011-11-02 2017-11-28 Dual modality imaging system for coregistered functional and anatomical mapping

Publications (2)

Publication Number Publication Date
AU2012332233A1 AU2012332233A1 (en) 2014-05-22
AU2012332233B2 true AU2012332233B2 (en) 2017-08-31

Family

ID=48192850

Family Applications (2)

Application Number Title Priority Date Filing Date
AU2012332233A Ceased AU2012332233B2 (en) 2011-11-02 2012-11-02 Dual modality imaging system for coregistered functional and anatomical mapping
AU2017268522A Abandoned AU2017268522A1 (en) 2011-11-02 2017-11-28 Dual modality imaging system for coregistered functional and anatomical mapping

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2017268522A Abandoned AU2017268522A1 (en) 2011-11-02 2017-11-28 Dual modality imaging system for coregistered functional and anatomical mapping

Country Status (8)

Country Link
JP (2) JP6322578B2 (en)
KR (1) KR102117132B1 (en)
AU (2) AU2012332233B2 (en)
CA (1) CA2861089C (en)
IL (1) IL232414A0 (en)
MX (1) MX2014005408A (en)
SG (1) SG11201401986WA (en)
WO (1) WO2013067419A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11832872B2 (en) 2019-04-01 2023-12-05 Anya L. Getman Resonating probe with optional sensor, emitter, and/or injection capability

Families Citing this family (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9144383B2 (en) 2007-12-13 2015-09-29 The Board Of Trustees Of The University Of Arkansas Device and method for in vivo noninvasive magnetic manipulation of circulating objects in bioflows
US9451884B2 (en) 2007-12-13 2016-09-27 Board Of Trustees Of The University Of Arkansas Device and method for in vivo detection of clots within circulatory vessels
US20090156932A1 (en) 2007-12-13 2009-06-18 Board Of Trustees Of The University Of Arkansas Device and method for in vivo flow cytometry using the detection of photoacoustic waves
US8686335B2 (en) 2011-12-31 2014-04-01 Seno Medical Instruments, Inc. System and method for adjusting the light output of an optoacoustic imaging system
US9445786B2 (en) 2011-11-02 2016-09-20 Seno Medical Instruments, Inc. Interframe energy normalization in an optoacoustic imaging system
US11191435B2 (en) 2013-01-22 2021-12-07 Seno Medical Instruments, Inc. Probe with optoacoustic isolator
US20130116538A1 (en) 2011-11-02 2013-05-09 Seno Medical Instruments, Inc. Optoacoustic imaging systems and methods with enhanced safety
US10433732B2 (en) 2011-11-02 2019-10-08 Seno Medical Instruments, Inc. Optoacoustic imaging system having handheld probe utilizing optically reflective material
US9814394B2 (en) 2011-11-02 2017-11-14 Seno Medical Instruments, Inc. Noise suppression in an optoacoustic system
US20130289381A1 (en) 2011-11-02 2013-10-31 Seno Medical Instruments, Inc. Dual modality imaging system for coregistered functional and anatomical mapping
US20140005544A1 (en) 2011-11-02 2014-01-02 Seno Medical Instruments, Inc. System and method for providing selective channel sensitivity in an optoacoustic imaging system
CA2866840C (en) 2012-03-09 2022-03-29 Seno Medical Instruments, Inc. Statistical mapping in an optoacoustic imaging system
JP6061571B2 (en) * 2012-09-04 2017-01-18 キヤノン株式会社 Subject information acquisition device
WO2014052449A1 (en) 2012-09-25 2014-04-03 The Board Of Trustees Of The University Of Arkansas Device and method for in vivo photoacoustic diagnosis and photothermal purging of infected blood
JP6472437B2 (en) * 2013-09-04 2019-02-20 キヤノン株式会社 Photoacoustic apparatus and acoustic wave receiving apparatus
CN103512960B (en) * 2013-09-27 2016-01-06 中国科学院声学研究所 A kind of supersonic array formation method
WO2015054688A2 (en) 2013-10-11 2015-04-16 Seno Medical Instruments, Inc. Systems and methods for component separation in medical imaging
CN104739453A (en) * 2013-12-31 2015-07-01 深圳市鹏瑞智能技术应用研究院 Ultrasonic tomography system and method
JP6049209B2 (en) * 2014-01-28 2016-12-21 富士フイルム株式会社 Photoacoustic measurement probe and photoacoustic measurement apparatus including the same
EP3240472B1 (en) * 2014-12-31 2023-07-26 Bioventures, LLC. Devices and methods for fractionated photoacoustic flow cytometry
WO2019156975A1 (en) * 2018-02-07 2019-08-15 Atherosys, Inc. Apparatus and method to guide ultrasound acquisition of the peripheral arteries in the transverse plane
CN112272540A (en) * 2018-04-04 2021-01-26 托莫维实验室有限公司 Quantitative imaging system and use thereof
AU2019331103A1 (en) * 2018-08-29 2021-03-25 Tel Hashomer Medical Research Infrastructure And Services Ltd. System and method for determining oxygenated-blood content of biological tissue
JP7301676B2 (en) * 2019-08-28 2023-07-03 キヤノンメディカルシステムズ株式会社 ULTRASOUND DIAGNOSTIC APPARATUS, SIGNAL PROCESSING METHOD, AND SIGNAL PROCESSING PROGRAM
JP7292434B2 (en) * 2020-01-21 2023-06-16 株式会社エビデント Erythrocyte differentiation monitoring device and erythrocyte differentiation monitoring method
CN111671436A (en) * 2020-05-21 2020-09-18 东南大学 Temperature-compensated photoacoustic noninvasive hemoglobin detection device and detection method
CN111839730B (en) * 2020-07-07 2022-02-11 厦门大学附属翔安医院 Photoacoustic imaging surgical navigation platform for guiding tumor resection
CN116138805B (en) * 2022-12-30 2023-09-08 深圳开立生物医疗科技股份有限公司 Photoacoustic ultrasound multi-modality imaging apparatus and method, electronic apparatus, and storage medium
CN115868956A (en) * 2023-03-01 2023-03-31 暨南大学附属第一医院(广州华侨医院) Anti-interference method of excitation source for magneto-optical acoustic imaging

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100256496A1 (en) * 2006-07-19 2010-10-07 Quing Zhu Method and apparatus for medical imaging using combined near-infrared optical tomography, fluorescent tomography and ultrasound
US20110201914A1 (en) * 2008-10-23 2011-08-18 Washington University In St. Louis Reflection-Mode Photoacoustic Tomography Using A Flexibly-Supported Cantilever Beam

Family Cites Families (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3974946B2 (en) * 1994-04-08 2007-09-12 オリンパス株式会社 Image classification device
JPH0961359A (en) * 1995-08-29 1997-03-07 Hamamatsu Photonics Kk Concentration measuring device
US5830146A (en) * 1997-03-17 1998-11-03 Polartechnics Limited Sheathed probes for tissue type recognition
JPH1176232A (en) * 1997-09-11 1999-03-23 Hitachi Medical Corp Ultrasonic diagnostic apparatus
US6827686B2 (en) * 2002-08-21 2004-12-07 Koninklijke Philips Electronics N.V. System and method for improved harmonic imaging
JP4406226B2 (en) * 2003-07-02 2010-01-27 株式会社東芝 Biological information video device
JP4643153B2 (en) * 2004-02-06 2011-03-02 株式会社東芝 Non-invasive biological information imaging device
IL166408A0 (en) * 2005-01-20 2006-01-15 Ultraview Ltd Combined 2d pulse-echo ultrasound and optoacousticsignal for glaucoma treatment
WO2006090298A1 (en) * 2005-02-23 2006-08-31 Philips Intellectual Property & Standards Gmbh Imaging an object of interest
JP4745743B2 (en) * 2005-07-14 2011-08-10 Hoya株式会社 Fluorescence observation endoscope system
US20070093708A1 (en) * 2005-10-20 2007-04-26 Benaron David A Ultra-high-specificity device and methods for the screening of in-vivo tumors
US20070093702A1 (en) * 2005-10-26 2007-04-26 Skyline Biomedical, Inc. Apparatus and method for non-invasive and minimally-invasive sensing of parameters relating to blood
JP2007267837A (en) * 2006-03-30 2007-10-18 Toshiba Corp Biolight measuring apparatus
CN101472520B (en) * 2006-06-23 2015-06-03 皇家飞利浦电子股份有限公司 Timing controller for combined photoacoustic and ultrasound imager
JP4820239B2 (en) * 2006-08-28 2011-11-24 公立大学法人大阪府立大学 Probe for optical tomography equipment
JP2010509977A (en) * 2006-11-21 2010-04-02 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ System, apparatus, method, computer readable medium and use for biological imaging of tissue in anatomical structures
EP2115450A4 (en) * 2007-02-05 2015-03-04 Univ Brown Enhanced ultra-high resolution acoustic microscope
JP5002397B2 (en) * 2007-09-28 2012-08-15 株式会社東芝 Ultrasonic diagnostic apparatus and program
WO2009055705A2 (en) * 2007-10-25 2009-04-30 Washington University In St. Louis Confocal photoacoustic microscopy with optical lateral resolution
JP2010179085A (en) * 2008-07-11 2010-08-19 Canon Inc Biological information acquisition apparatus
WO2010009412A2 (en) * 2008-07-18 2010-01-21 University Of Rochester Medical Center Low-cost device for c-scan photoacoustic imaging
CN102137618B (en) * 2008-07-25 2015-06-17 健康与环境慕尼黑德国研究中心赫姆霍茨中心(有限公司) Quantitative multi-spectral opto-acoustic tomography (MSOT) of tissue biomarkers
JP4900979B2 (en) * 2008-08-27 2012-03-21 キヤノン株式会社 Photoacoustic apparatus and probe for receiving photoacoustic waves
US20100094134A1 (en) * 2008-10-14 2010-04-15 The University Of Connecticut Method and apparatus for medical imaging using near-infrared optical tomography combined with photoacoustic and ultrasound guidance
JP2010125260A (en) * 2008-12-01 2010-06-10 Canon Inc Biological testing apparatus
JP5241465B2 (en) * 2008-12-11 2013-07-17 キヤノン株式会社 Photoacoustic imaging apparatus and photoacoustic imaging method
JP5275830B2 (en) * 2009-01-26 2013-08-28 富士フイルム株式会社 Optical ultrasonic tomographic imaging apparatus and optical ultrasonic tomographic imaging method
JP5483905B2 (en) * 2009-03-03 2014-05-07 キヤノン株式会社 Ultrasonic device
JP4621781B2 (en) * 2009-03-06 2011-01-26 株式会社東芝 Laser ultrasonic inspection equipment
WO2010127199A2 (en) * 2009-05-01 2010-11-04 Visualsonics Inc. System for photoacoustic imaging and related methods
JP2011072702A (en) * 2009-10-01 2011-04-14 Konica Minolta Medical & Graphic Inc Acoustic lens for ultrasonic probe, and ultrasonic probe
JP5692988B2 (en) * 2009-10-19 2015-04-01 キヤノン株式会社 Acoustic wave measuring device
WO2011052061A1 (en) * 2009-10-29 2011-05-05 キヤノン株式会社 Photo-acoustic device
US8879352B2 (en) * 2010-01-25 2014-11-04 The Arizona Board Of Regents On Behalf Of The University Of Arizona Ultrasonic/photoacoustic imaging devices and methods
JP5818444B2 (en) * 2010-02-04 2015-11-18 キヤノン株式会社 Function information acquisition apparatus, function information acquisition method, and program
JP5448918B2 (en) * 2010-02-24 2014-03-19 キヤノン株式会社 Biological information processing device
JP5479173B2 (en) * 2010-03-17 2014-04-23 キヤノン株式会社 Information processing apparatus and information processing method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100256496A1 (en) * 2006-07-19 2010-10-07 Quing Zhu Method and apparatus for medical imaging using combined near-infrared optical tomography, fluorescent tomography and ultrasound
US20110201914A1 (en) * 2008-10-23 2011-08-18 Washington University In St. Louis Reflection-Mode Photoacoustic Tomography Using A Flexibly-Supported Cantilever Beam

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Jiang Zhen et al., 'Different optical spectral characteristics in a necrotic transmissible venereal tumor ... under trans-rectal ultrasound guidance', Multimodal Biomedical Imaging VI, SPIE, vol. 7892, 2011, pages 1-10. *
Zhu et al., 'Optical Tomography with ultrasound localization for breast cancer diagnosis and treatment monitoring', Surg Oncol Clin N Am, 2008, pages 1-18. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11832872B2 (en) 2019-04-01 2023-12-05 Anya L. Getman Resonating probe with optional sensor, emitter, and/or injection capability

Also Published As

Publication number Publication date
JP6732830B2 (en) 2020-07-29
SG11201401986WA (en) 2014-08-28
JP2015501194A (en) 2015-01-15
WO2013067419A1 (en) 2013-05-10
JP6322578B2 (en) 2018-05-09
JP2018143778A (en) 2018-09-20
AU2017268522A1 (en) 2017-12-14
KR20140103932A (en) 2014-08-27
IL232414A0 (en) 2014-06-30
CA2861089C (en) 2021-01-12
KR102117132B1 (en) 2020-05-29
CA2861089A1 (en) 2013-05-10
MX2014005408A (en) 2015-02-12
AU2012332233A1 (en) 2014-05-22

Similar Documents

Publication Publication Date Title
US10709419B2 (en) Dual modality imaging system for coregistered functional and anatomical mapping
AU2012332233B2 (en) Dual modality imaging system for coregistered functional and anatomical mapping
US9757092B2 (en) Method for dual modality optoacoustic imaging
US10433732B2 (en) Optoacoustic imaging system having handheld probe utilizing optically reflective material
AU2013212213B2 (en) Laser optoacoustic ultrasonic imaging system (LOUIS) and methods of use
JP5441795B2 (en) Imaging apparatus and imaging method
WO2011070778A1 (en) Image generating apparatus, image generating method, and program
US9360551B2 (en) Object information acquiring apparatus and control method thereof
JP2013158531A (en) Apparatus and method for obtaining subject information
US20190183349A1 (en) Information acquisition apparatus and signal processing method
JP2017047177A (en) Subject information acquiring apparatus and control method for subject information acquiring apparatus
JP6742734B2 (en) Object information acquisition apparatus and signal processing method
US20150182126A1 (en) Photoacoustic apparatus, signal processing method, and program
JP6486056B2 (en) Photoacoustic apparatus and processing method of photoacoustic apparatus
EP2773267B1 (en) Dual modality imaging system for coregistered functional and anatomical mapping
RU2787527C2 (en) System for quantitative image generation and its use
JP6643108B2 (en) Subject information acquisition device and subject information acquisition method
JP2019136520A (en) Processing device, photoacoustic image display method, and program
Reyman et al. Two-dimensional optoacoustic tomography of large-scale phantoms
JP2017124264A (en) Processing device, subject information obtaining device, photoacoustic image display method, and program
JP2017086173A (en) Subject information acquisition device and control method thereof

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)
MK14 Patent ceased section 143(a) (annual fees not paid) or expired