WO2024104848A1 - Ultrasound imaging system and method for photoacoustic detection of transtemporal acoustic windows in transcranial ultrasound imaging - Google Patents

Ultrasound imaging system and method for photoacoustic detection of transtemporal acoustic windows in transcranial ultrasound imaging Download PDF

Info

Publication number
WO2024104848A1
WO2024104848A1 PCT/EP2023/081074 EP2023081074W WO2024104848A1 WO 2024104848 A1 WO2024104848 A1 WO 2024104848A1 EP 2023081074 W EP2023081074 W EP 2023081074W WO 2024104848 A1 WO2024104848 A1 WO 2024104848A1
Authority
WO
WIPO (PCT)
Prior art keywords
pas
power
ultrasound
threshold
acoustic window
Prior art date
Application number
PCT/EP2023/081074
Other languages
French (fr)
Inventor
William Tao Shi
Faik Can MERAL
Shyam Bharat
Khaled Salem Abdalleh YOUNIS
Antonio Luigi PERRONE
Sven Peter PREVRHAL
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2024104848A1 publication Critical patent/WO2024104848A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0808Detecting organic movements or changes, e.g. tumours, cysts, swellings for diagnosis of the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4483Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer
    • A61B8/4488Constructional features of the ultrasonic, sonic or infrasonic diagnostic device characterised by features of the ultrasound transducer the transducer being a phased array
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/8909Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration
    • G01S15/8915Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array
    • G01S15/8925Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques using a static transducer configuration using a transducer array the array being a two-dimensional transducer configuration, i.e. matrix or orthogonal linear arrays
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S15/00Systems using the reflection or reradiation of acoustic waves, e.g. sonar systems
    • G01S15/88Sonar systems specially adapted for specific applications
    • G01S15/89Sonar systems specially adapted for specific applications for mapping or imaging
    • G01S15/8906Short-range imaging systems; Acoustic microscope systems using pulse-echo techniques
    • G01S15/899Combination of imaging systems with ancillary equipment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • A61B6/46Arrangements for interfacing with the operator or the patient
    • A61B6/461Displaying means of special interest
    • A61B6/463Displaying means of special interest characterised by displaying multiple images or images and diagnostic data on one display
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • A61B8/0833Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures
    • A61B8/085Detecting organic movements or changes, e.g. tumours, cysts, swellings involving detecting or locating foreign bodies or organic structures for locating body or organic structures, e.g. tumours, calculi, blood vessels, nodules
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/52Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/5215Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
    • A61B8/5238Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
    • A61B8/5246Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode

Definitions

  • transcranial color-coded duplex ultrasonography One common type of ultrasound imaging is transcranial color-coded duplex ultrasonography. This type of imaging is widely used due to its capability of assessing both the intracerebral vascular system and anatomical structures, either bone or parenchymal. Because it is a noninvasive and readily available method, transcranial color-coded duplex imaging can be used as a repeatable bedside tool to identify patients with compromised intracranial hemodynamics, already during the ultra-early phase of acute brain injury, thus providing important prognostic information for the clinician. Therefore, transcranial ultrasound has been developed into a point-of-care modality in pre-hospital emergency and in-hospital critical care settings.
  • Photoacoustic signals One application of photoacoustics is illumination of anatomical tissue with optical energy, such as focused laser light, which stimulates the illuminated tissue to generate a mechanical response that may be detected by an acoustic transducer, such as an ultrasound probe. Systems and methods which exploit this feature are described herein.
  • a system for providing ultrasound images comprises: a probe comprising a plurality of ultrasound transducer elements and a source of optical energy; a processor; and a tangible, non-transitory computer-readable medium that stores instructions.
  • the instructions When executed by the processor, the instructions cause the processor to: activate the source of optical energy to illuminate an area of a body to stimulate photoacoustic signals (PAS).
  • PAS photoacoustic signals
  • each of the plurality of ultrasound transducer elements does not emit ultrasound energy during the stimulation of the photoacoustic signals.
  • the instructions further cause the processor to: receive the photoacoustic signals at the probe; determine a power of the PAS incident on the probe; and activate one or more of the plurality of ultrasound transducer elements only when the power of the PAS is greater than a threshold, or activate one or more of the plurality of ultrasound transducer elements in an area only when the power of the PAS is greater than the threshold.
  • a method of ultrasound imaging comprises: activating a source of optical energy to illuminate an area of a body to stimulate PAS.
  • each of a plurality of transducer elements does not emit ultrasound energy during the stimulation of the PAS.
  • the method further comprises receiving the photoacoustic signals at an ultrasound probe; determining a power of the PAS incident on the ultrasound probe; and activating one or more of the plurality of ultrasound transducer elements in a portion of the body only when the power of the PAS is greater than a threshold, or activating one or more of the plurality of transducers in an area of the portion of the body only when the power of the PAS is greater than the threshold.
  • a tangible, non-transitory computer-readable medium that stores instructions.
  • the instructions When executed by a processor, the instructions cause the processor to: activate a source of optical energy to illuminate an area of a body to stimulate PAS.
  • each of a plurality of transducer elements does not emit ultrasound energy during the stimulation of the photoacoustic signals.
  • the instructions further cause the processor to receive the photoacoustic signals at a probe; determine a power of the PAS incident on the probe; and activate one or more of the plurality of ultrasound transducer elements only when the power of the PAS is greater than a threshold, or activate one or more of the plurality of transducers in an area only when the power of the PAS is greater than the threshold.
  • FIG. 1 is a simplified block diagram of an ultrasound (US) imaging system for imaging a portion of a body, according to a representative embodiment.
  • US ultrasound
  • FIGs. 2A-2C are perspective views of ultrasound transducer probe heads each comprising a plurality of ultrasound transducer elements and a light source used to generate photoacoustic signals (PAs) according to a representative embodiment.
  • PAs photoacoustic signals
  • Fig. 3 A is an image locating a temporal acoustic window according to a representative embodiment.
  • Fig. 3B is an image showing manual placement of an ultrasound probe in a temporal region according to a representative embodiment.
  • Fig. 3C is an image of data from a simulated PAS scan provided on a display according to a representative embodiment.
  • Fig 4A is a flow-chart of operation of a system for locating an acoustic window according to a representative embodiment.
  • Fig. 4B is a flow-chart of operation of a system for performing an ultrasound scan with an optimal aperture on the acoustic window according to a representative embodiment.
  • the term ‘approximately’ mean to with acceptable limits or degree.
  • “approximately 2 MHz” means one of ordinary skill in the art would consider the signal to be 2 MHz within reasonable measure.
  • the term ‘substantially’ means within acceptable limits or degree.
  • the “plurality of transducer elements are substantially the same” means one of ordinary skill in the art would consider the plurality of transducer ports to be the same.
  • PA signals are adequately generated inside the human brain (mostly around “shallow” cortical layers behind the temporal bone) and can thus be used for two- dimensional PA sensing to locate a transtemporal acoustic window (often referred to herein as the ‘temporal acoustic window’ or the ‘acoustic window’).
  • PA sensing utilizes adequate PA signals from the cortical layers as “acoustic sources” for receive-only US detection of 2D acoustic transparency of the temporal bone.
  • a transtemporal acoustic window (a thin bone area of the temple bone that allows ultrasound signals to penetrate) is identified using PA signals. Not only does the system and method of various representative embodiments improve the ease of taking ultrasound images of the brain, but also enable good quality ultrasound imaging by operators who may not be experts.
  • FIG. 1 is a simplified block diagram of an ultrasound imaging system 100 for imaging a region of interest of a subject, according to a representative embodiment.
  • the ultrasound imaging system 100 comprises an imaging device 110 and a computer system 115 for controlling imaging of a region of interest in a patient 105 on a table 106.
  • the imaging device 110 is illustratively an ultrasound imaging probe having a plurality of transducer elements (“elements” not shown in Fig. 1) for conducting US imaging scans of a region of interest (ROI) of the patient 105.
  • the imaging device 110 also comprises one or more sources of optical energy for stimulating photoacoustic signals (PAS) in the body to facilitate locating an acoustic window to be displayed (e.g., on display 140) by the operator of the imaging device 110.
  • PAS photoacoustic signals
  • the region of interest is illustratively the temporal acoustic window in the temporal region of the head of patient 105. It is emphasized that the application of the present teachings is not limited to identifying the temporal acoustic window in the temporal region of the head of patient 105, and other portions of the body of the patient 105 are contemplated for US imaging according to the various systems, devices and methods of the present teachings. For example, and again just by way of illustration, another application of the systems, devices and methods of various representative embodiments are contemplated in the comparatively highly perfused liver behind the rib cage. The present teachings can thus be applied to identify a region between the ribs (i.e., an acoustic window) where US imaging of the underlying liver is desired.
  • the computer system 115 receives image data from the imaging device 110, and stores and processes the imaging data according to representative embodiments described herein.
  • the computer system 115 comprises a controller 120, a memory 130, a display 140 comprising a graphical user interface (GUI) 145, and a user interface 150.
  • GUI graphical user interface
  • the display 140 may also include a loudspeaker (not shown) to provide audible feedback.
  • the memory 130 stores instructions executable by the controller 120. When executed, and as described more fully below, the instructions cause the controller 120 to allow the user to perform different steps using the GUI 145 or the user interface 150, or both, and to initialize an ultrasound imaging device comprising a transducer.
  • the GUI 145 and the display 140, or the user interface 150 and the display 140 are used to select the desired time-based parameter desired to be reviewed by the clinician or sonographer.
  • the controller 120 may implement additional operations based on executing instructions, such as instructing or otherwise communicating with another component of the computer system 115, including the memory 130 and the display 140, to perform one or more of the above-noted processes.
  • the memory 130 may include a main memory and/or a static memory, where such memories may communicate with each other and the controller 120 via one or more buses.
  • the memory 130 stores instructions used to implement some or all aspects of methods and processes described herein.
  • the memory 130 may be implemented by any number, type and combination of random access memory (RAM) and read-only memory (ROM), for example, and may store various types of information, such as software algorithms, which serves as instructions, which when executed by a processor cause the processor to perform various steps and methods according to the present teachings. Furthermore, updates to the methods and processes described herein may also be provided to the computer system 115 and stored in memory 130.
  • RAM random access memory
  • ROM read-only memory
  • ROM and RAM may include any number, type and combination of computer readable storage media, such as a disk drive, flash memory, an electrically programmable read-only memory (EPROM), an electrically erasable and programmable read only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, Blu-ray disk, a universal serial bus (USB) drive, or any other form of storage medium known in the art.
  • the memory 130 is a tangible storage medium for storing data and executable software instructions, and is non- transitory during the time software instructions are stored therein.
  • non-transitory is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period.
  • the term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time.
  • the memory 130 may store software instructions and/or computer readable code that enable performance of various functions.
  • the memory 130 may be secure and/or encrypted, or unsecure and/or unencrypted.
  • Memory is an example of computer-readable storage media, and should be interpreted as possibly being multiple memories or databases.
  • the memory or database for instance may be multiple memories or databases local to the computer, and/or distributed amongst multiple computer systems or computing devices, or disposed in the ‘cloud’ according to known components and methods.
  • a computer readable storage medium is defined to be any medium that constitutes patentable subject matter under 35 U.S. C. ⁇ 101 and excludes any medium that does not constitute patentable subject matter under 35 U.S. C. ⁇ 101. Examples of such media include tangible non-transitory media such as computer memory devices that store information in a format that is readable by a computer or data processing system.
  • non-transitory media include computer disks and non-volatile memories.
  • modules for carrying different functions according to the present teachings. These modules comprise executable instructions, which when executed by a processor, cause the processor to carry out the various methods and functions of the representative embodiments.
  • the controller 120 is representative of one or more processors 121, and is configured to execute software instructions stored in memory 130 to perform functions as described in the various embodiments herein.
  • one or more of the processors 121 may be separate from the controller and may be dedicated to carrying out certain functions.
  • the controller 120 may be implemented by field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), systems on a chip (SOC), a general purpose computer, a central processing unit, a computer processor, a microprocessor, a graphics processing unit (GPU), a microcontroller, a state machine, programmable logic device, or combinations thereof, using any combination of hardware, software, firmware, hard-wired logic circuits, or combinations thereof. Additionally, any processing unit or processor herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
  • processor encompasses an electronic component able to execute a program or machine executable instruction.
  • references to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor.
  • a processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems, such as in a cloud-based or other multi-site application.
  • the term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Modules have software instructions to carry out the various functions using one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
  • the display 140 may be a monitor such as a computer monitor, a television, a liquid crystal display (LCD), a light emitting diode (LED) display, a flat panel display, a solid-state display, or a cathode ray tube (CRT) display, or an electronic whiteboard, for example.
  • the display 140 may also provide a graphical user interface (GUI) 145 for displaying and receiving information to and from the user.
  • GUI graphical user interface
  • the display 140 shows regions where the intensity/power of the PAS is comparatively high, and where the acoustic window may be located.
  • the identification of the acoustic window can be made readily by reviewing the image displayed, and identifying connected pixels on the display with a comparatively high intensity/power.
  • the user interface 150 may include a user and/or network interface for providing information and data output by the controller 120 and/or the memory 130 to the user and/or for receiving information and data input by the user. That is, the user interface 150 enables the user to operate the imaging device as described herein, and to schedule, control or manipulate aspects of the ultrasound imaging system 100 of the present teachings. Notably, the user interface 150 enables the controller 120 to indicate the effects of the user’s control or manipulation.
  • the user interface 150 may include one or more of ports, disk drives, wireless antennas, or other types of receiver circuitry.
  • the user interface 150 may further connect one or more interface devices, such as a mouse, a keyboard, a mouse, a trackball, a joystick, a microphone, a video camera, a touchpad, a touchscreen, voice or gesture recognition captured by a microphone or video camera, for example.
  • interface devices such as a mouse, a keyboard, a mouse, a trackball, a joystick, a microphone, a video camera, a touchpad, a touchscreen, voice or gesture recognition captured by a microphone or video camera, for example.
  • the controller 120, the memory 130, the display 140, the GUI 145 and the user interface 150 may be located away from (e.g., in another location of a building, or another building) the imaging device 110 operated by a sonographer.
  • the controller 120, the memory 130, the display 140, the GUI 145 and the user interface 150 may be, for example, located where the radiologist/clinician is located.
  • additional controllers, the memories, displays, GUI and user interfaces may be located near the sonographer and are useful in effecting the various functions of the imaging device 110 needed to complete the US scans contemplated by the present teachings.
  • the ultrasound imaging system 100 may comprise a source of ultrasound signal data from a US examination.
  • the computer system 115 may be connected to the source of these ultrasound signal data (e.g., at a remote location from the computer system 115) from the US examination to receive these data, but the ultrasound imaging system 100 does not include the imaging device 110 or the table 106.
  • Figs. 2A-2C are perspective views of ultrasound transducer probe heads each comprising a plurality of ultrasound transducer elements and a light source used to generate photoacoustic signals (PAS) according to a representative embodiment.
  • PAS photoacoustic signals
  • the ultrasound probe heads shown in Figs. 2A-2C are contemplated for use in the imaging device 110 shown and described in connection with representative embodiments of Fig. 1. It is emphasized that the light sources and their placement on the transducer probe head, and the transducer elements described below are merely illustrative and other light source and placement are contemplated.
  • a light generator may comprise: a commercially available 10 Hz Nd: YAG tunable laser having a wavelength in the range in the range 690 nm to 900 nm; or four (4) silica fiber bundles consisting of a total of number (e.g., 100 optical fibers) with an average output energy around 20 mJ.
  • LEDs that provide discrete near-infrared light having a wavelength in the range of 690 nm to 980 at up to 200 p J per pulse with 100-nsec pulse duration and 200 Hz PRF (pulse repetition frequency).
  • an imaging device 110 may comprise a commercially available 4-channel pulsed LED source.
  • Fig. 2 A shows an US transducer probe head 202 in accordance with a representative embodiment.
  • the US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmit and receive US images.
  • the US transducer probe head 202 further comprises a light source 206 that is used to generate the PA signals used to locate an acoustic window in accordance with various representative embodiments.
  • Fig. 2B shows an US transducer probe head 202 in accordance with another representative embodiment.
  • the US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmits and receives US images.
  • the US transducer probe head 202 also comprises a light source 208 that transmits optical energy to generate PA signals used to locate an acoustic window in accordance with representative embodiments.
  • Fig. 2C shows an US transducer probe head 202 in accordance with another representative embodiment.
  • the US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmit and receive US images.
  • the US transducer probe head 202 also comprises light sources 210, 212, 214 and 216 disposed at locations along the perimeter of an array of transducer elements 204. These light sources 210, 212, 214 and 216 generate PA signals used to locate an acoustic window in accordance with a representative embodiments described more fully below.
  • 2A-2C are moderately intense (providing light energy in the range of approximately around 2 mJ to approximately 20 mJ) light pulses are repetitively applied to illuminate the comparatively shallow regions (e.g., cortical layers) inside the brain through all intervening materials (skin, muscle layer and temporal bone window).
  • the light sources provide light at repetition frequencies in the range of approximately 5 Hz to approximately 500 Hz.
  • Light transmitted by the light sources of the various representative embodiments is transmitted through the skin and muscle bank, temporal bone and the brain tissue array.
  • PA signals generated in regions of the body that have comparatively high blood perfusion are of comparatively high intensity.
  • the bone areas where the perfusion is comparatively low provide comparatively low intensity PA signals.
  • optical signals that are transmitted by the light sources (e.g., LEDs, optical fibers) of the US transducer probe head 202 to regions of the body having larger blood vessels, such as the surface of the brain generate PA signals that are comparatively intense.
  • optical signals that are transmitted by the light sources of the US transducer probe head 202 to regions where the perfusion is low generate comparatively low intensity PA signals.
  • the PA signals having comparatively high intensity are shown on the display 140 to allow the operator of the imaging device 110 to readily determine the acoustic window where acceptable US imaging can be done.
  • the comparatively low intensity PA signals from regions of low perfusion may be selectively filtered out and not shown on the display 140 so as to avoid obscuring the US signals from the acoustic window.
  • the transducer elements 204 that are located in regions where the PA signals have a comparatively high intensity are activated to receive US signals reflected from that portion of the body, and transducer elements 204 that are in regions where the PA signals have a comparatively low intensity (i.e., regions with comparatively low perfusion) are not activated to avoid obscuring the US images with US signals that can reduce the clarity of the resultant US images.
  • Fig. 3 A is an image showing a transtemporal acoustic window 302 on the temporal bone identified by a system, device and method according to a representative embodiment.
  • the transtemporal acoustic window 302 is a region where PAS are reflected from regions of comparatively high perfusion. It is in this region that transtemporal imaging provides the highest quality images, with the transtemporal acoustic window 302 provides a better level of transparency than in other regions. For example, in a region 304 outside the transtemporal acoustic window 302 where bone, tissue and skin exist in greater thickness, the transparency is lower.
  • Fig. 3C shows a PAS intensity/power map provided on a display (e.g., display 140) according to a representative embodiment. As noted above, in regions of comparatively high perfusion, the intensity/power of the PAS signals are greatest, and locate an acoustic window. So, in Fig. 3C, in region 310 where the intensity of the PAS signals is comparatively high, an operator can locate the imaging device 110 (for example as shown in Fig. 3B).
  • an intensity threshold may be used to differentiate between regions of high PAS signal intensity and low PAS signal intensity.
  • a filtering function can be applied so PAS having an intensity/power above the threshold are shown on the display 140, and PAS having an intensity/power below the threshold are not shown on the display 140.
  • PA signals generated within the comparatively highly- perfused brain tissues located deeper i.e., behind the temporal bone
  • PA signals generated in shallow, intervening tissue layers can be effectively removed by the time filter.
  • This time filter is also referred to as a range filter, where the range is given by the time of propagation of the PA signal times at the speed of sound in the medium (e.g., tissue, bone, etc. in the shallow regions).
  • the PA signals generated within shallow intervening tissue layers arrive at the transducer surface earlier in time than the PA signals generated within highly-perfused brain tissues behind the temporal bone.
  • these undesired PA signals from the shallow intervening tissue layers can be identified and removed by the processor 121 so as not to be displayed on the display 140.
  • this provides an improved SNR because the PA signals generated in shallow intervening tissue layers, which arrive at the transducer surface and are received by transducer elements 204 at an earlier time than the PA signals generated within highly -perfused brain tissues behind the temporal bone, can be eliminated.
  • the PAS signal intensity/power levels from PA signals from the comparatively highly- perfused brain tissues located deeper (i.e., behind the temporal bone), which are transmitted and provided on the display 140 clearly show a good location for locating the imaging device 110 in the acoustic window, and provide US images that have a comparatively high signal-to-noise ratio (SNR). This is because the imaging device 110 is not located in regions where reverberations can result in undesired noise.
  • SNR signal-to-noise ratio
  • US imaging occurs after the PAS map is determined.
  • the processor 121 may be adapted to determine which transducer elements 204 are located in region 310 and to activate these transducer elements to provide comparatively high quality images.
  • the processor 121 may be adapted to determine which transducer elements 204 that are located in region 312, and not to activate these transducer elements 204.
  • US signals are not provided regions where reverberations may cause noise that obscure the US image quality and reduce the SNR.
  • Fig 4 A is flow-chart of operation of a system 400 of locating an acoustic window according to a representative embodiment. Certain details and aspects of the various representative embodiments described above in connection with Figs. 1-3C may be common to the ultrasound transducer probe head described in connection with Fig. 4A. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
  • a head 402 is disposed near an US imaging probe device 408 comprising an US transducer probe head that comprises a plurality of transducer elements and sources of visible or invisible light (often referred to herein as Tight’).
  • the US imaging probe device 408 is substantively the same as US transducer probe head 202 described above.
  • the US transducer elements of the US imaging probe device 408 are not active in transmit mode but are active in a passive receive mode of operation, and light 404 (e.g., near field IR radiation) is incident on the head 402.
  • This light is provided by a source 410, which may comprise light sources such as described above.
  • the US imaging probe device 408 is disposed so the US imaging probe device 408 is located near the temple for locating the temporal acoustic window.
  • the present teachings are not limited to this application.
  • light 404 is pulsed and induces PAS by the photoacoustic effect as discussed above.
  • PA signals 406 are transmitted from the regions of generation in the head and are incident on transducer elements of the US imaging probe device 408, as described above.
  • the signals generated by the PAS incident on transducer elements of the US probe head are provided to a US transmit/receive unit 412, which is an ultrasound beam forming unit known to those of ordinary skill in the art.
  • the US transmit/receive unit 412 acts only in a passive manner receiving only PA signals 406 incident on the US imaging probe device 408.
  • the US transmit/receive unit 412 is adapted to receive electrical signals generated by the transducer elements of the US imaging probe device 408, the electrical signals in turn responsive to the received PAS 406.
  • all transducer elements of the US imaging probe device 408 are adapted to receive PAS. As such, in this mode the US transducer probe is operating in “full aperture mode.”
  • the PAS 406 are provided from the US transmit/receive unit 412 to the controller 120 to carry out the range-filtering function as described above.
  • range gate instructions are stored in a range filtering instructions module 416 in memory 130.
  • the range filter instructions are executed by the processor 121 and cause the processor 121 to discard PAS having an intensity/power that are below a threshold and save PAS having an intensity/power above the threshold in the memory 130.
  • the data stored in memory 130 from the executed range filtering instructions are then summed. Specifically, summation instructions are stored in a summation instructions module 418 in memory 130.
  • the summation instructions are executed by the processor 121 and provide data for display by execution of instructions by the processor 121 stored in a display instructions module 420.
  • these data are the sum of the intensity/power at each transducer element of the US imaging probe device 408 from the PAS from the head 402, and by execution of the display instructions, the processor 121 provides commands to display the acoustic window by the intensity/power at each transducer element on the display 140.
  • summation instructions from the summation instructions module 418 cause the processor 121 to integrate the PAS received from the execution of the range filtering instructions that are above the set threshold for each US transducer element over time. Accordingly, the total PA intensity/power and acoustic window size (number of transducer elements having adequate PA sense signals) are also calculated for each intensity/power map obtained using the illustrative method.
  • the summed intensity/power data are then provided to the display 140, which is adapted to show the regions of the US imaging probe device 408 having comparatively high intensity/power PAS signals in order to determine not only where the acoustic window is located on the head 402, but also, as described below, to selectively activate only transducer elements in the US imaging probe device 408 in the acoustic window during the subsequent US imaging procedure in “optimal aperture” mode.
  • a measure of the intensity/power of the PAS at each transducer element is displayed in a 2D map, such as shown in Fig. 3C. This 2D map will show the user where the acoustic window is.
  • Fig 4B is flow-chart of operation of a system 400 for performing an US imaging procedure according to a representative embodiment. Certain details and aspects of the various representative embodiments described above in connection with Figs. 1-4A may be common to the ultrasound transducer probe head described in connection with Fig. 4B. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
  • the light sources of the US imaging probe device 408 are not active in this mode, and no light illumination is transmitted to the head by the US imaging probe device 408.
  • the range filtering instructions module 416 and the summation instructions module 418 are dormant, as these are used to determine the acoustic window as described above in connection with Fig. 4A.
  • the data determined at summation instructions module 418 are provided by the controller 120 to an US image processing unit 414.
  • the US image processing unit 414 may comprise executable instructions stored in memory 130.
  • the US image processing unit 414 may comprise firmware, such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • these instructions When executed by the processor 121, these instructions cause the US transmit/receive unit 412 to activate transducer elements in the US imaging probe device 408 that are in the acoustic window determined as described above. As such, based on data generated from the display instructions module 420, the US image processing unit 414 determines which transducer elements are located in the acoustic window and which transducer elements are not located in the acoustic window. These data are provided to the US transmit/receive unit 412, which in turn controls the US imaging probe device 408 with control instructions. The control instructions cause transducer elements in the acoustic window to be activated and transducer elements outside the acoustic window not to be activated.
  • the data acquired from the PAS 406 are used not only to display the intensity/power level at each transducer element of the US imaging probe device 408, but also to determine which transducer elements of the US imaging probe device are disposed in the acoustic window (and are thus activated), and which are not disposed in the acoustic window (and thus are not activated) during operation in the optimal aperture mode.
  • the US image processing unit 414 is adapted to send control signals to activate the transducer elements of the US transducer probe based on whether the intensity of the PAS signal on each transducer element is greater than the threshold.
  • the US image processing unit 414 provides control signals to the US transmit/receive unit 412 to carry out imaging of through the acoustic window located by the system 400.
  • the data from the display instructions module 420 can provide an apodization map to the US transmit/receive unit 412 directly.
  • the apodization map shows which transducer elements are turned on and which are turned off in the “optimal aperture” mode.
  • US transmit signals 407 are provided from the US imaging probe device 408 to the head 402
  • US receive signals 409 are provided from the head 402, and especially from the acoustic window of the head 402.
  • both transmit and receive apertures for ultrasound imaging are adapted to the located acoustic window in order to minimize the transmit reverberation and received noise signals from transducer elements outside the acoustic window.
  • US pulses with higher amplitudes and/or longer durations can be transmitted from the transducer elements only within the acoustic window for greater SNR.
  • devices, systems and methods of the present teachings facilitate locating an ultrasound device in a region of interest.
  • the present teachings provide a practical application of an ultrasound imaging system, device and method that does not require a highly skilled sonographer. Rather, and as alluded to above, in certain situations (e.g., emergencies) the ultrasound imaging system, device and method of the present teachings may allow acceptable US images to be gathered by an emergency medical technician (EMT) or an emergency room clinician that are not highly trained to conduct ultrasound imaging.
  • EMT emergency medical technician
  • these benefits are illustrative, and other advancements in the field of medical imaging will become apparent to one of ordinary skill in the art having the benefit of the present disclosure.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Acoustics & Sound (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Gynecology & Obstetrics (AREA)
  • Neurology (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A system (400), method and a tangible, non-transitory computer readable medium that stores instructions are described. Acoustic windows are located using depth-filtered, time-integrated photoacoustic signals (PAS (406)) based on the intensity/power levels of the PAS (406) dominantly originated from highly perfused tissues behind acoustic windows. The acoustic window can be displayed on a display (140) of the system (400), and data reflecting which transducer elements (204) of an ultrasound transducer probe should be activated for scanning in the acoustic window. Transducer elements (204) identified to be outside the acoustic window are not activated to avoid noise from reverberation or transducer elements (204) outside the acoustic window.

Description

ULTRASOUND IMAGING SYSTEM AND METHOD FOR PHOTO ACOUSTIC
DETECTION OF TRANSTEMPORAL ACOUSTIC WINDOWS IN TRANSCRANIAL
ULTRASOUND IMAGING
BACKGROUND
[0001] The image quality of certain anatomical parts by ultrasound imaging is hindered due to unacceptably high transmit reverberations between the individual transducers of the ultrasound probe and certain anatomical elements (e.g., bone surfaces). As a result, using certain known ultrasound probes and systems, gathering ultrasound images with acceptable quality can be difficult. This difficulty is especially for when the probe is being used by an inexperienced user to differentiate real echoes. For example, reverberations from bone tissue may be hard to distinguish from the tissue that is desirably imaged.
[0002] One common type of ultrasound imaging is transcranial color-coded duplex ultrasonography. This type of imaging is widely used due to its capability of assessing both the intracerebral vascular system and anatomical structures, either bone or parenchymal. Because it is a noninvasive and readily available method, transcranial color-coded duplex imaging can be used as a repeatable bedside tool to identify patients with compromised intracranial hemodynamics, already during the ultra-early phase of acute brain injury, thus providing important prognostic information for the clinician. Therefore, transcranial ultrasound has been developed into a point-of-care modality in pre-hospital emergency and in-hospital critical care settings. However, and as alluded to above, because clinicians in such situations are often not experienced sonographers, their ability to distinguish between images from tissue and from those caused by reverberations from other anatomical components, such as bone, are compromised. [0003] What is needed is a system and method that overcomes at least the noted drawbacks of known ultrasound devices, methods and systems set forth above.
SUMMARY
[0004] Applicants have discovered that one way of overcoming the above described drawbacks may be by the use of photoacoustic signals (PAS). One application of photoacoustics is illumination of anatomical tissue with optical energy, such as focused laser light, which stimulates the illuminated tissue to generate a mechanical response that may be detected by an acoustic transducer, such as an ultrasound probe. Systems and methods which exploit this feature are described herein.
[0005] According to one aspect of the present disclosure, a system for providing ultrasound images is described. The system comprises: a probe comprising a plurality of ultrasound transducer elements and a source of optical energy; a processor; and a tangible, non-transitory computer-readable medium that stores instructions. When executed by the processor, the instructions cause the processor to: activate the source of optical energy to illuminate an area of a body to stimulate photoacoustic signals (PAS). Notably, each of the plurality of ultrasound transducer elements does not emit ultrasound energy during the stimulation of the photoacoustic signals. The instructions further cause the processor to: receive the photoacoustic signals at the probe; determine a power of the PAS incident on the probe; and activate one or more of the plurality of ultrasound transducer elements only when the power of the PAS is greater than a threshold, or activate one or more of the plurality of ultrasound transducer elements in an area only when the power of the PAS is greater than the threshold.
[0006] According to yet another aspect of the present disclosure, a method of ultrasound imaging is disclosed. The method comprises: activating a source of optical energy to illuminate an area of a body to stimulate PAS. Notably, each of a plurality of transducer elements does not emit ultrasound energy during the stimulation of the PAS. The method further comprises receiving the photoacoustic signals at an ultrasound probe; determining a power of the PAS incident on the ultrasound probe; and activating one or more of the plurality of ultrasound transducer elements in a portion of the body only when the power of the PAS is greater than a threshold, or activating one or more of the plurality of transducers in an area of the portion of the body only when the power of the PAS is greater than the threshold.
[0007] According to another aspect of the present disclosure, a tangible, non-transitory computer-readable medium that stores instructions is disclosed. When executed by a processor, the instructions cause the processor to: activate a source of optical energy to illuminate an area of a body to stimulate PAS. Notably, each of a plurality of transducer elements does not emit ultrasound energy during the stimulation of the photoacoustic signals. The instructions further cause the processor to receive the photoacoustic signals at a probe; determine a power of the PAS incident on the probe; and activate one or more of the plurality of ultrasound transducer elements only when the power of the PAS is greater than a threshold, or activate one or more of the plurality of transducers in an area only when the power of the PAS is greater than the threshold.
BRIEF DESCRIPTION OF THE DRAWINGS
[0008] The representative embodiments are best understood from the following detailed description when read with the accompanying drawing figures. It is emphasized that the various features are not necessarily drawn to scale. In fact, the dimensions may be arbitrarily increased or decreased for clarity of discussion. Wherever applicable and practical, like reference numerals refer to like elements.
[0009] Fig. 1 is a simplified block diagram of an ultrasound (US) imaging system for imaging a portion of a body, according to a representative embodiment.
[0010] Figs. 2A-2C are perspective views of ultrasound transducer probe heads each comprising a plurality of ultrasound transducer elements and a light source used to generate photoacoustic signals (PAs) according to a representative embodiment.
[0011] Fig. 3 A is an image locating a temporal acoustic window according to a representative embodiment.
[0012] Fig. 3B is an image showing manual placement of an ultrasound probe in a temporal region according to a representative embodiment.
[0013] Fig. 3C is an image of data from a simulated PAS scan provided on a display according to a representative embodiment.
[0014] Fig 4A is a flow-chart of operation of a system for locating an acoustic window according to a representative embodiment.
[0015] Fig. 4B is a flow-chart of operation of a system for performing an ultrasound scan with an optimal aperture on the acoustic window according to a representative embodiment. DETAILED DESCRIPTION
[0016] In the following detailed description, for the purposes of explanation and not limitation, representative embodiments disclosing specific details are set forth in order to provide a thorough understanding of an embodiment according to the present teachings. Descriptions of known systems, devices, materials, methods of operation and methods of manufacture may be omitted so as to avoid obscuring the description of the representative embodiments. Nonetheless, systems, devices, materials and methods that are within the purview of one of ordinary skill in the art are within the scope of the present teachings and may be used in accordance with the representative embodiments. It is to be understood that the terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. The defined terms are in addition to the technical and scientific meanings of the defined terms as commonly understood and accepted in the technical field of the present teachings.
[0017] It will be understood that, although the terms first, second, third, etc. may be used herein to describe various elements or components, these elements or components should not be limited by these terms. These terms are only used to distinguish one element or component from another element or component. Thus, a first element or component discussed below could be termed a second element or component without departing from the teachings of the inventive concept.
[0018] The terminology used herein is for purposes of describing particular embodiments only and is not intended to be limiting. As used in the specification and appended claims, the singular forms of terms “a,” “an” and “the” are intended to include both singular and plural forms, unless the context clearly dictates otherwise. Additionally, the terms “comprises,” “comprising,” and/or similar terms specify the presence of stated features, elements, and/or components, but do not preclude the presence or addition of one or more other features, elements, components, and/or groups thereof. As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items.
[0019] As used in the specification and appended claims, and in addition to their ordinary meanings, the term ‘approximately’ mean to with acceptable limits or degree. For example, “approximately 2 MHz” means one of ordinary skill in the art would consider the signal to be 2 MHz within reasonable measure. [0020] As used in the specification and appended claims, in addition to their ordinary meanings, the term ‘substantially’ means within acceptable limits or degree. For example, the “plurality of transducer elements are substantially the same” means one of ordinary skill in the art would consider the plurality of transducer ports to be the same.
[0021] As will become clearer as the present description continues, various representative embodiments are directed to transcranial ultrasound imaging. As alluded to above, using known US imaging devices and systems, it is challenging, if not impossible, to garner sufficient quality US images beyond the comparatively shallow cortical layers of the human brain because the optical illumination path is mainly affected by the cortical layers in the form of optical energy attenuation due to the higher absorption, scattering, and reflection of these layers. As will be described more fully below, PA signals are adequately generated inside the human brain (mostly around “shallow” cortical layers behind the temporal bone) and can thus be used for two- dimensional PA sensing to locate a transtemporal acoustic window (often referred to herein as the ‘temporal acoustic window’ or the ‘acoustic window’). As such, by the present teachings, PA sensing utilizes adequate PA signals from the cortical layers as “acoustic sources” for receive-only US detection of 2D acoustic transparency of the temporal bone.
[0022] Moreover, in typical US imaging using US pulses, strong US reverberations occur within the intervening tissue layers (skin, muscle layer and compact temporal bone). However, by the present teachings, such transmit reverberations can be substantially reduced when acoustic signals are dominantly generated inside the brain via photoacoustic illumination. Owing to the comparatively high specificity of the photoacoustic effect in hemoglobin, PA generated signals in shallow intervening tissue layers (skin, muscle layer and compact temporal bone window, all with minimal to low perfusion) are much weaker than those from deeper highly-perfused brain tissues. The PA signals generated in these shallow intervening tissue layers include reflected PA signals within the shallow intervening tissue layers outside the acoustic window. As described more fully below, in accordance with a representative embodiment PA signals outside the acoustic window can be prevented from being displayed during the process of locating the acoustic window.
[0023] Finally, it is noted that while various representative embodiments are directed to locating an acoustic window in the region of the human temple, the present teachings are not limited to transcranial US imaging. Common to the application of the present teachings to other portions of the body are the incidence of strong reverberations in comparatively low-perfused tissues and comparatively high-perfused regions behind the low-perfused regions.
[0024] Beneficially, by the present teachings, a transtemporal acoustic window (a thin bone area of the temple bone that allows ultrasound signals to penetrate) is identified using PA signals. Not only does the system and method of various representative embodiments improve the ease of taking ultrasound images of the brain, but also enable good quality ultrasound imaging by operators who may not be experts.
[0025] Fig. 1 is a simplified block diagram of an ultrasound imaging system 100 for imaging a region of interest of a subject, according to a representative embodiment.
[0026] Referring to Fig. 1, the ultrasound imaging system 100 comprises an imaging device 110 and a computer system 115 for controlling imaging of a region of interest in a patient 105 on a table 106. The imaging device 110 is illustratively an ultrasound imaging probe having a plurality of transducer elements (“elements” not shown in Fig. 1) for conducting US imaging scans of a region of interest (ROI) of the patient 105. As described more fully below, the imaging device 110 also comprises one or more sources of optical energy for stimulating photoacoustic signals (PAS) in the body to facilitate locating an acoustic window to be displayed (e.g., on display 140) by the operator of the imaging device 110. As noted above, and as will become clearer as the present description continues, the region of interest is illustratively the temporal acoustic window in the temporal region of the head of patient 105. It is emphasized that the application of the present teachings is not limited to identifying the temporal acoustic window in the temporal region of the head of patient 105, and other portions of the body of the patient 105 are contemplated for US imaging according to the various systems, devices and methods of the present teachings. For example, and again just by way of illustration, another application of the systems, devices and methods of various representative embodiments are contemplated in the comparatively highly perfused liver behind the rib cage. The present teachings can thus be applied to identify a region between the ribs (i.e., an acoustic window) where US imaging of the underlying liver is desired.
[0027] The computer system 115 receives image data from the imaging device 110, and stores and processes the imaging data according to representative embodiments described herein. The computer system 115 comprises a controller 120, a memory 130, a display 140 comprising a graphical user interface (GUI) 145, and a user interface 150. The display 140 may also include a loudspeaker (not shown) to provide audible feedback.
[0028] The memory 130 stores instructions executable by the controller 120. When executed, and as described more fully below, the instructions cause the controller 120 to allow the user to perform different steps using the GUI 145 or the user interface 150, or both, and to initialize an ultrasound imaging device comprising a transducer. Notably, and among other functions, the GUI 145 and the display 140, or the user interface 150 and the display 140, are used to select the desired time-based parameter desired to be reviewed by the clinician or sonographer. In addition, the controller 120 may implement additional operations based on executing instructions, such as instructing or otherwise communicating with another component of the computer system 115, including the memory 130 and the display 140, to perform one or more of the above-noted processes.
[0029] The memory 130 may include a main memory and/or a static memory, where such memories may communicate with each other and the controller 120 via one or more buses. The memory 130 stores instructions used to implement some or all aspects of methods and processes described herein.
[0030] The memory 130 may be implemented by any number, type and combination of random access memory (RAM) and read-only memory (ROM), for example, and may store various types of information, such as software algorithms, which serves as instructions, which when executed by a processor cause the processor to perform various steps and methods according to the present teachings. Furthermore, updates to the methods and processes described herein may also be provided to the computer system 115 and stored in memory 130.
[0031] The various types of ROM and RAM may include any number, type and combination of computer readable storage media, such as a disk drive, flash memory, an electrically programmable read-only memory (EPROM), an electrically erasable and programmable read only memory (EEPROM), registers, a hard disk, a removable disk, tape, compact disk read only memory (CD-ROM), digital versatile disk (DVD), floppy disk, Blu-ray disk, a universal serial bus (USB) drive, or any other form of storage medium known in the art. The memory 130 is a tangible storage medium for storing data and executable software instructions, and is non- transitory during the time software instructions are stored therein. As used herein, the term “non-transitory” is to be interpreted not as an eternal characteristic of a state, but as a characteristic of a state that will last for a period. The term “non-transitory” specifically disavows fleeting characteristics such as characteristics of a carrier wave or signal or other forms that exist only transitorily in any place at any time. The memory 130 may store software instructions and/or computer readable code that enable performance of various functions. The memory 130 may be secure and/or encrypted, or unsecure and/or unencrypted.
[0032] “Memory” is an example of computer-readable storage media, and should be interpreted as possibly being multiple memories or databases. The memory or database for instance may be multiple memories or databases local to the computer, and/or distributed amongst multiple computer systems or computing devices, or disposed in the ‘cloud’ according to known components and methods. A computer readable storage medium is defined to be any medium that constitutes patentable subject matter under 35 U.S. C. §101 and excludes any medium that does not constitute patentable subject matter under 35 U.S. C. §101. Examples of such media include tangible non-transitory media such as computer memory devices that store information in a format that is readable by a computer or data processing system. More specific examples of non-transitory media include computer disks and non-volatile memories. Notably, described below in connection with Figs. 4A-4B are “modules” for carrying different functions according to the present teachings. These modules comprise executable instructions, which when executed by a processor, cause the processor to carry out the various methods and functions of the representative embodiments.
[0033] The controller 120 is representative of one or more processors 121, and is configured to execute software instructions stored in memory 130 to perform functions as described in the various embodiments herein. Notably, one or more of the processors 121 may be separate from the controller and may be dedicated to carrying out certain functions.
[0034] The controller 120 may be implemented by field programmable gate arrays (FPGAs), application specific integrated circuits (ASICs), systems on a chip (SOC), a general purpose computer, a central processing unit, a computer processor, a microprocessor, a graphics processing unit (GPU), a microcontroller, a state machine, programmable logic device, or combinations thereof, using any combination of hardware, software, firmware, hard-wired logic circuits, or combinations thereof. Additionally, any processing unit or processor herein may include multiple processors, parallel processors, or both. Multiple processors may be included in, or coupled to, a single device or multiple devices.
[0035] The term “processor” as used herein encompasses an electronic component able to execute a program or machine executable instruction. References to a computing device comprising “a processor” should be interpreted to include more than one processor or processing core, as in a multi-core processor. A processor may also refer to a collection of processors within a single computer system or distributed among multiple computer systems, such as in a cloud-based or other multi-site application. The term computing device should also be interpreted to include a collection or network of computing devices each including a processor or processors. Modules have software instructions to carry out the various functions using one or multiple processors that may be within the same computing device or which may be distributed across multiple computing devices.
[0036] The display 140 may be a monitor such as a computer monitor, a television, a liquid crystal display (LCD), a light emitting diode (LED) display, a flat panel display, a solid-state display, or a cathode ray tube (CRT) display, or an electronic whiteboard, for example. The display 140 may also provide a graphical user interface (GUI) 145 for displaying and receiving information to and from the user. As described more fully below, among other functions, the display 140 shows regions where the intensity/power of the PAS is comparatively high, and where the acoustic window may be located. Notably, and again as described more fully below, the identification of the acoustic window can be made readily by reviewing the image displayed, and identifying connected pixels on the display with a comparatively high intensity/power.
[0037] The user interface 150 may include a user and/or network interface for providing information and data output by the controller 120 and/or the memory 130 to the user and/or for receiving information and data input by the user. That is, the user interface 150 enables the user to operate the imaging device as described herein, and to schedule, control or manipulate aspects of the ultrasound imaging system 100 of the present teachings. Notably, the user interface 150 enables the controller 120 to indicate the effects of the user’s control or manipulation. The user interface 150 may include one or more of ports, disk drives, wireless antennas, or other types of receiver circuitry. The user interface 150 may further connect one or more interface devices, such as a mouse, a keyboard, a mouse, a trackball, a joystick, a microphone, a video camera, a touchpad, a touchscreen, voice or gesture recognition captured by a microphone or video camera, for example.
[0038] Notably, the controller 120, the memory 130, the display 140, the GUI 145 and the user interface 150 may be located away from (e.g., in another location of a building, or another building) the imaging device 110 operated by a sonographer. The controller 120, the memory 130, the display 140, the GUI 145 and the user interface 150 may be, for example, located where the radiologist/clinician is located. Notably, however, additional controllers, the memories, displays, GUI and user interfaces may be located near the sonographer and are useful in effecting the various functions of the imaging device 110 needed to complete the US scans contemplated by the present teachings. Finally, in a more general sense, the ultrasound imaging system 100 may comprise a source of ultrasound signal data from a US examination. In such an embodiment, the computer system 115 may be connected to the source of these ultrasound signal data (e.g., at a remote location from the computer system 115) from the US examination to receive these data, but the ultrasound imaging system 100 does not include the imaging device 110 or the table 106.
[0039] Figs. 2A-2C are perspective views of ultrasound transducer probe heads each comprising a plurality of ultrasound transducer elements and a light source used to generate photoacoustic signals (PAS) according to a representative embodiment. As will be appreciated, the ultrasound probe heads shown in Figs. 2A-2C are contemplated for use in the imaging device 110 shown and described in connection with representative embodiments of Fig. 1. It is emphasized that the light sources and their placement on the transducer probe head, and the transducer elements described below are merely illustrative and other light source and placement are contemplated. More generally, laser elements, UED elements or optical fibers/optical fiber bundles adapted to transmit optical energy to induce PA signals may be distributed among and/or next to US transducer elements. Furthermore, both optical fiber bundles and UED arrays with various bundle/array distribution shapes (rectangle, square, round, etc.) are contemplated for generating PA signals. Just by way of illustration and not limitation, a light generator may comprise: a commercially available 10 Hz Nd: YAG tunable laser having a wavelength in the range in the range 690 nm to 900 nm; or four (4) silica fiber bundles consisting of a total of number (e.g., 100 optical fibers) with an average output energy around 20 mJ. Alternatively, current LED technology contemplated for use in connection with the representative embodiments includes LEDs that provide discrete near-infrared light having a wavelength in the range of 690 nm to 980 at up to 200 p J per pulse with 100-nsec pulse duration and 200 Hz PRF (pulse repetition frequency). Just by way of illustration, an imaging device 110 may comprise a commercially available 4-channel pulsed LED source.
[0040] Notably, certain details and aspects of the ultrasound imaging system 100 described above may be common to the ultrasound transducer probe head described in connection with Figs. 2A-2C. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
[0041] Fig. 2 A shows an US transducer probe head 202 in accordance with a representative embodiment. The US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmit and receive US images. The US transducer probe head 202 further comprises a light source 206 that is used to generate the PA signals used to locate an acoustic window in accordance with various representative embodiments.
[0042] Fig. 2B shows an US transducer probe head 202 in accordance with another representative embodiment. The US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmits and receives US images. The US transducer probe head 202 also comprises a light source 208 that transmits optical energy to generate PA signals used to locate an acoustic window in accordance with representative embodiments.
[0043] Fig. 2C shows an US transducer probe head 202 in accordance with another representative embodiment. The US transducer probe head 202 comprises a plurality of transducer elements 204 disposed at the surface of the US transducer probe head 202 that transmit and receive US images. The US transducer probe head 202 also comprises light sources 210, 212, 214 and 216 disposed at locations along the perimeter of an array of transducer elements 204. These light sources 210, 212, 214 and 216 generate PA signals used to locate an acoustic window in accordance with a representative embodiments described more fully below. [0044] The light from the various light sources of the representative US transducer probe heads 202 of Figs. 2A-2C are moderately intense (providing light energy in the range of approximately around 2 mJ to approximately 20 mJ) light pulses are repetitively applied to illuminate the comparatively shallow regions (e.g., cortical layers) inside the brain through all intervening materials (skin, muscle layer and temporal bone window). Just by way of illustration the light sources provide light at repetition frequencies in the range of approximately 5 Hz to approximately 500 Hz.
[0045] Light transmitted by the light sources of the various representative embodiments is transmitted through the skin and muscle bank, temporal bone and the brain tissue array. As is known, PA signals generated in regions of the body that have comparatively high blood perfusion are of comparatively high intensity. By contrast, the bone areas where the perfusion is comparatively low provide comparatively low intensity PA signals. As such, in accordance with a representative embodiment, optical signals that are transmitted by the light sources (e.g., LEDs, optical fibers) of the US transducer probe head 202 to regions of the body having larger blood vessels, such as the surface of the brain, generate PA signals that are comparatively intense. By contrast, optical signals that are transmitted by the light sources of the US transducer probe head 202 to regions where the perfusion is low (e.g., the bone, muscle tissue and the skin), generate comparatively low intensity PA signals.
[0046] As will be described more fully below, and among other functions, the PA signals having comparatively high intensity are shown on the display 140 to allow the operator of the imaging device 110 to readily determine the acoustic window where acceptable US imaging can be done. Moreover, in certain embodiments, the comparatively low intensity PA signals from regions of low perfusion may be selectively filtered out and not shown on the display 140 so as to avoid obscuring the US signals from the acoustic window. Finally, and as described more fully below, in accordance with another representative embodiment, the transducer elements 204 that are located in regions where the PA signals have a comparatively high intensity (i.e., regions with comparatively high perfusion) are activated to receive US signals reflected from that portion of the body, and transducer elements 204 that are in regions where the PA signals have a comparatively low intensity (i.e., regions with comparatively low perfusion) are not activated to avoid obscuring the US images with US signals that can reduce the clarity of the resultant US images. As noted below, operations of the imaging device 110 in which transducer elements 204 located in regions where the PA signals have comparatively high intensity signals are activated, and in which transducer elements 204 located in regions where the PA signals have comparatively low intensity signals are not activated may be referred to as operation with an “optimal aperture.” As will be appreciated, by activating only selected transducer elements 204, images taken using the optimal aperture have improved quality and comparatively low SNR because the imaging device 110 is generating US signals only in the temporal acoustic window. [0047] Fig. 3 A is an image showing a transtemporal acoustic window 302 on the temporal bone identified by a system, device and method according to a representative embodiment. Notably, certain details and aspects of the ultrasound imaging system 100 and the US transducer probe heads 202 described above in connection with Figs. 1-2C may be common to the ultrasound transducer probe head described in connection with Figs. 3A-3C. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
[0048] Referring to Fig. 3A, and as can be seen, the transtemporal acoustic window 302 is a region where PAS are reflected from regions of comparatively high perfusion. It is in this region that transtemporal imaging provides the highest quality images, with the transtemporal acoustic window 302 provides a better level of transparency than in other regions. For example, in a region 304 outside the transtemporal acoustic window 302 where bone, tissue and skin exist in greater thickness, the transparency is lower. Because of comparatively strong transmit US reverberations between the ultrasound transducer and temporal bone surfaces in regions such as region 304, it can be difficult to differentiate echoes from the tissues between the ultrasound transducer probe and the bone caused by these near field reverberations and those caused by comparatively high perfusion regions more deep inside the brain. According to various representative embodiments noted above and discussed more fully below, these undesired transmit US reverberations can be substantially reduced when acoustic signals are generated inside the brain using PAS generated prior to the US imaging procedure. As such, using the PA sensing to aid in identifying acoustically transparent windows can enable the choice of aperture shapes that are tailored to these windows, thereby resulting in better conventional US images. Accordingly, as shown in Fig. 3B, the imaging device 110 can be properly located at a selected acoustic window by an operator, even an operator that is not an expert sonographer. [0049] Fig. 3C shows a PAS intensity/power map provided on a display (e.g., display 140) according to a representative embodiment. As noted above, in regions of comparatively high perfusion, the intensity/power of the PAS signals are greatest, and locate an acoustic window. So, in Fig. 3C, in region 310 where the intensity of the PAS signals is comparatively high, an operator can locate the imaging device 110 (for example as shown in Fig. 3B). By contrast, in region 312 the intensity of the PAS signals is comparatively low, and indicates the location of a region where the perfusion is comparatively low, and thus not a desirable location to conduct a US scan. As described more fully below, in accordance with a representative embodiment an intensity threshold may be used to differentiate between regions of high PAS signal intensity and low PAS signal intensity.
[0050] In accordance with a representative embodiment described more fully below, a filtering function can be applied so PAS having an intensity/power above the threshold are shown on the display 140, and PAS having an intensity/power below the threshold are not shown on the display 140. To this end, as noted above, PA signals generated within the comparatively highly- perfused brain tissues located deeper (i.e., behind the temporal bone) are desirably used to determine the location of the acoustic window, and PA signals generated in shallow, intervening tissue layers can be effectively removed by the time filter. This time filter is also referred to as a range filter, where the range is given by the time of propagation of the PA signal times at the speed of sound in the medium (e.g., tissue, bone, etc. in the shallow regions). As such, the PA signals generated within shallow intervening tissue layers arrive at the transducer surface earlier in time than the PA signals generated within highly-perfused brain tissues behind the temporal bone. In accordance with a representative embodiment, these undesired PA signals from the shallow intervening tissue layers can be identified and removed by the processor 121 so as not to be displayed on the display 140. Ultimately, this provides an improved SNR because the PA signals generated in shallow intervening tissue layers, which arrive at the transducer surface and are received by transducer elements 204 at an earlier time than the PA signals generated within highly -perfused brain tissues behind the temporal bone, can be eliminated. As such, by eliminating the undesired PA signals received in the entire region (including the regions 310 and 312), the PAS signal intensity/power levels from PA signals from the comparatively highly- perfused brain tissues located deeper (i.e., behind the temporal bone), which are transmitted and provided on the display 140, clearly show a good location for locating the imaging device 110 in the acoustic window, and provide US images that have a comparatively high signal-to-noise ratio (SNR). This is because the imaging device 110 is not located in regions where reverberations can result in undesired noise.
[0051] As noted above, and as described more fully below, US imaging occurs after the PAS map is determined. In accordance with a representative embodiment, when the PAS intensity/power map is known, the processor 121 may be adapted to determine which transducer elements 204 are located in region 310 and to activate these transducer elements to provide comparatively high quality images. By contrast, the processor 121 may be adapted to determine which transducer elements 204 that are located in region 312, and not to activate these transducer elements 204. In this way, using the PAS intensity/power map of Fig. 3B, US signals are not provided regions where reverberations may cause noise that obscure the US image quality and reduce the SNR.
[0052] It is further noted that not all regions of the PAS intensity/power map having comparatively high intensity/power need to be used in determining which transducer elements 204 are to be activated, and which are not. To this end, in the region 310, the pixels on the display 140, and thus the PAS intensity/power are connected. Such connections facilitate the activation of the transducer elements 204 in regions where the PAS intensity/power are sufficient to activate certain transducer elements 204. However, in other regions (e.g., regions 314, 316, 318) that are not connected or that are loosely connected, the processor 221 is adapted not to activate the transducer elements 204 in these regions.
[0053] Fig 4 A is flow-chart of operation of a system 400 of locating an acoustic window according to a representative embodiment. Certain details and aspects of the various representative embodiments described above in connection with Figs. 1-3C may be common to the ultrasound transducer probe head described in connection with Fig. 4A. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
[0054] Turning to Fig. 4A, a head 402 is disposed near an US imaging probe device 408 comprising an US transducer probe head that comprises a plurality of transducer elements and sources of visible or invisible light (often referred to herein as Tight’). By way of illustration, the US imaging probe device 408 is substantively the same as US transducer probe head 202 described above.
[0055] Initially, the US transducer elements of the US imaging probe device 408 are not active in transmit mode but are active in a passive receive mode of operation, and light 404 (e.g., near field IR radiation) is incident on the head 402. This light is provided by a source 410, which may comprise light sources such as described above. Keeping with the previously described applications of the present teachings, the US imaging probe device 408 is disposed so the US imaging probe device 408 is located near the temple for locating the temporal acoustic window. Again, it is noted that the present teachings are not limited to this application.
[0056] According to a representative embodiment, light 404 is pulsed and induces PAS by the photoacoustic effect as discussed above. PA signals 406 are transmitted from the regions of generation in the head and are incident on transducer elements of the US imaging probe device 408, as described above.
[0057] The signals generated by the PAS incident on transducer elements of the US probe head are provided to a US transmit/receive unit 412, which is an ultrasound beam forming unit known to those of ordinary skill in the art. As alluded to above, and as described below as well, in the presently described representative embodiment, in the mode of operation of Fig. 4A, the US transmit/receive unit 412 acts only in a passive manner receiving only PA signals 406 incident on the US imaging probe device 408. As such, in the mode of operation of the system 400 shown in Fig. 4A, the US transmit/receive unit 412 is adapted to receive electrical signals generated by the transducer elements of the US imaging probe device 408, the electrical signals in turn responsive to the received PAS 406. Notably, in the presently described representative embodiment, all transducer elements of the US imaging probe device 408 are adapted to receive PAS. As such, in this mode the US transducer probe is operating in “full aperture mode.” [0058] The PAS 406 are provided from the US transmit/receive unit 412 to the controller 120 to carry out the range-filtering function as described above. Specifically, range gate instructions are stored in a range filtering instructions module 416 in memory 130. The range filter instructions are executed by the processor 121 and cause the processor 121 to discard PAS having an intensity/power that are below a threshold and save PAS having an intensity/power above the threshold in the memory 130. These data are used not only to provide a visual representation of the intensity/power of PAS received from the US imaging probe device 408, but as described below, also are used to determine which transducer elements in the US imaging probe device 408 are to be activated (and which should not be activated) during the ultrasound imaging sequence described in connection with Fig. 4B when the US imaging probe device 408 is operating in the “optimal aperture mode.” Notably, when operating in “optimal aperture mode,” the US imaging probe device 408 may be referred to as being apodized.
[0059] The data stored in memory 130 from the executed range filtering instructions are then summed. Specifically, summation instructions are stored in a summation instructions module 418 in memory 130. The summation instructions are executed by the processor 121 and provide data for display by execution of instructions by the processor 121 stored in a display instructions module 420. As noted above, these data are the sum of the intensity/power at each transducer element of the US imaging probe device 408 from the PAS from the head 402, and by execution of the display instructions, the processor 121 provides commands to display the acoustic window by the intensity/power at each transducer element on the display 140. In one representative embodiment, summation instructions from the summation instructions module 418 cause the processor 121 to integrate the PAS received from the execution of the range filtering instructions that are above the set threshold for each US transducer element over time. Accordingly, the total PA intensity/power and acoustic window size (number of transducer elements having adequate PA sense signals) are also calculated for each intensity/power map obtained using the illustrative method.
[0060] The summed intensity/power data are then provided to the display 140, which is adapted to show the regions of the US imaging probe device 408 having comparatively high intensity/power PAS signals in order to determine not only where the acoustic window is located on the head 402, but also, as described below, to selectively activate only transducer elements in the US imaging probe device 408 in the acoustic window during the subsequent US imaging procedure in “optimal aperture” mode. As such, with the sum of the PAS at each transducer element of the US determined, a measure of the intensity/power of the PAS at each transducer element is displayed in a 2D map, such as shown in Fig. 3C. This 2D map will show the user where the acoustic window is.
[0061] Fig 4B is flow-chart of operation of a system 400 for performing an US imaging procedure according to a representative embodiment. Certain details and aspects of the various representative embodiments described above in connection with Figs. 1-4A may be common to the ultrasound transducer probe head described in connection with Fig. 4B. These common details and aspects may not be repeated to avoid obscuring the description of the currently described representative embodiments.
[0062] During operation of the system 400 in the method of Fig. 4B, the light sources of the US imaging probe device 408 are not active in this mode, and no light illumination is transmitted to the head by the US imaging probe device 408. Moreover, during operation according to the representative embodiments of Figs. 4B, the range filtering instructions module 416 and the summation instructions module 418 are dormant, as these are used to determine the acoustic window as described above in connection with Fig. 4A. However, the data determined at summation instructions module 418 are provided by the controller 120 to an US image processing unit 414. Notably, the US image processing unit 414 may comprise executable instructions stored in memory 130. Alternatively, the US image processing unit 414 may comprise firmware, such as an application specific integrated circuit (ASIC) or a field programmable gate array (FPGA).
[0063] When executed by the processor 121, these instructions cause the US transmit/receive unit 412 to activate transducer elements in the US imaging probe device 408 that are in the acoustic window determined as described above. As such, based on data generated from the display instructions module 420, the US image processing unit 414 determines which transducer elements are located in the acoustic window and which transducer elements are not located in the acoustic window. These data are provided to the US transmit/receive unit 412, which in turn controls the US imaging probe device 408 with control instructions. The control instructions cause transducer elements in the acoustic window to be activated and transducer elements outside the acoustic window not to be activated. So, the data acquired from the PAS 406 are used not only to display the intensity/power level at each transducer element of the US imaging probe device 408, but also to determine which transducer elements of the US imaging probe device are disposed in the acoustic window (and are thus activated), and which are not disposed in the acoustic window (and thus are not activated) during operation in the optimal aperture mode. As such, the US image processing unit 414 is adapted to send control signals to activate the transducer elements of the US transducer probe based on whether the intensity of the PAS signal on each transducer element is greater than the threshold. Accordingly, after each transducer element of the US transducer probe is matched to an intensity/power value gathered from the integrated PAS at each transducer element, the US image processing unit 414 provides control signals to the US transmit/receive unit 412 to carry out imaging of through the acoustic window located by the system 400.
[0064] In other representative embodiments, the data from the display instructions module 420 can provide an apodization map to the US transmit/receive unit 412 directly. Notably, the apodization map shows which transducer elements are turned on and which are turned off in the “optimal aperture” mode.
[0065] Once the combination probe is positioned on the acoustic window, ultrasound pulse-echo imaging is switched on and the US imaging procedure is carried out. As shown, US transmit signals 407 are provided from the US imaging probe device 408 to the head 402, and US receive signals 409 are provided from the head 402, and especially from the acoustic window of the head 402.
[0066] By selection of the transducer elements to activate within an acoustic window based on the PAS intensity/power at each transducer elements, both transmit and receive apertures for ultrasound imaging are adapted to the located acoustic window in order to minimize the transmit reverberation and received noise signals from transducer elements outside the acoustic window. US pulses with higher amplitudes and/or longer durations can be transmitted from the transducer elements only within the acoustic window for greater SNR.
[0067] As will be appreciated by one of ordinary skill in the art having the benefit of the present disclosure, devices, systems and methods of the present teachings facilitate locating an ultrasound device in a region of interest. For example, compared to known methods and systems, the present teachings provide a practical application of an ultrasound imaging system, device and method that does not require a highly skilled sonographer. Rather, and as alluded to above, in certain situations (e.g., emergencies) the ultrasound imaging system, device and method of the present teachings may allow acceptable US images to be gathered by an emergency medical technician (EMT) or an emergency room clinician that are not highly trained to conduct ultrasound imaging. Notably, these benefits are illustrative, and other advancements in the field of medical imaging will become apparent to one of ordinary skill in the art having the benefit of the present disclosure.
[0068] Although methods, systems and components for implementing imaging protocols have been described with reference to several exemplary embodiments, it is understood that the words that have been used are words of description and illustration, rather than words of limitation. Changes may be made within the purview of the appended claims, as presently stated and as amended, without departing from the scope and spirit of the protocol implementation of the present teachings. The preceding description of the disclosed embodiments is provided to enable any person skilled in the art to practice the concepts described in the present disclosure. As such, the above disclosed subject matter is to be considered illustrative, and not restrictive, and the appended claims are intended to cover all such modifications, enhancements, and other embodiments which fall within the true spirit and scope of the present disclosure. Thus, to the maximum extent allowed by law, the scope of the present disclosure is to be determined by the broadest permissible interpretation of the following claims and their equivalents and shall not be restricted or limited by the foregoing detailed description.

Claims

CLAIMS:
1. A system (400) for ultrasound imaging, comprising: a probe comprising a plurality of ultrasound transducers and a source (410) of optical energy; a processor (121); and a tangible, non-transitory computer-readable medium that stores instructions, which when executed by the processor (121) cause the processor (121) to: activate the source (410) of optical energy to illuminate an area of a body to stimulate photoacoustic signals (PAS (406)), wherein each of the plurality of ultrasound transducers does not emit ultrasound energy during the stimulation of the photoacoustic signals (406); receive the PAS (406) at the probe; determine a power of the PAS (406) incident on the probe; and activate one or more of the plurality of ultrasound transducers only when the power of the PAS (406) is greater than a threshold, or activating the one or more of the plurality of ultrasound transducers only in an area where the power of the PAS (406) is greater than the threshold.
2. The system (400) of claim 1, wherein the instructions further cause the processor (121) to indicate a location of the plurality of ultrasound transducers having power equal to or greater than the threshold.
3. The system (400) of claim 2, wherein the instructions further cause the processor (121) to integrate PAS (406) values received from the area of the body to determine a location of the plurality of ultrasound transducers having a value equal to or greater than the threshold.
4. The system (400) of claim 1, wherein a region (304) where the power of the PAS (406) is greater than the threshold is an acoustic window.
5. The system (400) of claim 4, wherein reverberation of the PAS (406) in the acoustic window is at least 10 dB below an average of PAS (406) originated from behind an acoustic bone window.
6. The system (400) of claim 4, wherein, after the acoustic window is determined, the instructions cause the processor (121) to deactivate the source (410) of optical energy, and to activate the plurality of ultrasound transducers selectively in the acoustic window to obtain an ultrasound image.
7. The system (400) of claim 1, wherein a power of optical energy from the sources of optical energy is adjustable.
8. The system (400) of claim 7, wherein the source (410) of optical energy comprises a plurality of sources of optical energy, and an output of the power of the optical energy at each the plurality of sources of optical energy is selectable.
9. The system (400) of claim 8, wherein a power map is generated based on the PAS (406) of each of the plurality of sources of optical energy, and based on the power map the plurality of sources of ultrasound energy are activated.
10. The system (400) of claim 8, wherein a power map of the area where the power of the PAS (406) is greater than the threshold is generated based on the PAS (406) of each of the plurality of sources of optical energy, and based on the power map the plurality of sources of ultrasound energy are activated.
11. A method of ultrasound imaging, the method comprising: activating a source (410) of optical energy to illuminate an area of a body to stimulate photoacoustic signals (406) (PAS (406)), wherein each of a plurality of transducers does not emit ultrasound energy during the stimulation of the PAS (406); receiving the photoacoustic signals (406) at an ultrasound probe; determining a power of the PAS (406) incident on the ultrasound probe; and activating one or more of the plurality of ultrasound transducers in a portion of the body only when the power of the PAS (406) is greater than a threshold, or activating only the plurality of transducers in an area of the portion of the body where the power of the PAS (406) is greater than the threshold.
12. The method of claim 11, wherein determining the power further comprises integrating PAS (406) values received from the area of the body to determine a location of the plurality of ultrasound transducers having a value equal to or greater than the threshold.
13. The method of claim 11, wherein the location of the plurality of ultrasound transducers having a threshold greater than the threshold is provided on a display (140).
14. The method of claim 11, further comprising determining an acoustic window where the power of the PAS (406) is greater than the threshold.
15. The method of claim 14, wherein the method further comprises, after the determining the acoustic window, deactivating the source (410) of optical energy, and activating the plurality of ultrasound transducers selectively in the acoustic window to obtain an ultrasound image.
16. A tangible, non-transitory computer readable medium that stores instructions, which when executed by a processor (121), cause the processor (121) to: activate a source (410) of optical energy to illuminate an area of a body to stimulate photoacoustic signals (PAS (406)), wherein each of a plurality of transducers does not emit ultrasound energy during the stimulation of the photoacoustic signals (406); receive the PAS (406) at a probe; determine a power of the PAS (406) incident on the probe; and activate one or more of the plurality of ultrasound transducers only when the power of the PAS (406) is greater than a threshold, or activate only the plurality of transducers in an area where the power of the PAS (406) is greater than the threshold.
17. The tangible, non-transitory computer readable medium of claim 16, wherein the instructions further cause the processor (121) to indicate a location of the plurality of ultrasound transducers having power equal to or greater than the threshold.
18. The tangible, non-transitory computer readable medium of claim 17, wherein the instructions further cause the processor (121) to integrate integrating PAS (406) values received from the area of the body to determine a location of the plurality of ultrasound transducers having a value equal to or greater than the threshold.
19. The tangible, non-transitory computer readable medium of claim 16, wherein a region (304) where the power of the PAS (406) is greater than the threshold is an acoustic window.
20. The tangible, non-transitory computer readable medium of claim 19, wherein reverberation of the PAS (406) in the acoustic window is at least 10 dB below an average of PAS (406) originated from behind an acoustic bone window.
PCT/EP2023/081074 2022-11-17 2023-11-08 Ultrasound imaging system and method for photoacoustic detection of transtemporal acoustic windows in transcranial ultrasound imaging WO2024104848A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263426118P 2022-11-17 2022-11-17
US63/426,118 2022-11-17

Publications (1)

Publication Number Publication Date
WO2024104848A1 true WO2024104848A1 (en) 2024-05-23

Family

ID=88757546

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/081074 WO2024104848A1 (en) 2022-11-17 2023-11-08 Ultrasound imaging system and method for photoacoustic detection of transtemporal acoustic windows in transcranial ultrasound imaging

Country Status (1)

Country Link
WO (1) WO2024104848A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7515948B1 (en) * 2003-09-12 2009-04-07 Ornim Inc. Photoacoustic analyzer of region of interest in a human body
US20120165670A1 (en) * 2009-09-03 2012-06-28 Koninklijke Philips Electronics N.V. Contralateral array based correction of transcranial ultrasound aberration
US20150297176A1 (en) * 2012-10-19 2015-10-22 Koninklijke Philips N.V. Ultrasound head frame for emergency medical services
US20220151496A1 (en) * 2019-03-15 2022-05-19 Ithera Medical Gmbh Device and method for analyzing optoacoustic data, optoacoustic system and computer program

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7515948B1 (en) * 2003-09-12 2009-04-07 Ornim Inc. Photoacoustic analyzer of region of interest in a human body
US20120165670A1 (en) * 2009-09-03 2012-06-28 Koninklijke Philips Electronics N.V. Contralateral array based correction of transcranial ultrasound aberration
US20150297176A1 (en) * 2012-10-19 2015-10-22 Koninklijke Philips N.V. Ultrasound head frame for emergency medical services
US20220151496A1 (en) * 2019-03-15 2022-05-19 Ithera Medical Gmbh Device and method for analyzing optoacoustic data, optoacoustic system and computer program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
VIGNON F ET AL: "Mapping skull attenuation for optimal probe placement in transcranial ultrasound applications", ULTRASONICS SYMPOSIUM (IUS), 2009 IEEE INTERNATIONAL, IEEE, PISCATAWAY, NJ, USA, 20 September 2009 (2009-09-20), pages 2336 - 2339, XP031654713, ISBN: 978-1-4244-4389-5 *

Similar Documents

Publication Publication Date Title
JP6732830B2 (en) Dual modality image processing system for simultaneous functional and anatomical display mapping
EP1614387B1 (en) Ultrasonic diagnostic apparatus, image processing apparatus and image processing method
US3735755A (en) Noninvasive surgery method and apparatus
EP1953566B1 (en) Ultrasonic diagnostic apparatus and ultrasonic image display method
EP2938246B1 (en) Subject information obtaining apparatus, display method, and program
Zhang et al. A high-frequency, high frame rate duplex ultrasound linear array imaging system for small animal imaging
JP6132466B2 (en) Subject information acquisition apparatus and subject information acquisition method
EP3266378A1 (en) Apparatus, method, and program for obtaining information derived from ultrasonic waves and photoacoustic waves
JP2015085013A (en) Subject information acquisition device, display method, subject information acquisition method, and program
WO2018008439A1 (en) Apparatus, method and program for displaying ultrasound image and photoacoustic image
US20190150894A1 (en) Control device, control method, control system, and non-transitory storage medium
JP4373698B2 (en) Ultrasonic diagnostic apparatus and ultrasonic diagnostic support program
JP5818582B2 (en) Subject information acquisition apparatus and subject information acquisition method
US11119199B2 (en) Acoustic wave image generation apparatus and acoustic wave image generation method
JP6533984B2 (en) Ultrasonic diagnostic apparatus, ultrasonic image analysis method, ultrasonic infection analysis method
WO2024104848A1 (en) Ultrasound imaging system and method for photoacoustic detection of transtemporal acoustic windows in transcranial ultrasound imaging
WO2019102969A1 (en) Information processing device, information processing method, and program
US20200289095A1 (en) Ultrasound diagnostic system and method of operating ultrasound diagnostic system
Suarez et al. Transcranial vibro-acoustography can detect traumatic brain injury, in-vivo: preliminary studies
JP4909132B2 (en) Optical tomography equipment
US20180146859A1 (en) Information acquisition apparatus
KR20160022639A (en) Ultrasonic diagnostic equipment capable of generating harmonic image and method of generating ultrasonic image including harmonic image
JP2019088346A (en) Photoacoustic apparatus and subject information acquisition method
JP6513121B2 (en) Processing apparatus, object information acquiring apparatus, display method of photoacoustic image, and program
JP2019136520A (en) Processing device, photoacoustic image display method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23804643

Country of ref document: EP

Kind code of ref document: A1