US20190350460A1 - Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method - Google Patents

Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method Download PDF

Info

Publication number
US20190350460A1
US20190350460A1 US16/512,934 US201916512934A US2019350460A1 US 20190350460 A1 US20190350460 A1 US 20190350460A1 US 201916512934 A US201916512934 A US 201916512934A US 2019350460 A1 US2019350460 A1 US 2019350460A1
Authority
US
United States
Prior art keywords
subject
information regarding
light
container
distribution
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/512,934
Inventor
Takuro Miyasato
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to US16/512,934 priority Critical patent/US20190350460A1/en
Publication of US20190350460A1 publication Critical patent/US20190350460A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0073Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/1702Systems in which incident light is modified in accordance with the properties of the material investigated with opto-acoustic detection, e.g. for gases or analysing solids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • A61B5/0091Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes for mammography

Definitions

  • the present invention relates to a photoacoustic imaging apparatus, a photoacoustic imaging method, and a program to execute a photoacoustic imaging method.
  • PAT is technique of visualizing information related to the optical characteristic of the inside of an organism, which is a subject, by irradiating the organism (subject) with pulsed light emitted from a light source, receiving an acoustic wave generated when the light that has propagated and diffused through the subject is absorbed by the organism's tissue, and analytically processing the received acoustic wave.
  • biological information such as an optical-characteristic-value distribution in the subject, and, particularly, an optical-energy-absorption density distribution can be acquired.
  • an initial acoustic pressure P 0 of an acoustic wave generated from an optical absorber inside the subject can be represented by the following expression.
  • represents the Grüneisen coefficient and is obtained by dividing the product of the isobaric volume expansion coefficient ⁇ and the square of sonic speed c with isobaric specific heat C P .
  • is known to be a substantially constant value when the subject is specified, where ⁇ a represents an optical absorption coefficient of an absorber, and ⁇ represents the light intensity (which is the intensity of light incident on the absorber and is also referred to as optical fluence) in a local area.
  • the change over time of the acoustic pressure P which is the volume of an acoustic wave propagated through the subject is measured, and an initial-acoustic-pressure distribution is calculated from the measured result.
  • the distribution of the product of ⁇ a and ⁇ i.e., the optical-energy-absorption density distribution, can be acquired.
  • the distribution ⁇ of the light intensity in the subject can be presented by the following expression when light propagates through the subject as plane waves.
  • ⁇ eff represents an average effective attenuation coefficient of the subject
  • ⁇ 0 represents the amount of light incident on the subject from a light source (the light intensity at the surface of the subject).
  • d represents the distance between the area on the surface of the subject irradiated with the light emitted from the light source (light irradiation area) and the optical absorber in the subject.
  • the optical absorption-oefficient distribution ( ⁇ a ) can be calculated from the optical-energy-absorption density distribution ( ⁇ a ⁇ ) of Expression 1.
  • the present invention highly precisely acquires an optical-characteristic-value distribution, such as an absorption coefficient of the inside of a subject.
  • the present invention provides a photoacoustic imaging apparatus including an acoustic converting unit configured to receive an acoustic wave generated by irradiating a subject with light emitted from a light source and to convert the acoustic wave to an electrical signal; and a processing unit configured to determine a light intensity distribution inside the subject on the basis of a light intensity distribution or an illuminance distribution of the light incident on the surface of the subject and to generate image data on the basis of the electrical signal and the determined light intensity distribution inside the subject.
  • an acoustic converting unit configured to receive an acoustic wave generated by irradiating a subject with light emitted from a light source and to convert the acoustic wave to an electrical signal
  • a processing unit configured to determine a light intensity distribution inside the subject on the basis of a light intensity distribution or an illuminance distribution of the light incident on the surface of the subject and to generate image data on the basis of the electrical signal and the determined light intensity distribution inside the subject.
  • the present invention also provides a method of photoacoustic imaging including the steps of generating image data from an electrical signal converted from an acoustic wave generated when light emitted from a light source is incident on a subject; determining a light intensity distribution or an illuminance distribution at a surface of the subject of the light emitted from a light source; determining a light intensity distribution of the inside of the subject on the basis of the light intensity distribution or the illuminance distribution at a surface of the subject; and generating image data on the basis of the electrical signal and the light intensity distribution of the inside of the subject.
  • the present invention highly precisely acquires an optical property distribution, such as an absorption coefficient of the inside of a subject.
  • FIGS. 1A and 1B are schematic views of a photoacoustic imaging apparatus according to first to fourth embodiments of the present invention.
  • FIG. 2 is a schematic diagram illustrating a program to be solved by the present invention.
  • FIG. 3 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the first embodiment of the present invention.
  • FIGS. 4A and 4B are top schematic views of acoustic-wave generating members included in the photoacoustic imaging apparatus according to the first embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the second embodiment of the present invention.
  • FIG. 6 is a schematic view illustrating a process of determining an illuminance distribution at the surface, which is Step 21 in FIG. 5 .
  • FIG. 7 is a schematic view of a photoacoustic imaging apparatus according to the third embodiments of the present invention.
  • FIG. 8 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the third embodiment of the present invention.
  • FIG. 9 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the fourth embodiment of the present invention.
  • acoustic waves include sonic waves, ultrasonic waves, and photoacoustic waves and are elastic waves that are generated inside a subject by irradiating the subject with light (electromagnetic waves), such as near infrared rays.
  • the photoacoustic imaging apparatus is an apparatus that generates image data, which is to be used in diagnosis of malignant tumors and vascular diseases in human being and other animals and follow-up in chemotherapy, by acquiring biological information about the inside of the subject.
  • the subject may be region in a human body or an animal body that is to be diagnosed, such as a breast, a finger, or a limb.
  • An optical absorber inside a subject is a part of the subject that has a relatively high absorption coefficient.
  • the optical absorber is, for example, a malignant tumor, which has many blood vessels or new blood vessels containing oxygenated and/or reduced hemoglobin.
  • FIG. 1A illustrates a photoacoustic imaging apparatus according to this embodiment.
  • the photoacoustic imaging apparatus includes an acoustic converting unit 1 and a processing unit 2 . Furthermore, in this embodiment, an acoustic-wave generating member 10 is provided along the face of a subject 6 .
  • the acoustic-wave generating member 10 has an absorption coefficient different from that of the subject 6 .
  • the thickness, optical absorption coefficient, and Grüneisen coefficient of the acoustic-wave generating member 10 are measured in advance.
  • a light beam 4 emitted from a light source 3 is incident on the subject 6 , which is, for example, an organism, via an optical system 5 , including a lens, a mirror, and an optical fiber.
  • an optical absorber 7 which is a sound source
  • thermal expansion of the optical absorber 7 generates an acoustic wave 81 (which is typically an ultrasonic wave).
  • An acoustic wave 82 is generated at the acoustic-wave generating member 10 in response to receiving the light beam 4 emitted from the light source 3 .
  • the acoustic waves 81 and 82 are received by the acoustic converting unit 1 and are converted to electrical signals.
  • the processing unit 2 generates image data, such as an optical-characteristic-value distribution, of the subject 6 on the basis of the electrical signals and a light intensity distribution of light emitted from the light source 3 incident on the surface of the subject 6 (hereinafter referred to as “surface-light-intensity distribution”).
  • image data such as an optical-characteristic-value distribution
  • the processing unit 2 determines the light intensity distribution in the subject 6 (hereinafter referred to as “internal-light-intensity distribution”) on the basis of the surface-light-intensity distribution, and image data is generated on the basis of the electrical signals and the internal-light-intensity distribution.
  • the image data is displayed as an image on a display device 9 , such as a liquid crystal display.
  • the photoacoustic imaging apparatus may include securing members 11 , such as those illustrated in FIG. 1B , to secure the subject 6 .
  • the securing members 11 define part of the shape of the subject 6 .
  • the other embodiments may also include such securing members.
  • the optical absorbers 7 having the same shape, size, and absorption coefficient but being present at different positions in the subject 6 are displayed with different luminance and color in images of optical-energy-absorption density distribution and optical-absorption-coefficient distribution. This is because the number of photons that reach each optical absorber 7 , i.e., the local amount of light in the subject 6 , differs. The local amount of light inside the subject 6 may differ due to the influence of the surface-light-intensity distribution of the subject 6 .
  • FIG. 2 illustrates two areas (A and B) on the subject 6 having the same size being irradiated with light emitted from light sources that emit light of the same intensity.
  • the illuminance at the areas A and B on the surface of the subject 6 differs because the size of the light irradiation areas differs.
  • the illuminance differs within a light irradiation area (area C) depending on the position.
  • Expression 2 can be applied when the emitted light intensity (surface-illuminance distribution) is uniform. However, such as in the above-described case, when the light intensity is not uniform, Expression 2 cannot be applied.
  • optical absorbers having the same shape, size, and absorption coefficient can be displayed with substantially the same luminance and/or color.
  • the subject 6 is irradiated with the light beam 4 from the light source 3 , and the acoustic converting unit 1 receives the acoustic wave 81 generated at the optical absorber 7 in the subject 6 and the acoustic wave 82 generated at the acoustic-wave generating member 10 disposed on the surface of the subject 6 (S 10 ).
  • the received acoustic waves are converted to electrical signals at the acoustic converting unit 1 (S 11 ) and are sent to the processing unit 2 .
  • the processing unit 2 performs amplification, A/D conversion, and filtering on the electrical signals (S 12 ), calculates the position and size of the optical absorber 7 or biological information, such as the initial-acoustic-pressure distribution, and generates first image data (S 13 ).
  • the processing unit 2 determines the surface-illuminance distribution of the light from the light source 3 incident on the subject 6 on from the first image data acquired from the electrical signals (S 14 ). This is described below.
  • the acoustic wave 81 is generated in response to light propagated through and attenuated inside the subject 6 , whereas the acoustic wave 82 is generated at the surface of the subject 6 in response to light that is substantially not attenuated.
  • the acoustic-wave generating member 10 has an absorption coefficient larger than that of the subject 6 . Therefore, the acoustic wave 82 generated at the surface of the subject 6 has a larger volume than that of the acoustic wave 81 generated in the optical absorber 7 .
  • a part having an initial acoustic pressure greater than the other parts can be extracted from the first image data acquired in Step S 13 (initial-acoustic-pressure distribution P 0 ).
  • the extracted part corresponds to a line between the subject 6 and the acoustic-wave generating member 10 , i.e., the surface of the subject 6 .
  • the line connecting parts having an initial acoustic pressure greater than a predetermined threshold defines the line between the subject 6 and the acoustic-wave generating member 10 .
  • the surface of the subject 6 is determined, and the initial-acoustic-pressure distribution ( ⁇ b ⁇ b ⁇ 0 ) along the line can be obtained.
  • ⁇ b represents a Grüneisen coefficient of the acoustic-wave generating member 10 and ⁇ b presents the absorption coefficient of the acoustic-wave generating member 10 .
  • the surface-illuminance distribution ⁇ 0 of the light emitted from the light source 3 and incident on the subject 6 can be calculated by subtracting the coefficients ⁇ b and ⁇ b from the initial-acoustic-pressure distribution ( ⁇ b ⁇ b ⁇ 0 ) along the line.
  • the internal-light-intensity distribution ⁇ in the subject 6 is determined on the basis of the surface-illuminance distribution ⁇ 0 (S 15 ). Specifically, using the shape of the surface of the subject 6 and the surface-illuminance distribution of the subject 6 acquired in Step S 14 , an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution ⁇ 0 is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution ⁇ in the subject 6 . At this time, the internal-light-intensity distribution is calculated using a diffusion equation and a transport equation.
  • the processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution ⁇ determined in Step S 15 and the first image data (initial-acoustic-pressure distribution P 0 ) acquired in Step S 13 (S 16 ).
  • second image data such as the absorption-coefficient distribution
  • the absorption-coefficient distribution can be calculated.
  • An image based on the second image data acquired in this way is displayed on the display device 9 (S 17 ).
  • the acoustic converting unit 1 includes at least one element that converts acoustic waves to electrical signals, transducers using a piezoelectric phenomenon, resonance of light, and/or change in capacitance. Any type of element may be used so long as it is capable of converting acoustic waves into electrical signals.
  • acoustic matching material such as gel, between the acoustic converting unit 1 and the subject 6 in order to improve the acoustic matching.
  • a work station is typically used as the processing unit 2 , and image reconstruction (generation of image data) is performed using preprogramed software.
  • the software used at the work station includes processing of determining the light intensity distribution or the illuminance distribution on the surface of the subject from electrical signals from the photoacoustic imaging apparatus or an external unit and a signal processing module for noise reduction.
  • the software used at the work station includes an image reconstruction module for image reconstruction.
  • preprocessing of image reconstruction noise reduction is performed on signals received at different sites. It is desirable that such preprocessing is performed by the signal processing modules.
  • the image reconstruction module forms image data by image reconstruction, and as an image reconstruction algorithm, for example, backprojection in a time domain or a Fourier domain, which is typically used in tomography techniques, is applied.
  • Image data is two- and three-dimensional data about biology information. Two-dimensional data is constituted of multiple data sets of pixel data, and three-dimensional data is constituted of multiple data sets of voxel data. Pixel data and voxel data are obtained through image reconstruction of acoustic waves acquired at multiple sites. Three-dimensional image data will be described below. However, the present invention can also be applied to two-dimensional image data.
  • the light source 3 emits light having a predetermined wavelength that is absorbed by a predetermined component (e.g., hemoglobin) that constitutes an organism.
  • a predetermined component e.g., hemoglobin
  • the wavelength of the light is preferably 500 nm or greater and 1,200 nm or smaller. This is because, in the processing described below, it is easier to distinguish between the acoustic waves generated at the surface of the subject (for example, at the skin) and the acoustic waves generated at an optical absorber inside the subject (for example, hemoglobin).
  • At least one light source 3 capable of generating pulsed light between 5 and 50 ns is provided.
  • Laser which has larger power, is desirable as the light source 3 .
  • a light-emitting diode may be used.
  • the light may be emitted from the side of the acoustic converting unit 1 or from the opposite side. Furthermore, the light may be emitted from both sides of the subject 6 .
  • the optical system 5 includes mirrors that reflect light and lenses that converge, diverge, and change the shape of light.
  • the optical system 5 may include, in addition to the mirrors and lenses, optical waveguides and have any configuration so long as the light emitted from the light source 3 is incident on the subject 6 in a desired shape. It is desirable that the light be converged by a lens to irradiate a predetermined area. It is also desirable that the area on the subject 6 irradiated by light be movable. In other words, it is desirable that the light emitted from the light source 3 be movable on the subject 6 . Movable light allows a larger area to be irradiated with light.
  • Methods of moving an area on the subject 6 irradiated with light include a method using a movable mirror and a method of mechanically moving the light source 3 .
  • the acoustic-wave generating member 10 has a predetermined absorption coefficient, is disposed on the surface of the subject 6 , and has known thickness, optical absorption coefficient, and Grüneisen coefficient.
  • the acoustic-wave generating member 10 generates acoustic waves by absorbing light emitted from the light source 3 and is capable of calculating the surface shape and the surface-light-intensity distribution of the subject 6 .
  • the acoustic-wave generating member 10 is made of a material having an absorption coefficient of the light generating an acoustic wave greater than the average absorption coefficient of the subject 6 . Specifically, it is desirable that the optical absorption coefficient be 0.005 mm ⁇ 1 or greater and 0.100 mm ⁇ 1 or smaller.
  • the absorption coefficient When the absorption coefficient is greater than 0.100 mm ⁇ 1 , the amount of light entering the subject 6 decreases, and thus, the acoustic wave generating inside the subject 6 is small. In contrast, when the absorption coefficient is smaller than 0.005 mm ⁇ 1 , this is smaller than the average absorption coefficient of the inside of the subject 6 ; therefore, it is difficult to distinguish between the acoustic waves from the inside and the surface of the subject 6 , and thus it is difficult to calculate the surface shape of the subject 6 . It is desirable that the optical absorption coefficient is 0.010 mm ⁇ 1 or greater and 0.080 mm ⁇ 1 or smaller.
  • the acoustic-wave generating member 10 include absorber particles having a known absorption coefficient disposed as a spotty film, as illustrated in FIG. 4A , or may include absorbers arranged in a grid, as illustrated in FIG. 4B . Instead, the acoustic-wave generating member 10 may include absorber fine particles disposed as a uniform film. It is possible to use an acoustic matching material having a known absorption coefficient, such as a gel, as the acoustic-wave generating member 10 .
  • the photoacoustic imaging apparatus differs from the photoacoustic imaging apparatus according to the first embodiment in that the acoustic-wave generating member 10 is not provided.
  • the surface shape is calculated using an acoustic wave that is generated due to the discontinuity in the optical characteristics (for example, absorption coefficients) of the subject 6 and the surroundings.
  • the illuminance distribution (hereinafter referred to as “surface-illuminance distribution”) at the surface of the subject 6 is calculated on the basis of the calculation result of the surface shape and the intensity distribution of the light emitted from the light source 3 .
  • surface-illuminance distribution An example in which air surrounds the subject 6 will be described below, but this embodiment is not limited thereto.
  • the absorption coefficients and the Grüneisen coefficients of air and the subject 6 are discontinuous. Therefore, light is absorbed at the surface therebetween, i.e., the surface of the subject 6 , and, as a result, an acoustic wave 82 is generated at the surface of the subject 6 .
  • the acoustic converting unit 1 receives an acoustic wave 81 generated at the optical absorber 7 and the acoustic wave 82 and converts these acoustic waves to electrical signals.
  • the acoustic wave 81 generated at the optical absorber 7 inside the subject 6 and the acoustic wave 82 generated at the surface of the subject 6 as a result of irradiating the subject 6 with the light beam 4 are received by the acoustic converting unit 1 (S 10 ).
  • the received acoustic wave 81 is converted to an electrical signal at the acoustic converting unit 1 (S 11 ) and is received by the processing unit 2 .
  • the processing unit 2 After performing filtering on the electrical signal (S 12 ), the processing unit 2 calculates biological information such as the position and size of the optical absorber 7 or biological information, such as the initial-acoustic-pressure distribution, to generate a first image data (S 13 ).
  • the processing unit 2 determines the shape of the subject 6 on the basis of the first image data acquired from the electrical signal (S 20 ). This is described below.
  • the acoustic wave 82 generated at the surface of the subject 6 is generated by receiving light that is substantially not attenuated, the acoustic wave 81 is larger than the acoustic wave 81 generated at the optical absorber 7 .
  • a part having an initial acoustic pressure greater than the other parts can be extracted from the first image data acquired in Step S 13 (initial-acoustic-pressure distribution P 0 ).
  • the extracted part corresponds to the boundary between the subject 6 and the acoustic-wave generating member 10 , i.e., the surface of the subject 6 .
  • the line obtained by connecting the parts having an initial acoustic pressure greater than a predetermined threshold defines the boundary between the subject 6 and the acoustic-wave generating member 10 .
  • the surface-illuminance distribution of the light emitted from the light source 3 incident on the subject 6 can be calculated from the initial-acoustic-pressure distribution at the boundary.
  • the processing unit 2 determines the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the subject 6 on the basis of the shape of the subject 6 and the intensity distribution of the light beam 4 emitted from the light source 3 (S 21 ). This is described below.
  • the intensity distribution of the light beam 4 emitted from the light source 3 which is a light intensity distribution in the inner direction orthogonal to the depth direction of the subject 6 , is measured in advance. This is described below with reference to FIG. 6 .
  • the shape of the subject 6 is represented by positions z in the depth direction of the subject 6 , positions x in an inner direction orthogonal to the depth direction of the subject 6 , and the inclination ⁇ (x) to the surface.
  • the light intensity distribution of the light beam 4 in the inner direction orthogonal to the depth direction of the subject 6 is presented by A(x). It is presumed that light travels linearly outside the subject 6 .
  • the inclination distribution ⁇ (x) of the light beam 4 on the surface of the subject 6 irradiated with light can be calculated with reference to the normal direction, which is calculated from the surface shape of the subject 6 .
  • the surface-illuminance distribution of the subject 6 can be calculated.
  • the internal-light-intensity distribution in the subject 6 is determined (S 15 ). Specifically, using the shape of the surface of the subject 6 acquired in S 20 and the surface-illuminance distribution of the subject 6 acquired in Step S 21 , an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution. At this time, the internal-light-intensity distribution is calculated using the light diffusion equation, the transport equation, or the Monte Carlo simulation for light propagation.
  • the processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution determined in Step S 15 and the first image data acquired in Step S 13 (S 16 ).
  • second image data such as the absorption-coefficient distribution
  • the absorption-coefficient distribution can be calculated.
  • An image based on the second image data acquired in this way is displayed on the display device 9 (S 17 ).
  • FIG. 7 illustrates a photoacoustic imaging apparatus according to the third embodiment of the present invention.
  • This embodiment differs from the second embodiment in that a measuring unit 30 is provided.
  • the other configurations are the same as that of the second embodiment.
  • the measuring unit 30 measures the shape of a subject 6 .
  • the processing unit 2 determines the shape of the subject 6 by calculating the outer shape and thickness of the subject 6 from the taken-in image.
  • the measuring unit 30 may instead be an acoustic-wave converting unit (so-called ultrasonic-wave-echo acoustic converting unit) that transmits and receives acoustic waves.
  • the acoustic converting unit 1 may function as the measuring unit 30 , or the measuring unit 30 may be provided separately.
  • the shape of the subject 6 is determined from electrical signals (first image data) (S 20 ).
  • the operation of the photoacoustic imaging apparatus according to this embodiment differs in that the shape of the subject 6 is determined from an image of the subject 6 acquired by the measuring unit 30 (S 30 ).
  • the other operations are the same as those of the second embodiment.
  • the photoacoustic imaging apparatus according to this embodiment will be described with reference to FIGS. 1A and 1B .
  • the photoacoustic imaging apparatus according to this embodiment includes a container 40 defining the shape of the subject 6 instead of the acoustic-wave generating member 10 in the photoacoustic imaging apparatus according to the first embodiment.
  • the other configurations are the same as that of the first embodiment.
  • the shape of the subject 6 is uniquely determined, the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the surface of the subject 6 is uniquely determined.
  • the container 40 suitable for the subject 6 is selected from a plurality of containers having different shapes and sizes, and then the subject 6 is placed into the container 40 to perform PAT measurement.
  • the surface-illuminance distribution of the light incident on the surface of the subject 6 for each container is determined in advance and is stored in the processing unit 2 as a surface-illuminance distribution data table containing the surface-illuminance distribution data of the subject 6 for each container.
  • the data table is prepared such that when a container is selected, the corresponding surface-illuminance distribution data of the subject 6 for the selected container is retrieved.
  • a single container of which the capacity, size, and/or shape are changeable may be provided.
  • the surface-illuminance distributions of the light incident on the surface of the subject 6 when the size and/or shape of the container is changed in various ways may be determined in advance, and a surface-illuminance distribution data table containing the surface-illuminance distribution data for when the size and/or shape of the container is changed may be stored in the processing unit 2 .
  • the container 40 is selected from a plurality of containers on the basis of the size and shape of the subject 6 , and the subject 6 is placed into the container 40 .
  • the subject 6 is irradiated with the light beam 4 emitted from the light source 3 , and the acoustic converting unit 1 receives the acoustic wave 81 generated at the optical absorber 7 inside the subject 6 (S 10 ).
  • the received acoustic wave 81 is converted to an electrical signal at the acoustic converting unit 1 (S 11 ) and is received by the processing unit 2 .
  • the processing unit 2 After performing filtering on the electrical signal (S 12 ), the processing unit 2 calculates the position and size of the optical absorber 7 or biological information such as the initial-acoustic-pressure distribution to generate first image data (image reconstruction, S 13 ).
  • the processing unit 2 selects and reads in the surface-illuminance distribution data corresponding to the selected container 40 from the surface-illuminance distribution data table stored in the processing unit 2 (S 40 ) and determines the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the subject 6 (S 21 ).
  • the internal-light-intensity distribution in the subject 6 is determined (S 15 ). Specifically, using the shape of the subject 6 defined by the container 40 and the surface-illuminance distribution of the subject 6 acquired in Step S 21 , an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution. At this time, the internal-light-intensity distribution is calculated using the light diffusion equation, the transport equation, or the Monte Carlo simulation for light propagation.
  • the processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution determined in Step S 15 and the first image data acquired in Step S 13 (S 16 ).
  • second image data such as the absorption-coefficient distribution
  • the absorption-coefficient distribution can be calculated.
  • An image based on the second image data acquired in this way is displayed on the display device 9 (S 17 ).
  • Step S 40 internal-light-intensity-distribution data is read in instead of the surface-illuminance distribution data, and Step S 21 may be combined with Step S 40 .
  • the present invention may also be realized by the following processing.
  • Software program that realizes the functions of the above-described first to fourth embodiments is applied to a system or an apparatus via a network or various storage media, and the program is read out by the system or a computer (CPU or MPU) of the apparatus.

Abstract

An optical property distribution, such as an absorption coefficient of the inside of a subject, is highly precisely acquired. A photoacoustic imaging apparatus includes an acoustic converting unit configured to receive acoustic waves generated by irradiating a subject with emitted light and to convert the acoustic wave to an electrical signal; and a processing unit configured to determine a light intensity distribution inside the subject on the basis of a light intensity distribution or an illuminance distribution of the light incident on the surface of the subject and to generate image data on the basis of the electrical signal and the determined light intensity distribution inside the subject.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • The present application is a Continuation of co-pending U.S. patent application Ser. No. 13/634,181, filed Sep. 11, 2012, which is a U.S. National Stage application of International Patent Application No. PCT/JP2011/056670 filed Mar. 14, 2011, which claims foreign priority benefit of Japanese Patent Application No. 2010-075662, filed Mar. 29, 2010. The above-named patent applications are hereby incorporated by reference herein in their entirety.
  • TECHNICAL FIELD
  • The present invention relates to a photoacoustic imaging apparatus, a photoacoustic imaging method, and a program to execute a photoacoustic imaging method.
  • BACKGROUND ART
  • Research on photoacoustic imaging apparatuses that acquire information about the inside of a subject by allowing light, such as a laser beam, emitted from a light source to enter and propagate through the subject has been actively carried out. In PTL 1, photoacoustic tomography (PAT) is proposed as such a photoacoustic imaging technique.
  • PAT is technique of visualizing information related to the optical characteristic of the inside of an organism, which is a subject, by irradiating the organism (subject) with pulsed light emitted from a light source, receiving an acoustic wave generated when the light that has propagated and diffused through the subject is absorbed by the organism's tissue, and analytically processing the received acoustic wave. In this way, information about biological information, such as an optical-characteristic-value distribution in the subject, and, particularly, an optical-energy-absorption density distribution can be acquired.
  • In PAT, an initial acoustic pressure P0 of an acoustic wave generated from an optical absorber inside the subject can be represented by the following expression.

  • P 0=Γ·μa·Φ  (1)
  • Here, Γ represents the Grüneisen coefficient and is obtained by dividing the product of the isobaric volume expansion coefficient β and the square of sonic speed c with isobaric specific heat CP. Γ is known to be a substantially constant value when the subject is specified, where μa represents an optical absorption coefficient of an absorber, and Φ represents the light intensity (which is the intensity of light incident on the absorber and is also referred to as optical fluence) in a local area.
  • The change over time of the acoustic pressure P, which is the volume of an acoustic wave propagated through the subject is measured, and an initial-acoustic-pressure distribution is calculated from the measured result. By dividing the calculated initial-acoustic-pressure distribution with the Grüneisen coefficient Γ, the distribution of the product of μa and Φ, i.e., the optical-energy-absorption density distribution, can be acquired.
  • As represented by Expression 1, to acquire the distribution of the optical absorption coefficient μa from the distribution of the initial-acoustic-pressure distribution, it is necessary to determine the distribution of the light intensity Φ inside the subject. When an area sufficiently large with respect to the thickness of the subject is irradiated with a uniform amount of light, the distribution Φ of the light intensity in the subject can be presented by the following expression when light propagates through the subject as plane waves.

  • Φ=Φ0·exp(−μeff ·d)   (2)
  • Here, μeff represents an average effective attenuation coefficient of the subject, and Φ0 represents the amount of light incident on the subject from a light source (the light intensity at the surface of the subject). Furthermore, d represents the distance between the area on the surface of the subject irradiated with the light emitted from the light source (light irradiation area) and the optical absorber in the subject.
  • By using the light intensity distribution Φ represented by Expression 2, the optical absorption-oefficient distribution (μa) can be calculated from the optical-energy-absorption density distribution (μaΦ) of Expression 1.
  • CITATION LIST Patent Literature
  • PTL 1 U.S. Pat. No. 5,713,356
  • SUMMARY OF INVENTION Technical Problem
  • However, when the shape of the subject is not simple and/or when the amount of light emitted from a light source incident on the subject is not uniform, the area of the light irradiation area on the surface of the subject and the irradiation light intensity distribution are not uniform. Therefore, the light intensity in the subject is not uniform in the inner direction from the irradiated surface. Consequently, Expression 2 cannot be used. Thus, to precisely determine the optical-characteristic-value distribution in the subject, such non uniform properties need to be considered. The present invention highly precisely acquires an optical-characteristic-value distribution, such as an absorption coefficient of the inside of a subject.
  • Solution to Problem
  • The present invention provides a photoacoustic imaging apparatus including an acoustic converting unit configured to receive an acoustic wave generated by irradiating a subject with light emitted from a light source and to convert the acoustic wave to an electrical signal; and a processing unit configured to determine a light intensity distribution inside the subject on the basis of a light intensity distribution or an illuminance distribution of the light incident on the surface of the subject and to generate image data on the basis of the electrical signal and the determined light intensity distribution inside the subject.
  • The present invention also provides a method of photoacoustic imaging including the steps of generating image data from an electrical signal converted from an acoustic wave generated when light emitted from a light source is incident on a subject; determining a light intensity distribution or an illuminance distribution at a surface of the subject of the light emitted from a light source; determining a light intensity distribution of the inside of the subject on the basis of the light intensity distribution or the illuminance distribution at a surface of the subject; and generating image data on the basis of the electrical signal and the light intensity distribution of the inside of the subject.
  • Advantageous Effects of Invention
  • The present invention highly precisely acquires an optical property distribution, such as an absorption coefficient of the inside of a subject.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIGS. 1A and 1B are schematic views of a photoacoustic imaging apparatus according to first to fourth embodiments of the present invention.
  • FIG. 2 is a schematic diagram illustrating a program to be solved by the present invention.
  • FIG. 3 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the first embodiment of the present invention.
  • FIGS. 4A and 4B are top schematic views of acoustic-wave generating members included in the photoacoustic imaging apparatus according to the first embodiment of the present invention.
  • FIG. 5 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the second embodiment of the present invention.
  • FIG. 6 is a schematic view illustrating a process of determining an illuminance distribution at the surface, which is Step 21 in FIG. 5.
  • FIG. 7 is a schematic view of a photoacoustic imaging apparatus according to the third embodiments of the present invention.
  • FIG. 8 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the third embodiment of the present invention.
  • FIG. 9 is a flow chart illustrating a process carried out by the photoacoustic imaging apparatus according to the fourth embodiment of the present invention.
  • DESCRIPTION OF EMBODIMENTS
  • The present invention will be described below with reference to the drawings. In the present invention, acoustic waves include sonic waves, ultrasonic waves, and photoacoustic waves and are elastic waves that are generated inside a subject by irradiating the subject with light (electromagnetic waves), such as near infrared rays. The photoacoustic imaging apparatus according to the present invention is an apparatus that generates image data, which is to be used in diagnosis of malignant tumors and vascular diseases in human being and other animals and follow-up in chemotherapy, by acquiring biological information about the inside of the subject. The subject may be region in a human body or an animal body that is to be diagnosed, such as a breast, a finger, or a limb. An optical absorber inside a subject is a part of the subject that has a relatively high absorption coefficient. In case the subject is a human body, the optical absorber is, for example, a malignant tumor, which has many blood vessels or new blood vessels containing oxygenated and/or reduced hemoglobin.
  • First Embodiment
  • FIG. 1A illustrates a photoacoustic imaging apparatus according to this embodiment. The photoacoustic imaging apparatus according to this embodiment includes an acoustic converting unit 1 and a processing unit 2. Furthermore, in this embodiment, an acoustic-wave generating member 10 is provided along the face of a subject 6. The acoustic-wave generating member 10 has an absorption coefficient different from that of the subject 6. The thickness, optical absorption coefficient, and Grüneisen coefficient of the acoustic-wave generating member 10 are measured in advance. A light beam 4 emitted from a light source 3 is incident on the subject 6, which is, for example, an organism, via an optical system 5, including a lens, a mirror, and an optical fiber. When part of the optical energy propagated through the subject 6 is absorbed by an optical absorber 7 (which is a sound source), such as the interior of a blood vessel or blood, thermal expansion of the optical absorber 7 generates an acoustic wave 81 (which is typically an ultrasonic wave). An acoustic wave 82 is generated at the acoustic-wave generating member 10 in response to receiving the light beam 4 emitted from the light source 3. The acoustic waves 81 and 82 are received by the acoustic converting unit 1 and are converted to electrical signals. Then, the processing unit 2 generates image data, such as an optical-characteristic-value distribution, of the subject 6 on the basis of the electrical signals and a light intensity distribution of light emitted from the light source 3 incident on the surface of the subject 6 (hereinafter referred to as “surface-light-intensity distribution”). Specifically, the light intensity distribution in the subject 6 (hereinafter referred to as “internal-light-intensity distribution”) is determined by the processing unit 2 on the basis of the surface-light-intensity distribution, and image data is generated on the basis of the electrical signals and the internal-light-intensity distribution. Then, the image data is displayed as an image on a display device 9, such as a liquid crystal display. The photoacoustic imaging apparatus may include securing members 11, such as those illustrated in FIG. 1B, to secure the subject 6. The securing members 11 define part of the shape of the subject 6. Although not mentioned in particular, the other embodiments may also include such securing members.
  • The optical absorbers 7 having the same shape, size, and absorption coefficient but being present at different positions in the subject 6 are displayed with different luminance and color in images of optical-energy-absorption density distribution and optical-absorption-coefficient distribution. This is because the number of photons that reach each optical absorber 7, i.e., the local amount of light in the subject 6, differs. The local amount of light inside the subject 6 may differ due to the influence of the surface-light-intensity distribution of the subject 6. FIG. 2 illustrates two areas (A and B) on the subject 6 having the same size being irradiated with light emitted from light sources that emit light of the same intensity. With reference to FIG. 2, even though the intensities of the light emitted from the light sources are the same, the illuminance at the areas A and B on the surface of the subject 6 differs because the size of the light irradiation areas differs. When the light from the light source 3 or the light beam 4 incident on the subject 6 via the optical system 5 diverge finitely and when the light intensity distribution is not uniform in the diverging direction, the illuminance differs within a light irradiation area (area C) depending on the position. Expression 2 can be applied when the emitted light intensity (surface-illuminance distribution) is uniform. However, such as in the above-described case, when the light intensity is not uniform, Expression 2 cannot be applied. According the present invention, by correcting the light intensity distribution in the subject using the surface-illuminance distribution of light emitted from a light source incident on a subject, optical absorbers having the same shape, size, and absorption coefficient can be displayed with substantially the same luminance and/or color.
  • Next, the operation of the photoacoustic imaging apparatus according to this embodiment will be described with reference to FIGS. 1 and 3.
  • The subject 6 is irradiated with the light beam 4 from the light source 3, and the acoustic converting unit 1 receives the acoustic wave 81 generated at the optical absorber 7 in the subject 6 and the acoustic wave 82 generated at the acoustic-wave generating member 10 disposed on the surface of the subject 6 (S10). The received acoustic waves are converted to electrical signals at the acoustic converting unit 1 (S11) and are sent to the processing unit 2. The processing unit 2 performs amplification, A/D conversion, and filtering on the electrical signals (S12), calculates the position and size of the optical absorber 7 or biological information, such as the initial-acoustic-pressure distribution, and generates first image data (S13).
  • The processing unit 2 determines the surface-illuminance distribution of the light from the light source 3 incident on the subject 6 on from the first image data acquired from the electrical signals (S14). This is described below.
  • The acoustic wave 81 is generated in response to light propagated through and attenuated inside the subject 6, whereas the acoustic wave 82 is generated at the surface of the subject 6 in response to light that is substantially not attenuated. The acoustic-wave generating member 10 has an absorption coefficient larger than that of the subject 6. Therefore, the acoustic wave 82 generated at the surface of the subject 6 has a larger volume than that of the acoustic wave 81 generated in the optical absorber 7. Thus, a part having an initial acoustic pressure greater than the other parts can be extracted from the first image data acquired in Step S13 (initial-acoustic-pressure distribution P0). The extracted part corresponds to a line between the subject 6 and the acoustic-wave generating member 10, i.e., the surface of the subject 6. Specifically, the line connecting parts having an initial acoustic pressure greater than a predetermined threshold defines the line between the subject 6 and the acoustic-wave generating member 10. The surface of the subject 6 is determined, and the initial-acoustic-pressure distribution (ΓbμbΦ0) along the line can be obtained. Γb represents a Grüneisen coefficient of the acoustic-wave generating member 10 and μb presents the absorption coefficient of the acoustic-wave generating member 10. The surface-illuminance distribution Φ0 of the light emitted from the light source 3 and incident on the subject 6 can be calculated by subtracting the coefficients Γb and μb from the initial-acoustic-pressure distribution (ΓbμbΦ0) along the line.
  • Then, the internal-light-intensity distribution Φ in the subject 6 is determined on the basis of the surface-illuminance distribution Φ0 (S15). Specifically, using the shape of the surface of the subject 6 and the surface-illuminance distribution of the subject 6 acquired in Step S14, an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution Φ0 is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution Φ in the subject 6. At this time, the internal-light-intensity distribution is calculated using a diffusion equation and a transport equation.
  • The processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution Φ determined in Step S15 and the first image data (initial-acoustic-pressure distribution P0) acquired in Step S13 (S16). By using the internal-light-intensity distribution determined in S15 by Expression 1, the absorption-coefficient distribution can be calculated. An image based on the second image data acquired in this way is displayed on the display device 9 (S17).
  • Next, the configuration of the photoacoustic imaging apparatus according to this embodiment will be described in detail below.
  • The acoustic converting unit 1 includes at least one element that converts acoustic waves to electrical signals, transducers using a piezoelectric phenomenon, resonance of light, and/or change in capacitance. Any type of element may be used so long as it is capable of converting acoustic waves into electrical signals. By one- or two-dimensionally arranging a plurality of acoustic-wave receiving elements, photoacoustic waves can be received simultaneously at different sites. Thus, reception time can be reduced, and the influence of vibration of the subject can be reduced. By moving one of the elements, it is possible to receive the same signal as that received when the elements are arranged one- or two-dimensionally. It is desirable to apply acoustic matching material, such as gel, between the acoustic converting unit 1 and the subject 6 in order to improve the acoustic matching.
  • A work station is typically used as the processing unit 2, and image reconstruction (generation of image data) is performed using preprogramed software. For example, the software used at the work station includes processing of determining the light intensity distribution or the illuminance distribution on the surface of the subject from electrical signals from the photoacoustic imaging apparatus or an external unit and a signal processing module for noise reduction. Furthermore, the software used at the work station includes an image reconstruction module for image reconstruction. In PAT, normally, as preprocessing of image reconstruction, noise reduction is performed on signals received at different sites. It is desirable that such preprocessing is performed by the signal processing modules. The image reconstruction module forms image data by image reconstruction, and as an image reconstruction algorithm, for example, backprojection in a time domain or a Fourier domain, which is typically used in tomography techniques, is applied. Image data is two- and three-dimensional data about biology information. Two-dimensional data is constituted of multiple data sets of pixel data, and three-dimensional data is constituted of multiple data sets of voxel data. Pixel data and voxel data are obtained through image reconstruction of acoustic waves acquired at multiple sites. Three-dimensional image data will be described below. However, the present invention can also be applied to two-dimensional image data.
  • The light source 3 emits light having a predetermined wavelength that is absorbed by a predetermined component (e.g., hemoglobin) that constitutes an organism. Specifically, the wavelength of the light is preferably 500 nm or greater and 1,200 nm or smaller. This is because, in the processing described below, it is easier to distinguish between the acoustic waves generated at the surface of the subject (for example, at the skin) and the acoustic waves generated at an optical absorber inside the subject (for example, hemoglobin). At least one light source 3 capable of generating pulsed light between 5 and 50 ns is provided. Laser, which has larger power, is desirable as the light source 3. However, instead of laser, a light-emitting diode may be used. Various different types of laser, such as solid-state laser, gas laser, dye laser, and semiconductor laser, can be used. The light may be emitted from the side of the acoustic converting unit 1 or from the opposite side. Furthermore, the light may be emitted from both sides of the subject 6.
  • The optical system 5 includes mirrors that reflect light and lenses that converge, diverge, and change the shape of light. The optical system 5 may include, in addition to the mirrors and lenses, optical waveguides and have any configuration so long as the light emitted from the light source 3 is incident on the subject 6 in a desired shape. It is desirable that the light be converged by a lens to irradiate a predetermined area. It is also desirable that the area on the subject 6 irradiated by light be movable. In other words, it is desirable that the light emitted from the light source 3 be movable on the subject 6. Movable light allows a larger area to be irradiated with light. It is even more desirable that the area on the subject 6 irradiated by light move in synchronization with the acoustic converting unit 1. Methods of moving an area on the subject 6 irradiated with light include a method using a movable mirror and a method of mechanically moving the light source 3.
  • The acoustic-wave generating member 10 has a predetermined absorption coefficient, is disposed on the surface of the subject 6, and has known thickness, optical absorption coefficient, and Grüneisen coefficient. The acoustic-wave generating member 10 generates acoustic waves by absorbing light emitted from the light source 3 and is capable of calculating the surface shape and the surface-light-intensity distribution of the subject 6. The acoustic-wave generating member 10 is made of a material having an absorption coefficient of the light generating an acoustic wave greater than the average absorption coefficient of the subject 6. Specifically, it is desirable that the optical absorption coefficient be 0.005 mm−1 or greater and 0.100 mm−1 or smaller. When the absorption coefficient is greater than 0.100 mm−1, the amount of light entering the subject 6 decreases, and thus, the acoustic wave generating inside the subject 6 is small. In contrast, when the absorption coefficient is smaller than 0.005 mm−1, this is smaller than the average absorption coefficient of the inside of the subject 6; therefore, it is difficult to distinguish between the acoustic waves from the inside and the surface of the subject 6, and thus it is difficult to calculate the surface shape of the subject 6. It is desirable that the optical absorption coefficient is 0.010 mm−1 or greater and 0.080 mm−1 or smaller. It is desirable to use a material having a Grüneisen coefficient greater than or equal to 0.8 and smaller than or equal to 1.5. The average Grüneisen coefficient of the subject 6 is approximately 0.5. The acoustic-wave generating member 10 include absorber particles having a known absorption coefficient disposed as a spotty film, as illustrated in FIG. 4A, or may include absorbers arranged in a grid, as illustrated in FIG. 4B. Instead, the acoustic-wave generating member 10 may include absorber fine particles disposed as a uniform film. It is possible to use an acoustic matching material having a known absorption coefficient, such as a gel, as the acoustic-wave generating member 10.
  • Second Embodiment
  • The photoacoustic imaging apparatus according to this embodiment differs from the photoacoustic imaging apparatus according to the first embodiment in that the acoustic-wave generating member 10 is not provided. In this embodiment, the surface shape is calculated using an acoustic wave that is generated due to the discontinuity in the optical characteristics (for example, absorption coefficients) of the subject 6 and the surroundings. Then, the illuminance distribution (hereinafter referred to as “surface-illuminance distribution”) at the surface of the subject 6 is calculated on the basis of the calculation result of the surface shape and the intensity distribution of the light emitted from the light source 3. An example in which air surrounds the subject 6 will be described below, but this embodiment is not limited thereto.
  • The absorption coefficients and the Grüneisen coefficients of air and the subject 6 are discontinuous. Therefore, light is absorbed at the surface therebetween, i.e., the surface of the subject 6, and, as a result, an acoustic wave 82 is generated at the surface of the subject 6. The acoustic converting unit 1 receives an acoustic wave 81 generated at the optical absorber 7 and the acoustic wave 82 and converts these acoustic waves to electrical signals.
  • Next, the operation of the photoacoustic imaging apparatus according to this embodiment will be described with reference to FIG. 5. The acoustic wave 81 generated at the optical absorber 7 inside the subject 6 and the acoustic wave 82 generated at the surface of the subject 6 as a result of irradiating the subject 6 with the light beam 4 are received by the acoustic converting unit 1 (S10). The received acoustic wave 81 is converted to an electrical signal at the acoustic converting unit 1 (S11) and is received by the processing unit 2. After performing filtering on the electrical signal (S12), the processing unit 2 calculates biological information such as the position and size of the optical absorber 7 or biological information, such as the initial-acoustic-pressure distribution, to generate a first image data (S13).
  • The processing unit 2 determines the shape of the subject 6 on the basis of the first image data acquired from the electrical signal (S20). This is described below.
  • Since the acoustic wave 82 generated at the surface of the subject 6 is generated by receiving light that is substantially not attenuated, the acoustic wave 81 is larger than the acoustic wave 81 generated at the optical absorber 7. Thus, a part having an initial acoustic pressure greater than the other parts can be extracted from the first image data acquired in Step S13 (initial-acoustic-pressure distribution P0). The extracted part corresponds to the boundary between the subject 6 and the acoustic-wave generating member 10, i.e., the surface of the subject 6. Specifically, the line obtained by connecting the parts having an initial acoustic pressure greater than a predetermined threshold defines the boundary between the subject 6 and the acoustic-wave generating member 10.
  • When the absorption (absorption coefficient) at the boundary between the subject 6 and air (surroundings of the subject 6) is known, similar to the first embodiment, the surface-illuminance distribution of the light emitted from the light source 3 incident on the subject 6 can be calculated from the initial-acoustic-pressure distribution at the boundary.
  • When the absorption (absorption coefficient) at the boundary between the subject 6 and air (surroundings of the subject 6) is not known, the surface-illuminance distribution the light emitted from the light source 3 incident on the subject 6 cannot be calculated from the initial-acoustic-pressure distribution at the boundary; thus, the following process is carried out.
  • The processing unit 2 determines the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the subject 6 on the basis of the shape of the subject 6 and the intensity distribution of the light beam 4 emitted from the light source 3 (S21). This is described below.
  • The intensity distribution of the light beam 4 emitted from the light source 3, which is a light intensity distribution in the inner direction orthogonal to the depth direction of the subject 6, is measured in advance. This is described below with reference to FIG. 6. With reference to FIG. 6, the shape of the subject 6 is represented by positions z in the depth direction of the subject 6, positions x in an inner direction orthogonal to the depth direction of the subject 6, and the inclination θ(x) to the surface. The light intensity distribution of the light beam 4 in the inner direction orthogonal to the depth direction of the subject 6 is presented by A(x). It is presumed that light travels linearly outside the subject 6. The inclination distribution θ(x) of the light beam 4 on the surface of the subject 6 irradiated with light can be calculated with reference to the normal direction, which is calculated from the surface shape of the subject 6. By multiplying the light intensity distribution A(x) with cos θ(x) at each of the positions x and z, the surface-illuminance distribution of the subject 6 can be calculated.
  • In the example described above, light is presumed to travel linearly outside the subject 6. However, it is also possible to determine the surface-illuminance distribution by determining the propagation of the light beam 4 outside the subject 6 to the surface of the subject 6 using the light transport equation or the Monte Carlo simulation for light propagation.
  • Based on the surface-illuminance distribution, the internal-light-intensity distribution in the subject 6 is determined (S15). Specifically, using the shape of the surface of the subject 6 acquired in S20 and the surface-illuminance distribution of the subject 6 acquired in Step S21, an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution. At this time, the internal-light-intensity distribution is calculated using the light diffusion equation, the transport equation, or the Monte Carlo simulation for light propagation.
  • The processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution determined in Step S15 and the first image data acquired in Step S13 (S16). By using the internal-light-intensity distribution determined in S15 by Expression 1, the absorption-coefficient distribution can be calculated. An image based on the second image data acquired in this way is displayed on the display device 9 (S17).
  • Third Embodiment
  • FIG. 7 illustrates a photoacoustic imaging apparatus according to the third embodiment of the present invention. This embodiment differs from the second embodiment in that a measuring unit 30 is provided. The other configurations are the same as that of the second embodiment. The measuring unit 30 measures the shape of a subject 6.
  • As the measuring unit 30, an image pickup device, such as a CCD camera, can be used. In such a case, the processing unit 2 determines the shape of the subject 6 by calculating the outer shape and thickness of the subject 6 from the taken-in image. The measuring unit 30 may instead be an acoustic-wave converting unit (so-called ultrasonic-wave-echo acoustic converting unit) that transmits and receives acoustic waves. The acoustic converting unit 1 may function as the measuring unit 30, or the measuring unit 30 may be provided separately.
  • Next, with reference to FIG. 8, the operation of the photoacoustic imaging apparatus according to this embodiment of the present invention will be described. In the second embodiment, the shape of the subject 6 is determined from electrical signals (first image data) (S20). The operation of the photoacoustic imaging apparatus according to this embodiment differs in that the shape of the subject 6 is determined from an image of the subject 6 acquired by the measuring unit 30 (S30). The other operations are the same as those of the second embodiment.
  • Fourth Embodiment
  • The photoacoustic imaging apparatus according to this embodiment will be described with reference to FIGS. 1A and 1B. The photoacoustic imaging apparatus according to this embodiment includes a container 40 defining the shape of the subject 6 instead of the acoustic-wave generating member 10 in the photoacoustic imaging apparatus according to the first embodiment. The other configurations are the same as that of the first embodiment.
  • In this embodiment, since the shape of the subject 6 is uniquely determined, the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the surface of the subject 6 is uniquely determined. Specifically, the container 40 suitable for the subject 6 is selected from a plurality of containers having different shapes and sizes, and then the subject 6 is placed into the container 40 to perform PAT measurement.
  • The surface-illuminance distribution of the light incident on the surface of the subject 6 for each container is determined in advance and is stored in the processing unit 2 as a surface-illuminance distribution data table containing the surface-illuminance distribution data of the subject 6 for each container. The data table is prepared such that when a container is selected, the corresponding surface-illuminance distribution data of the subject 6 for the selected container is retrieved. Instead of providing a plurality of containers, a single container of which the capacity, size, and/or shape are changeable may be provided. In such a case, the surface-illuminance distributions of the light incident on the surface of the subject 6 when the size and/or shape of the container is changed in various ways may be determined in advance, and a surface-illuminance distribution data table containing the surface-illuminance distribution data for when the size and/or shape of the container is changed may be stored in the processing unit 2.
  • Next, the operation of the photoacoustic imaging apparatus according to this embodiment will be described with reference to FIG. 9. First, the container 40 is selected from a plurality of containers on the basis of the size and shape of the subject 6, and the subject 6 is placed into the container 40.
  • Then, the subject 6 is irradiated with the light beam 4 emitted from the light source 3, and the acoustic converting unit 1 receives the acoustic wave 81 generated at the optical absorber 7 inside the subject 6 (S10). The received acoustic wave 81 is converted to an electrical signal at the acoustic converting unit 1 (S11) and is received by the processing unit 2. After performing filtering on the electrical signal (S12), the processing unit 2 calculates the position and size of the optical absorber 7 or biological information such as the initial-acoustic-pressure distribution to generate first image data (image reconstruction, S13).
  • The processing unit 2 selects and reads in the surface-illuminance distribution data corresponding to the selected container 40 from the surface-illuminance distribution data table stored in the processing unit 2 (S40) and determines the surface-illuminance distribution of the light beam 4 emitted from the light source 3 incident on the subject 6 (S21).
  • Based on the surface-illuminance distribution, the internal-light-intensity distribution in the subject 6 is determined (S15). Specifically, using the shape of the subject 6 defined by the container 40 and the surface-illuminance distribution of the subject 6 acquired in Step S21, an imaginary light source having a light intensity distribution that is the same as the surface-illuminance distribution is disposed on the surface of the subject 6 in a numerical space to calculate the internal-light-intensity distribution. At this time, the internal-light-intensity distribution is calculated using the light diffusion equation, the transport equation, or the Monte Carlo simulation for light propagation.
  • The processing unit 2 generates second image data, such as the absorption-coefficient distribution, on the basis of the internal-light-intensity distribution determined in Step S15 and the first image data acquired in Step S13 (S16). By using the internal-light-intensity distribution determined in S15 by Expression 1, the absorption-coefficient distribution can be calculated. An image based on the second image data acquired in this way is displayed on the display device 9 (S17).
  • When light diffusion within the subject 6 can be predicted, an internal-light-intensity-distribution data table corresponding to the inside of the subject 6 may be used instead of the surface-illuminance distribution data table. In such a case, in Step S40, internal-light-intensity-distribution data is read in instead of the surface-illuminance distribution data, and Step S21 may be combined with Step S40.
  • Fifth Embodiment
  • The present invention may also be realized by the following processing. Software (program) that realizes the functions of the above-described first to fourth embodiments is applied to a system or an apparatus via a network or various storage media, and the program is read out by the system or a computer (CPU or MPU) of the apparatus.
  • While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
  • REFERENCE SIGNS LIST
  • 1 acoustic converting unit
  • 2 processing unit
  • 3 light source
  • 4 light beam emitted from the light source
  • 6 subject
  • 81, 82 acoustic wave

Claims (18)

1. A processing apparatus that acquires subject information by using a signal derived from an acoustic wave generated by irradiating a subject a shape of which is defined by a container with light, comprising:
a storage unit configured to store pieces of information regarding illuminance distribution on a surface of the subject;
a reading unit configured to read, out of the storage unit, information regarding illuminance distribution corresponding to the container among the pieces of information regarding illuminance distribution; and
an acquisition unit configured to, based on the information regarding illuminance distribution corresponding to the container and based on the signal, acquire the subject information.
2. The processing apparatus according to claim 1,
wherein, based on the information regarding illuminance distribution corresponding to the container, the acquisition unit acquires information regarding light intensity distribution inside the subject, and
wherein, based on the information regarding light intensity distribution inside the subject and based on the signal, the acquisition unit acquires the subject information.
3. The processing apparatus according to claim 2,
wherein, based on the information regarding illuminance distribution corresponding to the container, the acquisition unit places an imaginary light source corresponding to the illuminance distribution in a calculation space, and acquires the information regarding light intensity distribution inside the subject by calculating propagation of light inside the subject from the imaginary light source.
4. The processing apparatus according to claim 3,
wherein the acquisition unit acquires the information regarding light intensity distribution inside the subject by calculating the propagation of light inside the subject from the imaginary light source by using a light diffusion equation, a transport equation, or a Monte Carlo simulation for light propagation.
5. The processing apparatus according to claim 2,
wherein, based on the signal, the acquisition unit acquires information regarding optical-energy-absorption density distribution or initial-acoustic-pressure distribution inside the subject, and
wherein, based on the information regarding optical-energy-absorption density distribution or initial-acoustic-pressure distribution and based on the information regarding illuminance distribution corresponding to the container, the acquisition unit acquires the subject information.
6. The processing apparatus according to claim 5,
wherein the subject information is information regarding absorption-coefficient distribution.
7. The processing apparatus according to claim 1,
wherein the storage unit stores pieces of information regarding illuminance distribution corresponding to a plurality of containers, inclusive of the information regarding illuminance distribution corresponding to the container.
8. The processing apparatus according to claim 1,
wherein the reading unit reads information regarding illuminance distribution corresponding to a size or a shape of the container.
9. The processing apparatus according to claim 1, further comprising:
a display control unit configured to cause a display to display an image that is based on the subject information.
10. A processing apparatus that acquires subject information by using a signal derived from an acoustic wave generated by irradiating a subject a shape of which is defined by a container with light, comprising:
a storage unit configured to store pieces of information regarding light intensity distribution inside the subject;
a reading unit configured to read, out of the storage unit, information regarding light intensity distribution corresponding to the container among the pieces of information regarding light intensity distribution; and
an acquisition unit configured to, based on the information regarding light intensity distribution corresponding to the container and based on the signal, acquire the subject information.
11. The processing apparatus according to claim 10,
wherein, based on the signal, the acquisition unit acquires information regarding optical-energy-absorption density distribution or initial-acoustic-pressure distribution inside the subject, and
wherein, based on the information regarding optical-energy-absorption density distribution or initial-acoustic-pressure distribution and based on the information regarding illuminance distribution corresponding to the container, the acquisition unit acquires the subject information.
12. The processing apparatus according to claim 11,
wherein the subject information is information regarding absorption-coefficient distribution.
13. The processing apparatus according to claim 10,
wherein the storage unit stores pieces of information regarding light intensity distribution corresponding to a plurality of containers, inclusive of the information regarding light intensity distribution corresponding to the container.
14. The processing apparatus according to claim 10,
wherein the reading unit reads information regarding light intensity distribution corresponding to a size or a shape of the container.
15. A processing method for acquiring subject information by using a signal derived from an acoustic wave generated by irradiating a subject a shape of which is defined by a container with light, comprising:
reading, out of a storage unit configured to store pieces of information regarding illuminance distribution on a surface of the subject, information regarding illuminance distribution corresponding to the container; and
acquiring, based on the information regarding illuminance distribution corresponding to the container and based on the signal, the subject information.
16. A processing method for acquiring subject information by using a signal derived from an acoustic wave generated by irradiating a subject a shape of which is defined by a container with light, comprising:
reading, out of a storage unit configured to store pieces of information regarding light intensity distribution inside the subject, information regarding light intensity distribution corresponding to the container; and
acquiring, based on the information regarding light intensity distribution corresponding to the container and based on the signal, the subject information.
17. A processing method, comprising:
acquiring image data that is based on a signal derived from an acoustic wave generated by irradiating a subject with light;
acquiring, based on the image data, information regarding a shape of the subject;
acquiring, based on the information regarding a shape of the subject and based on intensity distribution of the light with which the subject is irradiated, information regarding illuminance distribution of the light with which the subject is irradiated on a surface of the subject; and
acquiring, based on the information regarding illuminance distribution on the surface of the subject and based on the signal, the subject information.
18. A non-transitory computer-readable storage medium storing a program for causing a computer to execute processes of the processing method according to claim 15.
US16/512,934 2010-03-29 2019-07-16 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method Abandoned US20190350460A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/512,934 US20190350460A1 (en) 2010-03-29 2019-07-16 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
JP2010-075662 2010-03-29
JP2010075662A JP5675142B2 (en) 2010-03-29 2010-03-29 Subject information acquisition apparatus, subject information acquisition method, and program for executing subject information acquisition method
PCT/JP2011/056670 WO2011122382A1 (en) 2010-03-29 2011-03-14 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method
US201213634181A 2012-09-11 2012-09-11
US16/512,934 US20190350460A1 (en) 2010-03-29 2019-07-16 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
US13/634,181 Continuation US10390706B2 (en) 2010-03-29 2011-03-14 Photoacoustic imaging apparatus, photoacoustic imaging method, and storage medium
PCT/JP2011/056670 Continuation WO2011122382A1 (en) 2010-03-29 2011-03-14 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method

Publications (1)

Publication Number Publication Date
US20190350460A1 true US20190350460A1 (en) 2019-11-21

Family

ID=44080148

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/634,181 Active 2032-10-01 US10390706B2 (en) 2010-03-29 2011-03-14 Photoacoustic imaging apparatus, photoacoustic imaging method, and storage medium
US16/512,934 Abandoned US20190350460A1 (en) 2010-03-29 2019-07-16 Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/634,181 Active 2032-10-01 US10390706B2 (en) 2010-03-29 2011-03-14 Photoacoustic imaging apparatus, photoacoustic imaging method, and storage medium

Country Status (8)

Country Link
US (2) US10390706B2 (en)
EP (1) EP2552299B1 (en)
JP (1) JP5675142B2 (en)
KR (1) KR101483502B1 (en)
CN (3) CN104644127B (en)
BR (1) BR112012021776A2 (en)
RU (1) RU2535602C2 (en)
WO (1) WO2011122382A1 (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5675142B2 (en) 2010-03-29 2015-02-25 キヤノン株式会社 Subject information acquisition apparatus, subject information acquisition method, and program for executing subject information acquisition method
JP5661451B2 (en) * 2010-12-27 2015-01-28 キヤノン株式会社 Subject information acquisition apparatus and subject information acquisition method
JP5783779B2 (en) 2011-04-18 2015-09-24 キヤノン株式会社 Subject information acquisition apparatus and subject information acquisition method
JP5950538B2 (en) * 2011-10-26 2016-07-13 キヤノン株式会社 Subject information acquisition device
KR101273585B1 (en) 2011-12-05 2013-06-11 삼성전자주식회사 Ultrasound imaging apparatus and display method of ultrasound image
JP6146955B2 (en) * 2012-03-13 2017-06-14 キヤノン株式会社 Apparatus, display control method, and program
JP6071260B2 (en) * 2012-06-13 2017-02-01 キヤノン株式会社 Subject information acquisition apparatus and information processing method
JP5984547B2 (en) 2012-07-17 2016-09-06 キヤノン株式会社 Subject information acquisition apparatus and control method thereof
JP6053512B2 (en) * 2012-12-28 2016-12-27 キヤノン株式会社 Subject information acquisition apparatus and control method thereof
JP6029521B2 (en) * 2013-04-15 2016-11-24 株式会社アドバンテスト Photoacoustic wave measuring apparatus, method, program, and recording medium
JP6425527B2 (en) * 2013-12-17 2018-11-21 キヤノン株式会社 Photoacoustic apparatus, signal processing method, and program
KR101654675B1 (en) 2014-02-03 2016-09-06 삼성메디슨 주식회사 Method, apparatus and system for generating diagnostic image using photoacoustic material
JP6391249B2 (en) * 2014-02-10 2018-09-19 キヤノン株式会社 Subject information acquisition apparatus and signal processing method
US10342436B2 (en) * 2014-08-26 2019-07-09 Canon Kabushiki Kaisha Object information acquiring apparatus and processing method
JP6664176B2 (en) * 2014-09-30 2020-03-13 キヤノン株式会社 Photoacoustic apparatus, information processing method, and program
JP6544910B2 (en) * 2014-11-07 2019-07-17 キヤノン株式会社 INFORMATION PROCESSING APPARATUS, OBJECT INFORMATION ACQUIRING APPARATUS, AND METHOD OF DETERMINING SOUND SPEED
JP6497896B2 (en) * 2014-11-18 2019-04-10 キヤノン株式会社 Information acquisition device
DE102014226827A1 (en) * 2014-12-22 2016-06-23 Robert Bosch Gmbh Method, apparatus and sensor for determining an absorption behavior of a medium
JP2016152879A (en) * 2015-02-20 2016-08-25 キヤノン株式会社 Subject information acquisition apparatus
JP6512969B2 (en) 2015-07-06 2019-05-15 キヤノン株式会社 PROCESSING APPARATUS, PHOTOACOUSTIC APPARATUS, PROCESSING METHOD, AND PROGRAM
US9987089B2 (en) 2015-07-13 2018-06-05 University of Central Oklahoma Device and a method for imaging-guided photothermal laser therapy for cancer treatment
EP3344983A1 (en) * 2015-08-31 2018-07-11 C/o Canon Kabushiki Kaisha Photoacoustic object information obtaining apparatus and method
JP2017047178A (en) * 2015-09-04 2017-03-09 キヤノン株式会社 Subject information acquisition device
JP2017047056A (en) * 2015-09-04 2017-03-09 キヤノン株式会社 Subject information acquisition device
US20170086679A1 (en) * 2015-09-24 2017-03-30 Canon Kabushiki Kaisha Photoacoustic apparatus and method for acquiring object information
JP6632368B2 (en) 2015-12-21 2020-01-22 キヤノン株式会社 Information processing device, photoacoustic device, information processing method, and program
JP2018050775A (en) * 2016-09-27 2018-04-05 キヤノン株式会社 Photoacoustic apparatus, information processing method, and program
JP6929048B2 (en) 2016-11-30 2021-09-01 キヤノン株式会社 Display control device, display method, and program
CN108113650A (en) * 2016-11-30 2018-06-05 佳能株式会社 Display control unit, display control method and storage medium
JP2018110734A (en) * 2017-01-12 2018-07-19 キヤノン株式会社 Analyte information acquisition device and analyte information acquisition method
JP6501820B2 (en) * 2017-05-17 2019-04-17 キヤノン株式会社 Processing device, processing method, and program
CN107607473B (en) * 2017-08-31 2020-05-19 华南师范大学 Simultaneous multipoint excitation and matching received photoacoustic three-dimensional imaging device and method
JP2018012027A (en) * 2017-10-26 2018-01-25 キヤノン株式会社 Structure of recorded data
CN115024739B (en) * 2022-08-11 2022-11-29 之江实验室 Method for measuring distribution of Getsiram parameter in organism and application

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007229320A (en) * 2006-03-03 2007-09-13 Nippon Telegr & Teleph Corp <Ntt> Component concentration measuring apparatus
US20080221647A1 (en) * 2007-02-23 2008-09-11 The Regents Of The University Of Michigan System and method for monitoring photodynamic therapy
US20080306371A1 (en) * 2007-06-11 2008-12-11 Canon Kabushiki Kaisha Intravital-information imaging apparatus

Family Cites Families (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0616881B2 (en) 1986-01-17 1994-03-09 石川島播磨重工業株式会社 Slab forming equipment row
US5781294A (en) * 1991-12-24 1998-07-14 Hitachi, Ltd. Method and apparatus for detecting photoacoustic signal to detect surface and subsurface information of the specimen
US5473392A (en) 1992-05-01 1995-12-05 Summit Technology, Inc. Method and system for topographic measurement
US5673114A (en) * 1995-06-27 1997-09-30 Nikon Corporation Apparatus for measuring optical absorption of sample and sample holder applicable to the same
US5713356A (en) 1996-10-04 1998-02-03 Optosonics, Inc. Photoacoustic breast scanner
GB9704737D0 (en) 1997-03-07 1997-04-23 Optel Instr Limited Biological measurement system
KR100493154B1 (en) * 2002-03-20 2005-06-03 삼성전자주식회사 Apparatus of non-invasive measurement of bio-fluid concentration by using photoacoustic spectroscopy
EP1810610B1 (en) * 2006-01-20 2016-09-14 Olympus Corporation Method and apparatus for analyzing characteristic information of object with the use of mutual interaction between ultrasound wave and light
RU2008151407A (en) * 2006-05-25 2010-06-27 Конинклейке Филипс Электроникс Н.В. (Nl) METHOD OF PHOTOACOUSTIC VISUALIZATION
EP2086396A1 (en) * 2006-11-21 2009-08-12 Koninklijke Philips Electronics N.V. A system, device, method, computer-readable medium, and use for in vivo imaging of tissue in an anatomical structure
EP1935346A1 (en) * 2006-12-21 2008-06-25 Stichting voor de Technische Wetenschappen Imaging apparatus and method
JP4469903B2 (en) * 2007-06-11 2010-06-02 キヤノン株式会社 Biological information imaging device
US20090005685A1 (en) * 2007-06-29 2009-01-01 Canon Kabushiki Kaisha Ultrasonic probe and inspection apparatus equipped with the ultrasonic probe
WO2009011934A1 (en) * 2007-07-17 2009-01-22 University Of Florida Research Foundation, Inc. Method and apparatus for tomographic imaging of absolute optical absorption coefficient in turbid media using combined photoacoustic and diffusing light measurements
US20090105588A1 (en) * 2007-10-02 2009-04-23 Board Of Regents, The University Of Texas System Real-Time Ultrasound Monitoring of Heat-Induced Tissue Interactions
WO2009102036A1 (en) 2008-02-13 2009-08-20 Toto Ltd. Shower apparatus
WO2009158146A2 (en) * 2008-05-30 2009-12-30 Stc.Unm Photoacoustic imaging devices and methods of making and using the same
JP5572293B2 (en) * 2008-07-07 2014-08-13 株式会社日立ハイテクノロジーズ Defect inspection method and defect inspection apparatus
JP5189912B2 (en) * 2008-07-11 2013-04-24 キヤノン株式会社 Photoacoustic measuring device
JP5460000B2 (en) * 2008-08-20 2014-04-02 キヤノン株式会社 Imaging apparatus and imaging method
CN102131463B (en) * 2008-08-27 2013-01-16 佳能株式会社 Device for processing information relating to living body and method for processing information relating to living body
JP5541662B2 (en) 2008-09-12 2014-07-09 キヤノン株式会社 Subject information acquisition apparatus and control method thereof
JP2010088627A (en) * 2008-10-07 2010-04-22 Canon Inc Apparatus and method for processing biological information
US20110142316A1 (en) * 2009-10-29 2011-06-16 Ge Wang Tomography-Based and MRI-Based Imaging Systems
WO2011052061A1 (en) * 2009-10-29 2011-05-05 キヤノン株式会社 Photo-acoustic device
JP5675142B2 (en) * 2010-03-29 2015-02-25 キヤノン株式会社 Subject information acquisition apparatus, subject information acquisition method, and program for executing subject information acquisition method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007229320A (en) * 2006-03-03 2007-09-13 Nippon Telegr & Teleph Corp <Ntt> Component concentration measuring apparatus
US20080221647A1 (en) * 2007-02-23 2008-09-11 The Regents Of The University Of Michigan System and method for monitoring photodynamic therapy
US20080306371A1 (en) * 2007-06-11 2008-12-11 Canon Kabushiki Kaisha Intravital-information imaging apparatus

Also Published As

Publication number Publication date
US10390706B2 (en) 2019-08-27
RU2012145885A (en) 2014-05-10
CN102843960A (en) 2012-12-26
WO2011122382A1 (en) 2011-10-06
CN104644126A (en) 2015-05-27
CN104644126B (en) 2017-09-05
JP5675142B2 (en) 2015-02-25
BR112012021776A2 (en) 2016-05-17
JP2011206192A (en) 2011-10-20
EP2552299B1 (en) 2020-12-30
CN104644127A (en) 2015-05-27
RU2535602C2 (en) 2014-12-20
CN104644127B (en) 2017-12-15
US20130006088A1 (en) 2013-01-03
EP2552299A1 (en) 2013-02-06
KR20120126109A (en) 2012-11-20
CN102843960B (en) 2015-02-18
KR101483502B1 (en) 2015-01-21

Similar Documents

Publication Publication Date Title
US20190350460A1 (en) Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method
US20160338596A1 (en) Photoacoustic imaging apparatus and photoacoustic imaging method
US9615751B2 (en) Object information acquiring apparatus and object information acquiring method
US20120296192A1 (en) Photoacoustic imaging apparatus and photoacoustic imaging method
US8942058B2 (en) Display data obtaining apparatus and display data obtaining method
JP5489624B2 (en) measuring device
JP2012135462A (en) Device and method for acquiring test object information
US9572531B2 (en) Object information acquiring apparatus and control method thereof
JP2011217914A (en) Photoacoustic imaging apparatus, photoacoustic imaging method, and program
CN106175666B (en) Subject information acquisition device
US11006929B2 (en) Object information acquiring apparatus and signal processing method
KR101899838B1 (en) Photoacoustic apparatus and information acquisition apparatus
JP2012125447A (en) Apparatus and method for acquiring subject information
JP6456129B2 (en) SUBJECT INFORMATION ACQUISITION DEVICE, ITS CONTROL METHOD, AND LIGHT CONTROL METHOD
JP6016881B2 (en) Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method
US20140066744A1 (en) Object information acquiring apparatus
JP6272427B2 (en) Photoacoustic imaging apparatus, photoacoustic imaging method, and program for executing photoacoustic imaging method

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION