WO2019031607A1 - Photoacoustic apparatus and object information acquiring method - Google Patents

Photoacoustic apparatus and object information acquiring method Download PDF

Info

Publication number
WO2019031607A1
WO2019031607A1 PCT/JP2018/030104 JP2018030104W WO2019031607A1 WO 2019031607 A1 WO2019031607 A1 WO 2019031607A1 JP 2018030104 W JP2018030104 W JP 2018030104W WO 2019031607 A1 WO2019031607 A1 WO 2019031607A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
imaging
period
electric signal
photoacoustic
Prior art date
Application number
PCT/JP2018/030104
Other languages
French (fr)
Inventor
Naoto Abe
Original Assignee
Canon Kabushiki Kaisha
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Kabushiki Kaisha filed Critical Canon Kabushiki Kaisha
Publication of WO2019031607A1 publication Critical patent/WO2019031607A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7285Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
    • A61B5/7289Retrospective gating, i.e. associating measured signals or images with a physiological event after the actual measurement or image acquisition, e.g. by simultaneously recording an additional physiological signal during the measurement or image acquisition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/145Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/318Heart-related electrical modalities, e.g. electrocardiography [ECG]
    • A61B5/346Analysis of electrocardiograms
    • A61B5/349Detecting specific parameters of the electrocardiograph cycle
    • A61B5/352Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7285Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7285Specific aspects of physiological measurement analysis for synchronising or triggering a physiological measurement or image acquisition with a physiological event or waveform, e.g. an ECG signal
    • A61B5/7292Prospective gating, i.e. predicting the occurrence of a physiological event for use as a synchronisation signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/7425Displaying combinations of multiple images regardless of image source, e.g. displaying a reference anatomical image with a live image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4444Constructional features of the ultrasonic, sonic or infrasonic diagnostic device related to the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/54Control of the diagnostic device
    • A61B8/543Control of the diagnostic device involving acquisition triggered by a physiological signal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/44Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
    • A61B8/4477Constructional features of the ultrasonic, sonic or infrasonic diagnostic device using several separate ultrasound transducers or probes
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention relates to an apparatus that acquires information on an object by using a photoacoustic effect.
  • PAT photoacoustic tomography
  • An apparatus using photoacoustic tomography can detect oxyhemoglobin and deoxyhemoglobin in blood, which are light absorbers, whereby the structural information of the blood vessels, and such functional information as the oxygen saturation, can be acquired.
  • PTL 1 discloses a photoacoustic apparatus that acquires and images a photoacoustic signal at a timing that matches with the variation period of the blood volume.
  • the apparatus according to PTL 1 generates an image at a frame rate that matches with the pulsation.
  • human pulsation is relatively slow, 60 to 100 times per minute, hence if the object position changes during an image generation interval, the generated image is blurred.
  • a photoacoustic apparatus includes: a light source configured to irradiate an object with light; an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal; a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period; a composition unit configured to compose a plurality of electric signals acquired in each of the imaging periods; and an image generating unit configured to generate image data representing characteristic information of the object, based on the composed electric signal.
  • a photoacoustic apparatus includes: a light source configured to irradiate light to an object; an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal; a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period; an image generating unit configured to generate image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and a composition unit configured to compose a plurality of image data generated in each of the imaging periods.
  • An object information acquiring method is an object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal
  • the object information acquiring method including: a control step of setting an imaging period corresponding to an approximately periodic motion of the object, and executing the irradiation of the light and the acquisition of the electric signal at a timing in an approximately same phase in each imaging period; a composition step of composing a plurality of electric signals acquired in each of the imaging periods; and an image generating step of generating image data representing characteristic information of the object, based on the composed electric signal.
  • An object information acquiring method is an object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal
  • the object information acquiring method including: a control step of setting an imaging period corresponding to an approximate periodic motion of the object, and executing the irradiation of the light and the acquisition of the electric signal at a timing in an approximately same phase in each imaging period; an image generating step of generating image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and a composition step of composing a plurality of image data generated in each of the imaging periods.
  • the influence of the motion of an object can be reduced, and accuracy of the measurement can be improved in the photoacoustic apparatus. Further features of the present invention will become apparent from the following description of the exemplary embodiments with reference to the attached drawings.
  • Fig. 1 is a system block diagram of a photoacoustic apparatus according to Embodiment 1.
  • Fig. 2A is a schematic diagram of a probe 180 according to Embodiment 1.
  • Fig. 2B is a schematic diagram of the probe 180 of Embodiment 1.
  • Fig. 3 is a hardware block diagram according to Embodiment 1.
  • Fig. 4 is an operation timing chart of the photoacoustic apparatus according to Embodiment 1.
  • Fig. 5A is an example of a screen to set a timing for sampling.
  • Fig. 5B is an example of a screen to set a timing for sampling.
  • Fig. 5C is an example of a screen to set a timing for sampling.
  • Fig. 6 is an operation timing chart of a photoacoustic apparatus according to Embodiment 2.
  • Fig. 7 is a hardware block diagram according to Embodiment 3.
  • the present invention relates to a technique to detect an acoustic wave propagated from an object, and generate and acquire the characteristic information inside the object.
  • the present invention is regarded as a photoacoustic apparatus (object information acquiring apparatus), or a control method thereof, or an object information acquiring method.
  • the present invention is also regarded as a program which causes an information processing apparatus, equipped with such hardware resources as a CPU and memory, to execute these methods, or a computer readable non-transitory storage medium storing this program.
  • the photoacoustic apparatus (object information acquiring apparatus) is an apparatus that utilizes the photoacoustic effect, configured to receive an acoustic wave generated inside the object in a case where light (electromagnetic wave) is radiated to an object, and to acquire the characteristic information of the object as image data.
  • the characteristic information refers to information on the characteristic values corresponding to a plurality of positions inside the object respectively, and these characteristic values are generated using received signals which are acquired by receiving a photoacoustic wave.
  • the characteristic information acquired by the photoacoustic measurement refers to the values reflecting the absorption rate of the light energy.
  • the characteristic information includes a generation source of an acoustic wave which was generated by the light irradiation, an initial sound pressure inside the object, a light energy absorption density and an absorption coefficient derived from the initial sound pressure, and a concentration of a substance constituting a tissue.
  • spectral information such as the concentration of a substance constituting the object, can be acquired.
  • the spectral information may be an oxygen saturation, a value generated by weighting the oxygen saturation using intensity (e.g.
  • the spectral information may also be a glucose concentration, a collagen concentration, a melanin concentration, or a volume percentage of fat or water.
  • the distribution data can be generated as image data.
  • the characteristic information may be determined, not as numeric data, but as the distribution information at each position inside the object. In other words, such distribution information as the initial sound pressure distribution, the energy absorption density distribution, the absorption coefficient distribution, and the oxygen saturation distribution may be determined.
  • the “acoustic wave” in the present description is typically an ultrasonic wave, including an elastic wave called a “sound wave” or a “photoacoustic wave”.
  • An electric signal, which was converted from an acoustic wave by a probe or the like, is called an “acoustic signal”.
  • Such phrases as “ultrasonic wave” or “acoustic wave” in this description, however, are not intended to limit the wavelengths of these elastic waves.
  • An acoustic wave generated due to the photoacoustic effect is called a “photoacoustic wave” or a “light-induced ultrasonic wave”.
  • An electric signal, which originates from a photoacoustic wave is called a “photoacoustic signal”.
  • the photoacoustic signal includes both an analog signal and a digital signal.
  • the distribution data is also called “photoacoustic image data” or “reconstructed image data”.
  • a photoacoustic apparatus is an apparatus configured to irradiate an object with pulsed light, and generate an image of blood vessels (structural image) inside the object by receiving a photoacoustic wave generated inside the object.
  • a photoacoustic apparatus having a hand held probe is used, but the present invention can also be applied to a photoacoustic apparatus which has a probe on a mechanical stage so as to scan the object mechanically.
  • the photoacoustic apparatus according to Embodiment 1 includes a probe 180, a signal collecting unit 140, a computer 150, a display unit 160 and an input unit 170.
  • the probe 180 includes a light source unit 200, an optical system 112, a light irradiating unit 113, and a reception unit 120.
  • the computer 150 includes a computing unit 151, a storage unit 152, a control unit 153 and a frame rate converting unit 159.
  • the light source unit 200 periodically supplies a pulsed light to the light irradiating unit 113 via the optical system 112 constituted by optical fibers (bundle fibers) and the like.
  • the light irradiating unit 113 irradiates the object 100 with the supplied light. Thereby an acoustic wave is periodically generated from the object 100.
  • the reception unit 120 receives a photoacoustic wave generated from the object 100, and outputs an analog electric signal.
  • the signal collecting unit 140 converts the analog signal, which was outputted from the reception unit 120, into a digital signal, and outputs the digital signal to the computer 150.
  • the period of irradiating the object with the pulsed light and acquiring the electric signal is called a “sampling period”.
  • the computer 150 performs processing to compose a digital signal, which is outputted from the signal collecting unit 140, at each sampling period, and stores the composed signal in the storage unit 152.
  • the composition is not only a simple addition, but includes a weighting addition, averaging, moving average and the like. In the following, averaging will be primarily described, but a composing method other than averaging may be used.
  • the computer 150 performs such processing as image reconstruction for the digital signals stored in the storage unit 152, so as to generate the photoacoustic image data in each period corresponding to the imaging frame rate (hereafter “imaging period”). Concrete processing thereof will be described later.
  • the computer 150 outputs the generated photoacoustic image data to the frame rate converting unit 159 in each imaging period.
  • the frame rate converting unit 159 converts the photoacoustic image data, which is inputted in each imaging period, into a refresh rate corresponding to the display unit 160 (hereafter “display period”). Details on the method will be described later. Then the display unit 160 refreshes and displays the photoacoustic image data in each display period.
  • the user e.g. physician, technician
  • the user of the apparatus can confirm the photoacoustic image displayed on the display unit 160 so as to perform diagnosis.
  • the display image may be stored in the memory of the computer 150, or a data management system connected with the photoacoustic apparatus via a network, based on the storing instruction from the user or the computer 150.
  • the user of the apparatus can input data to the apparatus via the input unit 170.
  • Fig. 2A is a schematic diagram of the probe 180 according to the present embodiment.
  • the probe 180 includes a light source unit 200, the optical system 112, the light irradiating unit 113, the reception unit 120, and a housing 181.
  • the housing 181 is a case which houses the light source unit 200, the optical system 112, the light irradiating unit 113 and the reception unit 120. By holding the housing 181, the user can use the probe 180 as a handheld probe.
  • the light irradiating unit 113 is a unit configured to irradiate the object with the pulsed light, which is propagated by the optical system 112.
  • the probe 180 in Fig. 2A is connected to the signal collecting unit 140 via a cable 182.
  • the cable 182 includes a wire to supply power to the light source unit 200, a wire to transmit a light emission control signal, and a wire to output an analog signal, which is outputted from the reception unit 120, to the signal collecting unit 140 (these wires are not illustrated).
  • a connector may be disposed in the cable 182, so as to detachably attach the probe 180 to the other composing elements of the photoacoustic apparatus. Further, as illustrated in Fig.
  • a semiconductor laser, a light emitting diode or the like may be used as the light source unit 200, so that the pulsed light is directly radiated to the object without using the optical system 112.
  • a light emitting end portion of the semiconductor laser, LED or the like front end of housing becomes the light irradiating unit 113.
  • the light source unit 200 is a unit configured to generate a light which is radiated to an object 100.
  • the light source is preferably a laser light source in order to acquire high power, but a light emitting diode, flash lamp or the like may be used instead of laser.
  • various laser such as a solid-state laser, a gas laser, a dye laser and a semiconductor laser can be used.
  • the irradiation timing, waveform, intensity and the like are controlled by a light source control unit (not illustrated).
  • This light source control unit may be integrated with the light source.
  • a light source which can output a plurality of wavelengths.
  • the light source unit 200 To install the light source unit 200 in the housing 181, it is preferable to use a semiconductor light emitting element, such as a semiconductor laser and a light emitting diode, as shown in Fig. 2B. Further, to output a plurality of wavelengths, the wavelengths may be switched using a plurality of types of semiconductor laser and light emitting diodes which generate lights having different wavelengths.
  • a semiconductor light emitting element such as a semiconductor laser and a light emitting diode
  • the light must be radiated in a sufficiently short time in accordance with the thermal characteristics of the object.
  • the pulsed light that is generated from the light source is preferably 10 nanoseconds to 1 microsecond.
  • the wavelength of the pulsed light is preferably a wavelength at which the light propagates into the object.
  • the wavelength of the pulsed light is preferably at least 400 nm and not more than 1600 nm. Needless to say, the wavelength may be determined in accordance with the light absorption characteristic of a light absorber to be imaged.
  • a wavelength at which the light is well absorbed by the blood vessels (at least 400 nm, not more than 800 nm) may be used.
  • a light having a wavelength at which the light is not absorbed very much by the background tissue (e.g. water, fat) of the living body (at least 700 nm, not more than 1100 nm) may be used.
  • the semiconductor light emitting element is used as the light source, hence a large quantity of light is not radiated to the object.
  • the photoacoustic signal that can be acquired by one irradiation may not easily reach a desired S/N ratio. Therefore, light is emitted from the light source in each sampling period, and the acquired photoacoustic signals are averaged, so as to improve the S/N ratio.
  • An example of a preferable wavelength of the light source unit 200 that is used for the present embodiment is 797 nm.
  • This wavelength is a wavelength at which the light reaches the deep region inside the object, and at which absorption coefficients of the oxyhemoglobin and deoxyhemoglobin are approximately the same, therefore this wavelength is suitable for detecting the blood vessel structure. If 756 nm is additionally used as the second wavelength, the oxygen saturation can be determined, using the absorption coefficient difference between the oxyhemoglobin and deoxyhemoglobin.
  • the light irradiating unit 113 is a portion to emit the light which is radiated to the object (emitting end). If bundle fiber is used as the optical system 112, the terminal portion is the light irradiating unit 113. In the case when a part of the living body (e.g. breast) is the object, a diffuser or the like, to diffuse light, may be disposed in the light irradiating unit 113. Thereby the beam diameter of the pulsed light is expanded, and the pulsed light can be radiated to the object in this state. Further, in the case of using a plurality of semiconductor light emitting elements as the light source unit 200, as illustrated in Fig. 2B, the light irradiating unit 113 is constituted by aligning the light emitting end portion of each element (housing front end), whereby the object can be irradiated over a wide range.
  • the reception unit 120 is a unit including: a transducer (acoustic wave detection element) which receives a photoacoustic wave generated by a pulsed light, and outputs an electric signal; and a support member that supports the transducer.
  • a transducer a piezoelectric material, a capacitive type transducer (CMUT), a transducer using a Fabry-Perot interferometer and the like may be used, for example.
  • Examples of piezoelectric material are a piezoelectric ceramic material, such as PZT (lead zirconate titanate), or a high polymer piezoelectric film material, such as PVDF (polyvinylidene fluoride).
  • the electric signal acquired by the transducer is a time-resolved signal.
  • the amplitude of the acquired electric signal is a value based on the sound pressure which the transducer received at each timing (e.g. value in proportion to sound pressure).
  • the transducer can detect the frequency component constituting the photoacoustic wave (typically 100 KHz to 10 MHz).
  • a plurality of transducers may be arranged on the support member so as to form a plane or curved plane, such as an 1D array, a 1.5D array, a 1.75D array or a 2D array.
  • the reception unit 120 may include an amplifier configured to amplify time-series analog signals which are outputted from the transducer. Further, the reception unit 120 may include an A/D convertor to convert time-series analog signal which are outputted from the transducer into time series digital signals. In other words, the reception unit 120 may also play a part of the signal collecting unit 140.
  • the handheld probe was described as an example, but in order to improve image accuracy, it is preferable to use transducers which completely surround the object 100, so that the acoustic wave can be detected at various angles. If the object 100 is too large to be completely surrounded by transducers, transducers may be arranged on a hemispherical support member. In the case when the probe includes the reception unit having such a shape, the probe may be relatively moved mechanically with respect to the object 100. To move the probe, such a mechanism as an XY stage may be used.
  • the arrangement and number of transducers, and the shape of the support member are not limited to the above description, but may be optimized in accordance with the object 100.
  • a medium to propagate the photoacoustic wave (acoustic matching material) is preferably disposed in a space between the reception unit 120 and the object 100. Thereby the acoustic impedance of the object 100 and that of the transducer at the interface therebetween can be matched.
  • the acoustic matching material can be water, oil or ultrasonic gel, for example.
  • the photoacoustic apparatus according to the present embodiment may include a holding member, which holds the object 100 and stabilizes the shape of the object 100.
  • the holding member preferably has both high light transmissivity and high acoustic wave transmissivity. Polymethyl pentene, polyethylene terephthalate, acryl or the like, for example, can be used.
  • the transducer may have the function of a transmission unit which transmits an acoustic wave.
  • the transducer as the reception unit and the transducer as the transmission unit may be the same unit, or be separate units.
  • the signal collecting unit 140 includes an amplifier that amplifies an analog electric signal outputted from the reception unit 120, and an A/D convertor that converts an analog signal, which is outputted from the amplifier, into a digital signal.
  • the signal collecting unit 140 may be composed of an FPGA (Field Programmable Gate Array) chip and the like.
  • the rate of the A/D conversion is preferably at least double that of the band of the signal to be inputted. As mentioned above, if the frequency component constituting the photoacoustic wave is 100 KHz to 10 MHz, the A/D conversion rate is at least 20 MHz, preferably at least 40 MHz.
  • the signal collecting unit 140 synchronizes the timing of the light irradiation and the timing of the signal collecting processing by using the light emission control signal. In other words, based on the light emission timing in each sampling period, the signal collecting unit 140 starts A/D conversion at the rate mentioned above, and converts the analog signals into digital signals. As a result, a digital signal string is acquired for each transducer at an interval of one-A/D conversion rate (period of A/D conversion clock).
  • the signal collecting unit 140 is also called the “Data Acquisition System (DAS)”.
  • the signal collecting unit 140 may be disposed inside the housing 181. If this configuration is used, information between the probe 180 and the computer 150 can be propagated by digital signals, hence noise resistance improves. Further, compared with the case of transmission by analog signals, a number of wires can be decreased, and operability of the probe 180 improves in the case of transmission by digital signals.
  • the later mentioned averaging may be performed by the signal collecting unit 140. In this case, it is preferable to use such hardware as an FPGA to perform the averaging.
  • the computer 150 is an arithmetic unit (image generating unit and display control unit in the present invention), that includes a computing unit 151, a storage unit 152, a control unit 153 and a frame rate converting unit 159.
  • the computing unit 151 which performs the computing function is constituted by such processors as a CPU and GPU (Graphics Processing Unit), and by such arithmetic circuits as an FPGA (Field Programmable Gate Array) chip. These units may be a single processor or a single arithmetic circuit, or may be constituted of a plurality of processors and arithmetic circuits.
  • the computer 150 performs the following processing operations for each of the plurality of transducers. First the computer 150 composes a plurality of digital signals (photoacoustic signals) which are outputted from the signal collecting unit 140 in each sampling period. The composed photoacoustic signal is stored in the storage unit 152 in each imaging period. Then the computing unit 151 reconstructs the image in each imaging period, based on the composed photoacoustic signal that is stored in the storage unit 152, so as to generate the photoacoustic image (structural image, functional image), and executes other arithmetic processing operations. The image period is generated at timings synchronizing with the heartbeat of the examinee using the later mentioned electrocardiograph 173. The method of setting the imaging period will be described later.
  • any method such as a reverse projection method in a time domain, a reverse projection method in a Fourier domain, and a model based method (repeat operation method), may be used.
  • the reverse projection methods in a time domain are, for example, universal back projection (UBP), filtered back projection (FBP) and phasing addition (delay-and-sum).
  • the computing unit 151 When the light source unit 200 generates lights having two different wavelengths, the computing unit 151 generates a first initial sound pressure distribution from a photoacoustic signal originating from the light having a first wavelength, and generates a second initial sound pressure distribution from a photoacoustic signal originating from the light having a second wavelength. Further, a first absorption coefficient distribution is acquired by correcting the first initial sound pressure distribution using the light quantity distribution of the light having the first wavelength, and acquires a second absorption coefficient distribution by correcting the second initial sound pressure distribution using the light quantity distribution of the light having the second wavelength. Furthermore, an oxygen saturation distribution is acquired based on the first and second absorption coefficient distributions. The content and sequence of the computing are not limited to this, as long as the oxygen saturation distribution can finally be acquired.
  • the storage unit 152 is such a volatile memory as RAM (Random Access Memory), or such non-transitory storage media as ROM (Read Only Memory), magnetic disk and flash memory.
  • RAM Random Access Memory
  • ROM Read Only Memory
  • a storage medium in which a program is stored is a non-transitory storage medium.
  • the storage unit 152 may be constituted of a plurality of storage media.
  • the storage unit 152 can store various data, such as a photoacoustic signal averaged in each imaging period, photoacoustic image data generated by the computing unit 151, and reconstructed image data based on the photoacoustic image data.
  • the control unit 153 is a unit to control the operation of each composing element of the photoacoustic apparatus, and is constituted by arithmetic elements, such as a CPU.
  • the control unit 153 may control each composing element of the photoacoustic apparatus based on an instruction signal (e.g. measurement start signal) which is inputted via the input unit 170. Further, the control unit 153 reads program codes stored in the storage unit 152, and controls the operation of each composing element of the photoacoustic apparatus.
  • the control unit 153 can also adjust a generated image.
  • the frame rate converting unit 159 converts a photoacoustic image generated at a predetermined frame rate corresponding to the imaging period (hereinafter referred to as a imaging frame rate) into a photoacoustic image generated at a predetermined frame rate corresponding to the display period (hereinafter referred to as a display frame rate), and outputs the photoacoustic image to the display unit 160.
  • the frame rate converting unit 159 is independent, but the frame rate converting unit 159 need not always be an independent unit.
  • a photoacoustic image may be stored in the storage unit 152 at each imaging frame rate, and the stored photoacoustic image may be read in accordance with the display frame rate.
  • the display frame rate is preferably a frame rate corresponding to a general purpose display (e.g. 50 Hz, 60 Hz, 72 Hz, 120 Hz).
  • a frame rate appropriate for measurement and a frame rate appropriate for displaying the image can be set independently.
  • a frame rate appropriate for measurement can be freely set, regardless the frame rate appropriate for displaying the image. It is also possible to freely change only the imaging period in accordance with an instruction of the user.
  • the display unit 160 is a unit to display a photoacoustic image.
  • the display unit 160 refreshes the actual screen, synchronizing with the display frame rate.
  • the display frame rate and the rate of refreshing the actual screen (refresh rate) may be the same.
  • Some recent liquid crystal displays have a function to support inputs at a plurality of frame rates (frame frequencies). And some such liquid crystal displays have a function to convert the inputted frame rate into the rate to fresh an actual screen (refresh rate). If the display unit 160 has a function, this display unit 160 includes a unit to convert the display frame rate into the actual refresh rate. If the display unit 160 includes such a unit, the computer 150 need not have the frame rate converting unit 159 shown in Fig. 1. If the display unit 160 includes the frame rate converting function, the configuration of the computer 150 can be simplified. The display unit 160 may display information relating to the imaging period and the display period together with the photoacoustic image.
  • the computer 150 may be a custom designed workstation, or may be a general purpose PC or workstation.
  • the computer 150 may be operated in accordance with the instructions of the program stored in the storage unit 152.
  • Each configuration of the computer 150 may be implemented by different hardware respectively. At least a part of the configuration of the computer 150 may be implemented by a single hardware component.
  • Fig. 3 is a concrete configuration example of the computer 150 according to the present embodiment.
  • the computer 150 according to the present embodiment includes a CPU 154, a GPU 155, a RAM 156, a ROM 157, an external storage device 158 and a frame rate converting unit 159.
  • a liquid crystal display 161 serving as the display unit 160, a mouse 171 and a keyboard 172 serving as an input unit 170 are connected to the computer 150.
  • the computer 150 and the reception unit 120 may be housed in a common case.
  • a computer housed in a case may perform part of the signal processing operations, and a computer installed outside the case may perform the remainder of the signal processing operations.
  • computers inside and outside the case in total are regarded as the “computer” according to the present embodiment.
  • hardware that constitutes the computer may be distributed.
  • an information processing apparatus installed in a remote area, provided by a cloud computing service or the like may be used for the computer 150.
  • the computer 150 may perform image processing for the acquired photoacoustic image, or perform processing to compose GUI graphics or the like when necessary. These processing operations may be performed before or after the frame rate conversion.
  • the display unit 160 is a display device, such as a liquid crystal display and an organic EL. An image generated by the computer 150 and the numeric values at specific positions and the like are displayed by the display unit 160. In the display unit 160, the image is inputted at a frame rate corresponding to the display period (e.g. 50 Hz, 60 Hz, 72 Hz, 120 Hz), as mentioned above.
  • the display unit 160 may display the image at the inputted frame rate, or may convert the frame rate.
  • the display unit 160 may also display a GUI to operate the images and the apparatus on screen.
  • the input unit 170 is a unit to acquire input of an instruction and numeric value from the user.
  • the input unit 170 may be an operation console, for example, constituted by a mouse, keyboard and special knobs that the user can operate. If a touch panel is used for the display unit 160, the display unit 160 may also be the input unit 170.
  • the user can specify the start or end of measurement, specify the imaging period (imaging frame rate), and perform such an operation as an image saving instruction.
  • the input unit 170 may acquire various parameters on the sound velocity inside the object, configuration of the holding unit and the like, so that the computer 150 performs processing using this information.
  • the electrocardiograph 173 is a unit to detect an electric signal that is associated with the heart of the patient (examinee).
  • the signal detected by the electrocardiograph 173 is outputted to the computer 150, whereby the computer 150 can detect the pulsation of the examinee.
  • the computer 150 radiates the pulsed light and acquires the photoacoustic signal synchronizing with the output of the electrocardiograph (electrocardiographic waveform), and calculates the reconstructed image data. Details on this processing will be described later.
  • Each composing element of the photoacoustic apparatus described above may be configured as an independent apparatus respectively, or may be integrated into one configuration. Further, at least a part of the configuration of the photoacoustic apparatus may be integrated, and the remainder may be configured as independent apparatuses.
  • An object 100 is not a part of the photoacoustic apparatus of the present invention, but will be described here.
  • the photoacoustic apparatus according to the present embodiment is used to perform diagnosis of malignant cancers, vascular diseases and the like of humans and animals, and to perform follow up observation of chemotherapy.
  • a possible object 100 is the diagnostic target portion of a living body, such as a breast, an organ, blood vessels, the head, neck, abdomen, a limb including a finger and a toe of humans and animals.
  • the measurement target is a human body
  • oxyhemoglobin, deoxyhemoglobin, blood vessels which contain a large quantity of oxy- or deoxyhemoglobin, or new blood vessels formed near a malignant tumor may be a target light absorber.
  • plaque or a carotid artery wall may be a target light absorber.
  • dyes as methylene blue (MB) and indocyanine green (ICG), gold particles or a substance generated by integrating or chemically modifying these substances introduced from the outside, may be a target light absorber.
  • a needle or a light absorber attached to a needle may be observed.
  • the object may be a phantom or inorganic matter, such as a testing object.
  • Fig. 4 is a timing chart to describe the operation of the photoacoustic apparatus according to Embodiment 1.
  • the abscissa of each timing chart is a time axis.
  • the light source unit 200 emits light in each sampling period (tw1), and acquires the photoacoustic signal generated by the light emission in each sampling period.
  • the period in which sample is performed is from the timing when the delay time DLY elapses after the R wave is detected in the electrocardiographic waveform until the sampling possible time SW elapses. Thereby sampling can be performed at a timing in approximately the same phase in the period from detection of the R wave to the detection of the next R wave.
  • the sampling possible time SW is a time that is sufficiently shorter than the motion of the object living body (contraction of blood vessels generated by heart beat). Then a photoacoustic signal that is not influenced by the motion of the object can be acquired.
  • the length of the sampling period tw1 is preferably set considering maximum permissible exposure (MPE) to the skin. For example, if the measurement wavelength is 750 nm, the pulse width of the pulsed light is 1 microsecond, the sampling period tw1 is 0.1 milliseconds, and the MPE value to skin is about 14 J/m 2 . On the other hand, if the peak power of the pulsed light radiated from the light irradiating unit 113 is 2 kW, and the irradiation area irradiated by the light irradiating unit 113 is 150 mm 2 , then the light energy radiated to the object 100 is about 13.3 J/m 2 .
  • MPE maximum permissible exposure
  • the light energy radiated from the light irradiating unit 113 is not more than the MPE value. In this way, the light energy does not exceed the MPE value with certainty if the sample period tw1 is at least 0.1 milliseconds.
  • the light energy that is radiated to the object can be calculated using the value of the sampling period tw1, the peak power of the pulsed light and the irradiation area.
  • the time-series photoacoustic signals are acquired 8 times in every sampling period, and then averaged.
  • the computer 150 also synchronizes the imaging period tw2 with the RR period (tw4) of the electrocardiographic waveform T5 detected by the electrocardiograph 173.
  • the computer 150 starts the imaging period tw2 using the R wave, which is the maximum amplitude of the electrocardiographic waveform T5, as a trigger.
  • the photoacoustic signal A 1 averaged here is acquired in each imaging period tw2 (T2).
  • a simple averaging, moving average, weighted average or the like can be used.
  • tw1 the average value in the sampling period tw1 is 0.1 milliseconds and the imaging frame rate is 60 Hz
  • tw2 is 16.7 milliseconds and a maximum 167 times of addition can be performed in the period of the imaging frame rate.
  • the above mentioned reconstruction processing is performed based on the averaged photoacoustic signal A 1 , and the reconstructed image data R 1 is determined (T3).
  • the image data is sequentially generated in each imaging period.
  • the frame rate converting unit 159 outputs the image data R S , which is generated in T3, in the period (display period) tw3 corresponding to the display frame rate. Then the display unit 160 displays the image data inputted in the display period tw3.
  • the timing shown in Fig. 4 is merely an example, and in an actual apparatus, the sampling period is about 0.1 to several msec, the imaging period is about 0.4 to 2 sec, and the display frame rate is about 50 to 240 Hz.
  • the blood vessels move approximately periodically because of the heartbeat. Therefore, if the pulsation of the examinee and the imaging period are matched and sampling is performed in the same phase, the photoacoustic signals can be acquired in periods where the blood pressure is similar. Thereby motion blur, due to the contraction and motion of blood vessels caused by the heartbeat, can be minimized, and a reconstructed image having a good S/N ratio can be acquired.
  • the motion blur of arteries in particular can be decreased, and a clear photoacoustic image of arteries can be acquired.
  • a moving average of the plurality of photoacoustic image data acquired in each imaging period is determined and outputted. Thereby even if the object moves between pulsation and pulsation, a drop in accuracy of the image can be minimized.
  • the delay time DLY and the sampling possible time SW are adjustable, so that the user can search the appropriate values while checking the reconstructed image.
  • the user may freely set the delay time DLY and the sampling possible time SW, via the input unit 170.
  • Fig. 5 shows examples of the interface screen, which is displayed on the display unit 160 when the parameters related to imaging are acquired from the user via the input unit 170.
  • DLY and the length (time) of SW may be specified.
  • a phase in the imaging period and relative time, with respect to the imaging period may be specified.
  • DLY is the delay of the start of sampling indicated by time
  • PH is the delay of the start of sampling indicated by phase.
  • the electrocardiographic waveform may be displayed at the same time.
  • Embodiment 2 In Embodiment 1, the moving average of the photoacoustic image data generated in each imaging period is determined, whereby each photoacoustic image data is composed. In Embodiment 2, the moving average of the photoacoustic signals acquired in each imaging period is determined to compose the photoacoustic signals, and the image is reconstructed using the composed photoacoustic signal.
  • Fig. 6 is a diagram depicting the timing according to Embodiment 2.
  • the photoacoustic apparatus according to Embodiment 2 performs irradiation of the light and acquisition of the photoacoustic signal for a plurality of times in each sampling period tw1, similarly to Embodiment 1.
  • the sampling possible time SW is a value determined by multiplying the sampling period tw1 by a number of times of emission (8 times in this case).
  • the signal collecting unit 140 averages the photoacoustic signals acquired for a plurality of times in each sampling period tw1 to calculate a photoacoustic signal A S , but in the case of Embodiment 2, a plurality of photoacoustic signals are composed, instead of the reconstructed images.
  • the moving average of a plurality of photoacoustic signals acquired in each imaging period tw2 is determined, whereby the composed photoacoustic signal is generated. For example, if the photoacoustic signal generated in the latest imaging period is A n , and the moving average of the previous 5 periods is determined, the composed photoacoustic signal A S is determined by the following expression.
  • a s (1/5) * (A n-4 + A n-3 + A n-2 + A n-1 + A n )
  • the composed photoacoustic signal determined like this is generated in each imaging period tw2.
  • the computing unit 151 performs reconstruction processing based on the photoacoustic signal generated after the moving average. Thereby the reconstructed image data R n is sequentially calculated in each imaging period tw2.
  • the frame rate converting processing and the processing to display the image data are the same as Embodiment 1, therefore description thereof is omitted.
  • the timing shown in Fig. 6 is also an example, and in an actual apparatus, the sampling period is about 0.1 to several msec, the imaging period is about 0.4 to 2 sec, and the display frame rate is about 50 to 240 Hz.
  • the moving average of a plurality of electric signals, which are acquired in each of a plurality of imaging periods respectively, is determined, hence the acquired image can be a more natural image than an image acquired by composing each reconstructed image.
  • Embodiment 3 In Embodiments 1 and 2, the imaging period is synchronized with the periodic motion which the living body performs spontaneously. In Embodiment 3, on the other hand, the photoacoustic apparatus induces periodic motion to a living body (object).
  • Fig. 7 is a block diagram of the photoacoustic apparatus according to Embodiment 3.
  • an applying pad 174 is installed instead of the electrocardiograph 173.
  • the applying pad 174 is a pad having an electrode, and can generate expansion/contraction motions of a muscle by supplying a weak current to the living body (object), as implemented in a low frequency treatment device, for example.
  • the applying pad is a unit that applies a stimulation signal to induce motion to a living body (object).
  • motion of the object synchronizing with the imaging period can be induced by supplying a weak signal to a living body via the applying pad synchronizing with the imaging period. Further, in Embodiment 3, the irradiation of light and the acquisition of the photoacoustic signal are performed at a timing corresponding to the same phase of the imaging period.
  • the method of generating image data based on the photoacoustic signal is the same as Embodiment 1 and 2, hence description thereof is omitted.
  • expansion/contraction of a muscle can be caused by applying a stimulation signal which induces motion in a living body (object).
  • a stimulation signal which induces motion in a living body (object).
  • the blood vessels in the target state of expansion/contraction of a muscle can be observed.
  • the applying unit to apply the stimulation signal may be different from an applying pad which applies electric stimulation.
  • mechanical stimulation may be applied to a painful spot using an actuator or the like.
  • a stimulation signal may be applied to other sensory organs.
  • each embodiment is an example to describe the present invention, and the present invention can be carried out by appropriately changing or combining the above embodiments within a scope of not departing from the essence of the invention.
  • the present invention may be carried out as a photoacoustic apparatus that carries out at least a part of the above mentioned processing.
  • the present invention may also be carried out as an object information acquiring method that includes at least a part of the above mentioned processing.
  • the above processing and means may be freely combined within the scope of not generating technical inconsistencies.
  • the heart beat is detected based on electrocardiographic waveforms, but a biological signal other than electrocardiographic waveforms may be used.
  • a biological signal other than electrocardiographic waveforms may be used.
  • arterial pressure waveforms, sound waves emitted by the heart (cardiac sound) or the like may be used.
  • the present invention may be applied to a periodic activity of the living body, other than the heartbeat.
  • a simple moving average is used as the moving average, but a weighted moving average, an index moving average or the like may be used.
  • a pause period may be set.
  • a plurality of wavelengths of lights may be generated by the light source unit 200. If a plurality of wavelengths are used, the oxygen saturation can be calculated as functional information. For example, two wavelengths may be alternately switched in each imaging period to acquire the photoacoustic signal, whereby the reconstructed image data may be calculated, and the oxygen saturation may be calculated based on the calculated reconstructed image data.
  • the method of calculating the oxygen saturation is known, hence detailed description thereof is omitted.
  • the above described plurality of embodiments may be installed in one photoacoustic apparatus and used switchably. Moreover, a function to transmit the ultrasonic wave from the transducer and a function to receive the ultrasonic echo reflected by the object and perform measurement based on this ultrasonic echo may be added.
  • Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
  • computer executable instructions e.g., one or more programs
  • a storage medium which may also be referred to more fully as
  • the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD) TM ), a flash memory device, a memory card, and the like.
  • reception unit 140 signal collecting unit 151: computing unit 200: light source unit

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Pathology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Physiology (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Psychiatry (AREA)
  • Signal Processing (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Cardiology (AREA)
  • Vascular Medicine (AREA)
  • Optics & Photonics (AREA)
  • Acoustics & Sound (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

A photoacoustic apparatus comprises a light source configured to irradiate an object with light; an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal; a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period; a composition unit configured to compose a plurality of electric signals acquired in each of the imaging periods; and an image generating unit configured to generate image data representing characteristic information of the object, based on the composed electric signal.

Description

PHOTOACOUSTIC APPARATUS AND OBJECT INFORMATION ACQUIRING METHOD
The present invention relates to an apparatus that acquires information on an object by using a photoacoustic effect.
Recently in medical fields, research on imaging functional information, such as structural information and physiological information inside an object, is progressing. As one such technique, photoacoustic tomography (PAT), has been proposed.
If light, such as laser light, is radiated to a living body, which is an object, an acoustic wave (typically an ultrasonic wave) is generated when the light is absorbed by a biological tissue inside the object. This phenomenon is called a “photoacoustic effect”, and the acoustic wave generated by the photoacoustic effect is called a “photoacoustic wave”. The tissues constituting the object have different absorption rates of light energy, hence the generated photoacoustic waves also have different sound pressures. With PAT, a generated photoacoustic wave is received by a probe, and the received signal is mathematically analyzed so as to acquire characteristic information inside the object.
An apparatus using photoacoustic tomography can detect oxyhemoglobin and deoxyhemoglobin in blood, which are light absorbers, whereby the structural information of the blood vessels, and such functional information as the oxygen saturation, can be acquired.
In the case of acquiring information on blood and blood vessels, it is known that the S/N ratio changes depending on the timing of acquiring the signal since blood volume changes by pulsation. As an example of a technique to suppress the influence of pulsation, PTL 1 discloses a photoacoustic apparatus that acquires and images a photoacoustic signal at a timing that matches with the variation period of the blood volume.
[PTL 1] Japanese Patent Application Publication No. 2016-107069
The apparatus according to PTL 1 generates an image at a frame rate that matches with the pulsation. However, human pulsation is relatively slow, 60 to 100 times per minute, hence if the object position changes during an image generation interval, the generated image is blurred.
With the foregoing in view, it is an object of the present invention to reduce the influence of the motion of the object, and improve the accuracy of the measurement in the photoacoustic apparatus.
A photoacoustic apparatus according to a first aspect of the present invention includes: a light source configured to irradiate an object with light; an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal; a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period; a composition unit configured to compose a plurality of electric signals acquired in each of the imaging periods; and an image generating unit configured to generate image data representing characteristic information of the object, based on the composed electric signal.
A photoacoustic apparatus according to a second aspect of the present invention includes: a light source configured to irradiate light to an object; an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal; a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period; an image generating unit configured to generate image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and a composition unit configured to compose a plurality of image data generated in each of the imaging periods.
An object information acquiring method according to the first aspect of the present invention is an object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal, the object information acquiring method including: a control step of setting an imaging period corresponding to an approximately periodic motion of the object, and executing the irradiation of the light and the acquisition of the electric signal at a timing in an approximately same phase in each imaging period; a composition step of composing a plurality of electric signals acquired in each of the imaging periods; and an image generating step of generating image data representing characteristic information of the object, based on the composed electric signal.
An object information acquiring method according to the second aspect of the present invention is an object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal, the object information acquiring method including: a control step of setting an imaging period corresponding to an approximate periodic motion of the object, and executing the irradiation of the light and the acquisition of the electric signal at a timing in an approximately same phase in each imaging period; an image generating step of generating image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and a composition step of composing a plurality of image data generated in each of the imaging periods.
According to the present invention, the influence of the motion of an object can be reduced, and accuracy of the measurement can be improved in the photoacoustic apparatus.
Further features of the present invention will become apparent from the following description of the exemplary embodiments with reference to the attached drawings.
Fig. 1 is a system block diagram of a photoacoustic apparatus according to Embodiment 1. Fig. 2A is a schematic diagram of a probe 180 according to Embodiment 1. Fig. 2B is a schematic diagram of the probe 180 of Embodiment 1. Fig. 3 is a hardware block diagram according to Embodiment 1. Fig. 4 is an operation timing chart of the photoacoustic apparatus according to Embodiment 1. Fig. 5A is an example of a screen to set a timing for sampling. Fig. 5B is an example of a screen to set a timing for sampling. Fig. 5C is an example of a screen to set a timing for sampling. Fig. 6 is an operation timing chart of a photoacoustic apparatus according to Embodiment 2. Fig. 7 is a hardware block diagram according to Embodiment 3.
Embodiments of the present invention will be described with reference to the drawings. Dimensions, materials, shapes and relative positions of the components described below can be appropriately changed depending on the configurations and various conditions of the apparatus to which the invention is applied. Therefore, the following description is not intended to limit the scope of the invention.
The present invention relates to a technique to detect an acoustic wave propagated from an object, and generate and acquire the characteristic information inside the object. This means that the present invention is regarded as a photoacoustic apparatus (object information acquiring apparatus), or a control method thereof, or an object information acquiring method. The present invention is also regarded as a program which causes an information processing apparatus, equipped with such hardware resources as a CPU and memory, to execute these methods, or a computer readable non-transitory storage medium storing this program.
The photoacoustic apparatus (object information acquiring apparatus) according to the present invention is an apparatus that utilizes the photoacoustic effect, configured to receive an acoustic wave generated inside the object in a case where light (electromagnetic wave) is radiated to an object, and to acquire the characteristic information of the object as image data. In this case, the characteristic information refers to information on the characteristic values corresponding to a plurality of positions inside the object respectively, and these characteristic values are generated using received signals which are acquired by receiving a photoacoustic wave.
The characteristic information acquired by the photoacoustic measurement refers to the values reflecting the absorption rate of the light energy. For example, the characteristic information includes a generation source of an acoustic wave which was generated by the light irradiation, an initial sound pressure inside the object, a light energy absorption density and an absorption coefficient derived from the initial sound pressure, and a concentration of a substance constituting a tissue.
Based on the photoacoustic waves that are generated by lights having a plurality of different wavelengths, spectral information, such as the concentration of a substance constituting the object, can be acquired. The spectral information may be an oxygen saturation, a value generated by weighting the oxygen saturation using intensity (e.g. absorption coefficient), a total hemoglobin concentration, an oxyhemoglobin concentration, or a deoxyhemoglobin concentration. The spectral information may also be a glucose concentration, a collagen concentration, a melanin concentration, or a volume percentage of fat or water.
Based on the characteristic information at each position inside the object, a two-dimensional or three-dimensional characteristic information distribution is acquired. The distribution data can be generated as image data. The characteristic information may be determined, not as numeric data, but as the distribution information at each position inside the object. In other words, such distribution information as the initial sound pressure distribution, the energy absorption density distribution, the absorption coefficient distribution, and the oxygen saturation distribution may be determined.
The “acoustic wave” in the present description is typically an ultrasonic wave, including an elastic wave called a “sound wave” or a “photoacoustic wave”. An electric signal, which was converted from an acoustic wave by a probe or the like, is called an “acoustic signal”. Such phrases as “ultrasonic wave” or “acoustic wave” in this description, however, are not intended to limit the wavelengths of these elastic waves. An acoustic wave generated due to the photoacoustic effect is called a “photoacoustic wave” or a “light-induced ultrasonic wave”. An electric signal, which originates from a photoacoustic wave, is called a “photoacoustic signal”. In this description, the photoacoustic signal includes both an analog signal and a digital signal. The distribution data is also called “photoacoustic image data” or “reconstructed image data”.
(Embodiment 1)
<System configuration>
A photoacoustic apparatus according to Embodiment 1 is an apparatus configured to irradiate an object with pulsed light, and generate an image of blood vessels (structural image) inside the object by receiving a photoacoustic wave generated inside the object. In the following description of the embodiments, a photoacoustic apparatus having a hand held probe is used, but the present invention can also be applied to a photoacoustic apparatus which has a probe on a mechanical stage so as to scan the object mechanically.
A configuration of the photoacoustic apparatus according to Embodiment 1 will be described with reference to Fig. 1. The photoacoustic apparatus according to Embodiment 1 includes a probe 180, a signal collecting unit 140, a computer 150, a display unit 160 and an input unit 170. The probe 180 includes a light source unit 200, an optical system 112, a light irradiating unit 113, and a reception unit 120. The computer 150 includes a computing unit 151, a storage unit 152, a control unit 153 and a frame rate converting unit 159.
An overview of the photoacoustic measurement of an object will be described.
First the light source unit 200 periodically supplies a pulsed light to the light irradiating unit 113 via the optical system 112 constituted by optical fibers (bundle fibers) and the like. The light irradiating unit 113 irradiates the object 100 with the supplied light. Thereby an acoustic wave is periodically generated from the object 100.
The reception unit 120 receives a photoacoustic wave generated from the object 100, and outputs an analog electric signal. Then the signal collecting unit 140 converts the analog signal, which was outputted from the reception unit 120, into a digital signal, and outputs the digital signal to the computer 150.
Hereafter the period of irradiating the object with the pulsed light and acquiring the electric signal is called a “sampling period”.
The computer 150 performs processing to compose a digital signal, which is outputted from the signal collecting unit 140, at each sampling period, and stores the composed signal in the storage unit 152. The composition is not only a simple addition, but includes a weighting addition, averaging, moving average and the like. In the following, averaging will be primarily described, but a composing method other than averaging may be used.
Further, the computer 150 performs such processing as image reconstruction for the digital signals stored in the storage unit 152, so as to generate the photoacoustic image data in each period corresponding to the imaging frame rate (hereafter “imaging period”). Concrete processing thereof will be described later.
The computer 150 outputs the generated photoacoustic image data to the frame rate converting unit 159 in each imaging period. The frame rate converting unit 159 converts the photoacoustic image data, which is inputted in each imaging period, into a refresh rate corresponding to the display unit 160 (hereafter “display period”). Details on the method will be described later.
Then the display unit 160 refreshes and displays the photoacoustic image data in each display period.
The user (e.g. physician, technician) of the apparatus can confirm the photoacoustic image displayed on the display unit 160 so as to perform diagnosis. The display image may be stored in the memory of the computer 150, or a data management system connected with the photoacoustic apparatus via a network, based on the storing instruction from the user or the computer 150. The user of the apparatus can input data to the apparatus via the input unit 170.
Details on each composing element will be described next.
<<Probe 180>>
Fig. 2A is a schematic diagram of the probe 180 according to the present embodiment. The probe 180 includes a light source unit 200, the optical system 112, the light irradiating unit 113, the reception unit 120, and a housing 181.
The housing 181 is a case which houses the light source unit 200, the optical system 112, the light irradiating unit 113 and the reception unit 120. By holding the housing 181, the user can use the probe 180 as a handheld probe.
The light irradiating unit 113 is a unit configured to irradiate the object with the pulsed light, which is propagated by the optical system 112. The XYZ axes in Fig. 2A indicate the coordinate axes when the probe is still, and these axes are not intended to limit the direction when the probe is used.
The probe 180 in Fig. 2A is connected to the signal collecting unit 140 via a cable 182. The cable 182 includes a wire to supply power to the light source unit 200, a wire to transmit a light emission control signal, and a wire to output an analog signal, which is outputted from the reception unit 120, to the signal collecting unit 140 (these wires are not illustrated). A connector may be disposed in the cable 182, so as to detachably attach the probe 180 to the other composing elements of the photoacoustic apparatus.
Further, as illustrated in Fig. 2B, a semiconductor laser, a light emitting diode or the like may be used as the light source unit 200, so that the pulsed light is directly radiated to the object without using the optical system 112. In this case, a light emitting end portion of the semiconductor laser, LED or the like (front end of housing) becomes the light irradiating unit 113.
<<Light source unit 200>>
The light source unit 200 is a unit configured to generate a light which is radiated to an object 100.
The light source is preferably a laser light source in order to acquire high power, but a light emitting diode, flash lamp or the like may be used instead of laser. In the case of using a laser as the light source, various laser, such as a solid-state laser, a gas laser, a dye laser and a semiconductor laser can be used. The irradiation timing, waveform, intensity and the like are controlled by a light source control unit (not illustrated). This light source control unit may be integrated with the light source.
To acquire a substance concentration, such as an oxygen saturation, it is preferable to use a light source which can output a plurality of wavelengths. To install the light source unit 200 in the housing 181, it is preferable to use a semiconductor light emitting element, such as a semiconductor laser and a light emitting diode, as shown in Fig. 2B. Further, to output a plurality of wavelengths, the wavelengths may be switched using a plurality of types of semiconductor laser and light emitting diodes which generate lights having different wavelengths.
To effectively generate the photoacoustic wave, the light must be radiated in a sufficiently short time in accordance with the thermal characteristics of the object. When the object is a living body, the pulsed light that is generated from the light source is preferably 10 nanoseconds to 1 microsecond. The wavelength of the pulsed light is preferably a wavelength at which the light propagates into the object. In concrete terms, when the object is a living body, the wavelength of the pulsed light is preferably at least 400 nm and not more than 1600 nm. Needless to say, the wavelength may be determined in accordance with the light absorption characteristic of a light absorber to be imaged.
In the case of imaging blood vessels at high resolution, a wavelength at which the light is well absorbed by the blood vessels (at least 400 nm, not more than 800 nm) may be used. In the case of imaging a deep region in a living body, a light having a wavelength at which the light is not absorbed very much by the background tissue (e.g. water, fat) of the living body (at least 700 nm, not more than 1100 nm) may be used.
In the present embodiment, the semiconductor light emitting element is used as the light source, hence a large quantity of light is not radiated to the object. In other words, the photoacoustic signal that can be acquired by one irradiation may not easily reach a desired S/N ratio. Therefore, light is emitted from the light source in each sampling period, and the acquired photoacoustic signals are averaged, so as to improve the S/N ratio.
An example of a preferable wavelength of the light source unit 200 that is used for the present embodiment is 797 nm. This wavelength is a wavelength at which the light reaches the deep region inside the object, and at which absorption coefficients of the oxyhemoglobin and deoxyhemoglobin are approximately the same, therefore this wavelength is suitable for detecting the blood vessel structure. If 756 nm is additionally used as the second wavelength, the oxygen saturation can be determined, using the absorption coefficient difference between the oxyhemoglobin and deoxyhemoglobin.
<Light irradiating unit 113>
The light irradiating unit 113 is a portion to emit the light which is radiated to the object (emitting end). If bundle fiber is used as the optical system 112, the terminal portion is the light irradiating unit 113. In the case when a part of the living body (e.g. breast) is the object, a diffuser or the like, to diffuse light, may be disposed in the light irradiating unit 113. Thereby the beam diameter of the pulsed light is expanded, and the pulsed light can be radiated to the object in this state.
Further, in the case of using a plurality of semiconductor light emitting elements as the light source unit 200, as illustrated in Fig. 2B, the light irradiating unit 113 is constituted by aligning the light emitting end portion of each element (housing front end), whereby the object can be irradiated over a wide range.
<Reception unit 120>
The reception unit 120 is a unit including: a transducer (acoustic wave detection element) which receives a photoacoustic wave generated by a pulsed light, and outputs an electric signal; and a support member that supports the transducer.
For the transducer, a piezoelectric material, a capacitive type transducer (CMUT), a transducer using a Fabry-Perot interferometer and the like may be used, for example. Examples of piezoelectric material are a piezoelectric ceramic material, such as PZT (lead zirconate titanate), or a high polymer piezoelectric film material, such as PVDF (polyvinylidene fluoride).
The electric signal acquired by the transducer is a time-resolved signal. In other words, the amplitude of the acquired electric signal is a value based on the sound pressure which the transducer received at each timing (e.g. value in proportion to sound pressure).
It is preferable that the transducer can detect the frequency component constituting the photoacoustic wave (typically 100 KHz to 10 MHz). A plurality of transducers may be arranged on the support member so as to form a plane or curved plane, such as an 1D array, a 1.5D array, a 1.75D array or a 2D array.
The reception unit 120 may include an amplifier configured to amplify time-series analog signals which are outputted from the transducer. Further, the reception unit 120 may include an A/D convertor to convert time-series analog signal which are outputted from the transducer into time series digital signals. In other words, the reception unit 120 may also play a part of the signal collecting unit 140.
In the present embodiment, the handheld probe was described as an example, but in order to improve image accuracy, it is preferable to use transducers which completely surround the object 100, so that the acoustic wave can be detected at various angles. If the object 100 is too large to be completely surrounded by transducers, transducers may be arranged on a hemispherical support member. In the case when the probe includes the reception unit having such a shape, the probe may be relatively moved mechanically with respect to the object 100. To move the probe, such a mechanism as an XY stage may be used. The arrangement and number of transducers, and the shape of the support member, are not limited to the above description, but may be optimized in accordance with the object 100.
A medium to propagate the photoacoustic wave (acoustic matching material) is preferably disposed in a space between the reception unit 120 and the object 100. Thereby the acoustic impedance of the object 100 and that of the transducer at the interface therebetween can be matched. The acoustic matching material can be water, oil or ultrasonic gel, for example.
The photoacoustic apparatus according to the present embodiment may include a holding member, which holds the object 100 and stabilizes the shape of the object 100. The holding member preferably has both high light transmissivity and high acoustic wave transmissivity. Polymethyl pentene, polyethylene terephthalate, acryl or the like, for example, can be used.
If the apparatus according to the present embodiment has a function to generate not only a photoacoustic image, but also an ultrasonic image by transmitting/receiving an ultrasonic wave, the transducer may have the function of a transmission unit which transmits an acoustic wave. The transducer as the reception unit and the transducer as the transmission unit may be the same unit, or be separate units.
<<Signal collecting unit 140>>
The signal collecting unit 140 includes an amplifier that amplifies an analog electric signal outputted from the reception unit 120, and an A/D convertor that converts an analog signal, which is outputted from the amplifier, into a digital signal. The signal collecting unit 140 may be composed of an FPGA (Field Programmable Gate Array) chip and the like.
Analog signals outputted by a plurality of transducers, which are arranged in an array in the reception unit 120, are amplified by a plurality of amplifiers corresponding to each of the transducers, and are converted into digital signals by a plurality of A/D convertors corresponding to each of the amplifiers respectively. The rate of the A/D conversion is preferably at least double that of the band of the signal to be inputted. As mentioned above, if the frequency component constituting the photoacoustic wave is 100 KHz to 10 MHz, the A/D conversion rate is at least 20 MHz, preferably at least 40 MHz.
The signal collecting unit 140 synchronizes the timing of the light irradiation and the timing of the signal collecting processing by using the light emission control signal. In other words, based on the light emission timing in each sampling period, the signal collecting unit 140 starts A/D conversion at the rate mentioned above, and converts the analog signals into digital signals. As a result, a digital signal string is acquired for each transducer at an interval of one-A/D conversion rate (period of A/D conversion clock). The signal collecting unit 140 is also called the “Data Acquisition System (DAS)”.
The signal collecting unit 140 may be disposed inside the housing 181. If this configuration is used, information between the probe 180 and the computer 150 can be propagated by digital signals, hence noise resistance improves. Further, compared with the case of transmission by analog signals, a number of wires can be decreased, and operability of the probe 180 improves in the case of transmission by digital signals. The later mentioned averaging may be performed by the signal collecting unit 140. In this case, it is preferable to use such hardware as an FPGA to perform the averaging.
<<Computer 150>>
The computer 150 is an arithmetic unit (image generating unit and display control unit in the present invention), that includes a computing unit 151, a storage unit 152, a control unit 153 and a frame rate converting unit 159. The computing unit 151 which performs the computing function is constituted by such processors as a CPU and GPU (Graphics Processing Unit), and by such arithmetic circuits as an FPGA (Field Programmable Gate Array) chip. These units may be a single processor or a single arithmetic circuit, or may be constituted of a plurality of processors and arithmetic circuits.
The computer 150 performs the following processing operations for each of the plurality of transducers.
First the computer 150 composes a plurality of digital signals (photoacoustic signals) which are outputted from the signal collecting unit 140 in each sampling period. The composed photoacoustic signal is stored in the storage unit 152 in each imaging period.
Then the computing unit 151 reconstructs the image in each imaging period, based on the composed photoacoustic signal that is stored in the storage unit 152, so as to generate the photoacoustic image (structural image, functional image), and executes other arithmetic processing operations. The image period is generated at timings synchronizing with the heartbeat of the examinee using the later mentioned electrocardiograph 173. The method of setting the imaging period will be described later.
As for the reconstruction algorithm when the computing unit 151 converts the photoacoustic signal into the photoacoustic image (e.g. three-dimensional volume data), any method, such as a reverse projection method in a time domain, a reverse projection method in a Fourier domain, and a model based method (repeat operation method), may be used. The reverse projection methods in a time domain are, for example, universal back projection (UBP), filtered back projection (FBP) and phasing addition (delay-and-sum).
When the light source unit 200 generates lights having two different wavelengths, the computing unit 151 generates a first initial sound pressure distribution from a photoacoustic signal originating from the light having a first wavelength, and generates a second initial sound pressure distribution from a photoacoustic signal originating from the light having a second wavelength. Further, a first absorption coefficient distribution is acquired by correcting the first initial sound pressure distribution using the light quantity distribution of the light having the first wavelength, and acquires a second absorption coefficient distribution by correcting the second initial sound pressure distribution using the light quantity distribution of the light having the second wavelength. Furthermore, an oxygen saturation distribution is acquired based on the first and second absorption coefficient distributions. The content and sequence of the computing are not limited to this, as long as the oxygen saturation distribution can finally be acquired.
The storage unit 152 is such a volatile memory as RAM (Random Access Memory), or such non-transitory storage media as ROM (Read Only Memory), magnetic disk and flash memory. A storage medium in which a program is stored is a non-transitory storage medium. The storage unit 152 may be constituted of a plurality of storage media.
The storage unit 152 can store various data, such as a photoacoustic signal averaged in each imaging period, photoacoustic image data generated by the computing unit 151, and reconstructed image data based on the photoacoustic image data.
The control unit 153 is a unit to control the operation of each composing element of the photoacoustic apparatus, and is constituted by arithmetic elements, such as a CPU. The control unit 153 may control each composing element of the photoacoustic apparatus based on an instruction signal (e.g. measurement start signal) which is inputted via the input unit 170.
Further, the control unit 153 reads program codes stored in the storage unit 152, and controls the operation of each composing element of the photoacoustic apparatus.
The control unit 153 can also adjust a generated image.
The frame rate converting unit 159 converts a photoacoustic image generated at a predetermined frame rate corresponding to the imaging period (hereinafter referred to as a imaging frame rate) into a photoacoustic image generated at a predetermined frame rate corresponding to the display period (hereinafter referred to as a display frame rate), and outputs the photoacoustic image to the display unit 160.
In the example in Fig. 1, the frame rate converting unit 159 is independent, but the frame rate converting unit 159 need not always be an independent unit. For example, a photoacoustic image may be stored in the storage unit 152 at each imaging frame rate, and the stored photoacoustic image may be read in accordance with the display frame rate.
The display frame rate is preferably a frame rate corresponding to a general purpose display (e.g. 50 Hz, 60 Hz, 72 Hz, 120 Hz). By making the imaging period and the display period independent from each other, a frame rate appropriate for measurement and a frame rate appropriate for displaying the image can be set independently. In other words, a frame rate appropriate for measurement can be freely set, regardless the frame rate appropriate for displaying the image. It is also possible to freely change only the imaging period in accordance with an instruction of the user.
The display unit 160 is a unit to display a photoacoustic image. The display unit 160 refreshes the actual screen, synchronizing with the display frame rate. The display frame rate and the rate of refreshing the actual screen (refresh rate) may be the same.
Some recent liquid crystal displays have a function to support inputs at a plurality of frame rates (frame frequencies). And some such liquid crystal displays have a function to convert the inputted frame rate into the rate to fresh an actual screen (refresh rate). If the display unit 160 has a function, this display unit 160 includes a unit to convert the display frame rate into the actual refresh rate.
If the display unit 160 includes such a unit, the computer 150 need not have the frame rate converting unit 159 shown in Fig. 1. If the display unit 160 includes the frame rate converting function, the configuration of the computer 150 can be simplified.
The display unit 160 may display information relating to the imaging period and the display period together with the photoacoustic image.
The computer 150 may be a custom designed workstation, or may be a general purpose PC or workstation. The computer 150 may be operated in accordance with the instructions of the program stored in the storage unit 152. Each configuration of the computer 150 may be implemented by different hardware respectively. At least a part of the configuration of the computer 150 may be implemented by a single hardware component.
Fig. 3 is a concrete configuration example of the computer 150 according to the present embodiment. The computer 150 according to the present embodiment includes a CPU 154, a GPU 155, a RAM 156, a ROM 157, an external storage device 158 and a frame rate converting unit 159. A liquid crystal display 161 serving as the display unit 160, a mouse 171 and a keyboard 172 serving as an input unit 170 are connected to the computer 150.
The computer 150 and the reception unit 120 may be housed in a common case. A computer housed in a case may perform part of the signal processing operations, and a computer installed outside the case may perform the remainder of the signal processing operations. In this case, computers inside and outside the case in total are regarded as the “computer” according to the present embodiment. In other words, hardware that constitutes the computer may be distributed. Further, for the computer 150, an information processing apparatus installed in a remote area, provided by a cloud computing service or the like, may be used.
The computer 150 may perform image processing for the acquired photoacoustic image, or perform processing to compose GUI graphics or the like when necessary. These processing operations may be performed before or after the frame rate conversion.
<<Display unit 160>>
The display unit 160 is a display device, such as a liquid crystal display and an organic EL. An image generated by the computer 150 and the numeric values at specific positions and the like are displayed by the display unit 160. In the display unit 160, the image is inputted at a frame rate corresponding to the display period (e.g. 50 Hz, 60 Hz, 72 Hz, 120 Hz), as mentioned above. The display unit 160 may display the image at the inputted frame rate, or may convert the frame rate. The display unit 160 may also display a GUI to operate the images and the apparatus on screen.
<<Input unit 170>>
The input unit 170 is a unit to acquire input of an instruction and numeric value from the user. The input unit 170 may be an operation console, for example, constituted by a mouse, keyboard and special knobs that the user can operate. If a touch panel is used for the display unit 160, the display unit 160 may also be the input unit 170. Using the input unit 170, the user can specify the start or end of measurement, specify the imaging period (imaging frame rate), and perform such an operation as an image saving instruction. The input unit 170 may acquire various parameters on the sound velocity inside the object, configuration of the holding unit and the like, so that the computer 150 performs processing using this information.
<<Electrocardiograph 173>>
The electrocardiograph 173 is a unit to detect an electric signal that is associated with the heart of the patient (examinee). The signal detected by the electrocardiograph 173 is outputted to the computer 150, whereby the computer 150 can detect the pulsation of the examinee. In the present embodiment, the computer 150 radiates the pulsed light and acquires the photoacoustic signal synchronizing with the output of the electrocardiograph (electrocardiographic waveform), and calculates the reconstructed image data. Details on this processing will be described later.
Each composing element of the photoacoustic apparatus described above may be configured as an independent apparatus respectively, or may be integrated into one configuration. Further, at least a part of the configuration of the photoacoustic apparatus may be integrated, and the remainder may be configured as independent apparatuses.
<<Object 100>>
An object 100 is not a part of the photoacoustic apparatus of the present invention, but will be described here. The photoacoustic apparatus according to the present embodiment is used to perform diagnosis of malignant cancers, vascular diseases and the like of humans and animals, and to perform follow up observation of chemotherapy. This means that a possible object 100 is the diagnostic target portion of a living body, such as a breast, an organ, blood vessels, the head, neck, abdomen, a limb including a finger and a toe of humans and animals. For example, if the measurement target is a human body, then oxyhemoglobin, deoxyhemoglobin, blood vessels which contain a large quantity of oxy- or deoxyhemoglobin, or new blood vessels formed near a malignant tumor, may be a target light absorber. Further, plaque or a carotid artery wall may be a target light absorber. Furthermore, such dyes as methylene blue (MB) and indocyanine green (ICG), gold particles or a substance generated by integrating or chemically modifying these substances introduced from the outside, may be a target light absorber. Furthermore, a needle or a light absorber attached to a needle may be observed. The object may be a phantom or inorganic matter, such as a testing object.
<Processing details>
Details on the processing will be described next with reference to Fig. 4, which is a timing chart to describe the operation of the photoacoustic apparatus according to Embodiment 1. The abscissa of each timing chart is a time axis.
As indicated by the reference sign T1 in Fig. 4, in the photoacoustic apparatus according to the present embodiment, the light source unit 200 emits light in each sampling period (tw1), and acquires the photoacoustic signal generated by the light emission in each sampling period. The period in which sample is performed is from the timing when the delay time DLY elapses after the R wave is detected in the electrocardiographic waveform until the sampling possible time SW elapses. Thereby sampling can be performed at a timing in approximately the same phase in the period from detection of the R wave to the detection of the next R wave.
The sampling possible time SW is a time that is sufficiently shorter than the motion of the object living body (contraction of blood vessels generated by heart beat). Then a photoacoustic signal that is not influenced by the motion of the object can be acquired.
The length of the sampling period tw1 is preferably set considering maximum permissible exposure (MPE) to the skin. For example, if the measurement wavelength is 750 nm, the pulse width of the pulsed light is 1 microsecond, the sampling period tw1 is 0.1 milliseconds, and the MPE value to skin is about 14 J/m2. On the other hand, if the peak power of the pulsed light radiated from the light irradiating unit 113 is 2 kW, and the irradiation area irradiated by the light irradiating unit 113 is 150 mm2, then the light energy radiated to the object 100 is about 13.3 J/m2. In this case, the light energy radiated from the light irradiating unit 113 is not more than the MPE value.
In this way, the light energy does not exceed the MPE value with certainty if the sample period tw1 is at least 0.1 milliseconds. As described above, the light energy that is radiated to the object can be calculated using the value of the sampling period tw1, the peak power of the pulsed light and the irradiation area. Here it is assumed that the time-series photoacoustic signals are acquired 8 times in every sampling period, and then averaged.
The computer 150 also synchronizes the imaging period tw2 with the RR period (tw4) of the electrocardiographic waveform T5 detected by the electrocardiograph 173. In concrete terms, the computer 150 starts the imaging period tw2 using the R wave, which is the maximum amplitude of the electrocardiographic waveform T5, as a trigger.
The photoacoustic signal A1 averaged here is acquired in each imaging period tw2 (T2). For the averaging, a simple averaging, moving average, weighted average or the like can be used. For example, if the average value in the sampling period tw1 is 0.1 milliseconds and the imaging frame rate is 60 Hz, then tw2 is 16.7 milliseconds and a maximum 167 times of addition can be performed in the period of the imaging frame rate.
Then the above mentioned reconstruction processing is performed based on the averaged photoacoustic signal A1, and the reconstructed image data R1 is determined (T3). The image data is sequentially generated in each imaging period.
In the present embodiment, the composed image data is generated by acquiring the moving average from a plurality of image data determined in each imaging period tw2. For example, if the image data generated in the latest imaging period is Rn, and the moving average of the past 5 periods is acquired, then the composed image data RS can be determined by the following expression.
Rs = (1/5) * (Rn-4 + Rn-3 + Rn-2 + Rn-1 + Rn)
In the present embodiment, the composed image data determined like this is generated in each imaging period tw2.
The frame rate converting unit 159 outputs the image data RS, which is generated in T3, in the period (display period) tw3 corresponding to the display frame rate. Then the display unit 160 displays the image data inputted in the display period tw3.
The timing shown in Fig. 4 is merely an example, and in an actual apparatus, the sampling period is about 0.1 to several msec, the imaging period is about 0.4 to 2 sec, and the display frame rate is about 50 to 240 Hz.
The blood vessels move approximately periodically because of the heartbeat. Therefore, if the pulsation of the examinee and the imaging period are matched and sampling is performed in the same phase, the photoacoustic signals can be acquired in periods where the blood pressure is similar. Thereby motion blur, due to the contraction and motion of blood vessels caused by the heartbeat, can be minimized, and a reconstructed image having a good S/N ratio can be acquired. The motion blur of arteries in particular can be decreased, and a clear photoacoustic image of arteries can be acquired.
Further, in the present embodiment, a moving average of the plurality of photoacoustic image data acquired in each imaging period is determined and outputted. Thereby even if the object moves between pulsation and pulsation, a drop in accuracy of the image can be minimized.
Depending on the target measurement portion, the time and amount of change of blood pressure, with respect to the electrocardiographic waveform, change because the distance from the heart is different. Therefore, it is preferable that the delay time DLY and the sampling possible time SW are adjustable, so that the user can search the appropriate values while checking the reconstructed image. For example, the user may freely set the delay time DLY and the sampling possible time SW, via the input unit 170.
Fig. 5 shows examples of the interface screen, which is displayed on the display unit 160 when the parameters related to imaging are acquired from the user via the input unit 170.
For example, as shown in Fig. 5A, DLY and the length (time) of SW may be specified. Further, as shown in Fig. 5B, a phase in the imaging period and relative time, with respect to the imaging period, may be specified. DLY is the delay of the start of sampling indicated by time, and PH is the delay of the start of sampling indicated by phase. Furthermore, as shown in Fig. 5C, the electrocardiographic waveform may be displayed at the same time.
(Embodiment 2)
In Embodiment 1, the moving average of the photoacoustic image data generated in each imaging period is determined, whereby each photoacoustic image data is composed. In Embodiment 2, the moving average of the photoacoustic signals acquired in each imaging period is determined to compose the photoacoustic signals, and the image is reconstructed using the composed photoacoustic signal.
Fig. 6 is a diagram depicting the timing according to Embodiment 2. The photoacoustic apparatus according to Embodiment 2 performs irradiation of the light and acquisition of the photoacoustic signal for a plurality of times in each sampling period tw1, similarly to Embodiment 1. The sampling possible time SW is a value determined by multiplying the sampling period tw1 by a number of times of emission (8 times in this case).
In Embodiment 2, similarly to Embodiment 1, the signal collecting unit 140 averages the photoacoustic signals acquired for a plurality of times in each sampling period tw1 to calculate a photoacoustic signal AS, but in the case of Embodiment 2, a plurality of photoacoustic signals are composed, instead of the reconstructed images.
In concrete terms, the moving average of a plurality of photoacoustic signals acquired in each imaging period tw2 is determined, whereby the composed photoacoustic signal is generated. For example, if the photoacoustic signal generated in the latest imaging period is An, and the moving average of the previous 5 periods is determined, the composed photoacoustic signal AS is determined by the following expression.
As = (1/5) * (An-4 + An-3 + An-2 + An-1 + An)
In the present embodiment, the composed photoacoustic signal determined like this is generated in each imaging period tw2.
The computing unit 151 performs reconstruction processing based on the photoacoustic signal generated after the moving average. Thereby the reconstructed image data Rn is sequentially calculated in each imaging period tw2.
The frame rate converting processing and the processing to display the image data are the same as Embodiment 1, therefore description thereof is omitted.
The timing shown in Fig. 6 is also an example, and in an actual apparatus, the sampling period is about 0.1 to several msec, the imaging period is about 0.4 to 2 sec, and the display frame rate is about 50 to 240 Hz.
As described above, according to Embodiment 2, the moving average of a plurality of electric signals, which are acquired in each of a plurality of imaging periods respectively, is determined, hence the acquired image can be a more natural image than an image acquired by composing each reconstructed image.
(Embodiment 3)
In Embodiments 1 and 2, the imaging period is synchronized with the periodic motion which the living body performs spontaneously. In Embodiment 3, on the other hand, the photoacoustic apparatus induces periodic motion to a living body (object).
Fig. 7 is a block diagram of the photoacoustic apparatus according to Embodiment 3. In the present embodiment, an applying pad 174 is installed instead of the electrocardiograph 173. The applying pad 174 is a pad having an electrode, and can generate expansion/contraction motions of a muscle by supplying a weak current to the living body (object), as implemented in a low frequency treatment device, for example. In other words, the applying pad is a unit that applies a stimulation signal to induce motion to a living body (object).
In Embodiment 3, motion of the object synchronizing with the imaging period can be induced by supplying a weak signal to a living body via the applying pad synchronizing with the imaging period. Further, in Embodiment 3, the irradiation of light and the acquisition of the photoacoustic signal are performed at a timing corresponding to the same phase of the imaging period.
The method of generating image data based on the photoacoustic signal is the same as Embodiment 1 and 2, hence description thereof is omitted.
In Embodiment 3, expansion/contraction of a muscle can be caused by applying a stimulation signal which induces motion in a living body (object). In other words, the blood vessels in the target state of expansion/contraction of a muscle can be observed.
The applying unit to apply the stimulation signal may be different from an applying pad which applies electric stimulation. For example, mechanical stimulation may be applied to a painful spot using an actuator or the like. Further, a stimulation signal may be applied to other sensory organs.
(Other embodiments)
The above description on each embodiment is an example to describe the present invention, and the present invention can be carried out by appropriately changing or combining the above embodiments within a scope of not departing from the essence of the invention.
For example, the present invention may be carried out as a photoacoustic apparatus that carries out at least a part of the above mentioned processing. The present invention may also be carried out as an object information acquiring method that includes at least a part of the above mentioned processing. The above processing and means may be freely combined within the scope of not generating technical inconsistencies.
In the description of the embodiments, the heart beat is detected based on electrocardiographic waveforms, but a biological signal other than electrocardiographic waveforms may be used. For example, arterial pressure waveforms, sound waves emitted by the heart (cardiac sound) or the like may be used. Furthermore, the present invention may be applied to a periodic activity of the living body, other than the heartbeat.
In the description of the embodiment, a simple moving average is used as the moving average, but a weighted moving average, an index moving average or the like may be used.
In the description of the embodiments, such phrasing as the sampling period, the image period and the display period were used, but “period” here need not be precisely constant. In other words, “period” in this description includes cases of repeating at intervals which are not constant. Further, in the sampling period, a pause period may be set.
As mentioned above, a plurality of wavelengths of lights may be generated by the light source unit 200. If a plurality of wavelengths are used, the oxygen saturation can be calculated as functional information. For example, two wavelengths may be alternately switched in each imaging period to acquire the photoacoustic signal, whereby the reconstructed image data may be calculated, and the oxygen saturation may be calculated based on the calculated reconstructed image data. The method of calculating the oxygen saturation is known, hence detailed description thereof is omitted.
The above described plurality of embodiments may be installed in one photoacoustic apparatus and used switchably. Moreover, a function to transmit the ultrasonic wave from the transducer and a function to receive the ultrasonic echo reflected by the object and perform measurement based on this ultrasonic echo may be added.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation, so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2017-155617, filed on August 10, 2017, which is hereby incorporated by reference herein in its entirety.
120: reception unit
140: signal collecting unit
151: computing unit
200: light source unit

Claims (21)

  1. A photoacoustic apparatus, comprising:
    a light source configured to irradiate an object with light;
    an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal;
    a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period;
    a composition unit configured to compose a plurality of electric signals acquired in each of the imaging periods; and
    an image generating unit configured to generate image data representing characteristic information of the object, based on the composed electric signal.
  2. The photoacoustic apparatus according to claim 1, wherein the composition unit performs the composition by acquiring a moving average of a plurality of electric signals acquired in each of the imaging periods.
  3. The photoacoustic apparatus according to claim 1 or 2,
    wherein the control unit executes the irradiation of the light and the acquisition of the electric signal for a plurality of times in the imaging period, and
    wherein the composition unit further composes the electric signals acquired for a plurality of times in the imaging period.
  4. The photoacoustic apparatus according to any one of claims 1 and 3, further comprising a converting unit configured to convert a frame rate of the image data generated in the imaging period into a frame rate corresponding to a display period which is shorter than the imaging period.
  5. The photoacoustic apparatus according to claim 4, further comprising a display control unit configured to display the image data on a display unit in the display period.
  6. The photoacoustic apparatus according to claim 5, wherein the display control unit displays information relating to the display period on the display unit.
  7. The photoacoustic apparatus according to any one of claims 1 to 6, wherein the object is a living body,
    the photoacoustic apparatus further comprising
    a determination unit configured to determine the imaging period, based on an approximately periodic activity of the living body.
  8. The photoacoustic apparatus according to claim 7, further comprising an applying unit configured to apply stimulation to the living body in each of imaging periods.
  9. A photoacoustic apparatus, comprising:
    a light source configured to irradiate an object with light;
    an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal;
    a control unit configured to set an imaging period corresponding to an approximately periodic motion of the object, and to execute the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period;
    an image generating unit configured to generate image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and
    a composition unit configured to compose a plurality of image data generated in each of the imaging periods.
  10. The photoacoustic apparatus according to claim 9, wherein the composition unit performs the composition by acquiring a moving average of the plurality of image data acquired in each of the imaging periods.
  11. The photoacoustic apparatus according to any one of claims 1 to 10, wherein the light source is a light source including a semiconductor light emitting element.
  12. An object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal,
    the method comprising:
    a control step of setting an imaging period corresponding to an approximately periodic motion of the object, and executing the irradiation of the light and the acquisition of the electric signal at a timing in an approximately same phase in each imaging period;
    a composition step of composing a plurality of electric signals acquired in each of the imaging periods; and
    an image generating step of generating image data representing characteristic information of the object, based on the composed electric signal.
  13. The object information acquiring method according to claim 12, wherein in the composition step, the composition is performed by acquiring a moving average of the plurality of electric signals acquired in each of the imaging periods.
  14. The object information acquiring method according to claim 12 or 13,
    wherein in the control step, the irradiation of the light and the acquisition of the electric signal are executed for a plurality of times in the imaging period, and
    wherein the composition step further includes composing the electric signals acquired for a plurality of times in the imaging period.
  15. The object information acquiring method according to any one of claims 12 to 14, further comprising a converting step of converting a frame rate of the image data generated in the imaging period into a frame rate corresponding to a display period which is shorter than the imaging period.
  16. The object information acquiring method according to claim 15, further comprising a display control step of displaying the image data on a display unit in the display period.
  17. The object information acquiring method according to claim 16, wherein in the display control step, information relating to the display period is displayed on the display unit.
  18. The object information acquiring method according to any one of claims 12 to 17, wherein the object is a living body,
    the object information acquiring method further comprising
    a determination step of determining the imaging period based on an approximately periodic activity of the living body.
  19. The object information acquiring method according to claim 18, further comprising an applying step of applying stimulation to the living body in each of imaging periods.
  20. An object information acquiring method performed by a photoacoustic apparatus including a light source configured to irradiate an object with light, and an acoustic wave detection unit configured to receive an acoustic wave generated in the object by the irradiation of the light, and to convert the acoustic wave into an electric signal,
    the object information acquiring method comprising:
    a control step of setting an imaging period corresponding to an approximate periodic motion of the object, and executing the irradiation of the light and acquisition of the electric signal at a timing in an approximately same phase in each imaging period;
    an image generating step of generating image data representing characteristic information of the object in each of the imaging periods, based on the acquired electric signal; and
    a composition step of composing a plurality of image data generated in each of the imaging periods.
  21. The object information acquiring method according to claim 20, wherein in the composition step, the composition is performed by acquiring a moving average of a plurality of image data acquired in each of the imaging periods.
PCT/JP2018/030104 2017-08-10 2018-08-10 Photoacoustic apparatus and object information acquiring method WO2019031607A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2017155617A JP2019033806A (en) 2017-08-10 2017-08-10 Photoacoustic apparatus and subject information acquisition method
JP2017-155617 2017-08-10

Publications (1)

Publication Number Publication Date
WO2019031607A1 true WO2019031607A1 (en) 2019-02-14

Family

ID=63556376

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/030104 WO2019031607A1 (en) 2017-08-10 2018-08-10 Photoacoustic apparatus and object information acquiring method

Country Status (2)

Country Link
JP (1) JP2019033806A (en)
WO (1) WO2019031607A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160150990A1 (en) * 2014-11-28 2016-06-02 Canon Kabushiki Kaisha Photoacoustic apparatus, subject information acquisition method, and program
JP2016107069A (en) 2014-11-28 2016-06-20 キヤノン株式会社 Photoacoustic apparatus, subject information acquisition method, and program
JP2017155617A (en) 2016-02-29 2017-09-07 井関農機株式会社 Tractor

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160150990A1 (en) * 2014-11-28 2016-06-02 Canon Kabushiki Kaisha Photoacoustic apparatus, subject information acquisition method, and program
JP2016107069A (en) 2014-11-28 2016-06-20 キヤノン株式会社 Photoacoustic apparatus, subject information acquisition method, and program
JP2017155617A (en) 2016-02-29 2017-09-07 井関農機株式会社 Tractor

Also Published As

Publication number Publication date
JP2019033806A (en) 2019-03-07

Similar Documents

Publication Publication Date Title
US10143382B2 (en) Photoacoustic apparatus
US20190059739A1 (en) Photoacoustic apparatus
KR20180006308A (en) Apparatus, method, and program for obtaining information derived from ultrasonic waves and photoacoustic waves
US20180353082A1 (en) Photoacoustic apparatus and object information acquiring method
EP3326519A1 (en) Photoacoustic apparatus, control method, and program
JP6656229B2 (en) Photoacoustic device
US20180228377A1 (en) Object information acquiring apparatus and display method
EP3326520A1 (en) Photoacoustic apparatus, information processing method
JP2016101419A (en) Photoacoustic apparatus, subject information acquisition method, and program
US20160150990A1 (en) Photoacoustic apparatus, subject information acquisition method, and program
JP6614910B2 (en) Photoacoustic device
JP2016042922A (en) Photoacoustic imaging apparatus
JP2007267977A (en) Device for measuring vascular lumen diameter
WO2018207713A1 (en) Photoacoustic apparatus and photoacoustic image generating method
WO2019031607A1 (en) Photoacoustic apparatus and object information acquiring method
US20190000322A1 (en) Photoacoustic probe and photoacoustic apparatus including the same
US20180325380A1 (en) Subject information acquisition device and subject information acquisition method
JP7034625B2 (en) Photoacoustic device, control method, program
CN105640495B (en) Photo-acoustic device
JP2018089346A (en) Photoacoustic apparatus, image display method and program
US20180344168A1 (en) Photoacoustic apparatus
US20190159760A1 (en) Photoacoustic probe
WO2018079407A1 (en) Photoacoustic imaging apparatus, method for acquiring information, and program
JP2019042002A (en) Photoacoustic apparatus
JP2019042003A (en) Photoacoustic apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18768951

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18768951

Country of ref document: EP

Kind code of ref document: A1