WO2018207713A1 - Photoacoustic apparatus and photoacoustic image generating method - Google Patents
Photoacoustic apparatus and photoacoustic image generating method Download PDFInfo
- Publication number
- WO2018207713A1 WO2018207713A1 PCT/JP2018/017571 JP2018017571W WO2018207713A1 WO 2018207713 A1 WO2018207713 A1 WO 2018207713A1 JP 2018017571 W JP2018017571 W JP 2018017571W WO 2018207713 A1 WO2018207713 A1 WO 2018207713A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- frame rate
- photoacoustic
- cyclic period
- period
- display
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0093—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
- A61B5/0095—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7203—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
- A61B5/7207—Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal of noise induced by motion artifacts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/24—Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
- A61B5/316—Modalities, i.e. specific diagnostic methods
- A61B5/318—Heart-related electrical modalities, e.g. electrocardiography [ECG]
- A61B5/346—Analysis of electrocardiograms
- A61B5/349—Detecting specific parameters of the electrocardiograph cycle
- A61B5/352—Detecting R peaks, e.g. for synchronising diagnostic apparatus; Estimating R-R interval
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
Definitions
- the present invention relates to a photoacoustic apparatus and a method for processing subject information obtained by applying a photoacoustic effect.
- a photoacoustic apparatus which images an internal region of a subject by applying a photoacoustic effect.
- PTL 1 discloses a photoacoustic imaging apparatus having a hand-held probe, like one in an ultrasonic wave diagnosis apparatus.
- the photoacoustic imaging apparatus disclosed in PTL 1 causes a light emitting semiconductor device to emit light at a sampling cyclic period shorter than a cyclic period (hereinafter, called a refresh rate period) of a refresh rate that is the number of times of screen rewriting per unit time period.
- Detection signals obtained by receiving photoacoustic waves emitted from a subject irradiated with light are averaged to generate an image to be displayed on a display unit.
- the image to be displayed on the display unit is generated in a period equal to the refresh rate period.
- the period for generating such an image is limited by the refresh rate. Therefore, it may be difficult to change a condition for obtaining a photoacoustic image, such as a light emitting period of a light source and selection of signals to be used for generating an image.
- a condition for obtaining a photoacoustic image such as a light emitting period of a light source and selection of signals to be used for generating an image.
- photoacoustic signals cannot be obtained under proper conditions, there is a possibility that the generated image may have blurring caused by an insufficient S/N ratio or motion of a subject, for example.
- the present disclosure can improve the degree of freedom for obtaining a photoacoustic image for improved image quality of the obtained image.
- a photoacoustic apparatus includes a light irradiating unit configured to irradiate light to a subject, an acoustic wave receiving unit configured to receive acoustic waves generated in the subject upon irradiation of the light to the subject and to output a reception signal, and an image generating unit configured to generate an internal image of the subject based on the reception signal.
- the light irradiating unit is further configured to irradiate the light to the subject repetitively at a first cyclic period
- the image generating unit is further configured to combine a plurality of the reception signals obtained by irradiating the light a plurality of times and to generate image data at a frame rate of a second cyclic period based on the combined signal.
- the photoacoustic apparatus further includes a frame rate converting unit configured to convert a display rate of an image based on the image data generated at the frame rate of the second cyclic period to a frame rate of a third cyclic period.
- the present disclosure can improve the degree of freedom for obtaining a photoacoustic image for improved image quality of the obtained image.
- Fig. 1 is a block diagram illustrating a photoacoustic apparatus according to a first embodiment.
- Fig. 2A is a schematic diagram illustrating a hand-held probe according to the first embodiment.
- Fig. 2B is a schematic diagram illustrating a hand-held probe according to the first embodiment.
- Fig. 3 is a block diagram illustrating a computer and peripheral components according to the first embodiment.
- Fig. 4A is a timing chart for describing operations according to the first embodiment.
- Fig. 4B is a timing chart for describing operations according to the first embodiment.
- Fig. 4C is a timing chart for describing operations according to the first embodiment.
- Fig. 5A is a timing chart for describing operations according to a second embodiment.
- Fig. 5B is a timing chart for describing operations according to the second embodiment.
- Fig. 6 is a timing chart for describing operations according to a third embodiment.
- Fig. 7 is a timing chart for describing operations according to a fourth embodiment.
- Fig. 8 is a block diagram illustrating a computer and peripheral components according to a fifth embodiment.
- Fig. 9 is a timing chart for describing operations according to the fifth embodiment.
- Fig. 10A illustrates a display screen displaying measurement conditions according to the present disclosure.
- Fig. 10B illustrates a display screen displaying measurement conditions according to the present disclosure.
- Fig. 10C illustrates a display screen displaying measurement conditions according to the present disclosure.
- the present invention relates to a technology which detects acoustic waves propagated from a subject and generates and obtain property information on an internal region of the subject. Therefore, the present invention may be considered as a subject information obtaining apparatus or a control method therefor, a subject information obtaining method, or a signal processing method. Alternatively, the present invention may further be considered as a display method for generating and displaying an image illustrating property information regarding an internal region of a subject. Alternatively, the present invention may be considered as a program causing an information processing apparatus including hardware resources such as a CPU and a memory to execute the method as described above or a computer-readable, non-transitory storage medium storing the program.
- a subject information obtaining apparatus may include a photoacoustic imaging apparatus which receives acoustic waves generated within a subject irradiated with light (electromagnetic waves) and use a photoacoustic effect of obtaining property information on the subject as image data.
- the property information is information which is generated by using a signal originating from the received photoacoustic waves and indicates property values corresponding to a plurality of positions within the subject.
- Photoacoustic image data according to the present invention is a concept including all kinds of image data originating from photoacoustic waves in the subject upon irradiation of light.
- photoacoustic image data may be image data indicating a spatial distribution of at least one subject information piece of a sound pressure (initial sound pressure), an energy absorption density, and an absorption factor of occurring photoacoustic waves, a concentration (such as oxygen saturation) of a substance included in a subject.
- photoacoustic image data representing spectrum information such as a concentration of a substance included in a subject may be obtained based on photoacoustic wave in the subject upon irradiation of light beams having a plurality of mutually different wavelengths irradiated to the subject.
- the photoacoustic image data representing such spectrum information may be an oxygen saturation, a value acquired by weighting an oxygen saturation with an absorption factor, for example, a total hemoglobin concentration, an oxyhemoglobin concentration, or a deoxyhemoglobin concentration.
- the photoacoustic image data representing such spectrum information may be a glucose concentration, a collagen concentration, a melanin concentration, or a volume fraction of fat or water.
- a two-dimensional or three-dimensional property information distribution may be obtained.
- the distribution data may be generated as image data.
- the property information may be obtained as distribution information with respect to positions within a subject instead of numerical value data.
- the property information may be distribution information such as an initial sound pressure distribution, an energy absorption density distribution, an absorption factor distribution and an oxygen saturation distribution.
- acoustic wave herein typically refers to an ultrasonic wave and includes an elastic wave called a sonic wave or an acoustic wave.
- An electric signal converted from an acoustic wave by a transducer may also be called an acoustic signal.
- ultrasonic wave or “acoustic wave”
- An acoustic wave generated by a photoacoustic effect is called a photoacoustic wave or a photoacoustically induced ultrasonic wave.
- An electric signal originating from a photoacoustic wave may also be called a photoacoustic signal.
- the distribution data may also be called photoacoustic image data or reconstructed image data.
- the following embodiments relate to a photoacoustic apparatus, as a subject information obtaining apparatus, which irradiates pulsed light to a subject, receives photoacoustic waves from the subject and generates a blood vessel image (structure image) within the subject.
- a photoacoustic apparatus having a hand-held probe
- the present invention is also applicable to a photoacoustic apparatus including a probe in a mechanical stage for mechanical scan.
- the photoacoustic apparatus 1 has a probe 180, a signal collecting unit 140, a computer 150, a display unit 160, and an input unit 170.
- the probe 180 includes a light source unit 200, an optical system 112, a light irradiating unit 113, and an acoustic wave receiving unit 120.
- the computer 150 includes a calculating unit 151, a storage unit 152, a control unit 153, and a frame rate converting unit 159.
- the light source unit 200 is configured to supply light pulses to the light irradiating unit 113 through the optical system 112 such as an optical fiber (bundle fiber) at a first cyclic period.
- the light irradiating unit 113 is configured to irradiate supplied light to a subject 100.
- photoacoustic waves are generated in the subject 100 at the first cyclic period.
- the acoustic wave receiving unit 120 is configured to receive photoacoustic waves generated in the subject 100 at the first cyclic period and output an electric signal (hereinafter, also called a reception signal or photoacoustic signal) that is an analog signal.
- the acoustic wave receiving unit 120 may receive photoacoustic waves at intervals each defined by the first cyclic period.
- the signal collecting unit 140 is configured to convert the analog signal output from the acoustic wave receiving unit 120 to a digital signal and output to the computer 150.
- the computer 150 is configured to use the calculating unit 151, the storage unit 152, and the control unit 153 to combine, at a second cyclic period (also called an imaging frame rate period), digital signals generated in the subject upon irradiation of light a plurality of number of times and output from the signal collecting unit 140 at the first cyclic period and to store it in the storage unit 152 as an electric signal (photoacoustic signal) originating from photoacoustic waves.
- the combining may include not only a simple addition but also a weighted addition, an averaging, and a moving average. Any other kinds of combining than averaging may be applied although averaging will be described mainly below.
- the computer 150 may perform a process such as an image reconstruction on a digital signal stored in the storage unit 152 to generate photoacoustic image data within a time period defined by the second cyclic period (imaging frame rate period).
- the computer 150 may function as an image generating unit configured to generate an image of an internal region of a subject based on the received signal.
- the computer 150 is configured to output the generated photoacoustic image data to the frame rate converting unit 159 at the second cyclic period.
- the frame rate converting unit 159 is configured to convert photoacoustic image data generated at the second cyclic period to photoacoustic image data at a third cyclic period (hereinafter, also called a display frame rate period) that is suitable for display on the display unit 160.
- the computer 150 converts the display rate of an image based on the image data generated at the second cyclic period to a frame rate of the third cyclic period.
- the computer 150 is configured to generally control the photoacoustic apparatus 1 by using the control unit 153.
- the display unit 160 is configured to display a photoacoustic image based on the photoacoustic image data at the third cyclic period (display frame rate period).
- the computer 150 may perform image processing for display and processing for combining graphic representations for GUI on the obtained photoacoustic image data. This processing may be performed on the photoacoustic image data generated at the second cyclic period or may be performed on the photoacoustic image data generated at the third cyclic period.
- first cyclic period is not necessarily required to be a “completely uniform repetition time period”.
- period is used to refer to each of repetition time intervals that are not uniform according to the present disclosure.
- a repetition time period in a time period excluding the pause period is called a period according to the present disclosure.
- a user may check a photoacoustic image displayed on the display unit 160.
- the image displayed on the display unit 160 may be saved in a memory within the computer 150 or a data management system connected to the photoacoustic apparatus over a communication network in response to a save instruction from a user or the computer 150.
- the input unit 170 is configured to receive an instruction from a user.
- Fig. 2A is a schematic diagram of the probe 180 according to this embodiment.
- the probe 180 includes the light source unit 200, the optical system 112, the light irradiating unit 113, the acoustic wave receiving unit 120, and a housing 181.
- the housing 181 is configured to enclose the light source unit 200, the optical system 112, the light irradiating unit 113 and the acoustic wave receiving unit 120.
- a user may grip the housing 181 to use the probe 180 as a hand-held probe.
- the light irradiating unit 113 is configured to irradiate light pulses propagated from the optical system 112 to a subject.
- the X, Y, and Z axes illustrated in Figs. 2A and 2B represent coordinate axes in a case where the probe is settled but are not intended to limit the orientation of the probe while being used.
- the probe 180 illustrated in Fig. 2A is connected to the signal collecting unit 140 through a cable 182.
- the cable 182 may include a wire configured to supply power to the light source unit 200, a light emission control signal wire, or a wire (not illustrated) configured to output an analog signal output from the acoustic wave receiving unit 120 to the signal collecting unit 140.
- the cable 182 may have a connector and may be configured to be capable of separating the probe 180 and the other components of the photoacoustic apparatus.
- light pulses may be irradiated to a subject directly by using a semiconductor laser or a light emitting diode as the light source unit 200, without using the optical system 112.
- an light emitting end part of the semiconductor laser or LED (or the leading end of the housing), for example, may correspond to the light irradiating unit 113.
- the light source unit 200 is configured to generate light to be irradiated to the subject 100.
- the light source unit 200 may be a light source capable of generating pulsed light and outputting light beams having a plurality of wavelengths for obtaining a substance concentration such as an oxygen saturation. Because it is important to mount the probe 180 within the housing, the semiconductor light emitting device such as a semiconductor laser or a light emitting diode as illustrated in Fig. 2B may be used.
- the light beams having a plurality of wavelengths may be output by switching the light emission by using a plurality of types of semiconductor lasers or light emitting diodes which generate light beams having different wavelengths.
- the light source unit 200 can generate light having a pulse width equal to or higher than 10 ns and equal to or lower than 1 ⁇ s, for example.
- the light may have a wavelength equal to or higher than 400 nm or equal to or lower than 1600 nm, the wavelength may be determined in accordance with the light absorption property of a light absorber to be imaged.
- light having a wavelength (equal to or higher than 400 nm and equal to or lower than 800 nm) which can be highly absorbed by the blood vessel may be applied.
- light having a wavelength (equal to or higher than 700 nm and equal to or lower than 1100 nm) which can be less absorbed by a background tissue (water or fat) of the living body may be applied.
- a semiconductor light emitting device is used as a light source for the light source unit 200, an insufficient light amount may be produced.
- one irradiation may produce a photoacoustic signal having an S/N ratio lower than a desired ratio.
- light may be emitted at the first cyclic period, and resulting photoacoustic signals may be averaged for an improved S/N ratio.
- a photoacoustic image may be calculated at the second cyclic period (imaging frame rate period).
- the light source unit 200 may emit light having a wavelength of 797 nm, for example.
- light having the wavelength can reach a deep part of a subject and can be absorbed by oxyhemoglobin and deoxyhemoglobin with substantially equal absorption factors. Therefore, the wavelength is suitable for detection of a blood vessel structure.
- a light source may be used which produces a second wavelength of 756 nm so that an oxygen saturation can be acquired by using a difference between the absorption factors of oxyhemoglobin and deoxyhemoglobin.
- the light irradiating unit 113 is an emission end configured to irradiate light to a subject.
- the light irradiating unit 113 may have a usable termination when a bundle fiber is used as the optical system 112.
- a diffuser for diffusing light may be used so that pulsed light having an increased beam diameter can be irradiated.
- 2B is a semiconductor light emitting device, light emission end parts (housing leading edges) of a plurality of semiconductor light emitting devices may be arranged to function as the light irradiating unit 113 so that light can be irradiated to a wide range of a subject.
- the acoustic wave receiving unit 120 includes a transducer configured to receive photoacoustic waves generated in the subject upon irradiation of light at the first cyclic period and to output an electric signal and a supporting member configured to support the transducer.
- the transducer may include components such as a piezoelectric material, a Capacitive Micro-machined Ultrasonic Transducer (CMUT), and a Fabry-Perot interferometer.
- the piezoelectric material may be a piezoelectric ceramic material such as PZT (lead zirconate titanate) or a polymer piezoelectric film material such as PVDF (polyvinylidene difluoride), for example.
- the electric signal obtained by the transducer at the first cyclic period is a time-resolved signal. Therefore, the electric signal has an amplitude representing a value based on a sound pressure (such as a value in proportion to the sound pressure) received by the transducer at each time.
- the transducer may be capable of detecting a frequency component (typically from 100 KHz to 10 MHz) of a photoacoustic wave.
- a plurality of transducers may be arranged on the supporting member to form a plane or a curved surface called a 1D array, a 1.5D array, a 1.75D array, or a 2D array, for example.
- the acoustic wave receiving unit 120 may include an amplifier configured to amplify time-series analog signals output from the transducers.
- the acoustic wave receiving unit 120 may include an A/D converter configured to convert time-series analog signals output from the transducers to time-series digital signals.
- the acoustic wave receiving unit 120 may include the signal collecting unit 140.
- the transducers may surround the whole circumference of the subject 100 in order to detect acoustic waves in various angles for improved image accuracy.
- the transducers may be arranged on a hemispherical supporting member.
- the probe 180 including the acoustic wave receiving unit 120 having such a shape is suitable for an electrical scanning photoacoustic apparatus which relatively moves the probe with respect to the subject 100, instead of a hand-held photoacoustic apparatus.
- the probe may be moved by a scanning unit such as an XY stage.
- the arrangement and number of transducers and the shape of the supporting member are not limited to those described above but may be optimized in accordance with the subject 100.
- a medium for propagating photoacoustic waves may be provided in a space between the acoustic wave receiving unit 120 and the subject 100. This can cause an acoustic impedance match at an interface between the subject 100 and the transducers.
- the medium may be water, oil, or ultrasound gel, for example.
- the photoacoustic apparatus 1 may include a holding member configured to hold the subject 100 to stabilize the shape of the subject 100.
- the holding member may have both of a high luminous transmittance and a high acoustic wave transmittance.
- polymethylpentene, polyethylene terephthalate, or acrylic may be used.
- the transducers may function as a transmitting unit configured to transmit acoustic waves in order to generate not only a photoacoustic image but also an ultrasonic wave image by transmitting and receiving acoustic waves.
- a transducer functioning as a receiving unit and a transducer functioning as a transmitting unit may be implemented by a single (common) transducer or by separate transducers.
- the signal collecting unit 140 includes an amplifier and an A/D converter.
- the amplifier is configured to amplify an electric signal that is an analog signal generated in the subject upon irradiation of light at the first cyclic period and output from the acoustic wave receiving unit 120.
- the A/D converter is configured to convert the analog signal output from the amplifier to a digital signal.
- the signal collecting unit 140 may be implemented by an FPGA (Field Programmable Gate Array) chip.
- Analog signals output from a plurality of transducers arranged in an array form in the acoustic wave receiving unit 120 are amplified by a plurality of corresponding amplifiers and are converted to digital signals by a plurality of corresponding A/D converters.
- the A/D conversion is performed at an A/D conversion rate equal to or higher than at least two times of the band of an input signal.
- the A/D conversion rate may be equal to or higher than 20 MHz or, desirably, may be equal to 40 MHz.
- the signal collecting unit 140 synchronizes the timing of light irradiation and the timing of signal collection processing by using a light emission control signal.
- the A/D conversion starts at the A/D conversion rate to convert an analog signal to a digital signal with reference to a light emission time in each first cyclic period.
- a digital data string for each time interval (A/D conversion interval) equal to a fraction of the A/D conversion rate from a light emission time in each first cyclic period can be obtained for a plurality of transducers.
- the signal collecting unit 140 may also be called a Data Acquisition System (DAS).
- DAS Data Acquisition System
- An electric signal herein is a concept including an analog signal and a digital signal.
- the signal collecting unit 140 may be placed within the housing 181 of the probe 180, as described above. In this configuration, information between the probe 180 and the computer 150 is propagated as a digital signal for improved noise immunity. Use of high-speed digital signals can reduce the number of wires, which can improve operability of the probe 180, compared with transmission of analog signals.
- the signal collecting unit 140 may perform averaging, which will be described below.
- the averaging may be performed by using hardware such as an FPGA.
- the computer 150 includes the calculating unit 151, the storage unit 152, the control unit 153, and the frame rate converting unit 159.
- a unit responsible for a calculation function of the calculating unit 151 may be a computing circuit including a processor such as a CPU or a GPU (Graphics Processing Unit) and an FPGA (Field Programmable Gate Array) chip. This unit may be implemented by a single processor or computing circuit or a plurality of processors or computing circuits.
- the computer 150 may combine data pieces having an equal time difference from a time of light emission from the light source 200 in a digital data string output from the signal collecting unit 140 in every first cyclic period.
- the computer 150 then stores in the storage unit 152 the combined digital data string as a combined electric signal (photoacoustic signal) originating from photoacoustic waves in each second cyclic period (imaging frame rate period).
- the calculating unit 151 then generates photoacoustic image data (of a structure image or a function image) by performing image reconstruction and executes other kinds of computing processing based on the averaged photoacoustic signal stored in the storage unit 152 in each second cyclic period (imaging frame rate period).
- the calculating unit 151 may receive a parameter such as the speed of sound of a subject or a configuration of the holding unit from the input unit 170 for use in a computing operation.
- the calculating unit 151 may apply any arbitrary reconstruction algorithm for converting an electric signal to three-dimensional volume data, such as a time-domain back projection method, a Fourier-domain back projection method, and a model-based method (repeated calculation method).
- the time- domain back projection method may be Universal back-projection (UBP), Filtered back-projection (FBP), or phase-regulating addition (Delay-and-Sum).
- the calculating unit 151 may perform image reconstruction processing to generate a first initial sound pressure distribution from a photoacoustic signal originating from a light beam having a first wavelength and a second initial sound pressure distribution from a photoacoustic signal originating from a light beam having a second wavelength.
- the calculating unit 151 may further correct the first initial sound pressure distribution by using a light amount distribution of the light beam having the first wavelength to obtain a first absorption factor distribution and correct the second initial sound pressure distribution by using a light amount distribution of the light beams having the second wavelength to obtain a second absorption factor distribution.
- the calculating unit 151 can further obtain an oxygen saturation distribution from the first and second absorption factor distributions. Details and order of the computing operations are not limited thereto if the oxygen saturation distribution can finally be obtained.
- the storage unit 152 may include a volatile memory such as a RAM (random access memory) and a non-volatile storage medium such as a ROM (read only memory), a magnetic disk, and a flash memory.
- the storage medium to store a program is a non-volatile storage medium.
- the storage unit 152 may include a plurality of storage media.
- the storage unit 152 can save a photoacoustic signal averaged at the second cyclic period (imaging frame rate period) and data such as photoacoustic image data generated by the calculating unit 151 and reconstructed image data based on photoacoustic image data.
- the control unit 153 includes computing elements such as a CPU.
- the control unit 153 is configured to control operations to be performed by components of the photoacoustic apparatus.
- the control unit 153 may control a component of the photoacoustic apparatus in response to an instruction signal for an operation such as starting a measurement from the input unit 170.
- the control unit 153 may read out program code stored in the storage unit 152 and control an operation of a corresponding component of the photoacoustic apparatus.
- the control unit 153 may adjust an image for display on the display unit 160.
- oxygen saturation distribution images are displayed sequentially in response to movements of the probe and photoacoustic measurements.
- the frame rate converting unit 159 outputs photoacoustic image data (structure image or function image) generated at a frame rate (imaging frame rate) of the second cyclic period to the display unit as an image signal at a frame rate (display frame rate) of the third cyclic period.
- reconstructed image data may be stored in the storage unit 152 at the frame rate of the second cyclic period (imaging frame rate), and the stored reconstructed image data may be read out at the third frame rate (display frame rate).
- the control unit 153 and the storage unit 152 can also function as the frame rate converting unit 159.
- the component may also be called the frame rate converting unit.
- the frame rate of the third cyclic period may be selected from frame rates of 50 Hz, 60 Hz, 72 Hz, and 120 Hz which are supported by a general-purpose display apparatus, for example.
- the frame rate of the second cyclic period (imaging frame rate) suitable for measurement or the frame rate of the third cyclic period (display frame rate) suitable for image display can be independently selected.
- the frame rate of the second cyclic period (imaging frame rate) suitable for measurement can be selected independently from the frame rate of the third cyclic period (display frame rate) suitable for image display.
- a configuration may easily be provided in which the frame rate of the second cyclic period (imaging frame rate) can be changed in response to a user's instruction, for example.
- the display unit 160 is configured to rewrite the real screen in synchronization with the frame rate of the third cyclic period (display frame rate) input to the display unit 160.
- the frame rate of the third cyclic period (display frame rate) is equal to a rate for rewriting the real screen (refresh rate).
- some liquid crystal display apparatuses may have a function for processing a plurality of input frame rates (frame frequencies).
- Such liquid crystal display apparatuses may contain a frame rate converter configured to convert an input frame rate to a rate for rewriting the real screen (refresh rate).
- Such a configuration having the display unit 160 may correspond to a configuration including the display unit 160 having a frame rate converter configured to convert a rate for rewriting the real screen (refresh rate) to the frame rate of the third cyclic period (display frame rate).
- the frame rate converting unit 159 is provided in the display unit 160 instead of the configuration having the frame rate converting unit 159 illustrated in Fig. 1 in the computer 150.
- the frame rate converting unit 159 may be provided in the display unit 160 instead of being provided in the computer 150.
- the configuration having the frame rate converting unit in the display unit 160 can advantageously simplify the configuration of the computer 150.
- the computer 150 may be a specially designed workstation.
- the computer 150 may cause a general-purpose PC or workstation to operate of response to an instruction from a program stored in the storage unit 152.
- the components of the computer 150 may be configured by different hardware modules. At least some components of the computer 150 may be configured by one hardware module.
- Fig. 3 illustrates a specific configuration example of the computer 150 according to this embodiment.
- the computer 150 includes a CPU 154, a GPU 155, a RAM 156, a ROM 157, an external storage device 158, and the frame rate converting unit 159.
- a liquid crystal display 161 corresponding to the display unit 160, a mouse 171 corresponding to the input unit 170, and a keyboard 172 are connected.
- the computer 150 and the acoustic wave receiving unit 120 are contained in a common housing.
- the computer contained in the housing may perform some signal processes, and a computer provided externally to the housing may perform the other signal processes.
- the computers provided internally and externally to the housing can collectively be called a computer according to this embodiment.
- hardware modules included in the computer may not be contained in one housing.
- the computer 150 may be an information processing apparatus provided by a cloud computing service, for example, and installed remotely.
- the computer 151 corresponds to a processing unit according to an aspect of the present invention. Particularly, the calculating unit 151 mainly implements the functionality of the processing unit.
- the display unit 160 may be a display apparatus such as a liquid crystal display or an organic electro luminescence (EL).
- the display unit 160 is an apparatus configured to display an image based on subject information obtained by the computer 150 and a numerical value for a specific position.
- the display unit 160 receives and displays reconstructed image data at the frame rate of the third cyclic period (display frame rate).
- the frame rate of the third cyclic period (display frame rate) may be equal to a frame rate of 50 Hz, 60 Hz, 72 Hz, or 120 Hz, for example.
- the display unit 160 may display an image and a GUI for operating the apparatus.
- the display unit 160 or the computer 150 may perform image processing (such as adjustment of a luminance value).
- the input unit 170 may be an operating console including a mouse, a keyboard, and a specific knob, which can be operated by a user.
- the display unit 160 may be configured by a touch panel, and the display unit 160 may be used as the input unit 170.
- the input unit 170 may receive an instruction or a numerical value from a user and transmit it to the computer 150.
- the components of the photoacoustic apparatus may be implemented by mutually different devices or may be implemented by one integral apparatus. At least some components of the photoacoustic apparatus may be implemented by one integral apparatus.
- the computer 150 may further use the control unit 153 to control driving of the components of the photoacoustic apparatus.
- the display unit 160 may display a GUI in addition to an image generated by the computer 150.
- the input unit 170 is configured to receive information from a user. A user can use the input unit 170 to instruct to perform operations such as starting and finishing a measurement, designating the frame rate of the second cyclic period (imaging frame rate), which will be described below, and saving a generated image.
- the subject 100 will be described below though it is not a component of the photoacoustic apparatus.
- the photoacoustic apparatus according to this embodiment is usable for purposes such as diagnoses of human or animal malignant tumors and blood vessel diseases and follow-ups of chemical treatments. Therefore, the subject 100 is assumed as a region to be diagnosed such as a living body, more specifically, may be a living body, and more specifically, it may be a diagnosis target region such as the breast, organs, a vascular network, the head, the neck, the abdomen, or the limb including a finger and a toe of a human or animal body.
- oxyhemoglobin or deoxyhemoglobin or a blood vessel mostly including them or a neovessel formed in neighborhood of a tumor may be an optical absorber.
- Plaque of a carotid artery wall may be an optical absorber.
- a pigment such as methylene blue (MB), indocyanine green (ICG), gold minute particles, or an externally introduced substance integrating or chemically modifying them may be an optical absorber.
- a puncture needle or an optical absorber attached to a puncture needle may be observed.
- the subject may be a lifeless thing such as a phantom or a test subject.
- Figs. 4A, 4B and 4C are timing charts for describing operations according to the first embodiment of the present invention.
- Figs. 4A, 4B and 4C have a horizontal axis being a time axis.
- Controls may be executed by the computer 150, an FPGA or a dedicated hardware module.
- FIG. 4A illustrates a case where the frame rate of the second cyclic period (imaging frame rate) and the frame rate of the third cyclic period (display frame rate) have an equal frequency.
- the light source unit 200 emits light at a first cyclic period (first cyclic period: tw1), and a photoacoustic signal based on the emitted light is obtained at the first cyclic period: tw1.
- the first cyclic period: tw1 may have a length set in consideration of a Maximum Permissible Exposure (MPE) for skin. This is because the MPE value decreases as the length of the first cyclic period: tw1 decreases.
- MPE Maximum Permissible Exposure
- the optical energy irradiated from the light source unit 200 to the subject 100 such as a human body is equal to about 13.3 J/m 2 .
- the optical energy irradiated from the light emitting unit 113 is equal to or lower than the MPE value. Therefore, in a case where the first cyclic period: tw1 is equal to or higher than 0.1 msec, it can be guaranteed that the optical energy is equal to or lower than the MPE value.
- the length of the first cyclic period: tw1 is set in a range lower than the MPE value based on the peak power and the area of irradiation.
- the light irradiating unit irradiates light to the subject eight times in the first cyclic period: tw1, and photoacoustic signals generated in the subject upon the irradiation of light are obtained ((1) to (8)).
- the obtained photoacoustic signals are averaged, and the averaged photoacoustic signal A1 is obtained for each imaging frame rate period: tw2.
- simple averaging, moving averaging, or weighted averaging may be performed, for example, as described above.
- the time period of the first cyclic period: tw1 is equal to 0.1 msec and where the imaging frame rate is equal to 60 Hz
- the time period of the imaging frame rate period: tw2 is equal to 16.7 msec, which means that the averaging is performed 167 times in the imaging frame rate period.
- Processing for image reconstruction may be performed based on the averaged photoacoustic signal A1 in the time interval defined by the second cyclic period: tw2 so that reconstructed image data R1 can be obtained.
- the reconstructed image data are sequentially calculated by the calculating unit in the imaging frame rate period: tw2.
- the reconstructed image data R1 calculated by the calculating unit are output at the frame rate of the third cyclic period as an image 1 from the frame rate converting unit 159 to the display unit.
- the imaging frame rate and the display frame rate exhibit an equal frequency, as described above.
- the frame rate converting unit 159 outputs the reconstructed image data R1 obtained in T3 as display image data in the display frame rate period: tw3.
- the display unit 160 then displays the display image data input in the display frame rate period: tw3.
- the first cyclic period: tw1 depends on the peak power of pulsed light and an irradiated area on a subject, as described above.
- the number of times of averaging depends on the ratio of the S/N ratio of the photoacoustic signal obtained by one irradiation of pulsed light to the S/N ratio of the photoacoustic signal determined based on the image quality requested by a user.
- the S/N ratio of the photoacoustic signal obtained by one irradiation of pulsed light is 1/10 times of the S/N of the requested photoacoustic signal
- the S/N ratio may be improved 10 times. Therefore, the averaging may be performed 100 times.
- the imaging frame rate period is equal to or longer than 10 msec, that is, the imaging frame rate is equal to or lower than 100 Hz.
- the first cyclic period: tw1 is also limited by heat generated by the semiconductor light emitting device. In other words, when the thermal resistance of the probe is given, the temperature may be determined based on the power consumption of the semiconductor light emitting device. Therefore the first cyclic period: tw1 may be increased to prevent the temperature of the semiconductor light emitting device from exceeding the allowable temperature.
- the number of times of averaging may advantageously be reduced as much as possible. More specifically, it may be designed such that the motion blurring is reduced to 1/2 or lower of the requested resolution. For example, in a case where the requested resolution is 0.2 mm, the body motion of the subject is 5 mm/sec, and the first cyclic period: tw1 is equal to 0.1 msec, the number of times of averaging may be equal to or lower than 200, that is, the imaging frame rate period: tw2 may be equal to or shorter than 20 msec.
- the first cyclic period: tw1 and the imaging frame rate period: tw2 may be determined. If all of the conditions cannot be satisfied, those parameters may be determined by giving priority levels to the conditions.
- the photoacoustic apparatus may be configured such that a user can input desired conditions such as a resolution and an S/N ratio through the input unit and such that the control unit may determine the first cyclic period and second cyclic period based on the input conditions.
- Figs. 4B and 4C are timing charts according to the first embodiment of the present invention in a case where the imaging frame rate and the display frame rate have different frequencies.
- Fig. 4B methods for obtaining a photoacoustic signal and for generating photoacoustic image data based on the obtained photoacoustic signal in the photoacoustic apparatus according to the first embodiment of the present invention will be described in detail.
- Figs. 4B and 4C illustrate the same operations in T1 to T3 as those in Fig. 4A.
- operations illustrated in Figs. 4B and 4C from irradiation of light pulses to obtaining of reconstructed image data can be performed under the same conditions for the obtaining (and the same measurement condition) as those of the operations illustrated in Fig. 4A. Therefore, identical reconstructed image data can be obtained under identical measurement conditions irrespective of the display frame rate of the display unit.
- T4 indicates an example with a display frame rate of 72 Hz where the display frame rate period: tw3 is equal to about 13.8 msec. This means that the imaging frame rate period in the second cyclic period is longer than the display frame rate period in the third cyclic period.
- T4 indicates an example with a display frame rate of 50 Hz where the display frame rate period: tw3 is equal to 20 msec. This means that the imaging frame rate period in the second cyclic period is shorter than the display frame rate period in the third cyclic period.
- the frame rate converting unit 159 converts the reconstructed image data obtained under the same measurement conditions from the imaging frame rate (60 Hz) to the display frame rate (72 Hz or 50 Hz).
- the imaging frame rate period that is the second cyclic period is equal to 20 ms
- the present invention is not limited to the condition.
- the first cyclic period may be required to be longer than the time period depends on the distance from the receiving unit to an observation target or the distance for receiving photoacoustic waves, the first cyclic period is realistically in a range from 0.1 to several msec.
- the imaging frame rate may be set to 240 Hz or lower if photoacoustic signals based on light pulses irradiated 10 or more number of times are to be obtained and be averaged. More simply, the frame rate converting unit may image combining, frame thinning, and frame rewriting between a plurality of frames for the frame rate conversion. In a case where the probe moves quickly with a significant sense of interference, the frame rate converting unit may perform an inter-frame interpolation by using a motion vector, for example to perform frame rate conversion including generation of an interpolation frame.
- reconstructed image data not to be displayed may be generated when the display frame rate period: tw3 is shorter than the imaging frame rate period: tw2. Therefore, the reconstructed image data may be wasted if reconstructed image data is only to be displayed. Therefore, in this case, in order to eliminate the waste, reconstructed image data may be generated at a time that satisfies a condition that the imaging frame rate period: tw2 is equal to or longer than the display frame rate period: tw3.
- processes for reconstructing images for a plurality of frames may be performed in parallel so that the imaging frame rate period can be effectively shorter than the display frame rate period.
- the calculating unit may start a process for reconstructing an image R2 so that reconstructed image data for one frame can be obtained within the display frame rate period.
- the obtained reconstructed image data may be displayed and, at the same time, be sequentially stored in the storage unit 152.
- the reconstructed image data sequentially stored in the storage unit 152 may be displayed in another time period.
- the reconstructed image data are read out from the storage unit 152 in the imaging frame rate period: tw2, undergo the frame rate conversion in the frame rate converting unit 159, and, in the display frame rate period: tw3, are output to the display unit 200 as display data.
- the reconstructed image data stored in the storage unit 152 can be displayed.
- the reconstructed image data sequentially stored in the storage unit 159 may be read out in the display frame rate period: tw3 and may be output to the display unit 200 as display data.
- This processing is also called a frame rate conversion herein according to an aspect of the present invention.
- the time period for displaying the reconstructed image data may increase or decrease (slow-motion or high-speed imaging), but there is no reconstructed image data not to be displayed as an image.
- the condition for the relationship between the imaging frame rate period: tw2 and the display frame rate period: tw3 may not necessarily be required.
- Figs. 4A to 4C may be changed based on a user's instruction or a result of a predetermined judgment performed by the computer 150.
- a user can change the operating mode in accordance with the frequency for image refreshing.
- the timing for obtaining reconstructed image data depends on the refresh rate of the display.
- the imaging frame rate and the display frame rate may be independent from each other, and the frame rate converting unit 159 may convert the imaging frame rate of reconstructed image data to the display frame rate. Therefore, even when a display apparatus with the display frame rate is used, reconstructed image data can be obtained without changing the measurement condition.
- reconstructed image data can be obtained without changing the measurement condition, which can facilitate a comparison between a plurality of reconstructed image data pieces.
- the photoacoustic apparatus may have a simple apparatus configuration because it can operate of the predefined timing excluding the display frame rate T4 independently from the refresh rate of the display apparatus.
- the display apparatus to be used can be changed easily without any change of the measurement condition.
- the measurement condition cannot be easily changed.
- the measurement condition in a case where a reconstructed image is to be displayed on a display apparatus having a fixed display frame rate, the measurement condition can be changed in accordance of a subject or a region of interest in response to a user's designation or automatically.
- Fig. 5A is a timing chart for illustrating examples of operations according to this embodiment.
- the timing chart in Fig. 5A is different from the timing chart illustrated in Fig. 4A in times indicated in T1 to T3.
- the display frame rate T4 is the same as that illustrated in Fig. 4A.
- the operations in the timing chart illustrated in Fig. 5A are effective for improving the S/N ratio of a reconstructed image.
- the operations are also effective in a case where the region of interest is at a deeper position within a body.
- the light source unit 200 emits light at a first cyclic period: tw1, and a photoacoustic signal based on the emitted light is obtained at the first cyclic periods: tw1.
- the first cyclic period: tw1 is the same as the time period indicated by T1 in Fig. 4A. Because the length of the first cyclic period: tw1 relates to a Maximum Permissible Exposure (MPE) for skin, as described above, it is equal to the time period indicated by T1 illustrated in Fig. 4A.
- MPE Maximum Permissible Exposure
- the length of the first cyclic period: tw1 may be changed under the MPE.
- the number of times of averaging may be increased from 8 according to the first embodiment to 10.
- the signal collecting unit obtains a photoacoustic signal ten times during the first cyclic period: tw1 ((1) to (10)), and the calculating unit averages the obtained photoacoustic signals to obtain an averaged photoacoustic signal A1 in each imaging frame rate period: tw2. Then, in each imaging frame rate period: tw2, the calculating unit calculates reconstructed image data. As a result, though the imaging frame rate period: tw2 is longer than that in Fig.
- the frame rate converting unit 159 converts the output period for reconstructed image data pieces R1, R2, ... to the display frame rate period: tw3 and outputs the image data pieces as display image data.
- the display unit 160 displays the display image data input in the display frame rate period: tw3.
- the signal collecting unit obtains a photoacoustic signal six times during the first cyclic period: tw1 ((1) to (6)), and the calculating unit averages the obtained photoacoustic signals to obtain an averaged photoacoustic signal A1 in each imaging frame rate period: tw2. Then, in each imaging frame rate period: tw2, the calculating unit calculates reconstructed image data.
- the frame rate converting unit 159 converts the output period for reconstructed image data pieces R1, R2, ... to the display frame rate period: tw3 and outputs the image data pieces as display image data.
- the display unit 160 displays the display image data input in the display frame rate period: tw3.
- the operations illustrated in Fig. 5B is more effective for reduction of motion blurring than those in Fig. 5A.
- a user can select an operating mode in accordance with the priority levels of the reduction of motion blurring and the S/N ratio of the resulting photoacoustic image.
- the computer 150 may be configured to automatically determine the operating mode.
- the control unit may detect a motion of the probe by using an accelerometer provided within the probe so that the operating mode can be changed in accordance with the detection result, for example.
- the measurement may be executed in the operating mode illustrated in the timing chart in Fig. 5B (with a lower number of times of averaging photoacoustic signals) for reduction of motion blurring.
- the measurement may be executed in the operating mode illustrated in the timing chart in Fig.
- the operation mode may be automatically determined in accordance with a result of detection of the motion of the probe.
- the depth of region of interest may be used.
- priority is given to the S/N ratio of the resulting photoacoustic image so that the measurement is executed in the operating mode illustrated in the timing chart in Fig. 5A (with a higher number of times of averaging photoacoustic signals).
- priority is given to reduction of motion blurring so that the measurement is executed in the operating mode indicated by the timing chart in Fig. 5B (with a lower number of times of averaging photoacoustic signals).
- the operation mode may be determined automatically based on the depth of the region of interest.
- the probe may have a pressure sensor, and the pressing force of the probe may be detected to determine whether priority is given to the S/N ratio of a photoacoustic image or the reduction of motion blurring.
- the pressing force of the probe is higher than a predetermined threshold value, it may be determined that the region of interest is in a deeper area and that the probe moves slowly. Therefore, priority is given to the S/N ratio of a photoacoustic image, and the number of photoacoustic signals to be averaged can be increased.
- the pressing force of the probe is lower than the predetermined threshold value, it may be determined that the region of interest is in a shallower area and that the probe moves fast. Therefore, priority is given to reduction of motion blurring, and the number of photoacoustic signals to be averaged is reduced. In this manner, the priority may be determined automatically based on a result of detection of the pressing force of the probe.
- the frame rate converting unit is provided so that a measurement condition for obtaining reconstructed image data in response to a user's designation or automatically can be changed easily without changing the display frame rate of a display apparatus.
- a third embodiment according to the present invention is different from the first embodiment in operations of photoacoustic signals to be averaged for generating a reconstructed image.
- Fig. 6 illustrates time periods T1 to T4 which are identical to those in Fig. 4A.
- the operations illustrated in Fig. 4A include averaging of photoacoustic signals obtained in a time period equal to the imaging frame rate period: tw2 while, according to this embodiment, photoacoustic signals ((1) to (10)) obtained in a period longer than the imaging frame rate period are to be averaged.
- averaging of photoacoustic signals over a period longer than the imaging frame rate period: tw2 can provide an improved S/N ratio.
- the application of photoacoustic signals obtained in a period longer than imaging frame rate period: tw2 may delay the time for generating the reconstructed image data R1, compared with the operations illustrated in Fig. 4A. In other words, the standby time from irradiation of light to a subject to display of a resulting image is longer than that of the operations illustrated in Fig. 4A.
- partial photoacoustic signals obtained in the imaging frame rate period: tw2 may only be averaged to generate reconstructed image data.
- the photoacoustic signals (7) to (10) of the photoacoustic signals illustrated in Fig. 6 may be averaged, without using the photoacoustic signals (1) to (6).
- photoacoustic signals can be averaged in a time period shorter than the imaging frame rate period so that motion blurring can be reduced.
- This processing can facilitate to change the measurement condition for obtaining reconstructed image data as illustrated in the second embodiment, which can be implemented by a simplified apparatus configuration.
- a time for obtaining a photoacoustic signal is changed.
- Fig. 7 illustrates a timing chart for describing operations according to this embodiment.
- Fig. 7 illustrates a case where the imaging frame rate and the display frame rate have an equal frequency, like Fig. 4A according to the first embodiment. This embodiment is also applicable to a case where the imaging frame rate and the display frame rate have different frequencies from each other, as illustrated in Figs. 4B and 4C.
- the fourth embodiment illustrated in Fig. 7 is significantly different from the first embodiment in sampling times for averaging photoacoustic signals, that is, light emission times of the light source 200.
- the first cyclic period is fixed.
- times for obtaining photoacoustic signals are concentrated in a time period being the imaging frame rate period, that is, a pause period when no light irradiation to a subject is performed in the imaging frame rate period.
- the first cyclic period and so on may be determined under a condition satisfying an MPE as described above.
- the light source unit 200 emits light at a first cyclic period: tw1, and the photoacoustic apparatus obtains photoacoustic signals generated upon the irradiation of light at the first cyclic period: tw1.
- the calculating unit averages photoacoustic signals ((1) to (8)) obtained eight times to obtain an averaged photoacoustic signal A1.
- the calculating unit further sequentially calculates reconstructed image data pieces R1, R2, ... in a period defined by the second cyclic period based on the averaged photoacoustic signal A1.
- the frame rate converting unit 159 converts the reconstructed image data pieces R1, R2, ... in the display frame rate period: tw3 and outputs as display image data.
- the display unit 160 displays the display image data input in the display frame rate period: tw3.
- the operation according to this embodiment obtains and averages photoacoustic signals intensively in a time period within the imaging frame rate period, which can reduce motion blurring in the resulting photoacoustic image more than the operation according to the first embodiment.
- An unnecessary photoacoustic signal that occurs according to the third embodiment does not occur in this embodiment.
- the time period for irradiating light and obtaining a photoacoustic signal is a part of the imaging frame rate period, and a time period for not obtaining a photoacoustic signal occurs in the imaging frame rate period. Therefore, within the imaging frame rate period, photoacoustic signals are obtained and are averaged. Furthermore, reconstruction processing can be performed. As a result, a delay from obtaining a photoacoustic signal to output of reconstructed image data can be reduced.
- the fourth embodiment can reduce the time width for averaging photoacoustic data, which therefore can reduce motion blurring.
- a measurement condition is determined in accordance with a subject. More specifically, a photoacoustic apparatus will be described which synchronizes a frame rate period with periodical biological activities such as heartbeats or electrocardiographic waveforms.
- Fig. 8 is a block diagram illustrating a computer and peripheral components according to the fifth embodiment. Like numbers refer to like parts in Fig. 3 and Fig. 8.
- an electrocardiograph 173 is configured to detect an electric signal transmitting through the heart of a patient that is a subject and output it to the computer 150.
- the computer 150 determines a imaging frame rate so as to be in synchronization with output (electrocardiographic waveform) of the electrocardiograph, obtains a photoacoustic signal, and calculates reconstructed image data. This processing will be described in detail with reference to Fig. 9.
- the imaging frame rate period: tw2 and the display frame rate period: tw3 are different from each other.
- the computer 150 synchronizes the imaging frame rate period: tw2 with an RR period: tw4 of an electrocardiographic waveform T5 detected by the electrocardiograph 173. More specifically, the imaging frame rate period: tw2 starts with reference to an R wave having a maximum amplitude of the electrocardiographic waveform T5.
- the light source unit 200 emits light at the first cyclic period: tw1
- the photoacoustic apparatus obtains the photoacoustic signal generated upon the irradiation of light at the first cyclic period: tw1.
- the photoacoustic signal is obtained in a sampling enabled period SW after a delay time DLY with reference to the R wave.
- the sampling enabled period SW is equal to a value acquired by multiplying the number of times of light emission (eight times in this case) by the first cyclic period: tw1.
- the first cyclic period: tw1 may be determined based on a limitation due to the MPE as described above.
- the signal collecting unit obtains a photoacoustic signal eight times during the first cyclic period: tw1 ((1) to (8)), and the calculating unit averages the obtained photoacoustic signals and calculates the averaged photoacoustic signal A1.
- the calculating unit further performs processing for reconstruction.
- the reconstructed image data R1 are sequentially calculated in the imaging frame rate period: tw2.
- the frame rate converting unit 159 outputs the calculated reconstructed image data R1, R2, ... in the imaging frame rate period: tw2 as display image data in the display frame rate period: tw3.
- the display unit 160 then displays the display image data input in the display frame rate period: tw3.
- the first cyclic period may be equal to about 0.1 to several msec
- the imaging frame rate period may be equal to about 0.4 to 2 sec
- the display frame rate may be equal to about 50 to 240 Hz.
- the delay time DLY and the sampling enabled time SW may be set arbitrarily by a user through the input unit 170, for example.
- a blood vessel moves periodically in response to heartbeats.
- a photoacoustic signal is received in synchronization with an electrocardiographic waveform, that is, a heartbeat to obtain a reconstructed image. Therefore, motion blurring due to contraction and motion of the blood vessel in response to heartbeats can be reduced.
- the ventricle contracts with a QRS wave, and blood is ejected in a time period between an S wave to a T wave. In other words, it corresponds to a time period when the artery has a high blood pressure.
- the delay time DLY and the sampling enabled time SW may be set to obtain photoacoustic signals in a time period from an S wave to a T wave, as illustrated in Fig. 9.
- the photoacoustic signals obtained in the set time period can be reconstructed to obtain a clear reconstructed image without motion blurring.
- the region for obtaining a photoacoustic image depends on the distance from the heart and therefore depends on a time(phase for tw4) when the blood pressure changes based on an electrocardiographic waveform.
- the change of the blood pressure is also different (as the distance from the heart increases, the change of the blood pressure decreases). Accordingly, it may be configured such that a user can adjust the delay time DLY and the sampling enabled time SW to set a good value by checking the reconstructed image at the same time.
- reconstructed image data pieces such as blood vessel images can be obtained at different times such as atrial expansion times (or under different measurement conditions).
- the fifth embodiment of the present invention can apply a biological signal synchronized with a heartbeat instead of the electrocardiographic waveform. More specifically, instead of the electrocardiographic waveform, an arterial blood pressure waveform or a sound wave (heart sound) generated by the heart may be applied.
- the fifth embodiment is also applicable to periodical biological activities excluding heartbeats.
- reconstructed image data are generated at the imaging frame rate periods synchronized with electrocardiographic waveforms, for example, that is each different from the display frame rate period. Then, the frame rate converting unit 159 converts the period of the image data to the display frame rate, and the display unit 160 displays the resulting image.
- a good reconstructed image can be obtained which has less motion blurring of blood vessels due to a periodical physiological phenomena depending on a living body such as an electrocardiographic waveform. Particularly, motion blurring of an artery can be reduced so that a clear photoacoustic image of the artery can be obtained.
- a user interface mainly applying a display screen for displaying a measurement condition
- a display screen displaying a measurement condition that is a user interface corresponding to an example of an input unit configured to receive an input from a user may be displayed along with a reconstructed image on the liquid crystal display 161 corresponding to the display unit 160.
- the liquid crystal display 161 may display a plurality of windows so that display screens displaying a reconstructed image and a measurement condition may be displayed on different windows.
- a different liquid crystal display apparatus may be added, and the added liquid crystal display apparatus may specifically be assigned for displaying a user interface.
- a user can set measurement parameters including a first cyclic period, a second cyclic period, and a third cyclic period through the user interface. A user may input all of those imaging parameters, or a user may set some of the imaging parameters.
- Fig. 10A illustrates a display screen example according to this embodiment.
- an image U1 is a schematic image to be displayed on a display apparatus.
- the image U1 displays a waveform U101 schematically illustrating timing of a display frame rate, and a number (msec) U102 indicates a display frame rate period DT.
- the image U1 further displays a waveform U103 schematically illustrating an imaging frame rate and a number (msec) U104 indicating an imaging frame rate period FT.
- the image U1 further displays a waveform U105 indicating a delay time DLY and a sampling enabled time SW.
- the image U1 further displays a number (msec) U106 indicating a delay time DLY from start of the imaging frame rate U103 to start of the sampling enabled time SW and U107 indicating a number (msec) indicating a sampling enabled time SW.
- a user can change a numerical value by using a mouse or a keyboard to change a measurement condition and at the same time by checking the displayed image U1.
- a specially designed knob for changing a parameter can be provided for the adjustment.
- a software module may be designed such that a waveform or a numerical value on the image U1 interactively in response to a user's instruction.
- Fig. 10B illustrates another display screen example.
- an image U2 is a schematic image to be displayed on a display apparatus.
- the image U2 displays a waveform U201 schematically indicating timing of the display frame rate.
- the image U2 displays a number (Hz) U202 indicating a display frame rate DR.
- the image U2 displays a waveform U203 schematically indicating an imaging frame rate and a number (Hz) U204 indicating an imaging frame rate FR.
- the image U2 displays a waveform U205 indicating a delay time DLY and a sampling enabled time SW.
- the image U2 displays a number (°, one period is equal to 360°) U206 indicating in phase PH a rate of a delay time DLY from start of a frame rate to start of a sampling enabled time to the period of the imaging frame rate U203 and a number (%) U207 indicating a rate of a sampling enabled time SW to the period FT of the imaging frame rate U203.
- the time for starting to obtain a photoacoustic signal and a sampling enabled time can be intuitively identified even when a user converts the imaging frame rate.
- the display screen may be any combination of the aforementioned images or may be any other representations.
- a user can change a measurement condition interactively by, at the same time, checking the displayed image U2.
- a display frame rate may generally be fixed in accordance with the applied display unit 160. Therefore, the waveforms (U101, U201) schematically indicating times of the display frame rate and numbers (U102, U202) indicating the display frame rate illustrated in the images (U1, U2) to be displayed as illustrated in Figs. 10A and 10B may not be displayed. Thus, the number of display contents can be reduced so that a user can set and grasp a measurement condition.
- Fig. 10C illustrates another display screen example.
- Fig. 10C illustrates an embodiment which determines a measurement condition (imaging frame rate) according to the fifth embodiment in accordance with a subject. More specifically, Fig. 10C illustrates a display screen displaying a measurement condition for the photoacoustic apparatus which synchronizes a frame rate period with periodical biological activities such as heartbeats or electrocardiographic waveforms.
- an image U3 is a schematic image to be displayed on a display apparatus.
- the image U3 illustrates a waveform U301 schematically representing an imaging frame rate and a number (Sec) U302 indicating an imaging frame rate period FT.
- the image U301 further displays a waveform U303 indicating a delay time DLY and a sampling enabled time SW.
- the image U304 displays a number (°, one period is equal to 360°)U304 indicating in phase PH a rate of a time from start of a frame rate to start of a sampling enabled time SW to the period of the imaging frame rate U301 and a number (%) U305 indicating a rate of a sampling enabled time SW to the period FT of the imaging frame rate U303.
- the image U301 further displays a waveform U306 indicating an electrocardiographic waveform.
- the image U301 further displays a line U307 indicating a set trigger level with respect to the electrocardiographic waveform U306.
- the trigger level can be set by a user by using a mouse, a keyboard or a specially designed knob. Alternatively, the trigger level may be set to 90% of the peak value of the amplitude so that the computer 150 can automatically set the trigger level. In the example illustrated in Fig. 10C, the trigger may be performed at a rising time of the electrocardiographic waveform above the set trigger level amplitude.
- the computer 150 When the trigger is performed, the computer 150 generates an imaging frame rate where one period is a time from one trigger to the next trigger. In other words, a user cannot set the imaging frame rate.
- a user can adjust a sampling start phase and a sampling enabled time SW that are measurement conditions to desired values by checking electrocardiographic waveforms at the same time so that a reconstructed image can be obtained.
- the sixth embodiment may facilitate a user to grasp and set a measurement condition. Therefore, advantageously, a user can easily set an optimum measurement condition in accordance with the state of a living body that is a subject.
- the light source unit 200 may be configured to irradiate light having a plurality of wavelengths, as described above.
- an oxygen saturation can be calculated as function information. For example, light beams of two wavelengths may be irradiated to a subject alternately at each imaging frame rate, and corresponding photoacoustic signals are obtained. Thus, reconstructed image data pieces of the wavelengths and at the two imaging frame rates can be calculated so that an oxygen saturation can be calculated.
- the oxygen saturation calculation may apply the scheme disclosed in PTL 2 or other schemes.
- the plurality of aforementioned embodiments may be implemented by one photoacoustic apparatus so that the photoacoustic apparatus can be switched to provide a function corresponding to one of the embodiments.
- the photoacoustic apparatus may additionally have a function for performing the measurement based on a reflected wave of an ultrasonic wave transmitted from a transducer.
- an ultrasonic wave echo image and a photoacoustic image may be displayed side by side or may be superimposed by registration for display.
- Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s).
- computer executable instructions e.g., one or more programs
- a storage medium which may also be referred to more fully as
- the computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions.
- the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
- the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD) TM ), a flash memory device, a memory card, and the like.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Signal Processing (AREA)
- Acoustics & Sound (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
A conventional technology which generates a reconstructed image at periods equal to a refresh rate of a display device cannot optimize a measurement condition for obtaining a photoacoustic image. A photoacoustic apparatus according to the present invention irradiates light to a subject at a first cyclic period, averages reception signals obtained by the irradiation, thus generates reconstructed image data at a frame rate of a second cyclic period based on the averaged signal, and converts a display rate of the image based on the reconstructed image data generated at the frame rate of the second cyclic period to a frame rate of a third cyclic period.
Description
The present invention relates to a photoacoustic apparatus and a method for processing subject information obtained by applying a photoacoustic effect.
A photoacoustic apparatus has been known which images an internal region of a subject by applying a photoacoustic effect.
However, in the apparatus disclosed in PTL 1, the period for generating such an image is limited by the refresh rate. Therefore, it may be difficult to change a condition for obtaining a photoacoustic image, such as a light emitting period of a light source and selection of signals to be used for generating an image. When photoacoustic signals cannot be obtained under proper conditions, there is a possibility that the generated image may have blurring caused by an insufficient S/N ratio or motion of a subject, for example.
Accordingly, the present disclosure can improve the degree of freedom for obtaining a photoacoustic image for improved image quality of the obtained image.
According to an aspect of the present invention, a photoacoustic apparatus includes a light irradiating unit configured to irradiate light to a subject, an acoustic wave receiving unit configured to receive acoustic waves generated in the subject upon irradiation of the light to the subject and to output a reception signal, and an image generating unit configured to generate an internal image of the subject based on the reception signal. The light irradiating unit is further configured to irradiate the light to the subject repetitively at a first cyclic period, and the image generating unit is further configured to combine a plurality of the reception signals obtained by irradiating the light a plurality of times and to generate image data at a frame rate of a second cyclic period based on the combined signal. The photoacoustic apparatus further includes a frame rate converting unit configured to convert a display rate of an image based on the image data generated at the frame rate of the second cyclic period to a frame rate of a third cyclic period.
The present disclosure can improve the degree of freedom for obtaining a photoacoustic image for improved image quality of the obtained image.
Further features of the present invention will become apparent from the following description of exemplary embodiments with reference to the attached drawings.
Preferred embodiments of the present invention will be described below with reference to drawings. However, the dimensions, materials, shapes and relative positions of the components which will be described below, should be appropriately changed by the configuration and various conditions of an apparatus to which the present invention is applied. Therefore, it is not intended that the scope of the invention be limited to the following description.
The present invention relates to a technology which detects acoustic waves propagated from a subject and generates and obtain property information on an internal region of the subject. Therefore, the present invention may be considered as a subject information obtaining apparatus or a control method therefor, a subject information obtaining method, or a signal processing method. Alternatively, the present invention may further be considered as a display method for generating and displaying an image illustrating property information regarding an internal region of a subject. Alternatively, the present invention may be considered as a program causing an information processing apparatus including hardware resources such as a CPU and a memory to execute the method as described above or a computer-readable, non-transitory storage medium storing the program.
A subject information obtaining apparatus according to the present invention may include a photoacoustic imaging apparatus which receives acoustic waves generated within a subject irradiated with light (electromagnetic waves) and use a photoacoustic effect of obtaining property information on the subject as image data. In this case, the property information is information which is generated by using a signal originating from the received photoacoustic waves and indicates property values corresponding to a plurality of positions within the subject.
Photoacoustic image data according to the present invention is a concept including all kinds of image data originating from photoacoustic waves in the subject upon irradiation of light. For example, photoacoustic image data may be image data indicating a spatial distribution of at least one subject information piece of a sound pressure (initial sound pressure), an energy absorption density, and an absorption factor of occurring photoacoustic waves, a concentration (such as oxygen saturation) of a substance included in a subject. It should be noted that photoacoustic image data representing spectrum information such as a concentration of a substance included in a subject may be obtained based on photoacoustic wave in the subject upon irradiation of light beams having a plurality of mutually different wavelengths irradiated to the subject. The photoacoustic image data representing such spectrum information may be an oxygen saturation, a value acquired by weighting an oxygen saturation with an absorption factor, for example, a total hemoglobin concentration, an oxyhemoglobin concentration, or a deoxyhemoglobin concentration. The photoacoustic image data representing such spectrum information may be a glucose concentration, a collagen concentration, a melanin concentration, or a volume fraction of fat or water.
Based on property information with respect to positions within a subject, a two-dimensional or three-dimensional property information distribution may be obtained. The distribution data may be generated as image data. The property information may be obtained as distribution information with respect to positions within a subject instead of numerical value data. In other words, the property information may be distribution information such as an initial sound pressure distribution, an energy absorption density distribution, an absorption factor distribution and an oxygen saturation distribution.
The term "acoustic wave" herein typically refers to an ultrasonic wave and includes an elastic wave called a sonic wave or an acoustic wave. An electric signal converted from an acoustic wave by a transducer may also be called an acoustic signal. However, it is not intended that the term, "ultrasonic wave" or "acoustic wave", herein limit the wavelength of such an elastic wave. An acoustic wave generated by a photoacoustic effect is called a photoacoustic wave or a photoacoustically induced ultrasonic wave. An electric signal originating from a photoacoustic wave may also be called a photoacoustic signal. The distribution data may also be called photoacoustic image data or reconstructed image data.
The following embodiments relate to a photoacoustic apparatus, as a subject information obtaining apparatus, which irradiates pulsed light to a subject, receives photoacoustic waves from the subject and generates a blood vessel image (structure image) within the subject. Although the following embodiments relate to a photoacoustic apparatus having a hand-held probe, the present invention is also applicable to a photoacoustic apparatus including a probe in a mechanical stage for mechanical scan.
With reference to a block diagram in Fig. 1, a configuration of a photoacoustic apparatus 1 according to a first embodiment will be described below. The photoacoustic apparatus 1 has a probe 180, a signal collecting unit 140, a computer 150, a display unit 160, and an input unit 170. The probe 180 includes a light source unit 200, an optical system 112, a light irradiating unit 113, and an acoustic wave receiving unit 120. The computer 150 includes a calculating unit 151, a storage unit 152, a control unit 153, and a frame rate converting unit 159.
The light source unit 200 is configured to supply light pulses to the light irradiating unit 113 through the optical system 112 such as an optical fiber (bundle fiber) at a first cyclic period. The light irradiating unit 113 is configured to irradiate supplied light to a subject 100. Thus, photoacoustic waves are generated in the subject 100 at the first cyclic period. The acoustic wave receiving unit 120 is configured to receive photoacoustic waves generated in the subject 100 at the first cyclic period and output an electric signal (hereinafter, also called a reception signal or photoacoustic signal) that is an analog signal. In other words, the acoustic wave receiving unit 120 may receive photoacoustic waves at intervals each defined by the first cyclic period. The signal collecting unit 140 is configured to convert the analog signal output from the acoustic wave receiving unit 120 to a digital signal and output to the computer 150.
The computer 150 is configured to use the calculating unit 151, the storage unit 152, and the control unit 153 to combine, at a second cyclic period (also called an imaging frame rate period), digital signals generated in the subject upon irradiation of light a plurality of number of times and output from the signal collecting unit 140 at the first cyclic period and to store it in the storage unit 152 as an electric signal (photoacoustic signal) originating from photoacoustic waves. Here, the combining may include not only a simple addition but also a weighted addition, an averaging, and a moving average. Any other kinds of combining than averaging may be applied although averaging will be described mainly below. The computer 150 may perform a process such as an image reconstruction on a digital signal stored in the storage unit 152 to generate photoacoustic image data within a time period defined by the second cyclic period (imaging frame rate period). The computer 150 may function as an image generating unit configured to generate an image of an internal region of a subject based on the received signal.
The computer 150 is configured to output the generated photoacoustic image data to the frame rate converting unit 159 at the second cyclic period. The frame rate converting unit 159 is configured to convert photoacoustic image data generated at the second cyclic period to photoacoustic image data at a third cyclic period (hereinafter, also called a display frame rate period) that is suitable for display on the display unit 160. In other words, the computer 150 converts the display rate of an image based on the image data generated at the second cyclic period to a frame rate of the third cyclic period. The computer 150 is configured to generally control the photoacoustic apparatus 1 by using the control unit 153.
The display unit 160 is configured to display a photoacoustic image based on the photoacoustic image data at the third cyclic period (display frame rate period).
The computer 150 may perform image processing for display and processing for combining graphic representations for GUI on the obtained photoacoustic image data. This processing may be performed on the photoacoustic image data generated at the second cyclic period or may be performed on the photoacoustic image data generated at the third cyclic period.
Although, the terms, "first cyclic period", "second cyclic period (imaging frame rate period)", and "third cyclic period (display frame rate period)" are used for describing embodiments, the term " period" according to the present disclosure is not necessarily required to be a "completely uniform repetition time period". In other words, the term "period" is used to refer to each of repetition time intervals that are not uniform according to the present disclosure. There may be a pause period in the first cyclic period as will be described below. A repetition time period in a time period excluding the pause period is called a period according to the present disclosure.
A user (such as a doctor or a technician) may check a photoacoustic image displayed on the display unit 160. The image displayed on the display unit 160 may be saved in a memory within the computer 150 or a data management system connected to the photoacoustic apparatus over a communication network in response to a save instruction from a user or the computer 150. The input unit 170 is configured to receive an instruction from a user.
Next, configurations of blocks will be described in detail.
Fig. 2A is a schematic diagram of the probe 180 according to this embodiment. The probe 180 includes the light source unit 200, the optical system 112, the light irradiating unit 113, the acoustic wave receiving unit 120, and a housing 181. The housing 181 is configured to enclose the light source unit 200, the optical system 112, the light irradiating unit 113 and the acoustic wave receiving unit 120. A user may grip the housing 181 to use the probe 180 as a hand-held probe. The light irradiating unit 113 is configured to irradiate light pulses propagated from the optical system 112 to a subject. The X, Y, and Z axes illustrated in Figs. 2A and 2B represent coordinate axes in a case where the probe is settled but are not intended to limit the orientation of the probe while being used.
The probe 180 illustrated in Fig. 2A is connected to the signal collecting unit 140 through a cable 182. The cable 182 may include a wire configured to supply power to the light source unit 200, a light emission control signal wire, or a wire (not illustrated) configured to output an analog signal output from the acoustic wave receiving unit 120 to the signal collecting unit 140. The cable 182 may have a connector and may be configured to be capable of separating the probe 180 and the other components of the photoacoustic apparatus. As illustrated in Fig. 2B, light pulses may be irradiated to a subject directly by using a semiconductor laser or a light emitting diode as the light source unit 200, without using the optical system 112. In this case, an light emitting end part of the semiconductor laser or LED (or the leading end of the housing), for example, may correspond to the light irradiating unit 113.
The light source unit 200 is configured to generate light to be irradiated to the subject 100. The light source unit 200 may be a light source capable of generating pulsed light and outputting light beams having a plurality of wavelengths for obtaining a substance concentration such as an oxygen saturation. Because it is important to mount the probe 180 within the housing, the semiconductor light emitting device such as a semiconductor laser or a light emitting diode as illustrated in Fig. 2B may be used. The light beams having a plurality of wavelengths may be output by switching the light emission by using a plurality of types of semiconductor lasers or light emitting diodes which generate light beams having different wavelengths.
The light source unit 200 can generate light having a pulse width equal to or higher than 10 ns and equal to or lower than 1 μs, for example. Though the light may have a wavelength equal to or higher than 400 nm or equal to or lower than 1600 nm, the wavelength may be determined in accordance with the light absorption property of a light absorber to be imaged. In order to image a blood vessel at a high resolution, light having a wavelength (equal to or higher than 400 nm and equal to or lower than 800 nm) which can be highly absorbed by the blood vessel may be applied. In order to image a deep part of a living body, light having a wavelength (equal to or higher than 700 nm and equal to or lower than 1100 nm) which can be less absorbed by a background tissue (water or fat) of the living body may be applied. According to an aspect of the present invention, because a semiconductor light emitting device is used as a light source for the light source unit 200, an insufficient light amount may be produced. In other words, one irradiation may produce a photoacoustic signal having an S/N ratio lower than a desired ratio. Accordingly, light may be emitted at the first cyclic period, and resulting photoacoustic signals may be averaged for an improved S/N ratio. Then, based on the averaged photoacoustic signal, a photoacoustic image may be calculated at the second cyclic period (imaging frame rate period).
The light source unit 200 according to this embodiment may emit light having a wavelength of 797 nm, for example. In other words, light having the wavelength can reach a deep part of a subject and can be absorbed by oxyhemoglobin and deoxyhemoglobin with substantially equal absorption factors. Therefore, the wavelength is suitable for detection of a blood vessel structure. A light source may be used which produces a second wavelength of 756 nm so that an oxygen saturation can be acquired by using a difference between the absorption factors of oxyhemoglobin and deoxyhemoglobin.
The light irradiating unit 113 is an emission end configured to irradiate light to a subject. The light irradiating unit 113 may have a usable termination when a bundle fiber is used as the optical system 112. In a case where the subject 100 is a part (such as the breast) of a living body, a diffuser for diffusing light may be used so that pulsed light having an increased beam diameter can be irradiated. In a case where the light source unit 200 illustrated in Fig. 2B is a semiconductor light emitting device, light emission end parts (housing leading edges) of a plurality of semiconductor light emitting devices may be arranged to function as the light irradiating unit 113 so that light can be irradiated to a wide range of a subject.
The acoustic wave receiving unit 120 includes a transducer configured to receive photoacoustic waves generated in the subject upon irradiation of light at the first cyclic period and to output an electric signal and a supporting member configured to support the transducer. The transducer may include components such as a piezoelectric material, a Capacitive Micro-machined Ultrasonic Transducer (CMUT), and a Fabry-Perot interferometer. The piezoelectric material may be a piezoelectric ceramic material such as PZT (lead zirconate titanate) or a polymer piezoelectric film material such as PVDF (polyvinylidene difluoride), for example.
The electric signal obtained by the transducer at the first cyclic period is a time-resolved signal. Therefore, the electric signal has an amplitude representing a value based on a sound pressure (such as a value in proportion to the sound pressure) received by the transducer at each time.
The transducer may be capable of detecting a frequency component (typically from 100 KHz to 10 MHz) of a photoacoustic wave. A plurality of transducers may be arranged on the supporting member to form a plane or a curved surface called a 1D array, a 1.5D array, a 1.75D array, or a 2D array, for example.
The acoustic wave receiving unit 120 may include an amplifier configured to amplify time-series analog signals output from the transducers. The acoustic wave receiving unit 120 may include an A/D converter configured to convert time-series analog signals output from the transducers to time-series digital signals. In other words, the acoustic wave receiving unit 120 may include the signal collecting unit 140.
The transducers may surround the whole circumference of the subject 100 in order to detect acoustic waves in various angles for improved image accuracy. In a case where the transducers cannot surround the subject 100 because the subject 100 is large, the transducers may be arranged on a hemispherical supporting member. The probe 180 including the acoustic wave receiving unit 120 having such a shape is suitable for an electrical scanning photoacoustic apparatus which relatively moves the probe with respect to the subject 100, instead of a hand-held photoacoustic apparatus. The probe may be moved by a scanning unit such as an XY stage. The arrangement and number of transducers and the shape of the supporting member are not limited to those described above but may be optimized in accordance with the subject 100.
A medium for propagating photoacoustic waves may be provided in a space between the acoustic wave receiving unit 120 and the subject 100. This can cause an acoustic impedance match at an interface between the subject 100 and the transducers. The medium may be water, oil, or ultrasound gel, for example.
The photoacoustic apparatus 1 may include a holding member configured to hold the subject 100 to stabilize the shape of the subject 100. The holding member may have both of a high luminous transmittance and a high acoustic wave transmittance. For example, polymethylpentene, polyethylene terephthalate, or acrylic may be used.
In the apparatus according to this embodiment, the transducers may function as a transmitting unit configured to transmit acoustic waves in order to generate not only a photoacoustic image but also an ultrasonic wave image by transmitting and receiving acoustic waves. A transducer functioning as a receiving unit and a transducer functioning as a transmitting unit may be implemented by a single (common) transducer or by separate transducers.
The signal collecting unit 140 includes an amplifier and an A/D converter. The amplifier is configured to amplify an electric signal that is an analog signal generated in the subject upon irradiation of light at the first cyclic period and output from the acoustic wave receiving unit 120. The A/D converter is configured to convert the analog signal output from the amplifier to a digital signal. The signal collecting unit 140 may be implemented by an FPGA (Field Programmable Gate Array) chip.
Operations of the signal processing unit 140 will be described in details. Analog signals output from a plurality of transducers arranged in an array form in the acoustic wave receiving unit 120 are amplified by a plurality of corresponding amplifiers and are converted to digital signals by a plurality of corresponding A/D converters. The A/D conversion is performed at an A/D conversion rate equal to or higher than at least two times of the band of an input signal. In a case where the photoacoustic waves contain a frequency component in a range from 100 KHz to 10 MHz, the A/D conversion rate may be equal to or higher than 20 MHz or, desirably, may be equal to 40 MHz. The signal collecting unit 140 synchronizes the timing of light irradiation and the timing of signal collection processing by using a light emission control signal. In other words, the A/D conversion starts at the A/D conversion rate to convert an analog signal to a digital signal with reference to a light emission time in each first cyclic period. As a result, a digital data string for each time interval (A/D conversion interval) equal to a fraction of the A/D conversion rate from a light emission time in each first cyclic period can be obtained for a plurality of transducers.
The signal collecting unit 140 may also be called a Data Acquisition System (DAS). An electric signal herein is a concept including an analog signal and a digital signal.
The signal collecting unit 140 may be placed within the housing 181 of the probe 180, as described above. In this configuration, information between the probe 180 and the computer 150 is propagated as a digital signal for improved noise immunity. Use of high-speed digital signals can reduce the number of wires, which can improve operability of the probe 180, compared with transmission of analog signals.
The signal collecting unit 140 may perform averaging, which will be described below. The averaging may be performed by using hardware such as an FPGA.
The computer 150 includes the calculating unit 151, the storage unit 152, the control unit 153, and the frame rate converting unit 159. A unit responsible for a calculation function of the calculating unit 151 may be a computing circuit including a processor such as a CPU or a GPU (Graphics Processing Unit) and an FPGA (Field Programmable Gate Array) chip. This unit may be implemented by a single processor or computing circuit or a plurality of processors or computing circuits.
The computer 150 may combine data pieces having an equal time difference from a time of light emission from the light source 200 in a digital data string output from the signal collecting unit 140 in every first cyclic period. The computer 150 then stores in the storage unit 152 the combined digital data string as a combined electric signal (photoacoustic signal) originating from photoacoustic waves in each second cyclic period (imaging frame rate period).
The calculating unit 151 then generates photoacoustic image data (of a structure image or a function image) by performing image reconstruction and executes other kinds of computing processing based on the averaged photoacoustic signal stored in the storage unit 152 in each second cyclic period (imaging frame rate period). The calculating unit 151 may receive a parameter such as the speed of sound of a subject or a configuration of the holding unit from the input unit 170 for use in a computing operation.
The calculating unit 151 may apply any arbitrary reconstruction algorithm for converting an electric signal to three-dimensional volume data, such as a time-domain back projection method, a Fourier-domain back projection method, and a model-based method (repeated calculation method). The time- domain back projection method may be Universal back-projection (UBP), Filtered back-projection (FBP), or phase-regulating addition (Delay-and-Sum).
In a case where the light source unit 200 is capable of switching the wavelength of a light beam to be emitted, the calculating unit 151 may perform image reconstruction processing to generate a first initial sound pressure distribution from a photoacoustic signal originating from a light beam having a first wavelength and a second initial sound pressure distribution from a photoacoustic signal originating from a light beam having a second wavelength. The calculating unit 151 may further correct the first initial sound pressure distribution by using a light amount distribution of the light beam having the first wavelength to obtain a first absorption factor distribution and correct the second initial sound pressure distribution by using a light amount distribution of the light beams having the second wavelength to obtain a second absorption factor distribution. The calculating unit 151 can further obtain an oxygen saturation distribution from the first and second absorption factor distributions. Details and order of the computing operations are not limited thereto if the oxygen saturation distribution can finally be obtained.
The storage unit 152 may include a volatile memory such as a RAM (random access memory) and a non-volatile storage medium such as a ROM (read only memory), a magnetic disk, and a flash memory. The storage medium to store a program is a non-volatile storage medium. The storage unit 152 may include a plurality of storage media.
The storage unit 152 can save a photoacoustic signal averaged at the second cyclic period (imaging frame rate period) and data such as photoacoustic image data generated by the calculating unit 151 and reconstructed image data based on photoacoustic image data.
The control unit 153 includes computing elements such as a CPU. The control unit 153 is configured to control operations to be performed by components of the photoacoustic apparatus. The control unit 153 may control a component of the photoacoustic apparatus in response to an instruction signal for an operation such as starting a measurement from the input unit 170. The control unit 153 may read out program code stored in the storage unit 152 and control an operation of a corresponding component of the photoacoustic apparatus.
The control unit 153 may adjust an image for display on the display unit 160. Thus, oxygen saturation distribution images are displayed sequentially in response to movements of the probe and photoacoustic measurements.
The frame rate converting unit 159 outputs photoacoustic image data (structure image or function image) generated at a frame rate (imaging frame rate) of the second cyclic period to the display unit as an image signal at a frame rate (display frame rate) of the third cyclic period. Illustrating the frame rate converting unit 159 as an independent component in Fig. 1, reconstructed image data may be stored in the storage unit 152 at the frame rate of the second cyclic period (imaging frame rate), and the stored reconstructed image data may be read out at the third frame rate (display frame rate). In this case, the control unit 153 and the storage unit 152 can also function as the frame rate converting unit 159. According to an aspect of the present invention, in a case where a component which implements another function is used to implement the frame rate conversion, the component may also be called the frame rate converting unit.
The frame rate of the third cyclic period (display frame rate) may be selected from frame rates of 50 Hz, 60 Hz, 72 Hz, and 120 Hz which are supported by a general-purpose display apparatus, for example. With this configuration, the frame rate of the second cyclic period (imaging frame rate) suitable for measurement or the frame rate of the third cyclic period (display frame rate) suitable for image display can be independently selected. In other words, the frame rate of the second cyclic period (imaging frame rate) suitable for measurement can be selected independently from the frame rate of the third cyclic period (display frame rate) suitable for image display. A configuration may easily be provided in which the frame rate of the second cyclic period (imaging frame rate) can be changed in response to a user's instruction, for example.
The display unit 160 is configured to rewrite the real screen in synchronization with the frame rate of the third cyclic period (display frame rate) input to the display unit 160. In this case, the frame rate of the third cyclic period (display frame rate) is equal to a rate for rewriting the real screen (refresh rate). In recent years, some liquid crystal display apparatuses may have a function for processing a plurality of input frame rates (frame frequencies). Such liquid crystal display apparatuses may contain a frame rate converter configured to convert an input frame rate to a rate for rewriting the real screen (refresh rate). Such a configuration having the display unit 160 may correspond to a configuration including the display unit 160 having a frame rate converter configured to convert a rate for rewriting the real screen (refresh rate) to the frame rate of the third cyclic period (display frame rate).
In a case where the display unit 160 containing such a frame rate converter is used, the frame rate converting unit 159 is provided in the display unit 160 instead of the configuration having the frame rate converting unit 159 illustrated in Fig. 1 in the computer 150. According to an aspect of the present invention, the frame rate converting unit 159 may be provided in the display unit 160 instead of being provided in the computer 150. The configuration having the frame rate converting unit in the display unit 160 can advantageously simplify the configuration of the computer 150.
The computer 150 may be a specially designed workstation. The computer 150 may cause a general-purpose PC or workstation to operate of response to an instruction from a program stored in the storage unit 152. The components of the computer 150 may be configured by different hardware modules. At least some components of the computer 150 may be configured by one hardware module.
Fig. 3 illustrates a specific configuration example of the computer 150 according to this embodiment. The computer 150 according to this embodiment includes a CPU 154, a GPU 155, a RAM 156, a ROM 157, an external storage device 158, and the frame rate converting unit 159. To the computer 150, a liquid crystal display 161 corresponding to the display unit 160, a mouse 171 corresponding to the input unit 170, and a keyboard 172 are connected.
The computer 150 and the acoustic wave receiving unit 120 are contained in a common housing. The computer contained in the housing may perform some signal processes, and a computer provided externally to the housing may perform the other signal processes. In this case, the computers provided internally and externally to the housing can collectively be called a computer according to this embodiment. In other words, hardware modules included in the computer may not be contained in one housing. The computer 150 may be an information processing apparatus provided by a cloud computing service, for example, and installed remotely.
The computer 151 corresponds to a processing unit according to an aspect of the present invention. Particularly, the calculating unit 151 mainly implements the functionality of the processing unit.
The display unit 160 may be a display apparatus such as a liquid crystal display or an organic electro luminescence (EL). The display unit 160 is an apparatus configured to display an image based on subject information obtained by the computer 150 and a numerical value for a specific position. The display unit 160 receives and displays reconstructed image data at the frame rate of the third cyclic period (display frame rate). The frame rate of the third cyclic period (display frame rate) may be equal to a frame rate of 50 Hz, 60 Hz, 72 Hz, or 120 Hz, for example. The display unit 160 may display an image and a GUI for operating the apparatus. The display unit 160 or the computer 150 may perform image processing (such as adjustment of a luminance value).
The input unit 170 may be an operating console including a mouse, a keyboard, and a specific knob, which can be operated by a user. The display unit 160 may be configured by a touch panel, and the display unit 160 may be used as the input unit 170. The input unit 170 may receive an instruction or a numerical value from a user and transmit it to the computer 150.
The components of the photoacoustic apparatus may be implemented by mutually different devices or may be implemented by one integral apparatus. At least some components of the photoacoustic apparatus may be implemented by one integral apparatus.
The computer 150 may further use the control unit 153 to control driving of the components of the photoacoustic apparatus. The display unit 160 may display a GUI in addition to an image generated by the computer 150. The input unit 170 is configured to receive information from a user. A user can use the input unit 170 to instruct to perform operations such as starting and finishing a measurement, designating the frame rate of the second cyclic period (imaging frame rate), which will be described below, and saving a generated image.
The subject 100 will be described below though it is not a component of the photoacoustic apparatus. The photoacoustic apparatus according to this embodiment is usable for purposes such as diagnoses of human or animal malignant tumors and blood vessel diseases and follow-ups of chemical treatments. Therefore, the subject 100 is assumed as a region to be diagnosed such as a living body, more specifically, may be a living body, and more specifically, it may be a diagnosis target region such as the breast, organs, a vascular network, the head, the neck, the abdomen, or the limb including a finger and a toe of a human or animal body. For example, in a case where a human body is a measuring subject, oxyhemoglobin or deoxyhemoglobin or a blood vessel mostly including them or a neovessel formed in neighborhood of a tumor may be an optical absorber. Plaque of a carotid artery wall may be an optical absorber. Alternatively, a pigment such as methylene blue (MB), indocyanine green (ICG), gold minute particles, or an externally introduced substance integrating or chemically modifying them may be an optical absorber. Alternatively, a puncture needle or an optical absorber attached to a puncture needle may be observed. The subject may be a lifeless thing such as a phantom or a test subject.
Figs. 4A, 4B and 4C are timing charts for describing operations according to the first embodiment of the present invention. Figs. 4A, 4B and 4C have a horizontal axis being a time axis. With reference to Figs. 4A, 4B and 4C, the first embodiment of the present invention will be described. Controls may be executed by the computer 150, an FPGA or a dedicated hardware module.
First of all, with reference to Fig. 4A, methods for obtaining a photoacoustic signal and for generating a photoacoustic image based on the obtained photoacoustic signal in the photoacoustic apparatus according to an aspect of the present invention will be described in detail. For simple description, Fig. 4A illustrates a case where the frame rate of the second cyclic period (imaging frame rate) and the frame rate of the third cyclic period (display frame rate) have an equal frequency.
As indicated by T1 in Fig. 4A, in the photoacoustic apparatus, the light source unit 200 emits light at a first cyclic period (first cyclic period: tw1), and a photoacoustic signal based on the emitted light is obtained at the first cyclic period: tw1.
The first cyclic period: tw1 may have a length set in consideration of a Maximum Permissible Exposure (MPE) for skin. This is because the MPE value decreases as the length of the first cyclic period: tw1 decreases. For example, in a case where the measurement wavelength is equal to 750 nm, the pulse width of pulsed light is equal to 1 μsec, and the first cyclic period: tw1 is equal to 0.1 msec, the MPE value for skin is about 14 J/m2. On the other hand, in a case where the pulsed light irradiated from the light emitting unit 113 has a peak power of 2 kW and where the area of irradiation from the light emitting unit 113 is equal to 150 mm2, the optical energy irradiated from the light source unit 200 to the subject 100 such as a human body is equal to about 13.3 J/m2. In this case, the optical energy irradiated from the light emitting unit 113 is equal to or lower than the MPE value. Therefore, in a case where the first cyclic period: tw1 is equal to or higher than 0.1 msec, it can be guaranteed that the optical energy is equal to or lower than the MPE value. The length of the first cyclic period: tw1 is set in a range lower than the MPE value based on the peak power and the area of irradiation.
Referring to Fig. 4A, as indicated by T1 to T3, the light irradiating unit irradiates light to the subject eight times in the first cyclic period: tw1, and photoacoustic signals generated in the subject upon the irradiation of light are obtained ((1) to (8)). The obtained photoacoustic signals are averaged, and the averaged photoacoustic signal A1 is obtained for each imaging frame rate period: tw2. Instead of the averaging, simple averaging, moving averaging, or weighted averaging may be performed, for example, as described above. Giving more specific numerical value examples, in a case where the time period of the first cyclic period: tw1 is equal to 0.1 msec and where the imaging frame rate is equal to 60 Hz, the time period of the imaging frame rate period: tw2 is equal to 16.7 msec, which means that the averaging is performed 167 times in the imaging frame rate period.
Processing for image reconstruction may be performed based on the averaged photoacoustic signal A1 in the time interval defined by the second cyclic period: tw2 so that reconstructed image data R1 can be obtained. Here, the reconstructed image data are sequentially calculated by the calculating unit in the imaging frame rate period: tw2.
After that, the reconstructed image data R1 calculated by the calculating unit are output at the frame rate of the third cyclic period as an image 1 from the frame rate converting unit 159 to the display unit. According to the example illustrated in Fig. 4A, the imaging frame rate and the display frame rate exhibit an equal frequency, as described above. Thus, the frame rate converting unit 159 outputs the reconstructed image data R1 obtained in T3 as display image data in the display frame rate period: tw3. The display unit 160 then displays the display image data input in the display frame rate period: tw3.
Next, an example method for setting the first cyclic period: tw1 and the imaging frame rate period: tw2 will be described. Because of the limitation due to the MPE value, the first cyclic period: tw1 depends on the peak power of pulsed light and an irradiated area on a subject, as described above. The number of times of averaging depends on the ratio of the S/N ratio of the photoacoustic signal obtained by one irradiation of pulsed light to the S/N ratio of the photoacoustic signal determined based on the image quality requested by a user. For example, in a case where the S/N ratio of the photoacoustic signal obtained by one irradiation of pulsed light is 1/10 times of the S/N of the requested photoacoustic signal, the S/N ratio may be improved 10 times. Therefore, the averaging may be performed 100 times. For example, when the first cyclic period: tw1 is equal to 0.1 msec, the imaging frame rate period is equal to or longer than 10 msec, that is, the imaging frame rate is equal to or lower than 100 Hz.
The first cyclic period: tw1 is also limited by heat generated by the semiconductor light emitting device. In other words, when the thermal resistance of the probe is given, the temperature may be determined based on the power consumption of the semiconductor light emitting device. Therefore the first cyclic period: tw1 may be increased to prevent the temperature of the semiconductor light emitting device from exceeding the allowable temperature.
On the other hand, as the number of times of averaging is increased, the time for averaging photoacoustic signals also increases. A body motion of the subject if any may cause blurring. In order to reduce the motion blurring, the number of times of averaging may advantageously be reduced as much as possible. More specifically, it may be designed such that the motion blurring is reduced to 1/2 or lower of the requested resolution. For example, in a case where the requested resolution is 0.2 mm, the body motion of the subject is 5 mm/sec, and the first cyclic period: tw1 is equal to 0.1 msec, the number of times of averaging may be equal to or lower than 200, that is, the imaging frame rate period: tw2 may be equal to or shorter than 20 msec. In consideration of the plurality of conditions, the first cyclic period: tw1 and the imaging frame rate period: tw2 may be determined. If all of the conditions cannot be satisfied, those parameters may be determined by giving priority levels to the conditions. Alternatively, the photoacoustic apparatus may be configured such that a user can input desired conditions such as a resolution and an S/N ratio through the input unit and such that the control unit may determine the first cyclic period and second cyclic period based on the input conditions.
Figs. 4B and 4C are timing charts according to the first embodiment of the present invention in a case where the imaging frame rate and the display frame rate have different frequencies. With reference to Fig. 4B, methods for obtaining a photoacoustic signal and for generating photoacoustic image data based on the obtained photoacoustic signal in the photoacoustic apparatus according to the first embodiment of the present invention will be described in detail.
Figs. 4B and 4C illustrate the same operations in T1 to T3 as those in Fig. 4A. In other words, operations illustrated in Figs. 4B and 4C from irradiation of light pulses to obtaining of reconstructed image data can be performed under the same conditions for the obtaining (and the same measurement condition) as those of the operations illustrated in Fig. 4A. Therefore, identical reconstructed image data can be obtained under identical measurement conditions irrespective of the display frame rate of the display unit.
Referring to Fig. 4B, T4 indicates an example with a display frame rate of 72 Hz where the display frame rate period: tw3 is equal to about 13.8 msec. This means that the imaging frame rate period in the second cyclic period is longer than the display frame rate period in the third cyclic period. On the other hand, referring to Fig. 4C, T4 indicates an example with a display frame rate of 50 Hz where the display frame rate period: tw3 is equal to 20 msec. This means that the imaging frame rate period in the second cyclic period is shorter than the display frame rate period in the third cyclic period. As described above, the frame rate converting unit 159 converts the reconstructed image data obtained under the same measurement conditions from the imaging frame rate (60 Hz) to the display frame rate (72 Hz or 50 Hz). Having described the example that the imaging frame rate period that is the second cyclic period is equal to 20 ms, the present invention is not limited to the condition. Actually, because the first cyclic period may be required to be longer than the time period depends on the distance from the receiving unit to an observation target or the distance for receiving photoacoustic waves, the first cyclic period is realistically in a range from 0.1 to several msec. For an improved S/N ratio of photoacoustic signals, the imaging frame rate may be set to 240 Hz or lower if photoacoustic signals based on light pulses irradiated 10 or more number of times are to be obtained and be averaged. More simply, the frame rate converting unit may image combining, frame thinning, and frame rewriting between a plurality of frames for the frame rate conversion. In a case where the probe moves quickly with a significant sense of interference, the frame rate converting unit may perform an inter-frame interpolation by using a motion vector, for example to perform frame rate conversion including generation of an interpolation frame.
Because a simple frame rate conversion includes frame thinning and frame rewriting processing, reconstructed image data not to be displayed may be generated when the display frame rate period: tw3 is shorter than the imaging frame rate period: tw2. Therefore, the reconstructed image data may be wasted if reconstructed image data is only to be displayed. Therefore, in this case, in order to eliminate the waste, reconstructed image data may be generated at a time that satisfies a condition that the imaging frame rate period: tw2 is equal to or longer than the display frame rate period: tw3.
In a case where the imaging frame rate period is longer than the display frame rate period, processes for reconstructing images for a plurality of frames may be performed in parallel so that the imaging frame rate period can be effectively shorter than the display frame rate period. In other words, while a process for reconstructing an image R1 is being performed, the calculating unit may start a process for reconstructing an image R2 so that reconstructed image data for one frame can be obtained within the display frame rate period.
On the other hand, the obtained reconstructed image data may be displayed and, at the same time, be sequentially stored in the storage unit 152. The reconstructed image data sequentially stored in the storage unit 152 may be displayed in another time period. In other words, the reconstructed image data are read out from the storage unit 152 in the imaging frame rate period: tw2, undergo the frame rate conversion in the frame rate converting unit 159, and, in the display frame rate period: tw3, are output to the display unit 200 as display data. Thus, the reconstructed image data stored in the storage unit 152 can be displayed. On the other hand, the reconstructed image data sequentially stored in the storage unit 159 may be read out in the display frame rate period: tw3 and may be output to the display unit 200 as display data. This processing is also called a frame rate conversion herein according to an aspect of the present invention. In this case, the time period for displaying the reconstructed image data may increase or decrease (slow-motion or high-speed imaging), but there is no reconstructed image data not to be displayed as an image. The condition for the relationship between the imaging frame rate period: tw2 and the display frame rate period: tw3 may not necessarily be required.
The operations illustrated in Figs. 4A to 4C may be changed based on a user's instruction or a result of a predetermined judgment performed by the computer 150. In other words, a user can change the operating mode in accordance with the frequency for image refreshing.
In the apparatus according to PTL 1, the timing for obtaining reconstructed image data, that is, the measurement condition depends on the refresh rate of the display. According to this embodiment on the other hand, the imaging frame rate and the display frame rate may be independent from each other, and the frame rate converting unit 159 may convert the imaging frame rate of reconstructed image data to the display frame rate. Therefore, even when a display apparatus with the display frame rate is used, reconstructed image data can be obtained without changing the measurement condition. As a result, independently from the display apparatus to be used, reconstructed image data can be obtained without changing the measurement condition, which can facilitate a comparison between a plurality of reconstructed image data pieces. The photoacoustic apparatus may have a simple apparatus configuration because it can operate of the predefined timing excluding the display frame rate T4 independently from the refresh rate of the display apparatus. Advantageously, the display apparatus to be used can be changed easily without any change of the measurement condition.
Next, a second embodiment of the present invention will be described.
Because the apparatus disclosed in PTL 1 generates reconstructed image data for each period of the refresh rate of the display apparatus, the measurement condition cannot be easily changed. According to this embodiment, in a case where a reconstructed image is to be displayed on a display apparatus having a fixed display frame rate, the measurement condition can be changed in accordance of a subject or a region of interest in response to a user's designation or automatically.
With reference to Fig. 4A and Figs. 5A and 5B, operations to be performed by a photoacoustic apparatus according to the second embodiment of the present invention will be described in detail.
Fig. 5A is a timing chart for illustrating examples of operations according to this embodiment. The timing chart in Fig. 5A is different from the timing chart illustrated in Fig. 4A in times indicated in T1 to T3. On the other hand, the display frame rate T4 is the same as that illustrated in Fig. 4A. The operations in the timing chart illustrated in Fig. 5A are effective for improving the S/N ratio of a reconstructed image. The operations are also effective in a case where the region of interest is at a deeper position within a body.
Referring to Fig. 5A, in a time period indicated by T1, in the photoacoustic apparatus, the light source unit 200 emits light at a first cyclic period: tw1, and a photoacoustic signal based on the emitted light is obtained at the first cyclic periods: tw1. The first cyclic period: tw1 is the same as the time period indicated by T1 in Fig. 4A. Because the length of the first cyclic period: tw1 relates to a Maximum Permissible Exposure (MPE) for skin, as described above, it is equal to the time period indicated by T1 illustrated in Fig. 4A. The length of the first cyclic period: tw1 may be changed under the MPE. In order to increase the S/N ratio of a reconstructed image, the number of times of averaging may be increased from 8 according to the first embodiment to 10. As illustrated in Fig. 5A, the signal collecting unit obtains a photoacoustic signal ten times during the first cyclic period: tw1 ((1) to (10)), and the calculating unit averages the obtained photoacoustic signals to obtain an averaged photoacoustic signal A1 in each imaging frame rate period: tw2. Then, in each imaging frame rate period: tw2, the calculating unit calculates reconstructed image data. As a result, though the imaging frame rate period: tw2 is longer than that in Fig. 4A, the frame rate converting unit 159 converts the output period for reconstructed image data pieces R1, R2, ... to the display frame rate period: tw3 and outputs the image data pieces as display image data. The display unit 160 displays the display image data input in the display frame rate period: tw3.
On the other hand, operations illustrated in the timing chart in Fig. 5B are effective when a subject or a probe moves fast. Referring to Fig. 5B, in a time period as indicated by T1, the number of times of the operation for averaging photoacoustic signals is reduced from 8 in Fig. 4A to 6. The reduction of the number of times of the averaging may shorten the time width for averaging photoacoustic data (sampling enabled time). Therefore, motion blurring can be reduced, compared with the operation illustrated in Fig. 5A.
In an example illustrated in Fig. 5B, the signal collecting unit obtains a photoacoustic signal six times during the first cyclic period: tw1 ((1) to (6)), and the calculating unit averages the obtained photoacoustic signals to obtain an averaged photoacoustic signal A1 in each imaging frame rate period: tw2. Then, in each imaging frame rate period: tw2, the calculating unit calculates reconstructed image data. As a result, though the imaging frame rate period: tw2 is shorter than that in Fig. 4A, the frame rate converting unit 159 converts the output period for reconstructed image data pieces R1, R2, ... to the display frame rate period: tw3 and outputs the image data pieces as display image data. The display unit 160 displays the display image data input in the display frame rate period: tw3. The operations illustrated in Fig. 5B is more effective for reduction of motion blurring than those in Fig. 5A.
Also according to this embodiment, a user can select an operating mode in accordance with the priority levels of the reduction of motion blurring and the S/N ratio of the resulting photoacoustic image. Alternatively, the computer 150 may be configured to automatically determine the operating mode.
In a case where the computer 150 is configured to automatically set an operating mode, the control unit may detect a motion of the probe by using an accelerometer provided within the probe so that the operating mode can be changed in accordance with the detection result, for example. In a case where the motion of the probe is faster than a predetermined threshold value, the measurement may be executed in the operating mode illustrated in the timing chart in Fig. 5B (with a lower number of times of averaging photoacoustic signals) for reduction of motion blurring. On the other hand, in a case where the motion of the probe is slower than the predetermined threshold value, the measurement may be executed in the operating mode illustrated in the timing chart in Fig. 5A (with a higher number of times of averaging photoacoustic signals) for an improved S/N ratio of the resulting photoacoustic image. In this manner, the operation mode may be automatically determined in accordance with a result of detection of the motion of the probe.
According to an alternative method for changing the operating mode automatically by the computer 150, the depth of region of interest may be used. In a case where the depth of the region of interest is deeper than a predetermined threshold value, priority is given to the S/N ratio of the resulting photoacoustic image so that the measurement is executed in the operating mode illustrated in the timing chart in Fig. 5A (with a higher number of times of averaging photoacoustic signals). On the other hand, in a case where the depth of the region of interest is shallower than the predetermined threshold value, priority is given to reduction of motion blurring so that the measurement is executed in the operating mode indicated by the timing chart in Fig. 5B (with a lower number of times of averaging photoacoustic signals). In this manner, the operation mode may be determined automatically based on the depth of the region of interest.
The probe may have a pressure sensor, and the pressing force of the probe may be detected to determine whether priority is given to the S/N ratio of a photoacoustic image or the reduction of motion blurring. In other words, if the pressing force of the probe is higher than a predetermined threshold value, it may be determined that the region of interest is in a deeper area and that the probe moves slowly. Therefore, priority is given to the S/N ratio of a photoacoustic image, and the number of photoacoustic signals to be averaged can be increased. On the other hand, if the pressing force of the probe is lower than the predetermined threshold value, it may be determined that the region of interest is in a shallower area and that the probe moves fast. Therefore, priority is given to reduction of motion blurring, and the number of photoacoustic signals to be averaged is reduced. In this manner, the priority may be determined automatically based on a result of detection of the pressing force of the probe.
According to this embodiment, the frame rate converting unit is provided so that a measurement condition for obtaining reconstructed image data in response to a user's designation or automatically can be changed easily without changing the display frame rate of a display apparatus.
A third embodiment according to the present invention is different from the first embodiment in operations of photoacoustic signals to be averaged for generating a reconstructed image.
Fig. 6 illustrates time periods T1 to T4 which are identical to those in Fig. 4A. The operations illustrated in Fig. 4A include averaging of photoacoustic signals obtained in a time period equal to the imaging frame rate period: tw2 while, according to this embodiment, photoacoustic signals ((1) to (10)) obtained in a period longer than the imaging frame rate period are to be averaged. Thus, averaging of photoacoustic signals over a period longer than the imaging frame rate period: tw2 can provide an improved S/N ratio. The application of photoacoustic signals obtained in a period longer than imaging frame rate period: tw2 may delay the time for generating the reconstructed image data R1, compared with the operations illustrated in Fig. 4A. In other words, the standby time from irradiation of light to a subject to display of a resulting image is longer than that of the operations illustrated in Fig. 4A.
Conversely to the operations in Fig. 6, partial photoacoustic signals obtained in the imaging frame rate period: tw2 may only be averaged to generate reconstructed image data. For example, the photoacoustic signals (7) to (10) of the photoacoustic signals illustrated in Fig. 6 may be averaged, without using the photoacoustic signals (1) to (6). Thus, photoacoustic signals can be averaged in a time period shorter than the imaging frame rate period so that motion blurring can be reduced.
This processing can facilitate to change the measurement condition for obtaining reconstructed image data as illustrated in the second embodiment, which can be implemented by a simplified apparatus configuration.
According to a fourth embodiment, a time for obtaining a photoacoustic signal is changed. Fig. 7 illustrates a timing chart for describing operations according to this embodiment.
Fig. 7 illustrates a case where the imaging frame rate and the display frame rate have an equal frequency, like Fig. 4A according to the first embodiment. This embodiment is also applicable to a case where the imaging frame rate and the display frame rate have different frequencies from each other, as illustrated in Figs. 4B and 4C.
The fourth embodiment illustrated in Fig. 7 is significantly different from the first embodiment in sampling times for averaging photoacoustic signals, that is, light emission times of the light source 200. According to the first embodiment, the first cyclic period is fixed. According to the fourth embodiment, as indicated by T1 in Fig. 7, times for obtaining photoacoustic signals are concentrated in a time period being the imaging frame rate period, that is, a pause period when no light irradiation to a subject is performed in the imaging frame rate period. Also in this example, the first cyclic period and so on may be determined under a condition satisfying an MPE as described above.
In operations illustrated in Fig. 7, during a first half of the imaging frame rate period, the light source unit 200 emits light at a first cyclic period: tw1, and the photoacoustic apparatus obtains photoacoustic signals generated upon the irradiation of light at the first cyclic period: tw1. Next, according to this embodiment, the calculating unit averages photoacoustic signals ((1) to (8)) obtained eight times to obtain an averaged photoacoustic signal A1. The calculating unit further sequentially calculates reconstructed image data pieces R1, R2, ... in a period defined by the second cyclic period based on the averaged photoacoustic signal A1. The frame rate converting unit 159 converts the reconstructed image data pieces R1, R2, ... in the display frame rate period: tw3 and outputs as display image data. The display unit 160 displays the display image data input in the display frame rate period: tw3.
The operation according to this embodiment obtains and averages photoacoustic signals intensively in a time period within the imaging frame rate period, which can reduce motion blurring in the resulting photoacoustic image more than the operation according to the first embodiment. An unnecessary photoacoustic signal that occurs according to the third embodiment does not occur in this embodiment. As illustrated in Fig. 7, the time period for irradiating light and obtaining a photoacoustic signal is a part of the imaging frame rate period, and a time period for not obtaining a photoacoustic signal occurs in the imaging frame rate period. Therefore, within the imaging frame rate period, photoacoustic signals are obtained and are averaged. Furthermore, reconstruction processing can be performed. As a result, a delay from obtaining a photoacoustic signal to output of reconstructed image data can be reduced.
The fourth embodiment can reduce the time width for averaging photoacoustic data, which therefore can reduce motion blurring.
According to a fifth embodiment, a measurement condition is determined in accordance with a subject. More specifically, a photoacoustic apparatus will be described which synchronizes a frame rate period with periodical biological activities such as heartbeats or electrocardiographic waveforms.
Fig. 8 is a block diagram illustrating a computer and peripheral components according to the fifth embodiment. Like numbers refer to like parts in Fig. 3 and Fig. 8. Referring to Fig. 8, an electrocardiograph 173 is configured to detect an electric signal transmitting through the heart of a patient that is a subject and output it to the computer 150. The computer 150 determines a imaging frame rate so as to be in synchronization with output (electrocardiographic waveform) of the electrocardiograph, obtains a photoacoustic signal, and calculates reconstructed image data. This processing will be described in detail with reference to Fig. 9.
Referring to Fig. 9, the imaging frame rate period: tw2 and the display frame rate period: tw3 are different from each other. The computer 150 synchronizes the imaging frame rate period: tw2 with an RR period: tw4 of an electrocardiographic waveform T5 detected by the electrocardiograph 173. More specifically, the imaging frame rate period: tw2 starts with reference to an R wave having a maximum amplitude of the electrocardiographic waveform T5.
As illustrated in Fig. 9, the light source unit 200 emits light at the first cyclic period: tw1, and the photoacoustic apparatus obtains the photoacoustic signal generated upon the irradiation of light at the first cyclic period: tw1. However, the photoacoustic signal is obtained in a sampling enabled period SW after a delay time DLY with reference to the R wave. The sampling enabled period SW is equal to a value acquired by multiplying the number of times of light emission (eight times in this case) by the first cyclic period: tw1. The first cyclic period: tw1 may be determined based on a limitation due to the MPE as described above.
Next, the signal collecting unit obtains a photoacoustic signal eight times during the first cyclic period: tw1 ((1) to (8)), and the calculating unit averages the obtained photoacoustic signals and calculates the averaged photoacoustic signal A1. The calculating unit further performs processing for reconstruction. The reconstructed image data R1 are sequentially calculated in the imaging frame rate period: tw2. The frame rate converting unit 159 outputs the calculated reconstructed image data R1, R2, ... in the imaging frame rate period: tw2 as display image data in the display frame rate period: tw3. The display unit 160 then displays the display image data input in the display frame rate period: tw3.
The times illustrated in Fig. 9 are given for illustration purpose only. In an actual apparatus, the first cyclic period may be equal to about 0.1 to several msec, the imaging frame rate period may be equal to about 0.4 to 2 sec, and the display frame rate may be equal to about 50 to 240 Hz.
The delay time DLY and the sampling enabled time SW, as described above, may be set arbitrarily by a user through the input unit 170, for example.
A blood vessel moves periodically in response to heartbeats. Thus, a photoacoustic signal is received in synchronization with an electrocardiographic waveform, that is, a heartbeat to obtain a reconstructed image. Therefore, motion blurring due to contraction and motion of the blood vessel in response to heartbeats can be reduced. Referring to the electrocardiographic waveform T5 in Fig. 9, the ventricle contracts with a QRS wave, and blood is ejected in a time period between an S wave to a T wave. In other words, it corresponds to a time period when the artery has a high blood pressure. In order to view a reconstructed image of the blood vessel in the time period, the delay time DLY and the sampling enabled time SW may be set to obtain photoacoustic signals in a time period from an S wave to a T wave, as illustrated in Fig. 9. The photoacoustic signals obtained in the set time period can be reconstructed to obtain a clear reconstructed image without motion blurring.
The region for obtaining a photoacoustic image depends on the distance from the heart and therefore depends on a time(phase for tw4) when the blood pressure changes based on an electrocardiographic waveform. The change of the blood pressure is also different (as the distance from the heart increases, the change of the blood pressure decreases). Accordingly, it may be configured such that a user can adjust the delay time DLY and the sampling enabled time SW to set a good value by checking the reconstructed image at the same time. With the configuration that a user can adjust the delay time DLY and the sampling enabled time SW, reconstructed image data pieces such as blood vessel images can be obtained at different times such as atrial expansion times (or under different measurement conditions).
Having described that, according to the aforementioned embodiments, a photoacoustic signal is obtained in synchronization with a heartbeat based on an electrocardiographic waveform, the fifth embodiment of the present invention can apply a biological signal synchronized with a heartbeat instead of the electrocardiographic waveform. More specifically, instead of the electrocardiographic waveform, an arterial blood pressure waveform or a sound wave (heart sound) generated by the heart may be applied. The fifth embodiment is also applicable to periodical biological activities excluding heartbeats.
According to the fifth embodiment of the present invention, as described above, reconstructed image data are generated at the imaging frame rate periods synchronized with electrocardiographic waveforms, for example, that is each different from the display frame rate period. Then, the frame rate converting unit 159 converts the period of the image data to the display frame rate, and the display unit 160 displays the resulting image. With this configuration, a good reconstructed image can be obtained which has less motion blurring of blood vessels due to a periodical physiological phenomena depending on a living body such as an electrocardiographic waveform. Particularly, motion blurring of an artery can be reduced so that a clear photoacoustic image of the artery can be obtained.
According to a sixth embodiment according to an aspect of the present invention, a user interface (UI) mainly applying a display screen for displaying a measurement condition will be specifically described. A display screen displaying a measurement condition that is a user interface corresponding to an example of an input unit configured to receive an input from a user may be displayed along with a reconstructed image on the liquid crystal display 161 corresponding to the display unit 160. The liquid crystal display 161 may display a plurality of windows so that display screens displaying a reconstructed image and a measurement condition may be displayed on different windows. A different liquid crystal display apparatus may be added, and the added liquid crystal display apparatus may specifically be assigned for displaying a user interface. A user can set measurement parameters including a first cyclic period, a second cyclic period, and a third cyclic period through the user interface. A user may input all of those imaging parameters, or a user may set some of the imaging parameters.
With reference to Figs. 10A, 10B, and 10C, display screen examples will be specifically described.
Fig. 10A illustrates a display screen example according to this embodiment. Referring to Fig. 10A, an image U1 is a schematic image to be displayed on a display apparatus. The image U1 displays a waveform U101 schematically illustrating timing of a display frame rate, and a number (msec) U102 indicates a display frame rate period DT. The image U1 further displays a waveform U103 schematically illustrating an imaging frame rate and a number (msec) U104 indicating an imaging frame rate period FT. The image U1 further displays a waveform U105 indicating a delay time DLY and a sampling enabled time SW. The image U1 further displays a number (msec) U106 indicating a delay time DLY from start of the imaging frame rate U103 to start of the sampling enabled time SW and U107 indicating a number (msec) indicating a sampling enabled time SW.
A user can change a numerical value by using a mouse or a keyboard to change a measurement condition and at the same time by checking the displayed image U1. Instead of the mouse or the keyboard, a specially designed knob for changing a parameter can be provided for the adjustment. For the adjustment, a software module may be designed such that a waveform or a numerical value on the image U1 interactively in response to a user's instruction.
Fig. 10B illustrates another display screen example. Referring to Fig. 10B, an image U2 is a schematic image to be displayed on a display apparatus. The image U2 displays a waveform U201 schematically indicating timing of the display frame rate. The image U2 displays a number (Hz) U202 indicating a display frame rate DR. The image U2 displays a waveform U203 schematically indicating an imaging frame rate and a number (Hz) U204 indicating an imaging frame rate FR. The image U2 displays a waveform U205 indicating a delay time DLY and a sampling enabled time SW. The image U2 displays a number (°, one period is equal to 360°) U206 indicating in phase PH a rate of a delay time DLY from start of a frame rate to start of a sampling enabled time to the period of the imaging frame rate U203 and a number (%) U207 indicating a rate of a sampling enabled time SW to the period FT of the imaging frame rate U203. On this kind of display screen, the time for starting to obtain a photoacoustic signal and a sampling enabled time can be intuitively identified even when a user converts the imaging frame rate. The display screen may be any combination of the aforementioned images or may be any other representations.
A user can change a measurement condition interactively by, at the same time, checking the displayed image U2.
A display frame rate may generally be fixed in accordance with the applied display unit 160. Therefore, the waveforms (U101, U201) schematically indicating times of the display frame rate and numbers (U102, U202) indicating the display frame rate illustrated in the images (U1, U2) to be displayed as illustrated in Figs. 10A and 10B may not be displayed. Thus, the number of display contents can be reduced so that a user can set and grasp a measurement condition.
Fig. 10C illustrates another display screen example. Fig. 10C illustrates an embodiment which determines a measurement condition (imaging frame rate) according to the fifth embodiment in accordance with a subject. More specifically, Fig. 10C illustrates a display screen displaying a measurement condition for the photoacoustic apparatus which synchronizes a frame rate period with periodical biological activities such as heartbeats or electrocardiographic waveforms.
Referring to Fig. 10C, an image U3 is a schematic image to be displayed on a display apparatus. The image U3 illustrates a waveform U301 schematically representing an imaging frame rate and a number (Sec) U302 indicating an imaging frame rate period FT. The image U301 further displays a waveform U303 indicating a delay time DLY and a sampling enabled time SW. The image U304 displays a number (°, one period is equal to 360°)U304 indicating in phase PH a rate of a time from start of a frame rate to start of a sampling enabled time SW to the period of the imaging frame rate U301 and a number (%) U305 indicating a rate of a sampling enabled time SW to the period FT of the imaging frame rate U303. The image U301 further displays a waveform U306 indicating an electrocardiographic waveform. The image U301 further displays a line U307 indicating a set trigger level with respect to the electrocardiographic waveform U306. The trigger level can be set by a user by using a mouse, a keyboard or a specially designed knob. Alternatively, the trigger level may be set to 90% of the peak value of the amplitude so that the computer 150 can automatically set the trigger level. In the example illustrated in Fig. 10C, the trigger may be performed at a rising time of the electrocardiographic waveform above the set trigger level amplitude. When the trigger is performed, the computer 150 generates an imaging frame rate where one period is a time from one trigger to the next trigger. In other words, a user cannot set the imaging frame rate. A user can adjust a sampling start phase and a sampling enabled time SW that are measurement conditions to desired values by checking electrocardiographic waveforms at the same time so that a reconstructed image can be obtained.
The sixth embodiment may facilitate a user to grasp and set a measurement condition. Therefore, advantageously, a user can easily set an optimum measurement condition in accordance with the state of a living body that is a subject.
The light source unit 200 may be configured to irradiate light having a plurality of wavelengths, as described above. When light having a plurality of wavelengths is used, an oxygen saturation can be calculated as function information. For example, light beams of two wavelengths may be irradiated to a subject alternately at each imaging frame rate, and corresponding photoacoustic signals are obtained. Thus, reconstructed image data pieces of the wavelengths and at the two imaging frame rates can be calculated so that an oxygen saturation can be calculated. The oxygen saturation calculation may apply the scheme disclosed in PTL 2 or other schemes.
The plurality of aforementioned embodiments may be implemented by one photoacoustic apparatus so that the photoacoustic apparatus can be switched to provide a function corresponding to one of the embodiments. The photoacoustic apparatus may additionally have a function for performing the measurement based on a reflected wave of an ultrasonic wave transmitted from a transducer. In this case, an ultrasonic wave echo image and a photoacoustic image may be displayed side by side or may be superimposed by registration for display.
Embodiment(s) of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions (e.g., one or more programs) recorded on a storage medium (which may also be referred to more fully as a 'non-transitory computer-readable storage medium') to perform the functions of one or more of the above-described embodiment(s) and/or that includes one or more circuits (e.g., application specific integrated circuit (ASIC)) for performing the functions of one or more of the above-described embodiment(s), and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s) and/or controlling the one or more circuits to perform the functions of one or more of the above-described embodiment(s). The computer may comprise one or more processors (e.g., central processing unit (CPU), micro processing unit (MPU)) and may include a network of separate computers or separate processors to read out and execute the computer executable instructions. The computer executable instructions may be provided to the computer, for example, from a network or the storage medium. The storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.
While the present invention has been described with reference to exemplary embodiments, it is to be understood that the invention is not limited to the disclosed exemplary embodiments. The scope of the following claims is to be accorded the broadest interpretation so as to encompass all such modifications and equivalent structures and functions.
This application claims the benefit of Japanese Patent Application No. 2017-094117, filed May 10, 2017, which is hereby incorporated by reference herein in its entirety.
Claims (10)
- A photoacoustic apparatus comprising:
a light irradiating unit configured to irradiate light to a subject;
an acoustic wave receiving unit configured to receive acoustic waves generated in the subject upon irradiation of the light to the subject and to output a reception signal; and
an image generating unit configured to generate an internal image of the subject based on the reception signal; wherein
the light irradiating unit is further configured to irradiate the light to the subject repetitively at a first cyclic period, and
the image generating unit is further configured to combine a plurality of the reception signals obtained by irradiating the light a plurality of times and to generate image data at a frame rate of a second cyclic period based on the combined signal,
the photoacoustic apparatus further comprising a frame rate converting unit configured to convert a display rate of an image based on the image data generated at the frame rate of the second cyclic period to a frame rate of a third cyclic period. - The photoacoustic apparatus according to Claim 1, wherein the first cyclic period is shorter than the second cyclic period.
- The photoacoustic apparatus according to Claim 1 or 2, wherein the second cyclic period has a length equal to or longer than that of the third cyclic period.
- The photoacoustic apparatus according to any one of Claims 1 to 3, wherein the frame rate of the second cyclic period is synchronized with periodical biological activities.
- The photoacoustic apparatus according to any one of Claims 1 to 4, wherein the frame rate converting unit causes a display unit to display the image at the third cyclic period.
- The photoacoustic apparatus according to Claim 5, further comprising a storage unit configured to store the image data at the frame rate of the second cyclic period,
wherein the frame rate converting unit outputs stored reconstructed image data at the frame rate of the third cyclic period on the display unit. - The photoacoustic apparatus according to Claim 5 or 6, wherein the frame rate converting unit causes the display unit to display information regarding the third cyclic period.
- The photoacoustic apparatus according to any one of Claims 1 to 7, further comprising an input unit configured to receive at least one input of the first cyclic period, the second cyclic period, and the third cyclic period.
- A photoacoustic image generating method comprising:
irradiating light to a subject repetitively at a first cyclic period;
receiving acoustic waves generated in the subject upon the irradiation of light and outputting a reception signal; and
generating an internal image of the subject based on the reception signal by combining a plurality of the reception signals obtained by irradiating the light to the subject a plurality of times and generating image data at a frame rate of a second cyclic period based on the combined signal; and
converting a display rate of an image based on the image data generated at the frame rate of the second cyclic period to a frame rate of a third cyclic period. - A photoacoustic image display method, wherein an image generated by the photoacoustic image generating method according to Claim 9 is displayed at the third cyclic period on a display unit.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017094117A JP2018187234A (en) | 2017-05-10 | 2017-05-10 | Photoacoustic apparatus and photoacoustic image generation method |
JP2017-094117 | 2017-05-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018207713A1 true WO2018207713A1 (en) | 2018-11-15 |
Family
ID=62495842
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/017571 WO2018207713A1 (en) | 2017-05-10 | 2018-05-02 | Photoacoustic apparatus and photoacoustic image generating method |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP2018187234A (en) |
WO (1) | WO2018207713A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114129130A (en) * | 2021-10-29 | 2022-03-04 | 西安理工大学 | Photoacoustic image back projection reconstruction method based on single address LUT table |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140198606A1 (en) * | 2013-01-15 | 2014-07-17 | Helmsholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH) | System and method for quality-enhanced high-rate optoacoustic imaging of an object |
JP2015142740A (en) | 2015-02-26 | 2015-08-06 | キヤノン株式会社 | Optical acoustic device, information processor, and display method |
WO2016031687A1 (en) * | 2014-08-27 | 2016-03-03 | プレキシオン株式会社 | Photoacoustic imaging device |
JP2016047102A (en) | 2014-08-27 | 2016-04-07 | プレキシオン株式会社 | Photoacoustic imaging device |
-
2017
- 2017-05-10 JP JP2017094117A patent/JP2018187234A/en active Pending
-
2018
- 2018-05-02 WO PCT/JP2018/017571 patent/WO2018207713A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140198606A1 (en) * | 2013-01-15 | 2014-07-17 | Helmsholtz Zentrum München Deutsches Forschungszentrum für Gesundheit und Umwelt (GmbH) | System and method for quality-enhanced high-rate optoacoustic imaging of an object |
WO2016031687A1 (en) * | 2014-08-27 | 2016-03-03 | プレキシオン株式会社 | Photoacoustic imaging device |
JP2016047102A (en) | 2014-08-27 | 2016-04-07 | プレキシオン株式会社 | Photoacoustic imaging device |
US20170245763A1 (en) * | 2014-08-27 | 2017-08-31 | Prexion Corporation | Photoacoustic Imaging Device |
JP2015142740A (en) | 2015-02-26 | 2015-08-06 | キヤノン株式会社 | Optical acoustic device, information processor, and display method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114129130A (en) * | 2021-10-29 | 2022-03-04 | 西安理工大学 | Photoacoustic image back projection reconstruction method based on single address LUT table |
CN114129130B (en) * | 2021-10-29 | 2023-09-22 | 西安理工大学 | Photoacoustic image back projection reconstruction method based on single-address LUT table |
Also Published As
Publication number | Publication date |
---|---|
JP2018187234A (en) | 2018-11-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3266378A1 (en) | Apparatus, method, and program for obtaining information derived from ultrasonic waves and photoacoustic waves | |
JP2016087364A (en) | Subject information acquisition device | |
US20190059739A1 (en) | Photoacoustic apparatus | |
EP3326519A1 (en) | Photoacoustic apparatus, control method, and program | |
US20180353082A1 (en) | Photoacoustic apparatus and object information acquiring method | |
US20180228377A1 (en) | Object information acquiring apparatus and display method | |
JP6656229B2 (en) | Photoacoustic device | |
EP3326520A1 (en) | Photoacoustic apparatus, information processing method | |
WO2018207713A1 (en) | Photoacoustic apparatus and photoacoustic image generating method | |
US20160150990A1 (en) | Photoacoustic apparatus, subject information acquisition method, and program | |
JP2016042922A (en) | Photoacoustic imaging apparatus | |
US20180325380A1 (en) | Subject information acquisition device and subject information acquisition method | |
WO2019031607A1 (en) | Photoacoustic apparatus and object information acquiring method | |
US20190000322A1 (en) | Photoacoustic probe and photoacoustic apparatus including the same | |
JP7034625B2 (en) | Photoacoustic device, control method, program | |
KR20180106902A (en) | Photoacoustic apparatus and control method thereof, and photoacoustic probe | |
US20180344168A1 (en) | Photoacoustic apparatus | |
JP2018086265A (en) | Photoacoustic device, information acquisition method, and program | |
JP2019042002A (en) | Photoacoustic apparatus | |
JP2018089346A (en) | Photoacoustic apparatus, image display method and program | |
JP2019042003A (en) | Photoacoustic apparatus | |
US20190159760A1 (en) | Photoacoustic probe | |
JP6452410B2 (en) | Photoacoustic device | |
US20180360321A1 (en) | Photoacoustic apparatus, coding apparatus, and information processing apparatus | |
JP2019068914A (en) | Photoacoustic apparatus, information acquisition method, and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18728991 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18728991 Country of ref document: EP Kind code of ref document: A1 |