WO2017056462A1 - 光音響計測装置および光音響計測装置の信号処理方法 - Google Patents
光音響計測装置および光音響計測装置の信号処理方法 Download PDFInfo
- Publication number
- WO2017056462A1 WO2017056462A1 PCT/JP2016/004307 JP2016004307W WO2017056462A1 WO 2017056462 A1 WO2017056462 A1 WO 2017056462A1 JP 2016004307 W JP2016004307 W JP 2016004307W WO 2017056462 A1 WO2017056462 A1 WO 2017056462A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- photoacoustic
- filtering process
- region
- wave detection
- artifact
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0093—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
- A61B5/0095—Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
Definitions
- the present invention relates to a photoacoustic measurement apparatus that generates a photoacoustic image based on a photoacoustic signal generated in a subject, and a signal processing method used therefor.
- This measurement method emits pulsed light having a certain appropriate wavelength (for example, visible light, near-infrared light, or mid-infrared light wavelength band) toward the subject, and the absorbing substance in the subject is subjected to this pulse.
- a photoacoustic wave which is an elastic wave generated as a result of absorbing light energy, is detected, and the concentration of the absorbing substance is quantitatively measured.
- the absorbing substance in the subject is, for example, glucose or hemoglobin contained in blood.
- a technique for detecting such a photoacoustic wave and generating a photoacoustic image based on the detection signal is called photoacoustic imaging (PAI) or photoacoustic tomography (PAT). Yes.
- a probe having an acoustic wave detection unit for detecting the generated photoacoustic wave is often used.
- the inventors of the present application are able to detect photoacoustic waves generated on the surface portion of the subject on which the measurement light is incident. It was detected by the acoustic wave detection unit along with the other signals along the surface of the specimen, and it was found that this causes an artifact (fake image). Such artifacts impair the diagnostic performance of the displayed photoacoustic image.
- Patent Document 1 describes a technique for suppressing artifacts by changing processing on a photoacoustic wave detection signal for each region different in the depth direction of a subject.
- Patent Document 2 describes a technique for removing artifacts by performing low-frequency cut processing on a photoacoustic wave detection signal.
- Patent Document 1 is premised on elastography measurement and B-mode measurement. In these measurements, the measurement results from photoacoustic waves generated on the surface portion of the subject on which the measurement light is incident. It is not possible to detect an artifact occurrence area. Therefore, with the technique described in Patent Document 1, it is impossible to accurately reduce or eliminate artifacts caused by photoacoustic waves generated on the surface portion of the subject on which the measurement light is incident.
- Patent Document 2 since the technique described in Patent Document 2 performs the low-frequency cut process without recognizing the area where the artifact occurs, the photoacoustic generated in the surface portion of the subject on which the measurement light is incident also in this technique. It is impossible to accurately reduce or eliminate artifacts caused by waves.
- the present invention has been made in view of the above problems, and is a photoacoustic measurement device capable of accurately reducing or removing artifacts caused by photoacoustic waves generated on the surface portion of a subject on which measurement light is incident, and is used for the same.
- An object of the present invention is to provide a signal processing method.
- the photoacoustic measuring device is: A probe having a light emitting unit that emits measurement light to the subject, and an acoustic wave detection unit that is arranged in parallel with the light emitting unit and detects a photoacoustic wave generated in the subject by the emission of the measurement light, An image generation unit that generates a photoacoustic image based on the photoacoustic wave detection signal output by the acoustic wave detection unit; Region discriminating means for discriminating an artifact occurrence region and an artifact non-occurrence region in the photoacoustic image based on at least a positional relationship between the light emitting unit and the acoustic wave detection unit; and Among the photoacoustic wave detection signals, a first filtering process is performed on the first photoacoustic wave detection signal corresponding to the photoacoustic image of the artifact non-occurrence area determined by the area determination unit, and the photoacoustic wave Among the wave detection signals, a filter unit that performs
- photoacoustic wave detection signal “first photoacoustic wave detection signal”, and “second photoacoustic wave detection signal” are signals that are output from the acoustic wave detection unit by detecting the photoacoustic wave. It means not only the signal itself but also a signal (including digitized data) after the signal has undergone some processing.
- predetermined frequency does not mean a constant frequency of a certain value, but means a predetermined frequency, and the value itself can be arbitrarily set. .
- the filter means includes a bandpass filter for performing a bandpass filtering process; It is desirable that the bandpass filtering process as the second filtering process sets the cut-off frequency on the low frequency side to the higher frequency side than the bandpass filtering process as the first filtering process.
- the filter means includes a high-pass filter that performs high-pass filtering processing,
- the high-pass filtering process as the second filtering process may be such that the cutoff frequency is set on the higher frequency side than the high-pass filtering process as the first filtering process.
- the region discriminating unit is based on a correspondence table in which the positional relationship between the light emitting unit and the acoustic wave detecting unit is associated with the boundary between the artifact generation region and the artifact non-generation region.
- a correspondence table in which the positional relationship between the light emitting unit and the acoustic wave detecting unit is associated with the boundary between the artifact generation region and the artifact non-generation region.
- the region discriminating means generates the artifact by calculating the boundary between the artifact generation region and the artifact non-generation region from the positional relationship between the light emitting unit and the acoustic wave detection unit. An area and an artifact non-occurrence area may be discriminated.
- the region discriminating unit corrects the discrimination according to the sound velocity at the site of the subject through which the photoacoustic wave propagates.
- the filter means applies the first detection signal to the photoacoustic wave detection signal related to at least one boundary region set within a range including the boundary between the artifact non-occurrence region and the artifact generation region. It is desirable to perform boundary region filtering processing different from the first filtering processing and the second filtering processing.
- the above boundary region filtering process is preferably an intermediate filtering process between the first filtering process and the second filtering process.
- the pass characteristic of the boundary region filtering process is determined according to the depth direction position of the boundary region based on the pass characteristic of the first filtering process and the pass characteristic of the second filtering process. It is desirable that
- the photoacoustic measurement device provides the first light that has been subjected to the first filtering process, with respect to an image relating to at least one boundary region set within a range including the boundary between the artifact non-occurrence region and the artifact generation region.
- Means comprising an acoustic wave detection signal and a signal obtained by weighting and adding the second photoacoustic wave detection signal subjected to the second filtering process in accordance with the depth direction position of the boundary region It is desirable to have more.
- the signal processing method of the photoacoustic measuring device is: A probe having a light emitting unit that emits measurement light to the subject, and an acoustic wave detection unit that is arranged in parallel with the light emitting unit and detects a photoacoustic wave generated in the subject by the emission of the measurement light; and
- the photoacoustic measurement device including the image generation unit that generates the photoacoustic image based on the photoacoustic wave detection signal output by the acoustic wave detection unit, Based on at least the positional relationship between the light emitting part and the acoustic wave detection part, the artifact generation region and the artifact non-generation region in the photoacoustic image are determined,
- the first filtering process is performed on the first photoacoustic wave detection signal corresponding to the photoacoustic image of the artifact non-occurrence region,
- a second filtering process is performed on the first photoacoustic wave detection signal corresponding to the photoacoustic image of the arti
- the filtering process is a bandpass filtering process, It is desirable that the bandpass filtering process as the second filtering process sets the cut-off frequency on the low frequency side to the higher frequency side than the bandpass filtering process as the first filtering process.
- the filtering process is a high-pass filtering process
- the high-pass filtering process as the second filtering process may be such that the cutoff frequency is set on the higher frequency side than the high-pass filtering process as the first filtering process.
- the signal processing method of the photoacoustic measurement device based on the correspondence table in which the positional relationship between the light emitting unit and the acoustic wave detecting unit is associated with the boundary between the artifact generation region and the artifact non-generation region. It is desirable to discriminate between an artifact occurrence region and an artifact non-occurrence region.
- the artifact generation region is calculated by a calculation formula for calculating the boundary between the artifact generation region and the artifact non-generation region from the positional relationship between the light emitting unit and the acoustic wave detection unit. And an artifact non-occurrence region may be discriminated.
- the signal processing method of the photoacoustic measurement apparatus it is desirable to correct the discrimination according to the sound velocity at the site of the subject through which the photoacoustic wave propagates.
- the photoacoustic wave detection signal related to at least one boundary region set within the range including the boundary between the artifact non-occurrence region and the artifact generation region is It is desirable to perform boundary region filtering processing different from the first filtering processing and the second filtering processing.
- the above boundary region filtering process is preferably an intermediate filtering process between the first filtering process and the second filtering process.
- the pass characteristic of the boundary region filtering process is determined according to the depth direction position of the boundary region based on the pass characteristic of the first filtering process and the pass characteristic of the second filtering process. It is desirable to be a thing.
- an image relating to at least one boundary region set within a range including the non-artifact generation region and the boundary of the artifact generation region is subjected to the first filtering process.
- a signal obtained by weighting and adding the first photoacoustic wave detection signal and the second photoacoustic wave detection signal subjected to the second filtering process according to the position of the boundary region in the depth direction of the subject It is desirable to constitute by.
- the photoacoustic measuring device is: A probe having a light emitting unit that emits measurement light to the subject, and an acoustic wave detection unit that is arranged in parallel with the light emitting unit and detects a photoacoustic wave generated in the subject by the emission of the measurement light, An image generation unit that generates a photoacoustic image based on the photoacoustic wave detection signal output by the acoustic wave detection unit; Region discriminating means for discriminating an artifact occurrence region and an artifact non-occurrence region in the photoacoustic image based on at least a positional relationship between the light emitting unit and the acoustic wave detection unit; and Among the photoacoustic wave detection signals, a first filtering process is performed on the first photoacoustic wave detection signal corresponding to the photoacoustic image of the artifact non-occurrence area determined by the area determination unit, and the photoacoustic wave Among the wave detection signals, a filter unit that performs
- Diagram explaining generation of fake image (A) to (D) are diagrams showing photoacoustic images in which artifact regions are generated due to photoacoustic waves generated on the surface of the object, and (E) and (F) are (A) and (A) in this figure, respectively.
- (D) Schematic which shows the positional relationship of the light emission part at the time of producing
- a graph showing an example of the relationship between the distance between the light emitting part and the transducer array and the observable depth
- Schematic showing yet another example of filter pass characteristics The figure explaining the distance of a light emission part and a vibrator array Schematic which shows the whole structure of the photoacoustic measuring device by another embodiment of this invention.
- FIG. 1 is a schematic diagram showing an overall configuration of a photoacoustic measurement apparatus 10 according to an embodiment of the present invention.
- the photoacoustic measuring apparatus 10 of this embodiment has a function which produces
- a probe 11, an ultrasonic unit 12, a laser unit 13, a display unit 14, an input unit 15 and the like are provided.
- those components will be sequentially described.
- the probe 11 has a function of emitting measurement light and ultrasonic waves toward the subject M, which is a living body, for example, and a function of detecting an acoustic wave U propagating in the subject M. That is, the probe 11 can emit (transmit) ultrasonic waves to the subject M and detect (receive) reflected ultrasonic waves (reflected acoustic waves) reflected by the subject M and returning.
- acoustic wave is a term that includes ultrasonic waves and photoacoustic waves.
- ultrasonic wave means an elastic wave transmitted by the probe and its reflected wave
- photoacoustic wave means an elastic wave emitted when the absorber 65 absorbs measurement light.
- the acoustic wave emitted from the probe 11 is not limited to the ultrasonic wave, and an acoustic wave having an audible frequency may be used as long as an appropriate frequency is selected according to the test object, measurement conditions, and the like.
- Examples of the absorber 65 in the subject M include blood vessels and metal members.
- the probe 11 is generally prepared for sector scanning, linear scanning, convex scanning, or the like, and an appropriate one is selected and used according to the imaging region.
- an optical fiber 60 is connected to the probe 11 as a connecting portion that guides laser light L, which is measurement light emitted from a laser unit 13 described later, to the light emitting portion 40.
- the probe 11 includes a transducer array 20 that is an acoustic wave detector, and a total of two light emitting units 40 that are disposed on both sides of the transducer array 20 with the transducer array 20 in between. And a housing 50 in which the transducer array 20 and the two light emitting portions 40 are accommodated.
- the transducer array 20 also functions as an ultrasonic transmission element.
- the transducer array 20 is connected to a circuit for transmitting ultrasonic waves, a circuit for receiving acoustic waves, and the like via wires not shown.
- the transducer array 20 includes a plurality of ultrasonic transducers that are electroacoustic transducers arranged in parallel in one direction.
- the ultrasonic vibrator is a piezoelectric element made of a polymer film such as piezoelectric ceramics or polyvinylidene fluoride (PVDF).
- the ultrasonic transducer has a function of converting the received acoustic wave U into an electrical signal.
- the transducer array 20 may include an acoustic lens.
- the transducer array 20 in the present embodiment is formed by arranging a plurality of ultrasonic transducers in a one-dimensional manner.
- the transducer array in which a plurality of ultrasonic transducers are arranged in a two-dimensional manner. May be used.
- the ultrasonic transducer has a function of transmitting ultrasonic waves as described above. That is, when an alternating voltage is applied to the ultrasonic vibrator, the ultrasonic vibrator generates an ultrasonic wave having a frequency corresponding to the frequency of the alternating voltage. Note that transmission and reception of ultrasonic waves may be separated from each other. That is, for example, ultrasonic waves may be transmitted from a position different from the probe 11, and reflected ultrasonic waves with respect to the transmitted ultrasonic waves may be received by the probe 11.
- the light emitting unit 40 is a part that emits the laser light L guided by the optical fiber 60 toward the subject M.
- the light emitting portion 40 is configured by the tip portion of the optical fiber 60, that is, the end portion far from the laser unit 13 that is a light source of measurement light.
- two light emitting units 40 are disposed on both sides of the transducer array 20 in the elevation direction, for example, with the transducer array 20 interposed therebetween.
- the elevation direction is a direction perpendicular to the arrangement direction and parallel to the detection surface of the transducer array 20 when a plurality of ultrasonic transducers are arranged one-dimensionally.
- the light emitting part may be composed of a light guide plate and a diffusion plate that are optically coupled to the tip of the optical fiber 60.
- a light guide plate can be composed of, for example, an acrylic plate or a quartz plate.
- the diffusion plate a lens diffusion plate in which microlenses are randomly arranged on the substrate, for example, a quartz plate in which diffusion fine particles are dispersed, or the like can be used.
- a holographic diffusion plate or an engineering diffusion plate may be used as the lens diffusion plate.
- the laser unit 13 shown in FIG. 1 has a flash lamp excitation Q-switch solid laser such as a Q-switch alexandrite laser, for example, and emits laser light L as measurement light.
- the laser unit 13 is configured to receive a trigger signal from the control unit 34 of the ultrasonic unit 12 and output the laser light L.
- the laser unit 13 preferably outputs pulsed laser light L having a pulse width of 1 to 100 nsec (nanoseconds).
- the wavelength of the laser light L is appropriately selected according to the light absorption characteristics of the absorber 65 in the subject M to be measured.
- the wavelength be a wavelength belonging to the near-infrared wavelength region.
- the near-infrared wavelength region means a wavelength region of about 700 to 850 nm.
- the wavelength of the laser beam L is naturally not limited to this.
- the laser beam L may be a single wavelength or may include a plurality of wavelengths such as 750 nm and 800 nm. When the laser beam L includes a plurality of wavelengths, the light beams having these wavelengths may be emitted at the same time or may be emitted while being switched alternately.
- the laser unit 13 is also capable of outputting laser light in the near-infrared wavelength region as well as YAG-SHG (Second harmonic generation) -OPO (Optical Parametric Oscillation). : Optical parametric oscillation) laser, Ti-Sapphire (titanium-sapphire) laser, or the like.
- the optical fiber 60 guides the laser light L emitted from the laser unit 13 to the two light emitting units 40.
- the optical fiber 60 is not particularly limited, and a known fiber such as a quartz fiber can be used.
- a known fiber such as a quartz fiber can be used.
- one thick optical fiber may be used, or a bundle fiber in which a plurality of optical fibers are bundled may be used.
- the bundle fiber is arranged so that the laser light L is incident from the light incident end face of the bundled fiber portion, and the fiber portion branched into two of the bundle fiber is used.
- Each tip portion constitutes the light emitting portion 40 as described above.
- the ultrasonic unit 12 includes a reception circuit 21, a reception memory 22, a data separation unit 23, a photoacoustic image generation unit 24, an ultrasonic image generation unit 29, a display control unit 30, a transmission control circuit 33, and a control unit 34.
- the control unit 34 controls each unit of the photoacoustic measurement apparatus 10, and includes a trigger control circuit (not shown) in the present embodiment.
- This trigger control circuit sends a light trigger signal to the laser unit 13 when acquiring a photoacoustic image, for example.
- the flash lamp of the excitation source is turned on in the Q-switch solid-state laser of the laser unit 13, and excitation of the laser rod is started. While the excited state of the laser rod is maintained, the laser unit 13 can output the laser light L.
- the control unit 34 then transmits a Q switch trigger signal from the trigger control circuit to the laser unit 13. That is, the control unit 34 controls the output timing of the laser light L from the laser unit 13 by this Q switch trigger signal. Further, the control unit 34 transmits a sampling trigger signal to the receiving circuit 21 simultaneously with transmission of the Q switch trigger signal.
- This sampling trigger signal defines the start timing of the photoacoustic signal sampling in the AD converter (Analog-to-Digital converter) of the receiving circuit 21. As described above, by using the sampling trigger signal, it is possible to sample the photoacoustic signal in synchronization with the output of the laser light L.
- the control unit 34 transmits an ultrasonic trigger signal that instructs the transmission control circuit 33 to transmit an ultrasonic wave when acquiring an ultrasonic image.
- the transmission control circuit 33 transmits ultrasonic waves from the probe 11.
- the control unit 34 transmits a sampling trigger signal to the receiving circuit 21 in synchronization with the timing of ultrasonic transmission, and starts sampling of the reflected ultrasonic signal.
- the position of the probe 11 is gradually changed with respect to the subject M, for example, in the above-described elevation direction, and the subject M is irradiated with the laser light L or ultrasonic waves. Are scanned. Therefore, sampling of the photoacoustic signal or reflected ultrasonic signal is performed while shifting the acoustic wave detection line one line at a time in synchronization with this scanning.
- the scanning may be performed by the operator manually moving the probe 11 or may be performed using an automatic scanning mechanism.
- the reception circuit 21 receives the photoacoustic wave detection signal output from the transducer array 20 of the probe 11 and stores the received detection signal in the reception memory 22.
- the reception circuit 21 typically includes a low noise amplifier, a variable gain amplifier, a low-pass filter, and an AD converter.
- the photoacoustic wave detection signal output from the probe 11 is amplified by a low-noise amplifier, then gain-adjusted according to the depth by a variable gain amplifier, and after a high-frequency component is cut by a low-pass filter, a digital signal is output by an AD converter. And stored in the reception memory 22.
- the receiving circuit 21 is composed of, for example, one IC (Integrated Circuit).
- the high-frequency component cut by the low-pass filter is for preventing aliasing noise during AD conversion. Generally, the cut-off frequency is about 10 to 30 MHz, which is about half the sampling frequency of AD conversion. It is said.
- the probe 11 outputs a photoacoustic wave detection signal and a reflected ultrasonic detection signal. Therefore, the reception memory 22 stores the digitized photoacoustic wave detection signal and reflected ultrasonic wave detection signal.
- the data separation unit 23 reads out the photoacoustic image data, that is, the digitized photoacoustic wave detection signal from the reception memory 22, and transmits it to the photoacoustic image generation unit 24.
- the data separation unit 23 reads the data for the reflected ultrasound image from the reception memory 22, that is, the digitized reflected ultrasound detection signal, and transmits it to the ultrasound image generation unit 29.
- the photoacoustic image generation unit 24 reconstructs the photoacoustic wave detection signal received from the reception memory 23 to generate a photoacoustic image. Specifically, the photoacoustic image generation unit 24 adds a photoacoustic wave detection signal based on a signal from each ultrasonic transducer with a delay time corresponding to the position of each ultrasonic transducer in the transducer array 20, A photoacoustic wave detection signal for one line is generated (delay addition method). The photoacoustic image generation unit 24 may perform reconstruction by the CBP method (Circular Back Projection) instead of the delay addition method. Alternatively, the photoacoustic image generation unit 24 may perform reconstruction using the Hough transform method or the Fourier transform method. The reconstructed photoacoustic wave detection signals for a plurality of lines are sent to the display control unit 30 as a signal for displaying a photoacoustic image (tomographic image) through signal processing such as detection processing and logarithmic conversion processing.
- the photoacoustic wave detection signals for the plurality of lines have been subjected to processing such as digitization and reconstruction, and are not the photoacoustic wave detection signals themselves output from the transducer array 20 of the probe 11, but this vibration Since it is based on the photoacoustic wave detection signal output from the child array 20, it will be referred to as a “photoacoustic wave detection signal”.
- the above-described filtering process is performed in the photoacoustic image generation unit 24. The process will be described in detail later.
- the ultrasonic image generation unit 29 performs basically the same processing as the above-described photoacoustic wave detection signal on the reflected ultrasonic detection signal stored in the reception memory 22 to obtain an ultrasonic image (tomographic image). A reflected ultrasonic detection signal for a plurality of lines is generated. The ultrasonic image generation unit 29 outputs the reflected ultrasonic detection signal thus generated to the display control unit 30.
- the display control unit 30 causes the display unit 14 to display a photoacoustic image based on the photoacoustic wave detection signal and an ultrasonic image based on the reflected ultrasonic detection signal. These two images are displayed separately or combined and displayed on the display unit 14 as a combined image. In the latter case, the display control unit 30 performs image composition by superimposing a photoacoustic image and an ultrasonic image, for example. Thus, if an ultrasonic image is generated and displayed in addition to the photoacoustic image, a portion that cannot be imaged by the photoacoustic image can be observed in the ultrasonic image.
- FIG. 2 is a schematic diagram illustrating a state in which a photoacoustic wave generated from the blood vessel V as an example of the absorber 65 illustrated in FIG. 1 and a photoacoustic wave generated on the surface of the subject on which the measurement light is incident are detected. is there.
- the measurement light L is irradiated onto the subject M, it is ideally desired to detect only the photoacoustic wave U1 from the blood vessel V. However, in actuality, it occurs at the subject surface portion 44 where the measurement light L is incident.
- the photoacoustic wave U2 thus detected is also detected. This photoacoustic wave U2 causes an artifact (false image).
- the time from when the photoacoustic wave U2 is generated until it is detected is the interval between the light emitting unit 40 and the transducer array 20 (or individual ultrasonic transducers; the same applies hereinafter), more specifically, the measurement light L Depends on the distance between the arrival area on the contact plane and the acoustic wave detection unit. That is, as the distance between the light emitting unit 40 and the transducer array 20 is larger, the distance that the photoacoustic wave U2 travels through the subject is longer, so the time from when the photoacoustic wave U2 is generated until it is detected is longer. Become.
- the “contact plane” is parallel to the detection surface of the acoustic wave detection unit through the tip of the probe 11 (that is, the intersection of the probe surface in contact with the subject M and the central axis of the acoustic wave detection unit). It means a plane.
- the “arrival area” means an area where the contact plane and the measurement light L intersect.
- FIGS. 3A to 3F are diagrams showing the photoacoustic image P in which the artifact generation region R2 is generated due to the photoacoustic wave U2 generated on the surface of the subject.
- FIGS. 3E and 3F are schematic views showing the positional relationship between the light emitting unit 40 and the transducer array 20 when the photoacoustic images P of FIGS. 3A and 3D are generated, respectively.
- the boundary B between the artifact non-occurrence region R1 and the artifact occurrence region R2 decreases, that is, moves to the deeper side of the subject M as the distance W1 between the light emitting unit 40 and the transducer array 20 increases.
- the vertical direction of the photoacoustic image P corresponds to the time axis, and the longer the interval W1 between the light emitting unit 40 and the transducer array 20, the longer the time until the signal of the photoacoustic wave U2 is detected. It is to become.
- FIG. 4 shows an example of the relationship between the boundary B1 between the artifact non-occurrence region R1 and the artifact occurrence region R2 with respect to the interval W1 between the light emitting unit 40 and the transducer array 20 described above.
- the horizontal axis in FIG. 4 is the interval W1
- the “observable depth” on the vertical axis is the depth of the artifact non-occurrence region R1, that is, the depth position of the boundary B.
- the following filtering process is performed by a bandpass filter.
- This filtering process is performed by an arithmetic process based on a program given in advance.
- This band-pass filtering process includes a photoacoustic wave detection signal (first photoacoustic wave detection signal) related to the artifact non-occurrence region R1 among the photoacoustic wave detection signals reconstructed as described above and indicating a photoacoustic image. This is performed by changing the pass characteristic of the filter with the photoacoustic wave detection signal (second photoacoustic wave detection signal) related to the artifact generation region R2.
- the bandpass filtering process for the first photoacoustic wave detection signal is referred to as a first bandpass filtering process (first filtering process in the present invention), and the bandpass filtering for the second photoacoustic wave detection signal is performed.
- the filtering process is referred to as a second bandpass filtering process (second filtering process in the present invention).
- Fig. 5 shows the basic pass characteristics of the above bandpass filtering process.
- the broken line indicates the pass characteristic in the first bandpass filtering process
- the solid line indicates the pass characteristic in the second bandpass filtering process. That is, the cut-off frequency fL2 on the low frequency side for the second photoacoustic wave detection signal is set on the higher frequency side than the cut-off frequency fL1 on the low frequency side for the first photoacoustic wave detection signal. ing. More specifically, for example, when the detection center frequency of the transducer array 20 of the probe 11 is 6.5 MHz, the cutoff frequency fL1 on the low frequency side with respect to the first photoacoustic wave detection signal is about 1 MHz, and the cutoff on the high frequency side. The frequency fH is about 10 MHz, the cut-off frequency fL2 on the low frequency side for the second photoacoustic wave detection signal is about 2 MHz, and the cut-off frequency fH on the high frequency side is about 10 MHz.
- the second bandpass filtering process for the second photoacoustic wave detection signal is performed at a predetermined frequency (as compared to the first bandpass filtering process for the first photoacoustic wave detection signal).
- the number of photoacoustic wave detection signals in a frequency range lower than 2 MHz) is further reduced.
- the “frequency range lower than the predetermined frequency” is not limited to a frequency range lower than 2 MHz, but a frequency having a certain width that is present on the lowest frequency side of the photoacoustic wave detection signal.
- the numerical value of “predetermined frequency” is a range and can be set as appropriate.
- the photoacoustic wave U2 that is generated in the subject surface portion 44 and causes artifacts is generated from the entire surface portion of the subject M on which the laser light L is incident.
- the signal from which the photoacoustic wave U2 is detected has a relatively low frequency range. Therefore, as described above, if more signal components in the low frequency range of the second photoacoustic wave detection signal are removed compared to the first photoacoustic wave detection signal, the photoacoustic displayed on the display unit 14 will be described. Artifacts are not generated or reduced in the image artifact generation region R2.
- the display unit 14 can display a photoacoustic image having high diagnostic performance as a whole.
- a signal component in a low frequency range that is significant in diagnosis may be removed.
- the boundary B In order to discriminate between the artifact non-occurrence area R1 and the artifact occurrence area R2, it is necessary to know the boundary B between these areas. As described above with reference to FIGS. 3 and 4, the position in the subject depth direction of the boundary B basically corresponds to the interval W ⁇ b> 1 between the light emitting unit 40 and the transducer array 20. Therefore, in the present embodiment, a correspondence table between the object depth direction position of the boundary B and the interval W1 for each probe 11 is stored in the internal memory of the photoacoustic image generation unit 24 shown in FIG. The photoacoustic image generation unit 24 refers to the lookup table to determine the boundary B, that is, the artifact non-occurrence region R1 and the artifact occurrence region R2.
- the photoacoustic image generation unit 24 constitutes a region discriminating unit and a filter unit in the present invention.
- the filter means it is desirable to apply a filter whose side lobe is reduced by a window function such as a Hamming window, Hanning window, Blackman window, or Kaiser window.
- a window function such as a Hamming window, Hanning window, Blackman window, or Kaiser window.
- the relationship between the subject depth direction position of the boundary B and the interval W1 changes according to the sound velocity at the observation site, that is, the sound velocity at the site of the subject M where the photoacoustic wave propagates. Therefore, the relationship between the position in the subject depth direction of the boundary B and the interval W1 in the correspondence table may be corrected according to the sound velocity. By this correction, the position of the boundary B in the subject depth direction is corrected, that is, the discrimination between the artifact non-occurrence region R1 and the artifact occurrence region R2 is corrected.
- the sound speed may be input from the input unit 15 shown in FIG. 1, or the correspondence between the sound speed and the observation site is stored in a storage unit (not shown) and input from the input unit 15.
- the speed of sound may be known from the information on the observed site by referring to the correspondence relationship.
- the intensity of the laser L as the measurement light, the pulse width of the laser L, and detection of the transducer array 20 You may make it correct
- the boundary B may be obtained from the interval W1 input from the input unit 15 shown in FIG.
- the operator may input information specifying the boundary B from the input unit 15 while confirming the display of the artifact non-occurrence region R1 and the artifact generation region R2 shown on the display unit 14.
- the information specifying the boundary B may not be information indicating the position of the boundary B in the subject depth direction itself.
- x distance from the position of the optical axis at the output end of the light output unit 40 to the central axis of the transducer array 20
- y from the position of the optical axis at the output end of the light output unit 40
- the distance to the contact plane S1) and ⁇ may be used. If x, y, and ⁇ are obtained, the distance W1 between the light emitting section 40 and the transducer array 20 is obtained, and the subject depth direction position of the boundary B can be further obtained therefrom.
- the “interval between the arrival area of the measurement light on the contact plane and the acoustic wave detection unit” is as follows. Can be defined. For example, as shown in FIG. 11, when nothing is mounted on the detection surface, the intersection of the detection surface and the central axis of the transducer array 20 is P1, and the energy profile EP in the measurement light arrival region is a Gaussian distribution. Let P2 be the maximum point when approximated. At this time, the point P1 becomes the tip of the probe through which the contact plane passes. The distance W2 between the point P1 and the point P2 can be set as “the distance between the reaching region and the acoustic wave detection unit”.
- the operator confirms the display of the artifact non-occurrence area R1 and the artifact occurrence area R2 determined based on the correspondence table and the calculation formula described above on the display unit 14, and more appropriate area division can be considered.
- the operator may input information specifying the boundary B of the region division from the input unit 15 and reset the artifact non-occurrence region R1 and the artifact occurrence region R2.
- FIG. 6 shows an example of basic pass characteristics of such a high-pass filter.
- the broken line indicates the pass characteristic of the first high-pass filtering process (first filtering process in the present invention) for the first photoacoustic wave detection signal
- the solid line indicates the second light. It is a passage characteristic of the 2nd high pass filtering process (the 2nd filtering process in the present invention) to an acoustic wave detection signal. That is, in this case, the cutoff frequency f2 for the second photoacoustic wave detection signal is set to be higher than the cutoff frequency f1 for the first photoacoustic wave detection signal.
- the cutoff frequency f1 is about 1 MHz
- the cutoff frequency f2 is about 2 MHz.
- the pass characteristic is shown in FIG. The same effect as that obtained when the bandpass filter shown is used.
- a bandpass filter having a pass characteristic shown in FIG. 7 or a pass characteristic shown in FIG. 8 can be applied.
- the broken line indicates the pass characteristic of the first bandpass filtering process for the first photoacoustic wave detection signal
- the solid line indicates the second characteristic for the second photoacoustic wave detection signal. It is a pass characteristic of a band pass filtering process.
- the pass characteristic shown in FIG. 7 is continuous in a roll-off state when the cut-off frequency on the low frequency side with respect to the first photoacoustic wave detection signal is between fL1 and fL2. It is different in that it has changed. Also in this case, since the signal components in the low frequency range of the second photoacoustic wave detection signal can be removed more than the first photoacoustic wave detection signal, the bandpass filter whose pass characteristics are shown in FIG. Basically, the same effect can be obtained as when used.
- the cut-off frequency on the low frequency side of the first bandpass filtering process for the first photoacoustic wave detection signal continuously changes between fL1 and fL2.
- the cutoff frequency may be changed stepwise between fL1 and fL2.
- the pass characteristic shown in FIG. 8 is set such that the gain of filtering for the second photoacoustic wave detection signal is higher than the gain of filtering for the first photoacoustic wave detection signal, compared to the pass characteristic shown in FIG. Is different. Also in this case, since the signal components in the low frequency range of the second photoacoustic wave detection signal can be removed more than the first photoacoustic wave detection signal, the bandpass filter whose pass characteristics are shown in FIG. Basically, the same effect can be obtained as when used. In this case, the amount of the overall signal strength of the second photoacoustic wave detection signal is reduced by removing more signal components in the low frequency range of the second photoacoustic wave detection signal. Thus, it is possible to compensate by increasing the gain of filtering for the second photoacoustic wave detection signal.
- the filtering characteristics for the photoacoustic wave detection signal related to the artifact non-occurrence region R1 and the filtering characteristics for the photoacoustic wave detection signal related to the artifact generation region R2 are different from each other.
- the frequency characteristics of the image are likely to change abruptly at the boundary B between the artifact non-occurrence region R1 and the artifact generation region R2.
- an embodiment in which a sudden change in the frequency characteristic is suppressed will be described.
- two boundary regions BR1 and BR2 are set as an example within a range including the boundary B between the artifact non-occurrence region R1 and the artifact generation region R2 shown in FIG.
- the boundary region BR1 belongs to the artifact non-occurrence region R1
- the boundary region BR2 belongs to the artifact generation region R2.
- the bandpass filtering process is performed on the photoacoustic wave detection signals corresponding to the respective photoacoustic images in the boundary region BR1 and the boundary region BR2 by the bandpass filter described above.
- the first photoacoustic wave detection signal corresponding to the first photoacoustic wave detection signal corresponding to the photoacoustic image of the artifact non-occurrence region R1 and the second photoacoustic wave corresponding to the photoacoustic image of the artifact occurrence region R2 This is a boundary region band-pass filtering process that is different from the second band-pass filtering process for the detection signal.
- FIG. 10 shows basic pass characteristics in each of the above bandpass filtering.
- the broken line indicates the pass characteristic for the first photoacoustic wave detection signal
- the solid line indicates the pass characteristic for the second photoacoustic wave detection signal
- the two-dot chain line indicates the boundary region.
- the pass characteristic with respect to the photoacoustic wave detection signal corresponding to the photoacoustic image of BR1 and the pass characteristic with respect to the photoacoustic wave detection signal corresponding to the photoacoustic image of the boundary area BR2 are indicated by a one-dot chain line.
- the boundary region band-pass filtering process for the photoacoustic wave detection signal corresponding to the photoacoustic image of the boundary region BR1 and the boundary region bandpass for the photoacoustic wave detection signal corresponding to the photoacoustic image of the boundary region BR2 are performed.
- the filtering process includes a first bandpass filtering process for the first photoacoustic wave detection signal corresponding to the photoacoustic image in the artifact non-occurrence region R1, and a second photoacoustic corresponding to the photoacoustic image in the artifact occurrence region R2. This is an intermediate process to the second band-pass filtering process for the wave detection signal.
- the frequency characteristics of the photoacoustic image sharply change at the boundary B between the artifact non-generation area R1 and the artifact generation area R2. There is no change. That is, in the photoacoustic image displayed on the display unit 14, the frequency characteristics gradually change at each boundary of the artifact non-occurrence region R1, the boundary region BR1, the boundary region BR2, and the artifact generation region R2, and the photoacoustic image is displayed. The diagnostic performance of the image is improved.
- two boundary regions BR1 and BR2 are set in a range including the boundary B between the artifact non-occurrence region R1 and the artifact generation region R2.
- one or more boundary regions are provided. Any number can be set. When only one boundary area is set, the edge of the boundary area may be aligned with the boundary B. The more boundary regions are set, the smoother the change in frequency characteristics at the boundaries of each region is. However, the time required for the filtering process increases accordingly. It is better to set it to a number.
- At least one boundary region is set within a range including the boundary between the artifact non-occurrence region R1 and the artifact generation region R2.
- two boundary regions BR1 and BR2 are set as in the example of FIG. Then, the first photoacoustic wave detection signal corresponding to the photoacoustic image in the artifact non-occurrence region R1 is subjected to a first bandpass filtering process in which the cut-off frequency fL1 on the low frequency side is set to 1 MHz.
- One processed photoacoustic wave detection signal is obtained.
- the second photoacoustic wave detection signal corresponding to the photoacoustic image in the artifact occurrence region R2 is subjected to a second bandpass filtering process in which the cut-off frequency fL2 on the low frequency side is set to 2 MHz. Two processed photoacoustic wave detection signals are obtained.
- the first processed photoacoustic wave detection signal and the second processed photoacoustic wave detection signal are weighted and added according to the subject depth direction positions of the boundary region BR1 and the boundary region BR2, and the boundary region BR1 And a photoacoustic wave detection signal for the boundary region BR2.
- a weighting coefficient of 0.7 is given to the first processed photoacoustic wave detection signal
- a weighting coefficient of 0.3 is given to the second processed photoacoustic wave detection signal
- weighted addition is performed.
- a photoacoustic wave detection signal for BR1 is obtained.
- a weighting coefficient of 0.3 is given to the first processed photoacoustic wave detection signal
- a weighting coefficient of 0.7 is given to the second processed photoacoustic wave detection signal
- weighted addition is performed.
- the photoacoustic wave detection signal is obtained.
- a photoacoustic image of the boundary region BR1 is generated from the photoacoustic wave detection signal for the boundary region BR1 obtained as described above, and a photoacoustic image of the boundary region BR2 is generated from the photoacoustic wave detection signal for the boundary region BR2.
- a photoacoustic image is generated by the first processed photoacoustic wave detection signal for portions other than the boundary region BR1 of the artifact non-occurrence region R1, and the portions other than the boundary region BR2 of the artifact generation region R2 are described above.
- a photoacoustic image is generated by the second processed photoacoustic wave detection signal, and the photoacoustic images are combined to generate and display one photoacoustic image.
- the photoacoustic image displayed on the display unit 14 has a frequency characteristic slightly changed at each boundary of the artifact non-occurrence region R1, the boundary region BR1, the boundary region BR2, and the artifact generation region R2.
- the diagnostic performance of the photoacoustic image is improved.
- the filtering process for the photoacoustic wave detection signals in the artifact non-occurrence region R1, the artifact occurrence region R2, the boundary region BR1, and the boundary region BR2 Compared to the case of changing the characteristics, the calculation processing is lighter.
- FIG. 12 shows an example of the photoacoustic measuring apparatus 10 configured as described above.
- the photoacoustic measurement apparatus 10 shown in FIG. 12 has a configuration in which the data separation means 23, the ultrasonic image generation unit 29, and the transmission control circuit 33 are removed as compared with the one shown in FIG.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- General Health & Medical Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Acoustics & Sound (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physiology (AREA)
- Psychiatry (AREA)
- Signal Processing (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
- Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017542731A JP6411667B2 (ja) | 2015-09-29 | 2016-09-21 | 光音響計測装置および光音響計測装置の信号処理方法 |
EP16850642.6A EP3357430B1 (en) | 2015-09-29 | 2016-09-21 | Photoacoustic measurement device and signal processing method of photoacoustic measurement device |
CN201680052467.8A CN108024794B (zh) | 2015-09-29 | 2016-09-21 | 光声测量装置及光声测量装置的信号处理方法 |
US15/901,041 US11083376B2 (en) | 2015-09-29 | 2018-02-21 | Photoacoustic measurement device and signal processing method of photoacoustic measurement device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015191727 | 2015-09-29 | ||
JP2015-191727 | 2015-09-29 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/901,041 Continuation US11083376B2 (en) | 2015-09-29 | 2018-02-21 | Photoacoustic measurement device and signal processing method of photoacoustic measurement device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2017056462A1 true WO2017056462A1 (ja) | 2017-04-06 |
Family
ID=58423089
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/004307 WO2017056462A1 (ja) | 2015-09-29 | 2016-09-21 | 光音響計測装置および光音響計測装置の信号処理方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US11083376B2 (zh) |
EP (1) | EP3357430B1 (zh) |
JP (1) | JP6411667B2 (zh) |
CN (1) | CN108024794B (zh) |
WO (1) | WO2017056462A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107345345A (zh) * | 2017-06-30 | 2017-11-14 | 浙江众邦机电科技有限公司 | 一种用于缝纫机的布料检测系统及方法 |
CN107607473A (zh) * | 2017-08-31 | 2018-01-19 | 华南师范大学 | 一种同时多点激发与匹配接收的光声三维成像装置及方法 |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8997572B2 (en) | 2011-02-11 | 2015-04-07 | Washington University | Multi-focus optical-resolution photoacoustic microscopy with ultrasonic array detection |
WO2015077355A1 (en) | 2013-11-19 | 2015-05-28 | Washington University | Systems and methods of grueneisen-relaxation photoacoustic microscopy and photoacoustic wavefront shaping |
WO2018209046A1 (en) | 2017-05-10 | 2018-11-15 | Washington University | Snapshot photoacoustic photography using an ergodic relay |
EP3836831A4 (en) | 2018-08-14 | 2022-05-18 | California Institute of Technology | MULTIFOCAL PHOTOACOUSTIC MICROSCOPY THROUGH AN ERGODIC RELAY |
EP3847453A4 (en) | 2018-09-04 | 2022-06-22 | California Institute of Technology | PHOTOACOUSTIC INFRARED MICROSCOPY AND SPECTROSCOPY WITH INCREASED RESOLUTION |
CN109674490B (zh) * | 2019-01-17 | 2021-09-10 | 南京大学深圳研究院 | 一种超声引导的低反射伪像光声显微镜成像方法 |
WO2021092250A1 (en) * | 2019-11-05 | 2021-05-14 | California Institute Of Technology | Spatiotemporal antialiasing in photoacoustic computed tomography |
CN112842264B (zh) * | 2020-12-31 | 2023-04-25 | 哈尔滨工业大学(威海) | 多模态成像中数字滤波方法、装置和多模态成像技术系统 |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011217767A (ja) * | 2010-04-02 | 2011-11-04 | Canon Inc | 光音響イメージング装置及び光音響イメージング方法 |
JP2013005957A (ja) * | 2011-06-27 | 2013-01-10 | Fujifilm Corp | ドプラ画像表示方法および装置 |
Family Cites Families (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5873830A (en) * | 1997-08-22 | 1999-02-23 | Acuson Corporation | Ultrasound imaging system and method for improving resolution and operation |
US6936008B2 (en) * | 1999-08-20 | 2005-08-30 | Zonare Medical Systems, Inc. | Ultrasound system with cableless coupling assembly |
KR100856043B1 (ko) * | 2006-10-17 | 2008-09-03 | 주식회사 메디슨 | 초음파 공간 합성 영상의 형성 방법 |
US9451884B2 (en) * | 2007-12-13 | 2016-09-27 | Board Of Trustees Of The University Of Arkansas | Device and method for in vivo detection of clots within circulatory vessels |
US8233482B2 (en) | 2010-04-22 | 2012-07-31 | Robert Paul Morris | Methods, systems, and computer program products for disabling an operative coupling to a network |
JP5570877B2 (ja) * | 2010-06-04 | 2014-08-13 | 株式会社東芝 | 超音波診断装置 |
US8823928B2 (en) * | 2011-11-02 | 2014-09-02 | Seno Medical Intruments, Inc. | Light output calibration in an optoacoustic system |
US8839672B2 (en) * | 2010-10-19 | 2014-09-23 | Board Of Regents, The University Of Texas System | Combined ultrasound and photoacoustic imaging of metal objects |
JP6151882B2 (ja) * | 2010-12-24 | 2017-06-21 | キヤノン株式会社 | 被検体情報取得装置及び被検体情報取得方法 |
JP5850633B2 (ja) * | 2011-04-12 | 2016-02-03 | キヤノン株式会社 | 被検体情報取得装置 |
JP6132466B2 (ja) | 2012-02-07 | 2017-05-24 | キヤノン株式会社 | 被検体情報取得装置及び被検体情報取得方法 |
JP6146956B2 (ja) * | 2012-03-13 | 2017-06-14 | キヤノン株式会社 | 装置、表示制御方法、及びプログラム |
JP6146955B2 (ja) * | 2012-03-13 | 2017-06-14 | キヤノン株式会社 | 装置、表示制御方法、及びプログラム |
SG11201407748SA (en) | 2012-06-13 | 2015-01-29 | Seno Medical Instr Inc | System and method for producing parametric maps of optoacoustic data |
KR20140084828A (ko) * | 2012-12-27 | 2014-07-07 | 삼성전자주식회사 | 광음향 프로브 및 이를 포함하는 광음향 장치 |
KR102094502B1 (ko) * | 2013-02-21 | 2020-03-30 | 삼성전자주식회사 | 의료 영상들의 정합 방법 및 장치 |
JP5907918B2 (ja) * | 2013-03-21 | 2016-04-26 | 富士フイルム株式会社 | 光音響計測装置および光音響計測方法並びにプローブの接触判断方法 |
US10217213B2 (en) * | 2013-09-30 | 2019-02-26 | The United States Of America As Represented By The Secretary Of The Army | Automatic focused assessment with sonography for trauma exams |
JP6049215B2 (ja) * | 2014-01-16 | 2016-12-21 | 富士フイルム株式会社 | 光音響計測装置並びにそれに利用される信号処理装置および信号処理方法 |
JP6504826B2 (ja) * | 2014-02-10 | 2019-04-24 | キヤノン株式会社 | 情報処理装置および情報処理方法 |
CN105326524B (zh) * | 2014-07-31 | 2018-10-26 | 通用电气公司 | 可减少图像中的伪影的医学成像方法和装置 |
KR20160056614A (ko) * | 2014-11-12 | 2016-05-20 | 삼성전자주식회사 | 영상 처리 장치, 영상 처리 장치 제어 방법 및 초음파 영상 장치 |
KR102301379B1 (ko) * | 2015-01-20 | 2021-09-14 | 삼성전자주식회사 | 영상 처리 장치, 영상 촬영 장치, 영상 처리 방법 및 영상 촬영 장치의 제어 방법 |
-
2016
- 2016-09-21 CN CN201680052467.8A patent/CN108024794B/zh active Active
- 2016-09-21 EP EP16850642.6A patent/EP3357430B1/en active Active
- 2016-09-21 JP JP2017542731A patent/JP6411667B2/ja active Active
- 2016-09-21 WO PCT/JP2016/004307 patent/WO2017056462A1/ja unknown
-
2018
- 2018-02-21 US US15/901,041 patent/US11083376B2/en active Active
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011217767A (ja) * | 2010-04-02 | 2011-11-04 | Canon Inc | 光音響イメージング装置及び光音響イメージング方法 |
JP2013005957A (ja) * | 2011-06-27 | 2013-01-10 | Fujifilm Corp | ドプラ画像表示方法および装置 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107345345A (zh) * | 2017-06-30 | 2017-11-14 | 浙江众邦机电科技有限公司 | 一种用于缝纫机的布料检测系统及方法 |
CN107607473A (zh) * | 2017-08-31 | 2018-01-19 | 华南师范大学 | 一种同时多点激发与匹配接收的光声三维成像装置及方法 |
Also Published As
Publication number | Publication date |
---|---|
CN108024794B (zh) | 2020-12-18 |
EP3357430A1 (en) | 2018-08-08 |
JPWO2017056462A1 (ja) | 2018-02-08 |
US20180177407A1 (en) | 2018-06-28 |
EP3357430A4 (en) | 2018-08-08 |
JP6411667B2 (ja) | 2018-10-24 |
EP3357430B1 (en) | 2019-09-18 |
CN108024794A (zh) | 2018-05-11 |
US11083376B2 (en) | 2021-08-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6411667B2 (ja) | 光音響計測装置および光音響計測装置の信号処理方法 | |
US9974440B2 (en) | Photoacoustic image generation device and method | |
US9888856B2 (en) | Photoacoustic image generation apparatus, system and method | |
JP5719242B2 (ja) | ドプラ画像表示方法および装置 | |
US20170343515A1 (en) | Apparatus and method for obtaining object information and non-transitory computer-readable storage medium | |
JP5863345B2 (ja) | 被検体情報取得装置および被検体情報取得方法 | |
US10064558B2 (en) | Subject information acquisition device, method for controlling subject information acquisition device, and storage medium storing program therefor | |
JP6049215B2 (ja) | 光音響計測装置並びにそれに利用される信号処理装置および信号処理方法 | |
JP2013233386A (ja) | 光音響画像生成装置、システム、及び方法 | |
US11119199B2 (en) | Acoustic wave image generation apparatus and acoustic wave image generation method | |
JP5864905B2 (ja) | 被検体情報取得装置及び被検体情報取得方法 | |
US20190142277A1 (en) | Photoacoustic apparatus and object information acquiring method | |
WO2019044594A1 (ja) | 光音響画像生成装置および画像取得方法 | |
US11333599B2 (en) | Photoacoustic image generation apparatus, photoacoustic image generation method, and photoacoustic image generation program | |
JP6482686B2 (ja) | 光音響画像生成システム、装置、及び方法 | |
JP2014184025A (ja) | 光音響計測装置、プローブおよび音響整合部材並びに光音響計測方法およびプローブの接触判断方法 | |
JP2014023680A (ja) | 被検体情報取得装置およびその制御方法ならびに提示方法 | |
WO2018235781A1 (ja) | 音響波画像生成装置および光音響画像解析方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16850642 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2017542731 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |