WO2014050020A1 - Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique - Google Patents

Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique Download PDF

Info

Publication number
WO2014050020A1
WO2014050020A1 PCT/JP2013/005497 JP2013005497W WO2014050020A1 WO 2014050020 A1 WO2014050020 A1 WO 2014050020A1 JP 2013005497 W JP2013005497 W JP 2013005497W WO 2014050020 A1 WO2014050020 A1 WO 2014050020A1
Authority
WO
WIPO (PCT)
Prior art keywords
photoacoustic
detection
acoustic
coordinates
region
Prior art date
Application number
PCT/JP2013/005497
Other languages
English (en)
Japanese (ja)
Inventor
覚 入澤
剛也 阿部
Original Assignee
富士フイルム株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 富士フイルム株式会社 filed Critical 富士フイルム株式会社
Publication of WO2014050020A1 publication Critical patent/WO2014050020A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing

Definitions

  • the present invention relates to a photoacoustic image generation apparatus and a photoacoustic image generation method for generating a photoacoustic image based on a photoacoustic wave generated due to light absorption.
  • Photoacoustic spectroscopy irradiates a subject with light having a predetermined wavelength (for example, visible light, near-infrared light, or mid-infrared wavelength band), and a specific substance in the subject radiates the energy of this light.
  • a photoacoustic wave which is an elastic wave generated as a result of absorption, is detected, and the concentration or distribution of the specific substance is measured (for example, Patent Document 1).
  • the specific substance in the subject is, for example, glucose or hemoglobin contained in blood when the subject is a human body.
  • a technique for detecting a photoacoustic wave and generating a photoacoustic image based on the detection signal is photoacoustic imaging (PAI) or photoacoustic tomography (PAT: Photo). Called Acoustic Tomography).
  • Patent Document 1 in the generation of a photoacoustic image of an imaging target range divided into a plurality of partial areas, a one-frame photoacoustic image is generated using an acoustic detection element corresponding to each partial area.
  • the photoacoustic wave (or photoacoustic signal) to be provided is detected in multiple times, all these photoacoustic signals are temporarily stored in memory, and more data than the number of data that can be sampled in parallel are read from the memory.
  • a method for phase matching addition is disclosed. According to the method of Patent Document 1, it is possible to generate a photoacoustic image with higher resolution even when the number of data that can be sampled in parallel is limited.
  • the present invention has been made in view of the above problems, and even when volume data is generated based on photoacoustic signals detected in multiple times, the structure inside the subject is expressed more accurately. It is an object of the present invention to provide a photoacoustic image generation apparatus and a photoacoustic image generation method that enable this.
  • a photoacoustic image generation apparatus includes: In a photoacoustic image generation device that detects a photoacoustic wave generated in a subject and generates a photoacoustic image based on a photoacoustic signal of the photoacoustic wave, An acoustic detection unit having a plurality of acoustic detection elements, wherein a plurality of imaging regions corresponding to the plurality of acoustic detection elements are selected while sequentially selecting some acoustic detection element groups that detect photoacoustic waves in parallel.
  • An acoustic detection unit that detects photoacoustic waves for each detection region by dividing the detection region;
  • a coordinate acquisition unit for acquiring the coordinates of the acoustic detection unit in space;
  • a coordinate setting unit that sets, for each detection region, representative coordinates that represent the coordinates of each detection region, based on the coordinates of the acoustic detection unit when detecting the photoacoustic wave acquired by the coordinate acquisition unit;
  • Of the photoacoustic image data generated based on the photoacoustic signal partial image data for displaying each detection area and the representative coordinates set in each detection area are associated with each other to generate volume data of the photoacoustic image.
  • an acoustic signal processing unit that transmits to generate volume data of the photoacoustic image.
  • the acoustic signal processing unit can generate partial image data for displaying a certain detection region in the photoacoustic signal obtained in the detection region and the other detection regions.
  • a configuration that is performed based on the photoacoustic signal can be employed.
  • the acoustic signal processing unit generates partial image data for displaying a certain detection area based only on the photoacoustic signal obtained in the detection area. Configuration can be adopted.
  • the coordinate setting unit calculates, as representative coordinates, calculated coordinates calculated using a plurality of coordinates acquired while a photoacoustic wave is detected in a certain detection region.
  • a configuration that is set in the detection region can be employed. In this case, it is possible to adopt a configuration in which the coordinate setting unit obtains the calculated coordinates using the coordinates acquired immediately before and after the period in which the photoacoustic wave is detected in a certain detection region.
  • the coordinate setting unit directly uses one of the coordinates acquired while the photoacoustic wave is detected in a certain detection area as a representative coordinate. It is possible to adopt a configuration that is set to
  • a configuration in which a plurality of acoustic detection elements constituting the acoustic detection element group is continuous can be employed.
  • the coordinate acquisition unit has a plurality of reading points for reading the coordinates
  • the coordinate setting unit sets the representative coordinates based on the coordinates read by the plurality of reading points.
  • a configuration in which a plurality of reading points are provided corresponding to each acoustic detection element group can be employed.
  • the nth acoustic detection element group is n, N + n, 2N + n,. -2) N + n and (Q-1) N + nth acoustic detection elements may be used.
  • Q represents a quotient when the total number of a plurality of sound detection elements included in the sound detection unit is divided by N.
  • the acoustic detection unit detects reflected ultrasonic waves with respect to the ultrasonic waves transmitted to the subject, It is preferable that the acoustic signal processing unit generates an ultrasonic image based on an ultrasonic signal of reflected ultrasonic waves.
  • a photoacoustic image generation method includes: In a photoacoustic image generation method for detecting a photoacoustic wave generated in a subject and generating a photoacoustic image based on a photoacoustic signal of the photoacoustic wave, Using a sound detection unit having a plurality of sound detection elements, a plurality of imaging regions corresponding to the plurality of sound detection elements are selected while sequentially selecting some sound detection element groups that detect photoacoustic waves in parallel.
  • the photoacoustic wave is detected for each detection area divided into detection areas, Based on the coordinates of the acoustic detection unit in the space when detecting the photoacoustic wave, representative coordinates representing the coordinates of each detection region are set for each detection region, Of the photoacoustic image data generated based on the photoacoustic signal, partial image data for displaying each detection area and the representative coordinates set in each detection area are associated with each other to generate volume data of the photoacoustic image. It is characterized by this.
  • the generation of partial image data for displaying a certain detection region is based on the photoacoustic signal obtained in the detection region and the photoacoustic signal obtained in another detection region.
  • a configuration in which the partial image data is generated based on the photoacoustic signal obtained in a set of detection regions corresponding to the imaging region can be employed.
  • the photoacoustic image generation method it is possible to adopt a configuration in which partial image data for displaying a certain detection region is generated based only on the photoacoustic signal obtained in the detection region.
  • calculated coordinates calculated using a plurality of coordinates acquired while a photoacoustic wave is detected in a certain detection area are set as representative coordinates in the detection area.
  • a configuration in which one of coordinates acquired while a photoacoustic wave is detected in a certain detection area is set as a representative coordinate in the detection area as it is. Can be adopted.
  • a configuration in which a plurality of acoustic detection elements constituting the acoustic detection element group is continuous can be employed.
  • the photoacoustic wave is detected for each detection region, the representative coordinates are set, and the partial image data for displaying each detection region and each detection region are set. Since the volume data is generated in correspondence with each of the representative coordinates, rather than simply arranging one frame of photoacoustic image data generated based on the photoacoustic signal obtained for each detection region (that is, at different times). The accuracy of the position of the photoacoustic image data in the volume data can be increased. As a result, even when volume data is generated based on photoacoustic signals detected in a plurality of times, the structure inside the subject can be expressed more accurately.
  • FIG. 1 is a schematic block diagram showing the configuration of the photoacoustic image generation apparatus of the present embodiment.
  • FIG. 2 is a schematic diagram illustrating a configuration of an acoustic detection unit in the probe.
  • the photoacoustic image generation apparatus 10 of the present embodiment includes a probe 11, an ultrasonic unit 12, a laser unit 13, a display unit 14, a coordinate acquisition unit (15, 41 and 42), and an input unit 16. Is provided.
  • the photoacoustic image generation method in this embodiment uses the acoustic detection unit 20 having 128 acoustic detection elements 20c to sequentially select some acoustic detection element groups that detect photoacoustic waves in parallel.
  • the imaging region corresponding to the plurality of acoustic detection elements 20c is divided into a plurality of detection regions, the photoacoustic wave is detected for each detection region, and the coordinates of the acoustic detection unit 20 in the space when the photoacoustic wave is detected
  • representative coordinates representing the coordinates of each detection area are set for each detection area, and among the photoacoustic image data generated based on the photoacoustic signal, partial image data for displaying each detection area, and each detection Volume data of the photoacoustic image is generated by associating the representative coordinates set in the region with each other.
  • the probe 11 detects, for example, the optical fiber 40 that guides the laser light L output from the laser unit 13 to the subject M, and the acoustic wave U from the subject M, and sets the intensity of the detected acoustic wave U.
  • a sound detection unit 20 that generates a corresponding electrical signal (acoustic signal) is included.
  • acoustic wave means an ultrasonic wave and a photoacoustic wave.
  • ultrasonic wave means an elastic wave generated in a subject due to vibration of an acoustic wave generator such as a piezoelectric element and its reflected wave
  • photoacoustic wave means light generated by light irradiation.
  • the probe 11 is, for example, a handheld probe, and is configured so that a user can manually scan.
  • the scanning is not limited to manual scanning, and may be performed by a mechanical mechanism.
  • the probe 11 is appropriately selected from a sector scan type, a linear scan type, a convex scan type, and the like according to the subject M to be diagnosed.
  • a magnetic sensor 42 that constitutes a part of the coordinate acquisition unit is built in the probe 11.
  • the acoustic detection unit 20 includes, for example, a backing material, a detection element array 20a, a control circuit for the detection element array 20a, a multiplexer 20b, an acoustic matching layer, and an acoustic lens.
  • the detection element array 20a is a one-dimensional array of 128 acoustic detection elements 20c, and converts actually detected acoustic waves into electrical signals.
  • the number and arrangement of the acoustic detection elements 20c are not limited to this.
  • the number of acoustic detection elements 20c may be 192, or the acoustic detection elements 20c may be two-dimensionally arranged.
  • the acoustic detection element 20c is a piezoelectric element composed of a polymer film such as piezoelectric ceramics or polyvinylidene fluoride (PVDF).
  • the multiplexer 20b selectively connects the acoustic detection element 20c and the ultrasonic unit 12 for each of the acoustic detection elements 20c that detect acoustic waves in parallel.
  • the acoustic detection unit 20 generates photoacoustic image data by dividing the imaging region into a plurality of detection regions corresponding to acoustic detection element groups that detect photoacoustic waves in parallel by selective connection by the multiplexer 20b.
  • the photoacoustic wave used for the detection is detected for each detection region.
  • the acoustic detection element group is a set of acoustic detection elements that detect photoacoustic waves in parallel among the plurality of acoustic detection elements 20c.
  • the imaging region is a region of a subject to be displayed in a one-frame photoacoustic image defined by the detection element array 20a.
  • each detection region is a region corresponding to a part (less than 128) of acoustic detection elements 20c in a state in which photoacoustic signals can be transmitted to the ultrasound unit 12 in parallel in the imaging region.
  • the detection element array 20a as an acoustic detection element group is divided into two element regions (element region A or element region B). For example, as shown in FIG. It is comprised from the one acoustic detection element 20c.
  • region may overlap with another element area
  • the number of acoustic detection elements constituting each element region is preferably equal, but is not necessarily strictly equal.
  • the detection of the photoacoustic wave is performed in order for each element region by sequentially connecting each element region to the ultrasonic unit 12 by selective connection by the multiplexer 20b.
  • the number of channels (ch) of the multiplexer 20b is 64, and data for 64 channels can be sampled in parallel.
  • a photoacoustic wave is detected in the element region A, and after the multiplexer 20b switches the connection to the element region B, the photoacoustic wave is detected in the element region B.
  • a photoacoustic wave signal (photoacoustic signal) detected in each element region is sequentially transmitted to the receiving circuit 21 of the ultrasonic unit 12.
  • the number of channels (ch) in reception can be reduced, thereby reducing costs. Is possible. And when the probe 11 is scanned, the detection of the photoacoustic wave for every element area
  • the element area may be divided by dividing the entire acoustic detection element into two as shown in FIG. 2A, or dividing the entire acoustic detection element into three or four or more as shown in B of FIG. Good.
  • photoacoustic waves are detected in the order of, for example, the element region A, the element region B, and the element region C by selective connection by the multiplexer 20b.
  • the element region is divided as shown in FIG. 2C, in addition to the aspect in which the acoustic detection elements constituting each element region are continuous in the arrangement direction as shown in FIG. 2A and FIG. 2B.
  • a mode in which the set of acoustic detection element groups is separated by another acoustic detection element group may be employed.
  • the detection element array 20a is divided into N, and there are N element regions (acoustic detection element groups).
  • the nth element region is n, N + n, 2N + n, (Q-2). It may be configured from N + n and (Q ⁇ 1) N + nth acoustic detection elements.
  • Q represents a quotient when the total number of a plurality of sound detection elements included in the sound detection unit is divided by N. If there is a remainder, it should be incorporated into each element region as evenly as possible. In this case, for example, an acoustic signal is detected in an element region of 1, N + 1, 2N + 1,...
  • an acoustic signal is detected in an element region of 2, N + 2, 2N + 2,.
  • the first element region is composed of the first, fourth, seventh,..., 121 and 124th acoustic detection elements from the left and the remaining 127th acoustic detection element.
  • the second element region is composed of the 2, 5, 8, ..., 122 and 125th acoustic detection elements from the left and the remaining 128th acoustic detection element, and the third element region is 3, 6 from the left. , 9,..., 123, and 126, and 126th acoustic detection element.
  • the optical fiber 40 guides the laser light L output from the laser unit 13 to the vicinity of the detection element array 20a.
  • the optical fiber 40 is not particularly limited, and a known fiber such as a quartz fiber can be used.
  • the laser beam L is guided to the vicinity of the detection element array 20a, and then irradiated to a range including a detection region facing at least a selectively connected element region.
  • a light guide plate or a diffusion plate is used in addition to the optical fiber so that the laser light L is uniformly emitted to the subject. You can also.
  • the laser unit 13 includes a light source that emits laser light L, for example, and outputs the laser light L as light to be irradiated on the subject M.
  • the laser unit 13 is configured to output a laser beam L in response to, for example, a trigger signal from the control unit 29 of the ultrasonic unit 12.
  • the laser light L output from the laser unit 13 is guided to the vicinity of the detection element array 20a of the probe 11 using a light guide unit such as an optical fiber 40, for example.
  • the laser unit 13 preferably outputs pulsed light having a pulse width of 1 to 100 nsec as laser light.
  • the laser unit 13 is a Q-switch alexandrite laser.
  • the pulse width of the laser light L is controlled by, for example, a Q switch.
  • the wavelength of the laser light is appropriately determined according to the light absorption characteristics of the substance in the subject to be measured.
  • the wavelength is preferably a wavelength belonging to the near-infrared wavelength region.
  • the near-infrared wavelength region means a wavelength region of about 700 to 850 nm.
  • the wavelength of the laser beam is not limited to this.
  • the laser beam L may be a single wavelength or may include a plurality of wavelengths (for example, 750 nm and 800 nm). Furthermore, when the laser light L includes a plurality of wavelengths, the light of these wavelengths may be irradiated to the subject M at the same time, or may be irradiated while being switched alternately.
  • the laser unit 13 may be a YAG-SHG-OPO laser or a Ti-Sapphire laser that can output laser light in the near-infrared wavelength region in addition to the alexandrite laser.
  • the coordinate acquisition unit sequentially or sequentially acquires coordinates (hereinafter also simply referred to as coordinates) that define the position and posture of the probe 11 (that is, the acoustic detection unit 20) in the real space and while the probe 11 is being scanned. To do.
  • the coordinate acquisition unit is a magnetic sensor unit
  • this magnetic sensor unit includes a coordinate acquisition control unit 15, a magnetic field generation unit 41 such as a transmitter, and a magnetic sensor 42.
  • the magnetic sensor unit includes a position (x, y, z) and a posture (angle) ( ⁇ , ⁇ , ⁇ ) of the magnetic sensor relative to the space of the magnetic field generation unit system (the space on the pulsed magnetic field formed by the magnetic field generation unit). ) Can be obtained.
  • the position and orientation of the magnetic sensor are associated with the position and orientation of the probe.
  • the “magnetic sensor position” means the position of the reference point of the magnetic sensor determined based on the magnetic field information acquired by the magnetic sensor.
  • the attitude of the magnetic sensor means, for example, the inclination of the space (the space of the magnetic sensor system) with the reference point relating to the magnetic sensor as the origin. Note that when the scanning of the probe 11 is only parallel movement, the acquired information may be only the relative position.
  • the coordinate acquisition control unit 15 sets the coordinates of the probe 11 at that time as the origin in the space of the magnetic field generation unit system, for example.
  • This space is, for example, a (x, y, z) triaxial space when considering only parallel movement, and (x, y, z, ⁇ , ⁇ , ⁇ ) when considering rotational movement.
  • This is a 6-axis system space.
  • the origin is set so that the axis of the space is along the array direction of the detection element array 20a (direction in which the acoustic detection elements 20c are arranged) or the elevation direction (direction perpendicular to the array direction and parallel to the detection surface of the detection element array 20a). It is preferable to do.
  • the coordinate acquisition unit may be configured to acquire coordinates using an acceleration sensor, an infrared sensor, or the like in addition to the magnetic sensor unit.
  • the coordinate acquisition unit acquires the coordinates of the probe 11 at a predetermined cycle (coordinate acquisition cycle), for example.
  • a predetermined cycle for example.
  • the coordinate acquisition cycle of the magnetic sensor unit is 5 ms.
  • the acquired coordinates are transmitted to the control means 29. These coordinates are used when generating three-dimensional volume data based on the acoustic signal, generating tomographic data from the volume data, or arranging two-dimensional acoustic images in order according to the position. .
  • two magnetic sensors 42 serving as reading points for reading coordinates in the coordinate acquisition unit are provided in the probe 11 (FIG. 1), for example, one is in the vicinity of the element region A and the other is an element. It is arranged in the vicinity of the region B.
  • the control means 29 sets representative coordinates based on the coordinates read by a plurality of magnetic sensors at each reading timing, for example, adopts the coordinates read by the closer magnetic sensor 42 for each element region, The weighted average of the coordinates read by the two magnetic sensors 42 is taken.
  • One magnetic sensor 42 may be provided.
  • the ultrasonic unit 12 includes a reception circuit 21, an AD conversion unit 22, a reception memory 23, a photoacoustic image reconstruction unit 24, a detection / logarithm conversion unit 27, a photoacoustic image construction unit 28, a control unit 29, an image synthesis unit 38, and Observation method selection means 39 is provided.
  • the ultrasonic unit 12 corresponds to an acoustic signal processing unit in the present invention.
  • the control means 29 controls each part of the photoacoustic image generation apparatus 10, and includes a trigger control circuit 30 in the present embodiment, for example.
  • the trigger control circuit 30 sends a light trigger signal to the laser unit 13 when the photoacoustic image generation apparatus is activated, for example.
  • the flash lamp is turned on in the laser unit 13 and the excitation of the laser rod is started. And the excitation state of a laser rod is maintained and the laser unit 13 will be in the state which can output a laser beam.
  • the control means 29 then transmits a Qsw trigger signal from the trigger control circuit 30 to the laser unit 13. That is, the control means 29 controls the output timing of the laser light from the laser unit 13 by this Qsw trigger signal.
  • the transmission of the Qsw trigger signal may be transmitted at regular time intervals, or may be transmitted at regular coordinate intervals based on the coordinates obtained from the coordinate acquisition unit.
  • the control unit 29 transmits the sampling trigger signal to the AD conversion unit 22 simultaneously with the transmission of the Qsw trigger signal.
  • the sampling trigger signal serves as a cue for the start timing of the photoacoustic signal sampling in the AD conversion means 22. As described above, by using the sampling trigger signal, it is possible to sample the photoacoustic signal in synchronization with the output of the laser beam.
  • control means 29 acquires the coordinates of the probe 11 (more precisely, the coordinates of the element region in consideration of the distance between the magnetic sensor and the detection element array 20a) from the coordinate acquisition control unit 15 simultaneously with the transmission of the Qsw trigger signal. Then, based on the coordinates, the representative coordinates are set in the detection area where the photoacoustic wave is detected at that time. That is, the control means 29 corresponds to a coordinate setting unit in the present invention. This makes it possible to synchronize the three timings of laser light output, photoacoustic wave detection for each detection region, and coordinate setting.
  • the control unit 29 issues a command to the coordinate acquisition unit that the coordinate should be acquired before the transmission of the Qsw trigger signal.
  • the representative coordinates are coordinates representative of the spatial position of the detection area, and are set for each detection area as described above. Information on the set representative coordinates is transmitted to the reception memory 23 and stored in association with the photoacoustic signal obtained in the detection area. The representative coordinates are determined based on the coordinates of the acoustic detection unit 20 acquired by the coordinate acquisition unit when detecting the photoacoustic wave.
  • the representative coordinates are calculated coordinates calculated using a plurality of coordinates acquired while photoacoustic waves are detected in a certain detection area (for example, average value, weighted average value, median value, mode value, etc. ).
  • the representative coordinate may be one of the coordinates acquired while the photoacoustic wave is detected in a certain detection area. In the present embodiment, the latter is adopted, and the former representative coordinates will be described in detail in the third embodiment.
  • control means 29 can be configured to start transmission of a Qsw trigger signal when a predetermined switch provided on the probe 11 is pressed. If comprised in this way, the position of the probe 11 when a switch is pushed can be handled as a starting point of probe scanning. Further, if the transmission of the Qsw trigger signal is terminated when the switch is pressed next time, the position of the probe 11 at that time can be handled as the end point of the probe scan.
  • the receiving circuit 21 receives the photoacoustic signal detected by the probe 11.
  • the photoacoustic signal received by the receiving circuit 21 is transmitted to the AD conversion means 22.
  • the AD conversion means 22 is a sampling means, which samples the photoacoustic signal received by the receiving circuit 21 and converts it into a digital signal.
  • the AD conversion unit 22 includes a sampling control unit and an AD converter.
  • the reception signal received by the reception circuit 21 is converted into a sampling signal digitized by an AD converter.
  • the AD converter is controlled by a sampling control unit, and is configured to perform sampling when the sampling control unit receives a sampling trigger signal.
  • the AD converter 22 samples the received signal at a predetermined sampling period based on, for example, an AD clock signal having a predetermined frequency input from the outside.
  • the reception memory 23 stores the photoacoustic signal (that is, the sampling signal) sampled by the AD conversion means 22 and the representative coordinate information transmitted from the control means 29 in association with each other. Then, the reception memory 23 outputs the photoacoustic signal detected by the probe 11 to the photoacoustic image reconstruction unit 24.
  • the photoacoustic image reconstruction means 24 sequentially reads out the photoacoustic signal obtained for each detection area from the reception memory 23, and each line of partial image data for displaying the detection area for each detection area based on this photoacoustic signal. Signal data is generated. Specifically, the photoacoustic image reconstruction unit 24 adds the 64ch data obtained for each detection region with a delay time corresponding to the position of the acoustic detection element, and generates signal data for one line (delay). Addition method). The photoacoustic image reconstruction unit 24 may perform reconstruction by a CBP method (Circular Back Projection) instead of the delay addition method. Alternatively, the photoacoustic image reconstruction unit 24 may perform reconstruction using the Hough transform method or the Fourier transform method.
  • the detection / logarithm conversion means 27 obtains an envelope of the signal data of each line, and logarithmically converts the obtained envelope.
  • the photoacoustic image construction means 28 constructs photoacoustic image data based on the signal data of each line subjected to logarithmic transformation. That is, the photoacoustic image construction unit 28 converts the signal data of each line into image data, and generates partial image data for each detection region.
  • the reconstruction of the signal data of each line that is the basis of the partial image data is performed based only on the photoacoustic signal obtained in the detection region related to the partial image data. That is, the generation of partial image data for displaying a certain detection area is performed based only on the photoacoustic signal obtained in the detection area.
  • FIG. 3 shows the case where the partial image data IMa for displaying the detection area is generated from the 64 ch photoacoustic signal Sa obtained in the predetermined detection area by the element area A, and the 64 ch obtained in the other detection areas by the element area B.
  • the reconstruction process in generating the partial image data of B of FIG. 3 from the photoacoustic signal of A of FIG. 3 is omitted.
  • FIG. 4 is a conceptual diagram showing partial image data stored in volume data.
  • the partial image data is arranged in order for each set of detection regions R corresponding to the imaging region.
  • the partial image data IMb related to the detection region facing the element region B is the detection region facing the element region A even in one set of detection regions as shown in FIG.
  • the volume data is stored while being shifted in the scanning direction of the probe 11.
  • a set of detection regions corresponding to the imaging region means a combination of elements having different element regions related to the detection regions.
  • the set of detection regions R is a combination of a detection region corresponding to the element region A and a detection region corresponding to the element region B.
  • the photoacoustic image construction means 28 constructs a photoacoustic image by converting, for example, a position in the time axis direction of the photoacoustic signal (peak portion) into a position in the depth direction in the photoacoustic image.
  • the observation method selection means 39 is for selecting the display mode of the photoacoustic image.
  • Examples of the volume data display mode for the photoacoustic signal include a mode as a three-dimensional image, a mode as a cross-sectional image, and a mode as a graph on a predetermined axis.
  • the display mode is selected according to the initial setting or the input from the input unit 16 by the user.
  • the image composition unit 38 performs necessary processing (for example, scale correction and coloring according to the voxel value) on the generated volume data.
  • the photoacoustic image data generated according to the selected observation method is the final image (display image) to be displayed on the display means 14.
  • the photoacoustic image data generation method described above it is naturally possible for the user to rotate or move the image as necessary after the photoacoustic image data is once generated.
  • FIG. 5 is a flowchart showing the steps of the photoacoustic image generation method of the present embodiment.
  • FIG. 6 is a timing chart for laser beam emission, photoacoustic signal detection, and coordinate setting in this embodiment.
  • LT is the laser beam emission timing (repetition frequency 15 Hz, that is, repetition period 15 ms)
  • AT 1 , AT 2 ,..., AT n are the photoacoustic wave detection timing and detection period in the element region A
  • 1 , BT 2 ,..., BT n are the photoacoustic wave detection timing and detection period in the element region B
  • PT is the coordinate acquisition timing by the coordinate acquisition unit.
  • the acoustic detection element group is switched to an element belonging to the element region A in the detection element array 20a ( (Steps 10 and 11)
  • the above procedure is repeated until the scanning of the probe 11 is completed, and when the scanning of the probe 11 is completed, the detection of the photoacoustic wave is completed.
  • the end of scanning of the probe 11 can be determined based on, for example, that a predetermined switch of the probe 11 has been pressed next, or that the scanning speed of the probe 11 has been automatically detected and the speed has become zero. it can.
  • the photoacoustic wave is detected for each detection region, the representative coordinates are set, and the partial image data for displaying each detection region Since the volume data is generated in correspondence with the representative coordinates set in each detection region, one frame of the photoacoustic image generated based on the photoacoustic signal obtained for each detection region (that is, at a different time).
  • the accuracy of the position of the photoacoustic image data in the volume data can be improved rather than simply arranging the data. This is because according to the present invention, accurate volume data can be generated by reflecting the shift of each detection area caused by the scanning of the probe 11. For example, in the example of FIG.
  • the ultrasonic unit (acoustic signal processing unit) generates partial image data for displaying a certain detection area, the photoacoustic signal obtained in the detection area, and the photoacoustic obtained in another detection area.
  • the ultrasonic unit acoustic signal processing unit
  • FIG. 7 is a conceptual diagram illustrating a partial image data generation process in the present embodiment.
  • the photoacoustic image generation apparatus 10 of this embodiment also includes a probe 11, an ultrasonic unit 12, a laser unit 13, a display unit 14, a coordinate acquisition unit (15, 41 and 42), and an input unit 16. Is provided.
  • the ultrasonic unit 12 includes a reception circuit 21, an AD conversion unit 22, a reception memory 23, a photoacoustic image reconstruction unit 24, a detection / logarithm conversion unit 27, a photoacoustic image construction unit 28, a control unit 29, an image synthesis unit 38, and Observation method selection means 39 is provided. Then, the ultrasonic unit 12 generates partial image data for displaying a certain detection area based on the photoacoustic signal obtained in the detection area and the photoacoustic signal obtained in another detection area.
  • the photoacoustic image reconstruction means 24 detects the photoacoustic wave in the entire detection area.
  • the photoacoustic signal is not reconstructed until the end.
  • the photoacoustic image reconstruction means 24 uses the photoacoustic signals Sa and Sb (detection region facing the element region A and detection region facing the element region B) in a whole set of detection regions. (A in FIG. 7), these photoacoustic signals are arranged side by side in the element region and combined into one (B in FIG. 7).
  • the photoacoustic image reconstruction means 24 reconstructs the photoacoustic signal by using the entire collected photoacoustic signal (128ch) so as to generate signal data that is the basis of the photoacoustic image for one frame.
  • the photoacoustic signal Sa portion is 1 to 64 ch and the photoacoustic signal Sb portion is 65 to 128 ch in the total photoacoustic signal (128 ch)
  • 1 to 64 ch, 2 to Reconfiguration is performed with channel combinations such as 65 ch, 3 to 66 ch,..., 65 to 128 ch.
  • FIG. 7 illustration of the reconstruction process when generating the photoacoustic image data of C of FIG. 7 from the photoacoustic signal of B of FIG. 7 is omitted.
  • the photoacoustic image reconstruction unit 24 transmits the signal data of each line obtained by these reconstructions to the detection / logarithm conversion unit 27.
  • the photoacoustic image construction means 28 generates photoacoustic image data IM for one frame based on the signal data received from the detection / logarithm conversion means 27 (C in FIG. 7). However, since the photoacoustic image construction means 28 processes and manages the partial image data IMa and IMb for each detection region, when storing the image data in the volume data, the photoacoustic image data IM for one frame is stored in the partial image. The data is divided into data IMa and IMb (D in FIG. 7). Thereby, similarly to the first embodiment, it is possible to generate the volume data by associating the partial image data for displaying each detection area with the representative coordinates set in each detection area.
  • FIG. 8 is a flowchart showing the steps of the photoacoustic image generation method of the present embodiment. Note that the timing chart regarding the emission of laser light, the detection of photoacoustic signals, and the setting of coordinates in this embodiment is the same as that in FIG.
  • the photoacoustic signal detected in the first detection region and the representative coordinate are associated and stored in the memory (STEP 25).
  • Detection of photoacoustic waves used to generate photoacoustic image data for a frame is started (STEPs 31, 32, 22 and 23). The above procedure is repeated until the scanning of the probe 11 is completed, and when the scanning of the probe 11 is completed, the detection of the photoacoustic wave is completed.
  • partial image data that detects photoacoustic waves for each detection region, sets representative coordinates, and displays each detection region. Since the volume data is generated by associating each of the detection coordinates with the representative coordinates set in each detection region, the same effect as in the first embodiment can be obtained.
  • the control unit 29 uses, as representative coordinates, calculated coordinates calculated using a plurality of coordinates acquired while photoacoustic waves are detected in a certain detection area. This is different from the first embodiment in that it is set as follows. Therefore, a detailed description of the same components as those in the first embodiment is omitted unless particularly necessary.
  • FIG. 9 is a timing chart for laser beam emission, photoacoustic signal detection, and coordinate setting in the present embodiment.
  • the photoacoustic image generation apparatus 10 of this embodiment also includes a probe 11, an ultrasonic unit 12, a laser unit 13, a display unit 14, a coordinate acquisition unit (15, 41 and 42), and an input unit 16. Is provided.
  • the control unit 29 For example, based on the coordinates acquired at timings p 1 to p 4 within the detection period AT 1 for detecting the photoacoustic wave in the element region A, the control unit 29, for example, the average value, the weighted average value, and the median value of these coordinates The mode value is calculated, and the calculated value (calculated coordinate) is set as the representative coordinate.
  • the calculated value (calculated coordinate) is set as the representative coordinate.
  • control unit 29 may obtain the calculated coordinates using the coordinates acquired immediately before and after the period in which the photoacoustic wave is detected in a certain detection area (for example, the timing p 0 in FIG. 9).
  • a certain detection area for example, the timing p 0 in FIG. 9.
  • noise in the coordinate information is further removed, and the accuracy of matching between the representative coordinates and the actual position of the detection area is further improved.
  • the ultrasonic unit (acoustic signal processing unit) generates partial image data for displaying a certain detection area based only on the photoacoustic signal obtained in the detection area.
  • the ultrasonic unit (acoustic signal processing unit) generates partial image data for displaying a certain detection area using the photoacoustic signal obtained in the detection area and the other detection areas.
  • the present invention can also be applied to a case where it is performed based on the obtained photoacoustic signal.
  • FIG. 10 is a block diagram illustrating a configuration of the photoacoustic image generation apparatus 10 of the present embodiment.
  • This embodiment is different from the first embodiment in that an ultrasonic image is generated in addition to the photoacoustic image. Therefore, a detailed description of the same components as those in the first embodiment will be omitted unless particularly necessary.
  • the photoacoustic image generation apparatus 10 of this embodiment also includes a probe 11, an ultrasonic unit 12, a laser unit 13, a display unit 14, a coordinate acquisition unit (15, 41 and 42), and an input unit 16. Is provided.
  • the ultrasonic unit 12 of the present embodiment includes a transmission control circuit 33, a data separation unit 34, an ultrasonic image reconstruction unit 35, a detection / logarithm conversion unit 36, And an ultrasonic image constructing means 37.
  • the probe 11 performs output (transmission) of ultrasonic waves to the subject and detection (reception) of reflected ultrasonic waves from the subject with respect to the transmitted ultrasonic waves.
  • the acoustic detection element that transmits and receives ultrasonic waves the acoustic detection element array described above may be used, or a new acoustic detection element array provided separately in the probe 11 for ultrasonic transmission and reception is used. Also good.
  • transmission and reception of ultrasonic waves may be separated. For example, ultrasonic waves may be transmitted from a position different from the probe 11, and reflected ultrasonic waves with respect to the transmitted ultrasonic waves may be received by the probe 11.
  • the trigger control circuit 30 sends an ultrasonic transmission trigger signal for instructing ultrasonic transmission to the transmission control circuit 33 when generating an ultrasonic image.
  • the transmission control circuit 33 Upon receiving this trigger signal, the transmission control circuit 33 transmits an ultrasonic wave from the probe 11.
  • the probe 11 detects the reflected ultrasonic wave from the subject after transmitting the ultrasonic wave.
  • the reflected ultrasonic waves detected by the probe 11 are input to the AD conversion means 22 via the receiving circuit 21.
  • the trigger control circuit 30 sends a sampling trigger signal to the AD conversion means 22 in synchronization with the timing of ultrasonic transmission, and starts sampling of reflected ultrasonic waves.
  • the reflected ultrasonic waves reciprocate between the probe 11 and the ultrasonic reflection position, whereas the photoacoustic signal is one way from the generation position to the probe 11. Since the detection of the reflected ultrasonic wave takes twice as long as the detection of the photoacoustic signal generated at the same depth position, the sampling clock of the AD conversion means 22 is half the time when the photoacoustic signal is sampled, for example, It may be 20 MHz.
  • the AD conversion means 22 stores the reflected ultrasonic sampling signal in the reception memory 23. Either sampling of the photoacoustic signal or sampling of the reflected ultrasonic wave may be performed first.
  • the data separating means 34 separates the photoacoustic signal sampling signal and the reflected ultrasonic sampling signal stored in the reception memory 23.
  • the data separation unit 34 inputs a sampling signal of the separated photoacoustic signal to the photoacoustic image reconstruction unit 24.
  • the generation of the photoacoustic image is the same as that in the first embodiment.
  • the data separation unit 34 inputs the separated reflected ultrasound sampling signal to the ultrasound image reconstruction unit 35.
  • the ultrasonic image reconstruction means 35 generates data of each line of the ultrasonic image based on the reflected ultrasonic waves (its sampling signals) detected by the plurality of acoustic detection elements of the probe 11. For the generation of the data of each line, a delay addition method or the like can be used as in the generation of the data of each line in the photoacoustic image reconstruction means 24.
  • the detection / logarithm conversion means 36 obtains the envelope of the data of each line output from the ultrasonic image reconstruction means 35 and logarithmically transforms the obtained envelope.
  • the ultrasonic image construction means 37 generates an ultrasonic image based on the data of each line subjected to logarithmic transformation.
  • the image synthesis means 38 synthesizes, for example, a photoacoustic image and an ultrasonic image.
  • the image composition unit 38 performs image composition by superimposing a photoacoustic image and an ultrasonic image, for example.
  • the synthesized image is displayed on the display means 14. It is also possible to display the photoacoustic image and the ultrasonic image side by side on the display unit 14 without performing image synthesis, or to switch between the photoacoustic image and the ultrasonic image.
  • partial image data that detects photoacoustic waves for each detection region, sets representative coordinates, and displays each detection region. Since the volume data is generated by associating each of the detection coordinates with the representative coordinates set in each detection region, the same effect as in the first embodiment can be obtained.
  • the photoacoustic measurement device of the present embodiment generates an ultrasonic image in addition to the photoacoustic image. Therefore, by referring to the ultrasonic image, a portion that cannot be imaged in the photoacoustic image can be observed.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Acoustics & Sound (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)

Abstract

Le but de l'invention est de pouvoir décrire la structure interne d'un sujet avec une plus grande précision même lorsque le volume de données est généré en fonction d'une détection de signaux photo-acoustiques effectuée au cours de multiples étapes. A cette fin, le dispositif de génération d'image photo-acoustique (10) comprend : une unité de détection (20) qui comprend plusieurs éléments de détection acoustique (20c) et qui divise une région d'imagerie en de multiples régions de détection (AT1 et BT2), et détecte les ondes photo-acoustiques pour chaque région de détection (AT1 et BT2) ; une unité d'acquisition de coordonnées qui acquiert des coordonnées ; un moyen de commande (29) qui définit, pour chaque région de détection, des coordonnées représentatives qui représentent les coordonnées de la région de détection en fonction des coordonnées issues de l'unité de détection acoustique (20) pendant la détection des ondes photo-acoustiques ; et une unité de traitement de signal acoustique (12) qui associe les données d'images partielles respectives (IMa et IMb) représentant chaque région de détection à des ensembles de coordonnées représentatifs pour chaque région de détection, et génère un volume de données.
PCT/JP2013/005497 2012-09-28 2013-09-18 Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique WO2014050020A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-215434 2012-09-28
JP2012215434A JP2014068701A (ja) 2012-09-28 2012-09-28 光音響画像生成装置および光音響画像生成方法

Publications (1)

Publication Number Publication Date
WO2014050020A1 true WO2014050020A1 (fr) 2014-04-03

Family

ID=50387463

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/005497 WO2014050020A1 (fr) 2012-09-28 2013-09-18 Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique

Country Status (2)

Country Link
JP (1) JP2014068701A (fr)
WO (1) WO2014050020A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070115A1 (fr) * 2014-10-30 2016-05-06 Seno Medical Instruments, Inc. Système d'imagerie opto-acoustique avec détection de l'orientation relative de source de lumière et de récepteur acoustique au moyen d'ondes acoustiques
WO2017138541A1 (fr) * 2016-02-08 2017-08-17 Canon Kabushiki Kaisha Appareil d'acquisition d'informations et procédé d'affichage
CN114636672A (zh) * 2022-05-11 2022-06-17 之江实验室 一种光声超声复用的采集系统及方法

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6847234B2 (ja) * 2017-08-24 2021-03-24 富士フイルム株式会社 光音響画像生成装置
JP7020954B2 (ja) 2018-02-13 2022-02-16 キヤノン株式会社 画像処理装置、画像処理方法、プログラム

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10295691A (ja) * 1997-04-25 1998-11-10 Aloka Co Ltd 超音波診断装置
JP2012005623A (ja) * 2010-06-24 2012-01-12 Fujifilm Corp 生体情報画像化装置及び方法
JP2012135610A (ja) * 2010-12-10 2012-07-19 Fujifilm Corp 光音響検査用探触子および光音響検査装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10295691A (ja) * 1997-04-25 1998-11-10 Aloka Co Ltd 超音波診断装置
JP2012005623A (ja) * 2010-06-24 2012-01-12 Fujifilm Corp 生体情報画像化装置及び方法
JP2012135610A (ja) * 2010-12-10 2012-07-19 Fujifilm Corp 光音響検査用探触子および光音響検査装置

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016070115A1 (fr) * 2014-10-30 2016-05-06 Seno Medical Instruments, Inc. Système d'imagerie opto-acoustique avec détection de l'orientation relative de source de lumière et de récepteur acoustique au moyen d'ondes acoustiques
US10539675B2 (en) 2014-10-30 2020-01-21 Seno Medical Instruments, Inc. Opto-acoustic imaging system with detection of relative orientation of light source and acoustic receiver using acoustic waves
WO2017138541A1 (fr) * 2016-02-08 2017-08-17 Canon Kabushiki Kaisha Appareil d'acquisition d'informations et procédé d'affichage
CN114636672A (zh) * 2022-05-11 2022-06-17 之江实验室 一种光声超声复用的采集系统及方法

Also Published As

Publication number Publication date
JP2014068701A (ja) 2014-04-21

Similar Documents

Publication Publication Date Title
JP5779169B2 (ja) 音響画像生成装置およびそれを用いて画像を生成する際の進捗状況の表示方法
JP5448918B2 (ja) 生体情報処理装置
JP5655021B2 (ja) 光音響画像化方法および装置
JP6525565B2 (ja) 被検体情報取得装置および被検体情報取得方法
JP5626903B2 (ja) カテーテル型の光音響プローブおよびそれを備えた光音響撮像装置
WO2014050020A1 (fr) Dispositif de génération d'image photo-acoustique, et procédé de génération d'image photo-acoustique
JP6177530B2 (ja) ドプラ計測装置およびドプラ計測方法
JP5694991B2 (ja) 光音響画像化方法および装置
JP6545190B2 (ja) 光音響装置、信号処理装置、信号処理方法、プログラム
JP6289050B2 (ja) 被検体情報取得装置、情報処理方法、およびプログラム
JP5683383B2 (ja) 光音響撮像装置およびその作動方法
JP5936559B2 (ja) 光音響画像生成装置および光音響画像生成方法
JP6742745B2 (ja) 情報取得装置および表示方法
JP6742734B2 (ja) 被検体情報取得装置および信号処理方法
WO2013080539A1 (fr) Dispositif générateur d'image photo-acoustique et procédé de générateur d'image photo-acoustique
WO2012114695A1 (fr) Dispositif de génération d'image photoacoustique
JP2002257803A (ja) 超音波撮像方法及び超音波撮像装置
CN118019497A (zh) 图像生成方法、图像生成程序以及图像生成装置
JP5722182B2 (ja) 光音響撮像装置および光音響撮像方法
JP2014161428A (ja) 光音響計測装置および光音響計測方法
JP2013172810A (ja) 光音響画像処理装置、及び方法
JP7077384B2 (ja) 被検体情報取得装置
JP5868458B2 (ja) 測定装置
JP2014023680A (ja) 被検体情報取得装置およびその制御方法ならびに提示方法
JP2014073411A (ja) 被検体情報処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13842303

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13842303

Country of ref document: EP

Kind code of ref document: A1