US20140182383A1 - Object information obtaining device, display method, and non-transitory computer-readable storage medium - Google Patents

Object information obtaining device, display method, and non-transitory computer-readable storage medium Download PDF

Info

Publication number
US20140182383A1
US20140182383A1 US14/134,957 US201314134957A US2014182383A1 US 20140182383 A1 US20140182383 A1 US 20140182383A1 US 201314134957 A US201314134957 A US 201314134957A US 2014182383 A1 US2014182383 A1 US 2014182383A1
Authority
US
United States
Prior art keywords
processing
object information
unit
obtaining device
types
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/134,957
Other languages
English (en)
Inventor
Koichi Suzuki
Hiroshi Abe
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Assigned to CANON KABUSHIKI KAISHA reassignment CANON KABUSHIKI KAISHA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SUZUKI, KOICHI, ABE, HIROSHI
Publication of US20140182383A1 publication Critical patent/US20140182383A1/en
Priority to US15/700,996 priority Critical patent/US10429233B2/en
Priority to US16/543,419 priority patent/US20190368920A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01HMEASUREMENT OF MECHANICAL VIBRATIONS OR ULTRASONIC, SONIC OR INFRASONIC WAVES
    • G01H9/00Measuring mechanical vibrations or ultrasonic, sonic or infrasonic waves by using radiation-sensitive means, e.g. optical means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0093Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy
    • A61B5/0095Detecting, measuring or recording by applying one single type of energy and measuring its conversion into another type of energy by applying light and detecting acoustic waves, i.e. photoacoustic measurements
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/003Reconstruction from projections, e.g. tomography
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration using non-spatial domain filtering
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20056Discrete and fast Fourier transform, [DFT, FFT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical

Definitions

  • the present invention relates to technology to obtain object information based on a photoacoustic wave generated by irradiation of light to an object.
  • Photo acoustic imaging in an optical imaging technique developed based on the photoacoustic effect.
  • photo acoustic imaging for example, an object such as a living body is irradiated with pulsed light and a light absorber such as a blood vessel absorbs energy of the pulsed light to generate a photoacoustic wave.
  • An acoustic wave detecting unit detects the photoacoustic wave generated by the photoacoustic effect. Then, a detection signal output from the acoustic wave detecting unit is analyzed by image processing, for example, and object information is obtained.
  • UBP processing universal back-projection reconstruction processing
  • An object information obtaining device disclosed in this specification is provided with a light source configured to emit light, an acoustic wave detecting unit configured to detect a photoacoustic wave generated by irradiation of an object with the light, and to output an electric signal in response to detection of the acoustic wave, and a processing unit configured to perform two or more types of processing to photoacoustic signal data based on the electric signal to obtain object information corresponding to each of the two or more types of processing, and to display on a display unit the object information corresponding to at least one processing selected by a user out of the two or more types of processing.
  • FIG. 1 is a view illustrating an object information obtaining device according to this embodiment.
  • FIG. 2 is a view illustrating a processing unit according to this embodiment in detail.
  • FIG. 3 is a view illustrating a flow of a method of obtaining object information according to this embodiment.
  • FIG. 4A is a view illustrating a simulation model according to this embodiment.
  • FIG. 4B is a view illustrating a simulation result of a Fourier domain reconstruction processing according to this embodiment.
  • FIG. 4C is a view illustrating a simulation result of a time domain reconstruction processing according to this embodiment.
  • FIG. 4D is a view illustrating a simulation result of a model base reconstruction processing according to this embodiment.
  • FIG. 5 is a view illustrating a flow of a method of obtaining object information according to Example 1 of the present invention.
  • FIG. 6 is a view illustrating a processing unit according to Example 1 of the present invention in detail.
  • FIG. 7 is a view illustrating a screen displayed on a display according to Example 1 of the present invention.
  • FIG. 8 is a view illustrating a flow of a method of obtaining object information according to Example 2 of the present invention.
  • FIG. 9 is a view illustrating a screen displayed on a display according to Example 2 of the present invention.
  • Object information includes initial sound pressure of a photoacoustic wave generated by a photoacoustic effect, optical energy absorption density derived from the initial sound pressure, an absorption coefficient, density of a substance forming tissue and the like.
  • density of a substance may be determined by levels of oxygen saturation, oxyhemoglobin density, deoxyhemoglobin density, total hemoglobin density and the like.
  • the total hemoglobin density is a sum of the oxyhemoglobin density and the deoxyhemoglobin density.
  • the object information in this embodiment may be not numerical data but distribution information of each position in an object. That is to say, the distribution information such as absorption coefficient distribution and oxygen saturation distribution may be used as the object information.
  • Non-Patent Document 1 Further improvement in method of displaying the object information obtained only by specific processing (UBP reconstruction processing) as disclosed in Non-Patent Document 1 is desired from a diagnostic viewpoint.
  • a real image corresponding to the object might be displayed in a different manner depending on a type of the processing. Therefore, usefulness in diagnosis of an observation object might be different depending on the type of the processing.
  • a virtual image referred to as an artifact might be present in a diagnostic image obtained through the reconstruction processing.
  • the artifact might preclude appropriate diagnosis.
  • artifacts appear differently in a reconstructed image.
  • At least one processing is selected by a user from two or more types of processing to photoacoustic signal data (also referred to as raw data).
  • the user may confirm the object information obtained by desired processing, so that the user may selectively use the image corresponding to the processing determined to be useful according to a symptom in the diagnosis.
  • the user may also select the desired processing in consideration of acceptable processing time to the user. That is to say, according to this embodiment, the user may select the object information corresponding to the desired processing determined by the user to be highly useful within the acceptable processing time to the user.
  • FIG. 1 A basic configuration of the object information obtaining device (information obtaining apparatus) according to this embodiment illustrated in FIG. 1 is first described.
  • the object information obtaining device illustrated in FIG. 1 includes a light source 110 , an optical system 120 , an acoustic wave detecting unit 130 , a processing unit 140 as a computer, an input unit 150 , and a display unit 160 in order to obtain information of a living body 100 as the object.
  • FIG. 2 is a block diagram illustrating relevant parts of a computer, which is an example of a data processing apparatus including the processing unit 140 and peripheral elements of the processing unit 140 .
  • the processing unit 140 is provided with an arithmetic unit 141 and a storage unit 142 .
  • An example of the processing unit 140 includes, but is not limited to, a microprocessor chip, such as a CPU (central processing unit) or MPU (micro processing unit).
  • An example of storage unit 140 includes, but is not limited to, RAM or ROM memory.
  • the arithmetic unit 141 controls operation of each component forming the object information obtaining device through a data network 200 .
  • the arithmetic unit 141 reads a program in which processing steps for (a method of) obtaining object information to be described later is saved in the storage unit 142 and allows the object information obtaining device to execute the method of obtaining object information.
  • the light source 110 is preferably a pulse light source capable of emitting light pulses lasting a few nanoseconds to few microseconds. Specifically, the light source 110 is preferably capable of emitting light having a pulse width of approximately 10 nanoseconds in order to efficiently generate the photoacoustic wave.
  • a wavelength of the light which can be emitted by the light source 110 is desirably the wavelength at which the light propagates into the object. Specifically, when the object is a living body, such as a human or animal body, a preferable wavelength is not shorter than 500 nm and not longer than 1500 nm.
  • a laser or a light-emitting diode are examples of a light source that may be used in some embodiments disclosed herein.
  • the laser various lasers such as a solid-state laser, a gas laser, a dye laser, and a semiconductor laser may be used.
  • the laser used in this embodiment includes an alexandrite laser, an yttrium-aluminum-garnet laser, a titanium-sapphire laser and the like.
  • the light emitted from the light source 110 is typically guided to the living body 100 while being shaped into a desired light distribution shape by means of an optical component such as a lens and a mirror.
  • an optical component such as a lens and a mirror.
  • the optical component used to shape the light distribution includes, for example, a mirror reflecting the light, a lens collecting and magnifying the light or changing a focusing shape thereof, a prism dispersing, refracting, and reflecting the light, the optical fiber propagating the light, a diffusion plate dispersing the light and other like optical components or combinations thereof. Any type or number of such optical components may be used as long as the object is irradiated with the light emitted from the light source 110 in the desired manner.
  • the light emitted by the light source 110 may be guided directly to the object as desired light, it may not be necessary to use the optical system 120 .
  • the acoustic wave detecting unit 130 is provided with one or more opto-acoustic transducers and a housing enclosing the transducer(s).
  • An opto-acoustic transducer as used herein, is an element capable of detecting an acoustic wave.
  • the transducer receives the acoustic wave such as the photoacoustic wave and an ultrasonic echo to transform it to an electric signal being an analog signal. Any transducer may be used as long as the transducer is configured to receive the acoustic wave. Examples of transducer include a transducer using a piezoelectric phenomenon, a transducer using optical resonance, a transducer using change in capacitance, and other like transducers.
  • the acoustic wave detecting unit 130 is preferably provided with a plurality of transducers arranged in an array.
  • the processing unit 140 is provided with the arithmetic unit 141 and the storage unit 142 as illustrated in FIG. 2 .
  • the arithmetic unit 141 is typically formed of an arithmetic element such as a CPU, a GPU, an A/D converter, a FPGA (field programmable gate array) card, and an ASIC (application specific integrated circuit) chip. Meanwhile, the arithmetic unit 141 may be formed not only of one arithmetic element but also of a plurality of arithmetic elements. Any arithmetic element may be used to perform the disclosed process.
  • the storage unit 142 is typically formed of a storage medium such as a ROM memory, a RAM memory, a hard disk drive, or a combination thereof. That is, the storage unit 142 may be formed not only of one storage medium but also of a plurality of storage media.
  • the arithmetic unit 141 may make a gain adjustment to increase or decrease an amplification gain according to time that elapses from irradiation of the light to arrival of the acoustic wave at the element of the acoustic wave detecting unit 130 in order to obtain the image having a uniform contrast regardless of a depth in the living body.
  • the arithmetic unit 141 may control light emission timing of the pulsed light emitted from the light source 110 , and may also control operation start timing of the acoustic wave detecting unit 130 by using the pulsed light as a trigger signal.
  • the arithmetic unit 141 may control display operations of the display unit 160 .
  • the arithmetic unit 141 is preferably configured to simultaneously perform pipeline processing of a plurality of signals when a plurality of detecting signals is obtained from the acoustic wave detecting unit 130 . According to this, time that elapses before the object information is obtained may be shortened.
  • each processing operation performed by the processing unit 140 may be saved in the storage unit 142 as part of the program to be executed by the arithmetic unit 141 .
  • the storage unit 142 in which the program is saved is a non-transitory computer-readable recording medium.
  • the processing unit 140 and the acoustic wave detecting unit 130 may be provided as an integrated unit. Then, the processing unit provided on the acoustic wave detecting unit may perform a part of signal processing, and the processing unit provided outside the acoustic wave detecting unit may perform the remainder of signal processing. In this case, the processing unit provided on the acoustic wave detecting unit and the processing unit provided outside the acoustic wave detecting unit may be collectively referred to as the processing unit according to this embodiment.
  • the input unit 150 is a user interface (I/F) configured to accept an operation (e.g., input) by the user.
  • Information input by the user is input from the input unit 150 to the processing unit 140 .
  • a pointing device such as a mouse and a keyboard, a graphics tablet type and the like may be adapted as the input unit 150 .
  • a mechanical device such as a button and a dial provided on a device forming the object information obtaining device, or other I/F device may also be adapted as the input unit 150 .
  • the display unit 160 may also be adapted to function as the input unit 150 .
  • the input unit 150 may be provided as a user I/F disposed separately from the object information obtaining device and connected thereto via the data network 200 .
  • the display unit 160 is a device which displays the object information output from the processing unit 140 .
  • LCD liquid crystal display
  • another type of display such as a plasma display, an organic EL display, and a FED may also be used. It is also possible to integrally form the input unit 150 and the display unit 160 by adopting the touch panel display as the display unit 160 .
  • the display unit 160 may also be provided separately from the object information obtaining device according to this embodiment.
  • FIG. 3 The flow process illustrated in FIG. 3 is example of an algorithm executed by the processing unit 140 .
  • the light emitted by the light source 110 is applied to the living body 100 as pulse light 121 through the optical system 120 . Then, a light absorber 101 absorbs the pulse light 121 and a photoacoustic wave 102 is generated by the photoacoustic effect.
  • the acoustic wave detecting unit 130 transforms the photoacoustic wave 102 to the electric signal being the analog signal to output to the processing unit 140 .
  • the arithmetic unit 141 saves the electric signal output from the acoustic wave detecting unit 130 in the storage unit 142 as the photoacoustic signal data.
  • data obtained when the electric signal output from the acoustic wave detecting unit 130 is saved in the storage unit 142 is made into the photoacoustic signal data.
  • the photoacoustic signal data may be read from the storage unit 142 by the arithmetic unit 141 to be used in the two or more types of processing to be described later.
  • the electric signal output from the acoustic wave detecting unit 130 is typically amplified and subjected to the A/D conversion to be saved in the storage unit 142 as the photoacoustic signal data.
  • the electric signal output from the acoustic wave detecting unit 130 may also be saved in the storage unit 142 as the photoacoustic signal data after being averaged.
  • the photoacoustic signal data is saved in the storage unit 142 in this manner.
  • the arithmetic unit 141 may use the photoacoustic signal data including the same photoacoustic signal data corresponding to the photoacoustic wave detected at certain time in a plurality of types of processing to be described later.
  • photo acoustic imaging it is possible to apply different types of processing to the photoacoustic signal data including the same photoacoustic signal data obtained by detecting the photoacoustic wave at certain time. According to this, the object information at the same time corresponding to each of the different types of processing may be obtained.
  • the object information corresponding to the desired processing out of pieces of object information at the same time obtained by applying each of the two or more types of processing to the photoacoustic signal data including the same data may be selectively displayed.
  • the arithmetic unit 141 may also obtain the object information corresponding to each processing by performing the two or more types of processing to the photoacoustic signal data not including the same data.
  • step S 302 the user selects the desired processing from two or more types of processing by using the input unit 150 . Then, the input unit 150 outputs the information of the processing selected by the user to the processing unit 140 . At that time, the information of the selected processing is saved in the storage unit 142 .
  • An example of the input unit 150 for the user to select the desired processing from the two or more types of processing is hereinafter described. That is, an example of a method of inputting the information of the desired processing by the user is described.
  • the user may select the desired processing by pressing a mechanical button as the input unit 150 corresponding to each of the two or more types of processing.
  • the user may select the desired processing by turning a mechanical dial as the input unit 150 corresponding to each of the two or more types of processing.
  • the user may also select the desired processing by selecting an item indicating the processing displayed on the display unit 160 by means of a pointing device (mouse), the keyboard and the like as the input unit 150 .
  • the display unit 160 may display the items indicating the processing next to one another as icons or display them as a menu.
  • the item related to the processing displayed on the display unit 160 may be always displayed beside the image of the object information or may be configured to be displayed when the user performs some operation by using the input unit 150 .
  • the display unit 160 may be configured such that the item indicating the processing is displayed on the display unit 160 by a click of the mechanical button provided on the mouse as the input unit 150 .
  • the method is not limited to the above-described method and any method may be adopted as long as the user may select the desired processing out of the two or more types of processing.
  • the object information obtaining device is preferably configured such that progress of each processing is visually presented to the user.
  • a color of the item corresponding to the processing may be changed according to a progress status such as completion of the processing or the progress status may be displayed in characters in the vicinity of the item.
  • the object information obtaining device is preferably configured such that the progress of the processing may be grasped and the user may optionally stop the processing currently being calculated.
  • Such configuration allows the user to start a different process operation when the user sees the progress bar and determines that the progress of the processing currently being calculated is not convenient (e.g., the processing is taking too long, the processing is not good due to a processing error, the type of processing was chosen in error, etc.).
  • Image reconstruction processing selected by default may be set in advance in a file in the storage unit 142 .
  • the arithmetic unit 141 may read default processing at the beginning of step S 302 and execute the processing selected by default if the user does not especially select other processing. It is also possible that the user may intentionally select the processing set by default.
  • the desired processing selected by the user may be at least one type of processing.
  • at least two types of processing may be selected from three or more types of processing.
  • the object information obtaining device may be configured such that a plurality of combinations of at least two types of processing may be selected. According to this, the user may select the desired processing with a high degree of freedom and it becomes possible to display the object information useful in the diagnosis.
  • the arithmetic unit 141 obtains the object information by performing the desired processing selected at S 200 based on the photoacoustic signal data saved in the storage unit 142 .
  • the object information obtained by performing the desired processing is referred to as “object information corresponding to the desired processing”.
  • the arithmetic unit 141 may read the program in which an algorithm of the processing is described stored in the storage unit 142 and apply this processing to the photoacoustic signal data to obtain the object information.
  • three-dimensional voxel data and two-dimensional pixel data as the object information may be obtained by the processing.
  • the processing according to this embodiment is intended to mean every processing performed during transform from the photoacoustic signal data to the object information having a pathological value.
  • the processing according to this embodiment includes signal processing such as probe response correction processing and noise removal processing to generate different photoacoustic signal data based on the photoacoustic signal data stored in the storage unit 142 .
  • reconstruction processing such as time domain reconstruction processing, Fourier domain reconstruction processing, and model base reconstruction processing to generate the object information from the photoacoustic signal data stored in the storage unit 142 as the processing according to this embodiment.
  • the processing according to this embodiment includes image processing such as resolution improvement processing to generate different object information based on the object information generated by the above-described reconstruction processing.
  • the probe response correction processing (hereinafter, referred to as “BD processing”) as the signal processing according to this embodiment is the processing to correct signal deterioration due to band limitation of a probe by applying processing based on a blind deconvolution algorithm to the photoacoustic signal data (refer to Patent Document 1 (Japanese Patent Application Laid-Open No. 2012-135462)).
  • Patent Document 1 Japanese Patent Application Laid-Open No. 2012-135462
  • Probe response correction has an effect of decreasing the ringing by the acoustic wave detecting unit, thereby decreasing the artifact and improving the resolution.
  • the noise removal processing (hereinafter, referred to as “wavelet processing”) as the signal processing according to this embodiment is the processing to remove a noise component of the photoacoustic signal data through basis pursuit by a wavelet function of the photoacoustic signal data.
  • a waveform of the signal resulting from the photoacoustic wave is known to be an N-shaped waveform under an ideal condition (refer to Non-Patent Document 2 (Sergey A. Ermilov, RedaGharieb, Andre Conjusteau, Tom Miller, Ketan Mehta, and Alexander A. Oraevsky, “Data Processing and quasi-3D optoacoustic imaging of tumors in the breast using a linear arc-shaped array of ultrasonic transducers”, Proc.
  • the signal resulting from the noise is discriminated from the signal resulting from the photoacoustic wave by applying a discrete wavelet transform to the photoacoustic signal data and removing a coefficient having a small absolute value from a result thereof.
  • the wavelet processing has a large effect when the signal resulting from the photo acoustic wave has the waveform close to the ideal waveform.
  • the time domain reconstruction processing (hereinafter, referred to as “TD processing”) as the reconstruction processing is the processing to estimate a sonic wave source by superimposing sonic wave signals in a real space by using a property that the photoacoustic wave is a spherical wave to generate the voxel data (refer to Patent Document 2 (Japanese Patent Application Laid-Open No. 2010-35806)).
  • the TD processing specifically includes UBP processing disclosed in Non-Patent Document 1.
  • the TD processing is performed in the real space, so that an effect of a measurement system is easily introduced as compared to the Fourier domain reconstruction processing and the like to be described later. For example, it is possible to decrease a side-lobe artifact by applying weighted correction processing of a solid angle and the like in consideration of a state of the acoustic wave detecting unit 130 , for example.
  • the Fourier domain reconstruction processing (hereinafter, referred to as “FD processing”) as the reconstruction processing is the processing to estimate the sonic wave source by superimposing the detection signals in a frequency domain by using a Fourier transform and an inverse Fourier transform to generate the voxel data (refer to Japanese Patent Application Laid-Open No. 2010-35806).
  • the processing may be performed in a short time by using a fast Fourier transform.
  • the effect of the measurement system is not easily introduced in a frequency space as compared to the real space.
  • the model base reconstruction processing (hereinafter, referred to as “MBP processing”) as the reconstruction processing is the processing to estimate the sonic wave source such that difference between a calculation result based on a propagation model of an ideal photoacoustic wave and the photoacoustic signal data is minimum to generate the voxel data (refer to Patent Document 3 (Japanese Patent Application Laid-Open No. 2011-143175)).
  • MBP processing The model base reconstruction processing
  • the resolution improvement processing (hereinafter, referred to as “CF processing”) as the image processing is the processing to reduce the artifact generated by limitation of a viewing angle of the probe by using a coherent filter in the object information obtained by the above-described reconstruction processing (refer to Patent Document 4 (Japanese Patent Application Laid-Open No. 2011-120765). According to this, a high-resolution image of the object information may be obtained.
  • This is the processing to calculate a coefficient which is set to 1 when phase signals of the photoacoustic wave are in phase and set to 0 when they are out of phase for each voxel and multiply distribution of the coefficients by the image.
  • the CF processing is especially effective when sound speed distribution of the object is nearly constant. On the other hand, when variation in the sound speed distribution of the object is large, an effect of improving the image quality by the CF processing might be small.
  • an x-axis corresponds to a horizontal direction and a z-axis corresponds to a vertical direction in FIG. 4A .
  • a case in which a one-dimensional transducer array in an x-axis direction is arranged on a lowest part in FIG. 4A and the one-dimensional transducer array detects the photoacoustic wave propagated from an upper side of the z-axis is simulated.
  • FIG. 4B illustrates initial sound pressure distribution when the FD processing is performed.
  • FIG. 4C illustrates the initial sound pressure distribution when the TD processing is performed.
  • FIG. 4D illustrates the initial sound pressure distribution when the MBP processing is performed.
  • connection of the initial sound pressure distribution corresponding to the absorption coefficient distribution extending in a z-axis direction decreases as compared to that in the image illustrated in FIG. 4D .
  • the arithmetic unit 141 calculates display data to be displayed on the display unit 160 based on the voxel data (or the pixel data) of the object information saved in the storage unit 142 to display the display data on the display unit 160 .
  • the display data of desired dimension out of one dimension, two dimensions, and three dimensions may be obtained from the voxel data (or the pixel data).
  • the object information obtaining device may be configured such that the user may set the dimension of the display data by using the input unit 150 .
  • the object information obtaining device it is possible to display the object information corresponding to the desired processing selected by the user out of the two or more types of processing on the display unit. According to this, it is possible to diagnose by using the image meeting needs of the user such as the processing time and the image quality from the images at the same time obtained by each image reconstruction.
  • the object information obtaining device may also be configured such that the information of the desired processing is obtained and the object information corresponding to the information of the desired processing is displayed in a state in which the object information corresponding to the desired processing is obtained in advance. That is to say, step S 302 may be executed after step S 303 is executed and step S 304 may be executed thereafter.
  • the image itself obtained after the processing is applied to the photoacoustic signal data may be adopted as the item indicating the processing. That is to say, the user may select the processing by selecting the image. For example, it is possible that the images whose processing is finished are sequentially displayed on the display unit 160 and when the user selects one of a plurality of images, the image is displayed in an enlarged manner. By this method, the user may compare results of a plurality of types of processing and may select the desired image even when the user does not have knowledge of the processing.
  • the object information obtaining device is preferably configured such that the user cannot select the processing not finished yet.
  • the object information obtaining device is preferably further provided with notifying means of notifying the user of whether the processing is finished processing.
  • the notifying means is preferably configured such that the user may visually recognize whether the processing is the finished processing.
  • the item indicating the object information is displayed on the display unit 160 , it is possible to display the item of the finished processing and the item of the processing not finished yet in different colors and the like as the notifying means.
  • the information of the desired processing is obtained and the object information corresponding to the desired processing is displayed on the display unit 160 in a state in which the object information different from the object information corresponding to the desired processing is displayed on the display unit 160 .
  • the object information corresponding to the desired processing may be displayed so as to be superimposed on the object information displayed in advance or may be displayed next to the same.
  • the display method may be set in advance before shipping or may be set by the user by means of the input unit 150 .
  • the user may grasp pathological information which may be grasped from the object information displayed in advance and the pathological information which may grasped from the object information corresponding to the desired processing to diagnose in a comprehensive manner. It is also possible to diagnose in a comprehensive manner by grasping a plurality of pieces of pathological information without a time interval.
  • FIG. 5 is a flow diagram of a method of obtaining object information according to this example.
  • FIG. 6 is a schematic diagram illustrating a computer 140 as a processing unit according to this example in detail and a peripheral device. As illustrated in FIG. 6 , the computer 140 is provided with a CPU 641 , a FPGA 642 , and a GPU 643 as an arithmetic unit, and a ROM 644 and a RAM 645 as a storage unit.
  • the ROM 644 is used as a non-transitory computer-readable recording medium.
  • the CPU 641 controls operation of each component forming an object information obtaining device through a data network 200 , which is similar to that shown in FIG. 2 .
  • the CPU 641 reads a program in which the method of obtaining object information according to this example is described saved in the ROM 644 to allow the object information obtaining device to execute the method of obtaining object information. That is to say, the computer 140 executes a flow illustrated in FIG. 5 .
  • a user operated an input unit 150 to input a measurement parameter.
  • the measurement parameter was saved in the RAM 645 as the storage unit.
  • the user set a wavelength of laser light used in measurement and the number of irradiation times of the laser light to a breast 100 of a subject as an object in one measurement as the measurement parameters. Meanwhile, in this example, the user set the wavelength of the laser light used in the measurement to 797 nm and the number of irradiation times of the laser light to 30.
  • the CPU 641 issues an instruction based on the measurement parameter to a titanium-sapphire laser 110 as a light source to allow the same to emit the laser light.
  • the laser light was applied to the breast 100 as pulse light 121 with a pulse width of 50 nm through an optical fiber 120 .
  • the breast 100 absorbed the pulse light 121 and a photoacoustic wave reflecting absorption coefficient distribution in the breast 100 was generated.
  • the titan-sapphire laser 110 in this example includes a flash lamp and a Q-switch as means of exciting an internal laser medium and light emission timing was controlled by the instruction from the CPU 641 .
  • a CMUT array 130 as an acoustic wave detecting unit transformed the photoacoustic wave to an electric signal and output the electric signal to the processing unit 140 .
  • the CPU 641 instructs the CMUT array 130 to detect the photoacoustic wave in synchronization with the instruction to emit the laser light at step S 502 .
  • ultrasonic gel whose acoustic impedance is close to that of the breast 100 was provided as an acoustic matching medium between the CMUT array 130 and the breast 100 .
  • step S 504 the FPGA 642 amplified the electric signal and performed A/D conversion thereof.
  • the CPU 641 saved the signal amplified and subjected to the A/D conversion in the RAM 645 as photoacoustic signal data.
  • step S 505 it was determined whether the measurement of the object was completed.
  • the procedure shifts to step S 506 .
  • the procedure shifts to step S 502 .
  • the measurement is completed when the procedure from step S 502 to step S 504 is repeated 30 times.
  • a screen displayed on a liquid crystal display 160 as a display unit used in a following step is illustrated in FIG. 7 .
  • the user may select a desired item from an item 701 corresponding to BD processing, an item 702 corresponding to UBP processing, an item 703 corresponding to MBP processing, and an item 704 corresponding to CF processing.
  • the items 701 to 704 corresponding to each processing and progress bars 711 to 714 indicating a progress situation of each processing are displayed next to one another.
  • progress of the corresponding processing is indicated to be 0% and when this reaches a right end, the progress of the corresponding processing is indicated to be 100%.
  • the user may grasp the progress situation and time remained of the corresponding processing from a position and a speed of the progress bar.
  • FIG. 7 illustrates the screen displayed when the item 701 corresponding to the BD processing is selected at step S 508 to be described later and the item 703 corresponding to the MBP processing is selected at step S 512 thereafter. At that time, the black bar of the progress bar 713 corresponding the MBP processing does not reach the right end as illustrated in FIG. 7 . Therefore, it is understood that the MBP processing is not finished at that time.
  • step S 506 the CPU 641 referred to the RAM 645 and displayed a list of pieces of the saved photoacoustic signal data in a data selection window 720 . Then, the user selected one of the pieces of photoacoustic signal data displayed in the data selection window 720 .
  • an ID number of the subject and photographing time of the photoacoustic signal data are displayed in the data selection window 720 such that they may be selected.
  • step S 507 the CPU 641 read the measurement parameter corresponding to the photoacoustic signal data selected by the user at step S 506 and displayed the object information which may be displayed in an object information selection window 730 . Then, the user selected the item corresponding to initial sound pressure.
  • the initial sound pressure, an absorption coefficient, and oxygen saturation are displayed in the object information selection window 730 .
  • the oxygen saturation being spectral characteristics cannot be selected in this example.
  • the initial sound pressure and the absorption coefficient which may be selected by the user and the oxygen saturation which cannot be selected by the user are displayed in different colors.
  • step S 508 the CPU 641 determines whether the item 701 corresponding to the BD processing is selected by the user.
  • the procedure shifts to step S 509 .
  • the procedure shifts to step S 510 .
  • the procedure shifts to step S 509 in this example.
  • step S 509 the CPU 641 read the photoacoustic signal data selected by the user from the RAM 645 and applied the above-described BD processing to the photoacoustic signal data. Then, the photoacoustic signal data to which the BD processing was applied was saved in the RAM 645 .
  • step S 510 the CPU 641 determined whether the item 702 corresponding to the UBP processing was selected.
  • the procedure shifts to step S 511 .
  • the procedure shifts to step S 512 . Since the user does not select the item 702 corresponding to the UBP processing, the procedure shifts to step S 512 in this example.
  • step S 512 the CPU 641 determined whether the item 703 corresponding to the MBP processing was selected.
  • the procedure shifts to step S 513 .
  • the procedure shifts to step S 510 .
  • the procedure shifts to S 513 in this example.
  • step S 513 the CPU 641 instructed the GPU 643 to perform the MBP processing. Then, the GPU 643 applied the MBP processing to the photoacoustic signal data to which the BD processing was applied at step S 509 to generate three-dimensional voxel data related to the initial sound pressure. The three-dimensional voxel data was saved in the RAM 645 .
  • step S 514 the CPU 641 determines whether the item 704 corresponding to the CF processing is selected.
  • the procedure shifts to step S 515 .
  • the procedure shifts to step S 516 .
  • the user selects the item 704 corresponding to the CF processing, so that the procedure shifts to step S 515 .
  • step S 515 the CPU 641 applied the CF processing to the three-dimensional voxel data related to the initial sound pressure stored in the RAM 645 to generate the three-dimensional voxel data related to the initial sound pressure subjected to the CF processing. Then, the three-dimensional voxel data related to the initial sound pressure after being subjected to the CF processing was saved in the RAM 645 . By applying the CF processing at this step, resolution of the three-dimensional voxel data related to the initial sound pressure was improved.
  • step S 516 the GPU 643 applied scan transform processing to the three-dimensional voxel data related to the initial sound pressure stored in the RAM 645 to generate display data. Then, the CPU 641 output the display data to the liquid crystal display 160 and initial sound pressure distribution was displayed in an image display window 740 .
  • step S 508 the procedure shifts to step S 508 again to determine whether the item corresponding to each processing is selected and when any item is selected, the processing corresponding to the item is executed.
  • the user may execute the desired processing to display the object information. Therefore, the user may diagnose by using an image obtained by the processing meeting needs of the user such as processing time and an image quality by using the photoacoustic signal data obtained at certain time.
  • Example 2 of the present invention is described. This example is different from Example 1 in that a plurality of types of processing is started in parallel and desired processing may be selected from finished processing.
  • an object information obtaining device illustrated in FIGS. 1 and 6 was used as in Example 1.
  • a method of obtaining object information of this example is described with reference to a flow illustrated in FIG. 8 . Meanwhile, the flow illustrated in FIG. 8 is executed by a computer 140 .
  • a CPU 641 issued an instruction to a GPU 643 to execute UBP processing at step S 511 to photoacoustic signal data to which BD processing was applied after steps up to step S 509 . Further, the CPU 641 issued an instruction to the GPU 643 to execute MBP processing at step S 513 in parallel with step S 511 .
  • reconstruction processing is stored in a ROM 644 as a different thread program. Each processing is executed by each of a plurality of processors assigned in the GPU 643 .
  • a screen displayed on a liquid crystal display 160 at that time is illustrated in FIG. 9 .
  • the progress is indicated to be 100%, so that initial sound pressure distribution corresponding to the UBP processing may be selected.
  • an item 702 corresponding to the UBP processing which may be selected is displayed with white background and the item 703 corresponding to the MBP processing which cannot be selected is displayed with gray background, so that the user could visually recognize whether the processing may be selected.
  • step S 510 the CPU 641 determined whether the item 702 corresponding to the UBP processing was selected.
  • the procedure shifts to step S 516 .
  • the procedure shifts to step S 512 .
  • the user selects the item 702 corresponding to the UBP processing which previously becomes selectable, so that the procedure shifts to step S 516 .
  • step S 510 the procedure shifts to step S 510 again to determine whether the item corresponding to each processing is selected, and when any item is selected, the object information corresponding to the item is displayed.
  • the number of finished processing increases with time, so that the types of processing which the user may select also increase with time in this example. Therefore, a diagnostic method of confirming the object information obtained in a short time by the UBP processing and the like first and confirming the object information obtained by the MBP processing and the like when further detail is required during reading as in this example becomes possible.
  • Embodiments of the present invention can also be realized by a computer of a system or apparatus that reads out and executes computer executable instructions recorded on a storage medium (e.g., non-transitory computer-readable storage medium) to perform the functions of one or more of the above-described embodiment(s) of the present invention, and by a method performed by the computer of the system or apparatus by, for example, reading out and executing the computer executable instructions from the storage medium to perform the functions of one or more of the above-described embodiment(s).
  • the computer may comprise one or more of a central processing unit (CPU), micro processing unit (MPU), or other circuitry, and may include a network of separate computers or separate computer processing units.
  • CPU central processing unit
  • MPU micro processing unit
  • the computer executable instructions may be provided to the computer, for example, from a network or the storage medium.
  • the storage medium may include, for example, one or more of a hard disk, a random-access memory (RAM), a read only memory (ROM), a storage of distributed computing systems, an optical disk (such as a compact disc (CD), digital versatile disc (DVD), or Blu-ray Disc (BD)TM), a flash memory device, a memory card, and the like.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Surgery (AREA)
  • Acoustics & Sound (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biophysics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Theoretical Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Investigating Or Analyzing Materials By The Use Of Ultrasonic Waves (AREA)
US14/134,957 2012-12-28 2013-12-19 Object information obtaining device, display method, and non-transitory computer-readable storage medium Abandoned US20140182383A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US15/700,996 US10429233B2 (en) 2012-12-28 2017-09-11 Object information obtaining device, display method, and non-transitory computer-readable storage medium
US16/543,419 US20190368920A1 (en) 2012-12-28 2019-08-16 Object information obtaining device, display method, and non-transitory computer-readable storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012-286685 2012-12-28
JP2012286685 2012-12-28

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US15/700,996 Continuation US10429233B2 (en) 2012-12-28 2017-09-11 Object information obtaining device, display method, and non-transitory computer-readable storage medium

Publications (1)

Publication Number Publication Date
US20140182383A1 true US20140182383A1 (en) 2014-07-03

Family

ID=51015648

Family Applications (3)

Application Number Title Priority Date Filing Date
US14/134,957 Abandoned US20140182383A1 (en) 2012-12-28 2013-12-19 Object information obtaining device, display method, and non-transitory computer-readable storage medium
US15/700,996 Expired - Fee Related US10429233B2 (en) 2012-12-28 2017-09-11 Object information obtaining device, display method, and non-transitory computer-readable storage medium
US16/543,419 Abandoned US20190368920A1 (en) 2012-12-28 2019-08-16 Object information obtaining device, display method, and non-transitory computer-readable storage medium

Family Applications After (2)

Application Number Title Priority Date Filing Date
US15/700,996 Expired - Fee Related US10429233B2 (en) 2012-12-28 2017-09-11 Object information obtaining device, display method, and non-transitory computer-readable storage medium
US16/543,419 Abandoned US20190368920A1 (en) 2012-12-28 2019-08-16 Object information obtaining device, display method, and non-transitory computer-readable storage medium

Country Status (2)

Country Link
US (3) US20140182383A1 (enrdf_load_stackoverflow)
JP (1) JP6415050B2 (enrdf_load_stackoverflow)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140182384A1 (en) * 2012-12-28 2014-07-03 Canon Kabushiki Kaisha Object Information Acquisition Apparatus, Display Method, and Non-Transitory Computer-Readable Storage Medium
WO2016051734A1 (en) * 2014-09-29 2016-04-07 Canon Kabushiki Kaisha Object-information acquisition apparatus
CN105698917A (zh) * 2016-03-17 2016-06-22 辽宁石油化工大学 次声波检测装置及其检测方法
US20170265750A1 (en) * 2016-03-15 2017-09-21 Canon Kabushiki Kaisha Information processing system and display control method
WO2018041789A1 (de) * 2016-08-30 2018-03-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und vorrichtung zur tomographie von schall
CN112419438A (zh) * 2020-12-01 2021-02-26 上海科技大学 一种用于光声成像有限视角补偿和去伪影的图像重构方法
US11333599B2 (en) * 2016-12-22 2022-05-17 Fujifilm Corporation Photoacoustic image generation apparatus, photoacoustic image generation method, and photoacoustic image generation program

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018102750A (ja) * 2016-12-27 2018-07-05 キヤノン株式会社 情報処理装置、情報処理方法、情報処理システム及びプログラム
JP7277345B2 (ja) * 2019-11-29 2023-05-18 キヤノンメディカルシステムズ株式会社 画像処理装置及び画像処理プログラム

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6891920B1 (en) * 2002-11-29 2005-05-10 Fischer Imaging Corporation Automated background processing mammographic image data
US20070140536A1 (en) * 2005-12-19 2007-06-21 Eastman Kodak Company Medical image processing method and apparatus
US7475358B2 (en) * 2006-02-14 2009-01-06 International Business Machines Corporation Alternate progress indicator displays
US20100094561A1 (en) * 2008-10-03 2010-04-15 Canon Kabushiki Kaisha Apparatus and method for processing biological information
WO2011048596A1 (en) * 2009-10-23 2011-04-28 Boris Melnik System and method for noninvasive tissue examination
WO2012138965A2 (en) * 2011-04-08 2012-10-11 University Of Florida Research Foundation, Inc. Enhanced image reconstruction in photoacoustic tomography
WO2012140865A1 (en) * 2011-04-12 2012-10-18 Canon Kabushiki Kaisha Object information acquiring apparatus and object information acquiring method
US20140182384A1 (en) * 2012-12-28 2014-07-03 Canon Kabushiki Kaisha Object Information Acquisition Apparatus, Display Method, and Non-Transitory Computer-Readable Storage Medium

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588032A (en) * 1992-10-14 1996-12-24 Johnson; Steven A. Apparatus and method for imaging with wavefields using inverse scattering techniques
US6567688B1 (en) * 1999-08-19 2003-05-20 The Texas A&M University System Methods and apparatus for scanning electromagnetically-induced thermoacoustic tomography
JP4406226B2 (ja) * 2003-07-02 2010-01-27 株式会社東芝 生体情報映像装置
JP4643153B2 (ja) * 2004-02-06 2011-03-02 株式会社東芝 非侵襲生体情報映像装置
WO2009011934A1 (en) * 2007-07-17 2009-01-22 University Of Florida Research Foundation, Inc. Method and apparatus for tomographic imaging of absolute optical absorption coefficient in turbid media using combined photoacoustic and diffusing light measurements
JP5284129B2 (ja) * 2008-02-06 2013-09-11 キヤノン株式会社 イメージング装置、及び解析方法
JP5197217B2 (ja) * 2008-08-05 2013-05-15 キヤノン株式会社 生体情報イメージング装置、画像構成方法
JP5241465B2 (ja) * 2008-12-11 2013-07-17 キヤノン株式会社 光音響イメージング装置および光音響イメージング方法
JP5653057B2 (ja) * 2009-05-27 2015-01-14 キヤノン株式会社 測定装置
JP2011005042A (ja) * 2009-06-26 2011-01-13 Canon Inc 光音響イメージング装置及び光音響イメージング方法
US9271654B2 (en) * 2009-06-29 2016-03-01 Helmholtz Zentrum Munchen Deutsches Forschungszentrum Fur Gesundheit Und Umwelt (Gmbh) Thermoacoustic imaging with quantitative extraction of absorption map
EP2494923B1 (en) * 2009-10-29 2015-07-29 Canon Kabushiki Kaisha Photo-acoustic device
JP5528083B2 (ja) 2009-12-11 2014-06-25 キヤノン株式会社 画像生成装置、画像生成方法、及び、プログラム
JP5538862B2 (ja) * 2009-12-18 2014-07-02 キヤノン株式会社 画像処理装置、画像処理システム、画像処理方法、及びプログラム
JP5451414B2 (ja) 2010-01-18 2014-03-26 キヤノン株式会社 被検体情報処理装置および被検体情報処理方法
JP5586977B2 (ja) * 2010-02-08 2014-09-10 キヤノン株式会社 被検体情報取得装置及び被検体情報取得方法
JP5725720B2 (ja) * 2010-02-16 2015-05-27 キヤノン株式会社 被検体情報処理装置
JP5645421B2 (ja) * 2010-02-23 2014-12-24 キヤノン株式会社 超音波画像装置および遅延制御方法
JP5441781B2 (ja) * 2010-03-25 2014-03-12 キヤノン株式会社 光音響イメージング装置、光音響イメージング方法及びプログラム
JP5496031B2 (ja) * 2010-09-17 2014-05-21 キヤノン株式会社 音響波信号処理装置ならびにその制御方法および制御プログラム
CN102834061B (zh) * 2010-12-24 2016-06-08 柯尼卡美能达株式会社 超声波诊断装置及超声波诊断装置的控制方法
JP5661451B2 (ja) 2010-12-27 2015-01-28 キヤノン株式会社 被検体情報取得装置及び被検体情報取得方法
JP2012205888A (ja) * 2011-03-17 2012-10-25 Canon Inc 被検体情報取得装置および被検体情報取得方法
JP5896623B2 (ja) * 2011-05-02 2016-03-30 キヤノン株式会社 被検体情報取得装置およびその制御方法
JP5346987B2 (ja) * 2011-05-17 2013-11-20 富士フイルム株式会社 超音波診断装置
JP2013078463A (ja) * 2011-10-04 2013-05-02 Canon Inc 音響波取得装置
JP2013215236A (ja) * 2012-04-04 2013-10-24 Canon Inc 被検体情報取得装置および被検体情報取得方法

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6891920B1 (en) * 2002-11-29 2005-05-10 Fischer Imaging Corporation Automated background processing mammographic image data
US20070140536A1 (en) * 2005-12-19 2007-06-21 Eastman Kodak Company Medical image processing method and apparatus
US7475358B2 (en) * 2006-02-14 2009-01-06 International Business Machines Corporation Alternate progress indicator displays
US20100094561A1 (en) * 2008-10-03 2010-04-15 Canon Kabushiki Kaisha Apparatus and method for processing biological information
WO2011048596A1 (en) * 2009-10-23 2011-04-28 Boris Melnik System and method for noninvasive tissue examination
WO2012138965A2 (en) * 2011-04-08 2012-10-11 University Of Florida Research Foundation, Inc. Enhanced image reconstruction in photoacoustic tomography
WO2012140865A1 (en) * 2011-04-12 2012-10-18 Canon Kabushiki Kaisha Object information acquiring apparatus and object information acquiring method
US20140182384A1 (en) * 2012-12-28 2014-07-03 Canon Kabushiki Kaisha Object Information Acquisition Apparatus, Display Method, and Non-Transitory Computer-Readable Storage Medium

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140182384A1 (en) * 2012-12-28 2014-07-03 Canon Kabushiki Kaisha Object Information Acquisition Apparatus, Display Method, and Non-Transitory Computer-Readable Storage Medium
WO2016051734A1 (en) * 2014-09-29 2016-04-07 Canon Kabushiki Kaisha Object-information acquisition apparatus
US20170265750A1 (en) * 2016-03-15 2017-09-21 Canon Kabushiki Kaisha Information processing system and display control method
CN105698917A (zh) * 2016-03-17 2016-06-22 辽宁石油化工大学 次声波检测装置及其检测方法
WO2018041789A1 (de) * 2016-08-30 2018-03-08 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Verfahren und vorrichtung zur tomographie von schall
US11215501B2 (en) 2016-08-30 2022-01-04 Fraunhofer-Gesellschaft Zur Foerderung Der Angewandten Forschung E.V. Method and apparatus for the tomography of sound
US11333599B2 (en) * 2016-12-22 2022-05-17 Fujifilm Corporation Photoacoustic image generation apparatus, photoacoustic image generation method, and photoacoustic image generation program
CN112419438A (zh) * 2020-12-01 2021-02-26 上海科技大学 一种用于光声成像有限视角补偿和去伪影的图像重构方法

Also Published As

Publication number Publication date
JP2014140717A (ja) 2014-08-07
US10429233B2 (en) 2019-10-01
US20180010957A1 (en) 2018-01-11
US20190368920A1 (en) 2019-12-05
JP6415050B2 (ja) 2018-10-31

Similar Documents

Publication Publication Date Title
US10429233B2 (en) Object information obtaining device, display method, and non-transitory computer-readable storage medium
JP6399753B2 (ja) 被検体情報取得装置、表示方法、およびプログラム
JP5528083B2 (ja) 画像生成装置、画像生成方法、及び、プログラム
WO2012011242A1 (en) Image information acquiring apparatus, image information acquiring method and image information acquiring program
CN109475345A (zh) 用于显示超声图像和光声图像的装置、方法和程序
US20180228377A1 (en) Object information acquiring apparatus and display method
US20200085345A1 (en) Object information acquisition apparatus and method of controlling the same
JP6727812B2 (ja) 光音響装置、画像表示方法、およびプログラム
JP6742734B2 (ja) 被検体情報取得装置および信号処理方法
JPWO2013108375A1 (ja) 被検体情報取得装置及び被検体情報取得方法
JP6587410B2 (ja) 被検体情報取得装置および信号処理方法
EP3329843A1 (en) Display control apparatus, display control method, and program
JP2015167789A (ja) 被検体情報取得装置および信号処理方法
JP2017164198A (ja) 情報処理システムおよび表示制御方法
KR20200067254A (ko) 정보 처리장치, 정보 처리방법, 및 비일시적인 기억매체
JP6469133B2 (ja) 処理装置、光音響装置、処理方法、およびプログラム
JP6425438B2 (ja) 被検体情報取得装置および画像処理方法
JP2014147825A (ja) 画像生成装置、画像生成方法、及び、プログラム
US20160206246A1 (en) Object information acquiring apparatus and object information acquisition method
US20190271638A1 (en) Photoacoustic image generation apparatus, photoacoustic image generation method, and photoacoustic image generation program
JP6942847B2 (ja) 被検体情報取得装置および信号処理方法
JP2020018467A (ja) 情報処理装置、情報処理方法、プログラム
JP2017012692A (ja) 光音響装置、情報取得装置、情報取得方法、およびプログラム
JP6381718B2 (ja) 装置および画像生成方法
US20190374110A1 (en) Photoacoustic apparatus and control method

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANON KABUSHIKI KAISHA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SUZUKI, KOICHI;ABE, HIROSHI;SIGNING DATES FROM 20131212 TO 20131213;REEL/FRAME:032935/0344

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO PAY ISSUE FEE