WO2018008661A1 - Dispositif de commande, procédé de commande, système de commande et programme - Google Patents

Dispositif de commande, procédé de commande, système de commande et programme Download PDF

Info

Publication number
WO2018008661A1
WO2018008661A1 PCT/JP2017/024569 JP2017024569W WO2018008661A1 WO 2018008661 A1 WO2018008661 A1 WO 2018008661A1 JP 2017024569 W JP2017024569 W JP 2017024569W WO 2018008661 A1 WO2018008661 A1 WO 2018008661A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
photoacoustic
ultrasonic
signal
region
Prior art date
Application number
PCT/JP2017/024569
Other languages
English (en)
Japanese (ja)
Inventor
浩 荒井
由香里 中小司
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016229312A external-priority patent/JP2018011928A/ja
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2018008661A1 publication Critical patent/WO2018008661A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the disclosure of this specification relates to a control device, a control method, a control system, and a program.
  • An ultrasonic imaging device or a photoacoustic imaging device is used as an imaging device that images a state inside a subject in a minimally invasive manner.
  • a process for generating a photoacoustic image from a photoacoustic signal is performed for a specific region identified based on an ultrasonic image, It is disclosed that processing for generating a photoacoustic image from a photoacoustic signal is not performed for a region other than this region.
  • the photoacoustic signal is based on an acoustic wave generated by expansion generated inside the subject due to light irradiated on the subject. Therefore, only by generating a photoacoustic image from a photoacoustic signal only in a region where a photoacoustic image is desired to be acquired, light is irradiated even on a region where a photoacoustic image is not acquired.
  • the control device disclosed in this specification outputs an ultrasonic signal by transmitting / receiving ultrasonic waves to / from a subject, and receives a photoacoustic wave generated by light irradiation on the subject, from a probe that outputs a photoacoustic signal.
  • Signal acquisition means for acquiring at least one of the ultrasonic signal and the photoacoustic signal
  • generation means for generating an ultrasonic image based on the ultrasonic signal, and the generated ultrasonic image.
  • Irradiation control means for controlling the light irradiation of the probe on the basis thereof.
  • a control device capable of controlling different types of imaging, it is possible to control whether or not the other type of imaging is performed based on an image obtained by one type of imaging. It is possible to perform a test that is not redundant for the user or the subject.
  • an acoustic wave generated by irradiating a subject with light and expanding inside the subject is referred to as a photoacoustic wave.
  • an acoustic wave transmitted from the transducer or a reflected wave (echo) in which the transmitted acoustic wave is reflected inside the subject is referred to as an ultrasonic wave.
  • an imaging method using ultrasonic waves and an imaging method using photoacoustic waves are used.
  • the method of imaging using ultrasonic waves is, for example, that the ultrasonic waves oscillated from the transducer are reflected by the tissue inside the subject according to the difference in acoustic impedance, and the time until the reflected wave reaches the transducer and the reflected wave.
  • An image imaged using ultrasound is hereinafter referred to as an ultrasound image.
  • the user can operate while changing the angle of the probe and observe ultrasonic images of various cross sections in real time. Ultrasound images depict the shapes of organs and tissues and are used to find tumors.
  • the imaging method using photoacoustic waves is a method of generating an image based on photoacoustic waves generated by adiabatic expansion of tissue inside a subject irradiated with light, for example.
  • An image imaged using the photoacoustic wave is hereinafter referred to as a photoacoustic image.
  • the photoacoustic image information related to optical characteristics such as the degree of light absorption of each tissue is depicted.
  • photoacoustic images for example, it is known that blood vessels can be drawn by the optical characteristics of hemoglobin, and its use for evaluating the malignancy of tumors is being studied.
  • various information may be collected by imaging different phenomena on the same part of the subject based on different principles.
  • diagnosis about cancer is performed by combining morphological information obtained from a CT (Computed Tomography) image and functional information relating to metabolism obtained from a PET (Positronization Tomography) image.
  • CT Computer Tomography
  • PET PET
  • diagnosis accuracy it is considered effective to improve diagnosis accuracy to perform diagnosis using information obtained by imaging different phenomena based on different principles.
  • an imaging device for obtaining an image obtained by combining the respective characteristics has been studied.
  • both an ultrasonic image and a photoacoustic image are imaged using ultrasonic waves from a subject
  • the user wants to operate the probe in the same manner as a conventional ultrasonic image. That is, it is conceivable that the user touches the surface of the subject and operates the probe while observing an image displayed based on information acquired by the probe. At that time, if the operation mode related to signal acquisition or image display is switched via, for example, a switch provided on the probe or an input device provided on the console of the imaging device, the user observes the image. It is necessary to interrupt the probe operation. For this reason, it is conceivable that the body movement of the subject occurs during the operation input to the switch or the input device of the console, or the probe position shifts.
  • An object of the first embodiment is to provide a control device that can switch an image to be displayed without deteriorating operability when a user observes an image.
  • FIG. 10 is a diagram illustrating an example of a system configuration including the control device 101 according to the first embodiment.
  • An imaging system 100 that can generate an ultrasonic image and a photoacoustic image is connected to various external devices via a network 110.
  • Each configuration and various external devices included in the imaging system 100 do not need to be installed in the same facility, and may be connected to be communicable.
  • the imaging system 100 includes a control device 101, a probe 102, a detection unit 103, a display unit 104, and an operation unit 105.
  • the control apparatus 101 is an apparatus that acquires an ultrasonic signal and a photoacoustic signal from the probe 102, controls acquisition of the photoacoustic signal based on, for example, an ultrasonic image, and generates a photoacoustic image based on the control.
  • the control device 101 acquires information related to an examination including imaging of an ultrasonic image and a photoacoustic image from the ordering system 112, and controls the probe 102, the detection unit 103, and the display unit 104 when the examination is performed.
  • the control device 101 outputs the generated ultrasonic image, photoacoustic image, and superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image to the PACS 113.
  • the control device 101 transmits / receives information to / from external devices such as the ordering system 112 and the PACS 113 in accordance with standards such as HL7 (Health level 7) and DICOM (Digital Imaging and Communications in Medicine). Details of processing performed by the control device 101 will be described later.
  • the region in the subject from which an ultrasound image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a liver, or a pancreas.
  • an ultrasound image of a subject to which an ultrasound contrast agent using microbubbles is administered may be captured.
  • the region in the subject from which the photoacoustic image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a diameter portion, an abdomen, a limb including fingers and toes.
  • a blood vessel region including a new blood vessel and a plaque on a blood vessel wall may be set as a target for imaging a photoacoustic image in accordance with the characteristics relating to light absorption in the subject.
  • a region in a subject where a photoacoustic image is captured by the imaging system 100 is not necessarily a region where an ultrasonic image is captured. Does not have to match.
  • a photoacoustic image of a subject to which a dye such as methylene blue or indocyanine green, gold fine particles, or a substance obtained by integrating or chemically modifying them is administered as a contrast agent is captured. May be.
  • the probe 102 is operated by a user and transmits an ultrasonic signal and a photoacoustic signal to the control device 101.
  • the probe 102 includes a transmission / reception unit 106 and an irradiation unit 107.
  • the probe 102 transmits an ultrasonic wave from the transmission / reception unit 106 and receives the reflected wave by the transmission / reception unit 106. Further, the probe 102 irradiates the subject with light from the irradiation unit 107, and the photoacoustic wave is received by the transmission / reception unit 106.
  • the probe 102 converts the received reflected wave and photoacoustic wave into an electric signal, and transmits it to the control device 101 as an ultrasonic signal and a photoacoustic signal.
  • the probe 102 is controlled so that, when information indicating contact with the subject is received, transmission of ultrasonic waves for acquiring an ultrasonic signal and light irradiation for acquiring a photoacoustic signal are executed. It is
  • the transmission / reception unit 106 includes at least one transducer (not shown), a matching layer (not shown), a damper (not shown), and an acoustic lens (not shown).
  • the transducer (not shown) is made of a material exhibiting a piezoelectric effect, such as PZT (lead zirconate titanate) or PVDF (polyvinylidene difluoride).
  • the transducer (not shown) may be other than a piezoelectric element, for example, a transducer using a capacitive transducer (CMUT: capacitive ultrasonic transducer) or a Fabry-Perot interferometer.
  • CMUT capacitive ultrasonic transducer
  • the ultrasonic signal is composed of frequency components of 2 to 20 MHz and the photoacoustic signal is composed of frequency components of 0.1 to 100 MHz, and a transducer (not shown) that can detect these frequencies is used.
  • the signal obtained by the transducer (not shown) is a time-resolved signal.
  • the amplitude of the received signal represents a value based on the sound pressure received by the transducer at each time.
  • the transmission / reception unit 106 includes a circuit (not shown) or a control unit for electronic focusing.
  • the array form of transducers (not shown) is, for example, a sector, a linear array, a convex, an annular array, or a matrix array.
  • the transmitting / receiving unit 106 may include an amplifier (not shown) that amplifies a time-series analog signal received by a transducer (not shown).
  • the transmission / reception unit 106 may include an A / D converter that converts a time-series analog signal received by a transducer (not shown) into a time-series digital signal.
  • the transducer (not shown) may be divided into a transmitter and a receiver depending on the purpose of imaging an ultrasonic image. Further, the transducer (not shown) may be divided into an ultrasonic image capturing unit and a photoacoustic image capturing unit.
  • the irradiation unit 107 includes a light source (not shown) for acquiring a photoacoustic signal and an optical system (not shown) that guides pulsed light emitted from the light source (not shown) to the subject.
  • the pulse width of light emitted from a light source (not shown) is, for example, 1 ns or more and 100 ns or less.
  • the wavelength of the light which a light source (not shown) injects is a wavelength of 400 nm or more and 1600 nm or less, for example.
  • a wavelength of 400 nm or more and 700 nm or less and a large absorption in the blood vessel is preferable.
  • the wavelength of 700 nm or more and 1100 nm or less which is hard to be absorbed by tissues such as water and fat is preferable.
  • the light source (not shown) is, for example, a laser or a light emitting diode.
  • the irradiation unit 107 may use a light source that can convert wavelengths in order to acquire a photoacoustic signal using light of a plurality of wavelengths.
  • the irradiation unit 107 may include a plurality of light sources that generate light of different wavelengths, and may be configured to be able to irradiate light of different wavelengths alternately from each light source.
  • the laser is, for example, a solid laser, a gas laser, a dye laser, or a semiconductor laser.
  • a pulsed laser such as an Nd: YAG laser or an alexandrite laser may be used.
  • a Ti: sa laser or an OPO (optical parametric oscillators) laser that uses Nd: YAG laser light as excitation light may be used as a light source (not shown).
  • a microwave source may be used as a light source (not shown).
  • optical elements such as lenses, mirrors, and optical fibers are used.
  • the optical system may include a diffusion plate that diffuses the emitted light.
  • the optical system may include a lens or the like so that the beam can be focused.
  • the detection unit 103 acquires information regarding the position and orientation of the probe 102.
  • the detection unit 103 transmits information related to the position of the probe 102 to the control device 101.
  • the detection unit 103 is a motion sensor provided in the probe 102, for example.
  • the detection unit 103 is not necessarily included in the control device 101, and the sensor may be switched between ON and OFF as appropriate based on various conditions set prior to the inspection.
  • the display unit 104 displays an image captured by the imaging system 100 and information related to the inspection based on control from the control device 101.
  • the display unit 104 provides an interface for receiving user instructions based on control from the control device 101.
  • the display unit 104 is a liquid crystal display, for example.
  • the operation unit 105 transmits information related to user operation input to the control apparatus 101.
  • the operation unit 105 is, for example, a keyboard, a trackball, and various buttons for performing operation inputs related to inspection.
  • the display unit 104 and the operation unit 105 may be integrated as a touch panel display.
  • the control apparatus 101, the display part 104, and the operation part 105 do not need to be separate apparatuses, and may be implement
  • the control device 101 may have a plurality of probes.
  • a HIS (Hospital Information System) 111 is a system that supports hospital operations.
  • the HIS 111 includes an electronic medical record system, an ordering system, and a medical accounting system.
  • the ordering system of the HIS 111 transmits order information to the ordering system 112 for each department.
  • the ordering system 112 which will be described later, manages the execution of the order.
  • the ordering system 112 is a system that manages inspection information and manages the progress of each inspection in the imaging apparatus.
  • the ordering system 112 may be configured for each department that performs inspection.
  • the ordering system 112 is, for example, RIS (Radiology Information System) in the radiation department.
  • RIS Radiology Information System
  • the ordering system 112 transmits information on examinations performed by the imaging system 100 to the control apparatus 101.
  • the ordering system 112 receives information related to the progress of the inspection from the control device 101.
  • the ordering system 112 transmits information indicating that the inspection is completed to the HIS 111.
  • the ordering system 112 may be integrated into the HIS 111.
  • a PACS (Picture Archiving and Communication System) 113 is a database system that holds images obtained by various imaging devices inside and outside the facility.
  • the PACS 113 manages a storage unit (not shown) that stores medical images and imaging conditions of such medical images, additional parameters such as image processing parameters including reconstruction, and patient information, and information stored in the storage unit.
  • a controller (not shown).
  • the PACS 113 stores an ultrasonic image, a photoacoustic image, and a superimposed image output from the control device 101. It is preferable that communication between the PACS 113 and the control device 101 and various images stored in the PACS 113 comply with standards such as HL7 and DICOM. Various images output from the control device 101 are stored with associated information associated with various tags in accordance with the DICOM standard.
  • the Viewer 114 is a terminal for image diagnosis, and reads an image stored in the PACS 113 and displays it for diagnosis.
  • the doctor displays an image on the Viewer 114 for observation, and records information obtained as a result of the observation as an image diagnosis report.
  • the diagnostic imaging report created using the Viewer 114 may be stored in the Viewer 114, or may be output and stored in the PACS 113 or a report server (not shown).
  • the Printer 115 prints an image stored in the PACS 113 or the like.
  • the Printer 115 is, for example, a film printer, and outputs an image stored in the PACS 113 or the like by printing it on a film.
  • FIG. 1 is a diagram illustrating an example of the configuration of the control device 101.
  • the control device 101 includes a CPU 131, ROM 132, RAM 133, DISK 134, USB 135, communication circuit 136, GPU 137, HDMI 138, and probe connector port 139. These are connected by BUS130 so that communication is possible.
  • the BUS 130 is a data bus, and is used to transmit / receive data between connected hardware and to transmit commands from the CPU 131 to other hardware.
  • a CPU (Central Processing Unit) 131 is a control circuit that integrally controls the control device 101 and each unit connected thereto.
  • the CPU 131 performs control by executing a program stored in the ROM 132. Further, the CPU 131 executes a display driver which is software for controlling the display unit 104 and performs display control on the display unit 104. Further, the CPU 131 performs input / output control for the operation unit 105.
  • ROM (Read Only Memory) 132 stores a program and data in which a control procedure by the CPU is stored.
  • the ROM 132 has a boot program 140 of the control device 101 and various initial data 141.
  • various modules 142 to 150 for realizing the processing of the control apparatus 101 are included. Various modules for realizing the processing of the control apparatus 101 will be described later.
  • a RAM (Random Access Memory) 133 provides a working storage area when the CPU 131 performs control by an instruction program.
  • the RAM 133 has a stack 151 and a work area 152.
  • the RAM 133 stores a program for executing processing in each unit connected to the control device 101 and various parameters used in image processing.
  • the RAM 133 stores a control program executed by the CPU 131, and temporarily stores various data when the CPU 131 executes various controls.
  • the DISK 134 is an auxiliary storage device that stores various data such as ultrasonic images and photoacoustic images.
  • the DISK 134 is, for example, an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • a USB (Universal Serial Bus) 205 is a connection unit connected to the operation unit 105.
  • the communication circuit 136 is a circuit for communicating with each unit constituting the imaging system 100 and various external devices connected to the network 110.
  • the communication circuit 136 stores output information in a transfer packet, for example, and outputs the information to an external device via the network 110 by a communication technique such as TCP / IP.
  • the control device 101 may have a plurality of communication circuits in accordance with a desired communication form.
  • the GPU 137 is included in a general-purpose graphics board including a video memory.
  • the GPU 137 executes part or all of the image processing module 145 and performs, for example, a photoacoustic image reconstruction process. By using such an arithmetic device, it is possible to perform operations such as reconstruction processing at high speed without requiring dedicated hardware.
  • HDMI (registered trademark) (High Definition Multimedia Interface) 138 is a connection unit connected to the display unit 104.
  • the probe connector port 139 is a connection port for connecting the probe 102 to the control device 101.
  • the ultrasonic signal and photoacoustic signal output from the probe 102 are acquired by the control device 101 via the port 139.
  • CPU 131 and GPU 137 are examples of processors.
  • the ROM 132, RAM 133, and DISK 134 are examples of memories.
  • the control device 101 may have a plurality of processors. In the first embodiment, the function of each unit of the control device 101 is realized by the processor of the control device 101 executing a program stored in the memory.
  • control device 101 may have a CPU or GPU that performs a specific process exclusively. Further, the control device 101 may have a field-programmable gate array (FPGA) in which specific processing or all processing is programmed. The control device 101 may have both an HDD and an SSD as the DISK 134.
  • FPGA field-programmable gate array
  • the modules 143 to 150 stored in the ROM 132 will be described. Modules 143 to 150 shown in FIG. 1 are obtained by extracting modules for executing processing related to the embodiment of the present invention.
  • the control device 101 may include modules necessary for executing the inspection and operating the control device 101 other than those illustrated.
  • each module may be configured as a combination of one or a plurality of programs.
  • some or all of the modules 143 to 150 may be stored in a memory other than the ROM 132 such as the DISK 134.
  • each module will be described in detail.
  • the inspection control module 142 controls inspection performed in the imaging system 100.
  • the inspection control module 142 acquires inspection order information from the ordering system 112.
  • the examination order includes information on a patient who undergoes an examination and information on imaging procedures.
  • the inspection control module 142 controls the probe 102 and the detection unit 103 based on information on the imaging technique.
  • the examination control module 142 displays information on the examination on the display unit 104 via the output module 150 in order to present information related to the examination to the user.
  • the information on the examination displayed on the display unit 104 includes information on the patient undergoing the examination, information on the imaging technique included in the examination, and an image already generated after imaging.
  • the inspection control module 142 transmits information regarding the progress of the inspection to the ordering system 112. For example, when the inspection is started by the user, the system 112 is notified of the start, and when imaging by all the imaging techniques included in the inspection is completed, the system 112 is notified of the completion.
  • the signal acquisition module 143 acquires an ultrasonic signal and a photoacoustic signal from the probe 102. Specifically, the signal acquisition module 143 distinguishes an ultrasonic signal and a photoacoustic signal from information acquired from the probe 102 based on information from the inspection control module 142, the image processing module 145, and the position acquisition module 149. get. For example, in the imaging technique in which imaging is performed, when the acquisition timing of the ultrasonic signal and the acquisition of the photoacoustic signal is defined, based on the acquisition timing information acquired from the inspection control module 142, An ultrasonic signal and a photoacoustic signal are distinguished and acquired from information acquired from the probe 102.
  • the signal acquisition module 143 is an example of an acquisition unit that acquires at least one of an ultrasonic signal and a photoacoustic signal from the probe 102.
  • the signal acquisition module 143 has an irradiation control module 144.
  • the irradiation control module 144 controls the light irradiation by the irradiation unit 107 based on the information regarding the imaging conditions acquired from the inspection control module 142 and the result of analysis of the ultrasonic image by the image processing module 145.
  • the image processing module 145 is a module for performing processing for generating an image based on a signal acquired in the imaging system 100.
  • the image processing module 145 includes an ultrasonic image generation module 146, a photoacoustic image generation module 147, and a superimposed image generation module 148.
  • the image processing module 145 stores the images generated by the ultrasonic image generation module 146, the photoacoustic image generation module 147, and the superimposed image generation module 148 in the DISK 134 together with accompanying information.
  • the image is stored in the external device by outputting the image together with the supplementary information to the external device via the output module 150.
  • the ultrasonic image generation module 146 generates an ultrasonic image to be displayed on the display unit 104 from the ultrasonic signal acquired by the signal acquisition module 143.
  • the ultrasonic image generation module 146 generates an ultrasonic image suitable for the set mode based on the imaging technique information acquired from the examination control module 142. For example, when the Doppler mode is set as an imaging technique, the ultrasound image generation module 146 determines whether or not the internal frequency of the subject is based on the difference between the frequency of the ultrasound signal acquired by the signal acquisition module 143 and the transmission frequency. An image showing the flow velocity is generated.
  • the ultrasonic image generated by the ultrasonic image generation module 146 may be generated by any other method such as A mode, M mode, or Doppler mode, or may be a harmonic image or an elastography image. May be.
  • the ultrasonic image generation module 146 analyzes the generated ultrasonic image, acquires a photoacoustic signal, and specifies a region where a photoacoustic image is to be generated. For example, the ultrasonic image generation module 146 analyzes the ultrasonic image and identifies an area where there is a possibility of calculus. In this viewpoint, the ultrasonic image generation module 146 functions as a detection unit that analyzes an ultrasonic image and specifies a region from which a photoacoustic image is acquired.
  • the photoacoustic image generation module 147 generates a photoacoustic image based on the photoacoustic signal acquired by the signal acquisition module 143.
  • the photoacoustic image generation module 147 reconstructs an acoustic wave distribution (hereinafter referred to as an initial sound pressure distribution) when light is irradiated based on the photoacoustic signal.
  • the photoacoustic image generation module 147 obtains the light absorption coefficient distribution in the subject by dividing the reconstructed initial sound pressure distribution by the light fluence distribution of the subject irradiated with the light. .
  • the concentration distribution of the substance in the subject is obtained from the absorption coefficient distribution for a plurality of wavelengths by utilizing the fact that the degree of light absorption in the subject varies depending on the wavelength of the light irradiated to the subject.
  • the photoacoustic image generation module 147 acquires the concentration distribution of substances in the subject of oxyhemoglobin and deoxyhemoglobin. Further, the photoacoustic image generation module 147 acquires the oxygen saturation distribution as a ratio of the oxyhemoglobin concentration to the deoxyhemoglobin concentration.
  • the photoacoustic image generated by the photoacoustic image generation module 147 is an image indicating information such as the above-described initial sound pressure distribution, optical fluence distribution, absorption coefficient distribution, substance concentration distribution, and oxygen saturation distribution.
  • the photoacoustic image may be any image generated by combining these.
  • the image processing module 145 is an example of a generation unit that generates an ultrasonic image based on the ultrasonic signal and generates a photoacoustic image based on the photoacoustic signal.
  • the superimposed image generation module 148 generates a superimposed image in which the photoacoustic image generated by the photoacoustic image generation module 147 is superimposed on the ultrasonic image generated by the ultrasonic image generation module 146.
  • the superimposed image generation module 148 obtains a superimposed image by aligning the ultrasonic image and the photoacoustic image.
  • information regarding the imaging condition acquired from the inspection control module 142 or the position of the probe 102 acquired from the position acquisition module 149 described later may be used.
  • the alignment may be performed based on a region that is depicted in common for the ultrasonic image and the photoacoustic image.
  • the position acquisition module 149 acquires information related to the position of the probe 102 based on information from the detection unit 103.
  • the position acquisition module 149 obtains at least one of information on the speed of movement of the probe 102 with respect to the subject, information on the speed of rotation, and information indicating the degree of pressure on the subject based on the change over time of the information on the position. You may get it.
  • the position acquisition module 149 preferably acquires the position information on the probe 102 at regular time intervals, preferably in real time.
  • the position acquisition module 149 may acquire information regarding the probe 102 used for imaging.
  • Information related to the probe 102 includes information such as the type of probe, center frequency, sensitivity, acoustic focus, electronic focus, and observation depth.
  • the position acquisition module 149 appropriately transmits information regarding the position of the probe 102 and information regarding the probe 102 to the inspection control module 142, the image processing module 145, and the output module 150.
  • the output module 150 outputs information for displaying a screen on the display unit 104, and outputs the information to an external device via the network 110.
  • the output module 150 controls the display unit 104 to display information on the display unit 104.
  • the output module 150 displays information on the display unit 104 in response to an input from the inspection control module 142 or the image processing module 145 or a user operation input via the operation unit 105.
  • the output module 150 is an example of a display control unit.
  • the output module 150 outputs information from the control device 101 to an external device such as the PACS 113 via the network 110.
  • the output module 150 outputs the ultrasonic image and the photoacoustic image generated by the image processing module 145 and a superimposed image thereof to the PACS 113.
  • the image output from the output module 150 includes incidental information attached as various tags according to the DICOM standard by the inspection control module 142.
  • the incidental information includes, for example, patient information, information indicating the imaging device that captured the image, an image ID for uniquely identifying the image, and an examination ID for uniquely identifying the examination that captured the image Is included. Further, the incidental information includes information that associates an ultrasonic image captured in the same examination with a photoacoustic image.
  • the information associating the ultrasonic image and the photoacoustic image is information indicating a frame having the closest timing at which the photoacoustic image is acquired, for example, among a plurality of frames constituting the ultrasonic image.
  • the position information of the probe 102 acquired by the detection unit 103 may be incidental to each frame of the ultrasonic image and the photoacoustic image. That is, the output module 150 outputs information indicating the position of the probe 102 that has acquired the ultrasonic signal for generating the ultrasonic image, attached to the ultrasonic image. Also, the output module 150 outputs information indicating the position of the probe 102 that has acquired the photoacoustic signal for generating the photoacoustic image, attached to the photoacoustic image.
  • the output module 150 is an example of an output unit.
  • FIG. 2 is a flowchart showing an example of processing of the control apparatus 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the ultrasonic signal is acquired by the control device 101.
  • the ultrasonic image generation module 146 an ultrasonic image is generated based on the ultrasonic signal.
  • the output module 150 the ultrasonic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
  • step S202 the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image.
  • the information indicating whether or not to acquire the photoacoustic signal based on the ultrasonic image is specifically information set in advance by the user or included in the inspection order.
  • whether to acquire a photoacoustic signal may be determined according to the state of the load of processing performed in the control apparatus 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal. If the photoacoustic signal is acquired based on the ultrasound image, the process proceeds to step S203. If not, the process returns to step S201 to continue acquiring the ultrasound image.
  • the ultrasonic image generation module 146 is executed to analyze the ultrasonic image.
  • the ultrasound image generation module 146 analyzes the ultrasound image and detects a region of interest defined in advance from the regions depicted in the ultrasound image.
  • a region of interest defined in advance from the regions depicted in the ultrasound image.
  • an organ such as a calculus, a tumor, or a blood vessel is set in the region of interest.
  • the calculus includes the case of a virtual image that can be depicted by ultrasonic imaging.
  • a computer diagnosis support system can also be used in combination to search for a site of interest.
  • the analysis of the ultrasound image and the region to be extracted as the region of interest are not limited to the above-described examples, and any form may be used as long as it is possible to detect a region that is considered to be beneficial to acquire a photoacoustic image in addition to the ultrasound image. An example is also acceptable.
  • speckle is regarded as a noise component, and speckle is reduced by a spatial filter such as moving average or median.
  • speckle is reduced by a filter that uses the property of Rayleigh distribution, which is a property peculiar to speckle patterns, and the mask shape is locally variable.
  • speckles may be reduced using a multi-resolution filter or a filter by numerical simulation.
  • segmentation for extracting a target region is performed on the image with reduced speckles by threshold processing and differentiation processing on the density value.
  • segmentation using a variable shape model may be performed.
  • segmentation may be performed based on a speckle pattern that is a property unique to an ultrasound image.
  • texture analysis using a feature amount based on a co-occurrence matrix for density values, or a statistic obtained from a parameter of a logarithmic compression K distribution is used as a feature amount for a nonuniform echo.
  • One example is a probability distribution method.
  • information on internal echo, shape, boundary echo, backward echo, and outer shadow may be used.
  • the ultrasonic image is analyzed in step S203, the ultrasonic signal is continuously acquired by the signal acquisition module 143, the ultrasonic image is continuously generated by the ultrasonic image generation module 146, and the output module 150. Then, the ultrasonic image may be continuously displayed on the display unit 104.
  • step S204 by executing the ultrasonic image generation module 146, it is determined whether to acquire a photoacoustic image based on the analysis result in step S203. If the region of interest is extracted in step S203, the process proceeds to step S205. If the region of interest is not extracted, the process returns to step S201 to continue acquiring the ultrasound image.
  • the irradiation control module 144 is executed to determine whether or not to irradiate the subject with light. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149.
  • the probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good.
  • the irradiation control module 144 controls the irradiation unit 107 to emit light when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be.
  • the irradiation control module 144 further irradiates light based on the ultrasonic image generated by the ultrasonic image generation module 146 when the region extracted as the region of interest in step S203 is drawn on the ultrasonic image.
  • the irradiation unit 107 may be controlled as described above.
  • step S205 If it is determined in step S205 that the subject is to be irradiated with light, the process proceeds to step S206. If it is determined that the object is not irradiated, the process returns to step S201 to continue acquiring the ultrasound image.
  • Step S206 the photoacoustic signal is acquired from the probe 102 by executing the signal acquisition module 143.
  • step S207 the photoacoustic image generation module 147 is executed to reconstruct a photoacoustic image from the photoacoustic signal acquired in step S206. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
  • morphological information inside the subject is depicted in an ultrasonic image such as a B-mode image.
  • functional information such as the amount of hemoglobin in the blood vessel is depicted in the photoacoustic image.
  • a superimposed image generated according to a predetermined overlapping condition is displayed on the display unit 104.
  • the superimposing conditions are, for example, conditions such as a color for displaying each image, a superimposing range, and transparency in a superimposed image in which an ultrasonic image is a base image and a photoacoustic image is a layer image.
  • the control apparatus 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the form information and the function information can be referred to in association with each other. Can do. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input.
  • the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
  • the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • FIG. 3 is a diagram illustrating an example of analysis performed in step S203 illustrated in FIG. 2 and an example of an image displayed in step S207.
  • a case where a region with a possibility of calculus is depicted in an ultrasonic image will be described as an example.
  • FIG. 3A is a diagram schematically illustrating the internal structure of the subject.
  • An area 301 is an area drawn on an ultrasonic image generated based on the ultrasonic signal when the ultrasonic signal is acquired by bringing the probe into contact with a certain position with respect to the subject. It is assumed that a calculus 302 and a blood vessel 303 exist inside the subject. At this time, the calculus 302 is located outside the region 301.
  • FIG. 3B is an example of an ultrasonic image 304 generated by imaging the region 301 illustrated in FIG.
  • a blood vessel region 306 corresponding to the blood vessel 303 inside the subject is depicted.
  • an image of the calculus 302 existing outside the region 301 is depicted as a virtual image 305 in the ultrasonic image 304.
  • a virtual image is an image in which a structure that does not originally exist inside the subject is depicted on the image.
  • an ultrasonic image 304 as if the calculus 302 is present in the region 301 depicted by the ultrasonic waves in the main direction is obtained.
  • a user who observes the ultrasound image 304 must determine whether or not the virtual image region 305 is a virtual image. In general, the user often determines whether the image is a virtual image while operating the probe 102 and changing the imaging range of the ultrasonic image.
  • the picked-up ultrasonic image is analyzed to detect, for example, a region with a possibility of calculus.
  • FIG. 3C is an example of a photoacoustic image 307 generated by imaging the region 301 illustrated in FIG. That is, the photoacoustic image 307 is a photoacoustic image generated based on the photoacoustic signal acquired by the processing in steps S204 to S206 illustrated in FIG. In the photoacoustic image 307, a blood vessel region 308 corresponding to the blood vessel 303 inside the subject is depicted. The photoacoustic image 307 does not depict the image caused by the calculus 302 or the influence on the blood vessel region 308.
  • a possible reason for the feature resulting from the calculus 302 not being depicted in the photoacoustic image 307 is that, for example, the laser beam emitted from the irradiation unit 107 has higher straightness than the ultrasonic wave emitted from the transmission / reception unit 106. It is done.
  • the virtual image region 305 that is an image with the possibility of a calculus is a virtual image
  • the feature due to the calculus 302 is not drawn at the position corresponding to the virtual image region 305 in the photoacoustic image 307. Therefore, the user can refer to the information obtained from the photoacoustic image when determining whether or not the area considered to have a calculus is a virtual image.
  • FIG. 3D is a diagram illustrating an example of a superimposed image 309 in which the photoacoustic image 307 is superimposed on the ultrasonic image 304.
  • the photoacoustic image is displayed by displaying the superimposed image 309.
  • FIG. 3D shows an example in which the photoacoustic image 310 of the region corresponding to the region detected as the region of interest in step S ⁇ b> 203 is superimposed on the ultrasonic image 304 and displayed.
  • the method of displaying the photoacoustic image here, that is, the superimposing method can be set in advance by the user.
  • the photoacoustic image is displayed in a color corresponding to the intensity of the photoacoustic wave.
  • a blood vessel having the characteristic of absorbing the irradiated light and generating a photoacoustic wave is depicted in the photoacoustic image.
  • the photoacoustic image 310 only a blood vessel is depicted on a blood vessel region 311 corresponding to the blood vessel 303, and an image resulting from the calculus 302 is not depicted.
  • the control device 101 can assist the user in diagnosis related to the region of interest. For example, the control device 101 can assist in determining whether or not a region with a possibility of calculus is a virtual image.
  • step S203 an ultrasonic image is analyzed using the technique described in “Removal of a virtual image in an ultrasonic image using fuzzy image processing” (Medical Imaging Technology, Vol. 14, No. 5, 1996).
  • a virtual image may be detected.
  • step S207 the detected virtual image may be removed and displayed. The user can determine whether or not the detected virtual image region is a virtual image using the photoacoustic image. Even when the virtual image is removed and the ultrasonic image is displayed, the photoacoustic image is displayed in a comparable manner, so that the user can visually recognize that the virtual image has been removed from the ultrasonic image.
  • the control device 101 may determine whether or not a region having a possibility of a virtual image detected in the ultrasonic image is a virtual image based on information drawn in the photoacoustic image.
  • the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
  • Elastography is a method for imaging tissue hardness according to the principle described below.
  • elastography takes into account hardness based on Hooke's law, and measures tissue strain due to externally applied stress. For example, when the probe 102 is pressed from the body surface, there is a property that the softer tissue is deformed more greatly. If the displacement of the tissue before and after the pressurization is measured and differentiated, the strain at each point of the tissue can be obtained.
  • An elastography image is an image of the strain distribution at each point of tissue.
  • an elastography image is a two-dimensional image expressed by changing the hue so that a portion with a large distortion (soft portion) is red and a portion with a small distortion (hard portion) is blue via an intermediate green color. It is.
  • the adipose tissue when the breast is used as the subject, the adipose tissue is soft, and the site calcified by breast cancer or the like is considered to be hard.
  • knowing the hardness of the tissue in the subject is useful information for diagnosis.
  • a tumor tissue generates a lot of new blood vessels around it, and it is considered useful for diagnosis to use blood vessel information obtained by a photoacoustic image together with an elastography image.
  • the photoacoustic image is acquired by controlling the irradiation of light to the subject based on the result of analyzing the elastography image which is an example of the ultrasonic image will be described.
  • FIG. 5 is a flowchart showing an example of processing of the control device 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the ultrasonic signal is acquired by the control device 101.
  • the ultrasonic image generation module 146 an ultrasonic image is generated based on the ultrasonic signal.
  • the output module 150 the ultrasonic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
  • step S502 by executing the signal acquisition module 143, it is determined whether or not photographing by elastography has been performed. For example, while observing the ultrasonic image generated in step S501 and displayed on the display unit 104, the user finds a region where detailed observation is desired and performs imaging by elastography. For example, the user can switch the operation mode related to signal acquisition of the probe 102 to the operation mode of elastography imaging via the operation unit 105. Note that a switch or the like for switching the operation mode may be provided in the probe 102. The user switches to the elastography imaging operation mode and performs an operation of pressing the probe 102 against the subject. An ultrasonic signal is acquired from the probe 102 by the signal acquisition module 143.
  • step S503 the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image. Since the process of step S503 is the same as the process of step S202 illustrated in FIG. 2, the above description is used and the description is omitted here.
  • step S504 When acquiring a photoacoustic signal based on an ultrasonic image, it progresses to step S504, and when not acquiring, it returns to step S501 and acquisition of an ultrasonic image is continued.
  • step S504 by executing the ultrasonic image generation module 146, the elastography image acquired in step S502 is analyzed.
  • the ultrasound image generation module 146 analyzes the elastography image, and detects a region of interest defined in advance from the regions drawn on the elastography image.
  • an example of analysis performed based on the ultrasonic image generation module 146 in step S504 will be described.
  • an example in which an elastography image is analyzed and a region where a hard tissue may be depicted is detected as a region of interest is shown.
  • the elastography image is obtained by imaging the strain distribution at each point of the tissue.
  • a set of pixels having a distortion equal to or less than a predetermined value is detected as an area where a hard tissue may be depicted.
  • the signal acquisition module 143 or the ultrasonic image generation module 146 provides information indicating the degree of the pressing.
  • step S504 information on the degree to which the probe 102 presses the subject may be used together. This is because the ease of displacement of each tissue changes according to the degree to which the probe 102 presses the subject. For example, the greater the degree that the probe 102 presses the subject, the smaller the predetermined distortion value may be.
  • the signal acquisition module 143 continues to acquire an ultrasonic signal
  • the ultrasonic image generation module 146 continues to acquire an ultrasonic wave such as a B-mode image or an elastography image.
  • An image may be generated and an ultrasonic image may be continuously displayed on the display unit 104 by the output module 150.
  • step S505 when the ultrasonic image generation module 146 is executed, it is determined whether to acquire a photoacoustic image based on the analysis result in step S504. If a region of interest is extracted in step S504, the process proceeds to step S506, and if a region of interest is not extracted, the process returns to step S501 to continue acquiring an ultrasound image.
  • step S506 the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Since the process of step S506 is the same as the process of step S205 illustrated in FIG. 2, description here is abbreviate
  • step S507 the signal acquisition module 143 is executed to acquire a photoacoustic signal from the probe 102.
  • step S508 the photoacoustic image is generated from the photoacoustic signal acquired in step S507 by executing the photoacoustic image generation module 147. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
  • a superimposed image obtained by superimposing a photoacoustic image on an elastography image is displayed on the display unit 104.
  • the elastography image is generally expressed as a color image having a hue reflecting the degree of elasticity.
  • an image reflecting the concentration of a specific substance, for example, hemoglobin is expressed as a color image of a hue reflecting the magnitude of the concentration.
  • it is preferable to use different hues for the base image and the layer image for example.
  • the region expressed by the hue of the photoacoustic image and the region expressed by the hue of the elastography image overlap, the user confirms that the region is the region depicted in both images. It is preferable to make it visible.
  • the control device 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the degree of elasticity of the tissue and the function information can be referred to. Can be displayed.
  • the user may determine whether a region that is considered to be a hard tissue when an elastography image is observed is a malignant tumor.
  • the control apparatus 101 can assist a user's judgment by presenting, for example, information on a new blood vessel drawn in the photoacoustic image in a comparable manner.
  • the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input.
  • the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
  • the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • FIG. 4 is a diagram illustrating an example of analysis performed in step S504 illustrated in FIG.
  • a case where a region having a possibility of a hard tissue is depicted in an elastography image will be described as an example.
  • FIG. 4A is an example of an elastography image 401. It is the figure which illustrated typically the structure inside a subject. In the image 401, a hard tissue region 402 is displayed so as to be distinguishable from surrounding softer tissue.
  • FIG. 4B is an example of the photoacoustic image 403 obtained by imaging the region depicted in the elastography image 401 shown in FIG.
  • a blood vessel region 404 and a blood vessel region 405 are depicted in the photoacoustic image 403.
  • FIG. 4C is an example of a superimposed image 406 in which the photoacoustic image 403 is superimposed on the elastography image 401.
  • the blood vessel region 407 corresponds to the blood vessel region 404 depicted in the photoacoustic image illustrated in FIG.
  • the tissue region 408 corresponds to the tissue region 402 depicted in the elastography image illustrated in FIG.
  • FIG. 4C is an example of a superimposed image in which a photoacoustic image in the vicinity of a tissue region 408 that is considered to be harder than the surrounding tissue in the elastography image is superimposed.
  • the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
  • the example using the elastography image that expresses the elasticity of the tissue qualitatively has been described, but the present invention is not limited to this.
  • an image generated by quantitative elastic imaging that quantitatively represents the elasticity of the tissue may be used.
  • the propagation of sound waves is the propagation of wave energy, and a force called an acoustic radiation force is generated in the sound wave propagation direction in an object that blocks the propagation of wave energy. Therefore, when a convergent ultrasonic pulse having a high sound pressure and a relatively long duration is radiated to a living body, a minute displacement occurs in the tissue due to the acoustic radiation force.
  • a transverse wave is generated that propagates in a direction perpendicular to the displacement, that is, in a direction perpendicular to the ultrasonic beam. Since the propagation speed of the transverse wave is slower than that of the longitudinal wave, the propagation process of the transverse wave can be imaged by the pulse echo method, and the propagation speed can be obtained. It is considered that the propagation speed of the shear wave is higher as the tissue is harder, and thereby the hardness of the tissue can be quantitatively evaluated.
  • the elastic modulus distribution that is, the quantitative hardness index may be obtained and imaged based on the tissue strain distribution obtained by qualitative elastography and the tissue stress distribution.
  • the tissue stress distribution cannot be directly measured, but may be obtained by anatomical information, simulation, or the like.
  • the photoacoustic image depicts a substance or tissue having a property of absorbing irradiated light and generating an acoustic wave (hereinafter referred to as optical characteristics).
  • tissue features with optical properties can aid in diagnosis. For example, it is said that there are many new blood vessels in the vicinity of the tumor tissue, and there is a possibility that there is a correlation between a region where the density of thin blood vessels is high and the malignancy of the tumor. Further, there is a possibility that the concentration of a substance having optical characteristics is different in a specific lesion tissue as compared with the surrounding normal tissue.
  • a region having a lesion having such characteristics By observing the photoacoustic image, there may be a case where a region having a lesion having such characteristics can be identified.
  • a user observing a photoacoustic image finds a region that may have a lesion, i.e., a region of interest that requires more detailed observation, providing more detailed information about the region of interest It is considered useful for diagnosis.
  • an ultrasonic signal can be acquired and an ultrasonic image can be displayed.
  • FIG. 7 is a flowchart showing an example of processing of the control device 101 for controlling the irradiation of ultrasonic waves based on the acquired photoacoustic image and acquiring the ultrasonic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the photoacoustic image is generated by executing the photoacoustic image generation module 147.
  • the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and a photoacoustic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the photoacoustic signal is acquired by the control device 101.
  • the photoacoustic image generation module 147 a photoacoustic image is generated based on the photoacoustic signal.
  • the output module 150 the photoacoustic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the photoacoustic image displayed on the display unit 104.
  • step S702 the inspection control module 142 is executed to acquire information indicating whether to acquire an ultrasonic signal based on the photoacoustic image.
  • the information indicating whether or not to acquire an ultrasonic signal based on the photoacoustic image is specifically information set in advance by the user or included in the inspection order.
  • whether or not to acquire an ultrasonic signal may be determined according to the state of the load of processing performed by the control device 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal.
  • it progresses to step S703 and when not acquiring, it returns to step S701 and acquisition of a photoacoustic image is continued.
  • the photoacoustic image is analyzed by executing the photoacoustic image generation module 147.
  • the photoacoustic image generation module 147 analyzes the photoacoustic image and detects a region of interest defined in advance from the region depicted in the photoacoustic image.
  • an example of analysis performed based on the photoacoustic image generation module 147 in step S703 is shown.
  • an example of analyzing an image (hereinafter referred to as an absorption coefficient image) reflecting an absorption coefficient for light of a specific wavelength will be described as an example of a photoacoustic image.
  • an absorption coefficient image reflecting an absorption coefficient for light of a specific wavelength
  • a blood vessel image is drawn on the absorption coefficient image.
  • the density of blood vessels in the region depicted in the photoacoustic image is analyzed. Then, a region where blood vessels are present at a predetermined density or more is detected as a region of interest.
  • a value obtained by dividing the number of pixels having a pixel value equal to or greater than a certain threshold value by the number of pixels constituting the predetermined range is used as the density. To do.
  • the photoacoustic image is analyzed in step S703, the photoacoustic signal is continuously acquired by the signal acquisition module 143, the photoacoustic image is continuously generated by the photoacoustic image generation module 147, and the output module 150. Then, the photoacoustic image may be continuously displayed on the display unit 104.
  • step S704 when the photoacoustic image generation module 147 is executed, it is determined whether or not to acquire an ultrasonic image based on the analysis result in step S703. If the region of interest is extracted in step S703, the process proceeds to step S705. If the region of interest is not extracted, the process returns to step S701 to continue acquiring the ultrasound image. From this viewpoint, the photoacoustic image generation module 147 is an example of an analysis unit.
  • the irradiation control module 144 is executed to determine whether or not the subject is irradiated with ultrasonic waves. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the photoacoustic image generated by the photoacoustic image generation module 147 and the position information of the probe 102 acquired by the position acquisition module 149.
  • the probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good.
  • the irradiation control module 144 controls the irradiation unit 107 to irradiate ultrasonic waves when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be.
  • the irradiation control module 144 further irradiates the ultrasonic wave based on the photoacoustic image generated by the photoacoustic image generation module 147 when the region extracted as the region of interest in step S703 is drawn on the photoacoustic image.
  • the transmitting / receiving unit 106 may be controlled to do so.
  • the probe 102 leaves
  • possibility of performing redundant ultrasonic irradiation can be reduced.
  • the temperature of the probe 102 is equal to or lower than a predetermined value, it may be determined that the ultrasonic wave is irradiated. Due to the characteristics of the transmitter / receiver 106, when the probe 102 is separated from the subject, an air layer is formed between the probe 102 and the subject. The acoustic impedance of air is much larger than the acoustic impedance of the transmission / reception unit 106.
  • the ultrasonic waves are repeatedly reflected in the vicinity of the transmission / reception unit 106, and the temperature of the probe 102 may rise.
  • the probe 102 may be provided with a temperature sensor that measures the temperature, and the irradiation control module 144 may acquire temperature information of the probe 102 from the temperature sensor. If it is determined in step S705 that the subject is to be irradiated with ultrasonic waves, the process proceeds to step S706, and if it is determined not to be irradiated, the process returns to step S701 to continue acquiring the photoacoustic image.
  • step S706 an ultrasonic signal is acquired from the probe 102 by executing the signal acquisition module 143. Then, by executing the ultrasonic image generation module 146, an ultrasonic image is generated from the ultrasonic signal. Then, when the output module 150 is executed, the generated ultrasonic image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. .
  • the ultrasonic image generated in step S706 is, for example, a B mode image.
  • step S707 it is determined whether or not imaging by elastography has been performed by executing the ultrasonic image generation module 146. Since the region of interest is depicted in the photoacoustic image, it is considered useful to display the elastography image on the display unit 104 as one piece of detailed information that assists diagnosis of the region of interest. For example, a screen is displayed on the display unit 104 via the output module 150 to notify the user that there is a region with high blood vessel density and that it is useful to perform elastography imaging. The user refers to the notification screen, presses the probe 102 against the subject, and performs elastography imaging. What is the process for determining whether the user has performed elastography photography?
  • step S502 Since it is the same as the process of step S502 illustrated in FIG. 5, description here is abbreviate
  • step S708 by executing the ultrasonic image generation module 146, an elastography image is generated from the ultrasonic signal acquired in step S707. Then, when the output module 150 is executed, the generated elastography image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. .
  • control apparatus 101 can refer to the image of the region of interest specified based on the photoacoustic image and the ultrasonic image such as the B-mode image and the elastography image in association with each other. Images can be displayed on the display unit 104. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the above-described example, and the subject may be irradiated with ultrasonic waves based on a user operation input.
  • an ultrasonic image may be generated by the ultrasonic image generation module 146 and the ultrasonic image may be displayed on the display unit 104 via the output module 150.
  • the elastography image generated by the ultrasonic image generation module 146 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • step S707 and step S708 are not necessarily performed.
  • an ultrasonic image such as a B-mode image may be displayed based on the result of analyzing the photoacoustic image.
  • FIG. 6 is a diagram illustrating an example of analysis performed in step S703 illustrated in FIG. 7 and an example of an image displayed in step S708.
  • a region having a high blood vessel density is depicted in the photoacoustic image will be described as an example.
  • FIG. 6A is an example of the photoacoustic image 602 displayed on the display unit 104.
  • a blood vessel region 603 and a blood vessel region 604 are depicted.
  • FIG. 6B is an example of a screen displayed on the display unit 104 when a region considered to have a high blood vessel density is detected as a result of the analysis in step S703.
  • a frame 605 indicates the region of interest detected in step S703. Thereby, the user can visually recognize the region of interest, that is, the region detected when the blood vessel density is high.
  • the notification screen 606 displays information based on the analysis result. For example, a message for informing the user that there is a region with a high blood vessel density and that the elastography image is useful for further observation is displayed on the notification screen 606. For example, a message “Please perform elastography of a blood vessel dense region” is displayed on the notification screen 606.
  • FIG. 6C is an example of an elastography image 607 based on the ultrasonic signal obtained by the elastography imaging performed in step S707.
  • a tissue region 608 that may be a hard tissue is depicted so as to be distinguishable from surrounding tissues.
  • FIG. 6D is an example of a superimposed image 609 obtained by superimposing an elastography image 607 on the photoacoustic image 602.
  • a blood vessel region 610 corresponding to the blood vessel region 603 of the photoacoustic image 602 is depicted.
  • the user can visually recognize the region related to the blood vessel density around the tissue region 608 of the elastography image 607 illustrated in FIG. 6C, and the diagnosis performed by the user can be assisted.
  • the control device 101 analyzes the acquired photoacoustic image, detects a region that may be a lesion as a region of interest, and generates an ultrasound image corresponding to at least the region of interest.
  • the diagnosis can be assisted by controlling to perform elastography imaging for evaluating the hardness of the region of interest. By controlling in this way, the user can perform elastography imaging centering on the region of interest, and the user workflow in the examination can be improved.
  • an elastography image is acquired as an ultrasonic image
  • the present invention is not limited to this.
  • Doppler imaging for measuring blood flow velocity or B-mode imaging for grasping the structure in the subject may be performed.
  • FIG. 8 is a diagram schematically illustrating an inspection state in the fourth embodiment.
  • FIG. 8A shows an example of a state in which the user brings the probe 102 into contact with the subject 803 and acquires an ultrasonic image.
  • An ultrasonic signal from the probe 102 is transmitted to the console 801.
  • the console 801 is an apparatus in which the control device 101, the display unit 104, and the operation unit 105 shown in FIG.
  • the console 801 corresponds to the control device 101 in each embodiment described above.
  • the position information of the position 802 is acquired by the position acquisition module 149 of the console 801 and stored in the RAM 133 for a predetermined period.
  • position information is associated with the ultrasonic image and the photoacoustic image, respectively. It is assumed that the region of interest is detected in the ultrasonic image captured at the position 802.
  • FIG. 8B is an example of the state of the inspection at the time when the console 801 analyzes the ultrasonic image and detects the region of interest in the series of processes of FIG.
  • the user has moved the probe 102 from the position 805 corresponding to the position 802 in FIG.
  • the region of interest from which the photoacoustic image is to be acquired is detected by the console 801, the detected region of interest may not be depicted even if the photoacoustic image is generated based on the photoacoustic signal acquired at the position 804. There is.
  • FIG. 8C is an example of a state in which the probe 102 is moved to a position 806 where the user can draw the region of interest by the guide of the console 801.
  • the position acquisition module 149 of the console 801 the current position of the probe 102 is compared with the target position which is the position of the probe 102 that acquired the ultrasonic signal of the ultrasonic image in which the region of interest is detected.
  • The guide information for guiding the probe 102 to the target position is generated and displayed.
  • the user can acquire the photoacoustic image of the detected region of interest.
  • FIG. 9 is a flowchart illustrating an example of processing for the guide illustrated in FIG.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • step S901 and step S902 Since the processing of step S901 and step S902 is the same as the processing of step S201 illustrated in FIG. 2, the description here is omitted by using the above description.
  • the position acquisition module 149 is executed, whereby the position information of the probe 102 is acquired. More specifically, the motion sensor, which is an example of the detection unit 103, tracks the position information of the probe 102 and transmits it to the control device 101.
  • the motion sensor is provided or embedded in a portion different from the transmitting / receiving unit 106 of the probe 102 and the light source (not shown).
  • the motion sensor is composed of, for example, a micro electro mechanical system (Micro Electro Mechanical Systems), and provides 9-axis motion sensing including a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic compass.
  • Information regarding the movement of the probe 102 sensed by the motion sensor is acquired by the position acquisition module 149 and stored for a certain period of time.
  • step S904 and step S905 are the same as the processing of step S203 and step S204 illustrated in FIG. 2, the description here will be omitted by using the above description. If it is determined in step S905 that the photoacoustic signal is acquired, the process proceeds to step S906. If it is determined that the photoacoustic signal is not acquired, the process returns to step S901 to continue acquiring the photoacoustic image.
  • step S906 the position acquisition module 149 is executed, whereby guide information is displayed on the display unit 104.
  • the position of the probe 102 at the time when the ultrasonic signal for generating the ultrasonic image in which the region of interest is detected in step S904 is acquired is set as the target position.
  • the difference between the current position of the probe 102 indicated by the position information sequentially transmitted from the detection unit 103 and the target position is acquired.
  • guide information is produced
  • the guide information is presented to the user via the output module 150.
  • the guide information is a guide image, for example, and is displayed on the display unit 104.
  • the guide image is an objective index indicating guide information such as a movement direction, a movement amount, an inclination angle, a rotation direction, and a rotation amount for moving the probe 102 to the target position.
  • the guide image may be any image as long as it is an objective index of the guide information.
  • the guide image is an image of an arrow having a size corresponding to the amount of movement or rotation and a direction corresponding to the direction of movement, rotation, or tilt.
  • the guide image is a figure that has a size corresponding to the amount of movement or rotation, and whose shape is deformed according to the direction of movement, rotation, or inclination.
  • the guide image is displayed on the display unit 104 in a manner that does not interfere with observation of the region of interest when the probe 102 is moved to the target position.
  • the guide image is displayed in an area where the ultrasonic image, the photoacoustic image, and the superimposed image are not displayed.
  • the probe 102 while guiding the probe 102 to move to the target position, it is displayed at a position that overlaps the area near the target area, and when the target area is rendered, it is deformed into a shape that cannot be seen. May be displayed.
  • step S907 the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Specifically, in the determination by the irradiation control module 144, it is determined that light is irradiated when the probe 102 has reached the target position and the probe 102 is in contact with the subject. For example, when the current position of the probe 102 indicated by the position information transmitted from the detection unit 103 matches the target position, it is determined that the probe 102 has reached the target position. In another example, when the probe 102 reaches a predetermined range including the target position, it may be determined that the probe 102 has reached the target position.
  • the position of the probe 102 capable of acquiring the region of interest detected in step S904 and the target position are set as a predetermined range. Since the process for determining that the probe 102 is in contact with the subject is the same as the process of step S205 illustrated in FIG. 2, the description here is omitted by using the above description. If it is determined that the subject is to be irradiated with light, the process proceeds to step S908. If it is determined that the object is not irradiated with light, the process returns to step S901 to continue acquiring the ultrasound image.
  • step S908 and step S909 are the same as the processing of step S206 and step S207 illustrated in FIG. 2, respectively, the description here is omitted by using the above description.
  • the target position is set based on the region of interest detected by analyzing the ultrasonic image
  • the target position may be set based on the region of interest detected by analyzing the photoacoustic image.
  • the method of presenting guide information to the user is not limited to the above-described guide image.
  • the guide information may be presented to the user by generating a sound such that the sound generation interval decreases as the probe 102 approaches the target position.
  • control device 101 can efficiently acquire an image depicting a region of interest, thereby improving the user workflow.
  • the analysis for detecting the region of interest may not be performed for all the frames of the acquired image. For example, it is possible to reduce the processing load on the control apparatus 101 by performing the analysis at a predetermined frame interval.
  • the probe 102 when one of the ultrasonic image and the photoacoustic image is analyzed and a region of interest is detected, an example of controlling the probe 102 to capture the other image will be described. did.
  • the present invention is not limited to this.
  • the probe 102 when the region of interest is included in a predetermined range of the range that can be drawn by the probe 102, the probe 102 may be controlled to capture the other image.
  • the probe 102 may be controlled to capture the other image. This makes it possible to obtain a medical image that makes it easier to observe the region of interest when it is later interpreted or attached to an image diagnosis report. In addition, this can reduce the possibility that the other image is inadvertently captured when the region of interest is included in a range in which one image can be drawn unintentionally.
  • the method of superimposing is not limited to the above-described example.
  • the ultrasonic image is a base image and the photoacoustic image is a layer image
  • the photoacoustic image may be superimposed only in the vicinity of the region of interest of the ultrasonic image, or a photoacoustic image in a desired range is superimposed. May be.
  • the transparency of the layer image is appropriately changed according to the purpose.
  • the layer image may be opaque, or the transparency may be increased only in the vicinity of the region of interest.
  • a slider capable of changing the transparency of the layer image may be displayed on the display unit 104 and changed during observation by the user.
  • the display of the superimposed image, the display in which the ultrasonic image and the photoacoustic image are arranged in parallel, and the display of any one of the ultrasonic image and the photoacoustic image may be switched by a user operation input. Good.
  • a region drawn in a past three-dimensional image may be set as a region of interest.
  • a specific region on a CT image taken in the past is set as a region of interest.
  • the coordinate system of the said CT image and the coordinate system of the real space which operates the probe 102 are matched.
  • the control apparatus 101 can acquire an ultrasonic image and a photoacoustic image, and when the probe 102 is positioned at a position where a region of interest can be acquired while acquiring one image, the other image is also acquired. You may control to.
  • the user may be notified that light irradiation is performed by the probe 102.
  • a notification image for notifying that light irradiation is performed by the probe 102 is displayed on the display unit 104.
  • the probe 102 may be provided with an LED light that is turned on during light irradiation.
  • the control device 101 may generate a notification sound during light irradiation. Thereby, the user can know that light is being emitted from the probe 102, and the safety of the user and the subject can be improved.
  • the necessity of obtaining the photoacoustic image is determined based on the analysis result of the ultrasonic image. Therefore, when a region of interest such as a calculus is included in the ultrasonic image, a photoacoustic image is automatically captured. Therefore, a photoacoustic image is captured once for a certain region of interest. Nevertheless, there is a possibility that a photoacoustic image will be taken again if an attempt is made to confirm the region of interest again with an ultrasonic image. That is, a photoacoustic image is taken a plurality of times for the same region of interest, and an unnecessary photoacoustic image is acquired. Therefore, an object of the present modification is to prevent an unnecessary photoacoustic image from being acquired.
  • the irradiation control module 144 restricts light irradiation for acquiring a photoacoustic image based on position information of the probe 102 acquired by the position acquisition module 149, for example.
  • the irradiation control module 144 stores the position information of the probe 102 when it is determined that light irradiation has been performed in the past, and the position information of the current probe 102 matches or is stored with the stored position information. It is determined whether the deviation is within a predetermined threshold.
  • the irradiation control module 144 is the part which has already acquired the photoacoustic image when the position information of the current probe 102 matches the stored position information or the deviation from the stored position information is within a predetermined threshold. Judging that there is, it restricts light irradiation. By restricting the light irradiation, it is possible to prevent the subject from performing unnecessary light irradiation and to prevent acquisition of unnecessary photoacoustic images.
  • the irradiation control module 144 may limit the irradiation of light for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146, for example.
  • the irradiation control module 144 stores the ultrasonic image generated when it is determined that the light irradiation has been performed in the past, and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
  • the irradiation control module 144 performs light irradiation for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149. It may be limited. For example, the irradiation control module 144 stores an ultrasonic image in association with position information of the probe 102 when it is determined that light irradiation has been performed in the past. Then, the irradiation control module 144 reads an ultrasonic image associated with the current position information of the probe 102 and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • ASIC application specific integrated circuit
  • the control device in each of the above-described embodiments may be realized as a single device, or may be configured to execute the above-described processing by combining a plurality of devices so that they can communicate with each other. included.
  • the above-described processing may be executed by a common server device or server group.
  • the plurality of devices constituting the control device and the control system need only be able to communicate at a predetermined communication rate, and do not need to exist in the same facility or in the same country.
  • a software program that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and the computer of the system or apparatus reads and executes the code of the supplied program. Includes form.
  • the processing according to the embodiment is realized by a computer
  • the program code itself installed in the computer is also one embodiment of the present invention.
  • an OS or the like running on the computer performs part or all of the actual processing, and the functions of the above-described embodiments can be realized by the processing. .
  • Embodiments appropriately combining the above-described embodiments are also included in the embodiments of the present invention.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

Dans la présente invention, une première image est acquise sur la base d'un premier signal acquis par un premier moyen, la première image est analysée et une région d'intérêt est détectée, l'acquisition d'un second signal par un second moyen est commandée afin d'acquérir une seconde image qui comprend la région d'intérêt sur la base du second signal acquis, et la première image et la seconde image sont affichées dans une unité d'affichage d'une manière permettant la comparaison de la première image et de la seconde image.
PCT/JP2017/024569 2016-07-08 2017-07-05 Dispositif de commande, procédé de commande, système de commande et programme WO2018008661A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016136108 2016-07-08
JP2016-136108 2016-07-08
JP2016229312A JP2018011928A (ja) 2016-07-08 2016-11-25 制御装置、制御方法、制御システム及びプログラム
JP2016-229312 2016-11-25

Publications (1)

Publication Number Publication Date
WO2018008661A1 true WO2018008661A1 (fr) 2018-01-11

Family

ID=60912804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024569 WO2018008661A1 (fr) 2016-07-08 2017-07-05 Dispositif de commande, procédé de commande, système de commande et programme

Country Status (1)

Country Link
WO (1) WO2018008661A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019122621A (ja) * 2018-01-17 2019-07-25 キヤノン株式会社 被検体情報取得装置および被検体情報取得方法
CN110384480A (zh) * 2018-04-18 2019-10-29 佳能株式会社 被检体信息取得装置、被检体信息处理方法和存储介质
US20210275040A1 (en) * 2020-03-05 2021-09-09 Koninklijke Philips N.V. Ultrasound-based guidance for photoacoustic measurements and associated devices, systems, and methods

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012254284A (ja) * 2011-05-13 2012-12-27 Fujifilm Corp 断層画像生成装置、方法、及びプログラム
JP2013063253A (ja) * 2011-08-31 2013-04-11 Canon Inc 情報処理装置、超音波撮影装置および情報処理方法
JP2013527782A (ja) * 2010-04-22 2013-07-04 ザ ユニバーシティ オブ ワシントン スルー イッツ センター フォー コマーシャライゼーション 結石を検出し、その除去を促進する超音波ベースの方法及び装置
JP2013158531A (ja) * 2012-02-07 2013-08-19 Canon Inc 被検体情報取得装置及び被検体情報取得方法
JP2014136103A (ja) * 2013-01-18 2014-07-28 Fujifilm Corp 光音響画像生成装置および光音響画像生成方法
JP2015065975A (ja) * 2013-09-26 2015-04-13 キヤノン株式会社 被検体情報取得装置およびその制御方法
JP2016097165A (ja) * 2014-11-25 2016-05-30 キヤノン株式会社 被検体情報取得装置およびプローブ

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013527782A (ja) * 2010-04-22 2013-07-04 ザ ユニバーシティ オブ ワシントン スルー イッツ センター フォー コマーシャライゼーション 結石を検出し、その除去を促進する超音波ベースの方法及び装置
JP2012254284A (ja) * 2011-05-13 2012-12-27 Fujifilm Corp 断層画像生成装置、方法、及びプログラム
JP2013063253A (ja) * 2011-08-31 2013-04-11 Canon Inc 情報処理装置、超音波撮影装置および情報処理方法
JP2013158531A (ja) * 2012-02-07 2013-08-19 Canon Inc 被検体情報取得装置及び被検体情報取得方法
JP2014136103A (ja) * 2013-01-18 2014-07-28 Fujifilm Corp 光音響画像生成装置および光音響画像生成方法
JP2015065975A (ja) * 2013-09-26 2015-04-13 キヤノン株式会社 被検体情報取得装置およびその制御方法
JP2016097165A (ja) * 2014-11-25 2016-05-30 キヤノン株式会社 被検体情報取得装置およびプローブ

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019122621A (ja) * 2018-01-17 2019-07-25 キヤノン株式会社 被検体情報取得装置および被検体情報取得方法
CN110384480A (zh) * 2018-04-18 2019-10-29 佳能株式会社 被检体信息取得装置、被检体信息处理方法和存储介质
CN110384480B (zh) * 2018-04-18 2023-06-09 佳能株式会社 被检体信息取得装置、被检体信息处理方法和存储介质
US20210275040A1 (en) * 2020-03-05 2021-09-09 Koninklijke Philips N.V. Ultrasound-based guidance for photoacoustic measurements and associated devices, systems, and methods

Similar Documents

Publication Publication Date Title
JP5530592B2 (ja) イメージング・パラメータの記憶法
US9801614B2 (en) Ultrasound diagnostic apparatus, ultrasound image processing method, and non-transitory computer readable recording medium
US20170086795A1 (en) Medical image diagnostic apparatus and medical information display control method
CN105188555B (zh) 超声波诊断装置以及图像处理装置
JP2008086767A (ja) 3次元及び4次元コントラスト撮像のためのシステム及び方法
JP2010000143A (ja) 超音波診断装置及びプログラム
US20190150894A1 (en) Control device, control method, control system, and non-transitory storage medium
JP6661787B2 (ja) 光音響画像評価装置、方法およびプログラム並びに光音響画像生成装置
KR20150106779A (ko) 대상체에 대한 복수의 상이한 영상들을 디스플레이하는 방법 및 장치
US20150173721A1 (en) Ultrasound diagnostic apparatus, medical image processing apparatus and image processing method
WO2018116963A1 (fr) Appareil de commande d'affichage, procédé de commande d'affichage et programme
WO2018008439A1 (fr) Appareil, procédé et programme permettant d'afficher une image ultrasonore et une image photoacoustique
WO2018008661A1 (fr) Dispositif de commande, procédé de commande, système de commande et programme
JP2018011928A (ja) 制御装置、制御方法、制御システム及びプログラム
US11510630B2 (en) Display control apparatus, image display method, and non-transitory computer-readable medium
CN108463174A (zh) 用于表征对象的组织的装置和方法
EP3329843B1 (fr) Appareil de commande d'affichage, programme de commande d'affichage et programme
US11744537B2 (en) Radiography system, medical imaging system, control method, and control program
US20200113541A1 (en) Information processing apparatus, information processing method, and storage medium
WO2018008664A1 (fr) Dispositif de commande, procédé de commande, système de commande et programme
JP7129158B2 (ja) 情報処理装置、情報処理方法、情報処理システムおよびプログラム
KR102106542B1 (ko) 초음파를 이용하여 조직의 탄성을 분석하는 방법 및 장치
JP2017042603A (ja) 被検体情報取得装置
US11599992B2 (en) Display control apparatus, display method, and non-transitory storage medium
WO2020040174A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et programme

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17824264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17824264

Country of ref document: EP

Kind code of ref document: A1