WO2018008661A1 - Control device, control method, control system, and program - Google Patents

Control device, control method, control system, and program Download PDF

Info

Publication number
WO2018008661A1
WO2018008661A1 PCT/JP2017/024569 JP2017024569W WO2018008661A1 WO 2018008661 A1 WO2018008661 A1 WO 2018008661A1 JP 2017024569 W JP2017024569 W JP 2017024569W WO 2018008661 A1 WO2018008661 A1 WO 2018008661A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
photoacoustic
ultrasonic
signal
region
Prior art date
Application number
PCT/JP2017/024569
Other languages
French (fr)
Japanese (ja)
Inventor
浩 荒井
由香里 中小司
Original Assignee
キヤノン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016229312A external-priority patent/JP2018011928A/en
Application filed by キヤノン株式会社 filed Critical キヤノン株式会社
Publication of WO2018008661A1 publication Critical patent/WO2018008661A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/08Detecting organic movements or changes, e.g. tumours, cysts, swellings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B8/00Diagnosis using ultrasonic, sonic or infrasonic waves
    • A61B8/13Tomography
    • A61B8/14Echo-tomography

Definitions

  • the disclosure of this specification relates to a control device, a control method, a control system, and a program.
  • An ultrasonic imaging device or a photoacoustic imaging device is used as an imaging device that images a state inside a subject in a minimally invasive manner.
  • a process for generating a photoacoustic image from a photoacoustic signal is performed for a specific region identified based on an ultrasonic image, It is disclosed that processing for generating a photoacoustic image from a photoacoustic signal is not performed for a region other than this region.
  • the photoacoustic signal is based on an acoustic wave generated by expansion generated inside the subject due to light irradiated on the subject. Therefore, only by generating a photoacoustic image from a photoacoustic signal only in a region where a photoacoustic image is desired to be acquired, light is irradiated even on a region where a photoacoustic image is not acquired.
  • the control device disclosed in this specification outputs an ultrasonic signal by transmitting / receiving ultrasonic waves to / from a subject, and receives a photoacoustic wave generated by light irradiation on the subject, from a probe that outputs a photoacoustic signal.
  • Signal acquisition means for acquiring at least one of the ultrasonic signal and the photoacoustic signal
  • generation means for generating an ultrasonic image based on the ultrasonic signal, and the generated ultrasonic image.
  • Irradiation control means for controlling the light irradiation of the probe on the basis thereof.
  • a control device capable of controlling different types of imaging, it is possible to control whether or not the other type of imaging is performed based on an image obtained by one type of imaging. It is possible to perform a test that is not redundant for the user or the subject.
  • an acoustic wave generated by irradiating a subject with light and expanding inside the subject is referred to as a photoacoustic wave.
  • an acoustic wave transmitted from the transducer or a reflected wave (echo) in which the transmitted acoustic wave is reflected inside the subject is referred to as an ultrasonic wave.
  • an imaging method using ultrasonic waves and an imaging method using photoacoustic waves are used.
  • the method of imaging using ultrasonic waves is, for example, that the ultrasonic waves oscillated from the transducer are reflected by the tissue inside the subject according to the difference in acoustic impedance, and the time until the reflected wave reaches the transducer and the reflected wave.
  • An image imaged using ultrasound is hereinafter referred to as an ultrasound image.
  • the user can operate while changing the angle of the probe and observe ultrasonic images of various cross sections in real time. Ultrasound images depict the shapes of organs and tissues and are used to find tumors.
  • the imaging method using photoacoustic waves is a method of generating an image based on photoacoustic waves generated by adiabatic expansion of tissue inside a subject irradiated with light, for example.
  • An image imaged using the photoacoustic wave is hereinafter referred to as a photoacoustic image.
  • the photoacoustic image information related to optical characteristics such as the degree of light absorption of each tissue is depicted.
  • photoacoustic images for example, it is known that blood vessels can be drawn by the optical characteristics of hemoglobin, and its use for evaluating the malignancy of tumors is being studied.
  • various information may be collected by imaging different phenomena on the same part of the subject based on different principles.
  • diagnosis about cancer is performed by combining morphological information obtained from a CT (Computed Tomography) image and functional information relating to metabolism obtained from a PET (Positronization Tomography) image.
  • CT Computer Tomography
  • PET PET
  • diagnosis accuracy it is considered effective to improve diagnosis accuracy to perform diagnosis using information obtained by imaging different phenomena based on different principles.
  • an imaging device for obtaining an image obtained by combining the respective characteristics has been studied.
  • both an ultrasonic image and a photoacoustic image are imaged using ultrasonic waves from a subject
  • the user wants to operate the probe in the same manner as a conventional ultrasonic image. That is, it is conceivable that the user touches the surface of the subject and operates the probe while observing an image displayed based on information acquired by the probe. At that time, if the operation mode related to signal acquisition or image display is switched via, for example, a switch provided on the probe or an input device provided on the console of the imaging device, the user observes the image. It is necessary to interrupt the probe operation. For this reason, it is conceivable that the body movement of the subject occurs during the operation input to the switch or the input device of the console, or the probe position shifts.
  • An object of the first embodiment is to provide a control device that can switch an image to be displayed without deteriorating operability when a user observes an image.
  • FIG. 10 is a diagram illustrating an example of a system configuration including the control device 101 according to the first embodiment.
  • An imaging system 100 that can generate an ultrasonic image and a photoacoustic image is connected to various external devices via a network 110.
  • Each configuration and various external devices included in the imaging system 100 do not need to be installed in the same facility, and may be connected to be communicable.
  • the imaging system 100 includes a control device 101, a probe 102, a detection unit 103, a display unit 104, and an operation unit 105.
  • the control apparatus 101 is an apparatus that acquires an ultrasonic signal and a photoacoustic signal from the probe 102, controls acquisition of the photoacoustic signal based on, for example, an ultrasonic image, and generates a photoacoustic image based on the control.
  • the control device 101 acquires information related to an examination including imaging of an ultrasonic image and a photoacoustic image from the ordering system 112, and controls the probe 102, the detection unit 103, and the display unit 104 when the examination is performed.
  • the control device 101 outputs the generated ultrasonic image, photoacoustic image, and superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image to the PACS 113.
  • the control device 101 transmits / receives information to / from external devices such as the ordering system 112 and the PACS 113 in accordance with standards such as HL7 (Health level 7) and DICOM (Digital Imaging and Communications in Medicine). Details of processing performed by the control device 101 will be described later.
  • the region in the subject from which an ultrasound image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a liver, or a pancreas.
  • an ultrasound image of a subject to which an ultrasound contrast agent using microbubbles is administered may be captured.
  • the region in the subject from which the photoacoustic image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a diameter portion, an abdomen, a limb including fingers and toes.
  • a blood vessel region including a new blood vessel and a plaque on a blood vessel wall may be set as a target for imaging a photoacoustic image in accordance with the characteristics relating to light absorption in the subject.
  • a region in a subject where a photoacoustic image is captured by the imaging system 100 is not necessarily a region where an ultrasonic image is captured. Does not have to match.
  • a photoacoustic image of a subject to which a dye such as methylene blue or indocyanine green, gold fine particles, or a substance obtained by integrating or chemically modifying them is administered as a contrast agent is captured. May be.
  • the probe 102 is operated by a user and transmits an ultrasonic signal and a photoacoustic signal to the control device 101.
  • the probe 102 includes a transmission / reception unit 106 and an irradiation unit 107.
  • the probe 102 transmits an ultrasonic wave from the transmission / reception unit 106 and receives the reflected wave by the transmission / reception unit 106. Further, the probe 102 irradiates the subject with light from the irradiation unit 107, and the photoacoustic wave is received by the transmission / reception unit 106.
  • the probe 102 converts the received reflected wave and photoacoustic wave into an electric signal, and transmits it to the control device 101 as an ultrasonic signal and a photoacoustic signal.
  • the probe 102 is controlled so that, when information indicating contact with the subject is received, transmission of ultrasonic waves for acquiring an ultrasonic signal and light irradiation for acquiring a photoacoustic signal are executed. It is
  • the transmission / reception unit 106 includes at least one transducer (not shown), a matching layer (not shown), a damper (not shown), and an acoustic lens (not shown).
  • the transducer (not shown) is made of a material exhibiting a piezoelectric effect, such as PZT (lead zirconate titanate) or PVDF (polyvinylidene difluoride).
  • the transducer (not shown) may be other than a piezoelectric element, for example, a transducer using a capacitive transducer (CMUT: capacitive ultrasonic transducer) or a Fabry-Perot interferometer.
  • CMUT capacitive ultrasonic transducer
  • the ultrasonic signal is composed of frequency components of 2 to 20 MHz and the photoacoustic signal is composed of frequency components of 0.1 to 100 MHz, and a transducer (not shown) that can detect these frequencies is used.
  • the signal obtained by the transducer (not shown) is a time-resolved signal.
  • the amplitude of the received signal represents a value based on the sound pressure received by the transducer at each time.
  • the transmission / reception unit 106 includes a circuit (not shown) or a control unit for electronic focusing.
  • the array form of transducers (not shown) is, for example, a sector, a linear array, a convex, an annular array, or a matrix array.
  • the transmitting / receiving unit 106 may include an amplifier (not shown) that amplifies a time-series analog signal received by a transducer (not shown).
  • the transmission / reception unit 106 may include an A / D converter that converts a time-series analog signal received by a transducer (not shown) into a time-series digital signal.
  • the transducer (not shown) may be divided into a transmitter and a receiver depending on the purpose of imaging an ultrasonic image. Further, the transducer (not shown) may be divided into an ultrasonic image capturing unit and a photoacoustic image capturing unit.
  • the irradiation unit 107 includes a light source (not shown) for acquiring a photoacoustic signal and an optical system (not shown) that guides pulsed light emitted from the light source (not shown) to the subject.
  • the pulse width of light emitted from a light source (not shown) is, for example, 1 ns or more and 100 ns or less.
  • the wavelength of the light which a light source (not shown) injects is a wavelength of 400 nm or more and 1600 nm or less, for example.
  • a wavelength of 400 nm or more and 700 nm or less and a large absorption in the blood vessel is preferable.
  • the wavelength of 700 nm or more and 1100 nm or less which is hard to be absorbed by tissues such as water and fat is preferable.
  • the light source (not shown) is, for example, a laser or a light emitting diode.
  • the irradiation unit 107 may use a light source that can convert wavelengths in order to acquire a photoacoustic signal using light of a plurality of wavelengths.
  • the irradiation unit 107 may include a plurality of light sources that generate light of different wavelengths, and may be configured to be able to irradiate light of different wavelengths alternately from each light source.
  • the laser is, for example, a solid laser, a gas laser, a dye laser, or a semiconductor laser.
  • a pulsed laser such as an Nd: YAG laser or an alexandrite laser may be used.
  • a Ti: sa laser or an OPO (optical parametric oscillators) laser that uses Nd: YAG laser light as excitation light may be used as a light source (not shown).
  • a microwave source may be used as a light source (not shown).
  • optical elements such as lenses, mirrors, and optical fibers are used.
  • the optical system may include a diffusion plate that diffuses the emitted light.
  • the optical system may include a lens or the like so that the beam can be focused.
  • the detection unit 103 acquires information regarding the position and orientation of the probe 102.
  • the detection unit 103 transmits information related to the position of the probe 102 to the control device 101.
  • the detection unit 103 is a motion sensor provided in the probe 102, for example.
  • the detection unit 103 is not necessarily included in the control device 101, and the sensor may be switched between ON and OFF as appropriate based on various conditions set prior to the inspection.
  • the display unit 104 displays an image captured by the imaging system 100 and information related to the inspection based on control from the control device 101.
  • the display unit 104 provides an interface for receiving user instructions based on control from the control device 101.
  • the display unit 104 is a liquid crystal display, for example.
  • the operation unit 105 transmits information related to user operation input to the control apparatus 101.
  • the operation unit 105 is, for example, a keyboard, a trackball, and various buttons for performing operation inputs related to inspection.
  • the display unit 104 and the operation unit 105 may be integrated as a touch panel display.
  • the control apparatus 101, the display part 104, and the operation part 105 do not need to be separate apparatuses, and may be implement
  • the control device 101 may have a plurality of probes.
  • a HIS (Hospital Information System) 111 is a system that supports hospital operations.
  • the HIS 111 includes an electronic medical record system, an ordering system, and a medical accounting system.
  • the ordering system of the HIS 111 transmits order information to the ordering system 112 for each department.
  • the ordering system 112 which will be described later, manages the execution of the order.
  • the ordering system 112 is a system that manages inspection information and manages the progress of each inspection in the imaging apparatus.
  • the ordering system 112 may be configured for each department that performs inspection.
  • the ordering system 112 is, for example, RIS (Radiology Information System) in the radiation department.
  • RIS Radiology Information System
  • the ordering system 112 transmits information on examinations performed by the imaging system 100 to the control apparatus 101.
  • the ordering system 112 receives information related to the progress of the inspection from the control device 101.
  • the ordering system 112 transmits information indicating that the inspection is completed to the HIS 111.
  • the ordering system 112 may be integrated into the HIS 111.
  • a PACS (Picture Archiving and Communication System) 113 is a database system that holds images obtained by various imaging devices inside and outside the facility.
  • the PACS 113 manages a storage unit (not shown) that stores medical images and imaging conditions of such medical images, additional parameters such as image processing parameters including reconstruction, and patient information, and information stored in the storage unit.
  • a controller (not shown).
  • the PACS 113 stores an ultrasonic image, a photoacoustic image, and a superimposed image output from the control device 101. It is preferable that communication between the PACS 113 and the control device 101 and various images stored in the PACS 113 comply with standards such as HL7 and DICOM. Various images output from the control device 101 are stored with associated information associated with various tags in accordance with the DICOM standard.
  • the Viewer 114 is a terminal for image diagnosis, and reads an image stored in the PACS 113 and displays it for diagnosis.
  • the doctor displays an image on the Viewer 114 for observation, and records information obtained as a result of the observation as an image diagnosis report.
  • the diagnostic imaging report created using the Viewer 114 may be stored in the Viewer 114, or may be output and stored in the PACS 113 or a report server (not shown).
  • the Printer 115 prints an image stored in the PACS 113 or the like.
  • the Printer 115 is, for example, a film printer, and outputs an image stored in the PACS 113 or the like by printing it on a film.
  • FIG. 1 is a diagram illustrating an example of the configuration of the control device 101.
  • the control device 101 includes a CPU 131, ROM 132, RAM 133, DISK 134, USB 135, communication circuit 136, GPU 137, HDMI 138, and probe connector port 139. These are connected by BUS130 so that communication is possible.
  • the BUS 130 is a data bus, and is used to transmit / receive data between connected hardware and to transmit commands from the CPU 131 to other hardware.
  • a CPU (Central Processing Unit) 131 is a control circuit that integrally controls the control device 101 and each unit connected thereto.
  • the CPU 131 performs control by executing a program stored in the ROM 132. Further, the CPU 131 executes a display driver which is software for controlling the display unit 104 and performs display control on the display unit 104. Further, the CPU 131 performs input / output control for the operation unit 105.
  • ROM (Read Only Memory) 132 stores a program and data in which a control procedure by the CPU is stored.
  • the ROM 132 has a boot program 140 of the control device 101 and various initial data 141.
  • various modules 142 to 150 for realizing the processing of the control apparatus 101 are included. Various modules for realizing the processing of the control apparatus 101 will be described later.
  • a RAM (Random Access Memory) 133 provides a working storage area when the CPU 131 performs control by an instruction program.
  • the RAM 133 has a stack 151 and a work area 152.
  • the RAM 133 stores a program for executing processing in each unit connected to the control device 101 and various parameters used in image processing.
  • the RAM 133 stores a control program executed by the CPU 131, and temporarily stores various data when the CPU 131 executes various controls.
  • the DISK 134 is an auxiliary storage device that stores various data such as ultrasonic images and photoacoustic images.
  • the DISK 134 is, for example, an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
  • a USB (Universal Serial Bus) 205 is a connection unit connected to the operation unit 105.
  • the communication circuit 136 is a circuit for communicating with each unit constituting the imaging system 100 and various external devices connected to the network 110.
  • the communication circuit 136 stores output information in a transfer packet, for example, and outputs the information to an external device via the network 110 by a communication technique such as TCP / IP.
  • the control device 101 may have a plurality of communication circuits in accordance with a desired communication form.
  • the GPU 137 is included in a general-purpose graphics board including a video memory.
  • the GPU 137 executes part or all of the image processing module 145 and performs, for example, a photoacoustic image reconstruction process. By using such an arithmetic device, it is possible to perform operations such as reconstruction processing at high speed without requiring dedicated hardware.
  • HDMI (registered trademark) (High Definition Multimedia Interface) 138 is a connection unit connected to the display unit 104.
  • the probe connector port 139 is a connection port for connecting the probe 102 to the control device 101.
  • the ultrasonic signal and photoacoustic signal output from the probe 102 are acquired by the control device 101 via the port 139.
  • CPU 131 and GPU 137 are examples of processors.
  • the ROM 132, RAM 133, and DISK 134 are examples of memories.
  • the control device 101 may have a plurality of processors. In the first embodiment, the function of each unit of the control device 101 is realized by the processor of the control device 101 executing a program stored in the memory.
  • control device 101 may have a CPU or GPU that performs a specific process exclusively. Further, the control device 101 may have a field-programmable gate array (FPGA) in which specific processing or all processing is programmed. The control device 101 may have both an HDD and an SSD as the DISK 134.
  • FPGA field-programmable gate array
  • the modules 143 to 150 stored in the ROM 132 will be described. Modules 143 to 150 shown in FIG. 1 are obtained by extracting modules for executing processing related to the embodiment of the present invention.
  • the control device 101 may include modules necessary for executing the inspection and operating the control device 101 other than those illustrated.
  • each module may be configured as a combination of one or a plurality of programs.
  • some or all of the modules 143 to 150 may be stored in a memory other than the ROM 132 such as the DISK 134.
  • each module will be described in detail.
  • the inspection control module 142 controls inspection performed in the imaging system 100.
  • the inspection control module 142 acquires inspection order information from the ordering system 112.
  • the examination order includes information on a patient who undergoes an examination and information on imaging procedures.
  • the inspection control module 142 controls the probe 102 and the detection unit 103 based on information on the imaging technique.
  • the examination control module 142 displays information on the examination on the display unit 104 via the output module 150 in order to present information related to the examination to the user.
  • the information on the examination displayed on the display unit 104 includes information on the patient undergoing the examination, information on the imaging technique included in the examination, and an image already generated after imaging.
  • the inspection control module 142 transmits information regarding the progress of the inspection to the ordering system 112. For example, when the inspection is started by the user, the system 112 is notified of the start, and when imaging by all the imaging techniques included in the inspection is completed, the system 112 is notified of the completion.
  • the signal acquisition module 143 acquires an ultrasonic signal and a photoacoustic signal from the probe 102. Specifically, the signal acquisition module 143 distinguishes an ultrasonic signal and a photoacoustic signal from information acquired from the probe 102 based on information from the inspection control module 142, the image processing module 145, and the position acquisition module 149. get. For example, in the imaging technique in which imaging is performed, when the acquisition timing of the ultrasonic signal and the acquisition of the photoacoustic signal is defined, based on the acquisition timing information acquired from the inspection control module 142, An ultrasonic signal and a photoacoustic signal are distinguished and acquired from information acquired from the probe 102.
  • the signal acquisition module 143 is an example of an acquisition unit that acquires at least one of an ultrasonic signal and a photoacoustic signal from the probe 102.
  • the signal acquisition module 143 has an irradiation control module 144.
  • the irradiation control module 144 controls the light irradiation by the irradiation unit 107 based on the information regarding the imaging conditions acquired from the inspection control module 142 and the result of analysis of the ultrasonic image by the image processing module 145.
  • the image processing module 145 is a module for performing processing for generating an image based on a signal acquired in the imaging system 100.
  • the image processing module 145 includes an ultrasonic image generation module 146, a photoacoustic image generation module 147, and a superimposed image generation module 148.
  • the image processing module 145 stores the images generated by the ultrasonic image generation module 146, the photoacoustic image generation module 147, and the superimposed image generation module 148 in the DISK 134 together with accompanying information.
  • the image is stored in the external device by outputting the image together with the supplementary information to the external device via the output module 150.
  • the ultrasonic image generation module 146 generates an ultrasonic image to be displayed on the display unit 104 from the ultrasonic signal acquired by the signal acquisition module 143.
  • the ultrasonic image generation module 146 generates an ultrasonic image suitable for the set mode based on the imaging technique information acquired from the examination control module 142. For example, when the Doppler mode is set as an imaging technique, the ultrasound image generation module 146 determines whether or not the internal frequency of the subject is based on the difference between the frequency of the ultrasound signal acquired by the signal acquisition module 143 and the transmission frequency. An image showing the flow velocity is generated.
  • the ultrasonic image generated by the ultrasonic image generation module 146 may be generated by any other method such as A mode, M mode, or Doppler mode, or may be a harmonic image or an elastography image. May be.
  • the ultrasonic image generation module 146 analyzes the generated ultrasonic image, acquires a photoacoustic signal, and specifies a region where a photoacoustic image is to be generated. For example, the ultrasonic image generation module 146 analyzes the ultrasonic image and identifies an area where there is a possibility of calculus. In this viewpoint, the ultrasonic image generation module 146 functions as a detection unit that analyzes an ultrasonic image and specifies a region from which a photoacoustic image is acquired.
  • the photoacoustic image generation module 147 generates a photoacoustic image based on the photoacoustic signal acquired by the signal acquisition module 143.
  • the photoacoustic image generation module 147 reconstructs an acoustic wave distribution (hereinafter referred to as an initial sound pressure distribution) when light is irradiated based on the photoacoustic signal.
  • the photoacoustic image generation module 147 obtains the light absorption coefficient distribution in the subject by dividing the reconstructed initial sound pressure distribution by the light fluence distribution of the subject irradiated with the light. .
  • the concentration distribution of the substance in the subject is obtained from the absorption coefficient distribution for a plurality of wavelengths by utilizing the fact that the degree of light absorption in the subject varies depending on the wavelength of the light irradiated to the subject.
  • the photoacoustic image generation module 147 acquires the concentration distribution of substances in the subject of oxyhemoglobin and deoxyhemoglobin. Further, the photoacoustic image generation module 147 acquires the oxygen saturation distribution as a ratio of the oxyhemoglobin concentration to the deoxyhemoglobin concentration.
  • the photoacoustic image generated by the photoacoustic image generation module 147 is an image indicating information such as the above-described initial sound pressure distribution, optical fluence distribution, absorption coefficient distribution, substance concentration distribution, and oxygen saturation distribution.
  • the photoacoustic image may be any image generated by combining these.
  • the image processing module 145 is an example of a generation unit that generates an ultrasonic image based on the ultrasonic signal and generates a photoacoustic image based on the photoacoustic signal.
  • the superimposed image generation module 148 generates a superimposed image in which the photoacoustic image generated by the photoacoustic image generation module 147 is superimposed on the ultrasonic image generated by the ultrasonic image generation module 146.
  • the superimposed image generation module 148 obtains a superimposed image by aligning the ultrasonic image and the photoacoustic image.
  • information regarding the imaging condition acquired from the inspection control module 142 or the position of the probe 102 acquired from the position acquisition module 149 described later may be used.
  • the alignment may be performed based on a region that is depicted in common for the ultrasonic image and the photoacoustic image.
  • the position acquisition module 149 acquires information related to the position of the probe 102 based on information from the detection unit 103.
  • the position acquisition module 149 obtains at least one of information on the speed of movement of the probe 102 with respect to the subject, information on the speed of rotation, and information indicating the degree of pressure on the subject based on the change over time of the information on the position. You may get it.
  • the position acquisition module 149 preferably acquires the position information on the probe 102 at regular time intervals, preferably in real time.
  • the position acquisition module 149 may acquire information regarding the probe 102 used for imaging.
  • Information related to the probe 102 includes information such as the type of probe, center frequency, sensitivity, acoustic focus, electronic focus, and observation depth.
  • the position acquisition module 149 appropriately transmits information regarding the position of the probe 102 and information regarding the probe 102 to the inspection control module 142, the image processing module 145, and the output module 150.
  • the output module 150 outputs information for displaying a screen on the display unit 104, and outputs the information to an external device via the network 110.
  • the output module 150 controls the display unit 104 to display information on the display unit 104.
  • the output module 150 displays information on the display unit 104 in response to an input from the inspection control module 142 or the image processing module 145 or a user operation input via the operation unit 105.
  • the output module 150 is an example of a display control unit.
  • the output module 150 outputs information from the control device 101 to an external device such as the PACS 113 via the network 110.
  • the output module 150 outputs the ultrasonic image and the photoacoustic image generated by the image processing module 145 and a superimposed image thereof to the PACS 113.
  • the image output from the output module 150 includes incidental information attached as various tags according to the DICOM standard by the inspection control module 142.
  • the incidental information includes, for example, patient information, information indicating the imaging device that captured the image, an image ID for uniquely identifying the image, and an examination ID for uniquely identifying the examination that captured the image Is included. Further, the incidental information includes information that associates an ultrasonic image captured in the same examination with a photoacoustic image.
  • the information associating the ultrasonic image and the photoacoustic image is information indicating a frame having the closest timing at which the photoacoustic image is acquired, for example, among a plurality of frames constituting the ultrasonic image.
  • the position information of the probe 102 acquired by the detection unit 103 may be incidental to each frame of the ultrasonic image and the photoacoustic image. That is, the output module 150 outputs information indicating the position of the probe 102 that has acquired the ultrasonic signal for generating the ultrasonic image, attached to the ultrasonic image. Also, the output module 150 outputs information indicating the position of the probe 102 that has acquired the photoacoustic signal for generating the photoacoustic image, attached to the photoacoustic image.
  • the output module 150 is an example of an output unit.
  • FIG. 2 is a flowchart showing an example of processing of the control apparatus 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the ultrasonic signal is acquired by the control device 101.
  • the ultrasonic image generation module 146 an ultrasonic image is generated based on the ultrasonic signal.
  • the output module 150 the ultrasonic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
  • step S202 the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image.
  • the information indicating whether or not to acquire the photoacoustic signal based on the ultrasonic image is specifically information set in advance by the user or included in the inspection order.
  • whether to acquire a photoacoustic signal may be determined according to the state of the load of processing performed in the control apparatus 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal. If the photoacoustic signal is acquired based on the ultrasound image, the process proceeds to step S203. If not, the process returns to step S201 to continue acquiring the ultrasound image.
  • the ultrasonic image generation module 146 is executed to analyze the ultrasonic image.
  • the ultrasound image generation module 146 analyzes the ultrasound image and detects a region of interest defined in advance from the regions depicted in the ultrasound image.
  • a region of interest defined in advance from the regions depicted in the ultrasound image.
  • an organ such as a calculus, a tumor, or a blood vessel is set in the region of interest.
  • the calculus includes the case of a virtual image that can be depicted by ultrasonic imaging.
  • a computer diagnosis support system can also be used in combination to search for a site of interest.
  • the analysis of the ultrasound image and the region to be extracted as the region of interest are not limited to the above-described examples, and any form may be used as long as it is possible to detect a region that is considered to be beneficial to acquire a photoacoustic image in addition to the ultrasound image. An example is also acceptable.
  • speckle is regarded as a noise component, and speckle is reduced by a spatial filter such as moving average or median.
  • speckle is reduced by a filter that uses the property of Rayleigh distribution, which is a property peculiar to speckle patterns, and the mask shape is locally variable.
  • speckles may be reduced using a multi-resolution filter or a filter by numerical simulation.
  • segmentation for extracting a target region is performed on the image with reduced speckles by threshold processing and differentiation processing on the density value.
  • segmentation using a variable shape model may be performed.
  • segmentation may be performed based on a speckle pattern that is a property unique to an ultrasound image.
  • texture analysis using a feature amount based on a co-occurrence matrix for density values, or a statistic obtained from a parameter of a logarithmic compression K distribution is used as a feature amount for a nonuniform echo.
  • One example is a probability distribution method.
  • information on internal echo, shape, boundary echo, backward echo, and outer shadow may be used.
  • the ultrasonic image is analyzed in step S203, the ultrasonic signal is continuously acquired by the signal acquisition module 143, the ultrasonic image is continuously generated by the ultrasonic image generation module 146, and the output module 150. Then, the ultrasonic image may be continuously displayed on the display unit 104.
  • step S204 by executing the ultrasonic image generation module 146, it is determined whether to acquire a photoacoustic image based on the analysis result in step S203. If the region of interest is extracted in step S203, the process proceeds to step S205. If the region of interest is not extracted, the process returns to step S201 to continue acquiring the ultrasound image.
  • the irradiation control module 144 is executed to determine whether or not to irradiate the subject with light. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149.
  • the probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good.
  • the irradiation control module 144 controls the irradiation unit 107 to emit light when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be.
  • the irradiation control module 144 further irradiates light based on the ultrasonic image generated by the ultrasonic image generation module 146 when the region extracted as the region of interest in step S203 is drawn on the ultrasonic image.
  • the irradiation unit 107 may be controlled as described above.
  • step S205 If it is determined in step S205 that the subject is to be irradiated with light, the process proceeds to step S206. If it is determined that the object is not irradiated, the process returns to step S201 to continue acquiring the ultrasound image.
  • Step S206 the photoacoustic signal is acquired from the probe 102 by executing the signal acquisition module 143.
  • step S207 the photoacoustic image generation module 147 is executed to reconstruct a photoacoustic image from the photoacoustic signal acquired in step S206. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
  • morphological information inside the subject is depicted in an ultrasonic image such as a B-mode image.
  • functional information such as the amount of hemoglobin in the blood vessel is depicted in the photoacoustic image.
  • a superimposed image generated according to a predetermined overlapping condition is displayed on the display unit 104.
  • the superimposing conditions are, for example, conditions such as a color for displaying each image, a superimposing range, and transparency in a superimposed image in which an ultrasonic image is a base image and a photoacoustic image is a layer image.
  • the control apparatus 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the form information and the function information can be referred to in association with each other. Can do. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input.
  • the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
  • the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • FIG. 3 is a diagram illustrating an example of analysis performed in step S203 illustrated in FIG. 2 and an example of an image displayed in step S207.
  • a case where a region with a possibility of calculus is depicted in an ultrasonic image will be described as an example.
  • FIG. 3A is a diagram schematically illustrating the internal structure of the subject.
  • An area 301 is an area drawn on an ultrasonic image generated based on the ultrasonic signal when the ultrasonic signal is acquired by bringing the probe into contact with a certain position with respect to the subject. It is assumed that a calculus 302 and a blood vessel 303 exist inside the subject. At this time, the calculus 302 is located outside the region 301.
  • FIG. 3B is an example of an ultrasonic image 304 generated by imaging the region 301 illustrated in FIG.
  • a blood vessel region 306 corresponding to the blood vessel 303 inside the subject is depicted.
  • an image of the calculus 302 existing outside the region 301 is depicted as a virtual image 305 in the ultrasonic image 304.
  • a virtual image is an image in which a structure that does not originally exist inside the subject is depicted on the image.
  • an ultrasonic image 304 as if the calculus 302 is present in the region 301 depicted by the ultrasonic waves in the main direction is obtained.
  • a user who observes the ultrasound image 304 must determine whether or not the virtual image region 305 is a virtual image. In general, the user often determines whether the image is a virtual image while operating the probe 102 and changing the imaging range of the ultrasonic image.
  • the picked-up ultrasonic image is analyzed to detect, for example, a region with a possibility of calculus.
  • FIG. 3C is an example of a photoacoustic image 307 generated by imaging the region 301 illustrated in FIG. That is, the photoacoustic image 307 is a photoacoustic image generated based on the photoacoustic signal acquired by the processing in steps S204 to S206 illustrated in FIG. In the photoacoustic image 307, a blood vessel region 308 corresponding to the blood vessel 303 inside the subject is depicted. The photoacoustic image 307 does not depict the image caused by the calculus 302 or the influence on the blood vessel region 308.
  • a possible reason for the feature resulting from the calculus 302 not being depicted in the photoacoustic image 307 is that, for example, the laser beam emitted from the irradiation unit 107 has higher straightness than the ultrasonic wave emitted from the transmission / reception unit 106. It is done.
  • the virtual image region 305 that is an image with the possibility of a calculus is a virtual image
  • the feature due to the calculus 302 is not drawn at the position corresponding to the virtual image region 305 in the photoacoustic image 307. Therefore, the user can refer to the information obtained from the photoacoustic image when determining whether or not the area considered to have a calculus is a virtual image.
  • FIG. 3D is a diagram illustrating an example of a superimposed image 309 in which the photoacoustic image 307 is superimposed on the ultrasonic image 304.
  • the photoacoustic image is displayed by displaying the superimposed image 309.
  • FIG. 3D shows an example in which the photoacoustic image 310 of the region corresponding to the region detected as the region of interest in step S ⁇ b> 203 is superimposed on the ultrasonic image 304 and displayed.
  • the method of displaying the photoacoustic image here, that is, the superimposing method can be set in advance by the user.
  • the photoacoustic image is displayed in a color corresponding to the intensity of the photoacoustic wave.
  • a blood vessel having the characteristic of absorbing the irradiated light and generating a photoacoustic wave is depicted in the photoacoustic image.
  • the photoacoustic image 310 only a blood vessel is depicted on a blood vessel region 311 corresponding to the blood vessel 303, and an image resulting from the calculus 302 is not depicted.
  • the control device 101 can assist the user in diagnosis related to the region of interest. For example, the control device 101 can assist in determining whether or not a region with a possibility of calculus is a virtual image.
  • step S203 an ultrasonic image is analyzed using the technique described in “Removal of a virtual image in an ultrasonic image using fuzzy image processing” (Medical Imaging Technology, Vol. 14, No. 5, 1996).
  • a virtual image may be detected.
  • step S207 the detected virtual image may be removed and displayed. The user can determine whether or not the detected virtual image region is a virtual image using the photoacoustic image. Even when the virtual image is removed and the ultrasonic image is displayed, the photoacoustic image is displayed in a comparable manner, so that the user can visually recognize that the virtual image has been removed from the ultrasonic image.
  • the control device 101 may determine whether or not a region having a possibility of a virtual image detected in the ultrasonic image is a virtual image based on information drawn in the photoacoustic image.
  • the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
  • Elastography is a method for imaging tissue hardness according to the principle described below.
  • elastography takes into account hardness based on Hooke's law, and measures tissue strain due to externally applied stress. For example, when the probe 102 is pressed from the body surface, there is a property that the softer tissue is deformed more greatly. If the displacement of the tissue before and after the pressurization is measured and differentiated, the strain at each point of the tissue can be obtained.
  • An elastography image is an image of the strain distribution at each point of tissue.
  • an elastography image is a two-dimensional image expressed by changing the hue so that a portion with a large distortion (soft portion) is red and a portion with a small distortion (hard portion) is blue via an intermediate green color. It is.
  • the adipose tissue when the breast is used as the subject, the adipose tissue is soft, and the site calcified by breast cancer or the like is considered to be hard.
  • knowing the hardness of the tissue in the subject is useful information for diagnosis.
  • a tumor tissue generates a lot of new blood vessels around it, and it is considered useful for diagnosis to use blood vessel information obtained by a photoacoustic image together with an elastography image.
  • the photoacoustic image is acquired by controlling the irradiation of light to the subject based on the result of analyzing the elastography image which is an example of the ultrasonic image will be described.
  • FIG. 5 is a flowchart showing an example of processing of the control device 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the ultrasonic signal is acquired by the control device 101.
  • the ultrasonic image generation module 146 an ultrasonic image is generated based on the ultrasonic signal.
  • the output module 150 the ultrasonic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
  • step S502 by executing the signal acquisition module 143, it is determined whether or not photographing by elastography has been performed. For example, while observing the ultrasonic image generated in step S501 and displayed on the display unit 104, the user finds a region where detailed observation is desired and performs imaging by elastography. For example, the user can switch the operation mode related to signal acquisition of the probe 102 to the operation mode of elastography imaging via the operation unit 105. Note that a switch or the like for switching the operation mode may be provided in the probe 102. The user switches to the elastography imaging operation mode and performs an operation of pressing the probe 102 against the subject. An ultrasonic signal is acquired from the probe 102 by the signal acquisition module 143.
  • step S503 the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image. Since the process of step S503 is the same as the process of step S202 illustrated in FIG. 2, the above description is used and the description is omitted here.
  • step S504 When acquiring a photoacoustic signal based on an ultrasonic image, it progresses to step S504, and when not acquiring, it returns to step S501 and acquisition of an ultrasonic image is continued.
  • step S504 by executing the ultrasonic image generation module 146, the elastography image acquired in step S502 is analyzed.
  • the ultrasound image generation module 146 analyzes the elastography image, and detects a region of interest defined in advance from the regions drawn on the elastography image.
  • an example of analysis performed based on the ultrasonic image generation module 146 in step S504 will be described.
  • an example in which an elastography image is analyzed and a region where a hard tissue may be depicted is detected as a region of interest is shown.
  • the elastography image is obtained by imaging the strain distribution at each point of the tissue.
  • a set of pixels having a distortion equal to or less than a predetermined value is detected as an area where a hard tissue may be depicted.
  • the signal acquisition module 143 or the ultrasonic image generation module 146 provides information indicating the degree of the pressing.
  • step S504 information on the degree to which the probe 102 presses the subject may be used together. This is because the ease of displacement of each tissue changes according to the degree to which the probe 102 presses the subject. For example, the greater the degree that the probe 102 presses the subject, the smaller the predetermined distortion value may be.
  • the signal acquisition module 143 continues to acquire an ultrasonic signal
  • the ultrasonic image generation module 146 continues to acquire an ultrasonic wave such as a B-mode image or an elastography image.
  • An image may be generated and an ultrasonic image may be continuously displayed on the display unit 104 by the output module 150.
  • step S505 when the ultrasonic image generation module 146 is executed, it is determined whether to acquire a photoacoustic image based on the analysis result in step S504. If a region of interest is extracted in step S504, the process proceeds to step S506, and if a region of interest is not extracted, the process returns to step S501 to continue acquiring an ultrasound image.
  • step S506 the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Since the process of step S506 is the same as the process of step S205 illustrated in FIG. 2, description here is abbreviate
  • step S507 the signal acquisition module 143 is executed to acquire a photoacoustic signal from the probe 102.
  • step S508 the photoacoustic image is generated from the photoacoustic signal acquired in step S507 by executing the photoacoustic image generation module 147. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
  • a superimposed image obtained by superimposing a photoacoustic image on an elastography image is displayed on the display unit 104.
  • the elastography image is generally expressed as a color image having a hue reflecting the degree of elasticity.
  • an image reflecting the concentration of a specific substance, for example, hemoglobin is expressed as a color image of a hue reflecting the magnitude of the concentration.
  • it is preferable to use different hues for the base image and the layer image for example.
  • the region expressed by the hue of the photoacoustic image and the region expressed by the hue of the elastography image overlap, the user confirms that the region is the region depicted in both images. It is preferable to make it visible.
  • the control device 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the degree of elasticity of the tissue and the function information can be referred to. Can be displayed.
  • the user may determine whether a region that is considered to be a hard tissue when an elastography image is observed is a malignant tumor.
  • the control apparatus 101 can assist a user's judgment by presenting, for example, information on a new blood vessel drawn in the photoacoustic image in a comparable manner.
  • the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input.
  • the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
  • the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • FIG. 4 is a diagram illustrating an example of analysis performed in step S504 illustrated in FIG.
  • a case where a region having a possibility of a hard tissue is depicted in an elastography image will be described as an example.
  • FIG. 4A is an example of an elastography image 401. It is the figure which illustrated typically the structure inside a subject. In the image 401, a hard tissue region 402 is displayed so as to be distinguishable from surrounding softer tissue.
  • FIG. 4B is an example of the photoacoustic image 403 obtained by imaging the region depicted in the elastography image 401 shown in FIG.
  • a blood vessel region 404 and a blood vessel region 405 are depicted in the photoacoustic image 403.
  • FIG. 4C is an example of a superimposed image 406 in which the photoacoustic image 403 is superimposed on the elastography image 401.
  • the blood vessel region 407 corresponds to the blood vessel region 404 depicted in the photoacoustic image illustrated in FIG.
  • the tissue region 408 corresponds to the tissue region 402 depicted in the elastography image illustrated in FIG.
  • FIG. 4C is an example of a superimposed image in which a photoacoustic image in the vicinity of a tissue region 408 that is considered to be harder than the surrounding tissue in the elastography image is superimposed.
  • the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
  • the example using the elastography image that expresses the elasticity of the tissue qualitatively has been described, but the present invention is not limited to this.
  • an image generated by quantitative elastic imaging that quantitatively represents the elasticity of the tissue may be used.
  • the propagation of sound waves is the propagation of wave energy, and a force called an acoustic radiation force is generated in the sound wave propagation direction in an object that blocks the propagation of wave energy. Therefore, when a convergent ultrasonic pulse having a high sound pressure and a relatively long duration is radiated to a living body, a minute displacement occurs in the tissue due to the acoustic radiation force.
  • a transverse wave is generated that propagates in a direction perpendicular to the displacement, that is, in a direction perpendicular to the ultrasonic beam. Since the propagation speed of the transverse wave is slower than that of the longitudinal wave, the propagation process of the transverse wave can be imaged by the pulse echo method, and the propagation speed can be obtained. It is considered that the propagation speed of the shear wave is higher as the tissue is harder, and thereby the hardness of the tissue can be quantitatively evaluated.
  • the elastic modulus distribution that is, the quantitative hardness index may be obtained and imaged based on the tissue strain distribution obtained by qualitative elastography and the tissue stress distribution.
  • the tissue stress distribution cannot be directly measured, but may be obtained by anatomical information, simulation, or the like.
  • the photoacoustic image depicts a substance or tissue having a property of absorbing irradiated light and generating an acoustic wave (hereinafter referred to as optical characteristics).
  • tissue features with optical properties can aid in diagnosis. For example, it is said that there are many new blood vessels in the vicinity of the tumor tissue, and there is a possibility that there is a correlation between a region where the density of thin blood vessels is high and the malignancy of the tumor. Further, there is a possibility that the concentration of a substance having optical characteristics is different in a specific lesion tissue as compared with the surrounding normal tissue.
  • a region having a lesion having such characteristics By observing the photoacoustic image, there may be a case where a region having a lesion having such characteristics can be identified.
  • a user observing a photoacoustic image finds a region that may have a lesion, i.e., a region of interest that requires more detailed observation, providing more detailed information about the region of interest It is considered useful for diagnosis.
  • an ultrasonic signal can be acquired and an ultrasonic image can be displayed.
  • FIG. 7 is a flowchart showing an example of processing of the control device 101 for controlling the irradiation of ultrasonic waves based on the acquired photoacoustic image and acquiring the ultrasonic image.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • the photoacoustic image is generated by executing the photoacoustic image generation module 147.
  • the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112.
  • the examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique.
  • the user operates the probe 102, and a photoacoustic signal is transmitted from the probe 102 to the control device 101.
  • the signal acquisition module 143 the photoacoustic signal is acquired by the control device 101.
  • the photoacoustic image generation module 147 a photoacoustic image is generated based on the photoacoustic signal.
  • the output module 150 the photoacoustic image is displayed on the display unit 104.
  • the user can further operate the probe 102 while observing the photoacoustic image displayed on the display unit 104.
  • step S702 the inspection control module 142 is executed to acquire information indicating whether to acquire an ultrasonic signal based on the photoacoustic image.
  • the information indicating whether or not to acquire an ultrasonic signal based on the photoacoustic image is specifically information set in advance by the user or included in the inspection order.
  • whether or not to acquire an ultrasonic signal may be determined according to the state of the load of processing performed by the control device 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal.
  • it progresses to step S703 and when not acquiring, it returns to step S701 and acquisition of a photoacoustic image is continued.
  • the photoacoustic image is analyzed by executing the photoacoustic image generation module 147.
  • the photoacoustic image generation module 147 analyzes the photoacoustic image and detects a region of interest defined in advance from the region depicted in the photoacoustic image.
  • an example of analysis performed based on the photoacoustic image generation module 147 in step S703 is shown.
  • an example of analyzing an image (hereinafter referred to as an absorption coefficient image) reflecting an absorption coefficient for light of a specific wavelength will be described as an example of a photoacoustic image.
  • an absorption coefficient image reflecting an absorption coefficient for light of a specific wavelength
  • a blood vessel image is drawn on the absorption coefficient image.
  • the density of blood vessels in the region depicted in the photoacoustic image is analyzed. Then, a region where blood vessels are present at a predetermined density or more is detected as a region of interest.
  • a value obtained by dividing the number of pixels having a pixel value equal to or greater than a certain threshold value by the number of pixels constituting the predetermined range is used as the density. To do.
  • the photoacoustic image is analyzed in step S703, the photoacoustic signal is continuously acquired by the signal acquisition module 143, the photoacoustic image is continuously generated by the photoacoustic image generation module 147, and the output module 150. Then, the photoacoustic image may be continuously displayed on the display unit 104.
  • step S704 when the photoacoustic image generation module 147 is executed, it is determined whether or not to acquire an ultrasonic image based on the analysis result in step S703. If the region of interest is extracted in step S703, the process proceeds to step S705. If the region of interest is not extracted, the process returns to step S701 to continue acquiring the ultrasound image. From this viewpoint, the photoacoustic image generation module 147 is an example of an analysis unit.
  • the irradiation control module 144 is executed to determine whether or not the subject is irradiated with ultrasonic waves. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the photoacoustic image generated by the photoacoustic image generation module 147 and the position information of the probe 102 acquired by the position acquisition module 149.
  • the probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good.
  • the irradiation control module 144 controls the irradiation unit 107 to irradiate ultrasonic waves when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be.
  • the irradiation control module 144 further irradiates the ultrasonic wave based on the photoacoustic image generated by the photoacoustic image generation module 147 when the region extracted as the region of interest in step S703 is drawn on the photoacoustic image.
  • the transmitting / receiving unit 106 may be controlled to do so.
  • the probe 102 leaves
  • possibility of performing redundant ultrasonic irradiation can be reduced.
  • the temperature of the probe 102 is equal to or lower than a predetermined value, it may be determined that the ultrasonic wave is irradiated. Due to the characteristics of the transmitter / receiver 106, when the probe 102 is separated from the subject, an air layer is formed between the probe 102 and the subject. The acoustic impedance of air is much larger than the acoustic impedance of the transmission / reception unit 106.
  • the ultrasonic waves are repeatedly reflected in the vicinity of the transmission / reception unit 106, and the temperature of the probe 102 may rise.
  • the probe 102 may be provided with a temperature sensor that measures the temperature, and the irradiation control module 144 may acquire temperature information of the probe 102 from the temperature sensor. If it is determined in step S705 that the subject is to be irradiated with ultrasonic waves, the process proceeds to step S706, and if it is determined not to be irradiated, the process returns to step S701 to continue acquiring the photoacoustic image.
  • step S706 an ultrasonic signal is acquired from the probe 102 by executing the signal acquisition module 143. Then, by executing the ultrasonic image generation module 146, an ultrasonic image is generated from the ultrasonic signal. Then, when the output module 150 is executed, the generated ultrasonic image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. .
  • the ultrasonic image generated in step S706 is, for example, a B mode image.
  • step S707 it is determined whether or not imaging by elastography has been performed by executing the ultrasonic image generation module 146. Since the region of interest is depicted in the photoacoustic image, it is considered useful to display the elastography image on the display unit 104 as one piece of detailed information that assists diagnosis of the region of interest. For example, a screen is displayed on the display unit 104 via the output module 150 to notify the user that there is a region with high blood vessel density and that it is useful to perform elastography imaging. The user refers to the notification screen, presses the probe 102 against the subject, and performs elastography imaging. What is the process for determining whether the user has performed elastography photography?
  • step S502 Since it is the same as the process of step S502 illustrated in FIG. 5, description here is abbreviate
  • step S708 by executing the ultrasonic image generation module 146, an elastography image is generated from the ultrasonic signal acquired in step S707. Then, when the output module 150 is executed, the generated elastography image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. .
  • control apparatus 101 can refer to the image of the region of interest specified based on the photoacoustic image and the ultrasonic image such as the B-mode image and the elastography image in association with each other. Images can be displayed on the display unit 104. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
  • the irradiation control module 144 is not limited to the above-described example, and the subject may be irradiated with ultrasonic waves based on a user operation input.
  • an ultrasonic image may be generated by the ultrasonic image generation module 146 and the ultrasonic image may be displayed on the display unit 104 via the output module 150.
  • the elastography image generated by the ultrasonic image generation module 146 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
  • step S707 and step S708 are not necessarily performed.
  • an ultrasonic image such as a B-mode image may be displayed based on the result of analyzing the photoacoustic image.
  • FIG. 6 is a diagram illustrating an example of analysis performed in step S703 illustrated in FIG. 7 and an example of an image displayed in step S708.
  • a region having a high blood vessel density is depicted in the photoacoustic image will be described as an example.
  • FIG. 6A is an example of the photoacoustic image 602 displayed on the display unit 104.
  • a blood vessel region 603 and a blood vessel region 604 are depicted.
  • FIG. 6B is an example of a screen displayed on the display unit 104 when a region considered to have a high blood vessel density is detected as a result of the analysis in step S703.
  • a frame 605 indicates the region of interest detected in step S703. Thereby, the user can visually recognize the region of interest, that is, the region detected when the blood vessel density is high.
  • the notification screen 606 displays information based on the analysis result. For example, a message for informing the user that there is a region with a high blood vessel density and that the elastography image is useful for further observation is displayed on the notification screen 606. For example, a message “Please perform elastography of a blood vessel dense region” is displayed on the notification screen 606.
  • FIG. 6C is an example of an elastography image 607 based on the ultrasonic signal obtained by the elastography imaging performed in step S707.
  • a tissue region 608 that may be a hard tissue is depicted so as to be distinguishable from surrounding tissues.
  • FIG. 6D is an example of a superimposed image 609 obtained by superimposing an elastography image 607 on the photoacoustic image 602.
  • a blood vessel region 610 corresponding to the blood vessel region 603 of the photoacoustic image 602 is depicted.
  • the user can visually recognize the region related to the blood vessel density around the tissue region 608 of the elastography image 607 illustrated in FIG. 6C, and the diagnosis performed by the user can be assisted.
  • the control device 101 analyzes the acquired photoacoustic image, detects a region that may be a lesion as a region of interest, and generates an ultrasound image corresponding to at least the region of interest.
  • the diagnosis can be assisted by controlling to perform elastography imaging for evaluating the hardness of the region of interest. By controlling in this way, the user can perform elastography imaging centering on the region of interest, and the user workflow in the examination can be improved.
  • an elastography image is acquired as an ultrasonic image
  • the present invention is not limited to this.
  • Doppler imaging for measuring blood flow velocity or B-mode imaging for grasping the structure in the subject may be performed.
  • FIG. 8 is a diagram schematically illustrating an inspection state in the fourth embodiment.
  • FIG. 8A shows an example of a state in which the user brings the probe 102 into contact with the subject 803 and acquires an ultrasonic image.
  • An ultrasonic signal from the probe 102 is transmitted to the console 801.
  • the console 801 is an apparatus in which the control device 101, the display unit 104, and the operation unit 105 shown in FIG.
  • the console 801 corresponds to the control device 101 in each embodiment described above.
  • the position information of the position 802 is acquired by the position acquisition module 149 of the console 801 and stored in the RAM 133 for a predetermined period.
  • position information is associated with the ultrasonic image and the photoacoustic image, respectively. It is assumed that the region of interest is detected in the ultrasonic image captured at the position 802.
  • FIG. 8B is an example of the state of the inspection at the time when the console 801 analyzes the ultrasonic image and detects the region of interest in the series of processes of FIG.
  • the user has moved the probe 102 from the position 805 corresponding to the position 802 in FIG.
  • the region of interest from which the photoacoustic image is to be acquired is detected by the console 801, the detected region of interest may not be depicted even if the photoacoustic image is generated based on the photoacoustic signal acquired at the position 804. There is.
  • FIG. 8C is an example of a state in which the probe 102 is moved to a position 806 where the user can draw the region of interest by the guide of the console 801.
  • the position acquisition module 149 of the console 801 the current position of the probe 102 is compared with the target position which is the position of the probe 102 that acquired the ultrasonic signal of the ultrasonic image in which the region of interest is detected.
  • The guide information for guiding the probe 102 to the target position is generated and displayed.
  • the user can acquire the photoacoustic image of the detected region of interest.
  • FIG. 9 is a flowchart illustrating an example of processing for the guide illustrated in FIG.
  • the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
  • step S901 and step S902 Since the processing of step S901 and step S902 is the same as the processing of step S201 illustrated in FIG. 2, the description here is omitted by using the above description.
  • the position acquisition module 149 is executed, whereby the position information of the probe 102 is acquired. More specifically, the motion sensor, which is an example of the detection unit 103, tracks the position information of the probe 102 and transmits it to the control device 101.
  • the motion sensor is provided or embedded in a portion different from the transmitting / receiving unit 106 of the probe 102 and the light source (not shown).
  • the motion sensor is composed of, for example, a micro electro mechanical system (Micro Electro Mechanical Systems), and provides 9-axis motion sensing including a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic compass.
  • Information regarding the movement of the probe 102 sensed by the motion sensor is acquired by the position acquisition module 149 and stored for a certain period of time.
  • step S904 and step S905 are the same as the processing of step S203 and step S204 illustrated in FIG. 2, the description here will be omitted by using the above description. If it is determined in step S905 that the photoacoustic signal is acquired, the process proceeds to step S906. If it is determined that the photoacoustic signal is not acquired, the process returns to step S901 to continue acquiring the photoacoustic image.
  • step S906 the position acquisition module 149 is executed, whereby guide information is displayed on the display unit 104.
  • the position of the probe 102 at the time when the ultrasonic signal for generating the ultrasonic image in which the region of interest is detected in step S904 is acquired is set as the target position.
  • the difference between the current position of the probe 102 indicated by the position information sequentially transmitted from the detection unit 103 and the target position is acquired.
  • guide information is produced
  • the guide information is presented to the user via the output module 150.
  • the guide information is a guide image, for example, and is displayed on the display unit 104.
  • the guide image is an objective index indicating guide information such as a movement direction, a movement amount, an inclination angle, a rotation direction, and a rotation amount for moving the probe 102 to the target position.
  • the guide image may be any image as long as it is an objective index of the guide information.
  • the guide image is an image of an arrow having a size corresponding to the amount of movement or rotation and a direction corresponding to the direction of movement, rotation, or tilt.
  • the guide image is a figure that has a size corresponding to the amount of movement or rotation, and whose shape is deformed according to the direction of movement, rotation, or inclination.
  • the guide image is displayed on the display unit 104 in a manner that does not interfere with observation of the region of interest when the probe 102 is moved to the target position.
  • the guide image is displayed in an area where the ultrasonic image, the photoacoustic image, and the superimposed image are not displayed.
  • the probe 102 while guiding the probe 102 to move to the target position, it is displayed at a position that overlaps the area near the target area, and when the target area is rendered, it is deformed into a shape that cannot be seen. May be displayed.
  • step S907 the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Specifically, in the determination by the irradiation control module 144, it is determined that light is irradiated when the probe 102 has reached the target position and the probe 102 is in contact with the subject. For example, when the current position of the probe 102 indicated by the position information transmitted from the detection unit 103 matches the target position, it is determined that the probe 102 has reached the target position. In another example, when the probe 102 reaches a predetermined range including the target position, it may be determined that the probe 102 has reached the target position.
  • the position of the probe 102 capable of acquiring the region of interest detected in step S904 and the target position are set as a predetermined range. Since the process for determining that the probe 102 is in contact with the subject is the same as the process of step S205 illustrated in FIG. 2, the description here is omitted by using the above description. If it is determined that the subject is to be irradiated with light, the process proceeds to step S908. If it is determined that the object is not irradiated with light, the process returns to step S901 to continue acquiring the ultrasound image.
  • step S908 and step S909 are the same as the processing of step S206 and step S207 illustrated in FIG. 2, respectively, the description here is omitted by using the above description.
  • the target position is set based on the region of interest detected by analyzing the ultrasonic image
  • the target position may be set based on the region of interest detected by analyzing the photoacoustic image.
  • the method of presenting guide information to the user is not limited to the above-described guide image.
  • the guide information may be presented to the user by generating a sound such that the sound generation interval decreases as the probe 102 approaches the target position.
  • control device 101 can efficiently acquire an image depicting a region of interest, thereby improving the user workflow.
  • the analysis for detecting the region of interest may not be performed for all the frames of the acquired image. For example, it is possible to reduce the processing load on the control apparatus 101 by performing the analysis at a predetermined frame interval.
  • the probe 102 when one of the ultrasonic image and the photoacoustic image is analyzed and a region of interest is detected, an example of controlling the probe 102 to capture the other image will be described. did.
  • the present invention is not limited to this.
  • the probe 102 when the region of interest is included in a predetermined range of the range that can be drawn by the probe 102, the probe 102 may be controlled to capture the other image.
  • the probe 102 may be controlled to capture the other image. This makes it possible to obtain a medical image that makes it easier to observe the region of interest when it is later interpreted or attached to an image diagnosis report. In addition, this can reduce the possibility that the other image is inadvertently captured when the region of interest is included in a range in which one image can be drawn unintentionally.
  • the method of superimposing is not limited to the above-described example.
  • the ultrasonic image is a base image and the photoacoustic image is a layer image
  • the photoacoustic image may be superimposed only in the vicinity of the region of interest of the ultrasonic image, or a photoacoustic image in a desired range is superimposed. May be.
  • the transparency of the layer image is appropriately changed according to the purpose.
  • the layer image may be opaque, or the transparency may be increased only in the vicinity of the region of interest.
  • a slider capable of changing the transparency of the layer image may be displayed on the display unit 104 and changed during observation by the user.
  • the display of the superimposed image, the display in which the ultrasonic image and the photoacoustic image are arranged in parallel, and the display of any one of the ultrasonic image and the photoacoustic image may be switched by a user operation input. Good.
  • a region drawn in a past three-dimensional image may be set as a region of interest.
  • a specific region on a CT image taken in the past is set as a region of interest.
  • the coordinate system of the said CT image and the coordinate system of the real space which operates the probe 102 are matched.
  • the control apparatus 101 can acquire an ultrasonic image and a photoacoustic image, and when the probe 102 is positioned at a position where a region of interest can be acquired while acquiring one image, the other image is also acquired. You may control to.
  • the user may be notified that light irradiation is performed by the probe 102.
  • a notification image for notifying that light irradiation is performed by the probe 102 is displayed on the display unit 104.
  • the probe 102 may be provided with an LED light that is turned on during light irradiation.
  • the control device 101 may generate a notification sound during light irradiation. Thereby, the user can know that light is being emitted from the probe 102, and the safety of the user and the subject can be improved.
  • the necessity of obtaining the photoacoustic image is determined based on the analysis result of the ultrasonic image. Therefore, when a region of interest such as a calculus is included in the ultrasonic image, a photoacoustic image is automatically captured. Therefore, a photoacoustic image is captured once for a certain region of interest. Nevertheless, there is a possibility that a photoacoustic image will be taken again if an attempt is made to confirm the region of interest again with an ultrasonic image. That is, a photoacoustic image is taken a plurality of times for the same region of interest, and an unnecessary photoacoustic image is acquired. Therefore, an object of the present modification is to prevent an unnecessary photoacoustic image from being acquired.
  • the irradiation control module 144 restricts light irradiation for acquiring a photoacoustic image based on position information of the probe 102 acquired by the position acquisition module 149, for example.
  • the irradiation control module 144 stores the position information of the probe 102 when it is determined that light irradiation has been performed in the past, and the position information of the current probe 102 matches or is stored with the stored position information. It is determined whether the deviation is within a predetermined threshold.
  • the irradiation control module 144 is the part which has already acquired the photoacoustic image when the position information of the current probe 102 matches the stored position information or the deviation from the stored position information is within a predetermined threshold. Judging that there is, it restricts light irradiation. By restricting the light irradiation, it is possible to prevent the subject from performing unnecessary light irradiation and to prevent acquisition of unnecessary photoacoustic images.
  • the irradiation control module 144 may limit the irradiation of light for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146, for example.
  • the irradiation control module 144 stores the ultrasonic image generated when it is determined that the light irradiation has been performed in the past, and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
  • the irradiation control module 144 performs light irradiation for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149. It may be limited. For example, the irradiation control module 144 stores an ultrasonic image in association with position information of the probe 102 when it is determined that light irradiation has been performed in the past. Then, the irradiation control module 144 reads an ultrasonic image associated with the current position information of the probe 102 and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
  • the present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
  • ASIC application specific integrated circuit
  • the control device in each of the above-described embodiments may be realized as a single device, or may be configured to execute the above-described processing by combining a plurality of devices so that they can communicate with each other. included.
  • the above-described processing may be executed by a common server device or server group.
  • the plurality of devices constituting the control device and the control system need only be able to communicate at a predetermined communication rate, and do not need to exist in the same facility or in the same country.
  • a software program that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and the computer of the system or apparatus reads and executes the code of the supplied program. Includes form.
  • the processing according to the embodiment is realized by a computer
  • the program code itself installed in the computer is also one embodiment of the present invention.
  • an OS or the like running on the computer performs part or all of the actual processing, and the functions of the above-described embodiments can be realized by the processing. .
  • Embodiments appropriately combining the above-described embodiments are also included in the embodiments of the present invention.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)

Abstract

In the present invention, a first image is acquired on the basis of a first signal acquired by a first means, the first image is analyzed and a region of interest is detected, acquisition of a second signal by a second means is controlled in order to acquire a second image that includes the region of interest on the basis of the acquired second signal, and the first image and the second image are displayed in a display unit in a manner enabling comparison of the first image and the second image.

Description

制御装置、制御方法、制御システム及びプログラムControl device, control method, control system, and program
 本明細書の開示は、制御装置、制御方法、制御システム及びプログラムに関する。 The disclosure of this specification relates to a control device, a control method, a control system, and a program.
 被検体内部の状態を低侵襲に画像化する撮像装置として、超音波撮像装置や光音響撮像装置が利用されている。特許文献1には、超音波信号と光音響信号とを取得する装置において、超音波画像に基づいて同定された特定の領域については光音響信号から光音響画像を生成する処理を行い、当該特定の領域以外の領域については光音響信号から光音響画像を生成する処理を行わないことが開示されている。 An ultrasonic imaging device or a photoacoustic imaging device is used as an imaging device that images a state inside a subject in a minimally invasive manner. In Patent Document 1, in a device that acquires an ultrasonic signal and a photoacoustic signal, a process for generating a photoacoustic image from a photoacoustic signal is performed for a specific region identified based on an ultrasonic image, It is disclosed that processing for generating a photoacoustic image from a photoacoustic signal is not performed for a region other than this region.
特開2015―198688号公報Japanese Unexamined Patent Publication No. 2015-198888
 光音響信号は、被検体に照射された光により、被検体内部で生じた膨張によって発生した音響波に基づく。したがって、光音響画像を取得したい領域のみ光音響信号から光音響画像を生成するようにするのみでは、光音響画像を取得しない領域に対しても光が照射されることになる。 The photoacoustic signal is based on an acoustic wave generated by expansion generated inside the subject due to light irradiated on the subject. Therefore, only by generating a photoacoustic image from a photoacoustic signal only in a region where a photoacoustic image is desired to be acquired, light is irradiated even on a region where a photoacoustic image is not acquired.
 本明細書が開示する制御装置は、被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とのうち少なくともいずれかを取得する信号取得手段と、前記超音波信号に基づいて超音波画像を生成する生成手段と、前記生成された前記超音波画像に基づいて前記プローブの前記光照射を制御する照射制御手段と、を有することを特徴とする。 The control device disclosed in this specification outputs an ultrasonic signal by transmitting / receiving ultrasonic waves to / from a subject, and receives a photoacoustic wave generated by light irradiation on the subject, from a probe that outputs a photoacoustic signal. , Signal acquisition means for acquiring at least one of the ultrasonic signal and the photoacoustic signal, generation means for generating an ultrasonic image based on the ultrasonic signal, and the generated ultrasonic image. Irradiation control means for controlling the light irradiation of the probe on the basis thereof.
 本発明によれば、異なる種類の撮像を制御することができる制御装置において、一方の種類の撮像で得られた画像に基づいて、他方の種類の撮像を行うか否かを制御することができ、ユーザや被検体にとって冗長でない検査を行うことができる。 According to the present invention, in a control device capable of controlling different types of imaging, it is possible to control whether or not the other type of imaging is performed based on an image obtained by one type of imaging. It is possible to perform a test that is not redundant for the user or the subject.
第1の実施形態に係る制御装置の構成の一例を示す図である。It is a figure which shows an example of a structure of the control apparatus which concerns on 1st Embodiment. 第1の実施形態に係る制御装置により行われる処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the control apparatus which concerns on 1st Embodiment. 第1の実施形態に係る制御装置により表示される画像の一例を示す図である。It is a figure which shows an example of the image displayed by the control apparatus which concerns on 1st Embodiment. 第2の実施形態に係る制御装置により表示される画像の一例を示す図である。It is a figure which shows an example of the image displayed by the control apparatus which concerns on 2nd Embodiment. 第2の実施形態に係る制御装置により行われる処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the control apparatus which concerns on 2nd Embodiment. 第3の実施形態に係る制御装置により表示される画像の一例を示す図である。It is a figure which shows an example of the image displayed by the control apparatus which concerns on 3rd Embodiment. 第3の実施形態に係る制御装置により行われる処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the control apparatus which concerns on 3rd Embodiment. 第4の実施形態に係る制御装置により行われる検査の一例を示す図である。It is a figure which shows an example of the test | inspection performed by the control apparatus which concerns on 4th Embodiment. 第4の実施形態に係る制御装置により行われる処理の一例を示すフローチャートである。It is a flowchart which shows an example of the process performed by the control apparatus which concerns on 4th Embodiment. 本発明の実施形態に係る制御装置を含むシステムの構成の一例を示す図である。It is a figure which shows an example of a structure of the system containing the control apparatus which concerns on embodiment of this invention.
 以下、図面を参照して本発明の実施形態を説明する。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 [第1の実施形態]
 本願明細書では、被検体に光を照射し、被検体内部で生じた膨張によって発生する音響波を光音響波と称する。また、トランスデューサから送信された音響波または当該送信された音響波が被検体内部で反射した反射波(エコー)を超音波と称する。
[First Embodiment]
In the specification of the present application, an acoustic wave generated by irradiating a subject with light and expanding inside the subject is referred to as a photoacoustic wave. In addition, an acoustic wave transmitted from the transducer or a reflected wave (echo) in which the transmitted acoustic wave is reflected inside the subject is referred to as an ultrasonic wave.
 被検体内部の状態を低侵襲に画像化する方法として、超音波を用いた画像化の方法や光音響波を用いた画像化の手法が利用されている。超音波を用いた画像化の方法は、たとえばトランスデューサから発振された超音波が被検体内部の組織で音響インピーダンスの差に応じて反射され、反射波がトランスデューサに到達するまでの時間や反射波の強度に基づいて画像を生成する方法である。超音波を用いて画像化された画像を以下では超音波画像と称する。ユーザはプローブの角度等を変えながら操作し、様々な断面の超音波画像をリアルタイムに観察することができる。超音波画像には臓器や組織の形状が描出され、腫瘍の発見等に活用されている。また、光音響波を用いた画像化の方法は、たとえば光を照射された被検体内部の組織が断熱膨張することにより発生する光音響波に基づいて画像を生成する方法である。光音響波を用いて画像化された画像を以下では光音響画像と称する。光音響画像には各組織の光の吸収の度合いといった光学特性に関連した情報が描出される。光音響画像では、たとえばヘモグロビンの光学特性により血管を描出できることが知られており、腫瘍の悪性度の評価等への活用が検討されている。 As a method for imaging a state inside a subject in a minimally invasive manner, an imaging method using ultrasonic waves and an imaging method using photoacoustic waves are used. The method of imaging using ultrasonic waves is, for example, that the ultrasonic waves oscillated from the transducer are reflected by the tissue inside the subject according to the difference in acoustic impedance, and the time until the reflected wave reaches the transducer and the reflected wave This is a method for generating an image based on intensity. An image imaged using ultrasound is hereinafter referred to as an ultrasound image. The user can operate while changing the angle of the probe and observe ultrasonic images of various cross sections in real time. Ultrasound images depict the shapes of organs and tissues and are used to find tumors. The imaging method using photoacoustic waves is a method of generating an image based on photoacoustic waves generated by adiabatic expansion of tissue inside a subject irradiated with light, for example. An image imaged using the photoacoustic wave is hereinafter referred to as a photoacoustic image. In the photoacoustic image, information related to optical characteristics such as the degree of light absorption of each tissue is depicted. In photoacoustic images, for example, it is known that blood vessels can be drawn by the optical characteristics of hemoglobin, and its use for evaluating the malignancy of tumors is being studied.
 診断の精度を高めるために、被検体の同一部位を、異なる原理に基づいて異なる現象を画像化することにより、様々な情報を収集する場合がある。たとえば、CT(Computed Tomography)画像で得られた形態情報と、PET(positronemission tomography)画像で得られた代謝に関する機能情報とを組み合わせて、がんに関する診断を行う場合がある。このように、異なる原理に基づいて異なる現象を画像化して得られた情報を用いて診断を行うことは、診断の精度向上に有効であると考えられる。 In order to improve the accuracy of diagnosis, various information may be collected by imaging different phenomena on the same part of the subject based on different principles. For example, there is a case where diagnosis about cancer is performed by combining morphological information obtained from a CT (Computed Tomography) image and functional information relating to metabolism obtained from a PET (Positronization Tomography) image. As described above, it is considered effective to improve diagnosis accuracy to perform diagnosis using information obtained by imaging different phenomena based on different principles.
 上述した超音波画像と光音響画像に関しても、それぞれの特性を組み合わせた画像を得るための撮像装置が検討されている。特に、超音波画像も光音響画像も被検体からの超音波を利用して画像化されることから、超音波画像の撮像と光音響画像の撮像とを同じ撮像装置で行うことも可能である。より具体的には、被検体に照射した反射波と光音響波とを同じトランスデューサで受信する構成とすることができる。これにより、超音波信号と光音響信号とを一つのプローブで取得することができ、ハードウェア構成が複雑にならずに、超音波画像の撮像と光音響画像の撮像とを行う撮像装置を実現できる。 As for the above-described ultrasonic image and photoacoustic image, an imaging device for obtaining an image obtained by combining the respective characteristics has been studied. In particular, since both an ultrasonic image and a photoacoustic image are imaged using ultrasonic waves from a subject, it is also possible to perform imaging of an ultrasonic image and photoacoustic image with the same imaging device. . More specifically, it can be configured such that the reflected wave and the photoacoustic wave irradiated to the subject are received by the same transducer. As a result, it is possible to acquire an ultrasonic signal and a photoacoustic signal with a single probe, and realize an imaging device that captures an ultrasonic image and a photoacoustic image without complicating the hardware configuration. it can.
 超音波画像の撮像と光音響画像の撮像とを行う撮像装置において、ユーザは従来の超音波画像の撮像と同様にプローブの操作を行いたい場合が想定される。すなわちユーザはプローブを被検体の表面に接触させ、当該プローブにより取得された情報に基づいて表示される画像を観察しながらプローブを操作することが考えられる。その際に、信号取得や画像表示に関する動作モードの切り替えを、たとえばプローブに設けられたスイッチや、当該撮像装置の操作卓に設けられた入力デバイスを介して行うと、ユーザは画像を観察しながらのプローブ操作を中断する必要がある。そのため、スイッチや操作卓の入力デバイスへの操作入力の間に被検体の体動が生じたり、プローブ位置がずれたりすることが考えられる。 In an imaging apparatus that captures an ultrasonic image and a photoacoustic image, it is assumed that the user wants to operate the probe in the same manner as a conventional ultrasonic image. That is, it is conceivable that the user touches the surface of the subject and operates the probe while observing an image displayed based on information acquired by the probe. At that time, if the operation mode related to signal acquisition or image display is switched via, for example, a switch provided on the probe or an input device provided on the console of the imaging device, the user observes the image. It is necessary to interrupt the probe operation. For this reason, it is conceivable that the body movement of the subject occurs during the operation input to the switch or the input device of the console, or the probe position shifts.
 たとえば、上述の例のように超音波画像と光音響画像とを組み合わせて観察し、腫瘍の悪性度を評価する場合を考える。ユーザは超音波画像を観察しながらプローブを操作したところ、腫瘍の可能性がある部位を発見し、光音響画像を取得して血管の情報を収集したいとする。このとき、光音響画像を表示するための動作モードに切り替えるために上述したスイッチや操作卓の入力デバイスへの操作入力の間に、腫瘍の可能性があると考えた部位を観察できる位置からプローブがずれてしまうおそれがある。第1の実施形態は、ユーザが画像を観察する際の操作性を低下させずに、表示させる画像を切り替えることができる制御装置を提供することを目的とする。 For example, let us consider a case in which the malignancy of a tumor is evaluated by observing a combination of an ultrasonic image and a photoacoustic image as in the above example. A user operates a probe while observing an ultrasound image, finds a site that may be a tumor, acquires a photoacoustic image, and collects blood vessel information. At this time, in order to switch to the operation mode for displaying the photoacoustic image, the probe from a position where it is possible to observe the part considered to be a tumor during the operation input to the switch or the input device of the console described above. May shift. An object of the first embodiment is to provide a control device that can switch an image to be displayed without deteriorating operability when a user observes an image.
 図10は、第1の実施形態に係る制御装置101を含むシステムの構成の一例を示す図である。超音波画像と光音響画像とを生成可能な撮像システム100は、ネットワーク110を介して各種の外部装置と接続されている。撮像システム100に含まれる各構成及び各種の外部装置は、同じ施設内に設置されている必要はなく、通信可能に接続されていればよい。 FIG. 10 is a diagram illustrating an example of a system configuration including the control device 101 according to the first embodiment. An imaging system 100 that can generate an ultrasonic image and a photoacoustic image is connected to various external devices via a network 110. Each configuration and various external devices included in the imaging system 100 do not need to be installed in the same facility, and may be connected to be communicable.
 撮像システム100は、制御装置101、プローブ102、検知部103、表示部104、操作部105を含む。制御装置101は、プローブ102から超音波信号と光音響信号とを取得し、たとえば超音波画像に基づいて光音響信号の取得を制御し、当該制御に基づいて光音響画像を生成する装置である。また、制御装置101は、超音波画像ならびに光音響画像の撮像を含む検査に関する情報をオーダリングシステム112から取得し、当該検査が行われる際にプローブ102や検知部103や表示部104を制御する。制御装置101は、生成された超音波画像、光音響画像、超音波画像に光音響画像を重畳した重畳画像をPACS113に出力する。制御装置101は、HL7(Health level 7)及びDICOM(Digital Imaging and Communications in Medicine)といった規格に準じて、オーダリングシステム112やPACS113といった外部装置との間で情報の送受信を行う。制御装置101により行われる処理についての詳細は、後述する。 The imaging system 100 includes a control device 101, a probe 102, a detection unit 103, a display unit 104, and an operation unit 105. The control apparatus 101 is an apparatus that acquires an ultrasonic signal and a photoacoustic signal from the probe 102, controls acquisition of the photoacoustic signal based on, for example, an ultrasonic image, and generates a photoacoustic image based on the control. . In addition, the control device 101 acquires information related to an examination including imaging of an ultrasonic image and a photoacoustic image from the ordering system 112, and controls the probe 102, the detection unit 103, and the display unit 104 when the examination is performed. The control device 101 outputs the generated ultrasonic image, photoacoustic image, and superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image to the PACS 113. The control device 101 transmits / receives information to / from external devices such as the ordering system 112 and the PACS 113 in accordance with standards such as HL7 (Health level 7) and DICOM (Digital Imaging and Communications in Medicine). Details of processing performed by the control device 101 will be described later.
 撮像システム100で超音波画像を撮像される被検体内の領域は、たとえば循環器領域、乳房、肝臓、膵臓といった領域である。また、撮像システム100では、たとえば微小気泡を利用した超音波造影剤を投与した被検体の超音波画像を撮像してもよい。 The region in the subject from which an ultrasound image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a liver, or a pancreas. In the imaging system 100, for example, an ultrasound image of a subject to which an ultrasound contrast agent using microbubbles is administered may be captured.
 また、撮像システム100で光音響画像を撮像される被検体内の領域は、たとえば循環器領域、乳房、径部、腹部、手指および足指を含む四肢といった領域である。特に、被検体内の光吸収に関する特性に応じて、新生血管や血管壁のプラークを含む血管領域を、光音響画像の撮像の対象としてもよい。以下では、超音波画像を撮像しながら光音響画像を撮像する場合を例に説明するが、撮像システム100で光音響画像を撮像される被検体内の領域は必ずしも超音波画像を撮像される領域と一致していなくてもよい。撮像システム100では、たとえばメチレンブルー(methylene blue)やインドシアニングリーン(indocyanine green)といった色素や、金微粒子、それらを集積あるいは化学的に修飾した物質を造影剤として投与した被検体の光音響画像を撮像してもよい。 The region in the subject from which the photoacoustic image is captured by the imaging system 100 is, for example, a circulatory region, a breast, a diameter portion, an abdomen, a limb including fingers and toes. In particular, a blood vessel region including a new blood vessel and a plaque on a blood vessel wall may be set as a target for imaging a photoacoustic image in accordance with the characteristics relating to light absorption in the subject. In the following, a case where a photoacoustic image is captured while capturing an ultrasonic image will be described as an example. However, a region in a subject where a photoacoustic image is captured by the imaging system 100 is not necessarily a region where an ultrasonic image is captured. Does not have to match. In the imaging system 100, for example, a photoacoustic image of a subject to which a dye such as methylene blue or indocyanine green, gold fine particles, or a substance obtained by integrating or chemically modifying them is administered as a contrast agent is captured. May be.
 プローブ102は、ユーザにより操作され、超音波信号と光音響信号とを制御装置101に送信する。プローブ102は、送受信部106と照射部107とを含む。プローブ102は、送受信部106から超音波を送信し、反射波を送受信部106で受信する。また、プローブ102は照射部107から被検体に光を照射し、光音響波を送受信部106で受信する。プローブ102は受信した反射波ならびに光音響波を電気信号に変換し、超音波信号ならびに光音響信号として制御装置101に送信する。プローブ102は、被検体との接触を示す情報を受信したときに、超音波信号を取得するための超音波の送信ならびに光音響信号を取得するための光照射が実行されるように制御されることが好ましい。 The probe 102 is operated by a user and transmits an ultrasonic signal and a photoacoustic signal to the control device 101. The probe 102 includes a transmission / reception unit 106 and an irradiation unit 107. The probe 102 transmits an ultrasonic wave from the transmission / reception unit 106 and receives the reflected wave by the transmission / reception unit 106. Further, the probe 102 irradiates the subject with light from the irradiation unit 107, and the photoacoustic wave is received by the transmission / reception unit 106. The probe 102 converts the received reflected wave and photoacoustic wave into an electric signal, and transmits it to the control device 101 as an ultrasonic signal and a photoacoustic signal. The probe 102 is controlled so that, when information indicating contact with the subject is received, transmission of ultrasonic waves for acquiring an ultrasonic signal and light irradiation for acquiring a photoacoustic signal are executed. It is preferable.
 送受信部106は、少なくとも1つのトランスデューサ(不図示)と、整合層(不図示)、ダンパー(不図示)、音響レンズ(不図示)を含む。トランスデューサ(不図示)はPZT(lead zirconate titanate)やPVDF(polyvinylidene difluoride)といった、圧電効果を示す物質からなる。トランスデューサ(不図示)は圧電素子以外のものでもよく、たとえば静電容量型トランスデューサ(CMUT:capacitive micro-machined ultrasonic transducers)、ファブリペロー干渉計を用いたトランスデューサである。典型的には、超音波信号は2~20MHz、光音響信号は0.1~100MHzの周波数成分からなり、トランスデューサ(不図示)はこれらの周波数を検出できるものが用いられる。トランスデューサ(不図示)により得られる信号は時間分解信号である。受信された信号の振幅は各時刻にトランスデューサで受信される音圧に基づく値を表したものである。送受信部106は、電子フォーカスのための回路(不図示)もしくは制御部を含む。トランスデューサ(不図示)の配列形は、たとえばセクタ、リニアアレイ、コンベックス、アニュラアレイ、マトリクスアレイである。 The transmission / reception unit 106 includes at least one transducer (not shown), a matching layer (not shown), a damper (not shown), and an acoustic lens (not shown). The transducer (not shown) is made of a material exhibiting a piezoelectric effect, such as PZT (lead zirconate titanate) or PVDF (polyvinylidene difluoride). The transducer (not shown) may be other than a piezoelectric element, for example, a transducer using a capacitive transducer (CMUT: capacitive ultrasonic transducer) or a Fabry-Perot interferometer. Typically, the ultrasonic signal is composed of frequency components of 2 to 20 MHz and the photoacoustic signal is composed of frequency components of 0.1 to 100 MHz, and a transducer (not shown) that can detect these frequencies is used. The signal obtained by the transducer (not shown) is a time-resolved signal. The amplitude of the received signal represents a value based on the sound pressure received by the transducer at each time. The transmission / reception unit 106 includes a circuit (not shown) or a control unit for electronic focusing. The array form of transducers (not shown) is, for example, a sector, a linear array, a convex, an annular array, or a matrix array.
 送受信部106は、トランスデューサ(不図示)が受信した時系列のアナログ信号を増幅する増幅器(不図示)を備えていてもよい。また、送受信部106は、トランスデューサ(不図示)が受信した時系列のアナログ信号を時系列のデジタル信号に変換するA/D変換器を備えていてもよい。トランスデューサ(不図示)は、超音波画像の撮像の目的に応じて、送信用と受信用とに分割されてもよい。また、トランスデューサ(不図示)は、超音波画像の撮像用と、光音響画像の撮像用とに分割されてもよい。 The transmitting / receiving unit 106 may include an amplifier (not shown) that amplifies a time-series analog signal received by a transducer (not shown). The transmission / reception unit 106 may include an A / D converter that converts a time-series analog signal received by a transducer (not shown) into a time-series digital signal. The transducer (not shown) may be divided into a transmitter and a receiver depending on the purpose of imaging an ultrasonic image. Further, the transducer (not shown) may be divided into an ultrasonic image capturing unit and a photoacoustic image capturing unit.
 照射部107は、光音響信号を取得するための光源(不図示)と、光源(不図示)から射出されたパルス光を被検体へ導く光学系(不図示)とを含む。光源(不図示)が射出する光のパルス幅は、たとえば1ns以上、100ns以下のパルス幅である。また、光源(不図示)が射出する光の波長は、たとえば400nm以上、1600nm以下の波長である。被検体の表面近傍の血管を高解像度でイメージングする場合は、400nm以上、700nm以下の、血管での吸収が大きい波長が好ましい。また、被検体の深部をイメージングする場合には、700nm以上、1100nm以下の、水や脂肪といった組織で吸収されにくい波長が好ましい。 The irradiation unit 107 includes a light source (not shown) for acquiring a photoacoustic signal and an optical system (not shown) that guides pulsed light emitted from the light source (not shown) to the subject. The pulse width of light emitted from a light source (not shown) is, for example, 1 ns or more and 100 ns or less. Moreover, the wavelength of the light which a light source (not shown) injects is a wavelength of 400 nm or more and 1600 nm or less, for example. When imaging a blood vessel in the vicinity of the surface of the subject with a high resolution, a wavelength of 400 nm or more and 700 nm or less and a large absorption in the blood vessel is preferable. Moreover, when imaging the deep part of a test object, the wavelength of 700 nm or more and 1100 nm or less which is hard to be absorbed by tissues such as water and fat is preferable.
 光源(不図示)は、たとえばレーザーや発光ダイオードである。照射部107は、複数の波長の光を用いて光音響信号を取得するために、波長を変換できる光源を用いてもよい。あるいは、照射部107は、互いに異なる波長の光を発生する複数の光源を備え、それぞれの光源から交互に異なる波長の光を照射できる構成であってもよい。レーザーは、たとえば固体レーザー、ガスレーザー、色素レーザー、半導体レーザーである。光源(不図示)として、Nd:YAGレーザーやアレキサンドライトレーザーといったパルスレーザーを用いてもよい。また、Nd:YAGレーザーの光を励起光とするTi:saレーザーやOPO(optical parametric oscillators)レーザーを光源(不図示)として用いてもよい。また、光源(不図示)として、マイクロウェーブ源を用いてもよい。 The light source (not shown) is, for example, a laser or a light emitting diode. The irradiation unit 107 may use a light source that can convert wavelengths in order to acquire a photoacoustic signal using light of a plurality of wavelengths. Alternatively, the irradiation unit 107 may include a plurality of light sources that generate light of different wavelengths, and may be configured to be able to irradiate light of different wavelengths alternately from each light source. The laser is, for example, a solid laser, a gas laser, a dye laser, or a semiconductor laser. As a light source (not shown), a pulsed laser such as an Nd: YAG laser or an alexandrite laser may be used. Further, a Ti: sa laser or an OPO (optical parametric oscillators) laser that uses Nd: YAG laser light as excitation light may be used as a light source (not shown). A microwave source may be used as a light source (not shown).
 光学系(不図示)には、レンズ、ミラー、光ファイバといった光学素子が用いられる。被検体が乳房である場合には、パルス光のビーム径を広げて照射することが好ましいため、光学系(不図示)は射出される光を拡散させる拡散板を備えていてもよい。あるいは解像度を上げるために、光学系(不図示)はレンズ等を備え、ビームをフォーカスできる構成であってもよい。 In the optical system (not shown), optical elements such as lenses, mirrors, and optical fibers are used. When the subject is a breast, it is preferable to irradiate with the beam diameter of the pulsed light expanded, so the optical system (not shown) may include a diffusion plate that diffuses the emitted light. Alternatively, in order to increase the resolution, the optical system (not shown) may include a lens or the like so that the beam can be focused.
 検知部103は、プローブ102の位置と姿勢に関する情報を取得する。検知部103は、プローブ102の位置に関する情報を制御装置101に送信する。検知部103は、たとえばプローブ102に備えられたモーションセンサである。検知部103は制御装置101に必ずしも含まれていなくてもよく、検査に先立って設定される各種の条件に基づいて、適宜センサのONとOFFとが切り替えられるようにしてもよい。 The detection unit 103 acquires information regarding the position and orientation of the probe 102. The detection unit 103 transmits information related to the position of the probe 102 to the control device 101. The detection unit 103 is a motion sensor provided in the probe 102, for example. The detection unit 103 is not necessarily included in the control device 101, and the sensor may be switched between ON and OFF as appropriate based on various conditions set prior to the inspection.
 表示部104は、制御装置101からの制御に基づいて、撮像システム100で撮像された画像や、検査に関する情報を表示する。表示部104は、制御装置101からの制御に基づいて、ユーザの指示を受け付けるためのインタフェースを提供する。表示部104は、たとえば液晶ディスプレイである。 The display unit 104 displays an image captured by the imaging system 100 and information related to the inspection based on control from the control device 101. The display unit 104 provides an interface for receiving user instructions based on control from the control device 101. The display unit 104 is a liquid crystal display, for example.
 操作部105は、ユーザの操作入力に関する情報を制御装置101に送信する。操作部105は、たとえばキーボードやトラックボールや、検査に関する操作入力を行うための各種のボタンである。 The operation unit 105 transmits information related to user operation input to the control apparatus 101. The operation unit 105 is, for example, a keyboard, a trackball, and various buttons for performing operation inputs related to inspection.
 なお、表示部104と操作部105はタッチパネルディスプレイとして統合されていてもよい。また、制御装置101と表示部104と操作部105は別体の装置である必要はなく、これらの構成が統合された操作卓として実現されてもよい。制御装置101は、複数のプローブを有していてもよい。 Note that the display unit 104 and the operation unit 105 may be integrated as a touch panel display. Moreover, the control apparatus 101, the display part 104, and the operation part 105 do not need to be separate apparatuses, and may be implement | achieved as an operation console with which these structures were integrated. The control device 101 may have a plurality of probes.
 HIS(Hospital Information System)111は、病院の業務を支援するシステムである。HIS111は、電子カルテシステム、オーダリングシステムや医事会計システムを含む。HIS111により検査のオーダ発行から会計までを連携して管理することができる。HIS111のオーダリングシステムは、オーダ情報を部門ごとのオーダリングシステム112に送信する。そして後述するオーダリングシステム112において当該オーダの実施が管理される。 A HIS (Hospital Information System) 111 is a system that supports hospital operations. The HIS 111 includes an electronic medical record system, an ordering system, and a medical accounting system. By HIS111, it is possible to manage from inspection order issuance to accounting. The ordering system of the HIS 111 transmits order information to the ordering system 112 for each department. The ordering system 112, which will be described later, manages the execution of the order.
 オーダリングシステム112は、検査情報を管理し、撮像装置でのそれぞれの検査の進捗を管理するシステムである。オーダリングシステム112は検査を行う部門ごとに構成されていてもよい。オーダリングシステム112は、たとえば放射線部門においてはRIS(Radiology Information System)である。オーダリングシステム112は、制御装置101からの問い合わせに応じて、撮像システム100で行う検査の情報を制御装置101に送信する。オーダリングシステム112は、制御装置101から検査の進捗に関する情報を受信する。そして、オーダリングシステム112は、検査が完了したことを示す情報を制御装置101から受信すると、当該検査が完了したことを示す情報をHIS111に送信する。オーダリングシステム112はHIS111に統合されていてもよい。 The ordering system 112 is a system that manages inspection information and manages the progress of each inspection in the imaging apparatus. The ordering system 112 may be configured for each department that performs inspection. The ordering system 112 is, for example, RIS (Radiology Information System) in the radiation department. In response to an inquiry from the control apparatus 101, the ordering system 112 transmits information on examinations performed by the imaging system 100 to the control apparatus 101. The ordering system 112 receives information related to the progress of the inspection from the control device 101. When the ordering system 112 receives information indicating that the inspection is completed from the control device 101, the ordering system 112 transmits information indicating that the inspection is completed to the HIS 111. The ordering system 112 may be integrated into the HIS 111.
 PACS(Picture Archiving and Communication System)113は、施設内外の各種の撮像装置で得られた画像を保持するデータベースシステムである。PACS113は、医用画像及びかかる医用画像の撮影条件や、再構成を含む画像処理のパラメータや患者情報といった付帯情報を記憶する記憶部(不図示)と、当該記憶部に記憶される情報を管理するコントローラ(不図示)とを有する。PACS113は、制御装置101から出力された超音波画像や光音響画像や重畳画像を記憶する。PACS113と制御装置101との通信や、PACS113に記憶される各種の画像はHL7やDICOMといった規格に則していることが好ましい。制御装置101から出力される各種の画像は、DICOM規格に則って各種のタグに付帯情報が関連付けられ、記憶されている。 A PACS (Picture Archiving and Communication System) 113 is a database system that holds images obtained by various imaging devices inside and outside the facility. The PACS 113 manages a storage unit (not shown) that stores medical images and imaging conditions of such medical images, additional parameters such as image processing parameters including reconstruction, and patient information, and information stored in the storage unit. A controller (not shown). The PACS 113 stores an ultrasonic image, a photoacoustic image, and a superimposed image output from the control device 101. It is preferable that communication between the PACS 113 and the control device 101 and various images stored in the PACS 113 comply with standards such as HL7 and DICOM. Various images output from the control device 101 are stored with associated information associated with various tags in accordance with the DICOM standard.
 Viewer114は、画像診断用の端末であり、PACS113等に記憶された画像を読み出し、診断のために表示する。医師は、Viewer114に画像を表示させて観察し、当該観察の結果得られた情報を画像診断レポートとして記録する。Viewer114を用いて作成された画像診断レポートは、Viewer114に記憶されていてもよいし、PACS113やレポートサーバ(不図示)に出力され、記憶されてもよい。 The Viewer 114 is a terminal for image diagnosis, and reads an image stored in the PACS 113 and displays it for diagnosis. The doctor displays an image on the Viewer 114 for observation, and records information obtained as a result of the observation as an image diagnosis report. The diagnostic imaging report created using the Viewer 114 may be stored in the Viewer 114, or may be output and stored in the PACS 113 or a report server (not shown).
 Printer115は、PACS113等に記憶された画像を印刷する。Printer115はたとえばフィルムプリンタであり、PACS113等に記憶された画像をフィルムに印刷することにより出力する。 The Printer 115 prints an image stored in the PACS 113 or the like. The Printer 115 is, for example, a film printer, and outputs an image stored in the PACS 113 or the like by printing it on a film.
 図1は、制御装置101の構成の一例を示す図である。制御装置101は、CPU131と、ROM132と、RAM133と、DISK134と、USB135と、通信回路136と、GPU137と、HDMI138と、プローブコネクタポート139とを有する。これらはBUS130により通信可能に接続されている。 FIG. 1 is a diagram illustrating an example of the configuration of the control device 101. The control device 101 includes a CPU 131, ROM 132, RAM 133, DISK 134, USB 135, communication circuit 136, GPU 137, HDMI 138, and probe connector port 139. These are connected by BUS130 so that communication is possible.
 BUS130はデータバスであり、接続されたハードウェア間でのデータの送受信や、CPU131から他のハードウェアへの命令を送信するために使用される。 The BUS 130 is a data bus, and is used to transmit / receive data between connected hardware and to transmit commands from the CPU 131 to other hardware.
 CPU(Central Processing Unit)131は制御装置101及びこれに接続する各部を統合的に制御する制御回路である。CPU131はROM132に格納されているプログラムを実行することにより制御を実施する。またCPU131は、表示部104を制御するためのソフトウェアであるディスプレイドライバを実行し、表示部104に対する表示制御を行う。さらにCPU131は、操作部105に対する入出力制御を行う。 A CPU (Central Processing Unit) 131 is a control circuit that integrally controls the control device 101 and each unit connected thereto. The CPU 131 performs control by executing a program stored in the ROM 132. Further, the CPU 131 executes a display driver which is software for controlling the display unit 104 and performs display control on the display unit 104. Further, the CPU 131 performs input / output control for the operation unit 105.
 ROM(Read Only Memory)132は、CPUによる制御の手順を記憶させたプログラムやデータを格納する。ROM132は、制御装置101のブートプログラム140や各種初期データ141を有する。また、制御装置101の処理を実現するための各種のモジュール142~150を有する。制御装置101の処理を実現するための各種のモジュールについては、後述する。 ROM (Read Only Memory) 132 stores a program and data in which a control procedure by the CPU is stored. The ROM 132 has a boot program 140 of the control device 101 and various initial data 141. Also, various modules 142 to 150 for realizing the processing of the control apparatus 101 are included. Various modules for realizing the processing of the control apparatus 101 will be described later.
 RAM(Random Access Memory)133は、CPU131が命令プログラムによる制御を行う際に作業用の記憶領域を提供するものである。RAM133は、スタック151とワーク領域152とを有する。RAM133は、制御装置101を及びこれに接続する各部における処理を実行するためのプログラムや、画像処理で用いる各種パラメータを記憶する。RAM133は、CPU131が実行する制御プログラムを格納し、CPU131が各種制御を実行する際の様々なデータを一時的に格納する。 A RAM (Random Access Memory) 133 provides a working storage area when the CPU 131 performs control by an instruction program. The RAM 133 has a stack 151 and a work area 152. The RAM 133 stores a program for executing processing in each unit connected to the control device 101 and various parameters used in image processing. The RAM 133 stores a control program executed by the CPU 131, and temporarily stores various data when the CPU 131 executes various controls.
 DISK134は、超音波画像や光音響画像などの各種のデータを保存する補助記憶装置である。DISK134は、たとえばHDD(Hard Disk Drive)やSSD(Solid State Drive)である。 The DISK 134 is an auxiliary storage device that stores various data such as ultrasonic images and photoacoustic images. The DISK 134 is, for example, an HDD (Hard Disk Drive) or an SSD (Solid State Drive).
 USB(Universal Serial Bus)205は操作部105と接続する接続部である。 A USB (Universal Serial Bus) 205 is a connection unit connected to the operation unit 105.
 通信回路136は撮像システム100を構成する各部や、ネットワーク110に接続されている各種の外部装置との通信を行うための回路である。通信回路136は、たとえば出力する情報を転送用パケットに格納してTCP/IPといった通信技術により、ネットワーク110を介して外部装置に出力する。制御装置101は、所望の通信形態にあわせて、複数の通信回路を有していてもよい。 The communication circuit 136 is a circuit for communicating with each unit constituting the imaging system 100 and various external devices connected to the network 110. The communication circuit 136 stores output information in a transfer packet, for example, and outputs the information to an external device via the network 110 by a communication technique such as TCP / IP. The control device 101 may have a plurality of communication circuits in accordance with a desired communication form.
 GPU137は、ビデオメモリを含む汎用グラフィックスボードに含まれる。GPU137は、画像処理モジュール145の一部又は全部を実行し、たとえば光音響画像の再構成処理を行う。このような演算装置を使用するにより、専用ハードウェアを必要とせずに高速に再構成処理などの演算を行うことができる。 The GPU 137 is included in a general-purpose graphics board including a video memory. The GPU 137 executes part or all of the image processing module 145 and performs, for example, a photoacoustic image reconstruction process. By using such an arithmetic device, it is possible to perform operations such as reconstruction processing at high speed without requiring dedicated hardware.
 HDMI(登録商標)(High Definition Multimedia Interface)138は、表示部104と接続する接続部である。 HDMI (registered trademark) (High Definition Multimedia Interface) 138 is a connection unit connected to the display unit 104.
 プローブコネクタポート139は、プローブ102を制御装置101に接続するための接続口である。プローブ102から出力される超音波信号及び光音響信号は、ポート139を介して制御装置101に取得される。 The probe connector port 139 is a connection port for connecting the probe 102 to the control device 101. The ultrasonic signal and photoacoustic signal output from the probe 102 are acquired by the control device 101 via the port 139.
 CPU131やGPU137はプロセッサの一例である。また、ROM132やRAM133やDISK134はメモリの一例である。制御装置101は複数のプロセッサを有していてもよい。第1の実施形態においては、制御装置101のプロセッサがメモリに格納されているプログラムを実行することにより、制御装置101の各部の機能が実現される。 CPU 131 and GPU 137 are examples of processors. The ROM 132, RAM 133, and DISK 134 are examples of memories. The control device 101 may have a plurality of processors. In the first embodiment, the function of each unit of the control device 101 is realized by the processor of the control device 101 executing a program stored in the memory.
 なお、制御装置101は特定の処理を専用に行うCPUやGPUを有していても良い。また、制御装置101は特定の処理あるいは全ての処理をプログラムしたFPGA(Field-Programmable Gate Array)を有していてもよい。制御装置101はDISK134としてHDDとSSDとの両方を有していてもよい。 Note that the control device 101 may have a CPU or GPU that performs a specific process exclusively. Further, the control device 101 may have a field-programmable gate array (FPGA) in which specific processing or all processing is programmed. The control device 101 may have both an HDD and an SSD as the DISK 134.
 ROM132に格納されているモジュール143~150について説明する。図1に示したモジュール143~150は本発明の実施形態に関連する処理を実行するためのモジュールを抽出したものである。制御装置101は、図示した以外に、検査を実行し制御装置101を動作させるために必要なモジュールを有していてもよい。また、それぞれのモジュールは一つまたは複数のプログラムとして組み合わされて構成されていてもよい。また、モジュール143~150の一部又は全部は、DISK134といったROM132以外のメモリに格納されていてもよい。以下、各モジュールについて詳述する。 The modules 143 to 150 stored in the ROM 132 will be described. Modules 143 to 150 shown in FIG. 1 are obtained by extracting modules for executing processing related to the embodiment of the present invention. The control device 101 may include modules necessary for executing the inspection and operating the control device 101 other than those illustrated. In addition, each module may be configured as a combination of one or a plurality of programs. Also, some or all of the modules 143 to 150 may be stored in a memory other than the ROM 132 such as the DISK 134. Hereinafter, each module will be described in detail.
 検査制御モジュール142は、撮像システム100において行われる検査を制御する。検査制御モジュール142は、オーダリングシステム112から検査オーダの情報を取得する。検査オーダには、検査を受ける患者の情報や、撮影手技に関する情報が含まれる。検査制御モジュール142は、撮影手技の情報に基づいてプローブ102や検知部103を制御する。さらに検査制御モジュール142は、ユーザに検査に関する情報を提示するために出力モジュール150を介して表示部104に当該検査の情報を表示させる。表示部104に表示される検査の情報には、検査を受ける患者の情報や、当該検査に含まれる撮影手技の情報や、既に撮像が完了して生成された画像が含まれる。さらに検査制御モジュール142は、当該検査の進捗に関する情報をオーダリングシステム112に送信する。たとえば、ユーザにより当該検査が開始された際には、システム112に開始を通知し、当該検査に含まれる全ての撮影手技による撮像が完了した際には、システム112に完了を通知する。 The inspection control module 142 controls inspection performed in the imaging system 100. The inspection control module 142 acquires inspection order information from the ordering system 112. The examination order includes information on a patient who undergoes an examination and information on imaging procedures. The inspection control module 142 controls the probe 102 and the detection unit 103 based on information on the imaging technique. Further, the examination control module 142 displays information on the examination on the display unit 104 via the output module 150 in order to present information related to the examination to the user. The information on the examination displayed on the display unit 104 includes information on the patient undergoing the examination, information on the imaging technique included in the examination, and an image already generated after imaging. Further, the inspection control module 142 transmits information regarding the progress of the inspection to the ordering system 112. For example, when the inspection is started by the user, the system 112 is notified of the start, and when imaging by all the imaging techniques included in the inspection is completed, the system 112 is notified of the completion.
 信号取得モジュール143は、プローブ102から超音波信号と光音響信号とを取得する。具体的には、信号取得モジュール143は、検査制御モジュール142や画像処理モジュール145や位置取得モジュール149からの情報に基づいて、プローブ102から取得した情報から超音波信号と光音響信号とを区別して取得する。たとえば、撮像を行っている撮影手技において、超音波信号の取得と光音響信号の取得のタイミングが規定されている場合には、検査制御モジュール142から取得した当該取得のタイミングの情報に基づいて、プローブ102から取得した情報から超音波信号と光音響信号とを区別して取得する。後述する例のように、超音波画像を画像処理モジュール145により解析した結果に基づいて照射部107を制御して光音響信号を取得する場合には、当該制御に対応して取得された信号を光音響信号として、超音波信号と区別して取得する。信号取得モジュール143は、プローブ102から超音波信号と光音響信号とのうち少なくともいずれかを取得する取得手段の一例である。 The signal acquisition module 143 acquires an ultrasonic signal and a photoacoustic signal from the probe 102. Specifically, the signal acquisition module 143 distinguishes an ultrasonic signal and a photoacoustic signal from information acquired from the probe 102 based on information from the inspection control module 142, the image processing module 145, and the position acquisition module 149. get. For example, in the imaging technique in which imaging is performed, when the acquisition timing of the ultrasonic signal and the acquisition of the photoacoustic signal is defined, based on the acquisition timing information acquired from the inspection control module 142, An ultrasonic signal and a photoacoustic signal are distinguished and acquired from information acquired from the probe 102. When the photoacoustic signal is acquired by controlling the irradiation unit 107 based on the result of analyzing the ultrasonic image by the image processing module 145 as in the example described later, the signal acquired corresponding to the control is obtained. The photoacoustic signal is acquired separately from the ultrasonic signal. The signal acquisition module 143 is an example of an acquisition unit that acquires at least one of an ultrasonic signal and a photoacoustic signal from the probe 102.
 信号取得モジュール143は照射制御モジュール144を有する。照射制御モジュール144は、検査制御モジュール142から取得した撮像条件に関する情報や、画像処理モジュール145が超音波画像を解析した結果に基づいて、照射部107による光照射を制御する。 The signal acquisition module 143 has an irradiation control module 144. The irradiation control module 144 controls the light irradiation by the irradiation unit 107 based on the information regarding the imaging conditions acquired from the inspection control module 142 and the result of analysis of the ultrasonic image by the image processing module 145.
 画像処理モジュール145は、撮像システム100において取得された信号に基づいて画像を生成する処理を行うためのモジュールである。画像処理モジュール145は、超音波画像生成モジュール146と、光音響画像生成モジュール147と、重畳画像生成モジュール148とを有する。 The image processing module 145 is a module for performing processing for generating an image based on a signal acquired in the imaging system 100. The image processing module 145 includes an ultrasonic image generation module 146, a photoacoustic image generation module 147, and a superimposed image generation module 148.
 画像処理モジュール145は、超音波画像生成モジュール146、光音響画像生成モジュール147、重畳画像生成モジュール148がそれぞれ生成した画像を付帯情報とともにDISK134に保存する。また、出力モジュール150を介して外部装置に画像を付帯情報とともに出力することにより、当該外部装置に保存する。 The image processing module 145 stores the images generated by the ultrasonic image generation module 146, the photoacoustic image generation module 147, and the superimposed image generation module 148 in the DISK 134 together with accompanying information. In addition, the image is stored in the external device by outputting the image together with the supplementary information to the external device via the output module 150.
 超音波画像生成モジュール146は、信号取得モジュール143により取得された超音波信号から、表示部104に表示させるための超音波画像を生成する。超音波画像生成モジュール146は、検査制御モジュール142から取得した撮影手技の情報に基づいて、設定されたモードに適した超音波画像を生成する。たとえば撮影手技としてドプラモードが設定されている場合には、超音波画像生成モジュール146は、信号取得モジュール143により取得された超音波信号の周波数と送信周波数との差に基づいて、被検体内部の流速を示す画像を生成する。超音波画像生成モジュール146により生成される超音波画像は、Aモード、Mモード、ドプラモードといった、その他のいずれの方法により生成されるものであってもよいし、ハーモニックイメージやエラストグラフィ画像であってもよい。 The ultrasonic image generation module 146 generates an ultrasonic image to be displayed on the display unit 104 from the ultrasonic signal acquired by the signal acquisition module 143. The ultrasonic image generation module 146 generates an ultrasonic image suitable for the set mode based on the imaging technique information acquired from the examination control module 142. For example, when the Doppler mode is set as an imaging technique, the ultrasound image generation module 146 determines whether or not the internal frequency of the subject is based on the difference between the frequency of the ultrasound signal acquired by the signal acquisition module 143 and the transmission frequency. An image showing the flow velocity is generated. The ultrasonic image generated by the ultrasonic image generation module 146 may be generated by any other method such as A mode, M mode, or Doppler mode, or may be a harmonic image or an elastography image. May be.
 さらに、第一の実施形態において超音波画像生成モジュール146は、生成した超音波画像を解析して、光音響信号を取得して光音響画像を生成すべき領域を特定する。たとえば、超音波画像生成モジュール146は、超音波画像を解析して結石の可能性がある領域を特定する。この観点では、超音波画像生成モジュール146は、超音波画像を解析して光音響画像を取得する領域を特定する検出手段として機能する。 Furthermore, in the first embodiment, the ultrasonic image generation module 146 analyzes the generated ultrasonic image, acquires a photoacoustic signal, and specifies a region where a photoacoustic image is to be generated. For example, the ultrasonic image generation module 146 analyzes the ultrasonic image and identifies an area where there is a possibility of calculus. In this viewpoint, the ultrasonic image generation module 146 functions as a detection unit that analyzes an ultrasonic image and specifies a region from which a photoacoustic image is acquired.
 光音響画像生成モジュール147は、信号取得モジュール143により取得された光音響信号に基づいて光音響画像を生成する。光音響画像生成モジュール147は、光音響信号に基づいて光が照射された時の音響波の分布(以下、初期音圧分布と称する。)を再構成する。光音響画像生成モジュール147は、再構成された初期音圧分布を、被検体に照射された光の被検体の光フルエンス分布で除することにより、被検体内における光の吸収係数分布を取得する。また、被検体に照射する光の波長に応じて、被検体内で光の吸収の度合いが異なることを利用して、複数の波長に対する吸収係数分布から被検体内の物質の濃度分布を取得する。たとえば光音響画像生成モジュール147は、オキシヘモグロビンとデオキシヘモグロビンの被検体内における物質の濃度分布を取得する。さらに光音響画像生成モジュール147は、オキシヘモグロビン濃度のデオキシヘモグロビン濃度に対する割合として酸素飽和度分布を取得する。光音響画像生成モジュール147により生成される光音響画像は、たとえば上述した初期音圧分布、光フルエンス分布、吸収係数分布、物質の濃度分布、酸素飽和度分布といった情報を示す画像である。また、光音響画像は、これらを組み合わせて生成される画像のいずれであってもよい。 The photoacoustic image generation module 147 generates a photoacoustic image based on the photoacoustic signal acquired by the signal acquisition module 143. The photoacoustic image generation module 147 reconstructs an acoustic wave distribution (hereinafter referred to as an initial sound pressure distribution) when light is irradiated based on the photoacoustic signal. The photoacoustic image generation module 147 obtains the light absorption coefficient distribution in the subject by dividing the reconstructed initial sound pressure distribution by the light fluence distribution of the subject irradiated with the light. . Further, the concentration distribution of the substance in the subject is obtained from the absorption coefficient distribution for a plurality of wavelengths by utilizing the fact that the degree of light absorption in the subject varies depending on the wavelength of the light irradiated to the subject. . For example, the photoacoustic image generation module 147 acquires the concentration distribution of substances in the subject of oxyhemoglobin and deoxyhemoglobin. Further, the photoacoustic image generation module 147 acquires the oxygen saturation distribution as a ratio of the oxyhemoglobin concentration to the deoxyhemoglobin concentration. The photoacoustic image generated by the photoacoustic image generation module 147 is an image indicating information such as the above-described initial sound pressure distribution, optical fluence distribution, absorption coefficient distribution, substance concentration distribution, and oxygen saturation distribution. The photoacoustic image may be any image generated by combining these.
 すなわち画像処理モジュール145は、超音波信号に基づいて超音波画像を生成し、光音響信号に基づいて光音響画像を生成する生成手段の一例である。 That is, the image processing module 145 is an example of a generation unit that generates an ultrasonic image based on the ultrasonic signal and generates a photoacoustic image based on the photoacoustic signal.
 重畳画像生成モジュール148は、超音波画像生成モジュール146で生成された超音波画像に対して、光音響画像生成モジュール147で生成された光音響画像を重畳した重畳画像を生成する。重畳画像生成モジュール148は、超音波画像と光音響画像とを位置合わせして重畳画像を取得する。当該位置合わせには、検査制御モジュール142から取得した撮像条件や、後述する位置取得モジュール149から取得したプローブ102の位置に関する情報を用いてもよい。また、当該位置合わせは超音波画像と光音響画像とに共通して描出されている領域に基づいて行われてもよい。 The superimposed image generation module 148 generates a superimposed image in which the photoacoustic image generated by the photoacoustic image generation module 147 is superimposed on the ultrasonic image generated by the ultrasonic image generation module 146. The superimposed image generation module 148 obtains a superimposed image by aligning the ultrasonic image and the photoacoustic image. For the alignment, information regarding the imaging condition acquired from the inspection control module 142 or the position of the probe 102 acquired from the position acquisition module 149 described later may be used. In addition, the alignment may be performed based on a region that is depicted in common for the ultrasonic image and the photoacoustic image.
 位置取得モジュール149は、検知部103からの情報に基づいてプローブ102の位置に関する情報を取得する。位置取得モジュール149は、当該位置に関する情報の経時変化に基づいて、プローブ102の被検体に対する移動の速度や、回転の速度の情報や、被検体に対する押圧の程度を示す情報の、少なくともいずれかを取得してもよい。検知部103がプローブ102の情報を制御装置101に送信している場合は、位置取得モジュール149は一定の時間間隔で、好ましくはリアルタイムでプローブ102の位置情報を取得することが好ましい。 The position acquisition module 149 acquires information related to the position of the probe 102 based on information from the detection unit 103. The position acquisition module 149 obtains at least one of information on the speed of movement of the probe 102 with respect to the subject, information on the speed of rotation, and information indicating the degree of pressure on the subject based on the change over time of the information on the position. You may get it. When the detection unit 103 transmits the information on the probe 102 to the control device 101, the position acquisition module 149 preferably acquires the position information on the probe 102 at regular time intervals, preferably in real time.
 さらに位置取得モジュール149は、撮像に用いられているプローブ102に関する情報を取得してもよい。プローブ102に関する情報には、プローブの種類、中心周波数、感度、音響フォーカス、電子フォーカス、観察深度といった情報が含まれる。位置取得モジュール149は、プローブ102の位置に関する情報やプローブ102に関する情報を検査制御モジュール142、画像処理モジュール145、出力モジュール150に適宜送信する。 Further, the position acquisition module 149 may acquire information regarding the probe 102 used for imaging. Information related to the probe 102 includes information such as the type of probe, center frequency, sensitivity, acoustic focus, electronic focus, and observation depth. The position acquisition module 149 appropriately transmits information regarding the position of the probe 102 and information regarding the probe 102 to the inspection control module 142, the image processing module 145, and the output module 150.
 出力モジュール150は、表示部104に画面を表示させるための情報を出力し、ネットワーク110を介して外部装置に情報を出力する。 The output module 150 outputs information for displaying a screen on the display unit 104, and outputs the information to an external device via the network 110.
 出力モジュール150は、表示部104を制御して、表示部104に情報を表示させる。出力モジュール150は、検査制御モジュール142や画像処理モジュール145からの入力や、操作部105を介したユーザの操作入力に応じて、表示部104に情報を表示させる。出力モジュール150は、表示制御手段の一例である。 The output module 150 controls the display unit 104 to display information on the display unit 104. The output module 150 displays information on the display unit 104 in response to an input from the inspection control module 142 or the image processing module 145 or a user operation input via the operation unit 105. The output module 150 is an example of a display control unit.
 出力モジュール150は、制御装置101からネットワーク110を介してPACS113といった外部装置に情報を出力する。たとえば、出力モジュール150は画像処理モジュール145で生成された超音波画像や光音響画像、これらの重畳画像をPACS113に出力する。出力モジュール150から出力される画像には、検査制御モジュール142によりDICOM規格に則った各種のタグとして付帯された付帯情報が含まれる。付帯情報には、たとえば患者情報や、当該画像を撮像した撮像装置を示す情報や、当該画像を一意に識別するための画像IDや、当該画像を撮像した検査を一意に識別するための検査IDが含まれる。また、付帯情報には、同じ検査の中で撮像された超音波画像と光音響画像とを関連付ける情報が含まれる。超音波画像と光音響画像とを関連付ける情報とは、たとえば超音波画像を構成する複数のフレームのうち、光音響画像を取得したタイミングが最も近いフレームを示す情報である。さらに、付帯情報として、検知部103で取得されたプローブ102の位置情報を超音波画像及び光音響画像の各フレームに付帯させてもよい。すなわち出力モジュール150は、超音波画像を生成するための超音波信号を取得したプローブ102の位置を示す情報を、当該超音波画像に付帯させて出力する。また、出力モジュール150は、光音響画像を生成するための光音響信号を取得したプローブ102の位置を示す情報を、当該光音響画像に付帯させて出力する。出力モジュール150は、出力手段の一例である。 The output module 150 outputs information from the control device 101 to an external device such as the PACS 113 via the network 110. For example, the output module 150 outputs the ultrasonic image and the photoacoustic image generated by the image processing module 145 and a superimposed image thereof to the PACS 113. The image output from the output module 150 includes incidental information attached as various tags according to the DICOM standard by the inspection control module 142. The incidental information includes, for example, patient information, information indicating the imaging device that captured the image, an image ID for uniquely identifying the image, and an examination ID for uniquely identifying the examination that captured the image Is included. Further, the incidental information includes information that associates an ultrasonic image captured in the same examination with a photoacoustic image. The information associating the ultrasonic image and the photoacoustic image is information indicating a frame having the closest timing at which the photoacoustic image is acquired, for example, among a plurality of frames constituting the ultrasonic image. Furthermore, as the incidental information, the position information of the probe 102 acquired by the detection unit 103 may be incidental to each frame of the ultrasonic image and the photoacoustic image. That is, the output module 150 outputs information indicating the position of the probe 102 that has acquired the ultrasonic signal for generating the ultrasonic image, attached to the ultrasonic image. Also, the output module 150 outputs information indicating the position of the probe 102 that has acquired the photoacoustic signal for generating the photoacoustic image, attached to the photoacoustic image. The output module 150 is an example of an output unit.
 図2は、取得された超音波画像に基づいて光照射を制御し、光音響画像を取得するための制御装置101の処理の一例を示すフローチャートである。以下に示す各処理は、特に断りがない場合、各モジュールによる処理を実現する主体は、CPU131またはGPU137である。 FIG. 2 is a flowchart showing an example of processing of the control apparatus 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image. In the following processes, the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
 ステップS201において、超音波画像生成モジュール146が実行されることにより、超音波画像が取得される。具体的には、まず検査に先だって検査制御モジュール142が実行されることにより、検査オーダがオーダリングシステム112から取得される。当該検査オーダには、検査の対象である患者の情報や、検査の対象となる部位や、撮影手技に関する情報が含まれる。ユーザは、プローブ102を操作し、プローブ102から超音波信号が制御装置101に送信される。信号取得モジュール143が実行されることにより、当該超音波信号が制御装置101に取得される。超音波画像生成モジュール146が実行されることにより、当該超音波信号に基づいて超音波画像が生成される。出力モジュール150が実行されることにより、当該超音波画像が表示部104に表示される。ユーザは表示部104に表示された超音波画像を観察しながら、プローブ102をさらに操作することができる。 In step S201, the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112. The examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique. The user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101. By executing the signal acquisition module 143, the ultrasonic signal is acquired by the control device 101. By executing the ultrasonic image generation module 146, an ultrasonic image is generated based on the ultrasonic signal. By executing the output module 150, the ultrasonic image is displayed on the display unit 104. The user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
 ステップS202において、検査制御モジュール142が実行されることにより、超音波画像に基づいて光音響信号を取得するか否かを示す情報を取得する。超音波画像に基づいて光音響信号を取得するか否かを示す情報とは、具体的には、ユーザによる事前の設定や、検査オーダに含まれる情報である。別の例では、制御装置101で行われている処理の負荷の状態に応じて、光音響信号を取得するか否かが判定されてもよい。たとえば、制御装置101で行われている処理の負荷が大きく、光音響信号の取得を行うことで超音波画像を取得する処理に影響を与えてしまう場合には、検査制御モジュール142の処理において光音響信号を取得しないと判定してもよい。超音波画像に基づいて光音響信号を取得する場合にはステップS203に進み、取得しない場合にはステップS201に戻って超音波画像の取得が継続される。 In step S202, the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image. The information indicating whether or not to acquire the photoacoustic signal based on the ultrasonic image is specifically information set in advance by the user or included in the inspection order. In another example, whether to acquire a photoacoustic signal may be determined according to the state of the load of processing performed in the control apparatus 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal. If the photoacoustic signal is acquired based on the ultrasound image, the process proceeds to step S203. If not, the process returns to step S201 to continue acquiring the ultrasound image.
 ステップS203において、超音波画像生成モジュール146が実行されることにより、超音波画像が解析される。たとえば、超音波画像生成モジュール146は、超音波画像を解析し、超音波画像に描出された領域の中から予め定義された関心領域を検出する。当該関心領域には、たとえば結石や腫瘍や血管といった臓器が設定される。結石には、超音波撮影で描出され得る虚像の場合も含む。コンピュータ診断支援システムを併用して着目したい部位を探すこともできる。超音波画像の解析や、関心領域として抽出する領域は上述した例に限られず、超音波画像に加えて光音響画像を取得することが有益であると考えられる領域を検出する形態であればいかなる例でも良い。 In step S203, the ultrasonic image generation module 146 is executed to analyze the ultrasonic image. For example, the ultrasound image generation module 146 analyzes the ultrasound image and detects a region of interest defined in advance from the regions depicted in the ultrasound image. For example, an organ such as a calculus, a tumor, or a blood vessel is set in the region of interest. The calculus includes the case of a virtual image that can be depicted by ultrasonic imaging. A computer diagnosis support system can also be used in combination to search for a site of interest. The analysis of the ultrasound image and the region to be extracted as the region of interest are not limited to the above-described examples, and any form may be used as long as it is possible to detect a region that is considered to be beneficial to acquire a photoacoustic image in addition to the ultrasound image. An example is also acceptable.
 ステップS203において超音波画像生成モジュール146に基づいて行われる解析の一例を示す。ここでは2次元の断層画像であるBモード画像を解析する場合を例に説明する。超音波画像には、スペックルパターンと呼ばれる斑紋状のノイズが含まれる。したがって、当該解析においてはまずスペックルを低減する処理が行われる。たとえば、スペックルを雑音成分として捉え、移動平均やメディアンといった空間フィルタによりスペックルを低減する。あるいは、スペックルパターンに特有な性質であるRayleigh分布の性質を利用した、マスク形状が局所可変となるフィルタによりスペックルを低減する。その他、多重解像度フィルタや数値シミュレーションによるフィルタを用いてスペックルを低減してもよい。そして、スペックルが低減された画像に対して、濃度値に対する閾値処理や微分処理により、目的とする領域を抽出するセグメンテーションを行う。あるいは、組織の形状に着目して、たとえば可変形状モデルを用いたセグメンテーションを行ってもよい。当該解析のまた別の例としては、超音波画像に特有の性質であるスペックルパターンに基づいてセグメンテーションを行ってもよい。スペックルパターンの特徴に着目した手法としては、たとえば濃度値に対する同時生起行列に基づく特徴量を用いたテクスチャ解析や、対数圧縮K分布のパラメータより得られる統計量を不均一エコーに対する特徴量の一つとして用いた確率分布による手法が挙げられる。その他、当該解析において内部エコー、形状、境界部エコー、後方エコー、外側陰影の情報を用いてもよい。 An example of the analysis performed based on the ultrasonic image generation module 146 in step S203 is shown. Here, a case where a B-mode image that is a two-dimensional tomographic image is analyzed will be described as an example. The ultrasonic image includes mottled noise called a speckle pattern. Therefore, in the analysis, first, processing for reducing speckle is performed. For example, speckle is regarded as a noise component, and speckle is reduced by a spatial filter such as moving average or median. Alternatively, speckle is reduced by a filter that uses the property of Rayleigh distribution, which is a property peculiar to speckle patterns, and the mask shape is locally variable. In addition, speckles may be reduced using a multi-resolution filter or a filter by numerical simulation. Then, segmentation for extracting a target region is performed on the image with reduced speckles by threshold processing and differentiation processing on the density value. Alternatively, focusing on the shape of the tissue, for example, segmentation using a variable shape model may be performed. As another example of the analysis, segmentation may be performed based on a speckle pattern that is a property unique to an ultrasound image. As a method focusing on the features of the speckle pattern, for example, texture analysis using a feature amount based on a co-occurrence matrix for density values, or a statistic obtained from a parameter of a logarithmic compression K distribution is used as a feature amount for a nonuniform echo. One example is a probability distribution method. In addition, in the analysis, information on internal echo, shape, boundary echo, backward echo, and outer shadow may be used.
 なお、ステップS203において超音波画像を解析している間、信号取得モジュール143により継続して超音波信号を取得し、超音波画像生成モジュール146により継続して超音波画像を生成し、出力モジュール150により継続して超音波画像を表示部104に表示させるようにしてもよい。 While the ultrasonic image is analyzed in step S203, the ultrasonic signal is continuously acquired by the signal acquisition module 143, the ultrasonic image is continuously generated by the ultrasonic image generation module 146, and the output module 150. Then, the ultrasonic image may be continuously displayed on the display unit 104.
 ステップS204において、超音波画像生成モジュール146が実行されることにより、ステップS203の解析の結果に基づいて、光音響画像を取得するか否かについて判定される。ステップS203において関心領域が抽出された場合にはステップS205に進み、関心領域が抽出されなかった場合にはステップS201に戻って超音波画像の取得を継続する。 In step S204, by executing the ultrasonic image generation module 146, it is determined whether to acquire a photoacoustic image based on the analysis result in step S203. If the region of interest is extracted in step S203, the process proceeds to step S205. If the region of interest is not extracted, the process returns to step S201 to continue acquiring the ultrasound image.
 ステップS205において、照射制御モジュール144が実行されることにより、被検体に光を照射するか否かが判定される。具体的には、照射制御モジュール144は、プローブ102が被検体に接触しているか否かを判定する。照射制御モジュール144は、超音波画像生成モジュール146により生成された超音波画像や、位置取得モジュール149により取得されたプローブ102の位置情報に基づいて、被検体とプローブ102との接触を判定する。プローブ102に被検体との接触を検知するためのセンサ(不図示)を設け、照射制御モジュール144は当該センサ(不図示)からの情報に基づいて被検体とプローブ102との接触を判定してもよい。照射制御モジュール144は、被検体とプローブ102とが接触していると判定される場合に、照射部107を制御して光を照射させる。被検体とプローブ102とが接触していないと判定される場合に、出力モジュール150を介して表示部104に、被検体とプローブ102とが接触していないことをユーザに報知する画面を表示させてもよい。照射制御モジュール144は更に、超音波画像生成モジュール146により生成された超音波画像に基づいて、当該超音波画像にステップS203で関心領域として抽出された領域が描出されている場合に光を照射するように照射部107を制御してもよい。これにより、ステップS203において解析された領域を描出可能な位置からプローブ102が離れた場合に、冗長な光照射を行う可能性を低減できる。ステップS205において被検体に光を照射すると判定された場合にはステップS206に進み、照射しないと判定された場合にはステップS201に戻って超音波画像の取得を継続する。 In step S205, the irradiation control module 144 is executed to determine whether or not to irradiate the subject with light. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149. The probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good. The irradiation control module 144 controls the irradiation unit 107 to emit light when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be. The irradiation control module 144 further irradiates light based on the ultrasonic image generated by the ultrasonic image generation module 146 when the region extracted as the region of interest in step S203 is drawn on the ultrasonic image. The irradiation unit 107 may be controlled as described above. Thereby, when the probe 102 leaves | separates from the position which can draw the area | region analyzed in step S203, possibility of performing redundant light irradiation can be reduced. If it is determined in step S205 that the subject is to be irradiated with light, the process proceeds to step S206. If it is determined that the object is not irradiated, the process returns to step S201 to continue acquiring the ultrasound image.
 ステップS206において、信号取得モジュール143が実行されることにより、プローブ102から光音響信号が取得される。 In Step S206, the photoacoustic signal is acquired from the probe 102 by executing the signal acquisition module 143.
 ステップS207において、光音響画像生成モジュール147が実行されることにより、ステップS206で取得された光音響信号から光音響画像が再構成される。そして、出力モジュール150が実行されることにより、当該再構成された光音響画像が表示部104に表示される。さらに、光音響画像生成モジュール147が重畳画像生成モジュール148を実行するように制御することにより、重畳画像を生成し、出力モジュール150を介して表示部104に重畳画像を表示させるようにしてもよい。 In step S207, the photoacoustic image generation module 147 is executed to reconstruct a photoacoustic image from the photoacoustic signal acquired in step S206. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
 たとえば、Bモード画像のような超音波画像には被検体内部の形態情報が描出される。一方で、光音響画像には血管のヘモグロビン量といった機能情報が描出される。第1の実施形態においては、予め定められた重畳の条件により生成された重畳画像が表示部104に表示される。重畳の条件とは、たとえば超音波画像をベース画像とし、光音響画像をレイヤ画像とする重畳画像において、それぞれの画像を表示する色や、重畳範囲や、透明度といった条件である。 For example, morphological information inside the subject is depicted in an ultrasonic image such as a B-mode image. On the other hand, functional information such as the amount of hemoglobin in the blood vessel is depicted in the photoacoustic image. In the first embodiment, a superimposed image generated according to a predetermined overlapping condition is displayed on the display unit 104. The superimposing conditions are, for example, conditions such as a color for displaying each image, a superimposing range, and transparency in a superimposed image in which an ultrasonic image is a base image and a photoacoustic image is a layer image.
 このように制御装置101は、形態情報に基づいて特定された関心領域の画像と、機能情報とを対応させて参照できるように、超音波画像と光音響画像とを表示部104に表示させることができる。これにより、医師といったユーザが被検体の超音波画像や光音響画像といった医用画像を観察し、診断を行うためのワークフローを向上することができる。    As described above, the control apparatus 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the form information and the function information can be referred to in association with each other. Can do. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved. *
 なお、上述した例に限らず、照射制御モジュール144はユーザの操作入力に基づいて被検体に光を照射させてもよい。光を照射させた場合に、光音響画像生成モジュール147により光音響画像を生成させ、出力モジュール150を介して表示部104に光音響画像を表示させてもよい。 Note that the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input. When the light is irradiated, the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
 また、光音響画像生成モジュール147が生成した光音響画像及び重畳画像生成モジュール148が生成した重畳画像が、適宜DISK134やPACS113に保存されるようにしてもよい。 Further, the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
 図3は、図2に例示したステップS203において行われる解析の一例と、ステップS207において表示される画像の一例とを示す図である。ここでは、超音波画像に結石の可能性がある領域が描出される場合を例に説明する。 FIG. 3 is a diagram illustrating an example of analysis performed in step S203 illustrated in FIG. 2 and an example of an image displayed in step S207. Here, a case where a region with a possibility of calculus is depicted in an ultrasonic image will be described as an example.
 図3(a)は、被検体の内部の構造を模式的に例示した図である。領域301は、被検体に対するある位置にプローブを接触させて超音波信号を取得した場合に、当該超音波信号に基づいて生成される超音波画像に描出される領域である。被検体の内部には結石302と血管303とが存在するものとする。このとき、結石302は領域301の外部に位置している。 FIG. 3A is a diagram schematically illustrating the internal structure of the subject. An area 301 is an area drawn on an ultrasonic image generated based on the ultrasonic signal when the ultrasonic signal is acquired by bringing the probe into contact with a certain position with respect to the subject. It is assumed that a calculus 302 and a blood vessel 303 exist inside the subject. At this time, the calculus 302 is located outside the region 301.
 図3(b)は、図3(a)に例示する領域301を撮像して生成される超音波画像304の一例である。超音波画像304には、被検体内部の血管303と対応する血管領域306が描出されている。さらに、超音波画像304には領域301の外部に存在した結石302の像が虚像305として描出されている。虚像とは、被検体の内部には本来存在しない構造が画像上に描出されている像である。超音波画像に虚像が描出される原因は複数考えられる。図3(b)に例示する場合においては、サイドローブと呼ばれる、プローブ102から送信される主方向の超音波の外側に照射される弱い超音波が原因であると考えられる。サイドローブの超音波が結石といった構造物で反射した反射波がプローブ102に検出されると、主方向の超音波によって描出される領域301に結石302が存在するかのような超音波画像304が生成される。超音波画像304を観察するユーザは、虚像領域305が虚像であるか否かを判断しなければならない。一般的には、ユーザはプローブ102を操作して超音波画像の撮像範囲を変えながら虚像であるか否かを判断することが多い。ステップS203においては、撮像された超音波画像を解析して、たとえば結石の可能性がある領域を検出する。 FIG. 3B is an example of an ultrasonic image 304 generated by imaging the region 301 illustrated in FIG. In the ultrasound image 304, a blood vessel region 306 corresponding to the blood vessel 303 inside the subject is depicted. Furthermore, an image of the calculus 302 existing outside the region 301 is depicted as a virtual image 305 in the ultrasonic image 304. A virtual image is an image in which a structure that does not originally exist inside the subject is depicted on the image. There are several possible causes for the virtual image being depicted in the ultrasound image. In the case illustrated in FIG. 3B, it is considered that this is caused by weak ultrasonic waves that are irradiated outside the ultrasonic waves in the main direction transmitted from the probe 102, called side lobes. When a reflected wave reflected by a structure such as a calculus is detected by a probe 102, an ultrasonic image 304 as if the calculus 302 is present in the region 301 depicted by the ultrasonic waves in the main direction is obtained. Generated. A user who observes the ultrasound image 304 must determine whether or not the virtual image region 305 is a virtual image. In general, the user often determines whether the image is a virtual image while operating the probe 102 and changing the imaging range of the ultrasonic image. In step S203, the picked-up ultrasonic image is analyzed to detect, for example, a region with a possibility of calculus.
 図3(c)は、図3(a)に例示する領域301を撮像して生成される光音響画像307の一例である。すなわち光音響画像307は、図2に例示するステップS204乃至ステップS206の処理により取得された光音響信号に基づいて生成された光音響画像である。光音響画像307には、被検体内部の血管303と対応する血管領域308が描出されている。光音響画像307には、結石302に起因する像、あるいは血管領域308に対する影響は描出されていない。結石302に起因する特徴が光音響画像307に描出されない原因としては、たとえば照射部107から照射されるレーザー光の方が、送受信部106から照射される超音波よりも直進性が高いことが考えられる。このように、結石の可能性がある像である虚像領域305が虚像である場合には、光音響画像307において虚像領域305と対応する位置に結石302に起因する特徴が描出されない。したがってユーザは、結石の可能性があると考える領域が、虚像であるか否かを判断する際に光音響画像から得られる情報を参考にすることができる。 FIG. 3C is an example of a photoacoustic image 307 generated by imaging the region 301 illustrated in FIG. That is, the photoacoustic image 307 is a photoacoustic image generated based on the photoacoustic signal acquired by the processing in steps S204 to S206 illustrated in FIG. In the photoacoustic image 307, a blood vessel region 308 corresponding to the blood vessel 303 inside the subject is depicted. The photoacoustic image 307 does not depict the image caused by the calculus 302 or the influence on the blood vessel region 308. A possible reason for the feature resulting from the calculus 302 not being depicted in the photoacoustic image 307 is that, for example, the laser beam emitted from the irradiation unit 107 has higher straightness than the ultrasonic wave emitted from the transmission / reception unit 106. It is done. As described above, when the virtual image region 305 that is an image with the possibility of a calculus is a virtual image, the feature due to the calculus 302 is not drawn at the position corresponding to the virtual image region 305 in the photoacoustic image 307. Therefore, the user can refer to the information obtained from the photoacoustic image when determining whether or not the area considered to have a calculus is a virtual image.
 なお、図3(c)では超音波画像304を撮像した領域301と同一の領域の光音響画像307を生成する例を説明したが、これに限らない。たとえば、ステップS203において結石の可能性がある領域として検出された領域の近傍にのみ光を照射し、当該近傍のみの光音響画像を生成するようにしてもよい。虚像領域305の近傍のみを光音響信号を取得する対象とすることにより、光を被検体に照射する時間や、制御装置101のリソースに対する負荷を軽減することができる。 In addition, although the example which produces | generates the photoacoustic image 307 of the same area | region as the area | region 301 which imaged the ultrasonic image 304 was demonstrated in FIG.3 (c), it is not restricted to this. For example, light may be irradiated only in the vicinity of the area detected as a calculus area in step S203, and a photoacoustic image only in the vicinity may be generated. By setting only the vicinity of the virtual image region 305 as a target for acquiring the photoacoustic signal, it is possible to reduce the time for irradiating the subject with light and the load on the resources of the control apparatus 101.
 図3(d)は、超音波画像304に対して光音響画像307を重畳した重畳画像309の一例を示す図である。たとえば、図2に例示したステップS207においては、重畳画像309を表示することにより、光音響画像を表示する。図3(d)は、ステップS203において関心領域として検出された領域に対応する領域の光音響画像310を超音波画像304に重畳して表示する例を示す。光音響画像の表示の方法、ここではすなわち重畳の方法は、ユーザが予め設定しておくことができる。たとえば、光音響波の強度に応じた色で光音響画像を表示させる。これにより光音響画像には、照射された光を吸収して光音響波を発生させる特性を有する血管が描出される。光音響画像310には、血管303に対応する血管領域311上に血管が描出されるのみで、結石302に起因する像は描出されていない。これによりユーザは、結石の可能性がある関心領域に関して、超音波画像と光音響画像とを比較することができる。制御装置101は、ユーザが当該関心領域に関する診断を補助することができる。たとえば、制御装置101は、結石の可能性がある領域が、虚像であるか否かを判断するのを補助することができる。 FIG. 3D is a diagram illustrating an example of a superimposed image 309 in which the photoacoustic image 307 is superimposed on the ultrasonic image 304. For example, in step S207 illustrated in FIG. 2, the photoacoustic image is displayed by displaying the superimposed image 309. FIG. 3D shows an example in which the photoacoustic image 310 of the region corresponding to the region detected as the region of interest in step S <b> 203 is superimposed on the ultrasonic image 304 and displayed. The method of displaying the photoacoustic image, here, that is, the superimposing method can be set in advance by the user. For example, the photoacoustic image is displayed in a color corresponding to the intensity of the photoacoustic wave. As a result, a blood vessel having the characteristic of absorbing the irradiated light and generating a photoacoustic wave is depicted in the photoacoustic image. In the photoacoustic image 310, only a blood vessel is depicted on a blood vessel region 311 corresponding to the blood vessel 303, and an image resulting from the calculus 302 is not depicted. Thereby, the user can compare an ultrasonic image and a photoacoustic image regarding the region of interest with the possibility of calculus. The control device 101 can assist the user in diagnosis related to the region of interest. For example, the control device 101 can assist in determining whether or not a region with a possibility of calculus is a virtual image.
 図2及び図3を用いて、超音波画像に描出される特定の特徴を検出する例を説明したが、本発明はこれに限らない。たとえばステップS203において、「ファジイ画像処理を用いた超音波画像中の虚像の除去」(Medical Imaging Technology,Vol.14,No.5,1996)に記載の技術を用いて、超音波画像を解析して虚像を検出してもよい。さらにステップS207において、検出された虚像を除去して表示させてもよい。ユーザは、検出された虚像領域が虚像であるか否かを、光音響画像を用いて判断することができる。虚像を除去して超音波画像を表示させる場合にも、光音響画像が比較可能に表示されることにより、超音波画像から虚像が除去されたことをユーザが視認することができる。また、上述したように制御装置101は超音波画像において検出された虚像の可能性がある領域が虚像であるか否かについて、光音響画像に描出された情報に基づいて判定してもよい。 2 and FIG. 3, an example of detecting a specific feature depicted in an ultrasound image has been described, but the present invention is not limited to this. For example, in step S203, an ultrasonic image is analyzed using the technique described in “Removal of a virtual image in an ultrasonic image using fuzzy image processing” (Medical Imaging Technology, Vol. 14, No. 5, 1996). A virtual image may be detected. Further, in step S207, the detected virtual image may be removed and displayed. The user can determine whether or not the detected virtual image region is a virtual image using the photoacoustic image. Even when the virtual image is removed and the ultrasonic image is displayed, the photoacoustic image is displayed in a comparable manner, so that the user can visually recognize that the virtual image has been removed from the ultrasonic image. In addition, as described above, the control device 101 may determine whether or not a region having a possibility of a virtual image detected in the ultrasonic image is a virtual image based on information drawn in the photoacoustic image.
 第1の実施形態の構成によれば、制御装置101は取得された超音波画像を解析して関心領域を検出し、少なくとも当該関心領域に対応する光音響画像を生成する。これにより、関心領域を診断するのに有用な画像の撮像を効率的に行うことができる。また、制御装置101は関心領域が検出された場合に光を照射するので、冗長な光照射を低減できる。 According to the configuration of the first embodiment, the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
 [第2の実施形態]
 第1の実施形態においては、超音波画像の一例としてBモード画像を解析した結果に基づいて、被検体への光の照射を制御する例を説明した。第2の実施形態においては、超音波画像の一例としてエラストグラフィ画像を用いる例について説明する。エラストグラフィによる撮像を行う領域は、ユーザが着目している関心領域であると考えられる。関心領域の超音波画像と光音響画像とを生成することにより、関心領域の診断に有用であり、検査のワークフローを向上することができる。以下、詳述する。
[Second Embodiment]
In the first embodiment, the example in which the irradiation of light to the subject is controlled based on the result of analyzing the B-mode image as an example of the ultrasonic image has been described. In the second embodiment, an example in which an elastography image is used as an example of an ultrasound image will be described. It is considered that the region where imaging by elastography is performed is a region of interest focused on by the user. By generating an ultrasonic image and a photoacoustic image of a region of interest, it is useful for diagnosis of the region of interest, and the inspection workflow can be improved. Details will be described below.
 エラストグラフィとは、以下に述べる原理によって、組織の硬さを画像化する方法である。一般的にエラストグラフィにおいては、フックの法則に基づいた硬さを考慮し、外部から与えられた応力による組織の歪みが計測される。たとえば体表からプローブ102を押しあてたときに、やわらかい組織ほど大きく変形するという性質がある。加圧前後での組織の変位を計測し、微分すれば組織各点での歪みを求めることができる。エラストグラフィ画像とは、組織各点での歪みの分布を画像化したものである。たとえばエラストグラフィ画像は、歪みの大きい部位(柔らかい部位)を赤色とし、中間の緑色を経由して、歪みの小さい部位(硬い部位)を青色となるように色相を変えて表現された2次元画像である。 Elastography is a method for imaging tissue hardness according to the principle described below. In general, elastography takes into account hardness based on Hooke's law, and measures tissue strain due to externally applied stress. For example, when the probe 102 is pressed from the body surface, there is a property that the softer tissue is deformed more greatly. If the displacement of the tissue before and after the pressurization is measured and differentiated, the strain at each point of the tissue can be obtained. An elastography image is an image of the strain distribution at each point of tissue. For example, an elastography image is a two-dimensional image expressed by changing the hue so that a portion with a large distortion (soft portion) is red and a portion with a small distortion (hard portion) is blue via an intermediate green color. It is.
 たとえば、乳房を被検体とする場合、脂肪組織は柔らかく、乳がんなどで石灰化した部位は硬いと考えられる。このように、被検体内の組織の硬さを知ることは診断に有用な情報となる。また、たとえば、腫瘍組織は周囲に新生血管を多く生成すると言われており、光音響画像によって得られる血管の情報をエラストグラフィ画像と併せて利用することは、診断に有用であると考えられる。以下では、超音波画像の一例であるエラストグラフィ画像を解析した結果に基づいて、被検体への光の照射を制御し、光音響画像を取得する例について説明する。 For example, when the breast is used as the subject, the adipose tissue is soft, and the site calcified by breast cancer or the like is considered to be hard. Thus, knowing the hardness of the tissue in the subject is useful information for diagnosis. Further, for example, it is said that a tumor tissue generates a lot of new blood vessels around it, and it is considered useful for diagnosis to use blood vessel information obtained by a photoacoustic image together with an elastography image. Hereinafter, an example in which the photoacoustic image is acquired by controlling the irradiation of light to the subject based on the result of analyzing the elastography image which is an example of the ultrasonic image will be described.
 図5は、取得された超音波画像に基づいて光照射を制御し、光音響画像を取得するための制御装置101の処理の一例を示すフローチャートである。以下に示す各処理は、特に断りがない場合、各モジュールによる処理を実現する主体は、CPU131またはGPU137である。 FIG. 5 is a flowchart showing an example of processing of the control device 101 for controlling light irradiation based on the acquired ultrasonic image and acquiring a photoacoustic image. In the following processes, the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
 ステップS501において、超音波画像生成モジュール146が実行されることにより、超音波画像が取得される。具体的には、まず検査に先だって検査制御モジュール142が実行されることにより、検査オーダがオーダリングシステム112から取得される。当該検査オーダには、検査の対象である患者の情報や、検査の対象となる部位や、撮影手技に関する情報が含まれる。ユーザは、プローブ102を操作し、プローブ102から超音波信号が制御装置101に送信される。信号取得モジュール143が実行されることにより、当該超音波信号が制御装置101に取得される。超音波画像生成モジュール146が実行されることにより、当該超音波信号に基づいて超音波画像が生成される。出力モジュール150が実行されることにより、当該超音波画像が表示部104に表示される。ユーザは表示部104に表示された超音波画像を観察しながら、プローブ102をさらに操作することができる。 In step S501, the ultrasonic image generation module 146 is executed to acquire an ultrasonic image. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112. The examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique. The user operates the probe 102, and an ultrasonic signal is transmitted from the probe 102 to the control device 101. By executing the signal acquisition module 143, the ultrasonic signal is acquired by the control device 101. By executing the ultrasonic image generation module 146, an ultrasonic image is generated based on the ultrasonic signal. By executing the output module 150, the ultrasonic image is displayed on the display unit 104. The user can further operate the probe 102 while observing the ultrasonic image displayed on the display unit 104.
 ステップS502において、信号取得モジュール143が実行されることにより、エラストグラフィによる撮影が行われたか否かが判定される。たとえば、ユーザはステップS501で生成され、表示部104に表示された超音波画像を観察しながら、詳細な観察を行いたい領域を発見し、エラストグラフィによる撮影を行う。ユーザは、たとえば操作部105を介して、プローブ102の信号取得に関する動作モードを、エラストグラフィ撮影の動作モードに切り替えることができる。なお、動作モードの切り替えのためのスイッチ等はプローブ102に備え付けられていても構わない。ユーザがエラストグラフィ撮影の動作モードに切り替え、プローブ102を被検体に押し当てる動作を行う。信号取得モジュール143により、プローブ102から超音波信号が取得される。これにより、エラストグラフィ撮影が行われたと判定される。超音波画像生成モジュール146が実行されることにより、信号取得モジュール143により取得された超音波信号に基づいて、加圧前後の組織の変位を取得し、組織の弾性を反映したエラストグラフィ画像が生成される。エラストグラフィ撮影が行われた場合にはステップS203に進み、行われなかった場合にはステップS201に戻って超音波画像の取得が継続される。 In step S502, by executing the signal acquisition module 143, it is determined whether or not photographing by elastography has been performed. For example, while observing the ultrasonic image generated in step S501 and displayed on the display unit 104, the user finds a region where detailed observation is desired and performs imaging by elastography. For example, the user can switch the operation mode related to signal acquisition of the probe 102 to the operation mode of elastography imaging via the operation unit 105. Note that a switch or the like for switching the operation mode may be provided in the probe 102. The user switches to the elastography imaging operation mode and performs an operation of pressing the probe 102 against the subject. An ultrasonic signal is acquired from the probe 102 by the signal acquisition module 143. As a result, it is determined that elastography imaging has been performed. By executing the ultrasonic image generation module 146, based on the ultrasonic signal acquired by the signal acquisition module 143, the displacement of the tissue before and after pressing is acquired, and an elastography image reflecting the elasticity of the tissue is generated. Is done. If elastography imaging has been performed, the process proceeds to step S203. If not, the process returns to step S201, and acquisition of an ultrasonic image is continued.
 ステップS503において、検査制御モジュール142が実行されることにより、超音波画像に基づいて光音響信号を取得するか否かを示す情報を取得する。ステップS503の処理は、図2に例示したステップS202の処理と同様であるため、上述した説明を援用してここでの説明を省略する。超音波画像に基づいて光音響信号を取得する場合にはステップS504に進み、取得しない場合にはステップS501に戻って超音波画像の取得が継続される。 In step S503, the inspection control module 142 is executed to acquire information indicating whether to acquire a photoacoustic signal based on the ultrasonic image. Since the process of step S503 is the same as the process of step S202 illustrated in FIG. 2, the above description is used and the description is omitted here. When acquiring a photoacoustic signal based on an ultrasonic image, it progresses to step S504, and when not acquiring, it returns to step S501 and acquisition of an ultrasonic image is continued.
 ステップS504において、超音波画像生成モジュール146が実行されることにより、ステップS502で取得されたエラストグラフィ画像が解析される。たとえば、超音波画像生成モジュール146は、エラストグラフィ画像を解析し、エラストグラフィ画像に描出された領域の中から予め定義された関心領域を検出する。 In step S504, by executing the ultrasonic image generation module 146, the elastography image acquired in step S502 is analyzed. For example, the ultrasound image generation module 146 analyzes the elastography image, and detects a region of interest defined in advance from the regions drawn on the elastography image.
 ステップS504において超音波画像生成モジュール146に基づいて行われる解析の一例を示す。ここではエラストグラフィ画像を解析して、硬い組織が描出されている可能性のある領域を関心領域として検出する例を示す。エラストグラフィ画像は、上述したように組織各点での歪みの分布を画像化したものである。たとえば当該解析では、所定値以下の歪みを有する画素の集合を、硬い組織が描出されている可能性のある領域として検出する。プローブ102が被検体を押圧する程度を測定する圧力センサ(不図示)がプローブ102に備えられている場合には、信号取得モジュール143又は超音波画像生成モジュール146により当該押圧の程度を示す情報が取得される。そして、ステップS504における解析において、プローブ102が被検体を押圧する度合いの情報を併せて利用してもよい。プローブ102が被検体を押圧する度合いに応じて、各組織の変位のしやすさが変化するためである。たとえば、プローブ102が被検体を押圧する度合いが大きいほど、歪みの所定値を小さくしてもよい。 An example of analysis performed based on the ultrasonic image generation module 146 in step S504 will be described. Here, an example in which an elastography image is analyzed and a region where a hard tissue may be depicted is detected as a region of interest is shown. As described above, the elastography image is obtained by imaging the strain distribution at each point of the tissue. For example, in this analysis, a set of pixels having a distortion equal to or less than a predetermined value is detected as an area where a hard tissue may be depicted. When the probe 102 is equipped with a pressure sensor (not shown) that measures the degree to which the probe 102 presses the subject, the signal acquisition module 143 or the ultrasonic image generation module 146 provides information indicating the degree of the pressing. To be acquired. In the analysis in step S504, information on the degree to which the probe 102 presses the subject may be used together. This is because the ease of displacement of each tissue changes according to the degree to which the probe 102 presses the subject. For example, the greater the degree that the probe 102 presses the subject, the smaller the predetermined distortion value may be.
 なお、ステップS504においてエラストグラフィ画像を解析している間、信号取得モジュール143により継続して超音波信号を取得し、超音波画像生成モジュール146により継続してBモード画像やエラストグラフィ画像といった超音波画像を生成し、出力モジュール150により継続して超音波画像を表示部104に表示させるようにしてもよい。 During the analysis of the elastography image in step S504, the signal acquisition module 143 continues to acquire an ultrasonic signal, and the ultrasonic image generation module 146 continues to acquire an ultrasonic wave such as a B-mode image or an elastography image. An image may be generated and an ultrasonic image may be continuously displayed on the display unit 104 by the output module 150.
 ステップS505において、超音波画像生成モジュール146が実行されることにより、ステップS504の解析の結果に基づいて、光音響画像を取得するか否かについて判定される。ステップS504において関心領域が抽出された場合にはステップS506に進み、関心領域が抽出されなかった場合にはステップS501に戻って超音波画像の取得を継続する。 In step S505, when the ultrasonic image generation module 146 is executed, it is determined whether to acquire a photoacoustic image based on the analysis result in step S504. If a region of interest is extracted in step S504, the process proceeds to step S506, and if a region of interest is not extracted, the process returns to step S501 to continue acquiring an ultrasound image.
 ステップS506において、照射制御モジュール144が実行されることにより、被検体に光を照射するか否かを判定する。ステップS506の処理は、図2に例示したステップS205の処理と同様であるため、上述した説明を援用することによりここでの説明を省略する。光を照射すると判定された場合にはステップS507に進み、照射しないと判定された場合にはステップS501に戻って超音波画像の取得を継続する。 In step S506, the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Since the process of step S506 is the same as the process of step S205 illustrated in FIG. 2, description here is abbreviate | omitted by using the description mentioned above. If it is determined to irradiate light, the process proceeds to step S507. If it is determined not to irradiate, the process returns to step S501 to continue acquiring an ultrasound image.
 ステップS507において、信号取得モジュール143が実行されることにより、プローブ102から光音響信号が取得される。 In step S507, the signal acquisition module 143 is executed to acquire a photoacoustic signal from the probe 102.
 ステップS508において、光音響画像生成モジュール147が実行されることにより、ステップS507で取得された光音響信号から光音響画像が再構成される。そして、出力モジュール150が実行されることにより、当該再構成された光音響画像が表示部104に表示される。さらに、光音響画像生成モジュール147が重畳画像生成モジュール148を実行するように制御することにより、重畳画像を生成し、出力モジュール150を介して表示部104に重畳画像を表示させるようにしてもよい。 In step S508, the photoacoustic image is generated from the photoacoustic signal acquired in step S507 by executing the photoacoustic image generation module 147. Then, when the output module 150 is executed, the reconstructed photoacoustic image is displayed on the display unit 104. Furthermore, by controlling the photoacoustic image generation module 147 to execute the superimposed image generation module 148, a superimposed image may be generated and displayed on the display unit 104 via the output module 150. .
 たとえば、ステップS508においてエラストグラフィ画像に光音響画像を重畳させた重畳画像が表示部104に表示される。ここで、一般的にエラストグラフィ画像は弾性の程度を反映した色相のカラー画像として表現される。また、光音響画像の中でもたとえば特定の物質、たとえばヘモグロビンの濃度を反映した画像においては、濃度の大小を反映した色相のカラー画像として表現される。カラー画像同士を重畳表示する際には、たとえばベース画像とレイヤ画像とで異なる色相を用いることが好ましい。また、光音響画像の色相で表現された領域と、エラストグラフィ画像の色相で表現された領域が重なる場合には、当該領域がどちらの画像にも描出されている領域であることを、ユーザが視認できるようにすることが好ましい。 For example, in step S508, a superimposed image obtained by superimposing a photoacoustic image on an elastography image is displayed on the display unit 104. Here, the elastography image is generally expressed as a color image having a hue reflecting the degree of elasticity. In addition, in a photoacoustic image, for example, an image reflecting the concentration of a specific substance, for example, hemoglobin, is expressed as a color image of a hue reflecting the magnitude of the concentration. When the color images are superimposed and displayed, it is preferable to use different hues for the base image and the layer image, for example. Further, when the region expressed by the hue of the photoacoustic image and the region expressed by the hue of the elastography image overlap, the user confirms that the region is the region depicted in both images. It is preferable to make it visible.
 このように制御装置101は、組織の弾性の度合いに基づいて特定された関心領域の画像と、機能情報とを対応させて参照できるように、超音波画像と光音響画像とを表示部104に表示させることができる。たとえば、エラストグラフィ画像を観察したところ硬い組織であると考えられる領域が、悪性の腫瘍であるか否かをユーザが判断する場合がある。このとき制御装置101は、光音響画像に描出されたたとえば新生血管の情報を比較可能に提示することにより、ユーザの判断を補助することができる。これにより、医師といったユーザが被検体の超音波画像や光音響画像といった医用画像を観察し、診断を行うためのワークフローを向上することができる。 As described above, the control device 101 displays the ultrasonic image and the photoacoustic image on the display unit 104 so that the image of the region of interest specified based on the degree of elasticity of the tissue and the function information can be referred to. Can be displayed. For example, the user may determine whether a region that is considered to be a hard tissue when an elastography image is observed is a malignant tumor. At this time, the control apparatus 101 can assist a user's judgment by presenting, for example, information on a new blood vessel drawn in the photoacoustic image in a comparable manner. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
 なお、上述した例に限らず、照射制御モジュール144はユーザの操作入力に基づいて被検体に光を照射させてもよい。光を照射させた場合に、光音響画像生成モジュール147により光音響画像を生成させ、出力モジュール150を介して表示部104に光音響画像を表示させてもよい。 Note that the irradiation control module 144 is not limited to the example described above, and may irradiate the subject with light based on a user operation input. When the light is irradiated, the photoacoustic image may be generated by the photoacoustic image generation module 147 and the photoacoustic image may be displayed on the display unit 104 via the output module 150.
 また、光音響画像生成モジュール147が生成した光音響画像及び重畳画像生成モジュール148が生成した重畳画像が、適宜DISK134やPACS113に保存されるようにしてもよい。 Further, the photoacoustic image generated by the photoacoustic image generation module 147 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
 図4は、図5に例示したステップS504において行われる解析の一例を示す図である。ここでは、エラストグラフィ画像に硬い組織の可能性がある領域が描出される場合を例に説明する。 FIG. 4 is a diagram illustrating an example of analysis performed in step S504 illustrated in FIG. Here, a case where a region having a possibility of a hard tissue is depicted in an elastography image will be described as an example.
 図4(a)は、エラストグラフィ画像401の一例である。被検体の内部の構造を模式的に例示した図である。画像401には、硬い組織領域402が、周囲のよりやわらかい組織と区別可能に表示されている。 FIG. 4A is an example of an elastography image 401. It is the figure which illustrated typically the structure inside a subject. In the image 401, a hard tissue region 402 is displayed so as to be distinguishable from surrounding softer tissue.
 図4(b)は、図4(a)に示すエラストグラフィ画像401に描出されている領域を撮像した光音響画像403の一例である。血管領域404と、血管領域405とが光音響画像403に描出されている。 FIG. 4B is an example of the photoacoustic image 403 obtained by imaging the region depicted in the elastography image 401 shown in FIG. A blood vessel region 404 and a blood vessel region 405 are depicted in the photoacoustic image 403.
 図4(c)は、エラストグラフィ画像401に対して光音響画像403を重畳した重畳画像406の一例である。血管領域407は図4(b)に例示した光音響画像に描出された血管領域404と対応している。組織領域408は図4(a)に例示したエラストグラフィ画像に描出された組織領域402と対応している。図4(c)は、エラストグラフィ画像において周囲の組織と比べて硬いと考えられる組織領域408の近傍の光音響画像を重畳した重畳画像の例である。関心領域である硬い組織領域402の近傍のみ光音響画像を取得することにより、光を被検体に照射する時間や、制御装置101のリソースに対する負荷を軽減することができる。 FIG. 4C is an example of a superimposed image 406 in which the photoacoustic image 403 is superimposed on the elastography image 401. The blood vessel region 407 corresponds to the blood vessel region 404 depicted in the photoacoustic image illustrated in FIG. The tissue region 408 corresponds to the tissue region 402 depicted in the elastography image illustrated in FIG. FIG. 4C is an example of a superimposed image in which a photoacoustic image in the vicinity of a tissue region 408 that is considered to be harder than the surrounding tissue in the elastography image is superimposed. By acquiring the photoacoustic image only in the vicinity of the hard tissue region 402 that is the region of interest, the time for irradiating the subject with light and the load on the resources of the control apparatus 101 can be reduced.
 第2の実施形態の構成によれば、制御装置101は取得された超音波画像を解析して関心領域を検出し、少なくとも当該関心領域に対応する光音響画像を生成する。これにより、関心領域を診断するのに有用な画像の撮像を効率的に行うことができる。また、制御装置101は関心領域が検出された場合に光を照射するので、冗長な光照射を低減できる。 According to the configuration of the second embodiment, the control device 101 analyzes the acquired ultrasonic image to detect a region of interest, and generates a photoacoustic image corresponding to at least the region of interest. Thereby, an image useful for diagnosing a region of interest can be efficiently captured. Further, since the control device 101 emits light when a region of interest is detected, redundant light irradiation can be reduced.
 第2の実施形態においては、組織の弾性を定性的に表現したエラストグラフィ画像を用いる例について説明したが、本発明はこれに限らない。たとえば、組織の弾性を定量的に表現した定量的弾性イメージングにより生成された画像を用いてもよい。音波の伝搬は波動エネルギーの伝搬であり、これをさえぎる物体には音波の伝搬方向に音響放射力(acoustic radiation force)と呼ばれる力が生じる。したがって、高音圧で持続時間の比較的長い収束超音波パルスを生体に放射すると、音響放射力により組織に微小な変位が生じる。また、このときに変位と垂直な方向、すなわち超音波ビームと垂直な方向に伝搬する横波が発生する。横波の伝搬速度は縦波に比べて遅いため、横波の伝搬する過程をパルスエコー法で画像化でき、伝搬速度を求めることができる。横波の伝搬速度は硬い組織ほど大きいと考えられており、これにより組織の硬さを定量的に評価することができる。 In the second embodiment, the example using the elastography image that expresses the elasticity of the tissue qualitatively has been described, but the present invention is not limited to this. For example, an image generated by quantitative elastic imaging that quantitatively represents the elasticity of the tissue may be used. The propagation of sound waves is the propagation of wave energy, and a force called an acoustic radiation force is generated in the sound wave propagation direction in an object that blocks the propagation of wave energy. Therefore, when a convergent ultrasonic pulse having a high sound pressure and a relatively long duration is radiated to a living body, a minute displacement occurs in the tissue due to the acoustic radiation force. At this time, a transverse wave is generated that propagates in a direction perpendicular to the displacement, that is, in a direction perpendicular to the ultrasonic beam. Since the propagation speed of the transverse wave is slower than that of the longitudinal wave, the propagation process of the transverse wave can be imaged by the pulse echo method, and the propagation speed can be obtained. It is considered that the propagation speed of the shear wave is higher as the tissue is harder, and thereby the hardness of the tissue can be quantitatively evaluated.
 あるいは、定性的なエラストグラフィにより求められた組織の歪みの分布と、組織の応力分布とに基づいて、弾性係数分布すなわち定量的な硬さの指標を求めて画像化してもよい。組織の応力分布は直接測定できないが、解剖学的情報やシミュレーション等により取得してもよい。 Alternatively, the elastic modulus distribution, that is, the quantitative hardness index may be obtained and imaged based on the tissue strain distribution obtained by qualitative elastography and the tissue stress distribution. The tissue stress distribution cannot be directly measured, but may be obtained by anatomical information, simulation, or the like.
 [第3の実施形態]
 第1の実施形態及び第2の実施形態においては、超音波画像を解析した結果に基づいて、被検体への光の照射を制御し、光音響画像を表示する例について説明した。第3の実施形態においては、光音響画像を解析した結果に基づいて、被検体への超音波ビームの照射を制御し、超音波画像を表示する例について説明する。以下では、超音波画像の一例であるエラストグラフィ画像を取得する場合を例に説明する。
[Third Embodiment]
In the first embodiment and the second embodiment, the example in which the irradiation of light to the subject is controlled and the photoacoustic image is displayed based on the result of analyzing the ultrasonic image has been described. In the third embodiment, an example will be described in which irradiation of an ultrasonic beam to a subject is controlled based on a result of analyzing a photoacoustic image and an ultrasonic image is displayed. Below, the case where the elastography image which is an example of an ultrasonic image is acquired is demonstrated to an example.
 上述したように、光音響画像には照射された光を吸収し音響波を生じる性質(以下では、光特性と称する。)を有する物質あるいは組織が描出される。診断の対象となる病変によっては、光特性を有する組織の特徴が診断の助けとなり得る。たとえば、腫瘍組織の周辺では新生血管が多く存在すると言われており、細い血管の密集度合いが高い領域と、腫瘍の悪性度とに相関がある可能性がある。また、特定の病変組織において、光特性を有する物質の濃度が周囲の正常組織と比べて違いがある可能性がある。光音響画像を観察することにより、このような特徴を有する病変のある領域を特定できる場合がある。光音響画像を観察しているユーザが、病変の可能性がある領域、すなわち、より詳細な観察が求められる関心領域を発見した場合に、当該関心領域についてのより詳細な情報を提供することは、診断に有用であると考えられる。第3の実施形態は、光音響画像に関心領域が描出されている場合に、超音波信号を取得して超音波画像を表示することができる。 As described above, the photoacoustic image depicts a substance or tissue having a property of absorbing irradiated light and generating an acoustic wave (hereinafter referred to as optical characteristics). Depending on the lesion to be diagnosed, tissue features with optical properties can aid in diagnosis. For example, it is said that there are many new blood vessels in the vicinity of the tumor tissue, and there is a possibility that there is a correlation between a region where the density of thin blood vessels is high and the malignancy of the tumor. Further, there is a possibility that the concentration of a substance having optical characteristics is different in a specific lesion tissue as compared with the surrounding normal tissue. By observing the photoacoustic image, there may be a case where a region having a lesion having such characteristics can be identified. When a user observing a photoacoustic image finds a region that may have a lesion, i.e., a region of interest that requires more detailed observation, providing more detailed information about the region of interest It is considered useful for diagnosis. In the third embodiment, when a region of interest is depicted in a photoacoustic image, an ultrasonic signal can be acquired and an ultrasonic image can be displayed.
 図7は、取得された光音響画像に基づいて超音波の照射を制御し、超音波画像を取得するための制御装置101の処理の一例を示すフローチャートである。以下に示す各処理は、特に断りがない場合、各モジュールによる処理を実現する主体は、CPU131またはGPU137である。 FIG. 7 is a flowchart showing an example of processing of the control device 101 for controlling the irradiation of ultrasonic waves based on the acquired photoacoustic image and acquiring the ultrasonic image. In the following processes, the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
 ステップS701において、光音響画像生成モジュール147が実行されることにより、光音響画像が取得される。具体的には、まず検査に先だって検査制御モジュール142が実行されることにより、検査オーダがオーダリングシステム112から取得される。当該検査オーダには、検査の対象である患者の情報や、検査の対象となる部位や、撮影手技に関する情報が含まれる。ユーザは、プローブ102を操作し、プローブ102から光音響信号が制御装置101に送信される。信号取得モジュール143が実行されることにより、当該光音響信号が制御装置101に取得される。光音響画像生成モジュール147が実行されることにより、当該光音響信号に基づいて光音響画像が生成される。出力モジュール150が実行されることにより、当該光音響画像が表示部104に表示される。ユーザは表示部104に表示された光音響画像を観察しながら、プローブ102をさらに操作することができる。 In step S701, the photoacoustic image is generated by executing the photoacoustic image generation module 147. Specifically, first, the inspection control module 142 is executed prior to the inspection, whereby the inspection order is acquired from the ordering system 112. The examination order includes information on the patient to be examined, information on the part to be examined, and information on the imaging technique. The user operates the probe 102, and a photoacoustic signal is transmitted from the probe 102 to the control device 101. By executing the signal acquisition module 143, the photoacoustic signal is acquired by the control device 101. By executing the photoacoustic image generation module 147, a photoacoustic image is generated based on the photoacoustic signal. By executing the output module 150, the photoacoustic image is displayed on the display unit 104. The user can further operate the probe 102 while observing the photoacoustic image displayed on the display unit 104.
 ステップS702において、検査制御モジュール142が実行されることにより、光音響画像に基づいて超音波信号を取得するか否かを示す情報を取得する。光音響画像に基づいて超音波信号を取得するか否かを示す情報とは、具体的には、ユーザによる事前の設定や、検査オーダに含まれる情報である。別の例では、制御装置101で行われている処理の負荷の状態に応じて、超音波信号を取得するか否かが判定されてもよい。たとえば、制御装置101で行われている処理の負荷が大きく、光音響信号の取得を行うことで超音波画像を取得する処理に影響を与えてしまう場合には、検査制御モジュール142の処理において光音響信号を取得しないと判定してもよい。超音波画像に基づいて光音響信号を取得する場合にはステップS703に進み、取得しない場合にはステップS701に戻って光音響画像の取得が継続される。 In step S702, the inspection control module 142 is executed to acquire information indicating whether to acquire an ultrasonic signal based on the photoacoustic image. The information indicating whether or not to acquire an ultrasonic signal based on the photoacoustic image is specifically information set in advance by the user or included in the inspection order. In another example, whether or not to acquire an ultrasonic signal may be determined according to the state of the load of processing performed by the control device 101. For example, when the processing load performed by the control device 101 is heavy and the acquisition of the photoacoustic signal affects the processing of acquiring the ultrasound image, You may determine not to acquire an acoustic signal. When acquiring a photoacoustic signal based on an ultrasonic image, it progresses to step S703, and when not acquiring, it returns to step S701 and acquisition of a photoacoustic image is continued.
 ステップS703において、光音響画像生成モジュール147が実行されることにより、光音響画像が解析される。たとえば、光音響画像生成モジュール147は、光音響画像を解析し、光音響画像に描出された領域の中から予め定義された関心領域を検出する。 In step S703, the photoacoustic image is analyzed by executing the photoacoustic image generation module 147. For example, the photoacoustic image generation module 147 analyzes the photoacoustic image and detects a region of interest defined in advance from the region depicted in the photoacoustic image.
 ステップS703において光音響画像生成モジュール147に基づいて行われる解析の一例を示す。ここでは光音響画像の一例として特定の波長の光に対する吸収係数を反映した画像(以下では吸収係数画像)を解析する例について説明する。たとえば、ヘモグロビンの光吸収に対応する波長の光を照射することにより、吸収係数画像には血管像が描出される。たとえば当該解析では、光音響画像に描出された領域における血管の密度を解析される。そして、所定以上の密度で血管が存在する領域が関心領域として検出される。より具体的には、たとえば光音響画像の所定の範囲に含まれる画素のうち、ある閾値以上の画素値を持つ画素の個数を所定の範囲を構成する画素の数で割った値を密度として使用する。 An example of analysis performed based on the photoacoustic image generation module 147 in step S703 is shown. Here, an example of analyzing an image (hereinafter referred to as an absorption coefficient image) reflecting an absorption coefficient for light of a specific wavelength will be described as an example of a photoacoustic image. For example, by irradiating light having a wavelength corresponding to the light absorption of hemoglobin, a blood vessel image is drawn on the absorption coefficient image. For example, in this analysis, the density of blood vessels in the region depicted in the photoacoustic image is analyzed. Then, a region where blood vessels are present at a predetermined density or more is detected as a region of interest. More specifically, for example, among the pixels included in the predetermined range of the photoacoustic image, a value obtained by dividing the number of pixels having a pixel value equal to or greater than a certain threshold value by the number of pixels constituting the predetermined range is used as the density. To do.
 なお、ステップS703において光音響画像を解析している間、信号取得モジュール143により継続して光音響信号を取得し、光音響画像生成モジュール147により継続して光音響画像を生成し、出力モジュール150により継続して光音響画像を表示部104に表示させるようにしてもよい。 Note that while the photoacoustic image is analyzed in step S703, the photoacoustic signal is continuously acquired by the signal acquisition module 143, the photoacoustic image is continuously generated by the photoacoustic image generation module 147, and the output module 150. Then, the photoacoustic image may be continuously displayed on the display unit 104.
 ステップS704において、光音響画像生成モジュール147が実行されることにより、ステップS703の解析の結果に基づいて、超音波画像を取得するか否かが判定される。ステップS703において関心領域が抽出された場合にはステップS705に進み、関心領域が抽出されなかった場合にはステップS701に戻って超音波画像の取得を継続する。この観点で、光音響画像生成モジュール147は解析手段の一例である。 In step S704, when the photoacoustic image generation module 147 is executed, it is determined whether or not to acquire an ultrasonic image based on the analysis result in step S703. If the region of interest is extracted in step S703, the process proceeds to step S705. If the region of interest is not extracted, the process returns to step S701 to continue acquiring the ultrasound image. From this viewpoint, the photoacoustic image generation module 147 is an example of an analysis unit.
 ステップS705において、照射制御モジュール144が実行されることにより、被検体に超音波を照射するか否かを判定する。具体的には、照射制御モジュール144は、プローブ102が被検体に接触しているか否かを判定する。照射制御モジュール144は、光音響画像生成モジュール147により生成された光音響画像や、位置取得モジュール149により取得されたプローブ102の位置情報に基づいて、被検体とプローブ102との接触を判定する。プローブ102に被検体との接触を検知するためのセンサ(不図示)を設け、照射制御モジュール144は当該センサ(不図示)からの情報に基づいて被検体とプローブ102との接触を判定してもよい。照射制御モジュール144は、被検体とプローブ102とが接触していると判定される場合に、照射部107を制御して超音波を照射させる。被検体とプローブ102とが接触していないと判定される場合に、出力モジュール150を介して表示部104に、被検体とプローブ102とが接触していないことをユーザに報知する画面を表示させてもよい。照射制御モジュール144は更に、光音響画像生成モジュール147により生成された光音響画像に基づいて、当該光音響画像にステップS703で関心領域として抽出された領域が描出されている場合に超音波を照射するように送受信部106を制御してもよい。これにより、ステップS703において解析された領域を描出可能な位置からプローブ102が離れた場合に、冗長な超音波照射を行う可能性を低減できる。別の例では、プローブ102の温度が所定の値以下の場合に、超音波を照射すると判定してもよい。送受信部106の特性上、プローブ102が被検体から離れると、プローブ102と被検体との間に空気の層が生じる。空気の音響インピーダンスは送受信部106の音響インピーダンスに比べて非常に大きい。プローブ102と被検体との間に空気の層が存在すると、送受信部106の付近で超音波が反射を繰り返し、プローブ102の温度が上昇する可能性がある。プローブ102に温度を測定する温度センサを備え、照射制御モジュール144は当該温度センサからプローブ102の温度の情報を取得してもよい。ステップS705において被検体に超音波を照射すると判定された場合にはステップS706に進み、照射しないと判定された場合にはステップS701に戻って光音響画像の取得を継続する。 In step S705, the irradiation control module 144 is executed to determine whether or not the subject is irradiated with ultrasonic waves. Specifically, the irradiation control module 144 determines whether or not the probe 102 is in contact with the subject. The irradiation control module 144 determines contact between the subject and the probe 102 based on the photoacoustic image generated by the photoacoustic image generation module 147 and the position information of the probe 102 acquired by the position acquisition module 149. The probe 102 is provided with a sensor (not shown) for detecting contact with the subject, and the irradiation control module 144 determines contact between the subject and the probe 102 based on information from the sensor (not shown). Also good. The irradiation control module 144 controls the irradiation unit 107 to irradiate ultrasonic waves when it is determined that the subject and the probe 102 are in contact with each other. When it is determined that the subject and the probe 102 are not in contact, a screen for notifying the user that the subject and the probe 102 are not in contact is displayed on the display unit 104 via the output module 150. May be. The irradiation control module 144 further irradiates the ultrasonic wave based on the photoacoustic image generated by the photoacoustic image generation module 147 when the region extracted as the region of interest in step S703 is drawn on the photoacoustic image. The transmitting / receiving unit 106 may be controlled to do so. Thereby, when the probe 102 leaves | separates from the position which can draw the area | region analyzed in step S703, possibility of performing redundant ultrasonic irradiation can be reduced. In another example, when the temperature of the probe 102 is equal to or lower than a predetermined value, it may be determined that the ultrasonic wave is irradiated. Due to the characteristics of the transmitter / receiver 106, when the probe 102 is separated from the subject, an air layer is formed between the probe 102 and the subject. The acoustic impedance of air is much larger than the acoustic impedance of the transmission / reception unit 106. If there is an air layer between the probe 102 and the subject, the ultrasonic waves are repeatedly reflected in the vicinity of the transmission / reception unit 106, and the temperature of the probe 102 may rise. The probe 102 may be provided with a temperature sensor that measures the temperature, and the irradiation control module 144 may acquire temperature information of the probe 102 from the temperature sensor. If it is determined in step S705 that the subject is to be irradiated with ultrasonic waves, the process proceeds to step S706, and if it is determined not to be irradiated, the process returns to step S701 to continue acquiring the photoacoustic image.
 ステップS706において、信号取得モジュール143が実行されることにより、プローブ102から超音波信号が取得される。そして、超音波画像生成モジュール146が実行されることにより、当該超音波信号から超音波画像が生成される。そして、出力モジュール150が実行されることにより、当該生成された超音波画像が表示部104に表示される。さらに、超音波画像生成モジュール146が重畳画像生成モジュール148を実行するように制御することにより、重畳画像を生成し、出力モジュール150を介して表示部104に重畳画像を表示させるようにしてもよい。ステップS706で生成される超音波画像は、たとえばBモード画像である。 In step S706, an ultrasonic signal is acquired from the probe 102 by executing the signal acquisition module 143. Then, by executing the ultrasonic image generation module 146, an ultrasonic image is generated from the ultrasonic signal. Then, when the output module 150 is executed, the generated ultrasonic image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. . The ultrasonic image generated in step S706 is, for example, a B mode image.
 ステップS707において、超音波画像生成モジュール146が実行されることにより、エラストグラフィによる撮影が行われたか否かが判定される。光音響画像に関心領域が描出されているので、関心領域についての診断の補助となる詳細な情報の一つとして、エラストグラフィ画像を表示部104に表示させることは有用であると考えられる。たとえば出力モジュール150を介して表示部104に、血管の密度が高い領域が存在することと、エラストグラフィ撮影を行うことが有用であることをユーザに報知する画面を表示させる。ユーザは当該報知の画面を参照し、プローブ102を被検体に押し当て、エラストグラフィ撮影を行う。エラストグラフィ撮影をユーザが行ったか否かを判定するための処理は。図5に例示したステップS502の処理と同様であるため、上述した説明を援用することによりここでの説明を省略する。エラストグラフィ撮影をユーザが行った場合にはステップS708に進み、行わなかった場合にはステップS701に戻って光音響画像の取得を継続する。 In step S707, it is determined whether or not imaging by elastography has been performed by executing the ultrasonic image generation module 146. Since the region of interest is depicted in the photoacoustic image, it is considered useful to display the elastography image on the display unit 104 as one piece of detailed information that assists diagnosis of the region of interest. For example, a screen is displayed on the display unit 104 via the output module 150 to notify the user that there is a region with high blood vessel density and that it is useful to perform elastography imaging. The user refers to the notification screen, presses the probe 102 against the subject, and performs elastography imaging. What is the process for determining whether the user has performed elastography photography? Since it is the same as the process of step S502 illustrated in FIG. 5, description here is abbreviate | omitted by using the description mentioned above. If the user has taken elastography, the process proceeds to step S708. If not, the process returns to step S701 to continue acquiring the photoacoustic image.
 ステップS708において、超音波画像生成モジュール146が実行されることにより、ステップS707で取得された超音波信号からエラストグラフィ画像が生成される。そして、出力モジュール150が実行されることにより、当該生成されたエラストグラフィ画像が表示部104に表示される。さらに、超音波画像生成モジュール146が重畳画像生成モジュール148を実行するように制御することにより、重畳画像を生成し、出力モジュール150を介して表示部104に重畳画像を表示させるようにしてもよい。 In step S708, by executing the ultrasonic image generation module 146, an elastography image is generated from the ultrasonic signal acquired in step S707. Then, when the output module 150 is executed, the generated elastography image is displayed on the display unit 104. Furthermore, a superimposition image may be generated by controlling the ultrasonic image generation module 146 to execute the superimposition image generation module 148, and the superimposition image may be displayed on the display unit 104 via the output module 150. .
 このように制御装置101は、光音響画像に基づいて特定された関心領域の画像と、Bモード画像やエラストグラフィ画像といった超音波画像とを対応させて参照できるように、超音波画像と光音響画像とを表示部104に表示させることができる。これにより、医師といったユーザが被検体の超音波画像や光音響画像といった医用画像を観察し、診断を行うためのワークフローを向上することができる。 As described above, the control apparatus 101 can refer to the image of the region of interest specified based on the photoacoustic image and the ultrasonic image such as the B-mode image and the elastography image in association with each other. Images can be displayed on the display unit 104. Thereby, the workflow for a user such as a doctor to observe a medical image such as an ultrasonic image or a photoacoustic image of a subject and perform a diagnosis can be improved.
 なお、上述した例に限らず、照射制御モジュール144はユーザの操作入力に基づいて被検体に超音波を照射させてもよい。超音波を照射させた場合に、超音波画像生成モジュール146により超音波画像を生成させ、出力モジュール150を介して表示部104に超音波画像を表示させてもよい。 Note that the irradiation control module 144 is not limited to the above-described example, and the subject may be irradiated with ultrasonic waves based on a user operation input. When the ultrasonic wave is irradiated, an ultrasonic image may be generated by the ultrasonic image generation module 146 and the ultrasonic image may be displayed on the display unit 104 via the output module 150.
 また、超音波画像生成モジュール146が生成したエラストグラフィ画像及び重畳画像生成モジュール148が生成した重畳画像が、適宜DISK134やPACS113に保存されるようにしてもよい。 Further, the elastography image generated by the ultrasonic image generation module 146 and the superimposed image generated by the superimposed image generation module 148 may be appropriately stored in the DISK 134 or the PACS 113.
 図7においてはエラストグラフィ撮影を行う場合を例に説明したが、ステップS707、ステップS708は必ずしも行われなくてもよい。たとえば光音響画像を解析した結果に基づいて、Bモード画像といった超音波画像を表示させてもよい。 Although FIG. 7 illustrates an example of performing elastography imaging, step S707 and step S708 are not necessarily performed. For example, an ultrasonic image such as a B-mode image may be displayed based on the result of analyzing the photoacoustic image.
 図6は、図7に例示したステップS703において行われる解析の一例と、ステップS708において表示される画像の一例とを示す図である。ここでは、光音響画像に血管密度の高い領域が描出される場合を例に説明する。 FIG. 6 is a diagram illustrating an example of analysis performed in step S703 illustrated in FIG. 7 and an example of an image displayed in step S708. Here, a case where a region having a high blood vessel density is depicted in the photoacoustic image will be described as an example.
 図6(a)は、表示部104に表示される光音響画像602の一例である。光音響画像602には、血管領域603と血管領域604とが描出されている。 FIG. 6A is an example of the photoacoustic image 602 displayed on the display unit 104. In the photoacoustic image 602, a blood vessel region 603 and a blood vessel region 604 are depicted.
 図6(b)は、ステップS703における解析の結果、血管密度が高いと考えられる領域が検出された場合に表示部104に表示される画面の一例である。枠605は、ステップS703において検出された関心領域を示す。これにより、ユーザは関心領域、すなわち血管密度が高いと検出された領域を視認することができる。報知画面606は、当該解析の結果に基づく情報が表示される。たとえば、血管密度が高い領域が存在することと、更なる観察にエラストグラフィ画像が有用であることをユーザに報知するメッセージが報知画面606に表示される。たとえば、「血管密集領域のエラストグラフィを行ってください」というメッセージが報知画面606に表示される。 FIG. 6B is an example of a screen displayed on the display unit 104 when a region considered to have a high blood vessel density is detected as a result of the analysis in step S703. A frame 605 indicates the region of interest detected in step S703. Thereby, the user can visually recognize the region of interest, that is, the region detected when the blood vessel density is high. The notification screen 606 displays information based on the analysis result. For example, a message for informing the user that there is a region with a high blood vessel density and that the elastography image is useful for further observation is displayed on the notification screen 606. For example, a message “Please perform elastography of a blood vessel dense region” is displayed on the notification screen 606.
 図6(c)は、ステップS707で行われたエラストグラフィ撮影により得られた超音波信号に基づくエラストグラフィ画像607の一例である。エラストグラフィ画像607には、硬い組織である可能性がある組織領域608が周囲の組織と区別可能に描出されている。 FIG. 6C is an example of an elastography image 607 based on the ultrasonic signal obtained by the elastography imaging performed in step S707. In the elastography image 607, a tissue region 608 that may be a hard tissue is depicted so as to be distinguishable from surrounding tissues.
 図6(d)は、光音響画像602に対してエラストグラフィ画像607を重畳した重畳画像609の一例である。光音響画像602の血管領域603と対応する血管領域610が描出されている。これにより、図6(c)に例示したエラストグラフィ画像607の組織領域608の周辺の血管密度に関する領域をユーザが視認でき、ユーザが行う診断を補助できる。 FIG. 6D is an example of a superimposed image 609 obtained by superimposing an elastography image 607 on the photoacoustic image 602. A blood vessel region 610 corresponding to the blood vessel region 603 of the photoacoustic image 602 is depicted. Thereby, the user can visually recognize the region related to the blood vessel density around the tissue region 608 of the elastography image 607 illustrated in FIG. 6C, and the diagnosis performed by the user can be assisted.
 第3の実施形態の構成によれば、制御装置101は取得された光音響画像を解析して病変の可能性がある領域を関心領域として検出し、少なくとも関心領域に対応する超音波画像を生成する。たとえば、血管密度が高く、腫瘍の存在が疑われる関心領域の診断に際して、関心領域の硬さを評価するエラストグラフィ撮影を行うように制御することで、当該診断を補助することができる。このように制御することにより、ユーザは関心領域を中心にエラストグラフィ撮影を行うことができ、検査におけるユーザのワークフローを向上することができる。 According to the configuration of the third embodiment, the control device 101 analyzes the acquired photoacoustic image, detects a region that may be a lesion as a region of interest, and generates an ultrasound image corresponding to at least the region of interest. To do. For example, when diagnosing a region of interest where the blood vessel density is high and the presence of a tumor is suspected, the diagnosis can be assisted by controlling to perform elastography imaging for evaluating the hardness of the region of interest. By controlling in this way, the user can perform elastography imaging centering on the region of interest, and the user workflow in the examination can be improved.
 第3の実施形態においては、超音波画像としてエラストグラフィ画像を取得する場合を例に説明したが、本発明はこれに限らない。たとえば、血流速度を計測するドップラ撮影や、被検体内の構造を把握するためにBモード撮影を実施するようにしてもよい。 In the third embodiment, the case where an elastography image is acquired as an ultrasonic image has been described as an example, but the present invention is not limited to this. For example, Doppler imaging for measuring blood flow velocity or B-mode imaging for grasping the structure in the subject may be performed.
 [第4の実施形態]
 第1の実施形態乃至第3の実施形態においては、超音波画像もしくは光音響画像のいずれか一方を解析して得られた結果に基づいて、他方の撮影の制御を行う例について説明した。一方の画像に関心領域が描出されてから、解析の結果、他方の撮影を行う制御がなされるまでの間にタイムラグが生じる場合が考えられる。タイムラグの間にユーザがプローブ102を、関心領域の画像を撮像するのに適さない位置に移動させている場合が考えられる。第4の実施形態では、検査におけるプローブ102の位置情報に基づいて、適切に関心領域を撮像できるようにプローブ102をガイドする例について説明する。
[Fourth Embodiment]
In the first to third embodiments, the example in which the other imaging is controlled based on the result obtained by analyzing either the ultrasonic image or the photoacoustic image has been described. There may be a case where a time lag occurs between the time when the region of interest is drawn on one image and the time when the other image is controlled as a result of analysis. It is conceivable that the user moves the probe 102 to a position not suitable for capturing an image of the region of interest during the time lag. In the fourth embodiment, an example will be described in which the probe 102 is guided so that the region of interest can be appropriately imaged based on position information of the probe 102 in the examination.
 図8は、第4の実施形態における検査の様子を模式的に例示した図である。図8(a)は、ユーザが被検体803にプローブ102を接触させ、超音波画像を取得する様子の例である。プローブ102からの超音波信号は操作卓801に送信される。操作卓801は図10に示す制御装置101と表示部104と操作部105とが統合された装置である。操作卓801は上述した各実施形態における制御装置101と対応する。操作卓801の位置取得モジュール149により、位置802の位置情報が取得され、所定の期間RAM133に記憶される。また、出力モジュール150によりPACS113に出力される場合、及びDISK134に記憶される場合には、超音波画像及び光音響画像には位置情報がそれぞれ関連付けられる。位置802において撮像された超音波画像に、関心領域が検出されているとする。 FIG. 8 is a diagram schematically illustrating an inspection state in the fourth embodiment. FIG. 8A shows an example of a state in which the user brings the probe 102 into contact with the subject 803 and acquires an ultrasonic image. An ultrasonic signal from the probe 102 is transmitted to the console 801. The console 801 is an apparatus in which the control device 101, the display unit 104, and the operation unit 105 shown in FIG. The console 801 corresponds to the control device 101 in each embodiment described above. The position information of the position 802 is acquired by the position acquisition module 149 of the console 801 and stored in the RAM 133 for a predetermined period. Further, when output to the PACS 113 by the output module 150 and when stored in the DISK 134, position information is associated with the ultrasonic image and the photoacoustic image, respectively. It is assumed that the region of interest is detected in the ultrasonic image captured at the position 802.
 図8(b)は、たとえば図2の一連の処理において、操作卓801が超音波画像を解析し関心領域を検出した時点における、検査の様子の例である。ユーザは、図8(a)の位置802に対応する位置805から、位置804にプローブ102を移動している。操作卓801により光音響画像を取得すべき関心領域が検出されたが、位置804において取得される光音響信号に基づいて光音響画像を生成しても、検出された関心領域が描出されない可能性がある。 FIG. 8B is an example of the state of the inspection at the time when the console 801 analyzes the ultrasonic image and detects the region of interest in the series of processes of FIG. The user has moved the probe 102 from the position 805 corresponding to the position 802 in FIG. Although the region of interest from which the photoacoustic image is to be acquired is detected by the console 801, the detected region of interest may not be depicted even if the photoacoustic image is generated based on the photoacoustic signal acquired at the position 804. There is.
 図8(c)は、操作卓801のガイドにより、ユーザが関心領域を描出できる位置806にプローブ102を移動させた様子の例である。操作卓801の位置取得モジュール149が実行されることにより、プローブ102の現在位置と、関心領域が検出された超音波画像の超音波信号を取得したプローブ102の位置である目標位置とが比較される。これにより、プローブ102を目標位置にガイドするためのガイド情報が生成され、表示される。これにより、ユーザは検出された関心領域の光音響画像を取得することができる。 FIG. 8C is an example of a state in which the probe 102 is moved to a position 806 where the user can draw the region of interest by the guide of the console 801. By executing the position acquisition module 149 of the console 801, the current position of the probe 102 is compared with the target position which is the position of the probe 102 that acquired the ultrasonic signal of the ultrasonic image in which the region of interest is detected. The Thereby, guide information for guiding the probe 102 to the target position is generated and displayed. Thereby, the user can acquire the photoacoustic image of the detected region of interest.
 図9は、図8に例示したガイドのための処理の一例を示すフローチャートである。以下に示す各処理は、特に断りがない場合、各モジュールによる処理を実現する主体は、CPU131またはGPU137である。 FIG. 9 is a flowchart illustrating an example of processing for the guide illustrated in FIG. In the following processes, the CPU 131 or the GPU 137 is a main body that realizes the processes by the modules unless otherwise specified.
 ステップS901とステップS902の処理は図2に例示したステップS201の処理と同様であるため、上述した説明を援用することによりここでの説明を省略する。 Since the processing of step S901 and step S902 is the same as the processing of step S201 illustrated in FIG. 2, the description here is omitted by using the above description.
 ステップS903において、位置取得モジュール149が実行されることにより、プローブ102の位置情報が取得される。より具体的には、検知部103の一例であるモーションセンサは、プローブ102の位置情報をトラッキングし、制御装置101に送信する。モーションセンサは、プローブ102の送受信部106と光源(不図示)とは異なる部分に備えつけられ、若しくは埋め込まれる。モーションセンサは、たとえば微小電気機械システム(Micro Electro Mechanical Systems)で構成され、3軸の加速度計と、3軸のジャイロスコープと、3軸の磁気コンパスとを備える9軸モーションセンシングを提供する。モーションセンサが感知したプローブ102の移動に関する情報が、位置取得モジュール149により取得され、一定期間記憶される。 In step S903, the position acquisition module 149 is executed, whereby the position information of the probe 102 is acquired. More specifically, the motion sensor, which is an example of the detection unit 103, tracks the position information of the probe 102 and transmits it to the control device 101. The motion sensor is provided or embedded in a portion different from the transmitting / receiving unit 106 of the probe 102 and the light source (not shown). The motion sensor is composed of, for example, a micro electro mechanical system (Micro Electro Mechanical Systems), and provides 9-axis motion sensing including a 3-axis accelerometer, a 3-axis gyroscope, and a 3-axis magnetic compass. Information regarding the movement of the probe 102 sensed by the motion sensor is acquired by the position acquisition module 149 and stored for a certain period of time.
 ステップS904、ステップS905の処理は、図2に例示したステップS203、ステップS204の処理とそれぞれ同様であるため、上述した説明を援用することによりここでの説明を省略する。ステップS905において光音響信号を取得すると判定された場合にはステップS906に進み、取得しないと判定された場合にはステップS901に戻って光音響画像の取得を継続する。 Since the processing of step S904 and step S905 is the same as the processing of step S203 and step S204 illustrated in FIG. 2, the description here will be omitted by using the above description. If it is determined in step S905 that the photoacoustic signal is acquired, the process proceeds to step S906. If it is determined that the photoacoustic signal is not acquired, the process returns to step S901 to continue acquiring the photoacoustic image.
 ステップS906において、位置取得モジュール149が実行されることにより、表示部104にガイド情報が表示される。具体的には、ガイド情報の生成において、まずステップS904で関心領域が検出された超音波画像を生成するための超音波信号が取得された時点におけるプローブ102の位置を、目標位置とする。そして、検知部103から逐次送信される位置情報が示すプローブ102の現在位置と、目標位置との差分を取得する。そして当該差分に基づいて、ガイド情報を生成する。ガイド情報は、出力モジュール150を介してユーザに提示される。ガイド情報は、たとえばガイド画像であり表示部104に表示される。ガイド画像は、プローブ102を目標位置まで移動させるための移動方向、移動量、傾斜角、回転方向、回転量といったガイド情報を示す客観的な指標である。当該ガイド画像はこれらのガイド情報の客観的な指標となるものであればいかなるものでもよい。たとえばガイド画像は、移動や回転の量に対応した大きさで、移動や回転や傾斜の方向に対応した向きを有する矢印の画像である。別の例では、ガイド画像は移動や回転の量に対応した大きさで、移動や回転や傾斜の方向に対応して形状が変形する図形である。ガイド画像は、プローブ102を目標位置に移動させた場合に関心領域の観察を妨げない態様で表示部104に表示される。たとえばガイド画像は、超音波画像及び光音響画像及び重畳画像を表示していない領域に表示される。別の例では、目標位置にプローブ102を移動させるようガイドしている間は、目標領域の近傍の領域に重畳する位置に表示させ、目標領域が描出されると、視認できないような形状に変形するように表示してもよい。 In step S906, the position acquisition module 149 is executed, whereby guide information is displayed on the display unit 104. Specifically, in the generation of guide information, first, the position of the probe 102 at the time when the ultrasonic signal for generating the ultrasonic image in which the region of interest is detected in step S904 is acquired is set as the target position. Then, the difference between the current position of the probe 102 indicated by the position information sequentially transmitted from the detection unit 103 and the target position is acquired. And guide information is produced | generated based on the said difference. The guide information is presented to the user via the output module 150. The guide information is a guide image, for example, and is displayed on the display unit 104. The guide image is an objective index indicating guide information such as a movement direction, a movement amount, an inclination angle, a rotation direction, and a rotation amount for moving the probe 102 to the target position. The guide image may be any image as long as it is an objective index of the guide information. For example, the guide image is an image of an arrow having a size corresponding to the amount of movement or rotation and a direction corresponding to the direction of movement, rotation, or tilt. In another example, the guide image is a figure that has a size corresponding to the amount of movement or rotation, and whose shape is deformed according to the direction of movement, rotation, or inclination. The guide image is displayed on the display unit 104 in a manner that does not interfere with observation of the region of interest when the probe 102 is moved to the target position. For example, the guide image is displayed in an area where the ultrasonic image, the photoacoustic image, and the superimposed image are not displayed. In another example, while guiding the probe 102 to move to the target position, it is displayed at a position that overlaps the area near the target area, and when the target area is rendered, it is deformed into a shape that cannot be seen. May be displayed.
 ステップS907において、照射制御モジュール144が実行されることにより、被検体に光を照射するか否かが判定される。具体的には、照射制御モジュール144による判定では、プローブ102が目標位置に到達している場合であって、かつ被検体にプローブ102が接触している場合に光を照射すると判定する。たとえば、検知部103から送信される位置情報が示すプローブ102の現在位置が目標位置と一致する場合に、プローブ102が目標位置に到達していると判定する。別の例では、目標位置を含む所定の範囲にプローブ102が到達した場合に、プローブ102が目標位置に到達したと判定してもよい。たとえば、ステップS904において検出された関心領域を取得可能なプローブ102の位置と、目標位置とを所定の範囲とする。被検体にプローブ102が接触していることを判定するための処理は、図2に例示したステップS205の処理と同様であるため、上述した説明を援用することによりここでの説明を省略する。被検体に光を照射すると判定された場合にはステップS908に進み、光を照射しないと判定された場合にはステップS901に戻って超音波画像の取得を継続する。 In step S907, the irradiation control module 144 is executed to determine whether to irradiate the subject with light. Specifically, in the determination by the irradiation control module 144, it is determined that light is irradiated when the probe 102 has reached the target position and the probe 102 is in contact with the subject. For example, when the current position of the probe 102 indicated by the position information transmitted from the detection unit 103 matches the target position, it is determined that the probe 102 has reached the target position. In another example, when the probe 102 reaches a predetermined range including the target position, it may be determined that the probe 102 has reached the target position. For example, the position of the probe 102 capable of acquiring the region of interest detected in step S904 and the target position are set as a predetermined range. Since the process for determining that the probe 102 is in contact with the subject is the same as the process of step S205 illustrated in FIG. 2, the description here is omitted by using the above description. If it is determined that the subject is to be irradiated with light, the process proceeds to step S908. If it is determined that the object is not irradiated with light, the process returns to step S901 to continue acquiring the ultrasound image.
 ステップS908、ステップS909の処理は、図2に例示するステップS206、ステップS207の処理とそれぞれ同様であるため、上述した説明を援用することによりここでの説明を省略する。 Since the processing of step S908 and step S909 is the same as the processing of step S206 and step S207 illustrated in FIG. 2, respectively, the description here is omitted by using the above description.
 第4の実施形態では、超音波画像を解析して検出された関心領域に基づいて目標位置を設定する場合を例に説明したが、本発明はこれに限らない。たとえば第3の実施形態に示したように、光音響画像を解析して検出された関心領域に基づいて目標位置を設定してもよい。また、ガイド情報をユーザに提示する方法は、上述したガイド画像に限られない。たとえばプローブ102が目標位置に近付くにつれて発音間隔が小さくなるような音を発生することによりガイド情報をユーザに提示してもよい。 In the fourth embodiment, the case where the target position is set based on the region of interest detected by analyzing the ultrasonic image has been described as an example, but the present invention is not limited to this. For example, as shown in the third embodiment, the target position may be set based on the region of interest detected by analyzing the photoacoustic image. Moreover, the method of presenting guide information to the user is not limited to the above-described guide image. For example, the guide information may be presented to the user by generating a sound such that the sound generation interval decreases as the probe 102 approaches the target position.
 第4の実施形態の構成によれば、制御装置101は関心領域を描出した画像を効率的に取得することができ、ユーザのワークフローを向上する。 According to the configuration of the fourth embodiment, the control device 101 can efficiently acquire an image depicting a region of interest, thereby improving the user workflow.
 [変形例1]
 第1の実施形態乃至第4の実施形態において、関心領域を検出するための解析を、取得した画像の全てのフレームについて実施しなくてもよい。たとえば、所定のフレーム間隔で当該解析を実施することとし、制御装置101にかかる処理負荷を低減することができる。
[Modification 1]
In the first to fourth embodiments, the analysis for detecting the region of interest may not be performed for all the frames of the acquired image. For example, it is possible to reduce the processing load on the control apparatus 101 by performing the analysis at a predetermined frame interval.
 第1の実施形態乃至第4の実施形態において、超音波画像又は光音響画像の一方を解析して関心領域を検出した場合に、他方の画像を撮像するようにプローブ102を制御する例を説明した。本発明はこれに限らず、たとえば関心領域がプローブ102により描出可能な範囲のうち所定の範囲に含まれる場合に、他方の画像を撮像するようにプローブ102を制御してもよい。たとえば、関心領域が一方の画像の中央近傍の範囲に含まれる場合に、他方の画像を撮像するようにプローブ102を制御してもよい。これにより、後に読影したり画像診断レポートに添付したりする際に、関心領域をより観察しやすい医用画像を得ることができる。また、これにより意図せず関心領域が一方の画像の描出可能な範囲に含まれた場合に、不用意に他方の画像が撮像される可能性を低減することができる。 In the first to fourth embodiments, when one of the ultrasonic image and the photoacoustic image is analyzed and a region of interest is detected, an example of controlling the probe 102 to capture the other image will be described. did. The present invention is not limited to this. For example, when the region of interest is included in a predetermined range of the range that can be drawn by the probe 102, the probe 102 may be controlled to capture the other image. For example, when the region of interest is included in the range near the center of one image, the probe 102 may be controlled to capture the other image. This makes it possible to obtain a medical image that makes it easier to observe the region of interest when it is later interpreted or attached to an image diagnosis report. In addition, this can reduce the possibility that the other image is inadvertently captured when the region of interest is included in a range in which one image can be drawn unintentionally.
 第1の実施形態乃至第4の実施形態において、重畳画像を表示させる例を説明したが、重畳の方法は上述した例に限られない。超音波画像をベース画像、光音響画像をレイヤ画像とする例においては、超音波画像の関心領域の近傍に限って光音響画像を重畳してもよいし、所望の範囲の光音響画像を重畳してもよい。レイヤ画像の透明度は目的に応じて適宜変更される。レイヤ画像を不透明としてもよいし、関心領域の近傍のみ透明度を上げてもよい。レイヤ画像の透明度を変更可能なスライダーを表示部104に表示させ、ユーザが観察中に変更してもよい。重畳画像の表示と、超音波画像と光音響画像とを並列させる表示と、超音波画像と光音響画像とのうち何れかの画像の表示とを、ユーザの操作入力により切り替えられるようにしてもよい。 In the first to fourth embodiments, the example in which the superimposed image is displayed has been described, but the method of superimposing is not limited to the above-described example. In an example in which the ultrasonic image is a base image and the photoacoustic image is a layer image, the photoacoustic image may be superimposed only in the vicinity of the region of interest of the ultrasonic image, or a photoacoustic image in a desired range is superimposed. May be. The transparency of the layer image is appropriately changed according to the purpose. The layer image may be opaque, or the transparency may be increased only in the vicinity of the region of interest. A slider capable of changing the transparency of the layer image may be displayed on the display unit 104 and changed during observation by the user. The display of the superimposed image, the display in which the ultrasonic image and the photoacoustic image are arranged in parallel, and the display of any one of the ultrasonic image and the photoacoustic image may be switched by a user operation input. Good.
 また、第1の実施形態乃至第4の実施形態において、リアルタイムで取得された画像を解析して関心領域を検出する例について説明したが、本発明はこれに限らない。たとえば、過去に撮影された3次元画像に描出された領域を関心領域としてもよい。たとえば、過去に撮影されたCT画像上の特定の領域を関心領域として設定する。そして、当該CT画像の座標系とプローブ102を操作する実空間の座標系とを対応させる。制御装置101は超音波画像と光音響画像とを取得可能であり、一方の画像を取得中に関心領域を取得可能な位置にプローブ102が位置した場合に、他方の画像を併せて取得するように制御してもよい。 In the first to fourth embodiments, the example in which the region of interest is detected by analyzing the image acquired in real time has been described. However, the present invention is not limited to this. For example, a region drawn in a past three-dimensional image may be set as a region of interest. For example, a specific region on a CT image taken in the past is set as a region of interest. And the coordinate system of the said CT image and the coordinate system of the real space which operates the probe 102 are matched. The control apparatus 101 can acquire an ultrasonic image and a photoacoustic image, and when the probe 102 is positioned at a position where a region of interest can be acquired while acquiring one image, the other image is also acquired. You may control to.
 さらに、第1の実施形態乃至第4の実施形態において、プローブ102で光照射が行われていることをユーザに報知するようにしてもよい。たとえば、プローブ102で光照射が行われていることを報知する報知画像を表示部104に表示させる。表示部104に当該報知画像を表示させる場合には、ユーザが観察している被検体の画像の近傍に表示させることが好ましい。別の例では、プローブ102に光の照射中に点灯するLEDライトを備えてもよい。さらに別の例では、光の照射中に制御装置101は報知音を発生させてもよい。これにより、ユーザはプローブ102から光を照射中であることを知ることができ、ユーザ及び被検体の安全性を向上することができる。 Furthermore, in the first to fourth embodiments, the user may be notified that light irradiation is performed by the probe 102. For example, a notification image for notifying that light irradiation is performed by the probe 102 is displayed on the display unit 104. When displaying the notification image on the display unit 104, it is preferable to display the notification image in the vicinity of the image of the subject being observed by the user. In another example, the probe 102 may be provided with an LED light that is turned on during light irradiation. In yet another example, the control device 101 may generate a notification sound during light irradiation. Thereby, the user can know that light is being emitted from the probe 102, and the safety of the user and the subject can be improved.
 [変形例2]
 上記の実施形態では、超音波画像の解析結果に基づいて光音響画像の取得の要否を決定することとしていた。従って、超音波画像中に例えば結石等の関心領域が含まれている場合には自動的に光音響画像が撮影されることとなるため、ある関心領域に関して一度光音響画像を撮影しているにも関わらず、再度超音波画像で当該関心領域を確認しようとすると再び光音響画像が撮影されてしまう可能性があった。すなわち、同一の関心領域に対して複数回光音響画像が撮影されることとなり、不要な光音響画像を取得することとなってしまう。そこで、本変形例では、不要な光音響画像を取得することを防止することを目的とする。
[Modification 2]
In the above embodiment, the necessity of obtaining the photoacoustic image is determined based on the analysis result of the ultrasonic image. Therefore, when a region of interest such as a calculus is included in the ultrasonic image, a photoacoustic image is automatically captured. Therefore, a photoacoustic image is captured once for a certain region of interest. Nevertheless, there is a possibility that a photoacoustic image will be taken again if an attempt is made to confirm the region of interest again with an ultrasonic image. That is, a photoacoustic image is taken a plurality of times for the same region of interest, and an unnecessary photoacoustic image is acquired. Therefore, an object of the present modification is to prevent an unnecessary photoacoustic image from being acquired.
 本変形例において、照射制御モジュール144は、例えば位置取得モジュール149によって取得されるプローブ102の位置情報に基づいて光音響画像を取得するための光の照射を制限する。例えば、照射制御モジュール144は、過去に光照射すると判断した時のプローブ102の位置情報を記憶しておき、現在のプローブ102の位置情報が記憶された位置情報と一致または記憶された位置情報とのずれが所定閾値以内か判定する。そして、照射制御モジュール144は、現在のプローブ102の位置情報が記憶された位置情報と一致または記憶された位置情報とのずれが所定閾値以内の場合、既に光音響画像を取得している部位であると判断して、光の照射を制限する。光の照射を制限することで、被検者に不要な光照射を行うこと防止するとともに、不要な光音響画像を取得することを防止できる。 In this modification, the irradiation control module 144 restricts light irradiation for acquiring a photoacoustic image based on position information of the probe 102 acquired by the position acquisition module 149, for example. For example, the irradiation control module 144 stores the position information of the probe 102 when it is determined that light irradiation has been performed in the past, and the position information of the current probe 102 matches or is stored with the stored position information. It is determined whether the deviation is within a predetermined threshold. And the irradiation control module 144 is the part which has already acquired the photoacoustic image when the position information of the current probe 102 matches the stored position information or the deviation from the stored position information is within a predetermined threshold. Judging that there is, it restricts light irradiation. By restricting the light irradiation, it is possible to prevent the subject from performing unnecessary light irradiation and to prevent acquisition of unnecessary photoacoustic images.
 また、照射制御モジュール144は、例えば超音波画像生成モジュール146により生成された超音波画像に基づいて光音響画像を取得するための光の照射を制限することとしてもよい。例えば、照射制御モジュール144は、過去に光照射すると判断した場合に生成された超音波画像を記憶しておき、現在生成された超音波画像と比較を行う。比較の結果画像同士の類似度が所定閾値以上であれば、既に光音響画像を取得している部位であると判断して光の照射を制限することとしてもよい。 Further, the irradiation control module 144 may limit the irradiation of light for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146, for example. For example, the irradiation control module 144 stores the ultrasonic image generated when it is determined that the light irradiation has been performed in the past, and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
 なお、照射制御モジュール144は、超音波画像生成モジュール146により生成された超音波画像および位置取得モジュール149によって取得されるプローブ102の位置情報に基づいて光音響画像を取得するための光の照射を制限することとしてもよい。例えば、照射制御モジュール144は、過去に光照射すると判断した時のプローブ102の位置情報に超音波画像を対応づけて記憶しておく。そして、照射制御モジュール144は、現在のプローブ102の位置情報に対応付けられている超音波画像を読出し、現在生成された超音波画像と比較を行う。比較の結果画像同士の類似度が所定閾値以上であれば、既に光音響画像を取得している部位であると判断して光の照射を制限することとしてもよい。 The irradiation control module 144 performs light irradiation for acquiring a photoacoustic image based on the ultrasonic image generated by the ultrasonic image generation module 146 and the position information of the probe 102 acquired by the position acquisition module 149. It may be limited. For example, the irradiation control module 144 stores an ultrasonic image in association with position information of the probe 102 when it is determined that light irradiation has been performed in the past. Then, the irradiation control module 144 reads an ultrasonic image associated with the current position information of the probe 102 and compares it with the currently generated ultrasonic image. If the degree of similarity between the images as a result of the comparison is equal to or greater than a predetermined threshold value, it may be determined that the photoacoustic image has already been acquired and light irradiation may be limited.
 本発明は、上述の実施形態の1以上の機能を実現するプログラムを、ネットワーク又は記憶媒体を介してシステム又は装置に供給し、そのシステム又は装置のコンピュータにおける1つ以上のプロセッサがプログラムを読出し実行する処理でも実現可能である。また、1以上の機能を実現する回路(例えば、ASIC)によっても実現可能である。 The present invention supplies a program that realizes one or more functions of the above-described embodiments to a system or apparatus via a network or a storage medium, and one or more processors in the computer of the system or apparatus read and execute the program This process can be realized. It can also be realized by a circuit (for example, ASIC) that realizes one or more functions.
 上述の各実施形態における制御装置は、単体の装置として実現してもよいし、複数の装置を互いに通信可能に組合せて上述の処理を実行する形態としてもよく、いずれも本発明の実施形態に含まれる。共通のサーバ装置あるいはサーバ群で、上述の処理を実行することとしてもよい。制御装置および制御システムを構成する複数の装置は所定の通信レートで通信可能であればよく、また同一の施設内あるいは同一の国に存在することを要しない。 The control device in each of the above-described embodiments may be realized as a single device, or may be configured to execute the above-described processing by combining a plurality of devices so that they can communicate with each other. included. The above-described processing may be executed by a common server device or server group. The plurality of devices constituting the control device and the control system need only be able to communicate at a predetermined communication rate, and do not need to exist in the same facility or in the same country.
 本発明の実施形態には、前述した実施形態の機能を実現するソフトウェアのプログラムを、システムあるいは装置に供給し、そのシステムあるいは装置のコンピュータが該供給されたプログラムのコードを読みだして実行するという形態を含む。 In the embodiment of the present invention, a software program that realizes the functions of the above-described embodiments is supplied to a system or apparatus, and the computer of the system or apparatus reads and executes the code of the supplied program. Includes form.
 したがって、実施形態に係る処理をコンピュータで実現するために、該コンピュータにインストールされるプログラムコード自体も本発明の実施形態の一つである。また、コンピュータが読みだしたプログラムに含まれる指示に基づき、コンピュータで稼働しているOSなどが、実際の処理の一部又は全部を行い、その処理によっても前述した実施形態の機能が実現され得る。 Therefore, since the processing according to the embodiment is realized by a computer, the program code itself installed in the computer is also one embodiment of the present invention. Further, based on instructions included in a program read by the computer, an OS or the like running on the computer performs part or all of the actual processing, and the functions of the above-described embodiments can be realized by the processing. .
 上述の実施形態を適宜組み合わせた形態も、本発明の実施形態に含まれる。 Embodiments appropriately combining the above-described embodiments are also included in the embodiments of the present invention.
 本発明は上記実施の形態に制限されるものではなく、本発明の精神及び範囲から離脱することなく、様々な変更及び変形が可能である。従って、本発明の範囲を公にするために以下の請求項を添付する。 The present invention is not limited to the above embodiment, and various changes and modifications can be made without departing from the spirit and scope of the present invention. Therefore, in order to make the scope of the present invention public, the following claims are attached.
 本願は、2016年7月8日提出の日本国特許出願特願2016-136108、2016年11月25日提出の日本国特許出願特願2016-229312を基礎として優先権を主張するものであり、その記載内容の全てをここに援用する。 This application claims priority on the basis of Japanese Patent Application No. 2016-136108 filed on July 8, 2016 and Japanese Patent Application No. 2016-229312 filed on November 25, 2016, All the descriptions are incorporated herein.

Claims (16)

  1.  被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とのうち少なくともいずれかを取得する信号取得手段と、
     前記超音波信号に基づいて超音波画像を生成する生成手段と、
     前記生成された前記超音波画像に基づいて前記プローブの前記光照射を制御する照射制御手段と、
     を有することを特徴とする制御装置。
    The ultrasonic signal and the photoacoustic signal are output from a probe that outputs an ultrasonic signal by transmitting and receiving ultrasonic waves to the subject and outputs a photoacoustic signal by receiving a photoacoustic wave generated by light irradiation on the subject. And signal acquisition means for acquiring at least one of
    Generating means for generating an ultrasonic image based on the ultrasonic signal;
    Irradiation control means for controlling the light irradiation of the probe based on the generated ultrasonic image;
    A control device comprising:
  2.  前記生成手段は、前記制御された前記光照射に基づいて取得された光音響信号に基づいて光音響画像を生成し、
     前記生成された前記超音波画像と前記光音響画像とを比較可能に表示部に表示させる表示制御手段をさらに有することを特徴とする請求項1に記載の制御装置。
    The generating means generates a photoacoustic image based on a photoacoustic signal acquired based on the controlled light irradiation,
    The control apparatus according to claim 1, further comprising display control means for causing the generated ultrasonic image and the photoacoustic image to be displayed on a display unit in a comparable manner.
  3.  前記生成された前記超音波画像を解析することにより、前記光音響信号を取得する関心領域を検出する検出手段をさらに有し、
     前記照射制御手段は、前記検出された前記関心領域からの前記光音響信号を取得するために前記光照射を制御することを特徴とする請求項2に記載の制御装置。
    By further detecting a region of interest for acquiring the photoacoustic signal by analyzing the generated ultrasonic image,
    The control apparatus according to claim 2, wherein the irradiation control unit controls the light irradiation in order to acquire the photoacoustic signal from the detected region of interest.
  4.  前記プローブの前記被検体に対する位置を示す位置情報を取得する位置取得手段と、
     前記検出された関心領域を取得可能な位置に前記プローブをガイドするガイド手段と、をさらに有することを特徴とする請求項1乃至3のいずれか一項に記載の制御装置。
    Position acquisition means for acquiring position information indicating a position of the probe with respect to the subject;
    The control device according to any one of claims 1 to 3, further comprising guide means for guiding the probe to a position where the detected region of interest can be acquired.
  5.  前記照射制御手段は、前記プローブが前記関心領域を含む光音響画像を取得可能な位置にある場合に、前記光照射を行うように制御することを特徴とする請求項3又は請求項4のいずれか一項に記載の制御装置。 The said irradiation control means controls to perform the said light irradiation, when the said probe exists in the position which can acquire the photoacoustic image containing the said region of interest. A control device according to claim 1.
  6.  前記関心領域が検出されたことに応じて取得された前記光音響画像を、前記解析された前記超音波画像と関連付けて出力する出力手段をさらに有することを特徴とする請求項3乃至請求項5のいずれか一項に記載の制御装置。 6. The apparatus according to claim 3, further comprising an output unit that outputs the photoacoustic image acquired in response to detection of the region of interest in association with the analyzed ultrasonic image. The control device according to any one of the above.
  7.  前記検出手段は、前記超音波画像を解析して結石の可能性がある領域を検出することを特徴とする請求項3乃至請求項6のいずれか一項に記載の制御装置。 The control device according to any one of claims 3 to 6, wherein the detection means detects an area where a calculus is likely by analyzing the ultrasonic image.
  8.  前記超音波画像は前記プローブにより押圧される前後の、前記被検体の組織の歪みの分布を反映したエラストグラフィ画像であり、前記検出手段は前記歪みが所定値以下の領域を検出することを特徴とする請求項3乃至請求項7のいずれか一項に記載の制御装置。 The ultrasonic image is an elastography image reflecting the strain distribution of the tissue of the subject before and after being pressed by the probe, and the detection means detects a region where the strain is a predetermined value or less. The control device according to any one of claims 3 to 7.
  9.  前記照射制御手段は、前記プローブが前記被検体と接触している場合に、前記光照射を行うように制御することを特徴とする請求項1乃至請求項8のいずれか一項に記載の制御装置。 The control according to any one of claims 1 to 8, wherein the irradiation control unit controls the light irradiation when the probe is in contact with the subject. apparatus.
  10.  被検体に対する超音波の送受信により超音波信号を出力し、被検体への光照射により発生する光音響波を受信することにより光音響信号を出力するプローブから、前記超音波信号と前記光音響信号とのうち少なくともいずれかを取得する取得手段と、
     前記光音響信号に基づいて光音響画像を生成する生成手段と、
    前記生成された前記光音響画像に基づいて前記プローブの前記超音波の送信を制御する照射制御手段と、
     を有することを特徴とする制御装置。
    The ultrasonic signal and the photoacoustic signal are output from a probe that outputs an ultrasonic signal by transmitting and receiving ultrasonic waves to the subject and outputs a photoacoustic signal by receiving a photoacoustic wave generated by light irradiation on the subject. Acquisition means for acquiring at least one of
    Generating means for generating a photoacoustic image based on the photoacoustic signal;
    Irradiation control means for controlling transmission of the ultrasonic wave of the probe based on the generated photoacoustic image;
    A control device comprising:
  11.  前記生成された前記光音響画像を解析することにより、前記超音波信号を取得する関心領域を検出する検出手段をさらに有し、
     前記照射制御手段は、前記検出された前記関心領域からの前記超音波信号を取得するために前記超音波の送信を制御し、
     前記生成手段は、前記制御された前記超音波の送信に基づいて取得された超音波信号に基づいて超音波画像を生成し、
     前記生成された前記超音波画像と前記光音響画像とを比較可能に表示部に表示させる表示制御手段をさらに有することを特徴とする請求項10に記載の制御装置。
    By further detecting a region of interest for acquiring the ultrasonic signal by analyzing the generated photoacoustic image,
    The irradiation control means controls transmission of the ultrasonic wave to obtain the ultrasonic signal from the detected region of interest;
    The generating means generates an ultrasound image based on an ultrasound signal acquired based on the controlled transmission of the ultrasound,
    The control apparatus according to claim 10, further comprising display control means for causing the generated ultrasonic image and the photoacoustic image to be displayed on a display unit in a comparable manner.
  12.  前記照射制御手段は、前記検出された関心領域に基づいて前記超音波の送信の範囲を制御することを特徴とする請求項11に記載の制御装置。 12. The control apparatus according to claim 11, wherein the irradiation control unit controls a transmission range of the ultrasonic wave based on the detected region of interest.
  13.  前記表示制御手段は、前記超音波画像に対して前記光音響画像を重畳した重畳画像を表示部に表示させることを特徴とする請求項2乃至請求項12のいずれか一項に記載の制御装置。 The control device according to any one of claims 2 to 12, wherein the display control unit displays a superimposed image obtained by superimposing the photoacoustic image on the ultrasonic image on a display unit. .
  14.  被検体に光を照射するための光源と、
     超音波を送受信するためのトランスデューサと、
     前記トランスデューサにより送信された超音波の反射波を超音波信号として取得し、前記光源から被検体に照射された光により発生する光音響波を光音響信号として取得する信号取得手段と、
     前記超音波信号に基づいて超音波画像を生成する生成手段と、
     前記生成された前記超音波画像に基づいて前記光源からの前記光の照射を制御する照射制御手段と、
     を有することを特徴とする撮像システム。
    A light source for irradiating the subject with light;
    A transducer for transmitting and receiving ultrasound,
    A signal acquisition means for acquiring a reflected wave of an ultrasonic wave transmitted by the transducer as an ultrasonic signal, and acquiring a photoacoustic wave generated by light irradiated on a subject from the light source as a photoacoustic signal;
    Generating means for generating an ultrasonic image based on the ultrasonic signal;
    Irradiation control means for controlling irradiation of the light from the light source based on the generated ultrasonic image;
    An imaging system comprising:
  15.  被検体に光を照射するための光源と、
     超音波を送受信するためのトランスデューサと、
     前記トランスデューサにより送信された超音波の反射波を超音波信号として取得し、前記光源から被検体に照射された光により発生する光音響波を光音響信号として取得する信号取得手段と、
     前記光音響信号に基づいて光音響画像を生成する生成手段と、
     前記生成された前記光音響画像に基づいて前記トランスデューサの前記超音波の送信を制御する照射制御手段と、
     を有することを特徴とする撮像システム。
    A light source for irradiating the subject with light;
    A transducer for transmitting and receiving ultrasound,
    A signal acquisition means for acquiring a reflected wave of an ultrasonic wave transmitted by the transducer as an ultrasonic signal, and acquiring a photoacoustic wave generated by light irradiated on a subject from the light source as a photoacoustic signal;
    Generating means for generating a photoacoustic image based on the photoacoustic signal;
    Irradiation control means for controlling transmission of the ultrasonic waves of the transducer based on the generated photoacoustic image;
    An imaging system comprising:
  16.  第1の手法により取得された第1の信号に基づいて第1の画像を取得する工程と、
     前記第1の画像を解析して関心領域を検出する工程と
     前記第1の手法とは異なる第2の手法により取得された第2の信号に基づいて前記関心領域を含む第2の画像を取得するために、前記第2の信号の取得を制御する工程と、
     前記第1の画像と前記第2の画像とを比較可能に表示部に表示させる工程と、
     を有することを特徴とする制御方法。
    Acquiring a first image based on a first signal acquired by the first technique;
    Analyzing the first image to detect a region of interest; and obtaining a second image including the region of interest based on a second signal obtained by a second method different from the first method. Controlling the acquisition of the second signal,
    Displaying the first image and the second image on a display unit so that they can be compared;
    A control method characterized by comprising:
PCT/JP2017/024569 2016-07-08 2017-07-05 Control device, control method, control system, and program WO2018008661A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016136108 2016-07-08
JP2016-136108 2016-07-08
JP2016229312A JP2018011928A (en) 2016-07-08 2016-11-25 Control device, control method, control system, and program
JP2016-229312 2016-11-25

Publications (1)

Publication Number Publication Date
WO2018008661A1 true WO2018008661A1 (en) 2018-01-11

Family

ID=60912804

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/024569 WO2018008661A1 (en) 2016-07-08 2017-07-05 Control device, control method, control system, and program

Country Status (1)

Country Link
WO (1) WO2018008661A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019122621A (en) * 2018-01-17 2019-07-25 キヤノン株式会社 Subject information acquiring apparatus and subject information acquiring method
CN110384480A (en) * 2018-04-18 2019-10-29 佳能株式会社 Subject information acquisition device, subject information processing method and storage medium
US20210275040A1 (en) * 2020-03-05 2021-09-09 Koninklijke Philips N.V. Ultrasound-based guidance for photoacoustic measurements and associated devices, systems, and methods

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012254284A (en) * 2011-05-13 2012-12-27 Fujifilm Corp Tomographic image generating device, method, and program
JP2013063253A (en) * 2011-08-31 2013-04-11 Canon Inc Information processing apparatus, ultrasonic imaging apparatus, and information processing method
JP2013527782A (en) * 2010-04-22 2013-07-04 ザ ユニバーシティ オブ ワシントン スルー イッツ センター フォー コマーシャライゼーション Ultrasound-based method and apparatus for detecting and facilitating the removal of stones
JP2013158531A (en) * 2012-02-07 2013-08-19 Canon Inc Apparatus and method for obtaining subject information
JP2014136103A (en) * 2013-01-18 2014-07-28 Fujifilm Corp Photoacoustic image generation device and photoacoustic image generation method
JP2015065975A (en) * 2013-09-26 2015-04-13 キヤノン株式会社 Subject information acquisition device and control method therefor
JP2016097165A (en) * 2014-11-25 2016-05-30 キヤノン株式会社 Subject information acquisition device and probe

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013527782A (en) * 2010-04-22 2013-07-04 ザ ユニバーシティ オブ ワシントン スルー イッツ センター フォー コマーシャライゼーション Ultrasound-based method and apparatus for detecting and facilitating the removal of stones
JP2012254284A (en) * 2011-05-13 2012-12-27 Fujifilm Corp Tomographic image generating device, method, and program
JP2013063253A (en) * 2011-08-31 2013-04-11 Canon Inc Information processing apparatus, ultrasonic imaging apparatus, and information processing method
JP2013158531A (en) * 2012-02-07 2013-08-19 Canon Inc Apparatus and method for obtaining subject information
JP2014136103A (en) * 2013-01-18 2014-07-28 Fujifilm Corp Photoacoustic image generation device and photoacoustic image generation method
JP2015065975A (en) * 2013-09-26 2015-04-13 キヤノン株式会社 Subject information acquisition device and control method therefor
JP2016097165A (en) * 2014-11-25 2016-05-30 キヤノン株式会社 Subject information acquisition device and probe

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019122621A (en) * 2018-01-17 2019-07-25 キヤノン株式会社 Subject information acquiring apparatus and subject information acquiring method
CN110384480A (en) * 2018-04-18 2019-10-29 佳能株式会社 Subject information acquisition device, subject information processing method and storage medium
CN110384480B (en) * 2018-04-18 2023-06-09 佳能株式会社 Subject information acquisition device, subject information processing method, and storage medium
US20210275040A1 (en) * 2020-03-05 2021-09-09 Koninklijke Philips N.V. Ultrasound-based guidance for photoacoustic measurements and associated devices, systems, and methods

Similar Documents

Publication Publication Date Title
JP5530592B2 (en) Storage method of imaging parameters
US9801614B2 (en) Ultrasound diagnostic apparatus, ultrasound image processing method, and non-transitory computer readable recording medium
US20170086795A1 (en) Medical image diagnostic apparatus and medical information display control method
CN105188555B (en) Diagnostic ultrasound equipment and image processing apparatus
JP2008086767A (en) System and method for three-dimensional and four-dimensional contrast imaging
JP2010000143A (en) Ultrasonic diagnostic apparatus and program
US20190150894A1 (en) Control device, control method, control system, and non-transitory storage medium
JP6661787B2 (en) Photoacoustic image evaluation device, method and program, and photoacoustic image generation device
KR20150106779A (en) The method and apparatus for displaying a plurality of different images of an object
US20150173721A1 (en) Ultrasound diagnostic apparatus, medical image processing apparatus and image processing method
WO2018116963A1 (en) Display control apparatus, display control method, and program
WO2018008439A1 (en) Apparatus, method and program for displaying ultrasound image and photoacoustic image
WO2018008661A1 (en) Control device, control method, control system, and program
JP2018011928A (en) Control device, control method, control system, and program
US11510630B2 (en) Display control apparatus, image display method, and non-transitory computer-readable medium
CN108463174A (en) Device and method for the tissue for characterizing object
EP3329843B1 (en) Display control apparatus, display control method, and program
US11744537B2 (en) Radiography system, medical imaging system, control method, and control program
US20200113541A1 (en) Information processing apparatus, information processing method, and storage medium
WO2018008664A1 (en) Control device, control method, control system, and program
JP7129158B2 (en) Information processing device, information processing method, information processing system and program
KR102106542B1 (en) Method and apparatus for analyzing elastography of tissue using ultrasound
JP2017042603A (en) Subject information acquisition apparatus
US11599992B2 (en) Display control apparatus, display method, and non-transitory storage medium
WO2020040174A1 (en) Image processing device, image processing method, and program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17824264

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17824264

Country of ref document: EP

Kind code of ref document: A1