WO2022219783A1 - 光治療装置、光治療方法および光治療プログラム - Google Patents
光治療装置、光治療方法および光治療プログラム Download PDFInfo
- Publication number
- WO2022219783A1 WO2022219783A1 PCT/JP2021/015612 JP2021015612W WO2022219783A1 WO 2022219783 A1 WO2022219783 A1 WO 2022219783A1 JP 2021015612 W JP2021015612 W JP 2021015612W WO 2022219783 A1 WO2022219783 A1 WO 2022219783A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- light
- tissue structure
- image
- fluorescence intensity
- fluorescence
- Prior art date
Links
- 238000001126 phototherapy Methods 0.000 title claims abstract description 34
- 238000000034 method Methods 0.000 title claims description 20
- 230000008859 change Effects 0.000 claims abstract description 69
- 230000005284 excitation Effects 0.000 claims abstract description 44
- 238000004364 calculation method Methods 0.000 claims abstract description 26
- 238000002073 fluorescence micrograph Methods 0.000 claims abstract description 20
- 229940079593 drug Drugs 0.000 claims abstract description 11
- 239000003814 drug Substances 0.000 claims abstract description 11
- 230000001225 therapeutic effect Effects 0.000 claims description 87
- 230000001678 irradiating effect Effects 0.000 claims description 11
- 238000010606 normalization Methods 0.000 claims description 8
- 230000002123 temporal effect Effects 0.000 claims description 7
- 238000010801 machine learning Methods 0.000 claims description 3
- 238000006243 chemical reaction Methods 0.000 abstract description 18
- 238000002560 therapeutic procedure Methods 0.000 abstract description 6
- 238000003384 imaging method Methods 0.000 description 113
- 238000012545 processing Methods 0.000 description 81
- 210000001519 tissue Anatomy 0.000 description 72
- 230000003287 optical effect Effects 0.000 description 62
- 238000010586 diagram Methods 0.000 description 39
- 239000000306 component Substances 0.000 description 32
- 206010028980 Neoplasm Diseases 0.000 description 25
- 238000005286 illumination Methods 0.000 description 19
- 229940125644 antibody drug Drugs 0.000 description 18
- 230000036632 reaction speed Effects 0.000 description 11
- 238000003780 insertion Methods 0.000 description 9
- 230000037431 insertion Effects 0.000 description 9
- 201000011510 cancer Diseases 0.000 description 8
- 230000008569 process Effects 0.000 description 8
- 238000005452 bending Methods 0.000 description 7
- 210000004204 blood vessel Anatomy 0.000 description 6
- 210000004027 cell Anatomy 0.000 description 6
- 238000001917 fluorescence detection Methods 0.000 description 6
- 238000012986 modification Methods 0.000 description 6
- 230000004048 modification Effects 0.000 description 6
- 238000012937 correction Methods 0.000 description 5
- 230000015654 memory Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000001186 cumulative effect Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 4
- 238000002834 transmittance Methods 0.000 description 4
- 210000002784 stomach Anatomy 0.000 description 3
- 239000002344 surface layer Substances 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 239000003990 capacitor Substances 0.000 description 2
- 238000004891 communication Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005401 electroluminescence Methods 0.000 description 2
- 238000001727 in vivo Methods 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000000451 tissue damage Effects 0.000 description 2
- 231100000827 tissue damage Toxicity 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 1
- FFBHFFJDDLITSX-UHFFFAOYSA-N benzyl N-[2-hydroxy-4-(3-oxomorpholin-4-yl)phenyl]carbamate Chemical compound OC1=C(NC(=O)OCC2=CC=CC=C2)C=CC(=C1)N1CCOCC1=O FFBHFFJDDLITSX-UHFFFAOYSA-N 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000001574 biopsy Methods 0.000 description 1
- 239000012503 blood component Substances 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 239000003365 glass fiber Substances 0.000 description 1
- 229910052736 halogen Inorganic materials 0.000 description 1
- 150000002367 halogens Chemical class 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000004400 mucous membrane Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 238000003672 processing method Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- 229910052724 xenon Inorganic materials 0.000 description 1
- FHNFHKCVQCLJFQ-UHFFFAOYSA-N xenon atom Chemical compound [Xe] FHNFHKCVQCLJFQ-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N5/0613—Apparatus adapted for a specific treatment
- A61N5/062—Photodynamic therapy, i.e. excitation of an agent
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N5/0601—Apparatus for use inside the body
- A61N5/0603—Apparatus for use inside the body for treatment of body cavities
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0012—Biomedical image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N5/0601—Apparatus for use inside the body
- A61N5/0603—Apparatus for use inside the body for treatment of body cavities
- A61N2005/0609—Stomach and/or esophagus
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N2005/0626—Monitoring, verifying, controlling systems and methods
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N2005/0626—Monitoring, verifying, controlling systems and methods
- A61N2005/0627—Dose monitoring systems and methods
- A61N2005/0628—Dose monitoring systems and methods including a radiation sensor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61N—ELECTROTHERAPY; MAGNETOTHERAPY; RADIATION THERAPY; ULTRASOUND THERAPY
- A61N5/00—Radiation therapy
- A61N5/06—Radiation therapy using light
- A61N2005/0658—Radiation therapy using light characterised by the wavelength of light used
- A61N2005/0659—Radiation therapy using light characterised by the wavelength of light used infrared
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10064—Fluorescence image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20081—Training; Learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30092—Stomach; Gastric
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30096—Tumor; Lesion
Definitions
- the present invention relates to a phototherapy device, a phototherapy method, and a phototherapy program.
- PIT photoimmunotherapy
- the antibody drug irradiated with near-infrared light absorbs light energy, causes molecular vibration, and generates heat. This heat destroys cancer cells.
- the antibody drug emits fluorescence when excited. The intensity of this fluorescence is used as an index of therapeutic efficacy.
- the amount of reaction in the tissue may be non-uniform even if the same amount of therapeutic light is applied.
- the rate of reaction progress is also non-uniform, and there are regions where the reaction progresses and regions where the reaction progresses slowly.
- Patent Document 1 and Non-Patent Document 1 evaluate the therapeutic effect based on the decrease in fluorescence in the entire light-irradiated area, the therapeutic effect may not be evaluated appropriately.
- a fluorescence image is generally a blurred image, so that local changes are likely to be hidden by the conventional method of observing the amount of decrease in fluorescence over the entire light-irradiated region.
- the fluorescence of the reaction region where the reaction has progressed has decreased, it may be buried in the fluorescence of the unreacted region with high fluorescence intensity.
- the present invention has been made in view of the above, and aims to provide a phototherapy device, a phototherapy method, and a phototherapy program that can appropriately irradiate a treatment area with light.
- the phototherapy apparatus includes a therapeutic light emitting device for emitting therapeutic light that causes a drug to react, and a narrow band irradiated to the irradiation position of the therapeutic light.
- a tissue structure image acquisition unit that acquires a tissue structure image obtained by light
- a fluorescence image acquisition unit that acquires a fluorescence image obtained by the excitation light irradiated to the irradiation position of the therapeutic light; and using the tissue structure image.
- a boundary region determination unit for determining a boundary region where the tissue structure has changed;
- a fluorescence intensity change calculation unit for calculating the magnitude of change in fluorescence intensity of the boundary region; and for displaying the magnitude of change in fluorescence intensity.
- a display image generating unit that generates a display image of.
- the boundary region determination unit detects a temporal change in the tissue structure image, and the tissue structure has changed based on the temporal change amount. A part area is determined as the boundary area.
- the boundary region determination unit determines the region of the site where the tissue structure has changed by comparing the value of the tissue structure image with a preset threshold value. Determined as a boundary area.
- the boundary region determination unit uses a feature amount calculated in advance by machine learning to determine the region of the site where the tissue structure has changed as the boundary region. do.
- the tissue structure image acquisition unit acquires a tissue structure image obtained by the narrowband light with a wavelength band of 380 nm or more and 440 nm or less.
- the phototherapy apparatus normalizes the fluorescence intensity calculated by the fluorescence intensity change calculation unit using the light intensity of the return light of the narrowband light in the wavelength band of 440 nm or more and 490 nm or less. and a fluorescence intensity normalization unit.
- the tissue structure image acquisition unit acquires a tissue structure image obtained by the narrowband light with a wavelength band of 490 nm or more and 590 nm or less.
- the tissue structure image acquiring unit acquires a tissue structure image obtained by the narrowband light having a wavelength band of 590 nm or more and 620 nm or less.
- the tissue structure image acquisition unit acquires a tissue structure image obtained by the narrowband light with a wavelength band of 620 nm or more and 780 nm or less.
- a control unit that controls the emission of the therapeutic light with the integrated value of the light irradiation intensity and the irradiation time as the set irradiation light amount for the irradiation target area of the therapeutic light. , is further provided.
- the phototherapy method according to the present invention is a phototherapy method for irradiating a treatment site with therapeutic light that causes a drug to react and confirming the therapeutic effect, wherein a narrow area irradiated to the irradiation position of the therapeutic light is irradiated.
- tissue structure image acquisition step of acquiring a tissue structure image obtained by band light
- fluorescence image acquisition step of acquiring a fluorescence image obtained by excitation light irradiated to the irradiation position of the therapeutic light
- boundary region determination step of determining a boundary region where the tissue structure has changed
- fluorescence intensity change calculation step of calculating the magnitude of change in fluorescence intensity of the boundary region
- displaying the magnitude of change in fluorescence intensity a display image generating step of generating a display image for.
- the phototherapy program according to the present invention provides a phototherapy device that generates information for confirming the therapeutic effect by irradiating a treatment site with therapeutic light that causes a drug to react, and the irradiation position of the therapeutic light is irradiated.
- tissue structure image acquisition step of acquiring a tissue structure image obtained by narrowband light
- fluorescence image acquisition step of acquiring a fluorescence image obtained by excitation light irradiated to the irradiation position of the therapeutic light
- boundary region determination step of determining a boundary region where the tissue structure has changed
- fluorescence intensity change calculation step of calculating the magnitude of change in fluorescence intensity of the boundary region, and displaying the magnitude of change in fluorescence intensity using and a display image generating step of generating a display image for performing.
- FIG. 1 is a diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a schematic configuration of the endoscope system according to Embodiment 1 of the present invention.
- FIG. 3 is a diagram for explaining the configuration of the distal end of the endoscope according to the first embodiment of the present invention;
- FIG. 4 is a diagram illustrating the configuration of the imaging optical system of the endoscope according to Embodiment 1 of the present invention.
- FIG. 5 is a diagram for explaining an example of the wavelength band of light used as narrow-band light.
- FIG. 6 is a diagram showing an example of the flow of treatment using the endoscope according to the first embodiment of the present invention
- 7 is a flowchart illustrating an example of processing of the processing device according to the first embodiment of the present invention
- FIG. FIG. 8 is a diagram for explaining the areas divided by the boundary area determination.
- FIG. 9 is a diagram showing an example of changes in fluorescence intensity when the reaction rate is slow.
- FIG. 10 is a diagram showing an example of changes in fluorescence intensity when the reaction progresses at a high speed.
- 11 is a block diagram showing a schematic configuration of an endoscope system according to a modification of Embodiment 1 of the present invention; FIG. FIG.
- FIG. 12 is a diagram illustrating a configuration of an imaging optical system of an endoscope according to a modification of Embodiment 1 of the present invention
- FIG. 13 is a block diagram showing a schematic configuration of an endoscope system according to Embodiment 2 of the present invention.
- FIG. 14 is a diagram illustrating a configuration of an imaging optical system of an endoscope according to Embodiment 2 of the present invention.
- FIG. 15 is a diagram schematically showing an image obtained by the first image sensor.
- FIG. 16 is a diagram schematically showing an image obtained by the third image sensor.
- FIG. 17 is a diagram for explaining the boundary area set by adding the image shown in FIG. 15 and the image shown in FIG. FIG.
- FIG. 18 is a block diagram showing a schematic configuration of an endoscope system according to Embodiment 3 of the present invention.
- FIG. 19 is a diagram illustrating the configuration of an imaging optical system of an endoscope according to Embodiment 3 of the present invention.
- FIG. 20 is a diagram illustrating the configuration of an imaging optical system of an endoscope according to Embodiment 4 of the present invention.
- FIG. 1 is a diagram showing a schematic configuration of an endoscope system according to Embodiment 1 of the present invention.
- FIG. 2 is a block diagram showing a schematic configuration of the endoscope system according to the first embodiment.
- FIG. 3 is a diagram for explaining the configuration of the distal end of the endoscope according to the first embodiment.
- An endoscope system 1 shown in FIGS. 1 and 2 includes an endoscope 2 that captures an in-vivo image of a subject by inserting its distal end into the subject, and illumination light emitted from the distal end of the endoscope 2.
- a light source device 3 that generates a signal
- a processing device 4 that performs predetermined signal processing on an imaging signal captured by the endoscope 2 and controls the overall operation of the endoscope system 1, and a signal from the processing device 4
- a display device 5 for displaying an in-vivo image generated by processing and a treatment instrument device 6 are provided.
- the endoscope 2 includes an insertion section 21 having a flexible and elongated shape, an operation section 22 connected to the proximal end side of the insertion section 21 and receiving input of various operation signals, and an operation section 22 to the insertion section. and a universal cord 23 extending in a direction different from the direction in which 21 extends and containing various cables connected to the light source device 3 and the processing device 4 .
- the insertion section 21 is a flexible bendable body composed of a distal end section 24 containing an imaging device 244 in which pixels for generating signals by receiving and photoelectrically converting light are arranged two-dimensionally, and a plurality of bending pieces. It has a bending portion 25 and an elongated flexible tubular portion 26 connected to the base end side of the bending portion 25 and having flexibility.
- the insertion section 21 is inserted into the body cavity of the subject, and the imaging element 244 captures an image of a subject such as living tissue at a position where external light cannot reach.
- the operation unit 22 includes a bending knob 221 for bending the bending portion 25 in the vertical direction and the horizontal direction, and a treatment for inserting treatment tools such as a therapeutic light irradiation device, a biopsy forceps, an electric scalpel, and an examination probe into the body cavity of the subject. It has an instrument inserting portion 222 and a plurality of switches 223 as an operation input portion for inputting operation instruction signals for peripheral devices such as air supply means, water supply means, and screen display control in addition to the processing device 4 .
- a treatment instrument inserted from the treatment instrument insertion portion 222 is exposed from the opening via a treatment instrument channel (not shown) of the distal end portion 24 (see FIG. 3).
- the universal cord 23 incorporates at least a light guide 241 and a collective cable 245 that collects one or more signal lines.
- the universal cord 23 is branched at the end opposite to the side connected to the operating portion 22 .
- a connector 231 detachable from the light source device 3 and a connector 232 detachable from the processing device 4 are provided at the branch ends of the universal cord 23 .
- a part of the light guide 241 extends from the end of the connector 231 .
- the universal cord 23 propagates the illumination light emitted from the light source device 3 to the distal end portion 24 via the connector 231 (light guide 241 ), the operating portion 22 and the flexible tube portion 26 .
- the universal cord 23 transmits an image signal captured by the imaging device 244 provided at the distal end portion 24 to the processing device 4 via the connector 232 .
- the assembly cable 245 includes signal lines for transmitting imaging signals, signal lines for transmitting drive signals for driving the imaging element 244, and information including unique information about the endoscope 2 (imaging element 244). including signal lines for sending and receiving
- an electric signal is transmitted using a signal line. It may transmit a signal between them.
- the distal end portion 24 includes a light guide 241 made of glass fiber or the like and forming a light guide path for light emitted by the light source device 3, an illumination lens 242 provided at the distal end of the light guide 241, and an optical system for condensing light. It has a system 243 and an imaging element 244 which is provided at an image forming position of the optical system 243 and receives light condensed by the optical system 243, photoelectrically converts the light into an electric signal, and performs predetermined signal processing.
- the optical system 243 is configured using one or more lenses.
- the optical system 243 forms an observation image on the light receiving surface of the imaging device 244 .
- the optical system 243 may have an optical zoom function that changes the angle of view and a focus function that changes the focus.
- the imaging element 244 photoelectrically converts the light from the optical system 243 to generate an electric signal (image signal).
- the imaging device 244 has two imaging devices (a first imaging device 244a and a second imaging device 244b).
- the first imaging element 244a and the second imaging element 244b each have a plurality of pixels arranged in a matrix, each having a photodiode that accumulates electric charge corresponding to the amount of light and a capacitor that converts the electric charge transferred from the photodiode into a voltage level. arrayed.
- Each pixel of the first imaging element 244a and the second imaging element 244b photoelectrically converts light incident thereon through the optical system 243 to generate an electric signal.
- the generated electrical signals are sequentially read out and output as image signals.
- the first imaging element 244a and the second imaging element 244b are realized using, for example, a CCD (Charge Coupled Device) image sensor or a CMOS (Complementary Metal Oxide Semiconductor) image sensor
- FIG. 4 is a diagram for explaining the configuration of the imaging optical system of the endoscope according to the first embodiment.
- the optical system 243 and the imaging device 244 are provided inside the distal end portion 24 .
- the optical system 243 has an objective lens 243a consisting of one or more optical elements, a dichroic mirror 243b, and a cut filter 243c.
- the cut filter 243c cuts light in the wavelength band of the excitation light.
- the excitation light here corresponds to light in the wavelength band for exciting the antibody drug in PIT.
- the optical system 243 may have a lens or the like in addition to the optical elements described above.
- a beam splitter such as a half mirror may be used instead of the dichroic mirror 243b.
- the light from the subject passes through the objective lens 243a and enters the dichroic mirror 243b.
- the distance from the light passing/turning position in the dichroic mirror 243b to the light receiving surface of each imaging element be the same.
- the dichroic mirror 243b bends the optical path of light having a wavelength equal to or greater than the excitation light and allows light having a wavelength less than the excitation light to pass through. In other words, the dichroic mirror 243b bends the optical paths of the excitation light that excites the subject and the fluorescence. Light passing through the dichroic mirror 243b enters the first imaging element 244a. On the other hand, the excitation light and fluorescence whose optical paths are bent by the dichroic mirror 243b are cut by the cut filter 243c, and the fluorescence enters the second imaging element 244b.
- the transmittance of the excitation light of the cut filter 243c is set to 0.1% or less, for example. By setting the transmittance of the excitation light of the cut filter 243c to 0.1% or less, the fluorescence can be selectively taken in during the excitation light illumination.
- the first imaging element 244a corresponds to the tissue structure image acquisition section
- the cut filter and the second imaging element 244b correspond to the fluorescence image acquisition section.
- the endoscope 2 has a memory (not shown) that stores an execution program and a control program for the imaging element 244 to perform various operations, and data including identification information of the endoscope 2 .
- the identification information includes unique information (ID) of the endoscope 2, model year, spec information, transmission method, and the like.
- the memory may also temporarily store image data and the like generated by the imaging device 244 .
- the configuration of the light source device 3 will be described.
- the light source device 3 includes a light source section 31 , an illumination control section 32 and a light source driver 33 . Under the control of the illumination control unit 32, the light source unit 31 sequentially switches and emits illumination light to a subject (subject).
- the light source unit 31 is configured using a light source, one or more lenses, etc., and emits light (illumination light) by driving the light source.
- the light generated by the light source section 31 is emitted from the tip of the tip section 24 toward the subject via the light guide 241 .
- the light source section 31 has a white light source 311 , a narrow band light source 312 and an excitation light source 313 .
- the white light source 311 emits light (white light) having a wavelength band in the visible light range.
- the white light source 311 is implemented using any light source such as an LED light source, a laser light source, a xenon lamp, or a halogen lamp.
- the narrow-band light source 312 emits light (narrow-band light) having a partial wavelength or a wavelength band in the visible light range.
- FIG. 5 is a diagram for explaining an example of the wavelength band of light used as narrow-band light.
- the narrow-band light includes light L V in a wavelength band of 380 nm to 440 nm, light L B in a wavelength band of 440 nm to 490 nm, light L G in a wavelength band of 490 nm to 590 nm, and light of a wavelength band of 590 nm to 620 nm.
- Light L A and light L R in the wavelength band of 620 nm or more and 780 nm or less, or a combination of some of them.
- narrow band light examples include light consisting of a wavelength band of 380 nm or more and 440 nm or less with a central wavelength of 415 nm and a wavelength band of 490 nm or more and 590 nm or less with a central wavelength of 540 nm, which is used for NBI (Narrow Band Imaging) observation. be done.
- the narrow band light source 312 is implemented using an LED light source, a laser light source, or the like.
- near-infrared light L P with a central wavelength of 690 nm is used.
- the blood vessels on the mucosal surface layer can be visualized with high contrast.
- the surface layer of the mucous membrane can be relatively Deep blood vessels can be visualized with high contrast.
- Light in a wavelength band of 440 nm or more and 490 nm or less is used as reference light for generating an image for, for example, correcting fluorescence intensity, in addition to rendering blood vessels.
- the dichroic mirror 243b of the optical system 243 is replaced with a half mirror, or the optical system 243 is left as it is and the second image sensor 244b generates An electrical signal is used.
- the excitation light source 313 emits excitation light for exciting an excitation target (for example, an antibody drug in the case of PIT).
- the excitation light source 313 is implemented using a light source such as an LED light source or a laser light source.
- a light source such as an LED light source or a laser light source.
- near-infrared light L P is used to excite the PIT antibody drug.
- the lighting control unit 32 controls the amount of power supplied to the light source unit 31 based on the control signal (light control signal) from the processing device 4, and also controls the light source to emit light and the driving timing of the light source.
- the light source driver 33 supplies current to the light source to emit light, thereby causing the light source unit 31 to emit light.
- the processing device 4 includes an image processing section 41 , a synchronization signal generation section 42 , an input section 43 , a control section 44 and a storage section 45 .
- the image processing unit 41 receives image data of illumination light of each color imaged by the imaging element 244 from the endoscope 2 .
- the image processing unit 41 performs A/D conversion to generate a digital imaging signal.
- image data is received as an optical signal from the endoscope 2, the image processing unit 41 performs photoelectric conversion to generate digital image data.
- the image processing unit 41 performs predetermined image processing on the image data received from the endoscope 2 to generate an image and outputs the image to the display device 5, or sets an enhancement region determined based on the image. , and to calculate changes in fluorescence intensity over time.
- the image processing unit 41 has a boundary area determination unit 411 , a fluorescence intensity change calculation unit 412 , and a display image generation unit 413 .
- the boundary region determination unit 411 determines a portion where the tissue structure has changed and a portion where the tissue structure has changed based on an image (tissue structure image) formed by narrowband light, which is generated based on the imaging signal generated by the first imaging element 244a. Determining the boundary with the part that does not change or the change is small. By determining the boundary, the boundary region determination unit 411 determines each boundary region of a portion where the tissue structure has changed and a portion where the tissue structure has not changed or has changed only slightly.
- the fluorescence intensity change calculation unit 412 calculates the time change of the fluorescence intensity for each boundary area based on the second image generated by the second imaging element 244b and based on the image formed by the fluorescence.
- the display image generation unit 413 generates an image by performing predetermined image processing.
- the image may be an image using white light or narrow band light, an image indicating the boundary determined by the boundary region determination unit 411, an image corresponding to the amount of change calculated by the fluorescence intensity change calculation unit, or visual information in the fluorescence intensity itself.
- the predetermined image processing includes synchronization processing, gradation correction processing, color correction processing, and the like. Synchronization processing is processing for synchronizing image data of each color component of RGB.
- Gradation correction processing is processing for correcting the gradation of image data.
- Color correction processing is processing for performing color tone correction on image data. Note that the display image generation unit 413 may adjust the gain according to the brightness of the image.
- the image processing unit 41 is configured using a general-purpose processor such as a CPU (Central Processing Unit) or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit). Note that the image processing unit 41 may be configured to have a frame memory that holds the R image data, the G image data and the B image data.
- a general-purpose processor such as a CPU (Central Processing Unit) or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC (Application Specific Integrated Circuit).
- the image processing unit 41 may be configured to have a frame memory that holds the R image data, the G image data and the B image data.
- the synchronization signal generation unit 42 generates a clock signal (synchronization signal) that serves as a reference for the operation of the processing device 4, and transmits the generated synchronization signal to the light source device 3, the image processing unit 41, the control unit 44, and the endoscope 2.
- the synchronizing signal generated by the synchronizing signal generator 42 includes a horizontal synchronizing signal and a vertical synchronizing signal. Therefore, the light source device 3, the image processing section 41, the control section 44, and the endoscope 2 operate in synchronization with each other by the generated synchronization signal.
- the input unit 43 is implemented using a keyboard, a mouse, a switch, and a touch panel, and receives inputs of various signals such as operation instruction signals for instructing the operation of the endoscope system 1 .
- the input unit 43 may include a switch provided in the operation unit 22 or a portable terminal such as an external tablet computer.
- the control unit 44 performs drive control of each component including the imaging element 244 and the light source device 3, input/output control of information to each component, and the like.
- the control unit 44 refers to control information data (for example, readout timing) for imaging control stored in the storage unit 45, and performs imaging as a drive signal via a predetermined signal line included in the collective cable 245. It transmits to the element 244 and switches between a normal observation mode in which an image obtained by illumination with white light is observed and a fluorescence observation mode in which fluorescence intensity of an excitation target is calculated.
- the control unit 44 is configured using a general-purpose processor such as a CPU or a dedicated processor such as various arithmetic circuits that execute specific functions such as an ASIC.
- the storage unit 45 stores data including various programs for operating the endoscope system 1 and various parameters necessary for operating the endoscope system 1 .
- the storage unit 45 also stores identification information of the processing device 4 .
- the identification information includes unique information (ID) of the processing device 4, model year, specification information, and the like.
- the storage unit 45 also stores various programs including an image acquisition processing program for executing the image acquisition processing method of the processing device 4 .
- Various programs can be recorded on computer-readable recording media such as hard disks, flash memories, CD-ROMs, DVD-ROMs, flexible disks, etc., and can be widely distributed.
- the various programs described above can also be obtained by downloading via a communication network.
- the communication network here is realized by, for example, an existing public line network, LAN (Local Area Network), WAN (Wide Area Network), etc., and it does not matter whether it is wired or wireless.
- the storage unit 45 having the above configuration is implemented using a ROM (Read Only Memory) in which various programs etc. are pre-installed, and a RAM, hard disk, etc. for storing calculation parameters, data, etc. for each process.
- ROM Read Only Memory
- the display device 5 displays a display image corresponding to the image signal received from the processing device 4 (image processing unit 41) via the video cable.
- the display device 5 is configured using a monitor such as liquid crystal or organic EL (Electro Luminescence).
- the treatment instrument device 6 has a treatment instrument operation section 61 and a flexible treatment instrument 62 extending from the treatment instrument operation section 61 .
- the treatment instrument 62 used for PIT emits light for treatment (hereinafter referred to as treatment light).
- the treatment instrument operation section 61 controls emission of therapeutic light from the treatment instrument 62 .
- the treatment instrument operation section 61 has an operation input section 611 .
- the operation input unit 611 is composed of, for example, switches.
- the treatment instrument operating section 61 causes the treatment instrument 62 to emit therapeutic light in response to an input (for example, depression of a switch) to the operation input section 611 .
- the light source that emits the therapeutic light may be provided in the treatment instrument 62 or may be provided in the treatment instrument operation section 61 .
- a light source is implemented using a semiconductor laser, an LED, or the like.
- therapeutic light is light in a wavelength band of 680 nm or more, for example, light with a central wavelength of 690 nm (for example, light L P shown in FIG. 5).
- the illumination optical system provided in the treatment instrument 62 can change the irradiation range of the treatment light.
- the treatment instrument operation unit 61 under the control of the treatment instrument operation unit 61, it is composed of an optical system capable of changing the focal length, a DMD (Digital Micromirror Device), etc., and changes the spot diameter of the light irradiated to the subject and the shape of the irradiation range. can do.
- a DMD Digital Micromirror Device
- FIG. 6 is a diagram showing an example of the flow of treatment using the endoscope according to the first embodiment of the present invention
- FIG. 6 is a diagram showing an example of implementation of PIT, in which treatment is performed by inserting the insertion portion 21 into the stomach ST.
- the operator inserts the insertion portion 21 into the stomach ST (see (a) in FIG. 6).
- the operator causes the light source device 3 to irradiate white light, and searches for the treatment position while observing the white light image inside the stomach ST displayed by the display device 5 .
- tumors B 1 and B 2 are to be treated as targets of treatment.
- the operator observes the white light image and determines the regions containing the tumors B 1 and B 2 as irradiation regions.
- the operator directs the distal end portion 24 toward the tumor B 1 , protrudes the treatment instrument 62 from the distal end of the endoscope 2, and irradiates the tumor B 1 with therapeutic light (see (b) of FIG. 6). Irradiation of the therapeutic light causes the antibody drug bound to the tumor B1 to react, and the tumor B1 is treated.
- the operator directs the distal end portion 24 toward the tumor B2, protrudes the treatment instrument 62 from the distal end of the endoscope 2 , and irradiates the tumor B2 with therapeutic light (see (c) of FIG . 6). Irradiation of the therapeutic light causes the antibody drug bound to the tumor B2 to react, and the tumor B2 is treated.
- the operator directs the distal end portion 24 toward the tumor B 1 and irradiates the tumor B 1 with excitation light from the distal end of the endoscope 2 (see (d) in FIG. 6).
- the operator confirms the therapeutic effect on tumor B 1 by observing the fluorescence intensity. Confirmation of the therapeutic effect is determined by the operator by displaying an image, which will be described later.
- the operator directs the distal end portion 24 toward the tumor B 2 and irradiates the tumor B 2 with excitation light from the distal end of the endoscope 2 (see (e) in FIG. 6).
- the operator confirms the therapeutic effect on Tumor B2 by observing the fluorescence intensity.
- the operator repeats additional irradiation of therapeutic light and confirmation of therapeutic effects as necessary.
- FIG. 7 is a flowchart illustrating an example of processing performed by the processing apparatus according to the first embodiment;
- FIG. FIG. 7, like FIG. 6, shows an example of the flow when performing PIT.
- a tissue structure image before treatment is acquired by irradiating the treatment position with narrow-band light from the distal end portion 24 (step S101: tissue structure image acquisition step).
- the processing device 4 generates a tissue structure image based on the imaging signal generated by the first imaging element 244a.
- step S102 fluorescence detection step.
- the endoscope 2 irradiates the subject with the excitation light, and the antibody drug before treatment is excited to emit fluorescence.
- the processing device 4 acquires an imaging signal (fluorescence image) generated by the second imaging element 244b.
- step S103 drug reaction step.
- a treatment that destroys cancer cells by activating the antibody drug by irradiation with near-infrared light, which is therapeutic light, is performed.
- step S104 tissue structure image acquisition step.
- the processing device 4 generates a tissue structure image based on the imaging signal generated by the first imaging element 244a in the same manner as in step S101.
- step S105 fluorescence detection step. Also in step S105, the processing device 4 acquires the imaging signal (fluorescence image) generated by the second imaging device 244b in the same manner as in step S102.
- the boundary region determination unit 411 detects the boundary between the region with a fast reaction speed and the region with a slow reaction speed using the tissue structure image acquired in step S101 and the tissue structure image acquired in step S103. By doing so, the boundary area is determined (step S106: boundary area determination step). Note that the boundary area determination step may be performed before the fluorescence detection step, or may be performed simultaneously with the fluorescence detection step.
- the boundary area determination unit 411 determines the boundary area by, for example, either determination processing 1 or determination processing 2 below. Note that a known method other than the determination processes 1 and 2 may be used to determine the boundary area.
- the boundary region determination unit 411 detects temporal changes in two tissue structure images acquired at different times, and determines a region whose boundary is the outer edge including the site where the tissue structure has changed based on the amount of temporal change. Determined as an area.
- the boundary region determining unit 411 compares the tissue structure image value (brightness value) with a preset threshold value, for example, to extract the region of the site where the tissue structure has changed, and defines the outer edge of the extracted region as the boundary. It is determined as a boundary area that
- the threshold here may be a preset brightness value in a normal state (tumor-free state), or may be a brightness value of a tissue structure image acquired before treatment.
- the boundary region determination unit 411 uses the feature amount calculated in advance by machine learning to determine the region of the part where the tissue structure has changed as the boundary region.
- the boundary area determination unit 411 calculates the feature amount of the acquired tissue structure image, and determines the boundary area using the calculated feature amount and the learning model.
- FIG. 8 is a diagram for explaining the areas divided by the boundary area determination.
- the boundary region determination unit 411 compares the tissue structure images, detects the boundary between regions with large changes in tissue as regions with high reaction speed, and regions with small changes in tissue as regions with low reaction speed, and determines the boundary region. judge. For example, the boundary region determination unit 411 sets the first region ROI 1 as a region with a slow reaction speed, and sets the second region ROI 2 as a region with a fast reaction speed.
- FIG. 9 is a diagram showing an example of changes in fluorescence intensity when the reaction rate is slow.
- FIG. 10 is a diagram showing an example of changes in fluorescence intensity when the reaction progresses at a high speed.
- the reaction rate is slow (for example, the first region ROI 1 )
- the attenuation rate of the fluorescence intensity Q 1 derived from the antibody drug is small, and high intensity is maintained over time (see FIG. 9).
- the attenuation rate of the antibody drug-derived fluorescence intensity Q 2 is large (see FIG. 10).
- the fluorescence intensity change calculation unit 412 calculates a fluorescence intensity change using the fluorescence image acquired in step S102 and the fluorescence image acquired in step S105 (step S107: fluorescence intensity change calculation step).
- the fluorescence intensity change calculation unit 412 calculates a change in fluorescence intensity (difference value in fluorescence intensity between before and after treatment) for each region determined by the boundary region determination unit 411 .
- a known method such as pattern matching may be used to align the images before and after treatment.
- the display image generation unit 413 generates an image to be displayed on the display device 5 (step S108).
- the display image generation unit 413 generates an image that visually expresses changes in fluorescence intensity.
- the display image generation unit 413 generates an image in which visual information corresponding to changes in fluorescence intensity is superimposed on a tissue structure image, or generates a visual information corresponding to changes in fluorescence intensity over time (changes in fluorescence intensity) in each boundary region.
- the information is superimposed on the tissue structure image together with the boundary line of the boundary region (for example, the first region ROI 1 ), or the temporal change in the fluorescence intensity of each boundary region (see, for example, FIGS. 9 and 10) is displayed in parallel with the image. Generate images to be displayed.
- the visual information corresponding to the fluorescence intensity for example, the color of the region where the amount of change in the fluorescence intensity is small is set to a color that is easy to visually recognize (hue, color density, etc. that is easy for humans to distinguish).
- the display image allows, for example, visual recognition of the difference in fluorescence intensity change between different boundary regions (for example, the first region ROI 1 and the second region ROI 2 ).
- the display image generation unit 413 may generate an image consisting only of tissue structure, a white light image, or a fluorescence intensity image (intensity map).
- the control unit 44 causes the display device 5 to display the image generated in step S108 (step S109: display step).
- step S109 display step
- the operator can confirm the therapeutic effect.
- the operator refers to the image to confirm the therapeutic effect, determine whether or not to additionally irradiate the therapeutic light, and determine the portion to be irradiated with the therapeutic light (for example, the first region ROI 1 ).
- the operator operates the input unit 43 to input the determination result.
- step S110 determines whether or not to additionally irradiate the therapeutic light. If the control unit 44 determines that additional irradiation of therapeutic light is unnecessary based on the input determination result (step S110: No), the process ends. On the other hand, when the control unit 44 determines that additional irradiation of therapeutic light is to be performed (step S110: Yes), the process proceeds to step S111.
- additional irradiation for example, in the illumination optical system, the shape of the irradiation range of light is controlled to match the boundary region, or the operator adjusts the spot diameter to irradiate therapeutic light.
- the control unit 44 determines whether or not the amount of irradiated light in the region where additional therapeutic light irradiation is performed is within the allowable range (step S111).
- the allowable range is a preset amount of light, and at least an upper limit value is set. This upper limit is a value set to suppress tissue damage due to excessive irradiation.
- the control unit 44 determines whether or not the amount of light (accumulated light amount value) that has been applied to the target area exceeds, for example, an upper limit value.
- step S111: Yes When the control unit 44 determines that the amount of light that has been irradiated is below the allowable range (upper limit) (step S111: Yes), it proceeds to step S112. If the control unit 44 determines that the amount of light that has been irradiated exceeds the allowable range (upper limit) (step S111: No), the process proceeds to step S113.
- step S112 the control unit 44 sets an irradiation area for performing additional irradiation. After setting the irradiation area, the control unit 44 returns to step S103 and repeats the process.
- the control unit 44 outputs an alert to the effect that the irradiation light amount exceeds the allowable range.
- This alert may be displayed as character information on the display device 5, may be configured to emit sound or light, or may be combined. After displaying on the display device 5, the control unit 44 terminates the process.
- a tissue structure image is acquired with narrow-band light, and regions (boundary regions) with different reaction speeds are divided according to tissue changes before and after treatment, and changes in fluorescence intensity in each region are determined. Calculate At this time, by displaying the boundary region or the change in fluorescence intensity for each boundary region, the operator is allowed to determine whether or not additional therapeutic light irradiation is necessary for each region. According to Embodiment 1, since additional irradiation of therapeutic light can be performed on the region, light irradiation can be appropriately performed on the therapeutic region.
- the cumulative light amount of the therapeutic light to the area is compared with the allowable range, and the cumulative light amount exceeds the allowable range. If it exceeds, an alert is output to the effect that the cumulative amount of light exceeds the allowable range. According to the first embodiment, it is possible to suppress tissue damage due to excessive irradiation of therapeutic light.
- the first imaging element 244a may be configured using a multi-band image sensor to individually acquire light in a plurality of different wavelength bands. For example, scattered light and returned light in a wavelength band of 380 nm to 440 nm and scattered light and returned light in a wavelength band of 490 nm to 590 nm are separately acquired by a multiband image sensor, and each narrow By generating a band light image, it is possible to acquire blood vessel images at different depths from the mucosal surface layer, and use changes in blood vessels and tissues at each depth to determine the boundary region with higher accuracy. obtain.
- FIG. 11 is a block diagram showing a schematic configuration of an endoscope system according to a modification of Embodiment 1 of the present invention
- FIG. An endoscope system 1A according to this modification includes an endoscope 2A instead of the endoscope 2 of the endoscope system 1 according to the first embodiment. Since the configuration other than the endoscope 2A is the same, the description is omitted.
- the endoscope 2A includes a distal end portion 24A instead of the distal end portion 24 of the endoscope 2. Since the configuration other than the distal end portion 24A is the same as that of the endoscope 2, the description is omitted.
- the tip portion 24A is provided at the image forming position of the light guide 241, the illumination lens 242, the optical system 243A for condensing, and the optical system 243A, and receives the light condensed by the optical system 243A and converts it into an electric signal. and an imaging element 244A that photoelectrically converts and performs predetermined signal processing.
- FIG. 12 is a diagram for explaining the configuration of the imaging optical system of the endoscope according to the modified example of Embodiment 1 of the present invention.
- the optical system 243A and the imaging element 244A are provided inside the distal end portion 24A.
- the optical system 243A includes an objective lens 2430, a first lens 2431 consisting of one or more optical elements, a second lens 2432 consisting of one or more optical elements, and a third lens 2433 consisting of one or more optical elements. , a cut filter 2434, and a fourth lens 2435 consisting of one or more optical elements.
- the cut filter 2434 cuts light in the wavelength band of the excitation light.
- the excitation light here corresponds to light in the wavelength band for exciting the antibody drug in PIT.
- the second lens 2432 and the fourth lens 2435 form observation images at positions different from each other on the imaging device 244A and at positions that do not overlap each other.
- the excitation light transmittance of the cut filter 2434 is set to 0.1% or less. By setting the transmittance of the excitation light to 0.1% or less, fluorescence can be selectively taken in, for example, during excitation light illumination.
- the imaging element 244A photoelectrically converts the light from the optical system 243A to generate an electric signal (image signal).
- the image sensor 244A has a plurality of pixels arranged in a matrix, each having a photodiode that accumulates electric charge according to the amount of light, a capacitor that converts the electric charge transferred from the photodiode into a voltage level, and the like.
- Each pixel photoelectrically converts the light from the optical system 243A to generate an electric signal, which is output as an image signal.
- the imaging element 244A is implemented using, for example, a CCD image sensor or a CMOS image sensor.
- Lights L 3 and L 4 from the object pass through the objective lens 2430 and enter the first lens 2431 and the third lens 2433, respectively.
- Light L 3 incident on the first lens 2431 is imaged by the second lens 2432 .
- Light L 4 incident on the third lens 2433 passes through a cut filter 2434 and is imaged by a fourth lens 2435 .
- the second lens 2432 forms an observation image on the first imaging section 244c of the imaging element 244A.
- the fourth lens 2435 forms an observation image on the second imaging section 244d of the imaging element 244A.
- the first imaging section 244c and the second imaging section 244d are formed by dividing the light receiving area of the imaging device into two.
- the processing device 4 executes processing according to the flow of FIG. 7 when performing PIT.
- the first imaging device 244a is read as the first imaging unit 244c
- the second imaging device 244b is read as the second imaging unit 244d.
- a tissue structure image is acquired with narrow-band light, and regions (boundary regions) with different reaction speeds are divided according to changes in the tissue before and after treatment, and each By calculating the change in the fluorescence intensity of the region and displaying the boundary region or the change in the fluorescence intensity for each boundary region, the operator can determine whether or not additional therapeutic light irradiation is necessary for each region. According to this modified example, it is possible to perform additional irradiation of the treatment light on the region, so that it is possible to appropriately perform light irradiation on the treatment region.
- FIG. 13 is a block diagram showing a schematic configuration of an endoscope system according to Embodiment 2 of the present invention.
- An endoscope system 1B according to the second embodiment includes an endoscope 2B and a processing device 4A instead of the endoscope 2 and the processing device 4 of the endoscope system 1 according to the first embodiment. Since the configuration other than the endoscope 2A and the processing device 4A is the same, the description is omitted.
- the endoscope 2B includes a distal end portion 24B in place of the distal end portion 24 of the endoscope 2. Since the configuration other than the distal end portion 24B is the same as that of the endoscope 2, the description is omitted.
- the tip portion 24B is provided at the image forming position of the light guide 241, the illumination lens 242, the optical system 243B for condensing, and the optical system 243B, and receives the light condensed by the optical system 243B and converts it into an electric signal. and an imaging element 244B that photoelectrically converts and performs predetermined signal processing.
- FIG. 14 is a diagram explaining the configuration of the imaging optical system of the endoscope according to Embodiment 2 of the present invention.
- the optical system 243B and the imaging element 244B are provided inside the distal end portion 24B.
- the optical system 243B has an objective lens 243a, a dichroic mirror 243b (hereinafter referred to as the first dichroic mirror 243b), a cut filter 243c, and a second dichroic mirror 234d.
- the cut filter 243c cuts light in the wavelength band of the excitation light.
- the second dichroic mirror 234d bends the optical path of light in the wavelength band of the blue component, for example, light in the wavelength band of 490 nm or less, and allows light in the wavelength band of other components (eg, green component and red component) to pass through.
- the optical system 243B may have a lens or the like in addition to the optical elements described above.
- the first dichroic mirror 243b bends the optical path of light (light L 2 ) having a wavelength equal to or greater than the fluorescence wavelength emitted by the subject, and passes light (light L 1 ) having a wavelength less than the fluorescence wavelength.
- the light (light L 1 ) that has passed through the first dichroic mirror 243b enters the second dichroic mirror 234d.
- the cut filter 243c cuts the excitation light and fluorescence (light L 2 ), the optical path of which is bent by the first dichroic mirror 243b, and the fluorescence enters the second imaging element 244b.
- the second dichroic mirror 243d bends the optical path of the light (light L 12 ) including the return light of the narrow-band light in the wavelength band of 440 nm or more and 490 nm or less, and also bends the color components other than the blue component (for example, the components with wavelengths longer than 490 nm). ) (light L 11 ).
- the light (light L 11 ) that has passed through the second dichroic mirror 243d enters the first imaging element 244a.
- the light (light L 12 ) whose optical path is bent by the second dichroic mirror 243d enters the third imaging element 244e.
- the imaging element 244B photoelectrically converts the light from the optical system 243 to generate an electric signal (image signal).
- the imaging element 244B has three imaging elements (first imaging element 244a, second imaging element 244b, and third imaging element 244e).
- the first imaging element 244a to the third imaging element 244e are implemented using, for example, a CCD image sensor or a CMOS image sensor.
- the processing device 4A includes an image processing unit 41A, a synchronization signal generation unit 42, an input unit 43, a control unit 44, and a storage unit 45.
- the image processing unit 41A receives, from the endoscope 2, image data of illumination light of each color captured by the imaging device 244B.
- the image processing unit 41A performs predetermined image processing on the image data received from the endoscope 2B to generate an image and outputs the image to the display device 5, or sets a boundary region determined based on the image. , and to calculate changes in fluorescence intensity over time.
- the image processing unit 41A has a boundary area determination unit 411, a fluorescence intensity change calculation unit 412, a display image generation unit 413, a specific area intensity calculation unit 414, and a fluorescence intensity normalization unit 415.
- the display image generation unit 413 generates a white light image based on the electrical signals generated by the first imaging element 244a and the third imaging element 244e.
- the specific region intensity calculator 414 calculates the light intensity of a specific wavelength band. In the second embodiment, the intensity of the light (light L 12 ) in the wavelength band of the blue component is calculated. The specific area intensity calculation unit 414 calculates the light intensity of the blue component based on the electrical signal generated by the third imaging element 244e.
- the fluorescence intensity normalization unit 415 normalizes the intensity change by dividing the intensity change calculated by the fluorescence intensity change calculation unit 412 by the light intensity of the blue component calculated by the specific area intensity calculation unit 414 .
- the processing device 4A executes processing according to the flow of FIG. 7 when performing PIT.
- the fluorescence detection step step S105
- the subject is irradiated with narrow-band light of 440 nm to 490 nm in addition to the excitation light. Therefore, the specific region intensity calculator 414 calculates the light intensity of the return light of the narrow band light of 440 nm to 490 nm.
- the narrow-band light may be irradiated at a timing different from the fluorescence detection step.
- the fluorescence intensity change normalized by the fluorescence intensity normalization unit 415 is calculated.
- the boundary area determination unit 411 may determine the boundary area based on the electrical signal generated by the first image sensor 244a, or the electrical signal generated by the third image sensor 244e. The boundary region may be determined based on the electrical signal, or may be determined based on the electrical signals generated by the first imaging element 244a and the third imaging element 244e.
- FIG. 15 is a diagram schematically showing an image obtained by the first image sensor.
- FIG. 16 is a diagram schematically showing an image obtained by the third image sensor.
- the image obtained by the first imaging element 244a is based on an image formed by light in wavelength bands excluding fluorescent components and blue components.
- the image obtained by the third imaging element 244e is based on the image formed by the light in the wavelength band of the blue component.
- the image shown in FIG. 15 is obtained by the first imaging element 244a and the image shown in FIG. 16 is obtained by the third imaging element 244e.
- the X-axis and Y-axis shown in FIGS. 15 and 16 are attached to indicate the relative positional relationship of each image.
- the images shown in FIGS. 15 and 16 are images based on light in different wavelength bands (the wavelength band of the blue component and the wavelength bands other than the blue component and fluorescence), and depict different tissue structures. Specifically, blood vessels with different depths from the tissue surface are rendered. In FIGS. 15 and 16, it is assumed that tissue structure images are drawn in the photodetection regions R 1 and R 2 .
- the boundary region determination unit 411 determines an image obtained by the first imaging element 244a (for example, the image shown in FIG. 15; hereinafter sometimes referred to as the first image) and an image obtained by the third imaging element 244e (for example, the image shown in FIG. 16). (hereinafter sometimes referred to as a second image), boundary regions with different degrees of change in tissue structure are determined.
- FIG. 17 is a diagram for explaining the boundary area set by adding the image shown in FIG. 15 and the image shown in FIG.
- the boundary area determination unit 411 synthesizes the first image and the second image, extracts the outline of the synthesized image, etc., and determines the extracted outline as the boundary area.
- dashed line R3 is set as the boundary region.
- a tissue structure image is acquired with narrow-band light, and regions (boundary regions) with different reaction speeds are divided according to changes in the tissue before and after treatment, By calculating the change in fluorescence intensity of each region and displaying the boundary region or the change in fluorescence intensity for each boundary region, the operator can judge whether additional irradiation of therapeutic light is necessary for each region. .
- the second embodiment since additional irradiation of therapeutic light can be performed on the region, light irradiation can be appropriately performed on the therapeutic region.
- the distance between the endoscope 2B (distal end 24B) and the subject can be displayed to the operator.
- the change in fluorescence intensity can be properly grasped.
- the narrow band acquired for normalization is not limited to the wavelength band of 440 nm or more and 490 nm or less, and may be other wavelength bands.
- the light in the wavelength band of 440 nm or more and 490 nm or less does not contribute to absorption derived from blood components, and is dominated by scattered light from living tissue. Therefore, since the intensity of the scattered light from the tissue depends only on the distance, it is suitable for canceling the variation of the fluorescence intensity due to the distance due to division or the like.
- FIG. 18 is a block diagram showing a schematic configuration of an endoscope system according to Embodiment 3 of the present invention.
- An endoscope system 1C according to the third embodiment includes a processing device 4A instead of the processing device 4 of the endoscope system 1 according to the first embodiment.
- the distal end portion 24 includes an optical system 243 and an imaging element 244 similar to those in Embodiment 1, but the first imaging element 244a is configured by a multiband image sensor, and generates an electric signal for each color component separately. described as what to do.
- FIG. 19 is a diagram illustrating the configuration of an imaging optical system of an endoscope according to Embodiment 3 of the present invention.
- the light reflected or scattered from the subject is, for example, narrow-band red component light L R with a central wavelength of 660 nm, amber component light L A with a central wavelength of 590 nm, and green component light L with a central wavelength of 525 nm.
- G blue component light L B with a central wavelength of 480 nm, violet color component light L V with a central wavelength of 380 nm, excitation light (for example, light L P shown in FIG. 5), and Fluorescence and light L T .
- the light L T whose excitation light is cut by the cut filter 243c is incident on the second imaging element 244b.
- the specific region intensity calculation unit 414 uses an electrical signal generated based on the blue component light (light L B ) among the electrical signals generated by the first imaging element 244a to calculate the light intensity.
- the processing device 4A executes processing according to the flow of FIG. 7 when performing PIT.
- the fluorescence intensity change calculation step step S107
- the fluorescence intensity change normalized by the fluorescence intensity normalization unit 415 is calculated.
- the boundary region determination unit 411 may determine the boundary region using an electrical signal based on the blue component light among the electrical signals generated by the first imaging element 244a.
- the boundary region may be determined using electrical signals based on light components other than the blue component, or the boundary region may be determined based on the electrical signals of all color components generated by the first imaging element 244a.
- the electrical signals of all color components correspond to electrical signals generated by a plurality of filters included in the multiband image sensor and having different wavelength bands for receiving or transmitting light.
- a tissue structure image is acquired with narrow-band light, and regions (boundary regions) with different reaction speeds are divided according to changes in the tissue before and after treatment, By calculating the change in fluorescence intensity of each region and displaying the boundary region or the change in fluorescence intensity for each boundary region, the operator can judge whether additional irradiation of therapeutic light is necessary for each region. .
- the third embodiment since additional irradiation of therapeutic light can be performed on the region, light irradiation can be appropriately performed on the therapeutic region.
- the first imaging element 244a generates electronic signals individually for each color component. and an electrical signal based on light components other than the returned light may be generated separately.
- FIG. 20 is a block diagram showing a schematic configuration of an endoscope system according to Embodiment 4 of the present invention.
- An endoscope system 1D according to the fourth embodiment has the same configuration as the endoscope system 1 according to the first embodiment.
- the processing device 4 is electrically connected to the treatment instrument device 6 , and the controller 44 controls emission of therapeutic light from the treatment instrument 62 .
- the processing device 4 executes processing according to the flow of FIG. 7 when performing PIT.
- the control unit 44 controls the irradiation range, irradiation timing, and irradiation time of the therapeutic light. Specifically, the control unit 44 sets, for example, the light intensity (output value) and the irradiation time corresponding to the preset irradiation light amount for the irradiation range set by the operator.
- the control unit 44 starts irradiation control of the treatment light with the pressing of the switch of the operation input unit 611 as a trigger.
- control unit 44 when performing additional irradiation, sets the shape of the irradiation range of the therapeutic light emitted from the treatment instrument 62 according to the boundary region of the target, and presses the switch of the operation input unit 611 as a trigger. Irradiation control of therapeutic light is started. Note that the control unit 44 may determine whether or not the cumulative amount of irradiation light in the irradiation target area has exceeded a preset upper limit value, and output an alert if it exceeds.
- a tissue structure image is acquired with narrow-band light, and regions (boundary regions) with different reaction speeds are divided according to changes in tissue before and after treatment, By calculating the change in fluorescence intensity of each region and displaying the boundary region or the change in fluorescence intensity for each boundary region, the operator can judge whether additional irradiation of therapeutic light is necessary for each region. . According to the fourth embodiment, since additional irradiation of therapeutic light can be performed on the region, light irradiation can be appropriately performed on the therapeutic region.
- control unit 44 since the control unit 44 performs emission control of the therapeutic light emitted from the treatment tool 62, the operator does not need to adjust the irradiation range of the therapeutic light according to the boundary region. therapeutic light can be applied to a specific area.
- the excitation light and the treatment light may be in the same wavelength band (same center wavelength) or different wavelength bands (center wavelength).
- the treatment light (excitation light) may be emitted from the treatment tool 62 or the excitation light source 313, and either the excitation light source 313 or the treatment tool 62 may be omitted.
- the light source device 3 is separate from the processing device 4 in the above-described embodiment, the light source device 3 and the processing device 4 may be integrated. Further, in the embodiment, an example in which the therapeutic light is emitted by the treatment tool has been described, but the light source device 3 may be configured to emit the therapeutic light.
- the endoscope system according to the present invention is explained as being the endoscope system 1 using the flexible endoscope 2 whose observation target is the biological tissue in the subject.
- endoscopes such as rigid endoscopes, industrial endoscopes for observing material properties, fiberscopes, optical viewing tubes, and other optical endoscopes with a camera head connected to the eyepiece. It can also be applied to a scope system.
- the phototherapy device, phototherapy method, and phototherapy program according to the present invention are useful for appropriately irradiating the treatment area with light.
- Reference Signs List 1 1A to 1D endoscope system 2, 2A, 2B endoscope 3 light source device 4, 4A processing device 5 display device 6 treatment instrument device 21 insertion section 22 operation section 23 universal cord 24 tip section 25 bending section 26 flexibility Tube section 31 Light source section 32 Illumination control section 33 Light source driver 41 Image processing section 42 Synchronization signal generation section 43 Input section 44 Control section 45 Storage section 61 Treatment instrument operation section 62 Treatment instrument 241 Light guide 242 Illumination lens 243, 243A Optical system 243a , 2430 objective lens 243b dichroic mirror (first dichroic mirror) 243c, 2434 cut filter 243d second dichroic mirror 244, 244A, 244B imaging element 244a first imaging element 244b second imaging element 244c first imaging section 244d second imaging section 244e third imaging element 311 white light source 312 narrow band light source 313 Excitation light source 411 Boundary area determination unit 412 Fluorescence intensity change calculation unit 413 Display image generation unit 414 Specific region intensity calculation unit 4
Landscapes
- Engineering & Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Radiology & Medical Imaging (AREA)
- General Health & Medical Sciences (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Biophysics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Medical Informatics (AREA)
- Quality & Reliability (AREA)
- Endoscopes (AREA)
Abstract
Description
図1は、本発明の実施の形態1に係る内視鏡システムの概略構成を示す図である。図2は、本実施の形態1に係る内視鏡システムの概略構成を示すブロック図である。図3は、本実施の形態1にかかる内視鏡の先端構成を説明する図である。
また、第1撮像素子244aが組織構造画像取得部に相当し、カットフィルタおよび第2撮像素子244bが蛍光画像取得部に相当する。
なお、PITの抗体薬剤を励起させる場合、例えば690nmを中心波長とする近赤外光LPが用いられる。
また、440nm以上490nm以下の波長帯域の光は、血管描出のほか、例えば蛍光強度を補正するための画像を生成する参照光として使用される。
なお、620nm以上780nm以下の波長帯域の光を使用する場合は、光学系243のダイクロイックミラー243bをハーフミラーに代えた構成とするか、または、光学系243はそのままとして第2撮像素子244bが生成した電気信号を用いる。
このため、光源装置3、画像処理部41、制御部44、内視鏡2は、生成された同期信号によって、互いに同期をとって動作する。
ここで、処置具62が備える照明光学系は、治療光の照射範囲を変更することができる。例えば、処置具操作部61の制御のもと、焦点距離を変更可能な光学系や、DMD(Digital Micromirror Device)等によって構成され、被写体に照射する光のスポット径や、照射範囲の形状を変更することができる。
境界領域判定部411は、取得時刻が異なる二つの組織構造画像の時間的な変化を検出し、該時間的な変化量に基づいて組織構造が変化した部位を含む外縁を境界とする領域を境界領域として判定する。境界領域判定部411は、例えば、組織構造画像の値(輝度値)を、予め設定した閾値と比較することによって、組織構造が変化した部位の領域を抽出し、抽出した領域の外縁を境界とする境界領域として判定する。ここでの閾値は、予め設定されている正常な状態(腫瘍の無い状態)の輝度値であってもよいし、治療前に取得した組織構造画像の輝度値としてもよい。
境界領域判定部411は、予め機械学習によって算出された特徴量を用いて、組織構造が変化した部位の領域を境界領域として判定する。境界領域判定部411は、取得した組織構造画像の特徴量を算出し、算出した特徴量と、学習モデルとを用いて境界領域を判定する。
追加照射を行う際には、例えば照明光学系において、光の照射範囲の形状を境界領域に合わせる制御を行ったり、術者がスポット径を調整したりして治療光の照射を行う。
次に、実施の形態1の変形例について、図11および図12を参照して説明する。図11は、本発明の実施の形態1の変形例にかかる内視鏡システムの概略構成を示すブロック図である。本変形例にかかる内視鏡システム1Aは、実施の形態1に係る内視鏡システム1の内視鏡2に代えて内視鏡2Aを備える。内視鏡2A以外の構成は同じであるため、説明を省略する。
次に、実施の形態2について、図13および図14を参照して説明する。図13は、本発明の実施の形態2にかかる内視鏡システムの概略構成を示すブロック図である。本実施の形態2にかかる内視鏡システム1Bは、実施の形態1にかかる内視鏡システム1の内視鏡2および処理装置4に代えて内視鏡2Bおよび処理装置4Aを備える。内視鏡2Aおよび処理装置4A以外の構成は同じであるため、説明を省略する。
次に、実施の形態3について、図18および図19を参照して説明する。図18は、本発明の実施の形態3にかかる内視鏡システムの概略構成を示すブロック図である。本実施の形態3にかかる内視鏡システム1Cは、実施の形態1にかかる内視鏡システム1の処理装置4に代えて処理装置4Aを備える。なお、先端部24は、実施の形態1と同様の光学系243および撮像素子244を備えるが、第1撮像素子244aは、マルチバンドイメージセンサによって構成され、色成分ごとに個別に電気信号を生成するものとして説明する。
次に、実施の形態4について、図20を参照して説明する。図20は、本発明の実施の形態4にかかる内視鏡システムの概略構成を示すブロック図である。本実施の形態4にかかる内視鏡システム1Dは、実施の形態1にかかる内視鏡システム1と同じ構成を備える。内視鏡システム1Dでは、処理装置4が、処置具装置6と電気的に接続し、制御部44によって、処置具62からの治療光の出射制御を行う。
内視鏡の先端部を、治療対象部位まで挿入する工程と、
前記治療対象部位に治療光を照射して、治療対象部位に結合させた薬剤を反応させる工程と、
前記治療対象部位に照射した狭帯域光によって得られる組織構造画像を用いて、組織構造が変化した領域を境界領域として判定する工程と、
前記境界領域の蛍光強度の変化を算出する工程と、
前記蛍光強度の変化に基づいて治療光を追加照射するか否かを判断する工程と、
前記追加照射が必要な領域に対して、前記治療光を照射する工程と、
前記追加照射後の前記境界領域の蛍光強度の変化を算出する工程と、
を含む光治療方法。
2、2A、2B 内視鏡
3 光源装置
4、4A 処理装置
5 表示装置
6 処置具装置
21 挿入部
22 操作部
23 ユニバーサルコード
24 先端部
25 湾曲部
26 可撓管部
31 光源部
32 照明制御部
33 光源ドライバ
41 画像処理部
42 同期信号生成部
43 入力部
44 制御部
45 記憶部
61 処置具操作部
62 処置具
241 ライトガイド
242 照明レンズ
243、243A 光学系
243a、2430 対物レンズ
243b ダイクロイックミラー(第1ダイクロイックミラー)
243c、2434 カットフィルタ
243d 第2ダイクロイックミラー
244、244A、244B 撮像素子
244a 第1撮像素子
244b 第2撮像素子
244c 第1撮像部
244d 第2撮像部
244e 第3撮像素子
311 白色光源
312 狭帯域光源
313 励起光源
411 境界領域判定部
412 蛍光強度変化算出部
413 表示画像生成部
414 特定領域強度算出部
415 蛍光強度規格化部
2431 第1レンズ
2432 第2レンズ
2433 第3レンズ
2435 第4レンズ
Claims (12)
- 薬剤を反応させる治療光を出射する治療光出射装置と、
前記治療光の照射位置に照射された狭帯域光によって得られる組織構造画像を取得する組織構造画像取得部と、
前記治療光の照射位置に照射された励起光によって得られる蛍光画像を取得する蛍光画像取得部と、
前記組織構造画像を用いて、組織構造が変化した境界領域を判定する境界領域判定部と、
前記境界領域の蛍光強度の変化の大きさを算出する蛍光強度変化算出部と、
前記蛍光強度の変化の大きさを表示するための表示画像を生成する表示画像生成部と、
を備える光治療装置。 - 前記境界領域判定部は、前記組織構造画像の時間的な変化を検出し、該時間的な変化量に基づいて前記組織構造が変化した部位の領域を前記境界領域として判定する、
請求項1に記載の光治療装置。 - 前記境界領域判定部は、前記組織構造画像の値を、予め設定した閾値と比較することによって、前記組織構造が変化した部位の領域を前記境界領域として判定する、
請求項2に記載の光治療装置。 - 前記境界領域判定部は、予め機械学習によって算出された特徴量を用いて、前記組織構造が変化した部位の領域を前記境界領域として判定する、
請求項1に記載の光治療装置。 - 前記組織構造画像取得部は、380nm以上440nm以下の波長帯域の前記狭帯域光によって得られる組織構造画像を取得する、
請求項1に記載の光治療装置。 - 440nm以上490nm以下の波長帯域の狭帯域光の戻り光の光強度を用いて、前記蛍光強度変化算出部が算出した蛍光強度を規格化する蛍光強度規格化部、
をさらに備える請求項1に記載の光治療装置。 - 前記組織構造画像取得部は、490nm以上590nm以下の波長帯域の前記狭帯域光によって得られる組織構造画像を取得する、
請求項1に記載の光治療装置。 - 前記組織構造画像取得部は、590nm以上620nm以下の波長帯域の前記狭帯域光によって得られる組織構造画像を取得する、
請求項1に記載の光治療装置。 - 前記組織構造画像取得部は、620nm以上780nm以下の波長帯域の前記狭帯域光によって得られる組織構造画像を取得する、
請求項1に記載の光治療装置。 - 前記治療光の照射対象領域に対し、光照射強度および照射時間の積算値を設定照射光量として、前記治療光の出射制御を行う制御部、
をさらに備える請求項1に記載の光治療装置。 - 薬剤を反応させる治療光を、治療部位に照射して治療効果を確認するための光治療方法であって、
前記治療光の照射位置に照射された狭帯域光によって得られる組織構造画像を取得する組織構造画像取得ステップと、
前記治療光の照射位置に照射された励起光によって得られる蛍光画像を取得する蛍光画像取得ステップと、
前記組織構造画像を用いて、組織構造が変化した境界領域を判定する境界領域判定ステップと、
前記境界領域の蛍光強度の変化の大きさを算出する蛍光強度変化算出ステップと、
前記蛍光強度の変化の大きさを表示するための表示画像を生成する表示画像生成ステップと、
を含む光治療方法。 - 薬剤を反応させる治療光を、治療部位に照射して治療効果を確認するため情報を生成する光治療装置に、
前記治療光の照射位置に照射された狭帯域光によって得られる組織構造画像を取得する組織構造画像取得ステップと、
前記治療光の照射位置に照射された励起光によって得られる蛍光画像を取得する蛍光画像取得ステップと、
前記組織構造画像を用いて、組織構造が変化した境界領域を判定する境界領域判定ステップと、
前記境界領域の蛍光強度の変化の大きさを算出する蛍光強度変化算出ステップと、
前記蛍光強度の変化の大きさを表示するための表示画像を生成する表示画像生成ステップと、
を実行させる光治療プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180089242.0A CN116685376A (zh) | 2021-04-15 | 2021-04-15 | 光治疗装置、光治疗方法及光治疗程序 |
PCT/JP2021/015612 WO2022219783A1 (ja) | 2021-04-15 | 2021-04-15 | 光治療装置、光治療方法および光治療プログラム |
JP2023514280A JP7430845B2 (ja) | 2021-04-15 | 2021-04-15 | 光治療装置、光治療方法および光治療プログラム |
US18/220,362 US20230347169A1 (en) | 2021-04-15 | 2023-07-11 | Phototherapy device, phototherapy method, and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/015612 WO2022219783A1 (ja) | 2021-04-15 | 2021-04-15 | 光治療装置、光治療方法および光治療プログラム |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US18/220,362 Continuation US20230347169A1 (en) | 2021-04-15 | 2023-07-11 | Phototherapy device, phototherapy method, and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022219783A1 true WO2022219783A1 (ja) | 2022-10-20 |
Family
ID=83640246
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/015612 WO2022219783A1 (ja) | 2021-04-15 | 2021-04-15 | 光治療装置、光治療方法および光治療プログラム |
Country Status (4)
Country | Link |
---|---|
US (1) | US20230347169A1 (ja) |
JP (1) | JP7430845B2 (ja) |
CN (1) | CN116685376A (ja) |
WO (1) | WO2022219783A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014221117A (ja) * | 2013-05-13 | 2014-11-27 | 株式会社アライ・メッドフォトン研究所 | 治療進行度モニタ装置及びその方法 |
WO2015045516A1 (ja) * | 2013-09-27 | 2015-04-02 | 富士フイルム株式会社 | 蛍光観察装置、内視鏡システム及びプロセッサ装置並びに作動方法 |
WO2019215905A1 (ja) * | 2018-05-11 | 2019-11-14 | 株式会社島津製作所 | 治療支援装置および治療支援システム |
-
2021
- 2021-04-15 CN CN202180089242.0A patent/CN116685376A/zh active Pending
- 2021-04-15 WO PCT/JP2021/015612 patent/WO2022219783A1/ja active Application Filing
- 2021-04-15 JP JP2023514280A patent/JP7430845B2/ja active Active
-
2023
- 2023-07-11 US US18/220,362 patent/US20230347169A1/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2014221117A (ja) * | 2013-05-13 | 2014-11-27 | 株式会社アライ・メッドフォトン研究所 | 治療進行度モニタ装置及びその方法 |
WO2015045516A1 (ja) * | 2013-09-27 | 2015-04-02 | 富士フイルム株式会社 | 蛍光観察装置、内視鏡システム及びプロセッサ装置並びに作動方法 |
WO2019215905A1 (ja) * | 2018-05-11 | 2019-11-14 | 株式会社島津製作所 | 治療支援装置および治療支援システム |
Also Published As
Publication number | Publication date |
---|---|
US20230347169A1 (en) | 2023-11-02 |
JPWO2022219783A1 (ja) | 2022-10-20 |
CN116685376A (zh) | 2023-09-01 |
JP7430845B2 (ja) | 2024-02-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5502812B2 (ja) | 生体情報取得システムおよび生体情報取得システムの作動方法 | |
WO2003059150A3 (en) | Apparatus and method for spectroscopic examination of the colon | |
JP2012000160A (ja) | 内視鏡装置 | |
JP2012213551A (ja) | 生体情報取得システムおよび生体情報取得方法 | |
WO2015093114A1 (ja) | 内視鏡装置 | |
JP7328432B2 (ja) | 医療用制御装置、医療用観察システム、制御装置及び観察システム | |
WO2022219783A1 (ja) | 光治療装置、光治療方法および光治療プログラム | |
US20230000330A1 (en) | Medical observation system, medical imaging device and imaging method | |
WO2022230040A1 (ja) | 光治療装置、光治療方法および光治療プログラム | |
WO2022224454A1 (ja) | 光治療装置、光治療方法および光治療プログラム | |
JP7426248B2 (ja) | 医療用制御装置及び医療用観察システム | |
WO2023248306A1 (ja) | 画像処理装置、光治療システム、画像処理方法、画像処理プログラムおよび光治療方法 | |
JP2005211272A (ja) | 内視鏡装置 | |
WO2022208629A1 (ja) | 蛍光観察装置、光免疫治療システムおよび蛍光内視鏡 | |
WO2023127053A1 (ja) | 画像処理装置、光免疫治療システム、画像処理方法及び画像処理プログラム | |
US20230371817A1 (en) | Endoscope system | |
US20240115874A1 (en) | Endoscope system and phototherapy method | |
WO2021181484A1 (ja) | 医療用画像処理装置、医療用撮像装置、医療用観察システム、画像処理方法およびプログラム | |
US20230248209A1 (en) | Assistant device, endoscopic system, assistant method, and computer-readable recording medium | |
JP7441822B2 (ja) | 医療用制御装置及び医療用観察装置 | |
JP2006020727A (ja) | 光源装置 | |
JP2021132695A (ja) | 医療用画像処理装置、医療用観察システムおよび画像処理方法 | |
CN116249504A (zh) | 辅助装置、内窥镜系统、辅助方法以及程序 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21936979 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023514280 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180089242.0 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21936979 Country of ref document: EP Kind code of ref document: A1 |