WO2021024314A1 - Medical treatment assistance device and medical treatment assistance method - Google Patents

Medical treatment assistance device and medical treatment assistance method Download PDF

Info

Publication number
WO2021024314A1
WO2021024314A1 PCT/JP2019/030554 JP2019030554W WO2021024314A1 WO 2021024314 A1 WO2021024314 A1 WO 2021024314A1 JP 2019030554 W JP2019030554 W JP 2019030554W WO 2021024314 A1 WO2021024314 A1 WO 2021024314A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fluorescence
processor
fluorescent agent
unit
Prior art date
Application number
PCT/JP2019/030554
Other languages
French (fr)
Japanese (ja)
Inventor
紘之 妻鳥
Original Assignee
株式会社島津製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島津製作所 filed Critical 株式会社島津製作所
Priority to JP2021538536A priority Critical patent/JP7306461B2/en
Priority to PCT/JP2019/030554 priority patent/WO2021024314A1/en
Publication of WO2021024314A1 publication Critical patent/WO2021024314A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements

Definitions

  • the present invention relates to a treatment support device and a treatment support method.
  • a treatment support device that captures a fluorescent image with a fluorescent agent administered to a patient during treatment such as surgery is known.
  • Such a treatment support device is disclosed in, for example, Japanese Patent Application Laid-Open No. 2018-51320.
  • Japanese Patent Application Laid-Open No. 2018-51320 discloses a device for identifying perforator blood vessels before surgery using ICG (indocyanine green) fluorescence angiography for plastic surgery.
  • the apparatus disclosed in Japanese Patent Application Laid-Open No. 2018-51320 includes an infrared light source for exciting fluorescence, and the fluorescence signal is detected by a CCD camera. The entire cycle of fluorescence perfusion and disappearance of the ICG is captured by the contrast device.
  • JP-A-2018-51320 a well-vascularized flap is a good candidate for grafting, and the surgeon determines which of the several penetrating branches is the best candidate for grafting.
  • the apparatus of JP-A-2018-51320 is a means for processing an image sequence to generate a time-integrated luminance or a time derivative of the luminance with respect to a pixel value in a fluorescent image, and a time-integrated luminance or It is provided with a means for displaying the time derivative of luminance as a color image or a black-and-white image. This device is used to identify / locate perforator vessels prior to surgery.
  • the fluorescence image only shows the fluorescence intensity at a certain time, it is difficult to evaluate the quality of blood perfusion (that is, the quality of blood circulation in body tissue) from the fluorescence image at a certain time. .. Therefore, in JP-A-2018-51320, it is considered that the time-integrated luminance or the time derivative of the luminance with respect to the pixel value in the fluorescent image is obtained and displayed.
  • Evaluating the quality of blood perfusion is important not only for skin grafting (flap surgery) described above, but also for identifying the area to be treated and confirming the result of treatment. Therefore, when treating a patient, it is desired to be able to appropriately display the state of diffusion of the administered fluorescent agent in order to evaluate the quality of blood perfusion at the site to be treated.
  • the present invention has been made to solve the above-mentioned problems, and one object of the present invention is therapeutic support capable of appropriately displaying the state of diffusion of the fluorescent agent administered to the subject. It is to provide a device and a treatment support method.
  • the treatment support device detects the excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject and the fluorescence excited by the excitation light.
  • a fluorescence imaging unit that captures a fluorescence image of a subject and an image processing unit that has a processor that performs image processing and outputs an image to the display unit are provided, and the processor includes fluorescence images generated at a plurality of time points.
  • the region in the fluorescence image in which fluorescence was detected for the first time at each time point is detected, and the above area at each time point is superimposed in a different manner to obtain a fluorescence agent diffusion image. Is generated, and the generated fluorescent agent diffusion image is displayed on the display unit.
  • the treatment support method includes a step of irradiating the excitation light of the fluorescent agent administered to the subject, a step of detecting the fluorescence excited by the excitation light, and a step of capturing a fluorescence image of the subject.
  • the present invention includes a step of generating a fluorescent agent diffused image by superimposing the regions at each time point in different modes, and a step of displaying the generated fluorescent agent diffused image on the display unit.
  • a fluorescent agent diffusion image is generated and displayed in which the regions where fluorescence is first detected at each time point are displayed in different modes from each other. Since the above region is considered to be the region where the presence of the fluorescent agent diffused by the bloodstream is first detected, the fluorescent agent diffusion image can distinguish the change in which the fluorescent agent diffuses with the passage of time at each time point. It is displayed in. Therefore, the state of diffusion of the fluorescent agent administered to the subject can be displayed by the fluorescent agent diffusion image as a state in which the above-mentioned region is enlarged at each time point. As a result, the state of diffusion of the fluorescent agent administered to the subject can be appropriately displayed.
  • the configuration of the treatment support device 100 according to one embodiment will be described with reference to FIGS. 1 to 14.
  • the treatment support device 100 includes a fluorescence imaging device 10 and an image processing unit 20.
  • the treatment support device 100 images the treatment target site 2 by the fluorescence imaging device 10 and displays the image processed by the image processing unit 20 on the display unit 30 to perform the treatment. It is a device that supports the above.
  • the treatment support by the treatment support device 100 is performed by displaying an image visualized (imaging) by imaging the fluorescence 92 generated from the fluorescence agent 3 administered into the body of the subject 1 on the display unit 30. This is to provide a doctor or the like with information on the treatment target site 2 that cannot be directly seen from the outside.
  • Subject 1 is, for example, a human being, but is not particularly limited.
  • the treatment target site 2 is, for example, the chest, abdomen, back, internal organs (for example, digestive tract, liver, adrenal gland, etc.), but is not particularly limited.
  • the fluorescence imaging device 10 detects the fluorescence 92 emitted from the fluorescent agent 3 administered to the subject 1 by irradiating the excitation light 91, and visualizes (imaging) the treatment target site 2 of the subject 1 based on the fluorescence 92. It is a device to do.
  • the image processing unit 20 performs image processing on the fluorescence image captured by the fluorescence imaging device 10 to generate a fluorescent agent diffusion image 85, which will be described later.
  • the image processing unit 20 is configured to output an image to the display unit 30.
  • the image processing unit 20 is composed of a computer including a processor 21 and a storage unit for performing information processing to generate a fluorescent agent diffused image 85 from image data.
  • the processor 21 performs image processing of the fluorescence image.
  • the image processing unit 20 is, for example, a PC (Personal Computer).
  • the image processing unit 20 is electrically connected to, for example, the fluorescence imaging device 10, and acquires an image from the fluorescence imaging device 10.
  • the image processing unit 20 is electrically connected to, for example, the display unit 30 and outputs an image to the display unit 30.
  • the connection between these devices may be either wired or wireless.
  • the display unit 30 is configured to display an image of the treatment target site 2 output from the image processing unit 20.
  • the image processing unit 20 is, for example, a monitor such as a liquid crystal display.
  • the image processing unit 20 is, for example, a monitor provided in an operating room or the like where the treatment of the subject 1 is performed.
  • the display unit 30 may be, for example, a monitor included in the image processing unit 20 or the fluorescence imaging device 10.
  • the fluorescence imaging device 10 includes an imaging unit 11, an arm mechanism 12, and a main body unit 13.
  • the fluorescence imaging device 10 images the treatment target site 2 from the outside of the subject 1 by the imaging unit 11 arranged at a position separated from the subject 1.
  • the arm mechanism 12 (see FIG. 2) has a first end connected to the main body 13 and a second end connected to the imaging unit 11, and the imaging unit 11 is positioned and oriented at an arbitrary position within the movable range. It is configured to be able to be held in.
  • the imaging unit 11 includes at least a fluorescence imaging unit 111 that detects the fluorescence 92 excited by the excitation light 91 and captures the fluorescence image 81 of the subject 1.
  • the imaging unit 11 is configured to capture a visible light image 82 based on visible light 93 in addition to the fluorescence image 81 based on fluorescence 92. That is, the imaging unit 11 includes a visible light imaging unit 112 that detects the visible light 93 reflected from the subject 1 and captures the visible light image 82 of the subject 1.
  • the imaging unit 11 is configured to capture a fluorescence image 81 and a visible light image 82 as moving images.
  • the imaging unit 11 generates a fluorescence image 81 and a visible light image 82 in chronological order at a predetermined frame rate.
  • the generated individual fluorescence image 81 and visible light image 82 are frame images constituting each frame of the moving image.
  • the frame rate is, for example, 1 fps (frames per second) or more, preferably 15 fps or more, and more preferably 30 fps or more.
  • the frame rate is set to, for example, 60 fps, and can be changed according to the setting by the user.
  • the fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to generate the fluorescence image 81 at a time interval shorter than the pulsating cycle of the blood flow of the subject 1.
  • the heart rate of a human is about 60 to 75 times / minute for an adult, and is about 1 time / second when converted.
  • 30 fps and 60 fps are time intervals sufficiently shorter than the pulsating cycle of the blood flow of the subject 1.
  • the image pickup unit 11 includes a light receiving unit 11a, an optical system 11b, and an image pickup light source unit 11c.
  • the light receiving unit 11a includes the fluorescence imaging unit 111 and the visible light imaging unit 112 described above.
  • the visible light imaging unit 112 is configured to detect visible light 93.
  • the fluorescence imaging unit 111 is configured to detect fluorescence 92.
  • the visible light imaging unit 112 and the fluorescence imaging unit 111 include, for example, an image sensor such as a CMOS (Complementary Neticide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • CMOS Complementary Neticide Semiconductor
  • CCD Charge Coupled Device
  • the optical system 11b includes a zoom lens 113 and a prism 114.
  • the optical system 11b is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3.
  • the detailed configuration of the optical system 11b will be described later.
  • the imaging light source unit 11c includes an excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1.
  • the excitation light irradiation unit 115 has an excitation light source 116 (see FIG. 3) that generates excitation light 91.
  • the excitation light irradiation unit 115 irradiates the excitation light 91 having a suitable wavelength according to the light absorption characteristics of the fluorescent agent 3.
  • the fluorescent agent 3 is indocyanine green (ICG).
  • the ICG has an absorption peak in the wavelength region of about 750 nm or more and less than about 800 nm.
  • the ICG emits fluorescence 92 having a peak in the wavelength region of about 800 nm or more and less than about 850 nm.
  • the excitation light irradiation unit 115 irradiates the excitation light 91 having a peak at, for example, about 750 nm.
  • the imaging light source unit 11c includes a visible light irradiation unit 117 that irradiates visible light 93 in the visible wavelength region.
  • the visible light irradiation unit 117 has a visible light source 118 (see FIG. 3) that generates visible light 93.
  • the excitation light irradiation unit 115 irradiates, for example, white light toward the subject 1 as visible light 93.
  • White light contains wavelength components over substantially the entire visible wavelength region.
  • the visible light 93 irradiated by the excitation light irradiation unit 115 has a peak of emission intensity in the visible wavelength region.
  • Lighting equipment with visible light wavelength such as surgical light is installed in the operating room where the patient is treated. Since the light generated by the lighting equipment can be used as the visible light 93, the imaging light source unit 11c does not have to include the visible light irradiation unit 117.
  • the image pickup light source unit 11c is provided in an annular shape on the end surface of the image pickup unit 11 so as to surround the optical system 11b.
  • a total of 12 excitation light sources 116 and visible light sources 118 are arranged in an annular shape.
  • These excitation light sources 116 and visible light sources 118 are, for example, light emitting diodes (LEDs).
  • the excitation light source 116 and the visible light source 118 may be a laser light source such as a semiconductor laser.
  • the fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to detect fluorescence 92 and visible light 93, respectively, via a common optical system 11b. Therefore, the imaging unit 11 acquires the fluorescence image 81 and the visible light image 82 at the same imaging position and the same imaging field of view.
  • the fluorescence 92 and the visible light 93 are incident on the zoom lens 113 along the optical axis 94.
  • the zoom lens 113 is moved in a direction along the optical axis 94 by a lens moving mechanism (not shown) in order to focus.
  • the imaging unit 11 can acquire the fluorescence image 81 and the visible light image 82 at an arbitrary magnification within the variable range of the zoom lens 113.
  • the fluorescence 92 and the visible light 93 reach the prism 114 after passing through the zoom lens 113.
  • the prism 114 is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3.
  • the fluorescence 92 that has reached the prism 114 passes through the prism 114 and reaches the fluorescence imaging unit 111.
  • the visible light 93 that has reached the prism 114 is reflected by the prism 114 and reaches the visible light imaging unit 112.
  • the reflected light of the excitation light 91 from the subject 1 is reflected by the prism 114. Therefore, it is avoided that the reflected light of the excitation light 91 from the subject 1 reaches the fluorescence imaging unit 111.
  • the arm mechanism 12 includes a translational support portion 121 that rotatably supports the image pickup unit 11 and a rotation support portion 122 that rotatably supports the image pickup unit 11.
  • the translational support unit 121 supports the image pickup unit 11 via the rotation support unit 122, holds the position of the image pickup unit 11, and can translate the image pickup unit 11 in each of the front-back, left-right, and up-down directions. It is configured as follows.
  • the rotation support unit 122 is configured so that the image pickup unit 11 can be rotated in each of the left-right and up-down directions.
  • the main body 13 includes a housing 131 and a computer housed in the housing 131.
  • the housing 131 is, for example, a dolly having a box shape for accommodating a computer and being movable by wheels.
  • the main body unit 13 includes a control unit 132, an image generation unit 133, a main body storage unit 134, and an output unit 135.
  • the control unit 132 is composed of, for example, a computer including a processor such as a CPU (Central Processing Unit) and a memory.
  • the computer functions as a control unit 132 of the fluorescence imaging device 10 by executing a program stored in the memory by the processor.
  • the control unit 132 controls the image pickup unit 11 (start and stop of imaging, etc.), irradiates the light (excitation light 91, visible light 93) from the image pickup light source unit 11c, stops the irradiation, and the like, which is not shown. It is configured to control based on the input operation to.
  • the image generation unit 133 includes image data of the fluorescence image 81 (see FIG. 4) captured by the imaging unit 11 and the visible light image 82 (see FIG. 4) from the detection signals of the imaging unit 11 (fluorescence imaging unit 111 and visible light imaging unit 112). It is configured to generate the image data of (see FIG. 4) respectively.
  • the image generation unit 133 includes, for example, a processor such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array) configured for image processing, and a memory.
  • the main body storage unit 134 is configured to store the captured image generated by the image generation unit 133, a control program, and the like.
  • the main body storage unit 134 includes, for example, a non-volatile memory, an HDD (Hard Disk Drive), and the like.
  • the output unit 135 is configured to output a video signal including the captured image generated by the image generation unit 133 to the image processing unit 20.
  • the output unit 135 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices.
  • the output unit 135 is connected to the image processing unit 20 so that the captured image can be output by wire or wirelessly.
  • the fluorescence imaging device 10 acquires the fluorescence image 81 and the visible light image 82 of the subject 1, and outputs the acquired fluorescence image 81 and the visible light image 82 to the image processing unit 20.
  • the fluorescence imaging device 10 outputs an image to the image processing unit 20 in a moving image format.
  • the fluorescence image 81 and the visible light image 82 are sequentially output in chronological order according to a set frame rate as frame images (still images) constituting a moving image. That is, one fluorescent image 81 and one visible light image 82 are output to the image processing unit 20 at regular time intervals.
  • the image processing unit 20 includes a processor 21, a storage unit 22, and an input / output unit 23.
  • the processor 21 compares the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point in the fluorescence image 81 in which the fluorescence 92 is detected for the first time at each time point.
  • a fluorescence agent diffusion image 85 is generated by detecting a region (hereinafter referred to as a fluorescence start region 84) and superimposing the fluorescence start region 84 at each time point in a different manner, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30. It is configured to be displayed in.
  • the fluorescence start region 84 is an example of the "region in the fluorescence image in which fluorescence is detected for the first time" in the claims.
  • the configuration of the image processing unit 20 will be described.
  • the processor 21 is composed of, for example, a CPU, a GPU, an FPGA configured for image processing, or the like.
  • the storage unit 22 includes a volatile and / or non-volatile memory, a storage device such as an HDD, and the like.
  • the storage unit 22 stores a program executed by the processor 21.
  • the storage unit 22 stores various image data such as the image data obtained from the fluorescence imaging device 10 and the fluorescent agent diffusion image 85 generated by the image processing unit 20.
  • the input / output unit 23 receives the input of the video signal including the fluorescence image 81 and the visible light image 82 generated by the fluorescence imaging device 10.
  • the input / output unit 23 outputs an image from the image processing unit 20 to the display unit 30.
  • the input / output unit 23 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices.
  • the input / output unit 23 is connected to the fluorescence imaging device 10 (output unit 135) and the display unit 30, respectively, by wire or wirelessly.
  • the image processing unit 20 includes a visible light image acquisition unit 211, a fluorescence image acquisition unit 212, a region extraction unit 213, a pulsation cycle acquisition unit 214, a diffusion image generation unit 215, and an image composition unit 216. Included as a functional block.
  • the functional block means a set of information processing functions realized by the processor 21 included in the image processing unit 20 executing a program. Each of these functional blocks may be composed of separate hardware (processor).
  • the visible light image acquisition unit 211 acquires the visible light image 82 imaged by the visible light image pickup unit 112 of the fluorescence imaging device 10 via the input / output unit 23.
  • the visible light image acquisition unit 211 outputs the acquired visible light image 82 to the image synthesis unit 216.
  • the fluorescence image acquisition unit 212 acquires the fluorescence image 81 imaged by the fluorescence imaging unit 111 of the fluorescence imaging device 10 via the input / output unit 23.
  • the fluorescence image acquisition unit 212 outputs the acquired fluorescence image 81 to the region extraction unit 213.
  • the individual fluorescence images 81 sequentially acquired in chronological order are specified by frame numbers.
  • the frame number of the fluorescent image 81 represents the time of shooting.
  • the fluorescent image 81 of each frame number is an image of the same position of the subject 1 and is an image at a different time of shooting.
  • the fluorescence image 81 at each time point can be rephrased as the fluorescence image 81 of each frame number.
  • the region extraction unit 213 extracts the fluorescence start region 84 in the fluorescence image 81 generated in time series by the fluorescence imaging unit 111.
  • the region extraction unit 213 detects the fluorescence start region 84 by comparing the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point.
  • the fluorescence start region 84 is a region in which fluorescence 92 is detected for the first time in the fluorescence image 81 generated at each time point after the start of imaging.
  • the fluorescence start region 84 is, for example, individual pixels included in the fluorescence image 81. That is, the region extraction unit 213 extracts the fluorescence start region 84 in the fluorescence image 81 in pixel units.
  • the region extraction unit 213 may extract a group of a plurality of pixels as a fluorescence start region 84.
  • the fluorescence image 81 is an image obtained by detecting the fluorescence 92 generated from the fluorescence agent 3, and the pixel value of the fluorescence image 81 represents the fluorescence intensity.
  • the fluorescent agent 3 diffuses with the passage of time due to the blood flow.
  • the fluorescent agent 3 passes through a certain position in the fluorescence image 81, the fluorescence 92 is detected from the low pixel value in the state where the fluorescence 92 is not detected (the state where the pixel value is at the background level) at that position.
  • the pixel value rises. After that, when the fluorescent agent 3 flows away, the pixel value drops, and the state returns to the state where the fluorescence 92 is not detected.
  • the fluorescence start region 84 is a region in which the rising edge of the pixel value is first detected in the time change of the pixel value.
  • a region having a pixel value higher than a certain value is hatched, and a region having a pixel value equal to or less than a certain value is shown in plain color.
  • the fluorescence start region 84 extracted from each fluorescence image 81 is shown with hatching.
  • the fluorescence start region 84 extracted in the fluorescence image 81 is a region where the fluorescence 92 is detected for the first time after the start of imaging, the same fluorescence start region 84 is not extracted multiple times.
  • the fluorescence start region 84a is extracted from the fluorescence image 81 of a certain frame number (M1).
  • the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (M1), reflecting the diffusion of the fluorescent agent 3.
  • the fluorescence start region 84b does not include the fluorescence start region 84a of the frame number (M1).
  • the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (M2).
  • the method for extracting the fluorescence start region 84 in the fluorescence image 81 is not particularly limited, but here, three examples shown in FIGS. 7 to 9 will be described.
  • 7 to 9 are graphs showing a time intensity curve (TIC) 71 in one pixel in the fluorescence image 81.
  • the vertical axis indicates the pixel value (that is, the fluorescence intensity)
  • the horizontal axis indicates the frame number (that is, the elapsed time).
  • the processor 21 has a fluorescence start region 84 based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Is extracted.
  • the region extraction unit 213 extracts pixels whose pixel values exceed the threshold value 72 as the fluorescence start region 84.
  • the threshold value 72 is set to a predetermined value higher than the pixel value (background level) at the time point (region 71a) before the rise of the pixel value.
  • the processor 21 fluoresces based on the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 73.
  • the start region 84 is extracted.
  • the region extraction unit 213 extracts pixels whose slope of the time intensity curve 71 exceeds the threshold value 73 as the fluorescence start region 84.
  • the slope of the time intensity curve 71 is, for example, the difference value (change amount) of the pixel values between two adjacent frames.
  • the threshold value 73 is a predetermined value larger than the slope in the region 71a.
  • the processor 21 region extraction unit 213 is based on the fact that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 74.
  • the fluorescence start region 84 is extracted.
  • the region extraction unit 213 extracts pixels whose area value of the time intensity curve 71 exceeds the threshold value 74 as the fluorescence start region 84.
  • the area value of the time intensity curve 71 is, for example, an integrated value of pixel values exceeding the background level in each frame. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 (the value obtained by dividing the area value by the number of frames) may be used.
  • the processor 21 determines the fluorescence 92 depending on which frame number (at what time point) the fluorescence image 81 of the fluorescence images 81 acquired in time series extracts the fluorescence start region 84. It is possible to identify the location and time point when was first detected.
  • the fluorescence start region 84 extracted in the fluorescence image 81 of any frame number is not extracted again in the fluorescence image 81 of subsequent frame numbers.
  • the region extraction unit 213 outputs information for identifying the extracted fluorescence start region 84 to the diffusion image generation unit 215.
  • the pulsation cycle acquisition unit 214 acquires the pulsation cycle 60 of the blood flow of the subject 1.
  • the processor 21 integrates the number of fluorescence start regions 84 for each generated fluorescence image 81, and the fluorescence start region 84 for each fluorescence image 81 generated in time series. It is configured to detect the pulsation cycle 60 based on the number of. In the configuration in which the fluorescence start region 84 is extracted in units of one pixel, the number of fluorescence start regions 84 integrated for each fluorescence image 81 is the total number of pixels extracted as the fluorescence start region 84 in one fluorescence image 81. Is.
  • the fluorescent agent 3 is carried by the bloodstream of the subject 1, branches from the artery into smaller blood vessels, diffuses into the capillaries and body tissues, and then into the veins via the capillaries. It flows.
  • the flow of blood so that it penetrates into the body tissue is called perfusion.
  • the fluorescent agent 3 contained in the blood diffuses in the body tissue contained in the field of view by perfusion. Since the perfusion is caused by the pulsation of blood caused by the beating of the heart, the diffusion of the fluorescent agent 3 also changes periodically in response to the pulsation.
  • FIG. 10 shows a heartbeat waveform 75 detected by an electrocardiograph and a pulse waveform 76 detected by a pulse wave meter.
  • the horizontal axis represents time and the vertical axis represents signal strength.
  • the peak in waveform 75 the peak of systolic blood pressure of the heart arrives, and the peak in waveform 76 is formed with a time lag due to the time difference in the propagation of the blood pressure peak.
  • the inventor of the present application has obtained the finding that the fluorescent agent observed in the fluorescent image 81 diffuses periodically with the pulsation derived from the heartbeat. Therefore, in the present embodiment, the change in each pulsation cycle 60 is visualized from the fluorescence image 81 acquired in time series.
  • FIG. 11 shows a graph 65 showing the change in the number of fluorescence start regions 84 with the passage of time.
  • the horizontal axis of the graph 65 shows the elapsed time (that is, the number of frames), and the vertical axis shows the number of fluorescence start regions 84 (total number of fluorescence start regions 84 included in the frame image). Focusing on the change in the number of the fluorescence start regions 84, the peak 66 of the number of the fluorescence start regions 84 is formed corresponding to the peak in the waveform 75 and the peak in the waveform 76 shown in FIG.
  • the fluorescent agent 3 rapidly diffuses together with the blood at a timing corresponding to the peak of the heartbeat or the pulse, and the diffusion rate of the fluorescent agent 3 becomes slow until the next peak arrives. Therefore, the peak 66 of the number of the fluorescence start regions 84 is formed at the timing corresponding to the peak of the heartbeat or the pulse. Since the number peak 66 of the fluorescence initiation region 84 is caused by the beating of the heart, one is formed for each beating (or one pulsation).
  • the processor 21 acquires the change in the number of the fluorescence start regions 84 for each fluorescence image 81, and detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84. It is configured in. In graph 65, the period between two adjacent peaks 66 is the blood pulsation cycle 60 due to the beating of the heart.
  • the pulsation cycle acquisition unit 214 detects the pulsation cycle 60 based on, for example, the time interval between the vertices of the adjacent peak 66 and the time interval between the rising point or the falling point of the adjacent peak 66.
  • the pulsation cycle acquisition unit 214 detects the peak 66 of the number of the fluorescence start regions 84 in the fluorescence image 81 of any frame number among the fluorescence images 81 acquired in order in chronological order. Identify if it was done.
  • the pulsation cycle acquisition unit 214 specifies the period from the frame number of the immediately preceding peak 66 to the frame number of the peak 66 detected this time as one pulsation cycle 60.
  • the period from the start of imaging to the frame number of the peak 66 detected this time is specified as one pulsating cycle 60.
  • the pulsation cycle 60 can be specified by a start frame number that is the start point of the cycle and an end frame number that is the end point of the cycle.
  • the fluorescence image 81 having a frame number within the range of the start frame number or more and the end frame number or less is a fluorescence image 81 belonging to the same pulsation cycle 60.
  • the frame number (N1) is the first peak 66a
  • the frame number (N2) is the second peak 66b
  • the frame number (N3) is the third peak 66c
  • the frame number (N4) Shows an example in which the fourth peak 66d is detected
  • the frame number (N5) shows an example in which the fifth peak 66e is detected.
  • the frame number (1) to the frame number (N1) is the cycle 1
  • the frame number (N1 + 1) to the frame number (N2) is the cycle 2
  • the frame number (N2 + 1) to the frame number (N3) is the cycle 3
  • the frame number From (N3 + 1), the frame number (N4) has a cycle of 4
  • the frame number (N4 + 1) the frame number (N5) has a cycle of 5.
  • the diffused image generation unit 215 (processor 21, see FIG. 5) generates the fluorescent agent diffused image 85 by superimposing the fluorescence start regions 84 at each time point in different modes.
  • the diffusion image generation unit 215 (processor 21, see FIG. 5) is based on the fluorescence start region 84 extracted by the region extraction unit 213 and the pulsation cycle 60 acquired by the pulsation cycle acquisition unit 214.
  • Fluorescent agent diffusion image 85 (see FIG. 12) is generated.
  • the fluorescent agent diffusion image 85 is an image in which the extracted fluorescence start regions 84 are displayed in different modes for each pulsation cycle 60.
  • the diffusion image generation unit 215 sequentially arranges the fluorescence start regions 84 (see FIG. 6) extracted by the region extraction unit 213 from the frame number (1) in chronological order. To get to. Further, the diffusion image generation unit 215 acquires the pulsation cycle 60 detected by the pulsation cycle acquisition unit 214. That is, the diffusion image generation unit 215 acquires the frame number in which the peak 66 of the number of the fluorescence start regions 84 is detected.
  • the diffusion image generation unit 215 collectively displays the fluorescence start regions 84 extracted from the fluorescence images 81 belonging to the same pulsation cycle 60 in the same manner.
  • the diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 extracted from the fluorescence image 81 belonging to the pulsation cycle 60 for each pulsation cycle 60.
  • the diffusion image generation unit 215 (processor 21) generates a fluorescence agent diffusion image 85 in which the fluorescence start regions 84 at each time point in each pulsation cycle 60 are superimposed in different modes.
  • FIG. 12 shows the flow of the generation process of the fluorescent agent diffusion image 85.
  • the region extraction unit 213 acquires the fluorescence start region 84 in order from the frame number (1).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence initiation regions 84 extracted in each fluorescence image 81 until the peak 66 of the number of fluorescence initiation regions 84 is detected in the same embodiment 86a. ..
  • the frame number (1) to the frame number (N1) are detected by the pulsation cycle acquisition unit 214 as the first pulsation cycle 60 (cycle 1).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 from the frame number (1) to the frame number (N1) in the same aspect 86a.
  • the difference in display mode is expressed by the difference in hatching applied to the fluorescence start region 84.
  • the pulsation cycle acquisition unit 214 detects the frame number (N1 + 1) to the frame number (N2) as the second pulsation cycle 60 (cycle 2).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 2 in the same aspect 86b.
  • the aspect 86b is a display aspect different from the aspect 86a, and the user can visually distinguish the fluorescence start region 84 displayed in the aspect 86a from the fluorescence start region 84 displayed in the aspect 86b.
  • the pulsation cycle acquisition unit 214 detects the frame number (N2 + 1) to the frame number (N3) as the third pulsation cycle 60 (cycle 3).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 3 in the same aspect 86c.
  • Aspect 86c is a display aspect different from the aspects 86a and 86b.
  • the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 that displays the extracted fluorescence start region 84 in different modes for each pulsation cycle 60.
  • the processor 21 (diffusion image generation unit 215) generates (updates) the fluorescent agent diffusion image 85 for each frame.
  • the processor 21 adds the extracted fluorescence start region 84 included in the pulsation cycle 60 to the fluorescence image 81.
  • the fluorescence initiation region 84 extracted from the latest fluorescence image 81 is configured to be added to the fluorescent agent diffusion image 85.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including a fluorescence start region 84a.
  • the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (1).
  • the diffusion image generation unit 215 generates a fluorescent agent diffusion image 85 in which the fluorescence start region 84b is added to the fluorescence start region 84a.
  • the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (2).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 in which a fluorescence start region 84c is added to the fluorescence agent diffusion image 85 of frame number (2).
  • the diffusion image generation unit 215 acquires the fluorescent image 81 each time the fluorescent image 81 is acquired between the start frame number (1) and the end frame number (N1) of the pulsation cycle 60.
  • the fluorescence initiation region 84 extracted from is added to the fluorescent agent diffusion image 85.
  • the generated fluorescent agent diffusion image 85 is a moving image in which the fluorescence start region 84 gradually expands with the lapse of time until the pulsation cycle 60 is switched to the next cycle. Is generated as.
  • the fluorescent agent diffusion image 85 at the end frame number of the pulsation cycle 60 all the fluorescence start regions 84 extracted within the period of the pulsation cycle 60 are displayed.
  • the processor 21 sets the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60 each time the pulsation cycle 60 elapses.
  • the extracted fluorescence initiation region 84 is configured to be added to the fluorescence agent diffusion image 85.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86a during the first pulsation cycle 60 (cycle 1).
  • the diffusion image generation unit 215 keeps displaying the fluorescence start region 84 of the aspect 86a even after the first pulsation cycle 60 (cycle 1) elapses.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86b.
  • the diffusion image generation unit 215 adds the fluorescence start region 84 of the aspect 86a to the fluorescence start region 84 of the aspect 86b to display the fluorescence start region 84 of the aspect 86b on the fluorescent agent diffusion image 85.
  • the diffusion image generator 215 adds the fluorescence start region 84 of aspect 86a and the fluorescence start region 84 of aspect 86b to diffuse the fluorescence start region 84 of aspect 86c with a fluorescent agent. It is displayed on the image 85.
  • the diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 each time the pulsation cycle 60 elapses, and adds the fluorescence start region 84 belonging to the pulsation cycle 60 to the fluorescent agent diffusion image 85. To do. As a result, as shown in FIG. 12, in the generated fluorescent agent diffusion image 85, the display mode of the fluorescence start region 84 is gradually changed with the passage of the pulsation cycle 60, and the displayed fluorescence start region 84 is gradually displayed. It is generated as a moving image that expands to. In the fluorescent agent diffusion image 85, a boundary line 87 corresponding to the timing at which the pulsation cycle 60 is switched (the timing at which the display mode is switched) is formed.
  • the boundary line 87 is recognizable as a boundary line because the display mode between the adjacent fluorescence start regions 84 is different (the adjacent fluorescence start regions 84 belong to different pulsation cycles 60).
  • the fluorescence agent diffusion image 85 displays a fluorescence start region 84 divided by a number of boundary lines 87 corresponding to the number of pulsation cycles 60 that have elapsed until the end of imaging.
  • the fluorescent agent diffusion image 85 is an image in which the boundary line 87 indicating the timing at which the pulsation cycle 60 is switched is formed in an isolinear shape.
  • the image synthesis unit 216 (processor 21) combines the visible light image 82 acquired by the visible light image acquisition unit 211 and the fluorescent agent diffusion image 85 generated by the diffusion image generation unit 215. Perform the process of synthesizing. Compositing includes a process of superimposing a plurality of images. Specifically, as shown in FIG. 14, the image synthesizing unit 216 generates a superposed image 88 by superimposing the fluorescent agent diffusion image 85 on the visible light image 82. Therefore, in the generated superimposed image 88, in the visible light image 82 in which the treatment target site 2 actually visible to the user is captured, the fluorescent agent diffusion image 85 showing how the fluorescent agent 3 is diffused is displayed overlapping. It becomes an image.
  • the fluorescence image 81 and the visible light image 82 imaged by the fluorescence imaging device 10 are images having the same field of view.
  • the fluorescent agent diffusion image 85 is an image generated from the fluorescence image 81 and having the same field of view as the fluorescence image 81. Therefore, the visible light image 82 and the fluorescent agent diffusion image 85 are images that capture the same field of view. Since the visible light image 82 and the fluorescent agent diffusion image 85 are images in the same field of view, the superimposed image 88 can be generated by a simple process of simply superimposing the superimposed images without aligning the superimposed images. Is possible.
  • the image synthesizing unit 216 outputs the generated superimposed image 88 to the input / output unit 23.
  • the input / output unit 23 outputs the superimposed image 88 acquired from the image composition unit 216 to the display unit 30 and displays it on the screen.
  • the processor 21 causes the display unit 30 to display the generated fluorescent agent diffusion image 85.
  • the processor 21 (image synthesis unit 216) is configured to superimpose the fluorescent agent diffusion image 85 on the visible light image 82 and display it on the display unit 30.
  • the processor 21 acquires a frame image (fluorescence image 81, visible light image 82) of the latest frame from the fluorescence imaging device 10, it generates a fluorescence agent diffusion image 85 of the frame, and together with the fluorescence agent diffusion image 85.
  • a superposed image 88 is generated by superimposing the visible light image 82.
  • the processor 21 outputs the superimposed image 88 generated for each frame to the display unit 30, so that the superimposed image 88 is displayed as a moving image.
  • 15 to 17 are specific examples of the superimposed image 88 including the fluorescent agent diffusion image 85, and show the flap of the subject 1 to be transplanted in the flap technique (skin graft).
  • a flap is a blood-flowing skin, subcutaneous tissue, and deep tissue.
  • 15 to 17 show a superposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 in which the skin 2a and the subcutaneous tissue 2b of the treatment target site 2 are captured.
  • the processor 21 shows (1) displaying the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) showing the fluorescence start region 84 for each pulsation cycle 60.
  • the fluorescence initiation regions 84 for each pulsation cycle 60 are displayed in different modes from each other by at least one of displaying the lines in a contour diagram.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 with different gradations.
  • the gradation is a stage (gradation) of color and brightness expressed in an image.
  • the gradation includes a gradual change in color and a gradual change in brightness.
  • the gradations include a stepwise change in brightness from white (255) through gray (126) to black (0).
  • the image processing unit 20 sets the oldest pulsating cycle 60 (cycle 1) as white (255), and as it approaches the latest cycle K, , The gradation value is gradually reduced so as to approach black (0).
  • the fluorescent agent diffusion image 85 is a color image, and a color is obtained by combining three gradation values of R (red, 0 to 255), G (green, 0 to 255), and B (blue, 0 to 255).
  • the gradation includes a stepwise change in which the colors are continuously changed in a predetermined order such as red, yellow, green, and blue.
  • the processor 21 sets the oldest pulsating cycle 60 (cycle 1) as red, the latest cycle K as blue, and cycles 1 to cycle K. For each cycle up to, the color gradation is assigned so as to approach from red to blue, and the color gradation is changed stepwise.
  • the boundary line 87 is shown by a broken line for convenience in order to make it easy to understand the regions having different gradations.
  • the fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguishable and recognizable by the difference in gradation.
  • the processor 21 displays a line indicating the fluorescence start region 84 for each pulsation cycle 60 in a contour diagram.
  • the contour diagram is a diagram showing the distribution status by drawing a line connecting points with the same value.
  • a boundary line 87 is formed between the fluorescence start region 84 belonging to a certain pulsation cycle 60 and the fluorescence start region 84 belonging to the next pulsation cycle 60. Will be done.
  • To display the line indicating the fluorescence start region 84 for each pulsation cycle 60 in an isochronous diagram means to display the boundary line 87 on the image. With the diffusion of the fluorescent agent 3, the temporally new boundary line 87 is formed outside the temporally old boundary line 87, so that the fluorescent agent diffusion image 85 becomes an image of an iso-line diagram.
  • the processor 21 extracts the boundary line 87 and displays the boundary line 87 on the fluorescent agent diffusion image 85 each time the pulsation cycle 60 elapses. As a result, the fluorescent agent diffusion image 85 shown in FIG. 16 is generated.
  • the fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguished and recognizable as a region between two adjacent boundary lines 87.
  • the display mode by gradation and the display mode by lines may be combined.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and displays the boundary line 87 of the latest pulsation cycle 60 (cycle K) as a boundary line.
  • the processor 21 changes the gradation of the fluorescence start region 84 belonging to the pulsation cycle 60, and detects the elapse of the pulsation cycle 60 (switching to the next pulsation cycle 60). Is given the latest boundary line 87 of the pulsation cycle 60 (cycle K), and the boundary line of the previous pulsation cycle 60 (cycle K-1) is erased.
  • the processor 21 makes the display mode of the boundary line 87 different from the display mode of the fluorescence start region 84 for each pulsation cycle 60.
  • the rules for each display mode of the fluorescence start region 84 shown in FIGS. 15 to 17 are recorded in the storage unit 22 shown in FIG.
  • the processor 21 is configured to be able to accept selection of a display mode, for example, by a user's operation input.
  • the processor 21 generates the fluorescent agent diffusion image 85 in the display mode selected by the user according to the display mode setting information stored in the storage unit 22.
  • the treatment support device 100 implements the treatment support method of the present embodiment.
  • the treatment support method of the present embodiment includes at least the following steps (1) to (5).
  • Step (4) The processor 21 generates the fluorescent agent diffusion image 85 by superimposing the (fluorescence start region 84) at each time point in a different manner.
  • Step (1) and step (2) correspond to step 51 in FIG. Step (3) corresponds to step 52 in FIG. Step (4) corresponds to step 53 in FIG. Step (5) corresponds to step 54 in FIG. Further, the treatment support method of FIG. 18 includes additional steps in addition to the above-mentioned steps (1) to (5).
  • the image display process of the treatment support device 100 is started by starting the imaging by the fluorescence imaging device 10 based on the operation input of a user such as a doctor.
  • the fluorescence image 81 and the visible light image 82 of the subject 1 are captured. That is, the excitation light 91 is irradiated to the subject 1 from the excitation light irradiation unit 115 of the fluorescence imaging device 10, and the fluorescence 92 excited by the excitation light 91 is detected by the fluorescence imaging unit 111. Further, the visible light 93 is irradiated to the subject 1 from the visible light irradiation unit 117, and the reflected light of the visible light 93 is detected by the visible light imaging unit 112. As a result, the fluorescence image 81 and the visible light image 82 corresponding to one frame are imaged. The captured fluorescence image 81 and visible light image 82 are output to the image processing unit 20.
  • step 52 the processor 21 (region extraction unit 213) extracts the fluorescence start region 84 from the fluorescence image 81 obtained in step 51. Further, the processor 21 (pulsation cycle acquisition unit 214) integrates the number of fluorescence start regions 84 extracted in the fluorescence image 81. After the start of imaging, the processor 21 (pulsation cycle acquisition unit 214) shifts the obtained fluorescence image 81 to the first pulsation cycle 60 (cycle 1) until the first peak 66 of the number of fluorescence start regions 84 is detected. Set.
  • step 53 the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 based on the extracted fluorescence start region 84 and the pulsation cycle 60.
  • the processor 21 generates a fluorescent agent diffusion image 85 displaying the fluorescence start region 84 in a preset display mode according to the setting information stored in the storage unit 22.
  • step 54 the processor 21 (image synthesizing unit 216) superimposes the fluorescent agent diffusion image 85 on the visible light image 82 to generate the superposed image 88, and generates the superposed image 88 via the input / output unit 23. Output to the display unit 30 (see FIG. 1). As a result, the superimposed image 88 for one frame constituting the moving image is displayed on the display unit 30.
  • step 55 the processor 21 (pulsation cycle acquisition unit 214) determines whether or not it is the elapsed timing of the pulsation cycle 60. That is, the processor 21 (pulsation cycle acquisition unit 214) plots the number of fluorescence start regions 84 extracted from the fluorescence image 81 of the current frame number on the graph 65 shown in FIG. 11, and plots the number of fluorescence start regions 84. It is determined whether or not the peak 66 of the change of is detected. If no peak 66 is detected, processor 21 advances processing to step 56.
  • step 56 the processor 21 determines whether or not to end the imaging.
  • the processor 21 determines that the imaging is terminated when, for example, an operation input for ending the imaging is received from the user or when the preset end time is reached.
  • the fluorescence imaging device 10 ends the imaging
  • the processor 21 stops the image processing, and the image display processing of FIG. 18 ends.
  • the processor 21 returns the process to step 51, acquires images of the next frame (fluorescence image 81 and visible light image 82), and performs the processes of steps 51 to 54.
  • the processor 21 pulses the peak 66 of the change in the number of the fluorescence start regions 84.
  • the processor 21 advances the process to step 57.
  • step 57 the processor 21 determines the display mode of the fluorescence start region 84 belonging to the next pulsation cycle 60. That is, the processor 21 determines the display mode of the fluorescence start region 84 extracted after the next frame number according to the setting information of the display mode stored in the storage unit 22. Next, the processor 21 determines whether or not to end the imaging in step 56, and if not, processes the fluorescence image 81 and the visible light image 82 having the next frame number in step 51.
  • the fluorescence start region 84 extracted from the fluorescence image 81 of the frame after the display mode is determined in step 57 becomes the fluorescent agent diffusion image 85 in a mode different from the fluorescence start region 84 belonging to the previous pulsation cycle 60. Is displayed.
  • the display mode determined in step 57 is then applied up to the frame number at which the peak 66 of change in the number of fluorescence initiation regions 84 is detected.
  • the fluorescent agent diffusion image 85 the fluorescence start region 84 extracted from the fluorescence image 81 of each frame is displayed in a different manner for each pulsation cycle 60.
  • the image display processing by the treatment support device 100 is performed. From the fluorescent agent diffusion image 85 (superimposed image 88) displayed on the display unit 30, a doctor or the like can observe how the fluorescent agent 3 diffuses at each pulsation cycle 60.
  • the fluorescent agent diffusion image 85 makes it possible for a doctor or the like to provide useful information to the doctor or the like when evaluating the quality of blood perfusion at the treatment target site 2.
  • the excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1 and the fluorescence 92 excited by the excitation light 91 are provided.
  • the fluorescence imaging unit 111 that detects and captures the fluorescence image 81 of the subject 1 and the image processing unit 20 that has a processor 21 that performs image processing and outputs an image to the display unit 30 are provided.
  • the fluorescence start region 84 at each time point is detected, and the fluorescence start area 84 at each time point is superimposed in a different manner.
  • the fluorescent agent diffusion image 85 is generated, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30.
  • the step of irradiating the excitation light 91 of the fluorescent agent administered to the subject and the fluorescence 92 excited by the excitation light are detected to detect the fluorescence image 81 of the subject.
  • the fluorescence 92 is detected for the first time at each time point by comparing the fluorescence image 81 generated at each of the plurality of time points (frame numbers) with the fluorescence image 81 before the generation time point.
  • a fluorescent agent diffusion image 85 is generated in which the region where fluorescence is first detected at each time point (fluorescence start region 84) is displayed in different modes from each other. Since the fluorescence start region 84 is considered to be the region where the fluorescent agent 3 diffused by the bloodstream is first detected, the fluorescence agent diffusion image 85 shows a change in which the fluorescent agent 3 diffuses with the passage of time at each time point. It is displayed in a distinguishable manner. Therefore, the fluorescent agent diffusion image 85 can display the state of diffusion of the administered fluorescent agent 3 as a state in which the fluorescence start region 84 expands at each time point. Thereby, the state of diffusion of the fluorescent agent 3 administered to the subject 1 can be appropriately displayed.
  • the fluorescent agent diffusion image 85 can provide useful information for evaluating the quality of blood perfusion in the body tissue of the treatment target site to a doctor or the like who treats the patient. It becomes. That is, it is possible to provide a fluorescent agent diffusion image 85 that is useful for specifying a region to be treated and a region for confirming the result of treatment.
  • the processor 21 is based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted simply by comparing the pixel value of each pixel of the fluorescence image 81 with the threshold value 72.
  • the processor 21 is based on the fact that the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the slope of the time intensity curve 71 of each pixel.
  • the processor 21 determines that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Based on this, the fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the area value of the time intensity curve 71 of each pixel. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 may be obtained. Even in this case, the fluorescence start region 84 can be easily extracted.
  • the processor 21 acquires the pulsating cycle 60 of the subject and generates the fluorescent agent diffusion image 85 in which the fluorescence start regions 84 at each time point of each pulsating cycle 60 are superimposed in different modes. ..
  • the generated fluorescent agent diffusion image 85 is a display that distinguishably displays the change in which the fluorescent agent 3 diffuses with the pulsation derived from the heartbeat for each pulsation cycle 60. Therefore, the fluorescent agent diffusion image 85 can more appropriately display the diffusion of the fluorescent agent 3 administered to the subject as the fluorescence start region 84 expands every pulsation cycle 60.
  • the fluorescence imaging unit 111 is configured to generate a fluorescence image 81 at a time interval shorter than the pulsation cycle 60, and the processor 21 generates the number of fluorescence start regions 84. It is configured to integrate each of the generated fluorescence images 81 and detect the pulsation cycle 60 based on the number of fluorescence start regions 84 for each fluorescence image 81 generated in time series. With this configuration, the pulsation cycle 60 of the imaged portion can be directly acquired from the fluorescence image 81 acquired in time series.
  • the pulsation cycle 60 it is not necessary to separately provide a device for detecting pulsation such as an electrocardiograph or a pulsation meter, so that the device configuration can be simplified. Further, for example, when pulsation is detected by an electrocardiograph, only the pulsation cycle of the heart is detected, and a time difference occurs from the pulsation cycle 60 of the actually imaged portion. According to the above configuration for detecting the pulsation cycle 60 from the fluorescence image 81, the pulsation cycle 60 of the actually imaged portion can be directly detected, so that the state of diffusion of the fluorescent agent 3 accompanying the pulsation can be visualized more accurately. Can be done.
  • a device for detecting pulsation such as an electrocardiograph or a pulsation meter
  • the processor 21 acquires a change in the number of fluorescence start regions 84 for each fluorescence image 81, and detects a pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84. It is configured to do. With this configuration, the number of fluorescence initiation regions 84 increases sharply in response to the heartbeat (contraction of the ventricles) at every pulsation cycle 60 (that is, the amount of movement of the fluorescent agent 3). Since the peak 66) is formed, the pulsation cycle 60 of the actually imaged portion can be accurately detected from this peak 66.
  • the treatment support device of the above embodiment further includes a visible light imaging unit 112 that detects visible light 93 reflected from the subject 1 and captures a visible light image 82 of the subject 1, and the processor 21 includes a visible light image.
  • the fluorescent agent diffusion image 85 is superimposed on the 82 and displayed on the display unit 30.
  • the fluorescence image 81 and the fluorescence agent diffusion image 85 based on the fluorescence image 81 are images of the fluorescence 92 generated from the fluorescence agent 3, they include information on the morphology of the imaging site that can be recognized by the visible light 93. I can't.
  • the state of diffusion of the fluorescent agent 3 can be identified on the visible light image 82 actually visually recognized by a user such as a doctor. Can be displayed on. As a result, it is possible to easily identify the area to be treated and the area to confirm the result of treatment.
  • the processor 21 is extracted in the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60.
  • the fluorescence initiation region 84 is configured to be added to the fluorescent agent diffusion image 85. With this configuration, the fluorescence start region 84 extracted in the past pulsation cycle 60 is not erased from the image, and the fluorescence start region 84 is additionally displayed each time the pulsation cycle 60 elapses. It is possible to easily identify and display how the fluorescent agent 3 is diffused in accordance with the above.
  • the processor 21 when the fluorescence image 81 included in the pulsation cycle 60 is generated, the processor 21 adds the latest fluorescence start region 84 included in the pulsation cycle 60 to the latest.
  • the fluorescence start region 84 extracted from the fluorescence image 81 of the above is added to the fluorescence agent diffusion image 85.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and the line indicating the fluorescence start region 84 for each pulsation cycle 60 is equivalent.
  • the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes from each other.
  • the fluorescence start region 84 for each pulsation cycle 60 can be displayed in a visually easily distinguishable manner. As a result, convenience for users such as doctors can be improved.
  • the processor 21 detects the pulsation cycle 60 based on the peak 66 (see FIG. 11) of the number of the fluorescence start regions 84, but the present invention is not limited to this.
  • the processor 21 pulsates based on either the heart rate waveform 75 (see FIG. 10) detected by the electrocardiograph or the pulse waveform 76 (see FIG. 10) detected by the pulse wave meter. Cycle 60 may be detected.
  • the treatment support device 100 includes a waveform acquisition unit 40 that acquires the waveform of the heartbeat or pulse of the subject 1.
  • the waveform acquisition unit 40 includes, for example, an electrocardiograph and detects the heartbeat waveform 75 shown in FIG.
  • the waveform acquisition unit 40 includes, for example, a pulse wave meter, and detects the pulse waveform 76 shown in FIG.
  • the processor 21 is configured to detect the pulsation cycle 60 based on the heartbeat waveform 75 or the pulse waveform 76 instead of the peak 66 of the number of fluorescence initiation regions 84.
  • the processor 21 detects, for example, a feature point such as a peak 75a indicating a QRS complex in the heartbeat waveform 75, and detects a pulsation cycle 60 as a time interval between the feature points.
  • the processor 21 detects the peak 76a corresponding to the QRS wave of the heartbeat waveform 75 among the pulse waveforms 76 as feature points, and detects the pulsation cycle 60 as the time interval between the feature points.
  • the waveform acquisition unit 40 for acquiring the heartbeat or pulse waveform of the subject 1 is provided, and the processor 21 sets the pulsation cycle 60 based on the heartbeat or pulse waveform (75 or 76). It is configured to detect. With this configuration, the pulsation cycle 60 can be easily detected simply by acquiring the detection signal from the waveform acquisition unit 40.
  • the heartbeat waveform 75 is only a detection of the heartbeat, and there is a time lag from the blood pulsation at the treatment target site 2 actually imaged as the fluorescence image 81.
  • the pulse waveform 76 may have a time lag from the pulsation of blood at the treatment target site 2 depending on the length of the blood circulation path between the measurement point by the pulse wave meter and the treatment target site 2. There is.
  • the processor 21 detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84, the treatment target site 2 actually imaged from the fluorescence image 81. Since the pulsation of blood can be directly detected, it is possible to more accurately capture the state of diffusion of the fluorescent agent 3 accompanying the pulsation.
  • the image processing unit 20 is a PC provided separately from the fluorescence imaging device 10, but the present invention is not limited to this.
  • the image processing unit 20 may be provided integrally with the fluorescence imaging device 10.
  • the image processing unit 20 is provided in the main body 13 of the fluorescent imaging apparatus 10.
  • the processor that constitutes the image generation unit 133 of the main body unit 13 may be configured to also function as the processor 21 of the image processing unit 20. That is, the image processing unit 20 and the image generation unit 133 may be provided integrally.
  • the fluorescence start region 84 is extracted based on the pixel value of the fluorescence image 81 exceeding a predetermined threshold value 72 (see FIG. 7)
  • the slope of the time intensity curve 71 sets the predetermined threshold value 72.
  • the fluorescence start region 84 may be extracted based on an amount other than the pixel value, the slope of the time intensity curve 71, and the area value of the time intensity curve 71.
  • the processor 21 is configured to acquire the change in the number of fluorescence start regions 84 for each fluorescence image 81 and detect the pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84.
  • the present invention is not limited to this.
  • the processor 21 may detect the pulsation cycle 60 based on a periodically occurring pattern other than the peak 66 of the number of fluorescence initiation regions 84.
  • the processor 21 shows an example in which the superimposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 is displayed on the display unit 30, but the present invention is not limited to this.
  • the processor 21 may display only the fluorescent agent diffused image 85 alone without superimposing the fluorescent agent diffused image 85 on the visible light image 82. Further, the processor 21 may display the visible light image 82 and the fluorescent agent diffusion image 85 side by side on the screen.
  • the treatment support device 100 includes the visible light imaging unit 112, but the present invention is not limited to this. In the present invention, the treatment support device 100 does not have to include the visible light imaging unit 112.
  • the processor 21 displays (1) the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) a line indicating the fluorescence start region 84 for each pulsation cycle 60.
  • An example is shown in which the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes by at least one of displaying in an isolinear pattern, but the present invention is not limited to this.
  • the processor 21 may display the fluorescence start region 84 in any display mode as long as the display mode of the fluorescence start region 84 for each pulsation cycle 60 is different.
  • ICG is exemplified as the fluorescent agent 3, but the present invention is not limited to this.
  • a fluorescent agent 3 other than ICG may be used.
  • the fluorescent agent 3 other than ICG include 5-ALA (5-aminolevulinic acid) and IR700.
  • 5-ALA itself does not show fluorescence
  • protoporphyrin IX (PPIX) which is a metabolite of 5-ALA administered to subject 1, becomes a fluorescent substance, so that it is referred to as 5-ALA in the present specification.
  • Substances are also included in the fluorescent agent 3.
  • the fluorescent agent 3 may be any fluorescent substance used for fluorescent diagnosis of patients and the like.
  • An excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject
  • a fluorescence imaging unit that detects fluorescence excited by excitation light and captures a fluorescence image of the subject. It has a processor that performs image processing, and is equipped with an image processing unit that outputs an image to the display unit.
  • the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at a plurality of time points with the fluorescence image before the generation time point, and at each time point.
  • a treatment support device configured to generate a fluorescent agent diffused image by superimposing the regions in different modes and display the generated fluorescent agent diffused image on a display unit.
  • the fluorescence imaging unit is configured to generate the fluorescence image at a time interval shorter than the pulsation cycle.
  • the processor The number of the regions is integrated for each of the generated fluorescence images.
  • the treatment support device which is configured to detect the pulsating cycle based on the number of the regions for each fluorescent image generated in time series.
  • a waveform acquisition unit for acquiring the heartbeat or pulse waveform of the subject is further provided.
  • the treatment support device according to item 5, wherein the processor is configured to detect the pulsating cycle based on the waveform of the heartbeat or pulse.
  • Aspect 13 The step of irradiating the excitation light of the fluorescent agent administered to the subject, A step of detecting fluorescence excited by excitation light and capturing a fluorescence image of the subject, A step in which the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at each of the plurality of time points with the fluorescence image before the generation time point. A step in which the processor produces a fluorescent agent diffuse image by superimposing the regions at each time point in different ways.
  • a treatment support method comprising a step of displaying the generated fluorescent agent diffusion image on a display unit.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

A processor (21) for this medical treatment assistance device (100) compares fluorescence images (81) generated at multiple time points in time so as to detect areas (84) in the respective fluorescence images at the respective time points where fluorescence has been detected for the first time, and also generates a fluorescent agent diffusion image (85) by superimposing the respective areas at the respective time points in different modes.

Description

治療支援装置および治療支援方法Treatment support device and treatment support method
 本発明は、治療支援装置および治療支援方法に関する。 The present invention relates to a treatment support device and a treatment support method.
 従来、手術などの治療時に、患者に投与される蛍光剤により蛍光画像を撮像する治療支援装置が知られている。このような治療支援装置は、たとえば、特開2018-51320号公報に開示されている。 Conventionally, a treatment support device that captures a fluorescent image with a fluorescent agent administered to a patient during treatment such as surgery is known. Such a treatment support device is disclosed in, for example, Japanese Patent Application Laid-Open No. 2018-51320.
 上記特開2018-51320号公報には、形成外科手術のためにICG(インドシアニングリーン)蛍光血管造影を用いて穿通枝血管を手術前に識別する装置が開示されている。上記特開2018-51320号公報に開示された装置は、蛍光を励起するための赤外光源を備え、蛍光信号は、CCDカメラによって検出される。ICGの蛍光灌流および消失の全体サイクルが造影デバイスによってキャプチャされる。 The above-mentioned Japanese Patent Application Laid-Open No. 2018-51320 discloses a device for identifying perforator blood vessels before surgery using ICG (indocyanine green) fluorescence angiography for plastic surgery. The apparatus disclosed in Japanese Patent Application Laid-Open No. 2018-51320 includes an infrared light source for exciting fluorescence, and the fluorescence signal is detected by a CCD camera. The entire cycle of fluorescence perfusion and disappearance of the ICG is captured by the contrast device.
 上記特開2018-51320号公報によれば、よく血管の通った皮弁は、グラフトのよい候補であり、外科医は、いくつかの穿通枝のうちどれが最良のグラフト候補であるかを判断する必要がある。そこで、上記特開2018-51320号公報の装置は、蛍光画像におけるピクセル値に関する、時間積算された輝度、または輝度の時間微分を生み出すために画像シーケンスを処理する手段と、時間積算された輝度または輝度の時間微分を、カラー画像または白黒画像として表示する手段とを備える。この装置は、手術に先立って穿通枝血管を特定する/位置確認するために使用される。 According to JP-A-2018-51320, a well-vascularized flap is a good candidate for grafting, and the surgeon determines which of the several penetrating branches is the best candidate for grafting. There is a need. Therefore, the apparatus of JP-A-2018-51320 is a means for processing an image sequence to generate a time-integrated luminance or a time derivative of the luminance with respect to a pixel value in a fluorescent image, and a time-integrated luminance or It is provided with a means for displaying the time derivative of luminance as a color image or a black-and-white image. This device is used to identify / locate perforator vessels prior to surgery.
特開2018-51320号公報JP-A-2018-51320
 ここで、蛍光画像はある時刻での蛍光強度を示しているに過ぎないため、ある時刻の蛍光画像から血液の灌流の良否(すなわち、体組織の血行の良否)を評価することが困難である。そのため、上記特開2018-51320号公報では、蛍光画像におけるピクセル値に関する、時間積算された輝度、または輝度の時間微分を求めて表示させていると考えられる。 Here, since the fluorescence image only shows the fluorescence intensity at a certain time, it is difficult to evaluate the quality of blood perfusion (that is, the quality of blood circulation in body tissue) from the fluorescence image at a certain time. .. Therefore, in JP-A-2018-51320, it is considered that the time-integrated luminance or the time derivative of the luminance with respect to the pixel value in the fluorescent image is obtained and displayed.
 しかしながら、たとえばピクセル値を時間積算させて表示する場合、血行が悪くICGが滞留し易い箇所でも、血行が良いためにより多くのICGが流通している箇所でも、ICGの蛍光が継続的に検出されるため、同様に積算値が高くなると考えられる。つまり、血液の灌流に伴う蛍光剤の拡散の様子を適切に表示できているとは言えないという問題点がある。 However, for example, when the pixel values are integrated over time and displayed, the fluorescence of ICG is continuously detected even in a place where blood circulation is poor and ICG tends to stay, or in a place where more ICG is distributed due to good blood circulation. Therefore, it is considered that the integrated value will be high as well. That is, there is a problem that it cannot be said that the state of diffusion of the fluorescent agent due to blood perfusion can be properly displayed.
 血液の灌流の良否を評価することは、上記した皮膚移植(皮弁術)に限らず、治療を行う領域を特定したり、治療の結果を確認したりするために重要となる。そのため、患者の治療時に、治療の対象となる部位における血液の灌流の良否を評価するために、投与された蛍光剤の拡散の様子を適切に表示できるようにすることが望まれている。 Evaluating the quality of blood perfusion is important not only for skin grafting (flap surgery) described above, but also for identifying the area to be treated and confirming the result of treatment. Therefore, when treating a patient, it is desired to be able to appropriately display the state of diffusion of the administered fluorescent agent in order to evaluate the quality of blood perfusion at the site to be treated.
 この発明は、上記のような課題を解決するためになされたものであり、この発明の1つの目的は、被写体に投与された蛍光剤の拡散の様子を適切に表示することが可能な治療支援装置および治療支援方法を提供することである。 The present invention has been made to solve the above-mentioned problems, and one object of the present invention is therapeutic support capable of appropriately displaying the state of diffusion of the fluorescent agent administered to the subject. It is to provide a device and a treatment support method.
 上記目的を達成するために、この発明の第1の局面による治療支援装置は、被写体に投与される蛍光剤の励起光を照射する励起光照射部と、励起光により励起された蛍光を検出して被写体の蛍光画像を撮像する蛍光撮像部と、画像処理を行うプロセッサを有し、表示部に画像出力を行う画像処理部と、を備え、プロセッサは、複数の時点で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するとともに、各時点における上記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成し、生成した蛍光剤拡散画像を表示部に表示させるように構成されている。 In order to achieve the above object, the treatment support device according to the first aspect of the present invention detects the excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject and the fluorescence excited by the excitation light. A fluorescence imaging unit that captures a fluorescence image of a subject and an image processing unit that has a processor that performs image processing and outputs an image to the display unit are provided, and the processor includes fluorescence images generated at a plurality of time points. By comparing with the fluorescence image before the generation time point, the region in the fluorescence image in which fluorescence was detected for the first time at each time point is detected, and the above area at each time point is superimposed in a different manner to obtain a fluorescence agent diffusion image. Is generated, and the generated fluorescent agent diffusion image is displayed on the display unit.
 この発明の第2の局面による治療支援方法は、被写体に投与される蛍光剤の励起光を照射するステップと、励起光により励起された蛍光を検出して被写体の蛍光画像を撮像するステップと、プロセッサが、複数の時点の各々で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するステップと、プロセッサが、各時点における上記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成するステップと、生成した蛍光剤拡散画像を表示部に表示させるステップと、を備える。 The treatment support method according to the second aspect of the present invention includes a step of irradiating the excitation light of the fluorescent agent administered to the subject, a step of detecting the fluorescence excited by the excitation light, and a step of capturing a fluorescence image of the subject. A step in which the processor detects a region in a fluorescence image in which fluorescence is first detected at each time point by comparing a fluorescence image generated at each of a plurality of time points with a fluorescence image before the generation time point, and a processor. However, the present invention includes a step of generating a fluorescent agent diffused image by superimposing the regions at each time point in different modes, and a step of displaying the generated fluorescent agent diffused image on the display unit.
 本発明によれば、上記のように、各時点において蛍光が初めて検出された領域を、互いに異なる態様で表示する蛍光剤拡散画像が生成および表示される。上記領域は、血流によって拡散する蛍光剤の存在が最初に検出された領域と考えられるから、蛍光剤拡散画像は、時間経過に伴って蛍光剤が拡散していく変化を時点毎に区別可能に表示したものである。したがって、蛍光剤拡散画像によって、被写体に投与された蛍光剤の拡散の様子を、時点毎に上記領域が拡大する様子として表示することができる。これにより、被写体に投与された蛍光剤の拡散の様子を適切に表示することができる。 According to the present invention, as described above, a fluorescent agent diffusion image is generated and displayed in which the regions where fluorescence is first detected at each time point are displayed in different modes from each other. Since the above region is considered to be the region where the presence of the fluorescent agent diffused by the bloodstream is first detected, the fluorescent agent diffusion image can distinguish the change in which the fluorescent agent diffuses with the passage of time at each time point. It is displayed in. Therefore, the state of diffusion of the fluorescent agent administered to the subject can be displayed by the fluorescent agent diffusion image as a state in which the above-mentioned region is enlarged at each time point. As a result, the state of diffusion of the fluorescent agent administered to the subject can be appropriately displayed.
一実施形態による治療支援装置の概略図である。It is the schematic of the treatment support device by one Embodiment. 一実施形態による治療支援装置の模式的な斜視図である。It is a schematic perspective view of the treatment support device by one Embodiment. 撮像部の概略図である。It is the schematic of the imaging unit. 撮像部の内部構成を示した模式図である。It is a schematic diagram which showed the internal structure of the image pickup part. 治療支援装置の画像処理部の構成を説明するためのブロック図である。It is a block diagram for demonstrating the structure of the image processing part of a treatment support apparatus. 時系列の蛍光画像から抽出される蛍光開始領域を説明するための図である。It is a figure for demonstrating the fluorescence start region extracted from the time series fluorescence image. 蛍光開始領域の抽出方法の第1の例を説明するための図である。It is a figure for demonstrating 1st example of the extraction method of a fluorescence start region. 蛍光開始領域の抽出方法の第2の例を説明するための図である。It is a figure for demonstrating the 2nd example of the extraction method of the fluorescence start region. 蛍光開始領域の抽出方法の第3の例を説明するための図である。It is a figure for demonstrating the 3rd example of the extraction method of the fluorescence start region. 心拍の波形と、脈拍の波形とを示した模式図である。It is a schematic diagram which showed the waveform of the heartbeat and the waveform of the pulse. 時間経過に伴う蛍光開始領域の数の変化を示したグラフである。It is a graph which showed the change of the number of fluorescence start regions with the lapse of time. 蛍光剤拡散画像の生成処理の流れを示した模式図である。It is a schematic diagram which showed the flow of the generation processing of a fluorescent agent diffusion image. 時系列で抽出される蛍光開始領域と、蛍光剤拡散画像に表示される蛍光開始領域とを説明するための図である。It is a figure for demonstrating the fluorescence start region extracted in time series, and the fluorescence start region displayed in a fluorescent agent diffusion image. 蛍光剤拡散画像と可視光画像とを重畳させる処理を説明するための図である。It is a figure for demonstrating the process of superimposing a fluorescent agent diffusion image and a visible light image. 蛍光剤拡散画像の第1の具体的な表示例を示した図である。It is a figure which showed the 1st concrete display example of the fluorescent agent diffusion image. 蛍光剤拡散画像の第2の具体的な表示例を示した図である。It is a figure which showed the 2nd specific display example of the fluorescent agent diffusion image. 蛍光剤拡散画像の第3の具体的な表示例を示した図である。It is a figure which showed the 3rd specific display example of the fluorescent agent diffusion image. 画像処理部による画像表示処理の流れを示したフロー図である。It is a flow chart which showed the flow of the image display processing by an image processing unit. 治療支援装置の変形例における波形取得部を説明するための図である。It is a figure for demonstrating the waveform acquisition part in the modification of the treatment support apparatus. 治療支援装置の変形例における画像処理部を示したブロック図である。It is a block diagram which showed the image processing part in the modification of the treatment support device.
 以下、本発明を具体化した実施形態を図面に基づいて説明する。 Hereinafter, embodiments embodying the present invention will be described with reference to the drawings.
 図1~図14を参照して、一実施形態による治療支援装置100の構成について説明する。 The configuration of the treatment support device 100 according to one embodiment will be described with reference to FIGS. 1 to 14.
 (治療支援装置の構成)
 治療支援装置100は、図1に示すように、蛍光撮像装置10と、画像処理部20とを備える。治療支援装置100は、患者である被写体1の治療の際に、蛍光撮像装置10により治療対象部位2を撮像するとともに、画像処理部20によって処理した画像を表示部30に表示させることによって、治療の支援を行う装置である。
(Configuration of treatment support device)
As shown in FIG. 1, the treatment support device 100 includes a fluorescence imaging device 10 and an image processing unit 20. When treating the subject 1 which is a patient, the treatment support device 100 images the treatment target site 2 by the fluorescence imaging device 10 and displays the image processed by the image processing unit 20 on the display unit 30 to perform the treatment. It is a device that supports the above.
 治療支援装置100による治療支援は、具体的には、被写体1の体内に投与された蛍光剤3から生じる蛍光92を撮像して可視化(画像化)した画像を表示部30に表示することにより、外部から直接視認できない治療対象部位2の情報を医師などに提供することである。 Specifically, the treatment support by the treatment support device 100 is performed by displaying an image visualized (imaging) by imaging the fluorescence 92 generated from the fluorescence agent 3 administered into the body of the subject 1 on the display unit 30. This is to provide a doctor or the like with information on the treatment target site 2 that cannot be directly seen from the outside.
 被写体1(患者)は、たとえば、ヒトであるが、特に限定されない。治療対象部位2は、たとえば、胸部、腹部、背中、体内の臓器(たとえば消化管、肝臓、副腎など)などであるが、特に限定されない。 Subject 1 (patient) is, for example, a human being, but is not particularly limited. The treatment target site 2 is, for example, the chest, abdomen, back, internal organs (for example, digestive tract, liver, adrenal gland, etc.), but is not particularly limited.
 蛍光撮像装置10は、励起光91を照射することにより被写体1に投与された蛍光剤3から発せられる蛍光92を検出し、蛍光92に基づいて被写体1の治療対象部位2を可視化(画像化)する装置である。 The fluorescence imaging device 10 detects the fluorescence 92 emitted from the fluorescent agent 3 administered to the subject 1 by irradiating the excitation light 91, and visualizes (imaging) the treatment target site 2 of the subject 1 based on the fluorescence 92. It is a device to do.
 画像処理部20は、蛍光撮像装置10により撮像された蛍光画像を画像処理して、後述する蛍光剤拡散画像85を生成する。画像処理部20は、表示部30に画像出力を行うように構成されている。画像処理部20は、画像データから蛍光剤拡散画像85を生成する情報処理を行うためのプロセッサ21および記憶部を備えたコンピュータにより構成されている。プロセッサ21は蛍光画像の画像処理を行う。画像処理部20は、たとえばPC(Personal Computer)である。 The image processing unit 20 performs image processing on the fluorescence image captured by the fluorescence imaging device 10 to generate a fluorescent agent diffusion image 85, which will be described later. The image processing unit 20 is configured to output an image to the display unit 30. The image processing unit 20 is composed of a computer including a processor 21 and a storage unit for performing information processing to generate a fluorescent agent diffused image 85 from image data. The processor 21 performs image processing of the fluorescence image. The image processing unit 20 is, for example, a PC (Personal Computer).
 画像処理部20は、たとえば蛍光撮像装置10と電気的に接続され、蛍光撮像装置10から画像を取得する。画像処理部20は、たとえば表示部30と電気的に接続され、表示部30に画像を出力する。これらの機器間の接続は、有線および無線のいずれでもよい。 The image processing unit 20 is electrically connected to, for example, the fluorescence imaging device 10, and acquires an image from the fluorescence imaging device 10. The image processing unit 20 is electrically connected to, for example, the display unit 30 and outputs an image to the display unit 30. The connection between these devices may be either wired or wireless.
 表示部30は、画像処理部20から出力される治療対象部位2の画像を表示するように構成されている。画像処理部20は、たとえば、液晶ディスプレイなどのモニタである。画像処理部20は、たとえば、被写体1の治療が行われる手術室等に備え付けられたモニタである。表示部30は、たとえば、画像処理部20または蛍光撮像装置10が備えるモニタであってもよい。 The display unit 30 is configured to display an image of the treatment target site 2 output from the image processing unit 20. The image processing unit 20 is, for example, a monitor such as a liquid crystal display. The image processing unit 20 is, for example, a monitor provided in an operating room or the like where the treatment of the subject 1 is performed. The display unit 30 may be, for example, a monitor included in the image processing unit 20 or the fluorescence imaging device 10.
 (蛍光撮像装置の構成)
 図1および図2に示すように、蛍光撮像装置10は、撮像部11と、アーム機構12と、本体部13とを備える。
(Configuration of fluorescence imaging device)
As shown in FIGS. 1 and 2, the fluorescence imaging device 10 includes an imaging unit 11, an arm mechanism 12, and a main body unit 13.
 蛍光撮像装置10は、被写体1から離隔した位置に配置される撮像部11により、治療対象部位2を被写体1の外部から撮像する。アーム機構12(図2参照)は、本体部13に接続された第1端と、撮像部11に接続された第2端とを有し、可動範囲内で撮像部11を任意の位置および向きに保持可能に構成されている。 The fluorescence imaging device 10 images the treatment target site 2 from the outside of the subject 1 by the imaging unit 11 arranged at a position separated from the subject 1. The arm mechanism 12 (see FIG. 2) has a first end connected to the main body 13 and a second end connected to the imaging unit 11, and the imaging unit 11 is positioned and oriented at an arbitrary position within the movable range. It is configured to be able to be held in.
 撮像部11は、少なくとも、励起光91により励起された蛍光92を検出して被写体1の蛍光画像81を撮像する蛍光撮像部111を含む。本実施形態では、撮像部11は、蛍光92に基づく蛍光画像81に加えて、可視光93に基づく可視光画像82を撮像するように構成されている。すなわち、撮像部11は、被写体1から反射した可視光93を検出して、被写体1の可視光画像82を撮像する可視光撮像部112を含む。 The imaging unit 11 includes at least a fluorescence imaging unit 111 that detects the fluorescence 92 excited by the excitation light 91 and captures the fluorescence image 81 of the subject 1. In the present embodiment, the imaging unit 11 is configured to capture a visible light image 82 based on visible light 93 in addition to the fluorescence image 81 based on fluorescence 92. That is, the imaging unit 11 includes a visible light imaging unit 112 that detects the visible light 93 reflected from the subject 1 and captures the visible light image 82 of the subject 1.
 撮像部11は、動画像として、蛍光画像81および可視光画像82を撮像するように構成されている。撮像部11は、所定のフレームレートで、蛍光画像81および可視光画像82を時系列に沿って生成する。生成される個々の蛍光画像81や可視光画像82は、動画像の各フレームを構成するフレーム画像である。フレームレートは、たとえば、1fps(frames per second)以上であり、15fps以上が好ましく、より好ましくは30fps以上である。フレームレートは、たとえば、60fpsに設定されており、ユーザによる設定に応じて変更可能である。なお、蛍光撮像部111および可視光撮像部112は、被写体1の血流の脈動周期よりも短い時間間隔で蛍光画像81を生成するように構成されている。一般に、ヒトの心拍数は、成人で60~75回/分程度であり、換算すると1回/秒程度である。30fpsや60fpsは、被写体1の血流の脈動周期よりも十分に短い時間間隔となる。 The imaging unit 11 is configured to capture a fluorescence image 81 and a visible light image 82 as moving images. The imaging unit 11 generates a fluorescence image 81 and a visible light image 82 in chronological order at a predetermined frame rate. The generated individual fluorescence image 81 and visible light image 82 are frame images constituting each frame of the moving image. The frame rate is, for example, 1 fps (frames per second) or more, preferably 15 fps or more, and more preferably 30 fps or more. The frame rate is set to, for example, 60 fps, and can be changed according to the setting by the user. The fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to generate the fluorescence image 81 at a time interval shorter than the pulsating cycle of the blood flow of the subject 1. In general, the heart rate of a human is about 60 to 75 times / minute for an adult, and is about 1 time / second when converted. 30 fps and 60 fps are time intervals sufficiently shorter than the pulsating cycle of the blood flow of the subject 1.
 (撮像部)
 撮像部11は、より具体的には、受光部11aと、光学系11bと、撮像光源部11cとを備える。
(Image pickup unit)
More specifically, the image pickup unit 11 includes a light receiving unit 11a, an optical system 11b, and an image pickup light source unit 11c.
 受光部11aは、上記した蛍光撮像部111と可視光撮像部112とを含む。可視光撮像部112は、可視光93を検出するように構成されている。蛍光撮像部111は、蛍光92を検出するように構成されている。可視光撮像部112および蛍光撮像部111は、たとえば、CMOS(Complementary Netal Oxide Semiconductor)イメージセンサやCCD(Charge Coupled Device)イメージセンサなどのイメージセンサを含む。なお、可視光撮像部112は、可視光画像82をカラー画像として取得可能なものが使用される。 The light receiving unit 11a includes the fluorescence imaging unit 111 and the visible light imaging unit 112 described above. The visible light imaging unit 112 is configured to detect visible light 93. The fluorescence imaging unit 111 is configured to detect fluorescence 92. The visible light imaging unit 112 and the fluorescence imaging unit 111 include, for example, an image sensor such as a CMOS (Complementary Neticide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor. As the visible light imaging unit 112, one that can acquire the visible light image 82 as a color image is used.
 光学系11bは、ズームレンズ113と、プリズム114とを含む。光学系11bは、被写体1から反射された可視光93と、蛍光剤3から発せられる蛍光92との分離を行うように構成されている。光学系11bの詳しい構成は後述する。 The optical system 11b includes a zoom lens 113 and a prism 114. The optical system 11b is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3. The detailed configuration of the optical system 11b will be described later.
 撮像光源部11cは、被写体1に投与される蛍光剤3の励起光91を照射する励起光照射部115を含む。励起光照射部115は、励起光91を発生する励起光源116(図3参照)を有する。励起光照射部115は、蛍光剤3の光吸収特性に応じて、適した波長の励起光91を照射する。 The imaging light source unit 11c includes an excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1. The excitation light irradiation unit 115 has an excitation light source 116 (see FIG. 3) that generates excitation light 91. The excitation light irradiation unit 115 irradiates the excitation light 91 having a suitable wavelength according to the light absorption characteristics of the fluorescent agent 3.
 一例として、蛍光剤3はインドシアニングリーン(ICG)である。ICGは、約750nm以上約800nm未満の波長領域に吸収ピークを有する。ICGは、励起されることにより、約800nm以上約850nm未満の波長領域にピークを有する蛍光92を発する。励起光照射部115は、たとえば約750nmにピークを有する励起光91を照射する。 As an example, the fluorescent agent 3 is indocyanine green (ICG). The ICG has an absorption peak in the wavelength region of about 750 nm or more and less than about 800 nm. When excited, the ICG emits fluorescence 92 having a peak in the wavelength region of about 800 nm or more and less than about 850 nm. The excitation light irradiation unit 115 irradiates the excitation light 91 having a peak at, for example, about 750 nm.
 また、撮像光源部11cは、可視波長領域の可視光93を照射する可視光照射部117を含む。可視光照射部117は、可視光93を発生する可視光源118(図3参照)を有する。励起光照射部115は、可視光93として、たとえば、白色光を被写体1に向けて照射する。白色光は、可視波長領域の略全体に亘る波長成分を含んでいる。励起光照射部115が照射する可視光93は、可視波長領域に発光強度のピークを有する。 Further, the imaging light source unit 11c includes a visible light irradiation unit 117 that irradiates visible light 93 in the visible wavelength region. The visible light irradiation unit 117 has a visible light source 118 (see FIG. 3) that generates visible light 93. The excitation light irradiation unit 115 irradiates, for example, white light toward the subject 1 as visible light 93. White light contains wavelength components over substantially the entire visible wavelength region. The visible light 93 irradiated by the excitation light irradiation unit 115 has a peak of emission intensity in the visible wavelength region.
 なお、患者の治療が行われる手術室等には、無影灯などの可視光波長の照明設備が設置されている。照明設備が発生する光を可視光93として利用しうるため、撮像光源部11cは、可視光照射部117を備えていなくてもよい。 Lighting equipment with visible light wavelength such as surgical light is installed in the operating room where the patient is treated. Since the light generated by the lighting equipment can be used as the visible light 93, the imaging light source unit 11c does not have to include the visible light irradiation unit 117.
 図3に示すように、撮像光源部11cは、撮像部11の端面において、光学系11bを取り囲むように環状に設けられている。図3の例では、合計12個の励起光源116、可視光源118が円環状に配列されている。これらの励起光源116および可視光源118は、たとえば発光ダイオード(LED)である。励起光源116および可視光源118は、半導体レーザなどのレーザ光源でもよい。 As shown in FIG. 3, the image pickup light source unit 11c is provided in an annular shape on the end surface of the image pickup unit 11 so as to surround the optical system 11b. In the example of FIG. 3, a total of 12 excitation light sources 116 and visible light sources 118 are arranged in an annular shape. These excitation light sources 116 and visible light sources 118 are, for example, light emitting diodes (LEDs). The excitation light source 116 and the visible light source 118 may be a laser light source such as a semiconductor laser.
 図4に示すように、蛍光撮像部111および可視光撮像部112は、共通の光学系11bを介して、蛍光92および可視光93をそれぞれ検出するように構成されている。そのため、撮像部11は、同じ撮像位置および同じ撮像視野で、蛍光画像81と可視光画像82とを取得する。 As shown in FIG. 4, the fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to detect fluorescence 92 and visible light 93, respectively, via a common optical system 11b. Therefore, the imaging unit 11 acquires the fluorescence image 81 and the visible light image 82 at the same imaging position and the same imaging field of view.
 具体的には、蛍光92および可視光93は、光軸94に沿ってズームレンズ113に入射する。ズームレンズ113は、フォーカスを合わせるために、図示しないレンズ移動機構によって光軸94に沿った方向に移動される。撮像部11は、ズームレンズ113による可変範囲内の任意の倍率で、蛍光画像81および可視光画像82を取得可能である。 Specifically, the fluorescence 92 and the visible light 93 are incident on the zoom lens 113 along the optical axis 94. The zoom lens 113 is moved in a direction along the optical axis 94 by a lens moving mechanism (not shown) in order to focus. The imaging unit 11 can acquire the fluorescence image 81 and the visible light image 82 at an arbitrary magnification within the variable range of the zoom lens 113.
 蛍光92および可視光93は、ズームレンズ113を透過した後、プリズム114に到達する。プリズム114は、被写体1から反射された可視光93と、蛍光剤3から発せられる蛍光92とを分離するように構成されている。 The fluorescence 92 and the visible light 93 reach the prism 114 after passing through the zoom lens 113. The prism 114 is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3.
 プリズム114に到達した蛍光92は、プリズム114を透過して蛍光撮像部111に到達する。プリズム114に到達した可視光93は、プリズム114により反射され、可視光撮像部112に到達する。なお、被写体1からの励起光91の反射光は、プリズム114により反射される。そのため、被写体1からの励起光91の反射光が蛍光撮像部111に到達されることが回避される。 The fluorescence 92 that has reached the prism 114 passes through the prism 114 and reaches the fluorescence imaging unit 111. The visible light 93 that has reached the prism 114 is reflected by the prism 114 and reaches the visible light imaging unit 112. The reflected light of the excitation light 91 from the subject 1 is reflected by the prism 114. Therefore, it is avoided that the reflected light of the excitation light 91 from the subject 1 reaches the fluorescence imaging unit 111.
 (アーム機構)
 図2に示すように、アーム機構12は、撮像部11を並進移動可能に支持する並進支持部121と、撮像部11を回動可能に支持する回動支持部122と、を含む。
(Arm mechanism)
As shown in FIG. 2, the arm mechanism 12 includes a translational support portion 121 that rotatably supports the image pickup unit 11 and a rotation support portion 122 that rotatably supports the image pickup unit 11.
 並進支持部121は、回動支持部122を介して撮像部11を支持し、撮像部11の位置を保持するとともに、撮像部11を前後、左右、上下の各方向に並進移動させることが可能なように構成されている。回動支持部122は、撮像部11を左右および上下の各方向に回動させることが可能なように構成されている。 The translational support unit 121 supports the image pickup unit 11 via the rotation support unit 122, holds the position of the image pickup unit 11, and can translate the image pickup unit 11 in each of the front-back, left-right, and up-down directions. It is configured as follows. The rotation support unit 122 is configured so that the image pickup unit 11 can be rotated in each of the left-right and up-down directions.
 (本体部)
 本体部13は、図2に示すように、筐体131と、筐体131に収容されたコンピュータとを備えている。筐体131は、たとえば、コンピュータを収容する箱状形状を有し、車輪により移動可能に構成された台車である。図1に示すように、本体部13は、制御部132と、画像生成部133と、本体記憶部134と、出力部135と、を含む。
(Main body)
As shown in FIG. 2, the main body 13 includes a housing 131 and a computer housed in the housing 131. The housing 131 is, for example, a dolly having a box shape for accommodating a computer and being movable by wheels. As shown in FIG. 1, the main body unit 13 includes a control unit 132, an image generation unit 133, a main body storage unit 134, and an output unit 135.
 制御部132は、たとえば、CPU(Central Processing Unit)などのプロセッサとメモリとを備えたコンピュータによって構成されている。コンピュータは、プロセッサがメモリに格納されたプログラムを実行することにより、蛍光撮像装置10の制御部132として機能する。制御部132は、撮像部11の制御(撮像の開始および停止など)、撮像光源部11cからの光(励起光91、可視光93)の照射、照射の停止などの制御を、図示しない操作部への入力操作に基づいて制御するように構成されている。 The control unit 132 is composed of, for example, a computer including a processor such as a CPU (Central Processing Unit) and a memory. The computer functions as a control unit 132 of the fluorescence imaging device 10 by executing a program stored in the memory by the processor. The control unit 132 controls the image pickup unit 11 (start and stop of imaging, etc.), irradiates the light (excitation light 91, visible light 93) from the image pickup light source unit 11c, stops the irradiation, and the like, which is not shown. It is configured to control based on the input operation to.
 画像生成部133は、撮像部11(蛍光撮像部111および可視光撮像部112)の検出信号から、撮像部11により撮像された蛍光画像81(図4参照)の画像データ、可視光画像82(図4参照)の画像データをそれぞれ生成するように構成されている。画像生成部133は、たとえば、GPU(Graphics Processing Unit)、または画像処理用に構成されたFPGA(Field-Programmable Gate Array)などのプロセッサと、メモリとを含む。 The image generation unit 133 includes image data of the fluorescence image 81 (see FIG. 4) captured by the imaging unit 11 and the visible light image 82 (see FIG. 4) from the detection signals of the imaging unit 11 (fluorescence imaging unit 111 and visible light imaging unit 112). It is configured to generate the image data of (see FIG. 4) respectively. The image generation unit 133 includes, for example, a processor such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array) configured for image processing, and a memory.
 また、本体記憶部134は、画像生成部133により生成された撮像画像や制御用のプログラムなどを記憶するように構成されている。本体記憶部134は、たとえば、不揮発性のメモリやHDD(Hard Disk Drive)などを含む。 Further, the main body storage unit 134 is configured to store the captured image generated by the image generation unit 133, a control program, and the like. The main body storage unit 134 includes, for example, a non-volatile memory, an HDD (Hard Disk Drive), and the like.
 出力部135は、画像生成部133により生成された撮像画像を含む映像信号を画像処理部20に出力するように構成されている。出力部135は、HDMI(登録商標)など映像出力インターフェースや、その他の外部機器接続用のインターフェースである。出力部135は、有線または無線により、撮像画像を出力可能に画像処理部20と接続される。 The output unit 135 is configured to output a video signal including the captured image generated by the image generation unit 133 to the image processing unit 20. The output unit 135 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices. The output unit 135 is connected to the image processing unit 20 so that the captured image can be output by wire or wirelessly.
 このような構成により、蛍光撮像装置10は、被写体1の蛍光画像81および可視光画像82を取得し、取得した蛍光画像81および可視光画像82を画像処理部20に出力する。蛍光撮像装置10は、動画像形式で画像処理部20に画像出力を行う。蛍光画像81および可視光画像82は、それぞれ動画像を構成するフレーム画像(静止画像)として、設定されたフレームレートに従って時系列で順次出力される。つまり、一定時間間隔で、蛍光画像81および可視光画像82が各1枚ずつ画像処理部20に出力される。 With such a configuration, the fluorescence imaging device 10 acquires the fluorescence image 81 and the visible light image 82 of the subject 1, and outputs the acquired fluorescence image 81 and the visible light image 82 to the image processing unit 20. The fluorescence imaging device 10 outputs an image to the image processing unit 20 in a moving image format. The fluorescence image 81 and the visible light image 82 are sequentially output in chronological order according to a set frame rate as frame images (still images) constituting a moving image. That is, one fluorescent image 81 and one visible light image 82 are output to the image processing unit 20 at regular time intervals.
 (画像処理部)
 図5に示すように、画像処理部20は、プロセッサ21と、記憶部22と、入出力部23とを含んでいる。本実施形態では、プロセッサ21は、複数の時点で生成される蛍光画像81と、生成時点以前の蛍光画像81とを比較することにより、各時点において蛍光92が初めて検出された蛍光画像81中の領域(以下、蛍光開始領域84という)を検出するとともに、各時点における蛍光開始領域84を異なる態様で重畳することにより蛍光剤拡散画像85を生成し、生成した蛍光剤拡散画像85を表示部30に表示させるように構成されている。なお、蛍光開始領域84は、請求の範囲の「蛍光が初めて検出された蛍光画像中の領域」の一例である。以下、画像処理部20の構成について説明する。
(Image processing unit)
As shown in FIG. 5, the image processing unit 20 includes a processor 21, a storage unit 22, and an input / output unit 23. In the present embodiment, the processor 21 compares the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point in the fluorescence image 81 in which the fluorescence 92 is detected for the first time at each time point. A fluorescence agent diffusion image 85 is generated by detecting a region (hereinafter referred to as a fluorescence start region 84) and superimposing the fluorescence start region 84 at each time point in a different manner, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30. It is configured to be displayed in. The fluorescence start region 84 is an example of the "region in the fluorescence image in which fluorescence is detected for the first time" in the claims. Hereinafter, the configuration of the image processing unit 20 will be described.
 プロセッサ21は、たとえばCPU、GPUまたは画像処理用に構成されたFPGAなどにより構成される。 The processor 21 is composed of, for example, a CPU, a GPU, an FPGA configured for image processing, or the like.
 記憶部22は、揮発性および/または不揮発性のメモリ、HDDなどの記憶装置を含む。記憶部22は、プロセッサ21が実行するプログラムを記憶する。記憶部22は、蛍光撮像装置10から得られた画像データ、画像処理部20において生成される蛍光剤拡散画像85などの各種画像データを記憶する。 The storage unit 22 includes a volatile and / or non-volatile memory, a storage device such as an HDD, and the like. The storage unit 22 stores a program executed by the processor 21. The storage unit 22 stores various image data such as the image data obtained from the fluorescence imaging device 10 and the fluorescent agent diffusion image 85 generated by the image processing unit 20.
 入出力部23は、蛍光撮像装置10により生成された蛍光画像81および可視光画像82を含む映像信号の入力を受け付ける。入出力部23は、画像処理部20から表示部30に対する画像出力を行う。入出力部23は、HDMI(登録商標)など映像出力インターフェースや、その他の外部機器接続用のインターフェースである。入出力部23は、有線または無線により、蛍光撮像装置10(出力部135)および表示部30とそれぞれ接続されている。 The input / output unit 23 receives the input of the video signal including the fluorescence image 81 and the visible light image 82 generated by the fluorescence imaging device 10. The input / output unit 23 outputs an image from the image processing unit 20 to the display unit 30. The input / output unit 23 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices. The input / output unit 23 is connected to the fluorescence imaging device 10 (output unit 135) and the display unit 30, respectively, by wire or wirelessly.
 また、画像処理部20は、可視光画像取得部211と、蛍光画像取得部212と、領域抽出部213と、脈動周期取得部214と、拡散画像生成部215と、画像合成部216とを、機能ブロックとして含んでいる。機能ブロックは、画像処理部20が備えるプロセッサ21が、プログラムを実行することによって実現される情報処理の機能のまとまりを意味する。これらの各機能ブロックは、それぞれ別個のハードウェア(プロセッサ)により構成されていてもよい。 Further, the image processing unit 20 includes a visible light image acquisition unit 211, a fluorescence image acquisition unit 212, a region extraction unit 213, a pulsation cycle acquisition unit 214, a diffusion image generation unit 215, and an image composition unit 216. Included as a functional block. The functional block means a set of information processing functions realized by the processor 21 included in the image processing unit 20 executing a program. Each of these functional blocks may be composed of separate hardware (processor).
 〈可視光画像取得部〉
 可視光画像取得部211(プロセッサ21)は、入出力部23を介して、蛍光撮像装置10の可視光撮像部112により撮像された可視光画像82を取得する。可視光画像取得部211は、取得した可視光画像82を画像合成部216に出力する。
<Visible light image acquisition unit>
The visible light image acquisition unit 211 (processor 21) acquires the visible light image 82 imaged by the visible light image pickup unit 112 of the fluorescence imaging device 10 via the input / output unit 23. The visible light image acquisition unit 211 outputs the acquired visible light image 82 to the image synthesis unit 216.
 〈蛍光画像取得部〉
 蛍光画像取得部212(プロセッサ21)は、入出力部23を介して、蛍光撮像装置10の蛍光撮像部111により撮像された蛍光画像81を取得する。蛍光画像取得部212は、取得した蛍光画像81を領域抽出部213に出力する。
<Fluorescent image acquisition unit>
The fluorescence image acquisition unit 212 (processor 21) acquires the fluorescence image 81 imaged by the fluorescence imaging unit 111 of the fluorescence imaging device 10 via the input / output unit 23. The fluorescence image acquisition unit 212 outputs the acquired fluorescence image 81 to the region extraction unit 213.
 なお、蛍光画像81の撮影に際して、被写体1に対する撮像視野は一定で移動しないものとする。時系列で順次取得される個々の蛍光画像81は、フレーム番号によって特定される。蛍光画像81のフレーム番号は、撮影時点を表す。それぞれのフレーム番号の蛍光画像81は、被写体1の同一位置の画像であって、撮影時点の異なる画像である。本明細書において、各時点の蛍光画像81は、各フレーム番号の蛍光画像81と言い換えうる。 It is assumed that the imaging field of view for the subject 1 is constant and does not move when the fluorescent image 81 is photographed. The individual fluorescence images 81 sequentially acquired in chronological order are specified by frame numbers. The frame number of the fluorescent image 81 represents the time of shooting. The fluorescent image 81 of each frame number is an image of the same position of the subject 1 and is an image at a different time of shooting. In the present specification, the fluorescence image 81 at each time point can be rephrased as the fluorescence image 81 of each frame number.
 〈領域抽出部〉
 領域抽出部213(プロセッサ21)は、蛍光撮像部111により時系列に生成される蛍光画像81において、蛍光開始領域84を抽出する。領域抽出部213(プロセッサ21)は、複数の時点で生成される蛍光画像81と、生成時点以前の蛍光画像81とを比較することにより、蛍光開始領域84を検出する。
<Region extraction unit>
The region extraction unit 213 (processor 21) extracts the fluorescence start region 84 in the fluorescence image 81 generated in time series by the fluorescence imaging unit 111. The region extraction unit 213 (processor 21) detects the fluorescence start region 84 by comparing the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point.
 図6に示すように、蛍光開始領域84は、撮像開始後の各時点で生成された蛍光画像81中において初めて蛍光92が検出された領域である。蛍光開始領域84は、たとえば、蛍光画像81に含まれる個々の画素である。つまり、領域抽出部213は、蛍光画像81中の蛍光開始領域84を1画素単位で抽出する。領域抽出部213は、複数画素のまとまりを蛍光開始領域84として抽出してもよい。 As shown in FIG. 6, the fluorescence start region 84 is a region in which fluorescence 92 is detected for the first time in the fluorescence image 81 generated at each time point after the start of imaging. The fluorescence start region 84 is, for example, individual pixels included in the fluorescence image 81. That is, the region extraction unit 213 extracts the fluorescence start region 84 in the fluorescence image 81 in pixel units. The region extraction unit 213 may extract a group of a plurality of pixels as a fluorescence start region 84.
 蛍光画像81は、蛍光剤3から発生した蛍光92を検出して画像化したものであり、蛍光画像81の画素値は、蛍光強度を表す。被写体1に蛍光剤3が投与されると、蛍光剤3は、血流によって、時間の経過に伴って拡散していく。蛍光画像81中のある位置を蛍光剤3が通過する際、その位置では、蛍光92が検出されていない状態(画素値がバックグラウンドレベルにある状態)の低い画素値から、蛍光92の検出により画素値が立ち上がる。その後蛍光剤3が流れ去ることによって画素値が立ち下がり、蛍光92が検出されていない状態に戻る。蛍光開始領域84は、画素値の時間変化において、画素値の立ち上がりが最初に検出された領域である。図6では、便宜的に、蛍光画像81のうち、画素値がある値よりも高い領域にハッチングを付し、画素値がある値以下の領域を無地で示している。また、個々の蛍光画像81から抽出される蛍光開始領域84にハッチングを付して示している。 The fluorescence image 81 is an image obtained by detecting the fluorescence 92 generated from the fluorescence agent 3, and the pixel value of the fluorescence image 81 represents the fluorescence intensity. When the fluorescent agent 3 is administered to the subject 1, the fluorescent agent 3 diffuses with the passage of time due to the blood flow. When the fluorescent agent 3 passes through a certain position in the fluorescence image 81, the fluorescence 92 is detected from the low pixel value in the state where the fluorescence 92 is not detected (the state where the pixel value is at the background level) at that position. The pixel value rises. After that, when the fluorescent agent 3 flows away, the pixel value drops, and the state returns to the state where the fluorescence 92 is not detected. The fluorescence start region 84 is a region in which the rising edge of the pixel value is first detected in the time change of the pixel value. In FIG. 6, for convenience, in the fluorescence image 81, a region having a pixel value higher than a certain value is hatched, and a region having a pixel value equal to or less than a certain value is shown in plain color. Further, the fluorescence start region 84 extracted from each fluorescence image 81 is shown with hatching.
 蛍光画像81において抽出される蛍光開始領域84は、撮影開始後に初めて蛍光92が検出される領域であるため、同じ蛍光開始領域84が複数回抽出されることがない。 Since the fluorescence start region 84 extracted in the fluorescence image 81 is a region where the fluorescence 92 is detected for the first time after the start of imaging, the same fluorescence start region 84 is not extracted multiple times.
 たとえば、あるフレーム番号(M1)の蛍光画像81で蛍光開始領域84aが抽出されるとする。次のフレーム番号(M2)の蛍光画像81では、蛍光剤3の拡散を反映して、フレーム番号(M1)の蛍光開始領域84aに隣接する周囲の領域で、蛍光開始領域84bが抽出される。蛍光開始領域84bは、フレーム番号(M1)の蛍光開始領域84aを含まない。さらに、次のフレーム番号(M3)の蛍光画像81では、フレーム番号(M2)の蛍光開始領域84bに隣接する周囲の領域で、蛍光開始領域84cが抽出される。 For example, suppose that the fluorescence start region 84a is extracted from the fluorescence image 81 of a certain frame number (M1). In the fluorescence image 81 of the next frame number (M2), the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (M1), reflecting the diffusion of the fluorescent agent 3. The fluorescence start region 84b does not include the fluorescence start region 84a of the frame number (M1). Further, in the fluorescence image 81 of the next frame number (M3), the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (M2).
 蛍光画像81中における蛍光開始領域84の抽出方法は、特に限定されないが、ここでは、図7~図9に示す3つの例を説明する。図7~図9は、蛍光画像81中のある1つの画素における時間強度曲線(time intensity curve:TIC)71を示したグラフである。グラフは、縦軸が画素値(すなわち、蛍光強度)を示し、横軸がフレーム番号(すなわち、経過時間)を示す。 The method for extracting the fluorescence start region 84 in the fluorescence image 81 is not particularly limited, but here, three examples shown in FIGS. 7 to 9 will be described. 7 to 9 are graphs showing a time intensity curve (TIC) 71 in one pixel in the fluorescence image 81. In the graph, the vertical axis indicates the pixel value (that is, the fluorescence intensity), and the horizontal axis indicates the frame number (that is, the elapsed time).
 第1の例では、図7に示すように、プロセッサ21(領域抽出部213)は、蛍光画像81の各画素のうち、画素値が所定の閾値72を超えたことに基づき、蛍光開始領域84を抽出する。領域抽出部213は、画素値が閾値72を越えた画素を、蛍光開始領域84として抽出する。閾値72は、画素値の立ち上がりよりも前の時点(領域71a)における画素値(バックグラウンドレベル)よりも高い所定値とされる。 In the first example, as shown in FIG. 7, the processor 21 (region extraction unit 213) has a fluorescence start region 84 based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Is extracted. The region extraction unit 213 extracts pixels whose pixel values exceed the threshold value 72 as the fluorescence start region 84. The threshold value 72 is set to a predetermined value higher than the pixel value (background level) at the time point (region 71a) before the rise of the pixel value.
 第2の例では、図8に示すように、プロセッサ21(領域抽出部213)は、蛍光画像81の各画素の時間強度曲線71の傾きが、所定の閾値73を超えたことに基づき、蛍光開始領域84を抽出する。領域抽出部213は、時間強度曲線71の傾きが閾値73を越えた画素を、蛍光開始領域84として抽出する。便宜上、図8では模式的に図示しているが、時間強度曲線71の傾きは、たとえば隣接する2つのフレーム間の画素値の差分値(変化量)である。閾値73は、領域71aにおける傾きよりも大きい所定値とされる。 In the second example, as shown in FIG. 8, the processor 21 (region extraction unit 213) fluoresces based on the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 73. The start region 84 is extracted. The region extraction unit 213 extracts pixels whose slope of the time intensity curve 71 exceeds the threshold value 73 as the fluorescence start region 84. For convenience, although shown schematically in FIG. 8, the slope of the time intensity curve 71 is, for example, the difference value (change amount) of the pixel values between two adjacent frames. The threshold value 73 is a predetermined value larger than the slope in the region 71a.
 第3の例では、図9に示すように、プロセッサ21(領域抽出部213)は、蛍光画像81の各画素の時間強度曲線71の面積値が、所定の閾値74を超えたことに基づき、蛍光開始領域84を抽出する。領域抽出部213は、時間強度曲線71の面積値が閾値74を越えた画素を、蛍光開始領域84として抽出する。時間強度曲線71の面積値は、たとえば各フレームにおける、バックグラウンドレベルを超える画素値の積算値である。時間強度曲線71の面積値に代えて、時間強度曲線71の平均値(面積値をフレーム数で除算した値)としてもよい。 In the third example, as shown in FIG. 9, the processor 21 (region extraction unit 213) is based on the fact that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 74. The fluorescence start region 84 is extracted. The region extraction unit 213 extracts pixels whose area value of the time intensity curve 71 exceeds the threshold value 74 as the fluorescence start region 84. The area value of the time intensity curve 71 is, for example, an integrated value of pixel values exceeding the background level in each frame. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 (the value obtained by dividing the area value by the number of frames) may be used.
 このように、プロセッサ21(領域抽出部213)は、時系列で取得される蛍光画像81のうち、どのフレーム番号(どの時点)の蛍光画像81で蛍光開始領域84が抽出されたかによって、蛍光92が初めて検出された位置および時点を特定することが可能である。いずれかのフレーム番号の蛍光画像81において抽出された蛍光開始領域84は、以降のフレーム番号の蛍光画像81で再度抽出されることはない。領域抽出部213は、抽出した蛍光開始領域84を特定する情報を拡散画像生成部215に出力する。 In this way, the processor 21 (region extraction unit 213) determines the fluorescence 92 depending on which frame number (at what time point) the fluorescence image 81 of the fluorescence images 81 acquired in time series extracts the fluorescence start region 84. It is possible to identify the location and time point when was first detected. The fluorescence start region 84 extracted in the fluorescence image 81 of any frame number is not extracted again in the fluorescence image 81 of subsequent frame numbers. The region extraction unit 213 outputs information for identifying the extracted fluorescence start region 84 to the diffusion image generation unit 215.
 〈脈動周期取得部〉
 図5に示すように、脈動周期取得部214は、被写体1の血流の脈動周期60を取得する。本実施形態では、プロセッサ21(脈動周期取得部214)は、蛍光開始領域84の数を、生成された蛍光画像81毎に積算し、時系列に生成される蛍光画像81毎の蛍光開始領域84の数に基づき、脈動周期60を検出するように構成されている。蛍光開始領域84が1画素単位で抽出される構成においては、蛍光画像81毎に積算される蛍光開始領域84の数は、1つの蛍光画像81内で蛍光開始領域84として抽出された画素の総数である。
<Pulsating cycle acquisition unit>
As shown in FIG. 5, the pulsation cycle acquisition unit 214 acquires the pulsation cycle 60 of the blood flow of the subject 1. In the present embodiment, the processor 21 (pulsation cycle acquisition unit 214) integrates the number of fluorescence start regions 84 for each generated fluorescence image 81, and the fluorescence start region 84 for each fluorescence image 81 generated in time series. It is configured to detect the pulsation cycle 60 based on the number of. In the configuration in which the fluorescence start region 84 is extracted in units of one pixel, the number of fluorescence start regions 84 integrated for each fluorescence image 81 is the total number of pixels extracted as the fluorescence start region 84 in one fluorescence image 81. Is.
 ここで、蛍光剤3は、被写体1の血流によって運ばれて、動脈から、より細い血管へと分岐して、毛細血管および体組織内に拡散し、その後、毛細血管を介して静脈へと流れる。このように血液が体組織へと浸透するように流れることを灌流という。血液に含まれる蛍光剤3は、灌流によって撮影視野に含まれる体組織内を拡散する。灌流は、心臓の拍動に起因する血液の脈動によって生じるため、蛍光剤3の拡散も脈動に応じて周期的に変化する。図10において、心電計により検出される心拍の波形75と、脈波計により検出される脈拍の波形76とを示す。波形75および波形76のいずれも、横軸が時間を示し、縦軸が信号強度を示す。波形75におけるピークにおいて、心臓の収縮期血圧のピークが到来し、血圧ピークの伝搬の時間差によって、波形76におけるピークが時間差を伴って形成される。 Here, the fluorescent agent 3 is carried by the bloodstream of the subject 1, branches from the artery into smaller blood vessels, diffuses into the capillaries and body tissues, and then into the veins via the capillaries. It flows. The flow of blood so that it penetrates into the body tissue is called perfusion. The fluorescent agent 3 contained in the blood diffuses in the body tissue contained in the field of view by perfusion. Since the perfusion is caused by the pulsation of blood caused by the beating of the heart, the diffusion of the fluorescent agent 3 also changes periodically in response to the pulsation. FIG. 10 shows a heartbeat waveform 75 detected by an electrocardiograph and a pulse waveform 76 detected by a pulse wave meter. In both the waveform 75 and the waveform 76, the horizontal axis represents time and the vertical axis represents signal strength. At the peak in waveform 75, the peak of systolic blood pressure of the heart arrives, and the peak in waveform 76 is formed with a time lag due to the time difference in the propagation of the blood pressure peak.
 このことから、本願発明者は、蛍光画像81において観察される蛍光剤が、心拍に由来する脈動に伴って周期的に拡散していくという知見を得た。そこで、本実施形態では、時系列で取得される蛍光画像81から脈動周期60毎の変化を可視化するようにした。 From this, the inventor of the present application has obtained the finding that the fluorescent agent observed in the fluorescent image 81 diffuses periodically with the pulsation derived from the heartbeat. Therefore, in the present embodiment, the change in each pulsation cycle 60 is visualized from the fluorescence image 81 acquired in time series.
 図11は、時間経過に伴う蛍光開始領域84の数の変化を示したグラフ65を示している。グラフ65の横軸は、経過時間(すなわち、フレーム数)を示し、縦軸は、蛍光開始領域84の数(フレーム画像に含まれる蛍光開始領域84の総数)を示す。蛍光開始領域84の数の変化に着目すると、図10に示した波形75におけるピーク、波形76におけるピークと対応して、蛍光開始領域84の数のピーク66が形成される。つまり、蛍光剤3は、心拍や脈拍のピークに対応したタイミングで血液と共に急激に拡散し、次のピークが到来するまでは、蛍光剤3の拡散速度が緩やかになる。そのため、心拍や脈拍のピークに対応したタイミングで蛍光開始領域84の数のピーク66が形成される。蛍光開始領域84の数のピーク66は、心臓の拍動に起因するため、1回の拍動(あるいは1回の脈動)の度に、1つ形成されることになる。 FIG. 11 shows a graph 65 showing the change in the number of fluorescence start regions 84 with the passage of time. The horizontal axis of the graph 65 shows the elapsed time (that is, the number of frames), and the vertical axis shows the number of fluorescence start regions 84 (total number of fluorescence start regions 84 included in the frame image). Focusing on the change in the number of the fluorescence start regions 84, the peak 66 of the number of the fluorescence start regions 84 is formed corresponding to the peak in the waveform 75 and the peak in the waveform 76 shown in FIG. That is, the fluorescent agent 3 rapidly diffuses together with the blood at a timing corresponding to the peak of the heartbeat or the pulse, and the diffusion rate of the fluorescent agent 3 becomes slow until the next peak arrives. Therefore, the peak 66 of the number of the fluorescence start regions 84 is formed at the timing corresponding to the peak of the heartbeat or the pulse. Since the number peak 66 of the fluorescence initiation region 84 is caused by the beating of the heart, one is formed for each beating (or one pulsation).
 そこで、プロセッサ21(脈動周期取得部214)は、蛍光画像81毎の蛍光開始領域84の数の変化を取得し、蛍光開始領域84の数のピーク66に基づいて、脈動周期60を検出するように構成されている。グラフ65において、隣接する2つのピーク66の間の期間が、心臓の拍動に起因する血液の脈動周期60である。脈動周期取得部214は、たとえば隣接するピーク66の頂点間の時間間隔、隣接するピーク66の立ち上がり点または立ち下がり点の間の時間間隔、に基づいて、脈動周期60を検出する。 Therefore, the processor 21 (pulsation cycle acquisition unit 214) acquires the change in the number of the fluorescence start regions 84 for each fluorescence image 81, and detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84. It is configured in. In graph 65, the period between two adjacent peaks 66 is the blood pulsation cycle 60 due to the beating of the heart. The pulsation cycle acquisition unit 214 detects the pulsation cycle 60 based on, for example, the time interval between the vertices of the adjacent peak 66 and the time interval between the rising point or the falling point of the adjacent peak 66.
 具体的には、脈動周期取得部214は、時系列に沿って順番に取得される各蛍光画像81のうち、いずれのフレーム番号の蛍光画像81において、蛍光開始領域84の数のピーク66が検出されたかを特定する。 Specifically, the pulsation cycle acquisition unit 214 detects the peak 66 of the number of the fluorescence start regions 84 in the fluorescence image 81 of any frame number among the fluorescence images 81 acquired in order in chronological order. Identify if it was done.
 脈動周期取得部214は、直前のピーク66のフレーム番号から、今回検出されたピーク66のフレーム番号までの間を、1つの脈動周期60として特定する。直前のピーク66がない場合、撮影開始時点から今回検出されたピーク66のフレーム番号までの間が、1つの脈動周期60として特定される。脈動周期60は、周期の開始点となる開始フレーム番号と、周期の終了点となる終了フレーム番号と、で特定されうる。開始フレーム番号以上、終了フレーム番号以下の範囲内のフレーム番号を有する蛍光画像81は、同一の脈動周期60に属する蛍光画像81である。 The pulsation cycle acquisition unit 214 specifies the period from the frame number of the immediately preceding peak 66 to the frame number of the peak 66 detected this time as one pulsation cycle 60. When there is no immediately preceding peak 66, the period from the start of imaging to the frame number of the peak 66 detected this time is specified as one pulsating cycle 60. The pulsation cycle 60 can be specified by a start frame number that is the start point of the cycle and an end frame number that is the end point of the cycle. The fluorescence image 81 having a frame number within the range of the start frame number or more and the end frame number or less is a fluorescence image 81 belonging to the same pulsation cycle 60.
 図11では、時間経過に伴って、フレーム番号(N1)で1回目のピーク66a、フレーム番号(N2)で2回目のピーク66b、フレーム番号(N3)で3回目のピーク66c、フレーム番号(N4)で4回目のピーク66d、フレーム番号(N5)で5回目のピーク66eが検出された例を示している。この場合、フレーム番号(1)からフレーム番号(N1)が周期1、フレーム番号(N1+1)からフレーム番号(N2)が周期2、フレーム番号(N2+1)からフレーム番号(N3)が周期3、フレーム番号(N3+1)からフレーム番号(N4)が周期4、フレーム番号(N4+1)からフレーム番号(N5)が周期5、となる。 In FIG. 11, with the passage of time, the frame number (N1) is the first peak 66a, the frame number (N2) is the second peak 66b, the frame number (N3) is the third peak 66c, and the frame number (N4). ) Shows an example in which the fourth peak 66d is detected, and the frame number (N5) shows an example in which the fifth peak 66e is detected. In this case, the frame number (1) to the frame number (N1) is the cycle 1, the frame number (N1 + 1) to the frame number (N2) is the cycle 2, the frame number (N2 + 1) to the frame number (N3) is the cycle 3, and the frame number. From (N3 + 1), the frame number (N4) has a cycle of 4, and from the frame number (N4 + 1), the frame number (N5) has a cycle of 5.
 〈拡散画像生成部〉
 拡散画像生成部215(プロセッサ21、図5参照)は、各時点における蛍光開始領域84を異なる態様で重畳することにより、蛍光剤拡散画像85を生成する。本実施形態では、拡散画像生成部215(プロセッサ21、図5参照)は、領域抽出部213により抽出された蛍光開始領域84と、脈動周期取得部214により取得された脈動周期60とに基づいて、蛍光剤拡散画像85(図12参照)を生成する。蛍光剤拡散画像85は、抽出された蛍光開始領域84を、脈動周期60毎に互いに異なる態様で表示する画像である。
<Diffuse image generator>
The diffused image generation unit 215 (processor 21, see FIG. 5) generates the fluorescent agent diffused image 85 by superimposing the fluorescence start regions 84 at each time point in different modes. In the present embodiment, the diffusion image generation unit 215 (processor 21, see FIG. 5) is based on the fluorescence start region 84 extracted by the region extraction unit 213 and the pulsation cycle 60 acquired by the pulsation cycle acquisition unit 214. , Fluorescent agent diffusion image 85 (see FIG. 12) is generated. The fluorescent agent diffusion image 85 is an image in which the extracted fluorescence start regions 84 are displayed in different modes for each pulsation cycle 60.
 具体的には、図5に示すように、拡散画像生成部215は、領域抽出部213により抽出された蛍光開始領域84(図6参照)を、時系列に沿ってフレーム番号(1)から順番に取得する。また、拡散画像生成部215は、脈動周期取得部214により検出された脈動周期60を取得する。つまり、拡散画像生成部215は、蛍光開始領域84の数のピーク66が検出されたフレーム番号を取得する。 Specifically, as shown in FIG. 5, the diffusion image generation unit 215 sequentially arranges the fluorescence start regions 84 (see FIG. 6) extracted by the region extraction unit 213 from the frame number (1) in chronological order. To get to. Further, the diffusion image generation unit 215 acquires the pulsation cycle 60 detected by the pulsation cycle acquisition unit 214. That is, the diffusion image generation unit 215 acquires the frame number in which the peak 66 of the number of the fluorescence start regions 84 is detected.
 そして、拡散画像生成部215は、図12に示すように、同一の脈動周期60に属する蛍光画像81から抽出された蛍光開始領域84を、まとめて同じ態様で表示させる。拡散画像生成部215は、脈動周期60毎に、その脈動周期60に属する蛍光画像81から抽出された蛍光開始領域84の表示態様を異ならせる。これにより、拡散画像生成部215(プロセッサ21)は、脈動周期60毎の各時点における蛍光開始領域84を異なる態様で重畳した蛍光剤拡散画像85を生成する。 Then, as shown in FIG. 12, the diffusion image generation unit 215 collectively displays the fluorescence start regions 84 extracted from the fluorescence images 81 belonging to the same pulsation cycle 60 in the same manner. The diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 extracted from the fluorescence image 81 belonging to the pulsation cycle 60 for each pulsation cycle 60. As a result, the diffusion image generation unit 215 (processor 21) generates a fluorescence agent diffusion image 85 in which the fluorescence start regions 84 at each time point in each pulsation cycle 60 are superimposed in different modes.
 図12は、蛍光剤拡散画像85の生成処理の流れを示す。 FIG. 12 shows the flow of the generation process of the fluorescent agent diffusion image 85.
 まず、領域抽出部213により、フレーム番号(1)から順番に、蛍光開始領域84が取得される。拡散画像生成部215は、蛍光開始領域84の数のピーク66が検出されるまでの各蛍光画像81において抽出された蛍光開始領域84を、同じ態様86aで表示する蛍光剤拡散画像85を生成する。 First, the region extraction unit 213 acquires the fluorescence start region 84 in order from the frame number (1). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence initiation regions 84 extracted in each fluorescence image 81 until the peak 66 of the number of fluorescence initiation regions 84 is detected in the same embodiment 86a. ..
 たとえば図12の例では、脈動周期取得部214により、フレーム番号(1)からフレーム番号(N1)までが、1番目の脈動周期60(周期1)として検出される。拡散画像生成部215は、フレーム番号(1)からフレーム番号(N1)までの各蛍光画像81において抽出された蛍光開始領域84を、同じ態様86aで表示する蛍光剤拡散画像85を生成する。図12では便宜的に、表示態様の相違を、蛍光開始領域84に付与したハッチングの相違によって表現している。 For example, in the example of FIG. 12, the frame number (1) to the frame number (N1) are detected by the pulsation cycle acquisition unit 214 as the first pulsation cycle 60 (cycle 1). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 from the frame number (1) to the frame number (N1) in the same aspect 86a. In FIG. 12, for convenience, the difference in display mode is expressed by the difference in hatching applied to the fluorescence start region 84.
 脈動周期取得部214により、フレーム番号(N1+1)からフレーム番号(N2)までが、2番目の脈動周期60(周期2)として検出される。拡散画像生成部215は、周期2に属する各蛍光画像81において抽出された蛍光開始領域84を、同じ態様86bで表示する蛍光剤拡散画像85を生成する。態様86bは、態様86aとは異なる表示態様であり、ユーザは、態様86aで表示された蛍光開始領域84と態様86bで表示された蛍光開始領域84とを視覚的に区別可能である。 The pulsation cycle acquisition unit 214 detects the frame number (N1 + 1) to the frame number (N2) as the second pulsation cycle 60 (cycle 2). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 2 in the same aspect 86b. The aspect 86b is a display aspect different from the aspect 86a, and the user can visually distinguish the fluorescence start region 84 displayed in the aspect 86a from the fluorescence start region 84 displayed in the aspect 86b.
 脈動周期取得部214により、フレーム番号(N2+1)からフレーム番号(N3)までが、3番目の脈動周期60(周期3)として検出される。拡散画像生成部215は、周期3に属する各蛍光画像81において抽出された蛍光開始領域84を、同じ態様86cで表示する蛍光剤拡散画像85を生成する。態様86cは、態様86aおよび態様86bとは異なる表示態様である。 The pulsation cycle acquisition unit 214 detects the frame number (N2 + 1) to the frame number (N3) as the third pulsation cycle 60 (cycle 3). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 3 in the same aspect 86c. Aspect 86c is a display aspect different from the aspects 86a and 86b.
 周期3以降の脈動周期60についても、同様である。このようにして、プロセッサ21(拡散画像生成部215)は、抽出された蛍光開始領域84を、脈動周期60毎に互いに異なる態様で表示する蛍光剤拡散画像85を生成する。 The same applies to the pulsation cycle 60 after the cycle 3. In this way, the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 that displays the extracted fluorescence start region 84 in different modes for each pulsation cycle 60.
 〈同一脈動周期に属する蛍光開始領域の表示処理〉
 なお、プロセッサ21(拡散画像生成部215)は、フレーム毎に蛍光剤拡散画像85を生成(更新)する。本実施形態では、プロセッサ21(拡散画像生成部215)は、脈動周期60内に含まれる蛍光画像81が生成されると、脈動周期60内に含まれる抽出済みの蛍光開始領域84に加えて、最新の蛍光画像81から抽出された蛍光開始領域84を蛍光剤拡散画像85に追加するように構成されている。
<Display processing of fluorescence start region belonging to the same pulsation cycle>
The processor 21 (diffusion image generation unit 215) generates (updates) the fluorescent agent diffusion image 85 for each frame. In the present embodiment, when the fluorescence image 81 included in the pulsation cycle 60 is generated, the processor 21 (diffusion image generation unit 215) adds the extracted fluorescence start region 84 included in the pulsation cycle 60 to the fluorescence image 81. The fluorescence initiation region 84 extracted from the latest fluorescence image 81 is configured to be added to the fluorescent agent diffusion image 85.
 たとえば、図13に示すように、フレーム番号(1)の蛍光画像81で蛍光開始領域84aが抽出されるとする。拡散画像生成部215は、蛍光開始領域84aを含む蛍光剤拡散画像85を生成する。 For example, as shown in FIG. 13, it is assumed that the fluorescence start region 84a is extracted from the fluorescence image 81 of the frame number (1). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including a fluorescence start region 84a.
 次のフレーム番号(2)の蛍光画像81では、フレーム番号(1)の蛍光開始領域84aに隣接する周囲の領域で、蛍光開始領域84bが抽出される。拡散画像生成部215は、蛍光開始領域84aに、蛍光開始領域84bを追加した蛍光剤拡散画像85を生成する。 In the fluorescence image 81 of the next frame number (2), the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (1). The diffusion image generation unit 215 generates a fluorescent agent diffusion image 85 in which the fluorescence start region 84b is added to the fluorescence start region 84a.
 さらに、次のフレーム番号(3)の蛍光画像81では、フレーム番号(2)の蛍光開始領域84bに隣接する周囲の領域で、蛍光開始領域84cが抽出される。拡散画像生成部215は、フレーム番号(2)の蛍光剤拡散画像85に、蛍光開始領域84cを追加した蛍光剤拡散画像85を生成する。 Further, in the fluorescence image 81 of the next frame number (3), the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (2). The diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 in which a fluorescence start region 84c is added to the fluorescence agent diffusion image 85 of frame number (2).
 このように、拡散画像生成部215は、脈動周期60の開始フレーム番号(1)から、終了フレーム番号(N1)までの間で、蛍光画像81が取得される度に、取得された蛍光画像81から抽出された蛍光開始領域84を蛍光剤拡散画像85に追加する。この結果、生成される蛍光剤拡散画像85は、図13に示したように、脈動周期60が次の周期に切り替わるまで、時間経過に伴って蛍光開始領域84が徐々に拡大していく動画像として生成される。脈動周期60の終了フレーム番号における蛍光剤拡散画像85では、その脈動周期60の期間内に抽出された全ての蛍光開始領域84が表示されることになる。 In this way, the diffusion image generation unit 215 acquires the fluorescent image 81 each time the fluorescent image 81 is acquired between the start frame number (1) and the end frame number (N1) of the pulsation cycle 60. The fluorescence initiation region 84 extracted from is added to the fluorescent agent diffusion image 85. As a result, as shown in FIG. 13, the generated fluorescent agent diffusion image 85 is a moving image in which the fluorescence start region 84 gradually expands with the lapse of time until the pulsation cycle 60 is switched to the next cycle. Is generated as. In the fluorescent agent diffusion image 85 at the end frame number of the pulsation cycle 60, all the fluorescence start regions 84 extracted within the period of the pulsation cycle 60 are displayed.
 そして、図12に示したように、次の脈動周期60(たとえば周期2)になると、その脈動周期60に属する蛍光画像81が取得される度に、前の脈動周期60(たとえば周期1)とは異なる表示態様の蛍光開始領域84が徐々に拡大していく蛍光剤拡散画像85が生成される。 Then, as shown in FIG. 12, when the next pulsation cycle 60 (for example, cycle 2) is reached, each time the fluorescence image 81 belonging to the pulsation cycle 60 is acquired, the previous pulsation cycle 60 (for example, cycle 1) is set. Generates a fluorescent agent diffusion image 85 in which the fluorescence start region 84 of a different display mode gradually expands.
 〈異なる脈動周期に属する蛍光開始領域の表示処理〉
 また、本実施形態では、プロセッサ21(拡散画像生成部215)は、脈動周期60が経過する度に、過去の脈動周期60において抽出された蛍光開始領域84に加えて、最新の脈動周期60において抽出された蛍光開始領域84を蛍光剤拡散画像85に追加するように構成されている。
<Display processing of fluorescence start regions belonging to different pulsation cycles>
Further, in the present embodiment, the processor 21 (diffusion image generation unit 215) sets the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60 each time the pulsation cycle 60 elapses. The extracted fluorescence initiation region 84 is configured to be added to the fluorescence agent diffusion image 85.
 たとえば、図12に示したように、拡散画像生成部215は、1番目の脈動周期60(周期1)の間、態様86aの蛍光開始領域84を含む蛍光剤拡散画像85を生成する。拡散画像生成部215は、1番目の脈動周期60(周期1)が経過しても、態様86aの蛍光開始領域84を表示させたままにする。 For example, as shown in FIG. 12, the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86a during the first pulsation cycle 60 (cycle 1). The diffusion image generation unit 215 keeps displaying the fluorescence start region 84 of the aspect 86a even after the first pulsation cycle 60 (cycle 1) elapses.
 次に、2番目の脈動周期60(周期2)の間、拡散画像生成部215は、態様86bの蛍光開始領域84を含む蛍光剤拡散画像85を生成する。このとき、拡散画像生成部215は、態様86aの蛍光開始領域84に追加して、態様86bの蛍光開始領域84を蛍光剤拡散画像85に表示させる。3番目の脈動周期60(周期3)では、拡散画像生成部215は、態様86aの蛍光開始領域84および態様86bの蛍光開始領域84に追加して、態様86cの蛍光開始領域84を蛍光剤拡散画像85に表示させる。 Next, during the second pulsation cycle 60 (cycle 2), the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86b. At this time, the diffusion image generation unit 215 adds the fluorescence start region 84 of the aspect 86a to the fluorescence start region 84 of the aspect 86b to display the fluorescence start region 84 of the aspect 86b on the fluorescent agent diffusion image 85. In the third pulsation cycle 60 (cycle 3), the diffusion image generator 215 adds the fluorescence start region 84 of aspect 86a and the fluorescence start region 84 of aspect 86b to diffuse the fluorescence start region 84 of aspect 86c with a fluorescent agent. It is displayed on the image 85.
 このように、拡散画像生成部215は、脈動周期60が経過する度に、蛍光開始領域84の表示態様を変更するとともに、その脈動周期60に属する蛍光開始領域84を蛍光剤拡散画像85に追加する。この結果、生成される蛍光剤拡散画像85は、図12に示したように、脈動周期60の経過に伴って蛍光開始領域84の表示態様が変更されつつ、表示される蛍光開始領域84が徐々に拡大していく動画像として生成される。蛍光剤拡散画像85には、脈動周期60が切り替わるタイミング(表示態様が切り替わるタイミング)に対応する境界線87が形成される。 As described above, the diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 each time the pulsation cycle 60 elapses, and adds the fluorescence start region 84 belonging to the pulsation cycle 60 to the fluorescent agent diffusion image 85. To do. As a result, as shown in FIG. 12, in the generated fluorescent agent diffusion image 85, the display mode of the fluorescence start region 84 is gradually changed with the passage of the pulsation cycle 60, and the displayed fluorescence start region 84 is gradually displayed. It is generated as a moving image that expands to. In the fluorescent agent diffusion image 85, a boundary line 87 corresponding to the timing at which the pulsation cycle 60 is switched (the timing at which the display mode is switched) is formed.
 境界線87は、隣り合う蛍光開始領域84の間の表示態様が異なること(隣り合う蛍光開始領域84が異なる脈動周期60に属すること)によって境界線として認識可能に表示されることになる。蛍光画像81の撮像が終了される時には、蛍光剤拡散画像85には、撮像終了までに経過した脈動周期60の数に相当する数の境界線87で区分された蛍光開始領域84が、表示される。したがって、蛍光剤拡散画像85は、脈動周期60が切り替わるタイミングを示す境界線87が、等値線状に形成された画像になる。 The boundary line 87 is recognizable as a boundary line because the display mode between the adjacent fluorescence start regions 84 is different (the adjacent fluorescence start regions 84 belong to different pulsation cycles 60). When the imaging of the fluorescence image 81 is completed, the fluorescence agent diffusion image 85 displays a fluorescence start region 84 divided by a number of boundary lines 87 corresponding to the number of pulsation cycles 60 that have elapsed until the end of imaging. To. Therefore, the fluorescent agent diffusion image 85 is an image in which the boundary line 87 indicating the timing at which the pulsation cycle 60 is switched is formed in an isolinear shape.
 〈画像合成部〉
 図5に示すように、画像合成部216(プロセッサ21)は、可視光画像取得部211により取得された可視光画像82と、拡散画像生成部215において生成された蛍光剤拡散画像85とを、合成する処理を行う。合成とは、複数の画像を重畳させる処理を含む。具体的には、図14に示すように、画像合成部216は、可視光画像82に、蛍光剤拡散画像85を重畳させることにより、重畳画像88を生成する。したがって、生成される重畳画像88では、ユーザが実際に視認可能な治療対象部位2が写る可視光画像82において、蛍光剤3が拡散する様子を示した蛍光剤拡散画像85が重なって表示された画像となる。
<Image composition unit>
As shown in FIG. 5, the image synthesis unit 216 (processor 21) combines the visible light image 82 acquired by the visible light image acquisition unit 211 and the fluorescent agent diffusion image 85 generated by the diffusion image generation unit 215. Perform the process of synthesizing. Compositing includes a process of superimposing a plurality of images. Specifically, as shown in FIG. 14, the image synthesizing unit 216 generates a superposed image 88 by superimposing the fluorescent agent diffusion image 85 on the visible light image 82. Therefore, in the generated superimposed image 88, in the visible light image 82 in which the treatment target site 2 actually visible to the user is captured, the fluorescent agent diffusion image 85 showing how the fluorescent agent 3 is diffused is displayed overlapping. It becomes an image.
 なお、上記の通り、蛍光撮像装置10により撮像される蛍光画像81と可視光画像82とは、同一視野の画像である。蛍光剤拡散画像85は、蛍光画像81から生成され、蛍光画像81と同一視野の画像である。そのため、可視光画像82と蛍光剤拡散画像85とは、同一視野を写した画像となる。可視光画像82と蛍光剤拡散画像85とが同一視野の画像となるため、重畳する画像同士の位置合わせ等を行うことなく、そのまま重畳するだけの簡単な処理で、重畳画像88を生成することが可能である。 As described above, the fluorescence image 81 and the visible light image 82 imaged by the fluorescence imaging device 10 are images having the same field of view. The fluorescent agent diffusion image 85 is an image generated from the fluorescence image 81 and having the same field of view as the fluorescence image 81. Therefore, the visible light image 82 and the fluorescent agent diffusion image 85 are images that capture the same field of view. Since the visible light image 82 and the fluorescent agent diffusion image 85 are images in the same field of view, the superimposed image 88 can be generated by a simple process of simply superimposing the superimposed images without aligning the superimposed images. Is possible.
 図5に示すように、画像合成部216は、生成した重畳画像88を、入出力部23に出力する。入出力部23は、画像合成部216から取得した重畳画像88を、表示部30に出力して画面表示させる。これにより、プロセッサ21は、生成した蛍光剤拡散画像85を表示部30に表示させる。本実施形態では、プロセッサ21(画像合成部216)は、可視光画像82に、蛍光剤拡散画像85を重畳させて表示部30に表示させるように構成されている。プロセッサ21は、蛍光撮像装置10から最新のフレームのフレーム画像(蛍光画像81、可視光画像82)を取得する度に、そのフレームの蛍光剤拡散画像85を生成して、蛍光剤拡散画像85と可視光画像82とを重畳させた重畳画像88を生成する。プロセッサ21は、フレーム毎に生成した重畳画像88を表示部30に出力することにより、重畳画像88を動画像として表示させる。 As shown in FIG. 5, the image synthesizing unit 216 outputs the generated superimposed image 88 to the input / output unit 23. The input / output unit 23 outputs the superimposed image 88 acquired from the image composition unit 216 to the display unit 30 and displays it on the screen. As a result, the processor 21 causes the display unit 30 to display the generated fluorescent agent diffusion image 85. In the present embodiment, the processor 21 (image synthesis unit 216) is configured to superimpose the fluorescent agent diffusion image 85 on the visible light image 82 and display it on the display unit 30. Each time the processor 21 acquires a frame image (fluorescence image 81, visible light image 82) of the latest frame from the fluorescence imaging device 10, it generates a fluorescence agent diffusion image 85 of the frame, and together with the fluorescence agent diffusion image 85. A superposed image 88 is generated by superimposing the visible light image 82. The processor 21 outputs the superimposed image 88 generated for each frame to the display unit 30, so that the superimposed image 88 is displayed as a moving image.
 (蛍光剤拡散画像の表示態様)
 次に、蛍光剤拡散画像85(重畳画像88)の具体的な表示態様について説明する。
(Display mode of fluorescent agent diffusion image)
Next, a specific display mode of the fluorescent agent diffusion image 85 (superimposed image 88) will be described.
 図15~図17は、蛍光剤拡散画像85を含む重畳画像88の具体例であって、皮弁術(皮膚移植)において移植される被写体1の皮弁(flap)を示している。皮弁とは、血流のある皮膚、皮下組織および深部組織のことである。図15~図17では、治療対象部位2の皮膚2aおよび皮下組織2bが写る可視光画像82に、蛍光剤拡散画像85が重畳された重畳画像88を示している。 15 to 17 are specific examples of the superimposed image 88 including the fluorescent agent diffusion image 85, and show the flap of the subject 1 to be transplanted in the flap technique (skin graft). A flap is a blood-flowing skin, subcutaneous tissue, and deep tissue. 15 to 17 show a superposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 in which the skin 2a and the subcutaneous tissue 2b of the treatment target site 2 are captured.
 本実施形態では、プロセッサ21(図5参照)は、(1)脈動周期60毎の蛍光開始領域84を、異なる階調で表示すること、(2)脈動周期60毎の蛍光開始領域84を示す線を、等値線図状に表示すること、の少なくともいずれかにより、脈動周期60毎の蛍光開始領域84を互いに異なる態様で表示する。 In the present embodiment, the processor 21 (see FIG. 5) shows (1) displaying the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) showing the fluorescence start region 84 for each pulsation cycle 60. The fluorescence initiation regions 84 for each pulsation cycle 60 are displayed in different modes from each other by at least one of displaying the lines in a contour diagram.
 図15の例では、プロセッサ21は、脈動周期60毎の蛍光開始領域84を、異なる階調で表示する。ここで、階調は、画像において表現される色や明るさの段階(グラデーション)のことである。階調は、色の段階的変化、明るさの段階的変化を含む。たとえば蛍光剤拡散画像85がグレースケールの256階調で表現される場合、階調は、白(255)から灰色(126)を経て黒(0)までの明るさの段階的変化を含む。 In the example of FIG. 15, the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 with different gradations. Here, the gradation is a stage (gradation) of color and brightness expressed in an image. The gradation includes a gradual change in color and a gradual change in brightness. For example, when the fluorescent agent diffused image 85 is represented by 256 grayscale gradations, the gradations include a stepwise change in brightness from white (255) through gray (126) to black (0).
 たとえば周期1から周期KまでのK個の脈動周期60が検出された場合、画像処理部20は、最も古い脈動周期60(周期1)を白(255)とし、最新の周期Kに近付くにしたがって、黒(0)に近付くように階調値を段階的に減少させる。 For example, when K pulsating cycles 60 from cycle 1 to cycle K are detected, the image processing unit 20 sets the oldest pulsating cycle 60 (cycle 1) as white (255), and as it approaches the latest cycle K, , The gradation value is gradually reduced so as to approach black (0).
 また、蛍光剤拡散画像85がカラー画像であり、R(赤色、0~255)、G(緑色、0~255)、B(青色、0~255)の3色の階調値の組み合わせで色が表現される場合、階調は、赤色、黄色、緑色、青色といった所定順序で色を連続的に変化させる段階的変化を含む。 Further, the fluorescent agent diffusion image 85 is a color image, and a color is obtained by combining three gradation values of R (red, 0 to 255), G (green, 0 to 255), and B (blue, 0 to 255). When is expressed, the gradation includes a stepwise change in which the colors are continuously changed in a predetermined order such as red, yellow, green, and blue.
 たとえば周期1から周期KまでのK個の脈動周期60が検出された場合、プロセッサ21は、最も古い脈動周期60(周期1)を赤色とし、最新の周期Kを青色とし、周期1から周期Kまでの間の各周期について、赤色から青色に近付くように色の階調を割り当てて段階的に変化させる。 For example, when K pulsating cycles 60 from cycle 1 to cycle K are detected, the processor 21 sets the oldest pulsating cycle 60 (cycle 1) as red, the latest cycle K as blue, and cycles 1 to cycle K. For each cycle up to, the color gradation is assigned so as to approach from red to blue, and the color gradation is changed stepwise.
 図15は、階調の異なる領域を分かり易くするため、便宜的に境界線87を破線で示している。蛍光剤拡散画像85は、それぞれの脈動周期60において抽出された蛍光開始領域84を、階調の相違によって区別して認識可能なように表示する。 In FIG. 15, the boundary line 87 is shown by a broken line for convenience in order to make it easy to understand the regions having different gradations. The fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguishable and recognizable by the difference in gradation.
 図16の例では、プロセッサ21は、脈動周期60毎の蛍光開始領域84を示す線を、等値線図状に表示する。等値線図とは、同じ値の地点を連ねた線を描いて、分布状況を表した図である。蛍光剤拡散画像85においては、図12に示したように、ある脈動周期60に属する蛍光開始領域84と、その次の脈動周期60に属する蛍光開始領域84との間に、境界線87が形成される。脈動周期60毎の蛍光開始領域84を示す線を、等値線図状に表示するとは、この境界線87を画像上で表示することである。蛍光剤3の拡散に伴って、時間的に新しい境界線87は、時間的に古い境界線87の外側に形成されるため、蛍光剤拡散画像85が等値線図状の画像となる。 In the example of FIG. 16, the processor 21 displays a line indicating the fluorescence start region 84 for each pulsation cycle 60 in a contour diagram. The contour diagram is a diagram showing the distribution status by drawing a line connecting points with the same value. In the fluorescent agent diffusion image 85, as shown in FIG. 12, a boundary line 87 is formed between the fluorescence start region 84 belonging to a certain pulsation cycle 60 and the fluorescence start region 84 belonging to the next pulsation cycle 60. Will be done. To display the line indicating the fluorescence start region 84 for each pulsation cycle 60 in an isochronous diagram means to display the boundary line 87 on the image. With the diffusion of the fluorescent agent 3, the temporally new boundary line 87 is formed outside the temporally old boundary line 87, so that the fluorescent agent diffusion image 85 becomes an image of an iso-line diagram.
 プロセッサ21は、脈動周期60が経過する度に、境界線87を抽出して蛍光剤拡散画像85に境界線87を表示させる。その結果、図16に示す等値線図状の蛍光剤拡散画像85が生成される。蛍光剤拡散画像85は、それぞれの脈動周期60において抽出された蛍光開始領域84を、隣り合う2つの境界線87の間の領域として区別して認識可能なように表示する。 The processor 21 extracts the boundary line 87 and displays the boundary line 87 on the fluorescent agent diffusion image 85 each time the pulsation cycle 60 elapses. As a result, the fluorescent agent diffusion image 85 shown in FIG. 16 is generated. The fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguished and recognizable as a region between two adjacent boundary lines 87.
 階調による表示態様と、線(等値線)による表示態様とを、組み合わせてもよい。図17では、プロセッサ21は、脈動周期60毎の蛍光開始領域84を、異なる階調で表示しつつ、最新の脈動周期60(周期K)の境界線87を、境界線で表示する。プロセッサ21は、脈動周期60が経過する度に、脈動周期60に属する蛍光開始領域84の階調を異ならせるとともに、脈動周期60の経過(次の脈動周期60への切り替わり)が検出される度に、最新の脈動周期60(周期K)の境界線87を付与し、前の脈動周期60(周期K-1)の境界線を消去する。プロセッサ21は、境界線87の表示態様を、脈動周期60毎の蛍光開始領域84の表示態様とは異ならせる。 The display mode by gradation and the display mode by lines (isoline) may be combined. In FIG. 17, the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and displays the boundary line 87 of the latest pulsation cycle 60 (cycle K) as a boundary line. Each time the pulsation cycle 60 elapses, the processor 21 changes the gradation of the fluorescence start region 84 belonging to the pulsation cycle 60, and detects the elapse of the pulsation cycle 60 (switching to the next pulsation cycle 60). Is given the latest boundary line 87 of the pulsation cycle 60 (cycle K), and the boundary line of the previous pulsation cycle 60 (cycle K-1) is erased. The processor 21 makes the display mode of the boundary line 87 different from the display mode of the fluorescence start region 84 for each pulsation cycle 60.
 なお、図15~図17に示した蛍光開始領域84の各表示態様のルールは、図5に示した記憶部22に記録されている。プロセッサ21は、たとえばユーザの操作入力により、表示態様の選択を受け付け可能に構成されている。プロセッサ21は、記憶部22に記憶された表示態様の設定情報に従って、ユーザが選択した表示態様で蛍光剤拡散画像85を生成する。 The rules for each display mode of the fluorescence start region 84 shown in FIGS. 15 to 17 are recorded in the storage unit 22 shown in FIG. The processor 21 is configured to be able to accept selection of a display mode, for example, by a user's operation input. The processor 21 generates the fluorescent agent diffusion image 85 in the display mode selected by the user according to the display mode setting information stored in the storage unit 22.
 (治療支援装置の画像表示処理動作)
 次に、図18を参照して、本実施形態の治療支援装置100が実行する画像表示処理について説明する。治療支援装置100は、本実施形態の治療支援方法を実施する。本実施形態の治療支援方法は、少なくとも以下のステップ(1)~(5)を含む。
 ステップ(1):被写体に投与される蛍光剤の励起光91を照射する。
 ステップ(2):励起光により励起された蛍光92を検出して被写体の蛍光画像81を撮像する。
 ステップ(3):プロセッサ21が、複数の時点(フレーム番号)の各々で生成される蛍光画像81と、生成時点以前の蛍光画像81とを比較することにより、各時点において蛍光92が初めて検出された蛍光画像81中の領域(蛍光開始領域84)を検出する。
 ステップ(4):プロセッサ21が、各時点における(蛍光開始領域84)を異なる態様で重畳することにより蛍光剤拡散画像85を生成する。
 ステップ(5):生成した蛍光剤拡散画像85を表示部に表示させる。
(Image display processing operation of treatment support device)
Next, with reference to FIG. 18, the image display process executed by the treatment support device 100 of the present embodiment will be described. The treatment support device 100 implements the treatment support method of the present embodiment. The treatment support method of the present embodiment includes at least the following steps (1) to (5).
Step (1): The subject is irradiated with the excitation light 91 of the fluorescent agent to be administered.
Step (2): The fluorescence 92 excited by the excitation light is detected and the fluorescence image 81 of the subject is imaged.
Step (3): The processor 21 compares the fluorescence image 81 generated at each of the plurality of time points (frame numbers) with the fluorescence image 81 before the generation time point, so that the fluorescence 92 is detected for the first time at each time point. The region (fluorescence start region 84) in the fluorescence image 81 is detected.
Step (4): The processor 21 generates the fluorescent agent diffusion image 85 by superimposing the (fluorescence start region 84) at each time point in a different manner.
Step (5): The generated fluorescent agent diffusion image 85 is displayed on the display unit.
 ステップ(1)およびステップ(2)は、図18のステップ51に対応する。ステップ(3)は、図18のステップ52に対応する。ステップ(4)は、図18のステップ53に対応する。ステップ(5)は、図18のステップ54に対応する。また、図18の治療支援方法では、上記したステップ(1)~(5)に加えて、さらに付加的なステップを含んでいる。 Step (1) and step (2) correspond to step 51 in FIG. Step (3) corresponds to step 52 in FIG. Step (4) corresponds to step 53 in FIG. Step (5) corresponds to step 54 in FIG. Further, the treatment support method of FIG. 18 includes additional steps in addition to the above-mentioned steps (1) to (5).
 治療支援装置100の画像表示処理は、医師などのユーザの操作入力に基づいて、蛍光撮像装置10による撮像が開始されることにより、開始される。 The image display process of the treatment support device 100 is started by starting the imaging by the fluorescence imaging device 10 based on the operation input of a user such as a doctor.
 図18に示すように、ステップ51において、被写体1の蛍光画像81および可視光画像82が撮像される。すなわち、蛍光撮像装置10の励起光照射部115から励起光91が被写体1に照射され、励起光91により励起された蛍光92が蛍光撮像部111により検出される。また、可視光照射部117から可視光93が被写体1に照射され、可視光93の反射光が可視光撮像部112により検出される。これにより、1フレームに相当する蛍光画像81および可視光画像82が撮像される。撮像された蛍光画像81および可視光画像82は、画像処理部20に出力される。 As shown in FIG. 18, in step 51, the fluorescence image 81 and the visible light image 82 of the subject 1 are captured. That is, the excitation light 91 is irradiated to the subject 1 from the excitation light irradiation unit 115 of the fluorescence imaging device 10, and the fluorescence 92 excited by the excitation light 91 is detected by the fluorescence imaging unit 111. Further, the visible light 93 is irradiated to the subject 1 from the visible light irradiation unit 117, and the reflected light of the visible light 93 is detected by the visible light imaging unit 112. As a result, the fluorescence image 81 and the visible light image 82 corresponding to one frame are imaged. The captured fluorescence image 81 and visible light image 82 are output to the image processing unit 20.
 ステップ52において、プロセッサ21(領域抽出部213)は、ステップ51で得られた蛍光画像81から蛍光開始領域84を抽出する。また、プロセッサ21(脈動周期取得部214)は、蛍光画像81内で抽出された蛍光開始領域84の数を積算する。撮影開始後、蛍光開始領域84の数の最初のピーク66が検出されるまでは、プロセッサ21(脈動周期取得部214)は、得られた蛍光画像81を最初の脈動周期60(周期1)に設定する。 In step 52, the processor 21 (region extraction unit 213) extracts the fluorescence start region 84 from the fluorescence image 81 obtained in step 51. Further, the processor 21 (pulsation cycle acquisition unit 214) integrates the number of fluorescence start regions 84 extracted in the fluorescence image 81. After the start of imaging, the processor 21 (pulsation cycle acquisition unit 214) shifts the obtained fluorescence image 81 to the first pulsation cycle 60 (cycle 1) until the first peak 66 of the number of fluorescence start regions 84 is detected. Set.
 ステップ53において、プロセッサ21(拡散画像生成部215)は、抽出された蛍光開始領域84と、脈動周期60とに基づいて、蛍光剤拡散画像85を生成する。プロセッサ21は、記憶部22に記憶された設定情報に従って、予め設定された表示態様で蛍光開始領域84を表示した蛍光剤拡散画像85を生成する。 In step 53, the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 based on the extracted fluorescence start region 84 and the pulsation cycle 60. The processor 21 generates a fluorescent agent diffusion image 85 displaying the fluorescence start region 84 in a preset display mode according to the setting information stored in the storage unit 22.
 ステップ54において、プロセッサ21(画像合成部216)は、可視光画像82に、蛍光剤拡散画像85を重畳させて重畳画像88を生成し、入出力部23を介して、生成した重畳画像88を表示部30(図1参照)に出力する。これにより、動画像を構成する1フレーム分の重畳画像88が表示部30に表示される。 In step 54, the processor 21 (image synthesizing unit 216) superimposes the fluorescent agent diffusion image 85 on the visible light image 82 to generate the superposed image 88, and generates the superposed image 88 via the input / output unit 23. Output to the display unit 30 (see FIG. 1). As a result, the superimposed image 88 for one frame constituting the moving image is displayed on the display unit 30.
 ステップ55において、プロセッサ21(脈動周期取得部214)は、脈動周期60の経過タイミングか否かを判断する。すなわち、プロセッサ21(脈動周期取得部214)は、今回のフレーム番号の蛍光画像81から抽出された蛍光開始領域84の数を、図11に示したグラフ65にプロットし、蛍光開始領域84の数の変化のピーク66が検出されたか否かを判断する。ピーク66が検出されなければ、プロセッサ21は、処理をステップ56に進める。 In step 55, the processor 21 (pulsation cycle acquisition unit 214) determines whether or not it is the elapsed timing of the pulsation cycle 60. That is, the processor 21 (pulsation cycle acquisition unit 214) plots the number of fluorescence start regions 84 extracted from the fluorescence image 81 of the current frame number on the graph 65 shown in FIG. 11, and plots the number of fluorescence start regions 84. It is determined whether or not the peak 66 of the change of is detected. If no peak 66 is detected, processor 21 advances processing to step 56.
 ステップ56において、プロセッサ21は、撮像を終了するか否かを判断する。プロセッサ21は、たとえばユーザから撮像終了の操作入力を受け付けた場合や、予め設定された終了時刻に到達した場合に、撮像を終了すると判断する。撮像を終了する場合、蛍光撮像装置10は撮像を終了し、プロセッサ21は画像処理を停止して、図18の画像表示処理を終了する。撮像を終了しない場合、プロセッサ21は、処理をステップ51に戻して、次のフレームの画像(蛍光画像81および可視光画像82)を取得して、ステップ51~ステップ54の処理を行う。 In step 56, the processor 21 determines whether or not to end the imaging. The processor 21 determines that the imaging is terminated when, for example, an operation input for ending the imaging is received from the user or when the preset end time is reached. When the imaging is terminated, the fluorescence imaging device 10 ends the imaging, the processor 21 stops the image processing, and the image display processing of FIG. 18 ends. When the imaging is not completed, the processor 21 returns the process to step 51, acquires images of the next frame (fluorescence image 81 and visible light image 82), and performs the processes of steps 51 to 54.
 ステップ51~ステップ56のループを繰り返し、フレーム画像の表示が繰り返されると、図11に示したように、あるフレーム番号で蛍光開始領域84の数の変化のピーク66が検出される。すなわち、ステップ55において、プロセッサ21(脈動周期取得部214)が、蛍光開始領域84の数の変化のピーク66を検出する。ピーク66を検出した場合、プロセッサ21は、処理をステップ57に進める。 When the loop of steps 51 to 56 is repeated and the display of the frame image is repeated, as shown in FIG. 11, the peak 66 of the change in the number of the fluorescence start regions 84 is detected at a certain frame number. That is, in step 55, the processor 21 (pulsation cycle acquisition unit 214) detects the peak 66 of the change in the number of the fluorescence start regions 84. When the peak 66 is detected, the processor 21 advances the process to step 57.
 ステップ57において、プロセッサ21は、次の脈動周期60に属する蛍光開始領域84の表示態様を決定する。すなわち、プロセッサ21は、記憶部22に記憶された表示態様の設定情報に従って、次のフレーム番号以降に抽出される蛍光開始領域84の表示態様を決定する。次に、プロセッサ21は、ステップ56で撮像を終了するか否かを判断し、終了しない場合にはステップ51で次のフレーム番号の蛍光画像81および可視光画像82に対する処理を行う。 In step 57, the processor 21 determines the display mode of the fluorescence start region 84 belonging to the next pulsation cycle 60. That is, the processor 21 determines the display mode of the fluorescence start region 84 extracted after the next frame number according to the setting information of the display mode stored in the storage unit 22. Next, the processor 21 determines whether or not to end the imaging in step 56, and if not, processes the fluorescence image 81 and the visible light image 82 having the next frame number in step 51.
 この結果、ステップ57で表示態様を決定した後のフレームの蛍光画像81から抽出された蛍光開始領域84は、前の脈動周期60に属する蛍光開始領域84とは異なる態様で蛍光剤拡散画像85に表示される。ステップ57で決定された表示態様は、次に蛍光開始領域84の数の変化のピーク66が検出されるフレーム番号まで、適用される。これにより、蛍光剤拡散画像85では、各フレームの蛍光画像81から抽出された蛍光開始領域84が、脈動周期60毎に異なる態様で表示される。 As a result, the fluorescence start region 84 extracted from the fluorescence image 81 of the frame after the display mode is determined in step 57 becomes the fluorescent agent diffusion image 85 in a mode different from the fluorescence start region 84 belonging to the previous pulsation cycle 60. Is displayed. The display mode determined in step 57 is then applied up to the frame number at which the peak 66 of change in the number of fluorescence initiation regions 84 is detected. As a result, in the fluorescent agent diffusion image 85, the fluorescence start region 84 extracted from the fluorescence image 81 of each frame is displayed in a different manner for each pulsation cycle 60.
 以上のようにして、治療支援装置100による画像表示処理が行われる。医師等は、表示部30に表示された蛍光剤拡散画像85(重畳画像88)から、脈動周期60毎に蛍光剤3が拡散していく様子を観察することが可能である。 As described above, the image display processing by the treatment support device 100 is performed. From the fluorescent agent diffusion image 85 (superimposed image 88) displayed on the display unit 30, a doctor or the like can observe how the fluorescent agent 3 diffuses at each pulsation cycle 60.
 たとえば蛍光剤拡散画像85が等値線図状に表示される図16の例では、隣り合う境界線87の間隔が大きい程、1つの脈動周期60の間での蛍光剤3の移動量が大きく、隣り合う境界線87の間隔が小さい程、1つの脈動周期60の間での蛍光剤3の移動量が小さい、と考えることができる。つまり、蛍光剤拡散画像85における脈動周期60毎の蛍光開始領域84の粗密により、蛍光剤3の拡散の度合いや、拡散方向を評価できる。蛍光剤拡散画像85により、医師等が治療対象部位2における血液の灌流の良否を評価する際に、有用な情報を医師等に提供することが可能となる。 For example, in the example of FIG. 16 in which the fluorescent agent diffusion image 85 is displayed in a contour diagram, the larger the distance between adjacent boundary lines 87, the larger the amount of movement of the fluorescent agent 3 between one pulsating cycle 60. It can be considered that the smaller the distance between the adjacent boundary lines 87, the smaller the amount of movement of the fluorescent agent 3 between one pulsating cycle 60. That is, the degree of diffusion of the fluorescent agent 3 and the diffusion direction can be evaluated by the density of the fluorescence start region 84 for each pulsation cycle 60 in the fluorescent agent diffusion image 85. The fluorescent agent diffusion image 85 makes it possible for a doctor or the like to provide useful information to the doctor or the like when evaluating the quality of blood perfusion at the treatment target site 2.
 (本実施形態の効果)
 本実施形態では、以下のような効果を得ることができる。
(Effect of this embodiment)
In this embodiment, the following effects can be obtained.
 すなわち、本実施形態の治療支援装置100では、上記のように、被写体1に投与される蛍光剤3の励起光91を照射する励起光照射部115と、励起光91により励起された蛍光92を検出して被写体1の蛍光画像81を撮像する蛍光撮像部111と、画像処理を行うプロセッサ21を有し、表示部30に画像出力を行う画像処理部20と、を備え、プロセッサ21は、複数の時点で生成される蛍光画像81と、生成時点以前の蛍光画像81とを比較することにより、各時点において蛍光開始領域84を検出するとともに、各時点における蛍光開始領域84を異なる態様で重畳することにより蛍光剤拡散画像85を生成し、生成した蛍光剤拡散画像85を表示部30に表示させるように構成されている。 That is, in the treatment support device 100 of the present embodiment, as described above, the excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1 and the fluorescence 92 excited by the excitation light 91 are provided. The fluorescence imaging unit 111 that detects and captures the fluorescence image 81 of the subject 1 and the image processing unit 20 that has a processor 21 that performs image processing and outputs an image to the display unit 30 are provided. By comparing the fluorescence image 81 generated at the time point of the above with the fluorescence image 81 before the time of generation, the fluorescence start region 84 at each time point is detected, and the fluorescence start area 84 at each time point is superimposed in a different manner. As a result, the fluorescent agent diffusion image 85 is generated, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30.
 また、本実施形態の治療支援方法では、上記のように、被写体に投与される蛍光剤の励起光91を照射するステップと、励起光により励起された蛍光92を検出して被写体の蛍光画像81を撮像するステップと、プロセッサ21が、複数の時点(フレーム番号)の各々で生成される蛍光画像81と、生成時点以前の蛍光画像81とを比較することにより、各時点において蛍光92が初めて検出された蛍光画像81中の領域(蛍光開始領域84)を検出するステップと、プロセッサ21が、各時点における(蛍光開始領域84)を異なる態様で重畳することにより蛍光剤拡散画像85を生成するステップと、生成した蛍光剤拡散画像85を表示部に表示させるステップと、を備える。 Further, in the treatment support method of the present embodiment, as described above, the step of irradiating the excitation light 91 of the fluorescent agent administered to the subject and the fluorescence 92 excited by the excitation light are detected to detect the fluorescence image 81 of the subject. The fluorescence 92 is detected for the first time at each time point by comparing the fluorescence image 81 generated at each of the plurality of time points (frame numbers) with the fluorescence image 81 before the generation time point. A step of detecting a region (fluorescence start region 84) in the resulting fluorescence image 81 and a step of generating a fluorescence agent diffusion image 85 by superimposing the (fluorescence start region 84) at each time point in a different manner by the processor 21. And a step of displaying the generated fluorescent agent diffusion image 85 on the display unit.
 上記の構成により、各時点において蛍光が初めて検出された領域(蛍光開始領域84)を、互いに異なる態様で表示される蛍光剤拡散画像85が生成される。蛍光開始領域84は、血流によって拡散する蛍光剤3が最初に検出された領域と考えられるから、蛍光剤拡散画像85は、時間経過に伴って蛍光剤3が拡散していく変化を時点毎に区別可能に表示したものである。したがって、蛍光剤拡散画像85によって、投与された蛍光剤3の拡散の様子を、時点毎に蛍光開始領域84が拡大する様子として表示することができる。これにより、被写体1に投与された蛍光剤3の拡散の様子を適切に表示することができる。 With the above configuration, a fluorescent agent diffusion image 85 is generated in which the region where fluorescence is first detected at each time point (fluorescence start region 84) is displayed in different modes from each other. Since the fluorescence start region 84 is considered to be the region where the fluorescent agent 3 diffused by the bloodstream is first detected, the fluorescence agent diffusion image 85 shows a change in which the fluorescent agent 3 diffuses with the passage of time at each time point. It is displayed in a distinguishable manner. Therefore, the fluorescent agent diffusion image 85 can display the state of diffusion of the administered fluorescent agent 3 as a state in which the fluorescence start region 84 expands at each time point. Thereby, the state of diffusion of the fluorescent agent 3 administered to the subject 1 can be appropriately displayed.
 この結果、蛍光剤拡散画像85により、患者の治療を行う医師等に対して、治療の対象となる部位の体組織における血液の灌流の良否を評価するための有用な情報を提供することが可能となる。すなわち、治療を行う領域を特定したり、治療の結果を確認する領域を特定したりするために有用な蛍光剤拡散画像85を提供できる。 As a result, the fluorescent agent diffusion image 85 can provide useful information for evaluating the quality of blood perfusion in the body tissue of the treatment target site to a doctor or the like who treats the patient. It becomes. That is, it is possible to provide a fluorescent agent diffusion image 85 that is useful for specifying a region to be treated and a region for confirming the result of treatment.
 また、上記実施形態では、以下のように構成したことによって、更なる効果が得られる。 Further, in the above embodiment, a further effect can be obtained by configuring as follows.
 具体的には、上記実施形態の治療支援装置の第1の例(図7参照)では、プロセッサ21は、蛍光画像81の各画素のうち、画素値が所定の閾値72を超えたことに基づき、蛍光開始領域84を抽出する。このように構成すれば、蛍光画像81の各画素の画素値と閾値72とを比較するだけで、容易に、蛍光開始領域84を抽出することができる。 Specifically, in the first example (see FIG. 7) of the treatment support device of the above embodiment, the processor 21 is based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted simply by comparing the pixel value of each pixel of the fluorescence image 81 with the threshold value 72.
 また、上記実施形態の治療支援装置の第2の例(図8参照)では、プロセッサ21は、蛍光画像81の各画素の時間強度曲線71の傾きが、所定の閾値72を超えたことに基づき、蛍光開始領域84を抽出する。このように構成すれば、各画素の時間強度曲線71の傾きから、容易に、蛍光開始領域84を抽出することができる。 Further, in the second example of the treatment support device of the above embodiment (see FIG. 8), the processor 21 is based on the fact that the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the slope of the time intensity curve 71 of each pixel.
 また、上記実施形態の治療支援装置の第3の例(図9参照)では、プロセッサ21は、蛍光画像81の各画素の時間強度曲線71の面積値が、所定の閾値72を超えたことに基づき、蛍光開始領域84を抽出する。このように構成すれば、各画素の時間強度曲線71の面積値から、容易に、蛍光開始領域84を抽出することができる。なお、時間強度曲線71の面積値に代えて、時間強度曲線71の平均値を求めてもよい。この場合でも、容易に、蛍光開始領域84を抽出することができる。 Further, in the third example of the treatment support device of the above embodiment (see FIG. 9), the processor 21 determines that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Based on this, the fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the area value of the time intensity curve 71 of each pixel. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 may be obtained. Even in this case, the fluorescence start region 84 can be easily extracted.
 また、上記実施形態の治療支援装置では、プロセッサ21は、被写体の脈動周期60を取得し、脈動周期60毎の各時点における蛍光開始領域84を異なる態様で重畳した蛍光剤拡散画像85を生成する。このように構成すれば、生成される蛍光剤拡散画像85は、心拍に由来する脈動に伴って蛍光剤3が拡散していく変化を脈動周期60毎に区別可能に表示したものとなる。したがって、蛍光剤拡散画像85によって、被写体に投与された蛍光剤3の拡散の様子を、脈動周期60毎に蛍光開始領域84が拡大する様子として、より一層適切に表示することができる。 Further, in the treatment support device of the above embodiment, the processor 21 acquires the pulsating cycle 60 of the subject and generates the fluorescent agent diffusion image 85 in which the fluorescence start regions 84 at each time point of each pulsating cycle 60 are superimposed in different modes. .. With this configuration, the generated fluorescent agent diffusion image 85 is a display that distinguishably displays the change in which the fluorescent agent 3 diffuses with the pulsation derived from the heartbeat for each pulsation cycle 60. Therefore, the fluorescent agent diffusion image 85 can more appropriately display the diffusion of the fluorescent agent 3 administered to the subject as the fluorescence start region 84 expands every pulsation cycle 60.
 また、上記実施形態の治療支援装置では、蛍光撮像部111は、脈動周期60よりも短い時間間隔で蛍光画像81を生成するように構成され、プロセッサ21は、蛍光開始領域84の数を、生成された蛍光画像81毎に積算し、時系列に生成される蛍光画像81毎の蛍光開始領域84の数に基づき、脈動周期60を検出するように構成されている。このように構成すれば、時系列で取得される蛍光画像81から、撮像している部位の脈動周期60を直接取得することができる。そのため、脈動周期60を取得するために、たとえば心電計や脈波計などの脈動を検出するための装置を別途設ける必要がないので、装置構成を簡素化できる。また、たとえば心電計により脈動を検出する場合、検出されるのはあくまでも心臓の拍動の周期であり、実際に撮像している部位の脈動周期60とは時間差が生じる。蛍光画像81から脈動周期60を検出する上記構成によれば、実際に撮像している部位の脈動周期60を直接検出できるので、脈動に伴う蛍光剤3の拡散の様子をより正確に可視化することができる。 Further, in the treatment support device of the above embodiment, the fluorescence imaging unit 111 is configured to generate a fluorescence image 81 at a time interval shorter than the pulsation cycle 60, and the processor 21 generates the number of fluorescence start regions 84. It is configured to integrate each of the generated fluorescence images 81 and detect the pulsation cycle 60 based on the number of fluorescence start regions 84 for each fluorescence image 81 generated in time series. With this configuration, the pulsation cycle 60 of the imaged portion can be directly acquired from the fluorescence image 81 acquired in time series. Therefore, in order to acquire the pulsation cycle 60, it is not necessary to separately provide a device for detecting pulsation such as an electrocardiograph or a pulsation meter, so that the device configuration can be simplified. Further, for example, when pulsation is detected by an electrocardiograph, only the pulsation cycle of the heart is detected, and a time difference occurs from the pulsation cycle 60 of the actually imaged portion. According to the above configuration for detecting the pulsation cycle 60 from the fluorescence image 81, the pulsation cycle 60 of the actually imaged portion can be directly detected, so that the state of diffusion of the fluorescent agent 3 accompanying the pulsation can be visualized more accurately. Can be done.
 また、上記実施形態の治療支援装置では、プロセッサ21は、蛍光画像81毎の蛍光開始領域84の数の変化を取得し、蛍光開始領域84の数のピーク66に基づいて、脈動周期60を検出するように構成されている。このように構成すれば、脈動周期60の度に、心臓の拍動(心室の収縮)に応じて、蛍光開始領域84の数が急激に増加するピーク66(すなわち、蛍光剤3の移動量のピーク66)を形成するので、このピーク66から、実際に撮像している部位の脈動周期60を精度よく検出できる。 Further, in the treatment support device of the above embodiment, the processor 21 acquires a change in the number of fluorescence start regions 84 for each fluorescence image 81, and detects a pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84. It is configured to do. With this configuration, the number of fluorescence initiation regions 84 increases sharply in response to the heartbeat (contraction of the ventricles) at every pulsation cycle 60 (that is, the amount of movement of the fluorescent agent 3). Since the peak 66) is formed, the pulsation cycle 60 of the actually imaged portion can be accurately detected from this peak 66.
 また、上記実施形態の治療支援装置では、被写体1から反射した可視光93を検出して、被写体1の可視光画像82を撮像する可視光撮像部112をさらに備え、プロセッサ21は、可視光画像82に、蛍光剤拡散画像85を重畳させて表示部30に表示させるように構成されている。ここで、蛍光画像81および蛍光画像81に基づく蛍光剤拡散画像85は、蛍光剤3から生じた蛍光92を画像化したものであるため、可視光93によって認識できる撮像部位の形態の情報が含まれない。そのため、可視光画像82に蛍光剤拡散画像85を重畳させて表示させることによって、医師等の使用者が実際に視認している可視光画像82上において、蛍光剤3の拡散の様子を識別可能に表示させることができる。その結果、治療を行う領域や、治療の結果を確認する領域を特定することを、容易化することができる。 Further, the treatment support device of the above embodiment further includes a visible light imaging unit 112 that detects visible light 93 reflected from the subject 1 and captures a visible light image 82 of the subject 1, and the processor 21 includes a visible light image. The fluorescent agent diffusion image 85 is superimposed on the 82 and displayed on the display unit 30. Here, since the fluorescence image 81 and the fluorescence agent diffusion image 85 based on the fluorescence image 81 are images of the fluorescence 92 generated from the fluorescence agent 3, they include information on the morphology of the imaging site that can be recognized by the visible light 93. I can't. Therefore, by superimposing the fluorescent agent diffusion image 85 on the visible light image 82 and displaying it, the state of diffusion of the fluorescent agent 3 can be identified on the visible light image 82 actually visually recognized by a user such as a doctor. Can be displayed on. As a result, it is possible to easily identify the area to be treated and the area to confirm the result of treatment.
 また、上記実施形態の治療支援装置では、プロセッサ21は、脈動周期60が経過する度に、過去の脈動周期60において抽出された蛍光開始領域84に加えて、最新の脈動周期60において抽出された蛍光開始領域84を蛍光剤拡散画像85に追加するように構成されている。このように構成すれば、過去の脈動周期60で抽出された蛍光開始領域84が画像から消去されることなく、脈動周期60が経過する度に蛍光開始領域84が追加表示されていくので、脈動に伴って蛍光剤3が拡散していく様子を識別しやすく表示することができる。 Further, in the treatment support device of the above embodiment, each time the pulsation cycle 60 elapses, the processor 21 is extracted in the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60. The fluorescence initiation region 84 is configured to be added to the fluorescent agent diffusion image 85. With this configuration, the fluorescence start region 84 extracted in the past pulsation cycle 60 is not erased from the image, and the fluorescence start region 84 is additionally displayed each time the pulsation cycle 60 elapses. It is possible to easily identify and display how the fluorescent agent 3 is diffused in accordance with the above.
 また、上記実施形態の治療支援装置では、プロセッサ21は、脈動周期60内に含まれる蛍光画像81が生成されると、脈動周期60内に含まれる抽出済みの蛍光開始領域84に加えて、最新の蛍光画像81から抽出された蛍光開始領域84を蛍光剤拡散画像85に追加するように構成されている。このように構成すれば、同じ脈動周期60内の蛍光画像81(フレーム画像)が取得される度に、最新の蛍光画像81から抽出された新たな蛍光開始領域84が追加されるので、脈動周期60内で蛍光剤3が徐々に拡散していく様子を画像化することができる。そして、時間経過に伴って次の脈動周期60になると、直前の脈動周期60とは異なる態様で、蛍光開始領域84が徐々に拡散していく様子が画像化される。この結果、治療時に、脈動周期60を単位とする蛍光剤3の拡散の様子だけでなく、脈動周期60内での蛍光剤3の拡散の様子をリアルタイムで把握可能な画像を表示することができる。 Further, in the treatment support device of the above embodiment, when the fluorescence image 81 included in the pulsation cycle 60 is generated, the processor 21 adds the latest fluorescence start region 84 included in the pulsation cycle 60 to the latest. The fluorescence start region 84 extracted from the fluorescence image 81 of the above is added to the fluorescence agent diffusion image 85. With this configuration, every time a fluorescence image 81 (frame image) within the same pulsation cycle 60 is acquired, a new fluorescence start region 84 extracted from the latest fluorescence image 81 is added, so that the pulsation cycle It is possible to image how the fluorescent agent 3 gradually diffuses within 60. Then, when the next pulsation cycle 60 is reached with the passage of time, a state in which the fluorescence start region 84 is gradually diffused is imaged in a manner different from that of the immediately preceding pulsation cycle 60. As a result, at the time of treatment, it is possible to display an image capable of grasping not only the diffusion of the fluorescent agent 3 in the pulsating cycle 60 as a unit but also the diffusion of the fluorescent agent 3 within the pulsating cycle 60 in real time. ..
 また、上記実施形態の治療支援装置では、プロセッサ21は、脈動周期60毎の蛍光開始領域84を、異なる階調で表示すること、脈動周期60毎の蛍光開始領域84を示す線を、等値線図状に表示すること、の少なくともいずれかにより、脈動周期60毎の蛍光開始領域84を互いに異なる態様で表示する。このように構成すれば、脈動周期60毎の蛍光開始領域84を、視覚的に容易に区別可能な態様で表示することができる。その結果、医師等のユーザにとっての利便性を向上させることができる。 Further, in the treatment support device of the above embodiment, the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and the line indicating the fluorescence start region 84 for each pulsation cycle 60 is equivalent. By at least one of displaying in a schematic pattern, the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes from each other. With this configuration, the fluorescence start region 84 for each pulsation cycle 60 can be displayed in a visually easily distinguishable manner. As a result, convenience for users such as doctors can be improved.
 [変形例]
 なお、今回開示された実施形態は、すべての点で例示であって制限的なものではないと考えられるべきである。本発明の範囲は、上記した実施形態の説明ではなく、特許請求の範囲によって示され、さらに特許請求の範囲と均等の意味および範囲内でのすべての変更(変形例)が含まれる。
[Modification example]
It should be noted that the embodiments disclosed this time are exemplary in all respects and are not considered to be restrictive. The scope of the present invention is shown by the scope of claims, not the description of the above-described embodiment, and further includes all modifications (modifications) within the meaning and scope equivalent to the scope of claims.
 たとえば、上記実施形態では、プロセッサ21が、蛍光開始領域84の数のピーク66(図11参照)に基づいて、脈動周期60を検出する例を示したが、本発明はこれに限られない。本発明では、プロセッサ21が、心電計により検出される心拍の波形75(図10参照)、または脈波計により検出される脈拍の波形76(図10参照)のいずれかに基づいて、脈動周期60を検出してもよい。 For example, in the above embodiment, the processor 21 detects the pulsation cycle 60 based on the peak 66 (see FIG. 11) of the number of the fluorescence start regions 84, but the present invention is not limited to this. In the present invention, the processor 21 pulsates based on either the heart rate waveform 75 (see FIG. 10) detected by the electrocardiograph or the pulse waveform 76 (see FIG. 10) detected by the pulse wave meter. Cycle 60 may be detected.
 具体的には、図19に示す変形例では、治療支援装置100が、被写体1の心拍または脈拍の波形を取得する波形取得部40を備えている。波形取得部40は、たとえば心電計を含み、図10に示した心拍の波形75を検出する。あるいは、波形取得部40は、たとえば脈波計を含み、図10に示した脈拍の波形76を検出する。 Specifically, in the modified example shown in FIG. 19, the treatment support device 100 includes a waveform acquisition unit 40 that acquires the waveform of the heartbeat or pulse of the subject 1. The waveform acquisition unit 40 includes, for example, an electrocardiograph and detects the heartbeat waveform 75 shown in FIG. Alternatively, the waveform acquisition unit 40 includes, for example, a pulse wave meter, and detects the pulse waveform 76 shown in FIG.
 この変形例では、プロセッサ21は、蛍光開始領域84の数のピーク66に代えて、心拍の波形75または脈拍の波形76に基づいて、脈動周期60を検出するように構成されている。プロセッサ21は、たとえば心拍の波形75のうち、たとえばQRS波を示すピーク75aなどの特徴点を検出し、特徴点間の時間間隔として脈動周期60を検出する。プロセッサ21は、たとえば脈拍の波形76のうち、心拍の波形75のQRS波に対応するピーク76aを特徴点として検出し、特徴点間の時間間隔として脈動周期60を検出する。 In this modification, the processor 21 is configured to detect the pulsation cycle 60 based on the heartbeat waveform 75 or the pulse waveform 76 instead of the peak 66 of the number of fluorescence initiation regions 84. The processor 21 detects, for example, a feature point such as a peak 75a indicating a QRS complex in the heartbeat waveform 75, and detects a pulsation cycle 60 as a time interval between the feature points. For example, the processor 21 detects the peak 76a corresponding to the QRS wave of the heartbeat waveform 75 among the pulse waveforms 76 as feature points, and detects the pulsation cycle 60 as the time interval between the feature points.
 この変形例では、上記のように、被写体1の心拍または脈拍の波形を取得する波形取得部40を備え、プロセッサ21は、心拍または脈拍の波形(75または76)に基づいて、脈動周期60を検出するように構成されている。このように構成すれば、波形取得部40からの検出信号を取得するだけで、容易に、脈動周期60を検出することができる。 In this modification, as described above, the waveform acquisition unit 40 for acquiring the heartbeat or pulse waveform of the subject 1 is provided, and the processor 21 sets the pulsation cycle 60 based on the heartbeat or pulse waveform (75 or 76). It is configured to detect. With this configuration, the pulsation cycle 60 can be easily detected simply by acquiring the detection signal from the waveform acquisition unit 40.
 なお、心拍の波形75は、あくまでも心臓の拍動を検出したものであり、実際に蛍光画像81として撮像している治療対象部位2における血液の脈動とは時間差がある。同様に、脈拍の波形76は、脈波計による測定箇所と治療対象部位2との間の、血液の流通経路の長さによっては、治療対象部位2における血液の脈動とは時間差が生じる可能性がある。これに対して、プロセッサ21が、蛍光開始領域84の数のピーク66に基づいて脈動周期60を検出する上記実施形態の構成では、蛍光画像81から、実際に撮像している治療対象部位2における血液の脈動を直接検出することができるので、脈動に伴う蛍光剤3の拡散の様子を、より正確に捉えることが可能である。 Note that the heartbeat waveform 75 is only a detection of the heartbeat, and there is a time lag from the blood pulsation at the treatment target site 2 actually imaged as the fluorescence image 81. Similarly, the pulse waveform 76 may have a time lag from the pulsation of blood at the treatment target site 2 depending on the length of the blood circulation path between the measurement point by the pulse wave meter and the treatment target site 2. There is. On the other hand, in the configuration of the above embodiment in which the processor 21 detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84, the treatment target site 2 actually imaged from the fluorescence image 81. Since the pulsation of blood can be directly detected, it is possible to more accurately capture the state of diffusion of the fluorescent agent 3 accompanying the pulsation.
 また、上記実施形態では、画像処理部20が、蛍光撮像装置10とは別個に設けられたPCである例を示したが、本発明はこれに限られない。本発明では、画像処理部20が、蛍光撮像装置10と一体で設けられていてもよい。たとえば、図20に示す変形例では、画像処理部20が、蛍光撮像装置10の本体部13内に設けられている。この他、たとえば、本体部13の画像生成部133を構成するプロセッサが、画像処理部20のプロセッサ21としても機能するように構成されていてもよい。つまり、画像処理部20と画像生成部133とを一体的に設けてもよい。 Further, in the above embodiment, an example is shown in which the image processing unit 20 is a PC provided separately from the fluorescence imaging device 10, but the present invention is not limited to this. In the present invention, the image processing unit 20 may be provided integrally with the fluorescence imaging device 10. For example, in the modified example shown in FIG. 20, the image processing unit 20 is provided in the main body 13 of the fluorescent imaging apparatus 10. In addition, for example, the processor that constitutes the image generation unit 133 of the main body unit 13 may be configured to also function as the processor 21 of the image processing unit 20. That is, the image processing unit 20 and the image generation unit 133 may be provided integrally.
 また、上記実施形態では、蛍光画像81の画素値が所定の閾値72を超えたことに基づき蛍光開始領域84を抽出する例(図7参照)、時間強度曲線71の傾きが所定の閾値72を超えたことに基づき蛍光開始領域84を抽出する例(図8参照)、および、時間強度曲線71の面積値が所定の閾値72を超えたことに基づき蛍光開始領域84を抽出する例(図9参照)、を示したが、本発明はこれに限られない。本発明では、画素値、時間強度曲線71の傾き、時間強度曲線71の面積値以外の量に基づいて、蛍光開始領域84を抽出してもよい。 Further, in the above embodiment, an example in which the fluorescence start region 84 is extracted based on the pixel value of the fluorescence image 81 exceeding a predetermined threshold value 72 (see FIG. 7), the slope of the time intensity curve 71 sets the predetermined threshold value 72. An example of extracting the fluorescence start region 84 based on the fact that it exceeds the threshold value (see FIG. 8) and an example of extracting the fluorescence start region 84 based on the fact that the area value of the time intensity curve 71 exceeds a predetermined threshold value 72 (FIG. 9). (See), but the present invention is not limited to this. In the present invention, the fluorescence start region 84 may be extracted based on an amount other than the pixel value, the slope of the time intensity curve 71, and the area value of the time intensity curve 71.
 また、上記実施形態では、プロセッサ21が、蛍光画像81毎の蛍光開始領域84の数の変化を取得し、蛍光開始領域84の数のピーク66に基づいて、脈動周期60を検出するように構成した例を示したが、本発明はこれに限られない。本発明では、ピーク66と同様に、蛍光開始領域84の数の変化から周期的に生じるパターンを検出できれば、脈動周期60を検出できる。プロセッサ21が、蛍光開始領域84の数のピーク66以外の周期的に生じるパターンに基づいて脈動周期60を検出してもよい。 Further, in the above embodiment, the processor 21 is configured to acquire the change in the number of fluorescence start regions 84 for each fluorescence image 81 and detect the pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84. However, the present invention is not limited to this. In the present invention, similarly to the peak 66, if the pattern generated periodically from the change in the number of the fluorescence start regions 84 can be detected, the pulsation cycle 60 can be detected. The processor 21 may detect the pulsation cycle 60 based on a periodically occurring pattern other than the peak 66 of the number of fluorescence initiation regions 84.
 また、上記実施形態では、プロセッサ21は、可視光画像82に蛍光剤拡散画像85を重畳させた重畳画像88を表示部30に表示させる例を示したが、本発明はこれに限られない。本発明では、プロセッサ21は、可視光画像82に蛍光剤拡散画像85を重畳させずに、蛍光剤拡散画像85のみを単独で表示させてもよい。また、プロセッサ21は、可視光画像82と蛍光剤拡散画像85とを画面上で並べて表示させてもよい。 Further, in the above embodiment, the processor 21 shows an example in which the superimposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 is displayed on the display unit 30, but the present invention is not limited to this. In the present invention, the processor 21 may display only the fluorescent agent diffused image 85 alone without superimposing the fluorescent agent diffused image 85 on the visible light image 82. Further, the processor 21 may display the visible light image 82 and the fluorescent agent diffusion image 85 side by side on the screen.
 また、上記実施形態では、治療支援装置100が可視光撮像部112を備える例を示したが、本発明はこれに限られない。本発明では、治療支援装置100が可視光撮像部112を備えなくてもよい。 Further, in the above embodiment, the treatment support device 100 includes the visible light imaging unit 112, but the present invention is not limited to this. In the present invention, the treatment support device 100 does not have to include the visible light imaging unit 112.
 また、上記実施形態では、プロセッサ21は、(1)脈動周期60毎の蛍光開始領域84を、異なる階調で表示すること、(2)脈動周期60毎の蛍光開始領域84を示す線を、等値線図状に表示すること、の少なくともいずれかにより、脈動周期60毎の蛍光開始領域84を互いに異なる態様で表示する例を示したが、本発明はこれに限られない。本発明では、脈動周期60毎の蛍光開始領域84の表示態様が異なっていれば、プロセッサ21はどのような表示態様で蛍光開始領域84を表示させてもよい。 Further, in the above embodiment, the processor 21 displays (1) the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) a line indicating the fluorescence start region 84 for each pulsation cycle 60. An example is shown in which the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes by at least one of displaying in an isolinear pattern, but the present invention is not limited to this. In the present invention, the processor 21 may display the fluorescence start region 84 in any display mode as long as the display mode of the fluorescence start region 84 for each pulsation cycle 60 is different.
 また、上記実施形態では、蛍光剤3として、ICGを例示したが、本発明はこれに限られない。本発明では、ICG以外の蛍光剤3が用いられてもよい。ICG以外の蛍光剤3としては、たとえば、5-ALA(5-アミノレブリン酸)、IR700などが挙げられる。なお、5-ALA自体は蛍光を示さないが、被写体1に投与される5-ALAの代謝物であるプロトポルフィリンIX(PPIX)が蛍光物質となるため、本明細書では、5-ALAのような物質も蛍光剤3に含まれるものとする。蛍光剤3は、これらの他、患者の蛍光診断などに用いられる蛍光物質であれば、どのようなものでもよい。 Further, in the above embodiment, ICG is exemplified as the fluorescent agent 3, but the present invention is not limited to this. In the present invention, a fluorescent agent 3 other than ICG may be used. Examples of the fluorescent agent 3 other than ICG include 5-ALA (5-aminolevulinic acid) and IR700. Although 5-ALA itself does not show fluorescence, protoporphyrin IX (PPIX), which is a metabolite of 5-ALA administered to subject 1, becomes a fluorescent substance, so that it is referred to as 5-ALA in the present specification. Substances are also included in the fluorescent agent 3. In addition to these, the fluorescent agent 3 may be any fluorescent substance used for fluorescent diagnosis of patients and the like.
 [態様]
 上記した例示的な実施形態は、以下の態様の具体例であることが当業者により理解される。
[Aspect]
It will be understood by those skilled in the art that the above exemplary embodiments are specific examples of the following embodiments.
 (態様1)
 被写体に投与される蛍光剤の励起光を照射する励起光照射部と、
 励起光により励起された蛍光を検出して前記被写体の蛍光画像を撮像する蛍光撮像部と、
 画像処理を行うプロセッサを有し、表示部に画像出力を行う画像処理部と、を備え、
 前記プロセッサは、複数の時点で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するとともに、各時点における前記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成し、生成した前記蛍光剤拡散画像を表示部に表示させるように構成されている、治療支援装置。
(Aspect 1)
An excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject,
A fluorescence imaging unit that detects fluorescence excited by excitation light and captures a fluorescence image of the subject.
It has a processor that performs image processing, and is equipped with an image processing unit that outputs an image to the display unit.
The processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at a plurality of time points with the fluorescence image before the generation time point, and at each time point. A treatment support device configured to generate a fluorescent agent diffused image by superimposing the regions in different modes and display the generated fluorescent agent diffused image on a display unit.
 (態様2)
 前記プロセッサは、前記蛍光画像の各画素のうち、画素値が所定の閾値を超えたことに基づき、前記領域を抽出する、項目1に記載の治療支援装置。
(Aspect 2)
The treatment support device according to item 1, wherein the processor extracts the region based on the pixel value of each pixel of the fluorescent image exceeding a predetermined threshold value.
 (態様3)
 前記プロセッサは、前記蛍光画像の各画素の時間強度曲線の傾きが、所定の閾値を超えたことに基づき、前記領域を抽出する、項目1に記載の治療支援装置。
(Aspect 3)
The treatment support device according to item 1, wherein the processor extracts the region based on the inclination of the time intensity curve of each pixel of the fluorescence image exceeds a predetermined threshold value.
 (態様4)
 前記プロセッサは、前記蛍光画像の各画素の時間強度曲線の面積値が、所定の閾値を超えたことに基づき、前記領域を抽出する、項目1に記載の治療支援装置。
(Aspect 4)
The treatment support device according to item 1, wherein the processor extracts the region based on the area value of the time intensity curve of each pixel of the fluorescence image exceeds a predetermined threshold value.
 (態様5)
 前記プロセッサは、前記被写体の脈動周期を取得し、脈動周期毎の各時点における前記領域を異なる態様で重畳した前記蛍光剤拡散画像を生成する、項目1に記載の治療支援装置。
(Aspect 5)
The treatment support device according to item 1, wherein the processor acquires the pulsating cycle of the subject and generates the fluorescent agent diffusion image in which the regions at each time point in each pulsating cycle are superimposed in different modes.
 (態様6)
 前記蛍光撮像部は、前記脈動周期よりも短い時間間隔で前記蛍光画像を生成するように構成され、
 前記プロセッサは、
  前記領域の数を、生成された前記蛍光画像毎に積算し、
  時系列に生成される前記蛍光画像毎の前記領域の数に基づき、前記脈動周期を検出するように構成されている、項目5に記載の治療支援装置。
(Aspect 6)
The fluorescence imaging unit is configured to generate the fluorescence image at a time interval shorter than the pulsation cycle.
The processor
The number of the regions is integrated for each of the generated fluorescence images.
The treatment support device according to item 5, which is configured to detect the pulsating cycle based on the number of the regions for each fluorescent image generated in time series.
 (態様7)
 前記プロセッサは、前記蛍光画像毎の前記領域の数の変化を取得し、前記領域の数のピークに基づいて、前記脈動周期を検出するように構成されている、項目6に記載の治療支援装置。
(Aspect 7)
The treatment support apparatus according to item 6, wherein the processor acquires a change in the number of the regions for each fluorescence image and detects the pulsating cycle based on the peak of the number of the regions. ..
 (態様8)
 前記被写体の心拍または脈拍の波形を取得する波形取得部をさらに備え、
 前記プロセッサは、心拍または脈拍の前記波形に基づいて、前記脈動周期を検出するように構成されている、項目5に記載の治療支援装置。
(Aspect 8)
A waveform acquisition unit for acquiring the heartbeat or pulse waveform of the subject is further provided.
The treatment support device according to item 5, wherein the processor is configured to detect the pulsating cycle based on the waveform of the heartbeat or pulse.
 (態様9)
 前記被写体から反射した可視光を検出して、前記被写体の可視光画像を撮像する可視光撮像部をさらに備え、
 前記プロセッサは、前記可視光画像に、前記蛍光剤拡散画像を重畳させて前記表示部に表示させるように構成されている、項目1に記載の治療支援装置。
(Aspect 9)
Further provided with a visible light imaging unit that detects visible light reflected from the subject and captures a visible light image of the subject.
The treatment support device according to item 1, wherein the processor is configured to superimpose the fluorescent agent diffusion image on the visible light image and display it on the display unit.
 (態様10)
 前記プロセッサは、前記脈動周期が経過する度に、過去の前記脈動周期において抽出された前記領域に加えて、最新の前記脈動周期において抽出された前記領域を前記蛍光剤拡散画像に追加するように構成されている、項目5に記載の治療支援装置。
(Aspect 10)
Each time the pulsation cycle elapses, the processor adds the region extracted in the latest pulsation cycle to the fluorescent agent diffusion image in addition to the region extracted in the past pulsation cycle. The treatment support device according to item 5, which is configured.
 (態様11)
 前記プロセッサは、前記脈動周期内に含まれる前記蛍光画像が生成されると、前記脈動周期内に含まれる抽出済みの前記領域に加えて、最新の前記蛍光画像から抽出された前記領域を前記蛍光剤拡散画像に追加するように構成されている、項目5に記載の治療支援装置。
(Aspect 11)
When the fluorescence image included in the pulsation cycle is generated, the processor fluoresces the region extracted from the latest fluorescence image in addition to the extracted region included in the pulsation cycle. The treatment support device according to item 5, which is configured to be added to the agent diffusion image.
 (態様12)
 前記プロセッサは、
  前記脈動周期毎の前記領域を、異なる階調で表示すること、
  前記脈動周期毎の前記領域を示す線を、等値線図状に表示すること、
 の少なくともいずれかにより、前記脈動周期毎の前記領域を互いに異なる態様で表示する、項目5に記載の治療支援装置。
(Aspect 12)
The processor
Displaying the area for each pulsation cycle with different gradations,
Displaying a line indicating the region for each pulsation cycle in a contour diagram.
5. The treatment support device according to item 5, wherein the regions for each pulsation cycle are displayed in different modes by at least one of the above.
 (態様13)
 被写体に投与される蛍光剤の励起光を照射するステップと、
 励起光により励起された蛍光を検出して前記被写体の蛍光画像を撮像するステップと、
 プロセッサが、複数の時点の各々で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するステップと、
 プロセッサが、各時点における前記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成するステップと、
 生成した前記蛍光剤拡散画像を表示部に表示させるステップと、を備える、治療支援方法。
(Aspect 13)
The step of irradiating the excitation light of the fluorescent agent administered to the subject,
A step of detecting fluorescence excited by excitation light and capturing a fluorescence image of the subject,
A step in which the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at each of the plurality of time points with the fluorescence image before the generation time point.
A step in which the processor produces a fluorescent agent diffuse image by superimposing the regions at each time point in different ways.
A treatment support method comprising a step of displaying the generated fluorescent agent diffusion image on a display unit.
 1 被写体
 3 蛍光剤
 20 画像処理部
 30 表示部
 40 波形取得部
 60 脈動周期
 66 ピーク
 71 時間強度曲線
 72、73、74 閾値
 75 心拍の波形
 76 脈拍の波形
 81 蛍光画像
 82 可視光画像
 84(84a、84b、84c) 蛍光開始領域(蛍光が初めて検出された蛍光画像中の領域)
 85 蛍光剤拡散画像
 91 励起光
 92 蛍光
 93 可視光
 100 治療支援装置
 111 蛍光撮像部
 112 可視光撮像部
 115 励起光照射部
1 Subject 3 Fluorescent agent 20 Image processing unit 30 Display unit 40 Waveform acquisition unit 60 Pulsating cycle 66 Peak 71 Time intensity curve 72, 73, 74 Threshold 75 Heartbeat waveform 76 Pulse waveform 81 Fluorescent image 82 Visible light image 84 (84a, 84a, 84b, 84c) Fluorescence start region (region in the fluorescence image where fluorescence was first detected)
85 Fluorescent agent diffusion image 91 Excitation light 92 Fluorescence 93 Visible light 100 Treatment support device 111 Fluorescence imaging unit 112 Visible light imaging unit 115 Excitation light irradiation unit

Claims (13)

  1.  被写体に投与される蛍光剤の励起光を照射する励起光照射部と、
     励起光により励起された蛍光を検出して前記被写体の蛍光画像を撮像する蛍光撮像部と、
     画像処理を行うプロセッサを有し、表示部に画像出力を行う画像処理部と、を備え、
     前記プロセッサは、複数の時点で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するとともに、各時点における前記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成し、生成した前記蛍光剤拡散画像を表示部に表示させるように構成されている、治療支援装置。
    An excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject,
    A fluorescence imaging unit that detects fluorescence excited by excitation light and captures a fluorescence image of the subject.
    It has a processor that performs image processing, and is equipped with an image processing unit that outputs an image to the display unit.
    The processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at a plurality of time points with the fluorescence image before the generation time point, and at each time point. A treatment support device configured to generate a fluorescent agent diffused image by superimposing the regions in different modes and display the generated fluorescent agent diffused image on a display unit.
  2.  前記プロセッサは、前記蛍光画像の各画素のうち、画素値が所定の閾値を超えたことに基づき、前記領域を抽出する、請求項1に記載の治療支援装置。 The treatment support device according to claim 1, wherein the processor extracts the region based on the pixel value exceeding a predetermined threshold value among the pixels of the fluorescent image.
  3.  前記プロセッサは、前記蛍光画像の各画素の時間強度曲線の傾きが、所定の閾値を超えたことに基づき、前記領域を抽出する、請求項1に記載の治療支援装置。 The treatment support device according to claim 1, wherein the processor extracts the region based on the inclination of the time intensity curve of each pixel of the fluorescence image exceeds a predetermined threshold value.
  4.  前記プロセッサは、前記蛍光画像の各画素の時間強度曲線の面積値が、所定の閾値を超えたことに基づき、前記領域を抽出する、請求項1に記載の治療支援装置。 The treatment support device according to claim 1, wherein the processor extracts the region based on the area value of the time intensity curve of each pixel of the fluorescence image exceeds a predetermined threshold value.
  5.  前記プロセッサは、前記被写体の脈動周期を取得し、脈動周期毎の各時点における前記領域を異なる態様で重畳した前記蛍光剤拡散画像を生成する、請求項1に記載の治療支援装置。 The treatment support device according to claim 1, wherein the processor acquires the pulsating cycle of the subject and generates the fluorescent agent diffusion image in which the regions at each time point in each pulsating cycle are superimposed in different modes.
  6.  前記蛍光撮像部は、前記脈動周期よりも短い時間間隔で前記蛍光画像を生成するように構成され、
     前記プロセッサは、
      前記領域の数を、生成された前記蛍光画像毎に積算し、
      時系列に生成される前記蛍光画像毎の前記領域の数に基づき、前記脈動周期を検出するように構成されている、請求項5に記載の治療支援装置。
    The fluorescence imaging unit is configured to generate the fluorescence image at a time interval shorter than the pulsation cycle.
    The processor
    The number of the regions is integrated for each of the generated fluorescence images.
    The treatment support device according to claim 5, which is configured to detect the pulsating cycle based on the number of regions for each fluorescence image generated in time series.
  7.  前記プロセッサは、前記蛍光画像毎の前記領域の数の変化を取得し、前記領域の数のピークに基づいて、前記脈動周期を検出するように構成されている、請求項6に記載の治療支援装置。 The treatment support according to claim 6, wherein the processor acquires a change in the number of the regions for each fluorescence image and detects the pulsating cycle based on the peak of the number of the regions. apparatus.
  8.  前記被写体の心拍または脈拍の波形を取得する波形取得部をさらに備え、
     前記プロセッサは、心拍または脈拍の前記波形に基づいて、前記脈動周期を検出するように構成されている、請求項5に記載の治療支援装置。
    A waveform acquisition unit for acquiring the heartbeat or pulse waveform of the subject is further provided.
    The treatment support device according to claim 5, wherein the processor is configured to detect the pulsating cycle based on the waveform of the heartbeat or pulse.
  9.  前記被写体から反射した可視光を検出して、前記被写体の可視光画像を撮像する可視光撮像部をさらに備え、
     前記プロセッサは、前記可視光画像に、前記蛍光剤拡散画像を重畳させて前記表示部に表示させるように構成されている、請求項1に記載の治療支援装置。
    Further provided with a visible light imaging unit that detects visible light reflected from the subject and captures a visible light image of the subject.
    The treatment support device according to claim 1, wherein the processor is configured to superimpose the fluorescent agent diffusion image on the visible light image and display it on the display unit.
  10.  前記プロセッサは、前記脈動周期が経過する度に、過去の前記脈動周期において抽出された前記領域に加えて、最新の前記脈動周期において抽出された前記領域を前記蛍光剤拡散画像に追加するように構成されている、請求項5に記載の治療支援装置。 Each time the pulsation cycle elapses, the processor adds the region extracted in the latest pulsation cycle to the fluorescent agent diffusion image in addition to the region extracted in the past pulsation cycle. The treatment support device according to claim 5, which is configured.
  11.  前記プロセッサは、前記脈動周期内に含まれる前記蛍光画像が生成されると、前記脈動周期内に含まれる抽出済みの前記領域に加えて、最新の前記蛍光画像から抽出された前記領域を前記蛍光剤拡散画像に追加するように構成されている、請求項5に記載の治療支援装置。 When the fluorescence image included in the pulsation cycle is generated, the processor fluoresces the region extracted from the latest fluorescence image in addition to the extracted region included in the pulsation cycle. The treatment support device according to claim 5, which is configured to be added to the agent diffusion image.
  12.  前記プロセッサは、
      前記脈動周期毎の前記領域を、異なる階調で表示すること、
      前記脈動周期毎の前記領域を示す線を、等値線図状に表示すること、
     の少なくともいずれかにより、前記脈動周期毎の前記領域を互いに異なる態様で表示する、請求項5に記載の治療支援装置。
    The processor
    Displaying the area for each pulsation cycle with different gradations,
    Displaying a line indicating the region for each pulsation cycle in a contour diagram.
    The treatment support device according to claim 5, wherein the regions for each pulsation cycle are displayed in different modes by at least one of the above.
  13.  被写体に投与される蛍光剤の励起光を照射するステップと、
     励起光により励起された蛍光を検出して前記被写体の蛍光画像を撮像するステップと、
     プロセッサが、複数の時点の各々で生成される蛍光画像と、生成時点以前の蛍光画像とを比較することにより、各時点において蛍光が初めて検出された蛍光画像中の領域を検出するステップと、
     プロセッサが、各時点における前記領域を異なる態様で重畳することにより蛍光剤拡散画像を生成するステップと、
     生成した前記蛍光剤拡散画像を表示部に表示させるステップと、を備える、治療支援方法。
    The step of irradiating the excitation light of the fluorescent agent administered to the subject,
    A step of detecting fluorescence excited by excitation light and capturing a fluorescence image of the subject,
    A step in which the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at each of the plurality of time points with the fluorescence image before the generation time point.
    A step in which the processor produces a fluorescent agent diffuse image by superimposing the regions at each time point in different ways.
    A treatment support method comprising a step of displaying the generated fluorescent agent diffusion image on a display unit.
PCT/JP2019/030554 2019-08-02 2019-08-02 Medical treatment assistance device and medical treatment assistance method WO2021024314A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2021538536A JP7306461B2 (en) 2019-08-02 2019-08-02 Treatment support device and treatment support method
PCT/JP2019/030554 WO2021024314A1 (en) 2019-08-02 2019-08-02 Medical treatment assistance device and medical treatment assistance method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/030554 WO2021024314A1 (en) 2019-08-02 2019-08-02 Medical treatment assistance device and medical treatment assistance method

Publications (1)

Publication Number Publication Date
WO2021024314A1 true WO2021024314A1 (en) 2021-02-11

Family

ID=74502552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/030554 WO2021024314A1 (en) 2019-08-02 2019-08-02 Medical treatment assistance device and medical treatment assistance method

Country Status (2)

Country Link
JP (1) JP7306461B2 (en)
WO (1) WO2021024314A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113693724A (en) * 2021-08-19 2021-11-26 南京诺源医疗器械有限公司 Irradiation method, device and storage medium suitable for fluorescence image navigation operation

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023903A (en) * 1998-05-01 2000-01-25 Asahi Optical Co Ltd Electronic endoscope device for fluoroscopy
JP2004313470A (en) * 2003-04-16 2004-11-11 Kao Corp Decayed teeth detector and decayed teeth detection program for realizing the same
JP2007021006A (en) * 2005-07-20 2007-02-01 Hitachi Medical Corp X-ray ct apparatus
JP2010521198A (en) * 2007-03-08 2010-06-24 シンク−アールエックス,リミティド Imaging and tools for use with moving organs
WO2012147820A1 (en) * 2011-04-28 2012-11-01 オリンパス株式会社 Fluorescent observation device and image display method therefor
JP2015147048A (en) * 2014-02-07 2015-08-20 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. Synchronizing between image sequences of heart acquired at different heartbeat rates
JP2015527100A (en) * 2012-06-21 2015-09-17 ノバダック テクノロジーズ インコーポレイテッド Angiography and perfusion quantification and analysis techniques
WO2018167816A1 (en) * 2017-03-13 2018-09-20 株式会社島津製作所 Imaging apparatus

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4425098B2 (en) * 2004-09-06 2010-03-03 浜松ホトニクス株式会社 Fluorescence microscope and fluorescence correlation spectroscopy analyzer
JP6734386B2 (en) * 2016-09-28 2020-08-05 パナソニック株式会社 Display system

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023903A (en) * 1998-05-01 2000-01-25 Asahi Optical Co Ltd Electronic endoscope device for fluoroscopy
JP2004313470A (en) * 2003-04-16 2004-11-11 Kao Corp Decayed teeth detector and decayed teeth detection program for realizing the same
JP2007021006A (en) * 2005-07-20 2007-02-01 Hitachi Medical Corp X-ray ct apparatus
JP2010521198A (en) * 2007-03-08 2010-06-24 シンク−アールエックス,リミティド Imaging and tools for use with moving organs
WO2012147820A1 (en) * 2011-04-28 2012-11-01 オリンパス株式会社 Fluorescent observation device and image display method therefor
JP2015527100A (en) * 2012-06-21 2015-09-17 ノバダック テクノロジーズ インコーポレイテッド Angiography and perfusion quantification and analysis techniques
JP2015147048A (en) * 2014-02-07 2015-08-20 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. Synchronizing between image sequences of heart acquired at different heartbeat rates
WO2018167816A1 (en) * 2017-03-13 2018-09-20 株式会社島津製作所 Imaging apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113693724A (en) * 2021-08-19 2021-11-26 南京诺源医疗器械有限公司 Irradiation method, device and storage medium suitable for fluorescence image navigation operation
CN113693724B (en) * 2021-08-19 2022-10-14 南京诺源医疗器械有限公司 Irradiation method, device and storage medium suitable for fluorescence image navigation operation

Also Published As

Publication number Publication date
JPWO2021024314A1 (en) 2021-02-11
JP7306461B2 (en) 2023-07-11

Similar Documents

Publication Publication Date Title
JP5081992B2 (en) Method and apparatus for performing intraoperative angiography
US8892190B2 (en) Method and apparatus for performing intra-operative angiography
US6915154B1 (en) Method and apparatus for performing intra-operative angiography
JP5634755B2 (en) Electronic endoscope system, processor device for electronic endoscope, and method for operating electronic endoscope system
KR101647022B1 (en) Apparatus and method for capturing medical image
US11638517B2 (en) Systems and methods for medical imaging using a rolling shutter imager
JP5460488B2 (en) Electronic endoscope system, processor device for electronic endoscope, image retrieval system, and method of operating electronic endoscope system
JP2009226072A (en) Method and device for surgical operation support
JP2019136269A (en) Fluorescent imaging device
JP2023015232A (en) endoscope system
US20230081866A1 (en) Methods and systems for generating simulated intraoperative imaging data of a subject
JP2023088989A (en) Multi-spectral physiologic visualization (mspv) using laser imaging method and system for blood flow and perfusion imaging and quantification in endoscopic design
WO2017122431A1 (en) Image analysis device, image analysis system, and method for actuating image analysis device
US20110267444A1 (en) Endoscope apparatus, method, and computer readable medium
WO2021024314A1 (en) Medical treatment assistance device and medical treatment assistance method
JP4533673B2 (en) Infrared observation system and operation method by infrared observation system
US20230039047A1 (en) Image processing apparatus, image processing method, navigation method and endoscope system
JP5844447B2 (en) Electronic endoscope system, processor device for electronic endoscope, and method for operating electronic endoscope system
CN116744834A (en) Medical image processing device, method, and program
CN116724334A (en) Computer program, learning model generation method, and operation support device
WO2022239495A1 (en) Biological tissue observation system, biological tissue observation device, and biological tissue observation method
WO2020203034A1 (en) Endoscopic system
WO2018216658A1 (en) Image capturing device, image capturing system, and image capturing method
JP2021126153A (en) Medical image processing device and medical observation system
JP2021132695A (en) Medical image processing device, medical observation system, and image processing method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19940661

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021538536

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19940661

Country of ref document: EP

Kind code of ref document: A1