WO2021024314A1 - Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical - Google Patents

Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical Download PDF

Info

Publication number
WO2021024314A1
WO2021024314A1 PCT/JP2019/030554 JP2019030554W WO2021024314A1 WO 2021024314 A1 WO2021024314 A1 WO 2021024314A1 JP 2019030554 W JP2019030554 W JP 2019030554W WO 2021024314 A1 WO2021024314 A1 WO 2021024314A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
fluorescence
processor
fluorescent agent
unit
Prior art date
Application number
PCT/JP2019/030554
Other languages
English (en)
Japanese (ja)
Inventor
紘之 妻鳥
Original Assignee
株式会社島津製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社島津製作所 filed Critical 株式会社島津製作所
Priority to PCT/JP2019/030554 priority Critical patent/WO2021024314A1/fr
Priority to JP2021538536A priority patent/JP7306461B2/ja
Publication of WO2021024314A1 publication Critical patent/WO2021024314A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B10/00Other methods or instruments for diagnosis, e.g. instruments for taking a cell sample, for biopsy, for vaccination diagnosis; Sex determination; Ovulation-period determination; Throat striking implements

Definitions

  • the present invention relates to a treatment support device and a treatment support method.
  • a treatment support device that captures a fluorescent image with a fluorescent agent administered to a patient during treatment such as surgery is known.
  • Such a treatment support device is disclosed in, for example, Japanese Patent Application Laid-Open No. 2018-51320.
  • Japanese Patent Application Laid-Open No. 2018-51320 discloses a device for identifying perforator blood vessels before surgery using ICG (indocyanine green) fluorescence angiography for plastic surgery.
  • the apparatus disclosed in Japanese Patent Application Laid-Open No. 2018-51320 includes an infrared light source for exciting fluorescence, and the fluorescence signal is detected by a CCD camera. The entire cycle of fluorescence perfusion and disappearance of the ICG is captured by the contrast device.
  • JP-A-2018-51320 a well-vascularized flap is a good candidate for grafting, and the surgeon determines which of the several penetrating branches is the best candidate for grafting.
  • the apparatus of JP-A-2018-51320 is a means for processing an image sequence to generate a time-integrated luminance or a time derivative of the luminance with respect to a pixel value in a fluorescent image, and a time-integrated luminance or It is provided with a means for displaying the time derivative of luminance as a color image or a black-and-white image. This device is used to identify / locate perforator vessels prior to surgery.
  • the fluorescence image only shows the fluorescence intensity at a certain time, it is difficult to evaluate the quality of blood perfusion (that is, the quality of blood circulation in body tissue) from the fluorescence image at a certain time. .. Therefore, in JP-A-2018-51320, it is considered that the time-integrated luminance or the time derivative of the luminance with respect to the pixel value in the fluorescent image is obtained and displayed.
  • Evaluating the quality of blood perfusion is important not only for skin grafting (flap surgery) described above, but also for identifying the area to be treated and confirming the result of treatment. Therefore, when treating a patient, it is desired to be able to appropriately display the state of diffusion of the administered fluorescent agent in order to evaluate the quality of blood perfusion at the site to be treated.
  • the present invention has been made to solve the above-mentioned problems, and one object of the present invention is therapeutic support capable of appropriately displaying the state of diffusion of the fluorescent agent administered to the subject. It is to provide a device and a treatment support method.
  • the treatment support device detects the excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject and the fluorescence excited by the excitation light.
  • a fluorescence imaging unit that captures a fluorescence image of a subject and an image processing unit that has a processor that performs image processing and outputs an image to the display unit are provided, and the processor includes fluorescence images generated at a plurality of time points.
  • the region in the fluorescence image in which fluorescence was detected for the first time at each time point is detected, and the above area at each time point is superimposed in a different manner to obtain a fluorescence agent diffusion image. Is generated, and the generated fluorescent agent diffusion image is displayed on the display unit.
  • the treatment support method includes a step of irradiating the excitation light of the fluorescent agent administered to the subject, a step of detecting the fluorescence excited by the excitation light, and a step of capturing a fluorescence image of the subject.
  • the present invention includes a step of generating a fluorescent agent diffused image by superimposing the regions at each time point in different modes, and a step of displaying the generated fluorescent agent diffused image on the display unit.
  • a fluorescent agent diffusion image is generated and displayed in which the regions where fluorescence is first detected at each time point are displayed in different modes from each other. Since the above region is considered to be the region where the presence of the fluorescent agent diffused by the bloodstream is first detected, the fluorescent agent diffusion image can distinguish the change in which the fluorescent agent diffuses with the passage of time at each time point. It is displayed in. Therefore, the state of diffusion of the fluorescent agent administered to the subject can be displayed by the fluorescent agent diffusion image as a state in which the above-mentioned region is enlarged at each time point. As a result, the state of diffusion of the fluorescent agent administered to the subject can be appropriately displayed.
  • the configuration of the treatment support device 100 according to one embodiment will be described with reference to FIGS. 1 to 14.
  • the treatment support device 100 includes a fluorescence imaging device 10 and an image processing unit 20.
  • the treatment support device 100 images the treatment target site 2 by the fluorescence imaging device 10 and displays the image processed by the image processing unit 20 on the display unit 30 to perform the treatment. It is a device that supports the above.
  • the treatment support by the treatment support device 100 is performed by displaying an image visualized (imaging) by imaging the fluorescence 92 generated from the fluorescence agent 3 administered into the body of the subject 1 on the display unit 30. This is to provide a doctor or the like with information on the treatment target site 2 that cannot be directly seen from the outside.
  • Subject 1 is, for example, a human being, but is not particularly limited.
  • the treatment target site 2 is, for example, the chest, abdomen, back, internal organs (for example, digestive tract, liver, adrenal gland, etc.), but is not particularly limited.
  • the fluorescence imaging device 10 detects the fluorescence 92 emitted from the fluorescent agent 3 administered to the subject 1 by irradiating the excitation light 91, and visualizes (imaging) the treatment target site 2 of the subject 1 based on the fluorescence 92. It is a device to do.
  • the image processing unit 20 performs image processing on the fluorescence image captured by the fluorescence imaging device 10 to generate a fluorescent agent diffusion image 85, which will be described later.
  • the image processing unit 20 is configured to output an image to the display unit 30.
  • the image processing unit 20 is composed of a computer including a processor 21 and a storage unit for performing information processing to generate a fluorescent agent diffused image 85 from image data.
  • the processor 21 performs image processing of the fluorescence image.
  • the image processing unit 20 is, for example, a PC (Personal Computer).
  • the image processing unit 20 is electrically connected to, for example, the fluorescence imaging device 10, and acquires an image from the fluorescence imaging device 10.
  • the image processing unit 20 is electrically connected to, for example, the display unit 30 and outputs an image to the display unit 30.
  • the connection between these devices may be either wired or wireless.
  • the display unit 30 is configured to display an image of the treatment target site 2 output from the image processing unit 20.
  • the image processing unit 20 is, for example, a monitor such as a liquid crystal display.
  • the image processing unit 20 is, for example, a monitor provided in an operating room or the like where the treatment of the subject 1 is performed.
  • the display unit 30 may be, for example, a monitor included in the image processing unit 20 or the fluorescence imaging device 10.
  • the fluorescence imaging device 10 includes an imaging unit 11, an arm mechanism 12, and a main body unit 13.
  • the fluorescence imaging device 10 images the treatment target site 2 from the outside of the subject 1 by the imaging unit 11 arranged at a position separated from the subject 1.
  • the arm mechanism 12 (see FIG. 2) has a first end connected to the main body 13 and a second end connected to the imaging unit 11, and the imaging unit 11 is positioned and oriented at an arbitrary position within the movable range. It is configured to be able to be held in.
  • the imaging unit 11 includes at least a fluorescence imaging unit 111 that detects the fluorescence 92 excited by the excitation light 91 and captures the fluorescence image 81 of the subject 1.
  • the imaging unit 11 is configured to capture a visible light image 82 based on visible light 93 in addition to the fluorescence image 81 based on fluorescence 92. That is, the imaging unit 11 includes a visible light imaging unit 112 that detects the visible light 93 reflected from the subject 1 and captures the visible light image 82 of the subject 1.
  • the imaging unit 11 is configured to capture a fluorescence image 81 and a visible light image 82 as moving images.
  • the imaging unit 11 generates a fluorescence image 81 and a visible light image 82 in chronological order at a predetermined frame rate.
  • the generated individual fluorescence image 81 and visible light image 82 are frame images constituting each frame of the moving image.
  • the frame rate is, for example, 1 fps (frames per second) or more, preferably 15 fps or more, and more preferably 30 fps or more.
  • the frame rate is set to, for example, 60 fps, and can be changed according to the setting by the user.
  • the fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to generate the fluorescence image 81 at a time interval shorter than the pulsating cycle of the blood flow of the subject 1.
  • the heart rate of a human is about 60 to 75 times / minute for an adult, and is about 1 time / second when converted.
  • 30 fps and 60 fps are time intervals sufficiently shorter than the pulsating cycle of the blood flow of the subject 1.
  • the image pickup unit 11 includes a light receiving unit 11a, an optical system 11b, and an image pickup light source unit 11c.
  • the light receiving unit 11a includes the fluorescence imaging unit 111 and the visible light imaging unit 112 described above.
  • the visible light imaging unit 112 is configured to detect visible light 93.
  • the fluorescence imaging unit 111 is configured to detect fluorescence 92.
  • the visible light imaging unit 112 and the fluorescence imaging unit 111 include, for example, an image sensor such as a CMOS (Complementary Neticide Semiconductor) image sensor or a CCD (Charge Coupled Device) image sensor.
  • CMOS Complementary Neticide Semiconductor
  • CCD Charge Coupled Device
  • the optical system 11b includes a zoom lens 113 and a prism 114.
  • the optical system 11b is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3.
  • the detailed configuration of the optical system 11b will be described later.
  • the imaging light source unit 11c includes an excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1.
  • the excitation light irradiation unit 115 has an excitation light source 116 (see FIG. 3) that generates excitation light 91.
  • the excitation light irradiation unit 115 irradiates the excitation light 91 having a suitable wavelength according to the light absorption characteristics of the fluorescent agent 3.
  • the fluorescent agent 3 is indocyanine green (ICG).
  • the ICG has an absorption peak in the wavelength region of about 750 nm or more and less than about 800 nm.
  • the ICG emits fluorescence 92 having a peak in the wavelength region of about 800 nm or more and less than about 850 nm.
  • the excitation light irradiation unit 115 irradiates the excitation light 91 having a peak at, for example, about 750 nm.
  • the imaging light source unit 11c includes a visible light irradiation unit 117 that irradiates visible light 93 in the visible wavelength region.
  • the visible light irradiation unit 117 has a visible light source 118 (see FIG. 3) that generates visible light 93.
  • the excitation light irradiation unit 115 irradiates, for example, white light toward the subject 1 as visible light 93.
  • White light contains wavelength components over substantially the entire visible wavelength region.
  • the visible light 93 irradiated by the excitation light irradiation unit 115 has a peak of emission intensity in the visible wavelength region.
  • Lighting equipment with visible light wavelength such as surgical light is installed in the operating room where the patient is treated. Since the light generated by the lighting equipment can be used as the visible light 93, the imaging light source unit 11c does not have to include the visible light irradiation unit 117.
  • the image pickup light source unit 11c is provided in an annular shape on the end surface of the image pickup unit 11 so as to surround the optical system 11b.
  • a total of 12 excitation light sources 116 and visible light sources 118 are arranged in an annular shape.
  • These excitation light sources 116 and visible light sources 118 are, for example, light emitting diodes (LEDs).
  • the excitation light source 116 and the visible light source 118 may be a laser light source such as a semiconductor laser.
  • the fluorescence imaging unit 111 and the visible light imaging unit 112 are configured to detect fluorescence 92 and visible light 93, respectively, via a common optical system 11b. Therefore, the imaging unit 11 acquires the fluorescence image 81 and the visible light image 82 at the same imaging position and the same imaging field of view.
  • the fluorescence 92 and the visible light 93 are incident on the zoom lens 113 along the optical axis 94.
  • the zoom lens 113 is moved in a direction along the optical axis 94 by a lens moving mechanism (not shown) in order to focus.
  • the imaging unit 11 can acquire the fluorescence image 81 and the visible light image 82 at an arbitrary magnification within the variable range of the zoom lens 113.
  • the fluorescence 92 and the visible light 93 reach the prism 114 after passing through the zoom lens 113.
  • the prism 114 is configured to separate the visible light 93 reflected from the subject 1 and the fluorescence 92 emitted from the fluorescent agent 3.
  • the fluorescence 92 that has reached the prism 114 passes through the prism 114 and reaches the fluorescence imaging unit 111.
  • the visible light 93 that has reached the prism 114 is reflected by the prism 114 and reaches the visible light imaging unit 112.
  • the reflected light of the excitation light 91 from the subject 1 is reflected by the prism 114. Therefore, it is avoided that the reflected light of the excitation light 91 from the subject 1 reaches the fluorescence imaging unit 111.
  • the arm mechanism 12 includes a translational support portion 121 that rotatably supports the image pickup unit 11 and a rotation support portion 122 that rotatably supports the image pickup unit 11.
  • the translational support unit 121 supports the image pickup unit 11 via the rotation support unit 122, holds the position of the image pickup unit 11, and can translate the image pickup unit 11 in each of the front-back, left-right, and up-down directions. It is configured as follows.
  • the rotation support unit 122 is configured so that the image pickup unit 11 can be rotated in each of the left-right and up-down directions.
  • the main body 13 includes a housing 131 and a computer housed in the housing 131.
  • the housing 131 is, for example, a dolly having a box shape for accommodating a computer and being movable by wheels.
  • the main body unit 13 includes a control unit 132, an image generation unit 133, a main body storage unit 134, and an output unit 135.
  • the control unit 132 is composed of, for example, a computer including a processor such as a CPU (Central Processing Unit) and a memory.
  • the computer functions as a control unit 132 of the fluorescence imaging device 10 by executing a program stored in the memory by the processor.
  • the control unit 132 controls the image pickup unit 11 (start and stop of imaging, etc.), irradiates the light (excitation light 91, visible light 93) from the image pickup light source unit 11c, stops the irradiation, and the like, which is not shown. It is configured to control based on the input operation to.
  • the image generation unit 133 includes image data of the fluorescence image 81 (see FIG. 4) captured by the imaging unit 11 and the visible light image 82 (see FIG. 4) from the detection signals of the imaging unit 11 (fluorescence imaging unit 111 and visible light imaging unit 112). It is configured to generate the image data of (see FIG. 4) respectively.
  • the image generation unit 133 includes, for example, a processor such as a GPU (Graphics Processing Unit) or an FPGA (Field-Programmable Gate Array) configured for image processing, and a memory.
  • the main body storage unit 134 is configured to store the captured image generated by the image generation unit 133, a control program, and the like.
  • the main body storage unit 134 includes, for example, a non-volatile memory, an HDD (Hard Disk Drive), and the like.
  • the output unit 135 is configured to output a video signal including the captured image generated by the image generation unit 133 to the image processing unit 20.
  • the output unit 135 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices.
  • the output unit 135 is connected to the image processing unit 20 so that the captured image can be output by wire or wirelessly.
  • the fluorescence imaging device 10 acquires the fluorescence image 81 and the visible light image 82 of the subject 1, and outputs the acquired fluorescence image 81 and the visible light image 82 to the image processing unit 20.
  • the fluorescence imaging device 10 outputs an image to the image processing unit 20 in a moving image format.
  • the fluorescence image 81 and the visible light image 82 are sequentially output in chronological order according to a set frame rate as frame images (still images) constituting a moving image. That is, one fluorescent image 81 and one visible light image 82 are output to the image processing unit 20 at regular time intervals.
  • the image processing unit 20 includes a processor 21, a storage unit 22, and an input / output unit 23.
  • the processor 21 compares the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point in the fluorescence image 81 in which the fluorescence 92 is detected for the first time at each time point.
  • a fluorescence agent diffusion image 85 is generated by detecting a region (hereinafter referred to as a fluorescence start region 84) and superimposing the fluorescence start region 84 at each time point in a different manner, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30. It is configured to be displayed in.
  • the fluorescence start region 84 is an example of the "region in the fluorescence image in which fluorescence is detected for the first time" in the claims.
  • the configuration of the image processing unit 20 will be described.
  • the processor 21 is composed of, for example, a CPU, a GPU, an FPGA configured for image processing, or the like.
  • the storage unit 22 includes a volatile and / or non-volatile memory, a storage device such as an HDD, and the like.
  • the storage unit 22 stores a program executed by the processor 21.
  • the storage unit 22 stores various image data such as the image data obtained from the fluorescence imaging device 10 and the fluorescent agent diffusion image 85 generated by the image processing unit 20.
  • the input / output unit 23 receives the input of the video signal including the fluorescence image 81 and the visible light image 82 generated by the fluorescence imaging device 10.
  • the input / output unit 23 outputs an image from the image processing unit 20 to the display unit 30.
  • the input / output unit 23 is a video output interface such as HDMI (registered trademark) and an interface for connecting other external devices.
  • the input / output unit 23 is connected to the fluorescence imaging device 10 (output unit 135) and the display unit 30, respectively, by wire or wirelessly.
  • the image processing unit 20 includes a visible light image acquisition unit 211, a fluorescence image acquisition unit 212, a region extraction unit 213, a pulsation cycle acquisition unit 214, a diffusion image generation unit 215, and an image composition unit 216. Included as a functional block.
  • the functional block means a set of information processing functions realized by the processor 21 included in the image processing unit 20 executing a program. Each of these functional blocks may be composed of separate hardware (processor).
  • the visible light image acquisition unit 211 acquires the visible light image 82 imaged by the visible light image pickup unit 112 of the fluorescence imaging device 10 via the input / output unit 23.
  • the visible light image acquisition unit 211 outputs the acquired visible light image 82 to the image synthesis unit 216.
  • the fluorescence image acquisition unit 212 acquires the fluorescence image 81 imaged by the fluorescence imaging unit 111 of the fluorescence imaging device 10 via the input / output unit 23.
  • the fluorescence image acquisition unit 212 outputs the acquired fluorescence image 81 to the region extraction unit 213.
  • the individual fluorescence images 81 sequentially acquired in chronological order are specified by frame numbers.
  • the frame number of the fluorescent image 81 represents the time of shooting.
  • the fluorescent image 81 of each frame number is an image of the same position of the subject 1 and is an image at a different time of shooting.
  • the fluorescence image 81 at each time point can be rephrased as the fluorescence image 81 of each frame number.
  • the region extraction unit 213 extracts the fluorescence start region 84 in the fluorescence image 81 generated in time series by the fluorescence imaging unit 111.
  • the region extraction unit 213 detects the fluorescence start region 84 by comparing the fluorescence image 81 generated at a plurality of time points with the fluorescence image 81 before the generation time point.
  • the fluorescence start region 84 is a region in which fluorescence 92 is detected for the first time in the fluorescence image 81 generated at each time point after the start of imaging.
  • the fluorescence start region 84 is, for example, individual pixels included in the fluorescence image 81. That is, the region extraction unit 213 extracts the fluorescence start region 84 in the fluorescence image 81 in pixel units.
  • the region extraction unit 213 may extract a group of a plurality of pixels as a fluorescence start region 84.
  • the fluorescence image 81 is an image obtained by detecting the fluorescence 92 generated from the fluorescence agent 3, and the pixel value of the fluorescence image 81 represents the fluorescence intensity.
  • the fluorescent agent 3 diffuses with the passage of time due to the blood flow.
  • the fluorescent agent 3 passes through a certain position in the fluorescence image 81, the fluorescence 92 is detected from the low pixel value in the state where the fluorescence 92 is not detected (the state where the pixel value is at the background level) at that position.
  • the pixel value rises. After that, when the fluorescent agent 3 flows away, the pixel value drops, and the state returns to the state where the fluorescence 92 is not detected.
  • the fluorescence start region 84 is a region in which the rising edge of the pixel value is first detected in the time change of the pixel value.
  • a region having a pixel value higher than a certain value is hatched, and a region having a pixel value equal to or less than a certain value is shown in plain color.
  • the fluorescence start region 84 extracted from each fluorescence image 81 is shown with hatching.
  • the fluorescence start region 84 extracted in the fluorescence image 81 is a region where the fluorescence 92 is detected for the first time after the start of imaging, the same fluorescence start region 84 is not extracted multiple times.
  • the fluorescence start region 84a is extracted from the fluorescence image 81 of a certain frame number (M1).
  • the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (M1), reflecting the diffusion of the fluorescent agent 3.
  • the fluorescence start region 84b does not include the fluorescence start region 84a of the frame number (M1).
  • the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (M2).
  • the method for extracting the fluorescence start region 84 in the fluorescence image 81 is not particularly limited, but here, three examples shown in FIGS. 7 to 9 will be described.
  • 7 to 9 are graphs showing a time intensity curve (TIC) 71 in one pixel in the fluorescence image 81.
  • the vertical axis indicates the pixel value (that is, the fluorescence intensity)
  • the horizontal axis indicates the frame number (that is, the elapsed time).
  • the processor 21 has a fluorescence start region 84 based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Is extracted.
  • the region extraction unit 213 extracts pixels whose pixel values exceed the threshold value 72 as the fluorescence start region 84.
  • the threshold value 72 is set to a predetermined value higher than the pixel value (background level) at the time point (region 71a) before the rise of the pixel value.
  • the processor 21 fluoresces based on the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 73.
  • the start region 84 is extracted.
  • the region extraction unit 213 extracts pixels whose slope of the time intensity curve 71 exceeds the threshold value 73 as the fluorescence start region 84.
  • the slope of the time intensity curve 71 is, for example, the difference value (change amount) of the pixel values between two adjacent frames.
  • the threshold value 73 is a predetermined value larger than the slope in the region 71a.
  • the processor 21 region extraction unit 213 is based on the fact that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 74.
  • the fluorescence start region 84 is extracted.
  • the region extraction unit 213 extracts pixels whose area value of the time intensity curve 71 exceeds the threshold value 74 as the fluorescence start region 84.
  • the area value of the time intensity curve 71 is, for example, an integrated value of pixel values exceeding the background level in each frame. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 (the value obtained by dividing the area value by the number of frames) may be used.
  • the processor 21 determines the fluorescence 92 depending on which frame number (at what time point) the fluorescence image 81 of the fluorescence images 81 acquired in time series extracts the fluorescence start region 84. It is possible to identify the location and time point when was first detected.
  • the fluorescence start region 84 extracted in the fluorescence image 81 of any frame number is not extracted again in the fluorescence image 81 of subsequent frame numbers.
  • the region extraction unit 213 outputs information for identifying the extracted fluorescence start region 84 to the diffusion image generation unit 215.
  • the pulsation cycle acquisition unit 214 acquires the pulsation cycle 60 of the blood flow of the subject 1.
  • the processor 21 integrates the number of fluorescence start regions 84 for each generated fluorescence image 81, and the fluorescence start region 84 for each fluorescence image 81 generated in time series. It is configured to detect the pulsation cycle 60 based on the number of. In the configuration in which the fluorescence start region 84 is extracted in units of one pixel, the number of fluorescence start regions 84 integrated for each fluorescence image 81 is the total number of pixels extracted as the fluorescence start region 84 in one fluorescence image 81. Is.
  • the fluorescent agent 3 is carried by the bloodstream of the subject 1, branches from the artery into smaller blood vessels, diffuses into the capillaries and body tissues, and then into the veins via the capillaries. It flows.
  • the flow of blood so that it penetrates into the body tissue is called perfusion.
  • the fluorescent agent 3 contained in the blood diffuses in the body tissue contained in the field of view by perfusion. Since the perfusion is caused by the pulsation of blood caused by the beating of the heart, the diffusion of the fluorescent agent 3 also changes periodically in response to the pulsation.
  • FIG. 10 shows a heartbeat waveform 75 detected by an electrocardiograph and a pulse waveform 76 detected by a pulse wave meter.
  • the horizontal axis represents time and the vertical axis represents signal strength.
  • the peak in waveform 75 the peak of systolic blood pressure of the heart arrives, and the peak in waveform 76 is formed with a time lag due to the time difference in the propagation of the blood pressure peak.
  • the inventor of the present application has obtained the finding that the fluorescent agent observed in the fluorescent image 81 diffuses periodically with the pulsation derived from the heartbeat. Therefore, in the present embodiment, the change in each pulsation cycle 60 is visualized from the fluorescence image 81 acquired in time series.
  • FIG. 11 shows a graph 65 showing the change in the number of fluorescence start regions 84 with the passage of time.
  • the horizontal axis of the graph 65 shows the elapsed time (that is, the number of frames), and the vertical axis shows the number of fluorescence start regions 84 (total number of fluorescence start regions 84 included in the frame image). Focusing on the change in the number of the fluorescence start regions 84, the peak 66 of the number of the fluorescence start regions 84 is formed corresponding to the peak in the waveform 75 and the peak in the waveform 76 shown in FIG.
  • the fluorescent agent 3 rapidly diffuses together with the blood at a timing corresponding to the peak of the heartbeat or the pulse, and the diffusion rate of the fluorescent agent 3 becomes slow until the next peak arrives. Therefore, the peak 66 of the number of the fluorescence start regions 84 is formed at the timing corresponding to the peak of the heartbeat or the pulse. Since the number peak 66 of the fluorescence initiation region 84 is caused by the beating of the heart, one is formed for each beating (or one pulsation).
  • the processor 21 acquires the change in the number of the fluorescence start regions 84 for each fluorescence image 81, and detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84. It is configured in. In graph 65, the period between two adjacent peaks 66 is the blood pulsation cycle 60 due to the beating of the heart.
  • the pulsation cycle acquisition unit 214 detects the pulsation cycle 60 based on, for example, the time interval between the vertices of the adjacent peak 66 and the time interval between the rising point or the falling point of the adjacent peak 66.
  • the pulsation cycle acquisition unit 214 detects the peak 66 of the number of the fluorescence start regions 84 in the fluorescence image 81 of any frame number among the fluorescence images 81 acquired in order in chronological order. Identify if it was done.
  • the pulsation cycle acquisition unit 214 specifies the period from the frame number of the immediately preceding peak 66 to the frame number of the peak 66 detected this time as one pulsation cycle 60.
  • the period from the start of imaging to the frame number of the peak 66 detected this time is specified as one pulsating cycle 60.
  • the pulsation cycle 60 can be specified by a start frame number that is the start point of the cycle and an end frame number that is the end point of the cycle.
  • the fluorescence image 81 having a frame number within the range of the start frame number or more and the end frame number or less is a fluorescence image 81 belonging to the same pulsation cycle 60.
  • the frame number (N1) is the first peak 66a
  • the frame number (N2) is the second peak 66b
  • the frame number (N3) is the third peak 66c
  • the frame number (N4) Shows an example in which the fourth peak 66d is detected
  • the frame number (N5) shows an example in which the fifth peak 66e is detected.
  • the frame number (1) to the frame number (N1) is the cycle 1
  • the frame number (N1 + 1) to the frame number (N2) is the cycle 2
  • the frame number (N2 + 1) to the frame number (N3) is the cycle 3
  • the frame number From (N3 + 1), the frame number (N4) has a cycle of 4
  • the frame number (N4 + 1) the frame number (N5) has a cycle of 5.
  • the diffused image generation unit 215 (processor 21, see FIG. 5) generates the fluorescent agent diffused image 85 by superimposing the fluorescence start regions 84 at each time point in different modes.
  • the diffusion image generation unit 215 (processor 21, see FIG. 5) is based on the fluorescence start region 84 extracted by the region extraction unit 213 and the pulsation cycle 60 acquired by the pulsation cycle acquisition unit 214.
  • Fluorescent agent diffusion image 85 (see FIG. 12) is generated.
  • the fluorescent agent diffusion image 85 is an image in which the extracted fluorescence start regions 84 are displayed in different modes for each pulsation cycle 60.
  • the diffusion image generation unit 215 sequentially arranges the fluorescence start regions 84 (see FIG. 6) extracted by the region extraction unit 213 from the frame number (1) in chronological order. To get to. Further, the diffusion image generation unit 215 acquires the pulsation cycle 60 detected by the pulsation cycle acquisition unit 214. That is, the diffusion image generation unit 215 acquires the frame number in which the peak 66 of the number of the fluorescence start regions 84 is detected.
  • the diffusion image generation unit 215 collectively displays the fluorescence start regions 84 extracted from the fluorescence images 81 belonging to the same pulsation cycle 60 in the same manner.
  • the diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 extracted from the fluorescence image 81 belonging to the pulsation cycle 60 for each pulsation cycle 60.
  • the diffusion image generation unit 215 (processor 21) generates a fluorescence agent diffusion image 85 in which the fluorescence start regions 84 at each time point in each pulsation cycle 60 are superimposed in different modes.
  • FIG. 12 shows the flow of the generation process of the fluorescent agent diffusion image 85.
  • the region extraction unit 213 acquires the fluorescence start region 84 in order from the frame number (1).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence initiation regions 84 extracted in each fluorescence image 81 until the peak 66 of the number of fluorescence initiation regions 84 is detected in the same embodiment 86a. ..
  • the frame number (1) to the frame number (N1) are detected by the pulsation cycle acquisition unit 214 as the first pulsation cycle 60 (cycle 1).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 from the frame number (1) to the frame number (N1) in the same aspect 86a.
  • the difference in display mode is expressed by the difference in hatching applied to the fluorescence start region 84.
  • the pulsation cycle acquisition unit 214 detects the frame number (N1 + 1) to the frame number (N2) as the second pulsation cycle 60 (cycle 2).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 2 in the same aspect 86b.
  • the aspect 86b is a display aspect different from the aspect 86a, and the user can visually distinguish the fluorescence start region 84 displayed in the aspect 86a from the fluorescence start region 84 displayed in the aspect 86b.
  • the pulsation cycle acquisition unit 214 detects the frame number (N2 + 1) to the frame number (N3) as the third pulsation cycle 60 (cycle 3).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 that displays the fluorescence start region 84 extracted in each fluorescence image 81 belonging to the cycle 3 in the same aspect 86c.
  • Aspect 86c is a display aspect different from the aspects 86a and 86b.
  • the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 that displays the extracted fluorescence start region 84 in different modes for each pulsation cycle 60.
  • the processor 21 (diffusion image generation unit 215) generates (updates) the fluorescent agent diffusion image 85 for each frame.
  • the processor 21 adds the extracted fluorescence start region 84 included in the pulsation cycle 60 to the fluorescence image 81.
  • the fluorescence initiation region 84 extracted from the latest fluorescence image 81 is configured to be added to the fluorescent agent diffusion image 85.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including a fluorescence start region 84a.
  • the fluorescence start region 84b is extracted in the surrounding region adjacent to the fluorescence start region 84a of the frame number (1).
  • the diffusion image generation unit 215 generates a fluorescent agent diffusion image 85 in which the fluorescence start region 84b is added to the fluorescence start region 84a.
  • the fluorescence start region 84c is extracted in the surrounding region adjacent to the fluorescence start region 84b of the frame number (2).
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 in which a fluorescence start region 84c is added to the fluorescence agent diffusion image 85 of frame number (2).
  • the diffusion image generation unit 215 acquires the fluorescent image 81 each time the fluorescent image 81 is acquired between the start frame number (1) and the end frame number (N1) of the pulsation cycle 60.
  • the fluorescence initiation region 84 extracted from is added to the fluorescent agent diffusion image 85.
  • the generated fluorescent agent diffusion image 85 is a moving image in which the fluorescence start region 84 gradually expands with the lapse of time until the pulsation cycle 60 is switched to the next cycle. Is generated as.
  • the fluorescent agent diffusion image 85 at the end frame number of the pulsation cycle 60 all the fluorescence start regions 84 extracted within the period of the pulsation cycle 60 are displayed.
  • the processor 21 sets the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60 each time the pulsation cycle 60 elapses.
  • the extracted fluorescence initiation region 84 is configured to be added to the fluorescence agent diffusion image 85.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86a during the first pulsation cycle 60 (cycle 1).
  • the diffusion image generation unit 215 keeps displaying the fluorescence start region 84 of the aspect 86a even after the first pulsation cycle 60 (cycle 1) elapses.
  • the diffusion image generation unit 215 generates a fluorescence agent diffusion image 85 including the fluorescence start region 84 of the aspect 86b.
  • the diffusion image generation unit 215 adds the fluorescence start region 84 of the aspect 86a to the fluorescence start region 84 of the aspect 86b to display the fluorescence start region 84 of the aspect 86b on the fluorescent agent diffusion image 85.
  • the diffusion image generator 215 adds the fluorescence start region 84 of aspect 86a and the fluorescence start region 84 of aspect 86b to diffuse the fluorescence start region 84 of aspect 86c with a fluorescent agent. It is displayed on the image 85.
  • the diffusion image generation unit 215 changes the display mode of the fluorescence start region 84 each time the pulsation cycle 60 elapses, and adds the fluorescence start region 84 belonging to the pulsation cycle 60 to the fluorescent agent diffusion image 85. To do. As a result, as shown in FIG. 12, in the generated fluorescent agent diffusion image 85, the display mode of the fluorescence start region 84 is gradually changed with the passage of the pulsation cycle 60, and the displayed fluorescence start region 84 is gradually displayed. It is generated as a moving image that expands to. In the fluorescent agent diffusion image 85, a boundary line 87 corresponding to the timing at which the pulsation cycle 60 is switched (the timing at which the display mode is switched) is formed.
  • the boundary line 87 is recognizable as a boundary line because the display mode between the adjacent fluorescence start regions 84 is different (the adjacent fluorescence start regions 84 belong to different pulsation cycles 60).
  • the fluorescence agent diffusion image 85 displays a fluorescence start region 84 divided by a number of boundary lines 87 corresponding to the number of pulsation cycles 60 that have elapsed until the end of imaging.
  • the fluorescent agent diffusion image 85 is an image in which the boundary line 87 indicating the timing at which the pulsation cycle 60 is switched is formed in an isolinear shape.
  • the image synthesis unit 216 (processor 21) combines the visible light image 82 acquired by the visible light image acquisition unit 211 and the fluorescent agent diffusion image 85 generated by the diffusion image generation unit 215. Perform the process of synthesizing. Compositing includes a process of superimposing a plurality of images. Specifically, as shown in FIG. 14, the image synthesizing unit 216 generates a superposed image 88 by superimposing the fluorescent agent diffusion image 85 on the visible light image 82. Therefore, in the generated superimposed image 88, in the visible light image 82 in which the treatment target site 2 actually visible to the user is captured, the fluorescent agent diffusion image 85 showing how the fluorescent agent 3 is diffused is displayed overlapping. It becomes an image.
  • the fluorescence image 81 and the visible light image 82 imaged by the fluorescence imaging device 10 are images having the same field of view.
  • the fluorescent agent diffusion image 85 is an image generated from the fluorescence image 81 and having the same field of view as the fluorescence image 81. Therefore, the visible light image 82 and the fluorescent agent diffusion image 85 are images that capture the same field of view. Since the visible light image 82 and the fluorescent agent diffusion image 85 are images in the same field of view, the superimposed image 88 can be generated by a simple process of simply superimposing the superimposed images without aligning the superimposed images. Is possible.
  • the image synthesizing unit 216 outputs the generated superimposed image 88 to the input / output unit 23.
  • the input / output unit 23 outputs the superimposed image 88 acquired from the image composition unit 216 to the display unit 30 and displays it on the screen.
  • the processor 21 causes the display unit 30 to display the generated fluorescent agent diffusion image 85.
  • the processor 21 (image synthesis unit 216) is configured to superimpose the fluorescent agent diffusion image 85 on the visible light image 82 and display it on the display unit 30.
  • the processor 21 acquires a frame image (fluorescence image 81, visible light image 82) of the latest frame from the fluorescence imaging device 10, it generates a fluorescence agent diffusion image 85 of the frame, and together with the fluorescence agent diffusion image 85.
  • a superposed image 88 is generated by superimposing the visible light image 82.
  • the processor 21 outputs the superimposed image 88 generated for each frame to the display unit 30, so that the superimposed image 88 is displayed as a moving image.
  • 15 to 17 are specific examples of the superimposed image 88 including the fluorescent agent diffusion image 85, and show the flap of the subject 1 to be transplanted in the flap technique (skin graft).
  • a flap is a blood-flowing skin, subcutaneous tissue, and deep tissue.
  • 15 to 17 show a superposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 in which the skin 2a and the subcutaneous tissue 2b of the treatment target site 2 are captured.
  • the processor 21 shows (1) displaying the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) showing the fluorescence start region 84 for each pulsation cycle 60.
  • the fluorescence initiation regions 84 for each pulsation cycle 60 are displayed in different modes from each other by at least one of displaying the lines in a contour diagram.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 with different gradations.
  • the gradation is a stage (gradation) of color and brightness expressed in an image.
  • the gradation includes a gradual change in color and a gradual change in brightness.
  • the gradations include a stepwise change in brightness from white (255) through gray (126) to black (0).
  • the image processing unit 20 sets the oldest pulsating cycle 60 (cycle 1) as white (255), and as it approaches the latest cycle K, , The gradation value is gradually reduced so as to approach black (0).
  • the fluorescent agent diffusion image 85 is a color image, and a color is obtained by combining three gradation values of R (red, 0 to 255), G (green, 0 to 255), and B (blue, 0 to 255).
  • the gradation includes a stepwise change in which the colors are continuously changed in a predetermined order such as red, yellow, green, and blue.
  • the processor 21 sets the oldest pulsating cycle 60 (cycle 1) as red, the latest cycle K as blue, and cycles 1 to cycle K. For each cycle up to, the color gradation is assigned so as to approach from red to blue, and the color gradation is changed stepwise.
  • the boundary line 87 is shown by a broken line for convenience in order to make it easy to understand the regions having different gradations.
  • the fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguishable and recognizable by the difference in gradation.
  • the processor 21 displays a line indicating the fluorescence start region 84 for each pulsation cycle 60 in a contour diagram.
  • the contour diagram is a diagram showing the distribution status by drawing a line connecting points with the same value.
  • a boundary line 87 is formed between the fluorescence start region 84 belonging to a certain pulsation cycle 60 and the fluorescence start region 84 belonging to the next pulsation cycle 60. Will be done.
  • To display the line indicating the fluorescence start region 84 for each pulsation cycle 60 in an isochronous diagram means to display the boundary line 87 on the image. With the diffusion of the fluorescent agent 3, the temporally new boundary line 87 is formed outside the temporally old boundary line 87, so that the fluorescent agent diffusion image 85 becomes an image of an iso-line diagram.
  • the processor 21 extracts the boundary line 87 and displays the boundary line 87 on the fluorescent agent diffusion image 85 each time the pulsation cycle 60 elapses. As a result, the fluorescent agent diffusion image 85 shown in FIG. 16 is generated.
  • the fluorescent agent diffusion image 85 displays the fluorescence start region 84 extracted in each pulsation cycle 60 so as to be distinguished and recognizable as a region between two adjacent boundary lines 87.
  • the display mode by gradation and the display mode by lines may be combined.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and displays the boundary line 87 of the latest pulsation cycle 60 (cycle K) as a boundary line.
  • the processor 21 changes the gradation of the fluorescence start region 84 belonging to the pulsation cycle 60, and detects the elapse of the pulsation cycle 60 (switching to the next pulsation cycle 60). Is given the latest boundary line 87 of the pulsation cycle 60 (cycle K), and the boundary line of the previous pulsation cycle 60 (cycle K-1) is erased.
  • the processor 21 makes the display mode of the boundary line 87 different from the display mode of the fluorescence start region 84 for each pulsation cycle 60.
  • the rules for each display mode of the fluorescence start region 84 shown in FIGS. 15 to 17 are recorded in the storage unit 22 shown in FIG.
  • the processor 21 is configured to be able to accept selection of a display mode, for example, by a user's operation input.
  • the processor 21 generates the fluorescent agent diffusion image 85 in the display mode selected by the user according to the display mode setting information stored in the storage unit 22.
  • the treatment support device 100 implements the treatment support method of the present embodiment.
  • the treatment support method of the present embodiment includes at least the following steps (1) to (5).
  • Step (4) The processor 21 generates the fluorescent agent diffusion image 85 by superimposing the (fluorescence start region 84) at each time point in a different manner.
  • Step (1) and step (2) correspond to step 51 in FIG. Step (3) corresponds to step 52 in FIG. Step (4) corresponds to step 53 in FIG. Step (5) corresponds to step 54 in FIG. Further, the treatment support method of FIG. 18 includes additional steps in addition to the above-mentioned steps (1) to (5).
  • the image display process of the treatment support device 100 is started by starting the imaging by the fluorescence imaging device 10 based on the operation input of a user such as a doctor.
  • the fluorescence image 81 and the visible light image 82 of the subject 1 are captured. That is, the excitation light 91 is irradiated to the subject 1 from the excitation light irradiation unit 115 of the fluorescence imaging device 10, and the fluorescence 92 excited by the excitation light 91 is detected by the fluorescence imaging unit 111. Further, the visible light 93 is irradiated to the subject 1 from the visible light irradiation unit 117, and the reflected light of the visible light 93 is detected by the visible light imaging unit 112. As a result, the fluorescence image 81 and the visible light image 82 corresponding to one frame are imaged. The captured fluorescence image 81 and visible light image 82 are output to the image processing unit 20.
  • step 52 the processor 21 (region extraction unit 213) extracts the fluorescence start region 84 from the fluorescence image 81 obtained in step 51. Further, the processor 21 (pulsation cycle acquisition unit 214) integrates the number of fluorescence start regions 84 extracted in the fluorescence image 81. After the start of imaging, the processor 21 (pulsation cycle acquisition unit 214) shifts the obtained fluorescence image 81 to the first pulsation cycle 60 (cycle 1) until the first peak 66 of the number of fluorescence start regions 84 is detected. Set.
  • step 53 the processor 21 (diffusion image generation unit 215) generates a fluorescence agent diffusion image 85 based on the extracted fluorescence start region 84 and the pulsation cycle 60.
  • the processor 21 generates a fluorescent agent diffusion image 85 displaying the fluorescence start region 84 in a preset display mode according to the setting information stored in the storage unit 22.
  • step 54 the processor 21 (image synthesizing unit 216) superimposes the fluorescent agent diffusion image 85 on the visible light image 82 to generate the superposed image 88, and generates the superposed image 88 via the input / output unit 23. Output to the display unit 30 (see FIG. 1). As a result, the superimposed image 88 for one frame constituting the moving image is displayed on the display unit 30.
  • step 55 the processor 21 (pulsation cycle acquisition unit 214) determines whether or not it is the elapsed timing of the pulsation cycle 60. That is, the processor 21 (pulsation cycle acquisition unit 214) plots the number of fluorescence start regions 84 extracted from the fluorescence image 81 of the current frame number on the graph 65 shown in FIG. 11, and plots the number of fluorescence start regions 84. It is determined whether or not the peak 66 of the change of is detected. If no peak 66 is detected, processor 21 advances processing to step 56.
  • step 56 the processor 21 determines whether or not to end the imaging.
  • the processor 21 determines that the imaging is terminated when, for example, an operation input for ending the imaging is received from the user or when the preset end time is reached.
  • the fluorescence imaging device 10 ends the imaging
  • the processor 21 stops the image processing, and the image display processing of FIG. 18 ends.
  • the processor 21 returns the process to step 51, acquires images of the next frame (fluorescence image 81 and visible light image 82), and performs the processes of steps 51 to 54.
  • the processor 21 pulses the peak 66 of the change in the number of the fluorescence start regions 84.
  • the processor 21 advances the process to step 57.
  • step 57 the processor 21 determines the display mode of the fluorescence start region 84 belonging to the next pulsation cycle 60. That is, the processor 21 determines the display mode of the fluorescence start region 84 extracted after the next frame number according to the setting information of the display mode stored in the storage unit 22. Next, the processor 21 determines whether or not to end the imaging in step 56, and if not, processes the fluorescence image 81 and the visible light image 82 having the next frame number in step 51.
  • the fluorescence start region 84 extracted from the fluorescence image 81 of the frame after the display mode is determined in step 57 becomes the fluorescent agent diffusion image 85 in a mode different from the fluorescence start region 84 belonging to the previous pulsation cycle 60. Is displayed.
  • the display mode determined in step 57 is then applied up to the frame number at which the peak 66 of change in the number of fluorescence initiation regions 84 is detected.
  • the fluorescent agent diffusion image 85 the fluorescence start region 84 extracted from the fluorescence image 81 of each frame is displayed in a different manner for each pulsation cycle 60.
  • the image display processing by the treatment support device 100 is performed. From the fluorescent agent diffusion image 85 (superimposed image 88) displayed on the display unit 30, a doctor or the like can observe how the fluorescent agent 3 diffuses at each pulsation cycle 60.
  • the fluorescent agent diffusion image 85 makes it possible for a doctor or the like to provide useful information to the doctor or the like when evaluating the quality of blood perfusion at the treatment target site 2.
  • the excitation light irradiation unit 115 that irradiates the excitation light 91 of the fluorescent agent 3 administered to the subject 1 and the fluorescence 92 excited by the excitation light 91 are provided.
  • the fluorescence imaging unit 111 that detects and captures the fluorescence image 81 of the subject 1 and the image processing unit 20 that has a processor 21 that performs image processing and outputs an image to the display unit 30 are provided.
  • the fluorescence start region 84 at each time point is detected, and the fluorescence start area 84 at each time point is superimposed in a different manner.
  • the fluorescent agent diffusion image 85 is generated, and the generated fluorescence agent diffusion image 85 is displayed on the display unit 30.
  • the step of irradiating the excitation light 91 of the fluorescent agent administered to the subject and the fluorescence 92 excited by the excitation light are detected to detect the fluorescence image 81 of the subject.
  • the fluorescence 92 is detected for the first time at each time point by comparing the fluorescence image 81 generated at each of the plurality of time points (frame numbers) with the fluorescence image 81 before the generation time point.
  • a fluorescent agent diffusion image 85 is generated in which the region where fluorescence is first detected at each time point (fluorescence start region 84) is displayed in different modes from each other. Since the fluorescence start region 84 is considered to be the region where the fluorescent agent 3 diffused by the bloodstream is first detected, the fluorescence agent diffusion image 85 shows a change in which the fluorescent agent 3 diffuses with the passage of time at each time point. It is displayed in a distinguishable manner. Therefore, the fluorescent agent diffusion image 85 can display the state of diffusion of the administered fluorescent agent 3 as a state in which the fluorescence start region 84 expands at each time point. Thereby, the state of diffusion of the fluorescent agent 3 administered to the subject 1 can be appropriately displayed.
  • the fluorescent agent diffusion image 85 can provide useful information for evaluating the quality of blood perfusion in the body tissue of the treatment target site to a doctor or the like who treats the patient. It becomes. That is, it is possible to provide a fluorescent agent diffusion image 85 that is useful for specifying a region to be treated and a region for confirming the result of treatment.
  • the processor 21 is based on the fact that the pixel value of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted simply by comparing the pixel value of each pixel of the fluorescence image 81 with the threshold value 72.
  • the processor 21 is based on the fact that the inclination of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. , Fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the slope of the time intensity curve 71 of each pixel.
  • the processor 21 determines that the area value of the time intensity curve 71 of each pixel of the fluorescence image 81 exceeds a predetermined threshold value 72. Based on this, the fluorescence start region 84 is extracted. With this configuration, the fluorescence start region 84 can be easily extracted from the area value of the time intensity curve 71 of each pixel. Instead of the area value of the time intensity curve 71, the average value of the time intensity curve 71 may be obtained. Even in this case, the fluorescence start region 84 can be easily extracted.
  • the processor 21 acquires the pulsating cycle 60 of the subject and generates the fluorescent agent diffusion image 85 in which the fluorescence start regions 84 at each time point of each pulsating cycle 60 are superimposed in different modes. ..
  • the generated fluorescent agent diffusion image 85 is a display that distinguishably displays the change in which the fluorescent agent 3 diffuses with the pulsation derived from the heartbeat for each pulsation cycle 60. Therefore, the fluorescent agent diffusion image 85 can more appropriately display the diffusion of the fluorescent agent 3 administered to the subject as the fluorescence start region 84 expands every pulsation cycle 60.
  • the fluorescence imaging unit 111 is configured to generate a fluorescence image 81 at a time interval shorter than the pulsation cycle 60, and the processor 21 generates the number of fluorescence start regions 84. It is configured to integrate each of the generated fluorescence images 81 and detect the pulsation cycle 60 based on the number of fluorescence start regions 84 for each fluorescence image 81 generated in time series. With this configuration, the pulsation cycle 60 of the imaged portion can be directly acquired from the fluorescence image 81 acquired in time series.
  • the pulsation cycle 60 it is not necessary to separately provide a device for detecting pulsation such as an electrocardiograph or a pulsation meter, so that the device configuration can be simplified. Further, for example, when pulsation is detected by an electrocardiograph, only the pulsation cycle of the heart is detected, and a time difference occurs from the pulsation cycle 60 of the actually imaged portion. According to the above configuration for detecting the pulsation cycle 60 from the fluorescence image 81, the pulsation cycle 60 of the actually imaged portion can be directly detected, so that the state of diffusion of the fluorescent agent 3 accompanying the pulsation can be visualized more accurately. Can be done.
  • a device for detecting pulsation such as an electrocardiograph or a pulsation meter
  • the processor 21 acquires a change in the number of fluorescence start regions 84 for each fluorescence image 81, and detects a pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84. It is configured to do. With this configuration, the number of fluorescence initiation regions 84 increases sharply in response to the heartbeat (contraction of the ventricles) at every pulsation cycle 60 (that is, the amount of movement of the fluorescent agent 3). Since the peak 66) is formed, the pulsation cycle 60 of the actually imaged portion can be accurately detected from this peak 66.
  • the treatment support device of the above embodiment further includes a visible light imaging unit 112 that detects visible light 93 reflected from the subject 1 and captures a visible light image 82 of the subject 1, and the processor 21 includes a visible light image.
  • the fluorescent agent diffusion image 85 is superimposed on the 82 and displayed on the display unit 30.
  • the fluorescence image 81 and the fluorescence agent diffusion image 85 based on the fluorescence image 81 are images of the fluorescence 92 generated from the fluorescence agent 3, they include information on the morphology of the imaging site that can be recognized by the visible light 93. I can't.
  • the state of diffusion of the fluorescent agent 3 can be identified on the visible light image 82 actually visually recognized by a user such as a doctor. Can be displayed on. As a result, it is possible to easily identify the area to be treated and the area to confirm the result of treatment.
  • the processor 21 is extracted in the latest pulsation cycle 60 in addition to the fluorescence start region 84 extracted in the past pulsation cycle 60.
  • the fluorescence initiation region 84 is configured to be added to the fluorescent agent diffusion image 85. With this configuration, the fluorescence start region 84 extracted in the past pulsation cycle 60 is not erased from the image, and the fluorescence start region 84 is additionally displayed each time the pulsation cycle 60 elapses. It is possible to easily identify and display how the fluorescent agent 3 is diffused in accordance with the above.
  • the processor 21 when the fluorescence image 81 included in the pulsation cycle 60 is generated, the processor 21 adds the latest fluorescence start region 84 included in the pulsation cycle 60 to the latest.
  • the fluorescence start region 84 extracted from the fluorescence image 81 of the above is added to the fluorescence agent diffusion image 85.
  • the processor 21 displays the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and the line indicating the fluorescence start region 84 for each pulsation cycle 60 is equivalent.
  • the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes from each other.
  • the fluorescence start region 84 for each pulsation cycle 60 can be displayed in a visually easily distinguishable manner. As a result, convenience for users such as doctors can be improved.
  • the processor 21 detects the pulsation cycle 60 based on the peak 66 (see FIG. 11) of the number of the fluorescence start regions 84, but the present invention is not limited to this.
  • the processor 21 pulsates based on either the heart rate waveform 75 (see FIG. 10) detected by the electrocardiograph or the pulse waveform 76 (see FIG. 10) detected by the pulse wave meter. Cycle 60 may be detected.
  • the treatment support device 100 includes a waveform acquisition unit 40 that acquires the waveform of the heartbeat or pulse of the subject 1.
  • the waveform acquisition unit 40 includes, for example, an electrocardiograph and detects the heartbeat waveform 75 shown in FIG.
  • the waveform acquisition unit 40 includes, for example, a pulse wave meter, and detects the pulse waveform 76 shown in FIG.
  • the processor 21 is configured to detect the pulsation cycle 60 based on the heartbeat waveform 75 or the pulse waveform 76 instead of the peak 66 of the number of fluorescence initiation regions 84.
  • the processor 21 detects, for example, a feature point such as a peak 75a indicating a QRS complex in the heartbeat waveform 75, and detects a pulsation cycle 60 as a time interval between the feature points.
  • the processor 21 detects the peak 76a corresponding to the QRS wave of the heartbeat waveform 75 among the pulse waveforms 76 as feature points, and detects the pulsation cycle 60 as the time interval between the feature points.
  • the waveform acquisition unit 40 for acquiring the heartbeat or pulse waveform of the subject 1 is provided, and the processor 21 sets the pulsation cycle 60 based on the heartbeat or pulse waveform (75 or 76). It is configured to detect. With this configuration, the pulsation cycle 60 can be easily detected simply by acquiring the detection signal from the waveform acquisition unit 40.
  • the heartbeat waveform 75 is only a detection of the heartbeat, and there is a time lag from the blood pulsation at the treatment target site 2 actually imaged as the fluorescence image 81.
  • the pulse waveform 76 may have a time lag from the pulsation of blood at the treatment target site 2 depending on the length of the blood circulation path between the measurement point by the pulse wave meter and the treatment target site 2. There is.
  • the processor 21 detects the pulsation cycle 60 based on the peak 66 of the number of the fluorescence start regions 84, the treatment target site 2 actually imaged from the fluorescence image 81. Since the pulsation of blood can be directly detected, it is possible to more accurately capture the state of diffusion of the fluorescent agent 3 accompanying the pulsation.
  • the image processing unit 20 is a PC provided separately from the fluorescence imaging device 10, but the present invention is not limited to this.
  • the image processing unit 20 may be provided integrally with the fluorescence imaging device 10.
  • the image processing unit 20 is provided in the main body 13 of the fluorescent imaging apparatus 10.
  • the processor that constitutes the image generation unit 133 of the main body unit 13 may be configured to also function as the processor 21 of the image processing unit 20. That is, the image processing unit 20 and the image generation unit 133 may be provided integrally.
  • the fluorescence start region 84 is extracted based on the pixel value of the fluorescence image 81 exceeding a predetermined threshold value 72 (see FIG. 7)
  • the slope of the time intensity curve 71 sets the predetermined threshold value 72.
  • the fluorescence start region 84 may be extracted based on an amount other than the pixel value, the slope of the time intensity curve 71, and the area value of the time intensity curve 71.
  • the processor 21 is configured to acquire the change in the number of fluorescence start regions 84 for each fluorescence image 81 and detect the pulsation cycle 60 based on the peak 66 of the number of fluorescence start regions 84.
  • the present invention is not limited to this.
  • the processor 21 may detect the pulsation cycle 60 based on a periodically occurring pattern other than the peak 66 of the number of fluorescence initiation regions 84.
  • the processor 21 shows an example in which the superimposed image 88 in which the fluorescent agent diffusion image 85 is superimposed on the visible light image 82 is displayed on the display unit 30, but the present invention is not limited to this.
  • the processor 21 may display only the fluorescent agent diffused image 85 alone without superimposing the fluorescent agent diffused image 85 on the visible light image 82. Further, the processor 21 may display the visible light image 82 and the fluorescent agent diffusion image 85 side by side on the screen.
  • the treatment support device 100 includes the visible light imaging unit 112, but the present invention is not limited to this. In the present invention, the treatment support device 100 does not have to include the visible light imaging unit 112.
  • the processor 21 displays (1) the fluorescence start region 84 for each pulsation cycle 60 in different gradations, and (2) a line indicating the fluorescence start region 84 for each pulsation cycle 60.
  • An example is shown in which the fluorescence start regions 84 for each pulsation cycle 60 are displayed in different modes by at least one of displaying in an isolinear pattern, but the present invention is not limited to this.
  • the processor 21 may display the fluorescence start region 84 in any display mode as long as the display mode of the fluorescence start region 84 for each pulsation cycle 60 is different.
  • ICG is exemplified as the fluorescent agent 3, but the present invention is not limited to this.
  • a fluorescent agent 3 other than ICG may be used.
  • the fluorescent agent 3 other than ICG include 5-ALA (5-aminolevulinic acid) and IR700.
  • 5-ALA itself does not show fluorescence
  • protoporphyrin IX (PPIX) which is a metabolite of 5-ALA administered to subject 1, becomes a fluorescent substance, so that it is referred to as 5-ALA in the present specification.
  • Substances are also included in the fluorescent agent 3.
  • the fluorescent agent 3 may be any fluorescent substance used for fluorescent diagnosis of patients and the like.
  • An excitation light irradiation unit that irradiates the excitation light of the fluorescent agent administered to the subject
  • a fluorescence imaging unit that detects fluorescence excited by excitation light and captures a fluorescence image of the subject. It has a processor that performs image processing, and is equipped with an image processing unit that outputs an image to the display unit.
  • the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at a plurality of time points with the fluorescence image before the generation time point, and at each time point.
  • a treatment support device configured to generate a fluorescent agent diffused image by superimposing the regions in different modes and display the generated fluorescent agent diffused image on a display unit.
  • the fluorescence imaging unit is configured to generate the fluorescence image at a time interval shorter than the pulsation cycle.
  • the processor The number of the regions is integrated for each of the generated fluorescence images.
  • the treatment support device which is configured to detect the pulsating cycle based on the number of the regions for each fluorescent image generated in time series.
  • a waveform acquisition unit for acquiring the heartbeat or pulse waveform of the subject is further provided.
  • the treatment support device according to item 5, wherein the processor is configured to detect the pulsating cycle based on the waveform of the heartbeat or pulse.
  • Aspect 13 The step of irradiating the excitation light of the fluorescent agent administered to the subject, A step of detecting fluorescence excited by excitation light and capturing a fluorescence image of the subject, A step in which the processor detects a region in the fluorescence image in which fluorescence is first detected at each time point by comparing the fluorescence image generated at each of the plurality of time points with the fluorescence image before the generation time point. A step in which the processor produces a fluorescent agent diffuse image by superimposing the regions at each time point in different ways.
  • a treatment support method comprising a step of displaying the generated fluorescent agent diffusion image on a display unit.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

L'invention concerne un processeur (21) destiné à un dispositif d'assistance (100) à un traitement médical, lequel processeur compare des images de fluorescence (81) générées à de multiples points temporels dans le temps de façon à détecter des zones (84) dans les images de fluorescence respectives au niveau des points temporels respectifs où la fluorescence a été détectée pour la première fois, et génère également une image de diffusion d'agent fluorescent (85) en superposant les zones respectives aux points temporels respectifs dans différents modes.
PCT/JP2019/030554 2019-08-02 2019-08-02 Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical WO2021024314A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/JP2019/030554 WO2021024314A1 (fr) 2019-08-02 2019-08-02 Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical
JP2021538536A JP7306461B2 (ja) 2019-08-02 2019-08-02 治療支援装置および治療支援方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2019/030554 WO2021024314A1 (fr) 2019-08-02 2019-08-02 Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical

Publications (1)

Publication Number Publication Date
WO2021024314A1 true WO2021024314A1 (fr) 2021-02-11

Family

ID=74502552

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/030554 WO2021024314A1 (fr) 2019-08-02 2019-08-02 Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical

Country Status (2)

Country Link
JP (1) JP7306461B2 (fr)
WO (1) WO2021024314A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113693724A (zh) * 2021-08-19 2021-11-26 南京诺源医疗器械有限公司 适用于荧光影像导航手术的照射方法、装置及存储介质

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023903A (ja) * 1998-05-01 2000-01-25 Asahi Optical Co Ltd 蛍光診断用電子内視鏡装置
JP2004313470A (ja) * 2003-04-16 2004-11-11 Kao Corp 虫歯検出装置及びこれを実現する虫歯検出プログラム
JP2007021006A (ja) * 2005-07-20 2007-02-01 Hitachi Medical Corp X線ct装置
JP2010521198A (ja) * 2007-03-08 2010-06-24 シンク−アールエックス,リミティド 運動する器官と共に使用するイメージング及びツール
WO2012147820A1 (fr) * 2011-04-28 2012-11-01 オリンパス株式会社 Dispositif d'observation fluorescent et son procédé d'affichage d'images
JP2015147048A (ja) * 2014-02-07 2015-08-20 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. 異なる心拍数にて獲得された心臓の画像シーケンス間を同期すること
JP2015527100A (ja) * 2012-06-21 2015-09-17 ノバダック テクノロジーズ インコーポレイテッド 血管造影及びかん流の定量化並びに解析手法
WO2018167816A1 (fr) * 2017-03-13 2018-09-20 株式会社島津製作所 Appareil d'imagerie

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4425098B2 (ja) 2004-09-06 2010-03-03 浜松ホトニクス株式会社 蛍光顕微鏡および蛍光相関分光解析装置
EP3506624B1 (fr) 2016-09-28 2022-12-21 Panasonic Holdings Corporation Système d'affichage

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000023903A (ja) * 1998-05-01 2000-01-25 Asahi Optical Co Ltd 蛍光診断用電子内視鏡装置
JP2004313470A (ja) * 2003-04-16 2004-11-11 Kao Corp 虫歯検出装置及びこれを実現する虫歯検出プログラム
JP2007021006A (ja) * 2005-07-20 2007-02-01 Hitachi Medical Corp X線ct装置
JP2010521198A (ja) * 2007-03-08 2010-06-24 シンク−アールエックス,リミティド 運動する器官と共に使用するイメージング及びツール
WO2012147820A1 (fr) * 2011-04-28 2012-11-01 オリンパス株式会社 Dispositif d'observation fluorescent et son procédé d'affichage d'images
JP2015527100A (ja) * 2012-06-21 2015-09-17 ノバダック テクノロジーズ インコーポレイテッド 血管造影及びかん流の定量化並びに解析手法
JP2015147048A (ja) * 2014-02-07 2015-08-20 バイオセンス・ウエブスター・(イスラエル)・リミテッドBiosense Webster (Israel), Ltd. 異なる心拍数にて獲得された心臓の画像シーケンス間を同期すること
WO2018167816A1 (fr) * 2017-03-13 2018-09-20 株式会社島津製作所 Appareil d'imagerie

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113693724A (zh) * 2021-08-19 2021-11-26 南京诺源医疗器械有限公司 适用于荧光影像导航手术的照射方法、装置及存储介质
CN113693724B (zh) * 2021-08-19 2022-10-14 南京诺源医疗器械有限公司 适用于荧光影像导航手术的照射方法、装置及存储介质

Also Published As

Publication number Publication date
JP7306461B2 (ja) 2023-07-11
JPWO2021024314A1 (fr) 2021-02-11

Similar Documents

Publication Publication Date Title
JP5081992B2 (ja) 術中血管造影を行なうための方法および装置
US8892190B2 (en) Method and apparatus for performing intra-operative angiography
US6915154B1 (en) Method and apparatus for performing intra-operative angiography
JP5634755B2 (ja) 電子内視鏡システム、電子内視鏡用のプロセッサ装置、及び電子内視鏡システムの作動方法
US20230320577A1 (en) Systems and methods for medical imaging using a rolling shutter imager
KR101647022B1 (ko) 의료 영상 획득 장치 및 방법
JP5460488B2 (ja) 電子内視鏡システム、電子内視鏡用のプロセッサ装置、画像検索システム、及び電子内視鏡システムの作動方法
CN108472088A (zh) 用于管理从医学成像导出的数据的方法和系统
JP2009226072A (ja) 手術支援方法及び装置
JP2019136269A (ja) 蛍光撮像装置
US20230081866A1 (en) Methods and systems for generating simulated intraoperative imaging data of a subject
WO2017122431A1 (fr) Dispositif d'analyse d'image, système d'analyse d'image, et procédé d'actionnement de dispositif d'analyse d'image
US20110267444A1 (en) Endoscope apparatus, method, and computer readable medium
WO2021024314A1 (fr) Dispositif d'assistance à un traitement médical et méthode d'assistance à un traitement médical
JP4533673B2 (ja) 赤外観察システム及び赤外観察システムによる作動方法
US20230039047A1 (en) Image processing apparatus, image processing method, navigation method and endoscope system
JP5844447B2 (ja) 電子内視鏡システム、電子内視鏡用のプロセッサ装置、及び電子内視鏡システムの作動方法
CN116744834A (zh) 医疗图像处理装置、方法及程序
CN116724334A (zh) 计算机程序、学习模型的生成方法、以及手术辅助装置
JP2021065293A (ja) 画像処理方法、画像処理装置、画像処理プログラム、教師データ生成方法、教師データ生成装置、教師データ生成プログラム、学習済みモデル生成方法、学習済みモデル生成装置、診断支援方法、診断支援装置、診断支援プログラム、およびそれらのプログラムを記録した記録媒体
WO2022239495A1 (fr) Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique
WO2020203034A1 (fr) Système endoscopique
WO2018216658A1 (fr) Appareil de capture d'image, système de capture d'image et procédé de capture d'image
JP2021126153A (ja) 医療用画像処理装置および医療用観察システム
JP2021132695A (ja) 医療用画像処理装置、医療用観察システムおよび画像処理方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19940661

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2021538536

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19940661

Country of ref document: EP

Kind code of ref document: A1