WO2022239495A1 - Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique - Google Patents

Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique Download PDF

Info

Publication number
WO2022239495A1
WO2022239495A1 PCT/JP2022/013701 JP2022013701W WO2022239495A1 WO 2022239495 A1 WO2022239495 A1 WO 2022239495A1 JP 2022013701 W JP2022013701 W JP 2022013701W WO 2022239495 A1 WO2022239495 A1 WO 2022239495A1
Authority
WO
WIPO (PCT)
Prior art keywords
biological tissue
image
observation system
light
tissue observation
Prior art date
Application number
PCT/JP2022/013701
Other languages
English (en)
Japanese (ja)
Inventor
和樹 池下
信二 勝木
大介 菊地
Original Assignee
ソニーグループ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーグループ株式会社 filed Critical ソニーグループ株式会社
Publication of WO2022239495A1 publication Critical patent/WO2022239495A1/fr

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/045Control thereof
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes

Definitions

  • the present disclosure relates to a biological tissue observation system, a biological tissue observation device, and a biological tissue observation method.
  • NBI narrow band imaging
  • FI fluorescence imaging
  • IRI Infra-Red Imaging infrared light observation
  • Non-Patent Document 1 discloses an overview of fluorescence observation using indocyanine green (ICG) as a fluorescent substance and an example of surgery based on the fluorescence observation.
  • ICG indocyanine green
  • the image sensor detects the absolute value of the brightness of the observation light (for example, fluorescence). may saturate the signal output from the In such a case, it is difficult to observe a minute difference or change in luminance of observation light having a luminance equal to or higher than a certain value in an observation image based on the signal.
  • the brightness of the observation light is low, the signal becomes weak, so that it is buried in noise and the like, making it difficult to observe the image itself.
  • the present disclosure proposes a biological tissue observation system, a biological tissue observation apparatus, and a biological tissue observation method capable of capturing minute differences and changes in observation light without being affected by the brightness of the observation light itself. .
  • an event vision sensor that detects, as an event, a change in luminance value of light in a first wavelength band emitted from living tissue, and a plurality of sensing data output from the event vision sensor at different timings. and an image processor that generates a first image based on the biological tissue observation system.
  • an event vision sensor that detects, as an event, a change in luminance value of light in a first wavelength band emitted from living tissue, and a plurality of sensing signals output from the event vision sensor at different timings. and an image processor that generates a first image based on the data.
  • the event vision sensor detects, as an event, a change in luminance value of light in the first wavelength band emitted from the biological tissue, and the image processing unit differs from the event vision sensor. Generating a first image based on a plurality of sensing data output with timing is provided.
  • FIG. 2 is a block diagram showing an example configuration of an EVS 200 used in an embodiment of the present disclosure
  • FIG. 2 is a block diagram showing an example of a configuration of a pixel 302 located in a pixel array section 300 in the EVS 200 shown in FIG. 1
  • FIG. It is a figure showing an example of composition of observation system 10 for medical science concerning a 1st embodiment of this indication.
  • 3 is a diagram showing an example of a functional configuration of a signal processing section 500 according to the first embodiment of the present disclosure
  • FIG. 4 is a flowchart of an observation method according to the first embodiment of the present disclosure
  • FIG. 4 is an explanatory diagram for explaining an example of an observation image (photograph) according to the first embodiment; 1 is an explanatory diagram for describing a first embodiment of the present disclosure; FIG. FIG. 10 is an explanatory diagram for explaining Modification 1 of the first embodiment of the present disclosure; FIG. 10 is a flowchart of an observation method according to modification 2 of the first embodiment of the present disclosure; FIG. FIG. 11 is an explanatory diagram (part 1) for explaining Modification 2 of the first embodiment of the present disclosure; FIG. 11 is an explanatory diagram (part 2) for explaining Modification 2 of the first embodiment of the present disclosure; FIG.
  • FIG. 11 is an explanatory diagram (part 3) for explaining Modification 2 of the first embodiment of the present disclosure; It is a figure showing an example of composition of observation system 10a for medical science concerning a 2nd embodiment of this indication.
  • FIG. 7 is a diagram showing an example of a functional configuration of a signal processing unit 500a according to the second embodiment of the present disclosure; 6 is a flow chart of an observation method according to a second embodiment of the present disclosure;
  • FIG. 11 is an explanatory diagram for explaining an example of an observation image according to the second embodiment;
  • FIG. FIG. 4 is a diagram showing an example of temporal changes in brightness of each site in cerebral artery bypass surgery.
  • FIG. 18 is a diagram showing an example of a drawing result based on FIG. 17; FIG.
  • FIG. 2 is a functional block diagram showing a configuration example of a hardware configuration of an information processing device that constitutes the medical observation system 10 according to the embodiment; It is a figure which shows an example of a schematic structure of an endoscope system.
  • 21 is a block diagram showing an example of functional configurations of the camera and CCU shown in FIG. 20; FIG. It is a figure which shows an example of a schematic structure of a microsurgery system.
  • tissue section or cell that is part of a tissue obtained from a living body (eg, human body, plant, etc.) is referred to as a biological tissue.
  • a tissue eg, organ or epithelial tissue obtained from a living body (eg, human body, plant, etc.)
  • examples of living tissue include in-vivo tubular living tissue such as blood vessels and lymphatic vessels.
  • ICG Indocyanine green
  • ICG can emit fluorescence having a wavelength around 820 nm (that is, light in the near-infrared band) when irradiated with light having a wavelength around 808 nm as excitation light.
  • fluorescent substances used for fluorescence observation from the viewpoint of reducing the effects (side effects) on the test subject due to the characteristics of more selectively accumulating in lesions such as cancer, and the administration (side effects).
  • various fluorescent materials have been proposed. Further, among such fluorescent substances, substances that emit fluorescence in a wavelength band different from that of ICG have been proposed, and fluorescent substances that emit light belonging to the visible light wavelength band are also included.
  • fluorescence observation for example, by observing the presence or absence of fluorescence emitted by a fluorescent substance such as ICG and the time change of the fluorescence, the operator (user) can visually observe blood flow and lymphatic flow. .
  • desired media such as blood vessels and lymph vessels (for example, blood and lymph) ) can also be distinguished between good and bad flow portions.
  • the image sensor detects the absolute value of the luminance of the fluorescence emitted from the fluorescent material.
  • signal may saturate. In such a case, it is difficult to observe a minute difference or change in the luminance of fluorescent light having a luminance higher than a certain value in a fluorescent image based on the signal.
  • the luminance of the fluorescence is low, the signal becomes weak, so that it is buried in noise and the like, making it difficult to observe the fluorescence image itself.
  • observation may be made possible by adjusting the sensitivity (amplification factor) of the image sensor that detects fluorescence. It is difficult to adjust the sensitivity appropriately because the brightness is too high and the signal is saturated because the brightness is too high for observation.
  • EVS Event Vision Sensor
  • the EVS is a sensor that detects changes in luminance rather than detecting the absolute value of luminance, and has higher sensitivity than general RGB sensors.
  • EVS has no concept of frame rate, and can output time stamp information (time) and pixel information (position, amount of change in brightness) when a change in brightness occurs exceeding a threshold. Therefore, even if the luminance is high, the EVS detects only the change in luminance, so that the output signal is not saturated. Therefore, according to EVS, minute differences and changes in fluorescence luminance can be captured with high temporal resolution without being affected by the absolute value of fluorescence luminance.
  • the signal from the sensor may become saturated even if the brightness of the fluorescence increases over time. It is required to observe minute changes in the brightness of the fluorescence without the need to observe the intensity of the fluorescence.
  • EVS electrospray senorescence detection system
  • the purpose is often to observe its change over time. Therefore, by using EVS, the change in fluorescence itself can be captured without being affected by other stationary signals.
  • FIG. 1 is a block diagram showing an example of the configuration of the EVS 200 used in the embodiment of the present disclosure
  • FIG. 2 shows an example of the configuration of the pixels 302 located in the pixel array section 300 in the EVS 200 shown in FIG. It is a block diagram.
  • the EVS 200 has a pixel array section 300 configured by arranging a plurality of pixels 302 (see FIG. 2) in a matrix.
  • Each pixel 302 can generate a voltage corresponding to a photocurrent generated by photoelectric conversion as a pixel signal.
  • each pixel 302 can detect the presence or absence of an event by comparing the change in photocurrent corresponding to the amount of change in luminance of incident light (light emitted from the object) with a predetermined threshold. In other words, pixel 302 can detect an event based on the amount of luminance change exceeding a predetermined threshold.
  • the EVS 200 has a drive circuit 211 , an arbiter section (arbitration section) 213 , a column processing section 214 , and a signal processing section 212 as peripheral circuit sections of the pixel array section 300 .
  • each pixel 302 When detecting an event, each pixel 302 can output a request to the arbiter unit 213 requesting output of event data representing the occurrence of the event. Then, each pixel 302 outputs the event data to the driving circuit 211 and the signal processing unit 212 when receiving a response indicating permission to output the event data from the arbiter unit 213 . Also, the pixel 302 that has detected an event outputs a pixel signal generated by photoelectric conversion to the column processing unit 214 .
  • the drive circuit 211 can drive each pixel 302 of the pixel array section 300 .
  • the drive circuit 211 detects an event, drives the pixel 302 that outputs the event data, and outputs the pixel signal of the corresponding pixel 302 to the column processing unit 214 .
  • the arbiter unit 213 arbitrates requests requesting the output of event data supplied from each of the pixels 302, responds based on the arbitration result (permission/non-permission of event data output), and resets event detection. A reset signal can be sent to the pixel 302 to do so.
  • the column processing unit 214 can perform processing for converting analog pixel signals output from the pixels 302 of the corresponding column into digital signals for each column of the pixel array unit 300 .
  • the column processing unit 214 can also perform CDS (Correlated Double Sampling) processing on digitized pixel signals.
  • the signal processing unit 212 performs predetermined signal processing on the digitized pixel signals supplied from the column processing unit 214 and the event data output from the pixel array unit 300, and converts the signal-processed event data ( time stamp information, etc.) and pixel signals can be output.
  • a change in the photocurrent generated by the pixel 302 can be regarded as a change in the amount of light (luminance change) incident on the pixel 302 . Therefore, an event can also be said to be a luminance change of pixel 302 exceeding a predetermined threshold. Furthermore, the event data representing the occurrence of an event can include at least positional information such as coordinates representing the position of the pixel 302 where the change in the amount of light has occurred as an event.
  • each pixel 302 has a light receiving section 304 , a pixel signal generation section 306 and a detection section (event detection section) 308 .
  • the light receiving unit 304 can photoelectrically convert incident light to generate a photocurrent. Then, the light receiving unit 304 can supply a voltage signal corresponding to the photocurrent to either the pixel signal generating unit 306 or the detecting unit 308 under the control of the driving circuit 211 .
  • the pixel signal generation unit 306 can generate the signal supplied from the light receiving unit 304 as a pixel signal. Then, the pixel signal generation unit 306 can supply the generated analog pixel signals to the column processing unit 214 via vertical signal lines VSL (not shown) corresponding to columns of the pixel array unit 300 .
  • the detection unit 308 can detect whether an event has occurred based on whether the amount of change in photocurrent from the light receiving unit 304 has exceeded a predetermined threshold.
  • the events can include, for example, an ON event indicating that the amount of change in photocurrent (amount of luminance change) has exceeded the upper limit threshold, and an OFF event indicating that the amount of change has fallen below the lower limit threshold.
  • the detection unit 308 may detect only on-events.
  • the detection unit 308 can output to the arbiter unit 213 a request to output event data representing the occurrence of the event. Then, when receiving a response to the request from the arbiter unit 213 , the detection unit 308 can output event data to the drive circuit 211 and the signal processing unit 212 .
  • FIG. 3 is a diagram showing an example of the configuration of the medical observation system 10 according to this embodiment.
  • the medical observation system 10 mainly includes an EVS 200 , a light source (first illumination device) 400 and a signal processing section (image processing section) 500 .
  • Each device included in the medical observation system 10 according to this embodiment will be described below.
  • the EVS 200 can detect a change in luminance value of observation light (for example, fluorescence) emitted from the living tissue, which is the subject 700, as an event.
  • the EVS 200 includes a pixel array section 300 having a plurality of pixels 302 arranged in a matrix and a detection section (event detector) 308 .
  • the EVS 200 can detect changes in the luminance value of observation light in a predetermined wavelength band (first wavelength band), for example, fluorescence with a wavelength around 820 nm (that is, near-infrared band). light) can be detected.
  • first wavelength band for example, fluorescence with a wavelength around 820 nm (that is, near-infrared band). light
  • the EVS 200 measures, for example, the brightness of fluorescence emitted by a fluorescent substance (a drug containing a substance capable of emitting light in the first wavelength band) administered once or multiple times to the living tissue that is the subject 700. Changes can be detected.
  • a fluorescent substance a drug containing a substance capable of emitting light in the first wavelength band
  • the EVS 200 has a cut filter (CF) 250 that cuts (does not transmit) visible light and excitation light in order to suitably detect fluorescence. is preferred.
  • CF cut filter
  • a prism that separates observation light (for example, fluorescence) to be detected from visible light and excitation light and guides it to the EVS 200 may be used.
  • a fluorescence image observation image
  • the cut filter 250 for cutting light may not be provided.
  • the light source 400 is composed of, for example, an LED (Light Emitting Diode), a laser, or the like, and can irradiate excitation light that excites a fluorescent substance contained in a drug administered to the biological tissue, which is the subject 700 .
  • the excitation light is light in a wavelength band (second wavelength band) different from the wavelength band (first wavelength band) of observation light (e.g., fluorescence) detected by the EVS 200.
  • ICG is used as a fluorescent substance.
  • the excitation light for ICG is light having a wavelength near 808 nm. By being irradiated with such excitation light, ICG can emit fluorescence having a wavelength of around 820 nm.
  • the signal processing unit 500 can generate image data (observation image) (first image), which is a fluorescence image by fluorescence, based on a plurality of signals (sensing data) output from the EVS 200 at different timings. A detailed configuration of the signal processing unit 500 will be described later.
  • the medical observation system 10 is not limited to the configuration shown in FIG. It may also include other communication devices such as relay devices.
  • the signal processing section 500 is not limited to being composed of an integrated device as shown in FIG. 3, and may be composed of a plurality of devices.
  • FIG. 4 is a diagram showing an example of the functional configuration of the signal processing section 500 according to this embodiment.
  • the signal processing unit 500 includes a CPU (Central Processing Unit) and the like, and functions as a processing unit and a control unit.
  • the signal processing unit 500 includes a signal acquisition unit 502, a frame memory (storage unit) 504, an image processing unit 506, an image adjustment unit 508, an output unit 510, and a control unit. It mainly has a section 520 , an adjustment section 522 and an input section 524 .
  • Each functional unit included in the signal processing unit 500 according to this embodiment will be described below.
  • the signal acquisition unit 502 can acquire event data (time stamp information, etc.) and pixel signals, which are output signals (sensing data) from the EVS 200, and output them to an image processing unit 506, an adjustment unit 522, etc., which will be described later.
  • event data time stamp information, etc.
  • pixel signals which are output signals (sensing data) from the EVS 200, and output them to an image processing unit 506, an adjustment unit 522, etc., which will be described later.
  • the frame memory 504 is composed of a magnetic storage device such as a HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like, and stores image data output from the image processing unit 506, which will be described later. can be done.
  • a magnetic storage device such as a HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the image processing unit 506 can generate image data (observation image) (first image), which is a fluorescence image, based on a plurality of signals (sensing data) output from the EVS 200 at different timings. Specifically, the image processing unit 506 reads the image data (sensing data) before the current frame from the frame memory 504, and adds the read image data to the EVS 200 after the image data via the signal acquisition unit 502. are added or sequentially added to generate new image data for the current frame. The image processing unit 506 then outputs the newly generated image data of the current frame to the image adjusting unit 508, which will be described later.
  • Image adjustment unit 508 performs brightness adjustment (gain adjustment) and contrast adjustment (gamma adjustment) on the image data of the current frame output from the image processing unit 506, and performs output unit 510 and adjustment unit 522, which will be described later. can be output to
  • the output unit 510 can output the image data from the image adjustment unit 508 to a display unit (not shown) in order to present the observation image to the user. Furthermore, the output unit 510 may output the image data to the frame memory 504 .
  • the display unit is configured by, for example, an LCD (Liquid Crystal Display), an organic EL (Electro Luminescence) display, or the like.
  • control unit 520 can control the EVS 200 and the light source 400 .
  • the control unit 520 can control the sensitivity (amplification) of the EVS 200, the threshold to be compared with the amount of change in brightness for detecting the presence or absence of an event, and the irradiation intensity of the light source 400.
  • the adjustment unit 522 determines (adjusts) a threshold (predetermined threshold) to be compared with the luminance change amount to detect the presence or absence of an event, and outputs a command signal based on the determined threshold to the control unit 520 described above. can be done.
  • the adjustment unit 522 may determine the threshold based on the frequency of output of signals (sensing data) from the EVS 200 . Further, for example, the adjustment unit 522 may determine the threshold value based on the dynamic range of luminance or the gradation of the image data (first image) output from the image adjustment unit 508 .
  • the adjustment unit 522 adjusts the velocity of the fluorescent substance (fluorescent agent) in the biological tissue that is the subject 700 (more specifically, the flow of the medium in the tubular biological tissue in the living body, such as the blood flow in the blood vessel) or
  • the threshold may be determined based on the rate of change in fluorescence brightness.
  • the adjustment unit 522 determines the threshold value based on the type, size, position, state, displacement speed, deformation speed, surrounding state, etc. of the living tissue that is the subject 700 obtained from the image data. may In the present embodiment, the adjustment unit 522 may recognize the type, state, surrounding state, etc. of the living tissue using, for example, an image recognition model obtained by machine learning.
  • the adjustment unit 522 may determine the threshold based on the depth of the blood vessel or organ that is the observation target. Specifically, when the observation target is located at a deep position, the brightness of the fluorescence decreases. Therefore, the adjusting unit 522 sets the threshold to be small, so that the EVS 200 can more reliably detect changes in the fluorescence. make it On the other hand, when the observation target is close to the superficial layer, there are many blood vessels, for example, and the brightness of the fluorescence is high. In addition, when the observation target is the digestive organ, there is a lot of fat around it, and the fat blocks the fluorescence observation, making it difficult to observe fluorescence.
  • the adjustment unit 522 may determine the threshold value based on the number of times of administration, type, or concentration of the fluorescent substance (fluorescent agent) input by the user via the input unit 524, which will be described later.
  • the adjustment unit 522 may set the threshold to be sequentially smaller, for example, as the number of times of administration increases.
  • the adjustment unit 522 may determine the threshold based on the irradiation intensity of the excitation light from the light source 400 .
  • the adjustment unit 522 may set the threshold to be sequentially smaller, for example, as the irradiation intensity becomes lower. In this way, by adjusting the irradiation intensity and the threshold value in conjunction with each other, it is possible to detect changes in fluorescence luminance even when the irradiation intensity is low. Influence of excitation light on 700 can be suppressed.
  • the input unit 524 can receive input of data and commands from the user, and can output information input by the received input operation to the adjustment unit 522 . More specifically, the input unit 524 can be configured with a keyboard, touch panel, buttons, microphone, or the like.
  • the signal processing section 500 is not limited to being configured by the functional sections as shown in FIG. 4, and may further include other functional sections.
  • the signal processing unit 500 is not limited to being composed of an integrated device as shown in FIG. 4, and may be composed of a plurality of separate devices.
  • FIG. 5 is a flowchart of the observation method according to this embodiment
  • FIG. 6 is an explanatory diagram for explaining an example of an observed image (photograph).
  • FIG. 7 is an explanatory diagram for explaining the present embodiment, and more specifically, an explanatory diagram for explaining threshold setting.
  • the observation method according to this embodiment can include a plurality of steps from step S101 to step S107. Details of each of these steps according to the present embodiment will be described below.
  • the signal processing unit 500 reads the image data of the previous frame stored in the frame memory 504 (the image data stored in the frame memory 504 immediately before) (step S101). In this embodiment, if the image data in the frame memory 504 has been reset (erased) immediately before, step S101 may be omitted. Next, the signal processing unit 500 adds the output signal (sensing data) output from the EVS 200 to the read image data (step S102).
  • the signal processing unit 500 adjusts the brightness (gain) of the image data to which the signal has been added in step S102 (step S103).
  • the signal processing unit 500 adjusts the contrast (gamma) (step S104).
  • the signal processing unit 500 writes the image data adjusted in steps S103 and S104 to the frame memory 504 (step S105).
  • the signal processing unit 500 receives an input from the user or a result of whether or not the image data written in the frame memory 504 in step S105 described above satisfies preset conditions (for example, the number of additions of the output signal, the observation time etc.), it is determined whether or not necessary image data has been obtained (step S106). If the signal processing unit 500 determines that the necessary image data has been obtained (step S106: Yes), the process proceeds to step S107, and if it determines that the necessary image data has not been obtained (step S106: No), the process proceeds to step S101.
  • preset conditions for example, the number of additions of the output signal, the observation time etc.
  • the signal processing unit 500 outputs the image data written in the frame memory 504 in step S105 (step S107). Then, the signal processing unit 500 ends the series of processes.
  • FIG. 6 shows image data (observation image) obtained by the observation method according to the present embodiment as described above.
  • the left side of FIG. 6 shows an observation image obtained by this embodiment
  • the right side shows an observation image obtained by a conventional fluorescence observation method (more specifically, a method using a conventional image sensor). Observation images are shown.
  • the signal from the image sensor is saturated due to the luminance of the fluorescence having a luminance higher than a certain value, and the entire corresponding portion becomes white. It is not possible to recognize minute differences in fluorescence brightness at the observation site.
  • the observation image according to the present embodiment only the change in the fluorescence luminance by the EVS 200 is detected, and the output signal is added to obtain an image. signal from is never saturated. Therefore, as shown on the left side of FIG. 6, minute differences in fluorescence brightness can be recognized in the observed image.
  • FIG. 8 is an explanatory diagram for explaining Modification 1 of the present embodiment, and more specifically, an explanatory diagram for explaining threshold setting in this modification.
  • the brightness of the fluorescence emitted from the fluorescent material administered to the subject 700 is weaker than, for example, the visible light reflected from the subject 700 or the excitation light that excites the fluorescent material. Therefore, in the present embodiment described above, the EVS 200 preferably has the CF 250 that cuts (does not transmit) visible light and excitation light in order to suitably detect fluorescence.
  • a fluorescence image (observation image) can be obtained by integrating the output signal from the EVS 200 as it is. It is not necessary to provide the CF 250 that cuts the effective light (visible light and excitation light).
  • setting of the threshold value C when such a CF 250 is not provided will be described as this modified example.
  • the EVS 200 will detect, for example, the reflection of visible light from the subject 700 and the luminance change due to the excitation light, in addition to the fluorescence.
  • the brightness of the fluorescence emitted from the fluorescent substance (fluorescent agent) administered to the subject 700 is very small compared to the visible light reflected from the subject 700. also tend to be very small. Therefore, in this modified example, it is preferable to set the threshold value C small in order to detect changes in fluorescence luminance.
  • the brightness of the image data obtained by this modified example is such that the brightness of the fluorescent light is above the constant range of the brightness components of the visible light and the excitation light (lower side of FIG. 8).
  • the changes (lower side of FIG. 8) are sequentially added. Therefore, in this modified example, the threshold value C is set based on the dynamic range (DR) obtained by subtracting the minimum luminance value from the maximum fluorescence luminance value, thereby obtaining an image that is a fluorescent image having a desired gradation. data can be obtained.
  • DR dynamic range
  • the present embodiment and Modification 1 are not limited to setting the threshold C based on the dynamic range (DR).
  • the threshold C may be determined based on the event detection frequency. good. For example, when the event detection frequency is higher than a predetermined frame rate (eg, 60 fps), the threshold C may be increased to lower the event detection frequency. Also, when the event detection frequency is lower than a predetermined frame rate (eg, 60 fps), the threshold C may be decreased to increase the event detection frequency.
  • FIG. 9 is a flowchart of an observation method according to Modification 2 of this embodiment
  • FIGS. 10 and 11 are explanatory diagrams for explaining Modification 2 of this embodiment
  • FIG. 12 is an explanatory diagram for explaining Modification 2 of the present embodiment, and is an explanatory diagram for explaining threshold setting in this modification.
  • the image data stored in the frame memory 504 is reset at the timing of administering the fluorescent agent for the second and subsequent times, and a new output signal from the EVS 200 is added. It is possible to obtain an observation image, which is a fluorescence image emitted by . According to this modification, it is not necessary to perform observation after the fluorescent agent remaining in the observation area is released (after it is washed out), so it is possible to avoid an increase in the time required for observation.
  • the observation method according to this modification can mainly include a plurality of steps from step S201 to step S205. Details of each of these steps according to the present modification will be described below.
  • the user administers the first fluorescent agent (fluorescent substance) to the subject 700 (for example, a tubular biological tissue in the living body such as a blood vessel) (step S201).
  • the signal processing unit 500 sequentially adds the signals output from the EVS 200 to the image data (step S202).
  • the signal processing unit 500 determines whether a new fluorescent agent has been administered based on the input from the user (step S203). If it is determined that a new fluorescent agent has been administered (step S203: Yes), the signal processing unit 500 proceeds to step S204, and if it is determined that a new fluorescent agent has not been administered. (Step S203: No), the process proceeds to step S205.
  • step S203 is not limited to determining whether or not a new fluorescent agent has been administered based on the input from the user.
  • the signal processing unit 500 performs image analysis on the image data to which the signal has been added in step S202 to determine the timing of administration of the fluorescent agent, that is, whether or not a new fluorescent agent has been administered. may be detected. More specifically, it is possible to detect the timing of the first administration of the fluorescent agent by detecting the rising edge of the luminance value in the image data to which the signal output from the EVS 200 is added. Furthermore, by detecting the timing at which the lowered luminance value starts to rise again in the temporal change of the luminance value in the image data, it is possible to detect the timing of the second and subsequent administrations of the fluorescent agent.
  • the signal processing unit 500 clears the output image data stored in the frame memory 504, and returns to the process of step S202 described above (step S204).
  • the signal processing unit 500 stores the output image data generated in step S202 described above in the frame memory 504 (step S205), and ends the series of processes.
  • V101 schematically shows an output image (fluorescence image) immediately before the first administration of the fluorescent agent.
  • V103 schematically shows an output image (fluorescence image) after the first administration of the fluorescent agent.
  • V105 schematically shows an output image (fluorescence image) immediately after the second administration of the fluorescent agent, obtained by a conventional fluorescence observation method (more specifically, a method using a conventional image sensor). (It is assumed that time has passed from V103 to V105).
  • V107 schematically shows an output image (fluorescence image) output after the second administration of the fluorescent agent, obtained by a conventional fluorescence observation method.
  • V109 schematically shows an output image (fluorescence image) immediately after the second administration of the fluorescent agent according to this modified example.
  • V111 schematically shows an output image (fluorescence image) output after the second administration of the fluorescent agent according to this modified example.
  • the image data stored in the frame memory 504 is reset at the timing of the second administration of the fluorescent agent. Furthermore, according to this modification, the EVS 200 detects the fluorescence in the portion where the luminance increased after the second administration of the fluorescent agent. In this case as well, the visibility of the site to be observed by fluorescence can be ensured in the same manner as in the observation after the first administration of the fluorescent agent.
  • the first and second fluorescent agents are the same type of agent (fluorescent substance), but in this modified example, they are different types of agents (fluorescent substance).
  • the image data stored in the frame memory 504 at the timing of the second administration of the fluorescent agent may be stored separately without being discarded. By doing so, the user can compare the fluorescence image before the second administration of the fluorescent agent and the fluorescence image after the second administration of the fluorescence agent.
  • this modification can also be applied, for example, when evaluating blood flow before and after intestinal anastomosis by fluorescence observation using ICG as a fluorescent substance.
  • ICG in the evaluation of blood flow before and after intestinal anastomosis, it is required to recognize the boundaries between areas with blood flow and areas without blood flow.
  • the first administration of the fluorescent agent is performed before the intestinal anastomosis.
  • a blood flow assessment is performed. It should be noted that it takes about several minutes from administration of the fluorescent agent to observation of fluorescence emitted from the administered fluorescent agent.
  • V201 in FIG. 11 schematically shows a region where fluorescence emitted by ICG (fluorescent substance) contained in the first administered fluorescent agent is detected. That is, the expansion of the region V201 confirms the state of blood circulation (for example, how far the blood flow reaches), and the anastomosis site is determined according to the result of the confirmation.
  • a fluorescence image V210 shown in the lower left of FIG. 11 is obtained by a conventional fluorescence observation method (more specifically, a method using a conventional image sensor) and is presented after the second administration of the fluorescent agent. An example is shown.
  • V211 in the fluorescence image V210 schematically shows the anastomosis site
  • V213 schematically shows the region corresponding to the first fluorescence component.
  • V215 schematically shows the region where fluorescence emitted by ICG contained in the fluorescent agent administered for the second time was detected. Therefore, in the fluorescence image V210 obtained by the conventional fluorescence observation method, the first fluorescence component remains in the fluorescence image V210, and the fluorescence signal is added to the corresponding area V213. Therefore, it becomes difficult to determine which portion in the fluorescence image V210 corresponds to the second fluorescence component, and thus, the first administration of the fluorescent agent (before anastomosis) and the second administration of the fluorescent agent are difficult. It becomes difficult to compare with the time of administration (after anastomosis).
  • the fluorescence image V220 shown in the lower right of FIG. 11 shows an example of an output image presented after the second administration of the fluorescence agent, obtained by this modified example.
  • V221 in V220 schematically shows the anastomosis site
  • V223 schematically shows the region corresponding to the second fluorescence component.
  • the region V223 corresponding to the second fluorescence component is the same as the fluorescence image V200 presented after the first administration of the fluorescent agent. It becomes possible to visually recognize with visibility. That is, according to this modification, it is possible to determine which portion in the fluorescence image V220 corresponds to the second fluorescence component. (before anastomosis) and the second time of administration of the fluorescent agent (after anastomosis).
  • the threshold C is set based on the dynamic range (DR) obtained by subtracting the minimum luminance value from the maximum luminance luminance value of the newly administered fluorescent agent, thereby obtaining the desired gradation.
  • image data which is a fluorescent image due to the newly administered fluorescent agent, can be obtained.
  • FIG. 13 is a diagram showing an example of the configuration of a medical observation system 10a according to this embodiment.
  • the medical observation system 10a mainly includes an EVS 200, a light source (first and second lighting devices) 400a, a signal processing section 500a, and an RGB sensor 600.
  • Each device included in the medical observation system 10a according to the present embodiment will be described below, but the description of the devices common to the first embodiment will be omitted.
  • the light source 400a can irradiate excitation light that excites a fluorescent agent (fluorescent substance) that has been administered to the biological tissue that is the subject 700 . Furthermore, the light source 400a can irradiate the subject 700 with observation light (eg, visible light (light having a wavelength of approximately 360 nm to approximately 830 nm) (light in a third wavelength band), for example. In terms of form, the light source 400a is not limited to irradiating excitation light and visible light. 900 nm to about 2500 nm) may be irradiated onto the subject 700. In the present embodiment, the light source 400a may be configured as an integrated device as shown in FIG. It may consist of two or more devices of the body, and is not particularly limited.
  • the signal processing unit 500a can generate an observed image of fluorescence (fluorescence image) based on a plurality of signals (sensing data) output from the EVS 200 at different timings. At that time, the signal processing unit 500a can perform motion compensation on the fluorescence image based on the visible light image based on the signal from the RGB sensor 600, which will be described later. A detailed configuration of the signal processing unit 500a will be described later.
  • the RGB sensor 600 mainly includes a plurality of pixels (not shown) arranged in a matrix and a peripheral circuit section (not shown) that outputs an image based on light incident on each of the pixels as a pixel signal. Furthermore, the pixel signal output from the RGB sensor 600 is transmitted to the signal processing section 500a described above.
  • the RGB sensor 600 has a Bayer arrangement capable of detecting visible light (light in a third wavelength band different from the first wavelength band) such as blue light, green light, and red light, for example.
  • an image sensor capable of capturing a high-resolution image of 4K or higher is preferable. By using such an image sensor, a desired visible light image can be obtained with high resolution.
  • the EVS 200 and the RGB sensor 600 are not limited to being configured as separate devices as shown in FIG.
  • the EVS 200 and the RGB sensor 600 are provided on two different substrates provided in one device (camera head), and the light dispersed by a prism (not shown) provided in the device is It may be configured to detect each.
  • a plurality of pixels on one substrate may be configured to function as the EVS 200 and the RGB sensor 600 .
  • the medical observation system 10a may have an IR sensor (not shown) that detects infrared light, or a short wave infrared (SWIR) sensor such as an InGaAs sensor. It may have a sensor (not shown).
  • IR sensor detects infrared light
  • SWIR short wave infrared
  • InGaAs sensor such as an InGaAs sensor
  • a sensor not shown
  • blood vessels located deep inside the body can be accurately captured by using short-wave infrared rays (light having a wavelength of about 900 nm to about 2500 nm).
  • the medical observation system 10a is not limited to the configuration shown in FIG. 13, and may include, for example, various sensors as described above.
  • the signal processing unit 500a is not limited to being composed of an integrated device as shown in FIG. 13, and may be composed of a plurality of devices.
  • FIG. 14 is a diagram showing an example of the functional configuration of the signal processing section 500a according to this embodiment.
  • the signal processing unit 500a includes signal acquisition units 502 and 532, a frame memory (storage unit) 504, an image processing unit 506, an image adjustment unit 508, an output unit 510, and a control unit 520. , an adjustment unit 522 , an input unit 524 , a motion detection unit (estimation unit) 534 , and a motion compensation unit (correction unit) 536 .
  • Each functional unit included in the signal processing unit 500a according to the present embodiment will be described below, but the description of the functional units common to the first embodiment will be omitted.
  • Image processing unit 506 adds a signal related to the current frame output from the EVS 200 after the image data to the image data before the current frame that has undergone motion compensation in a motion compensation unit 536 (to be described later). (sensing data) are added to generate a new image of the current frame. The image processing unit 506 then outputs the newly generated image data of the current frame to the image adjusting unit 508, which will be described later.
  • the signal acquisition unit 532 can acquire pixel signals from the RGB sensor 600 and output them to the motion detection unit 534, which will be described later.
  • the motion detection unit 534 compares image data (visible light image) (second image) based on pixel signals from the RGB sensor 600 prior to the current frame and image data based on pixel signals of the current frame from the RGB sensor 600. By doing so, the relative motion between the RGB sensor 600 and the object 700 is estimated. Further, the motion detector 534 estimates the relative motion between the EVS 200 and the subject 700 based on the known positional relationship between the RGB sensor 600 and the EVS 200 . For example, the motion detection unit 534 calculates a motion vector between frames corresponding to each of the plurality of visible light image data for each predetermined unit data (for example, in units of pixels) among the plurality of visible light image data.
  • predetermined unit data for example, in units of pixels
  • Motion estimation methods include, for example, a block matching method and a gradient method. Further, the motion detection unit 534 may estimate the motion of the subject 700 between visible light image data based on each of a plurality of motion estimation methods, and combine the respective estimation results to improve motion estimation accuracy. The motion detection unit 534 then outputs the motion estimation result of the subject 700 in the image to the motion compensation unit 536, which will be described later.
  • the motion compensator 536 reads image data (fluorescence image) due to fluorescence from the frame before the current frame stored in the frame memory 504, and performs motion compensation based on the result of estimating the motion of the object 700 in the image output from the motion detector 534. , corrects the read image data by removing or shifting luminance changes due to motion (motion compensation). The motion compensation unit 536 then outputs the corrected image data to the image processing unit 506 .
  • the signal processing unit 500a is not limited to being configured by the functional units as shown in FIG. 14, and may further include other functional units.
  • the signal processing unit 500a is not limited to being composed of an integrated device as shown in FIG. 14, and may be composed of a plurality of separate devices.
  • FIG. 15 is a flowchart of an observation method according to this embodiment
  • FIG. 16 is an explanatory diagram for explaining an example of an observation image.
  • the observation method according to this embodiment can mainly include a plurality of steps from step S301 to step S309. Details of each of these steps according to the present embodiment will be described below.
  • the signal processing unit 500a converts the image data (fluorescence image) of the previous frame stored in the frame memory 504 (the image data stored in the frame memory 504 immediately before) into Read (step S301).
  • the signal processing unit 500a detects the movement of the subject 700 based on the pixel signals output from the RGB sensor 600 (step S302).
  • motion compensation is performed on the read image data based on the motion detected in step S302 (step S303).
  • the signal processing unit 500a adds the signal output from the EVS 200 to the image data (fluorescence image) motion-compensated in step S303 (step S304).
  • the signal processing unit 500a executes steps S305 to S309. to step S107, detailed description is omitted here.
  • the EVS 200 not only the EVS 200 but also the RGB sensor 600 capable of capturing a visible light image are used together to detect the movement of the subject 700 or the like using the visible light image, Based on the detected motion, motion compensation can be performed to remove the change component of fluorescence due to motion.
  • motion compensation can be performed to remove the change component of fluorescence due to motion.
  • motion compensation is not limited to being performed based on the visible light image by the RGB sensor 600.
  • the visible light image (second image) and the fluorescence image (first image) may be added to generate image data.
  • the foreground portion is the fluorescence image
  • the background portion is the visible light image. be.
  • the above-described embodiments of the present disclosure can be used, for example, in anastomosis of blood vessels and large intestine, gastric tube preparation, esophageal reconstruction, cholecystectomy, etc., to evaluate tissues in vivo with blood flow, and in cerebral artery flow clipping and bypass surgery. , can be used to evaluate the state of blood-flowing tissues and blood flow turbulence.
  • the embodiments of the present disclosure can also be used to identify target tissues in vivo, such as the liver, and to identify lymph nodes and lymph vessels other than blood flow.
  • the sequential flow of the fluorescent agent causes the lymph nodes to light up in sequence, so the lymph node network can be identified by the order of light emission. Since it takes several minutes for the lymph nodes to light up sequentially, it is not necessary to detect fluorescence with high temporal resolution. Therefore, the threshold for comparison with the amount of change in luminance to detect the presence or absence of an event is set to a relatively low value. you don't have to
  • FIG. 17 is a diagram showing an example of temporal changes in brightness of each site in cerebral artery bypass surgery
  • FIG. 18 is a diagram showing an example of the result drawn based on FIG.
  • the elapsed time until each site emits fluorescence with maximum brightness is the blood flow evaluation index. Then, the elapsed time until the maximum brightness of each part shown in FIG. 17 can be imaged using, for example, hue and brightness. An example of such an image is shown in FIG.
  • the subject 700 is not limited to living tissue, and may be a fine mechanical structure or the like, and is not particularly limited.
  • the above-described embodiments of the present disclosure are not limited to application to applications such as medical care or research, and can be applied to observation devices that perform high-precision analysis using images. Therefore, the medical observation system 10 described above can be used as an observation system (observation device).
  • the medical observation system 10 described above can be used as a rigid endoscope, a flexible endoscope, an endoscope, a microscope, and the like.
  • FIG. 19 is a functional block diagram showing one configuration example of the hardware configuration of the information processing device that configures the medical observation system according to one embodiment of the present disclosure.
  • An information processing device 900 that configures the medical observation system mainly includes a CPU 901, a ROM 902, and a RAM 903.
  • the information processing device 900 further includes a host bus 907, a bridge 909, an external bus 911, an interface 913, an input device 915, an output device 917, a storage device 919, a drive 921, and a connection port 923. and a communication device 925 .
  • the CPU 901 functions as an arithmetic processing device and a control device, and controls all or part of the operations in the information processing device 900 according to various programs recorded in the ROM 902, RAM 903, storage device 919, or removable recording medium 927.
  • the ROM 902 stores programs, calculation parameters, and the like used by the CPU 901 .
  • the RAM 903 temporarily stores programs used by the CPU 901, parameters that change as appropriate during execution of the programs, and the like. These are interconnected by a host bus 907 comprising an internal bus such as a CPU bus. Note that each configuration of the signal processing unit 500 shown in FIG. 4 can be implemented by the CPU 901 .
  • the host bus 907 is connected via a bridge 909 to an external bus 911 such as a PCI (Peripheral Component Interconnect/Interface) bus.
  • An input device 915 , an output device 917 , a storage device 919 , a drive 921 , a connection port 923 and a communication device 925 are connected to the external bus 911 via an interface 913 .
  • the input device 915 is operation means operated by the user, such as a mouse, keyboard, touch panel, button, switch, lever, and pedal. Further, the input device 915 may be, for example, remote control means (so-called remote controller) using infrared rays or other radio waves, or an external connection device such as a mobile phone or PDA corresponding to the operation of the information processing device 900. 929 may be used. Furthermore, the input device 915 is composed of, for example, an input control circuit that generates an input signal based on the information input by the user using the operation means and outputs the signal to the CPU 901 . A user of the information processing apparatus 900 can input various data to the information processing apparatus 900 and instruct processing operations by operating the input device 915 .
  • the output device 917 consists of a device capable of visually or audibly notifying the user of the acquired information. Such devices include display devices such as CRT display devices, liquid crystal display devices, plasma display devices, EL display devices and lamps, audio output devices such as speakers and headphones, and printer devices.
  • the output device 917 outputs, for example, results obtained by various processes performed by the information processing device 900 . Specifically, the display device displays the results obtained by various processes performed by the information processing device 900 as text or images.
  • the audio output device converts an audio signal including reproduced audio data, acoustic data, etc. into an analog signal and outputs the analog signal. Note that the output unit 510 shown in FIG. 4 can be implemented by the output device 917 .
  • the storage device 919 is a data storage device configured as an example of the storage unit of the information processing device 900 .
  • the storage device 919 is composed of, for example, a magnetic storage device such as a HDD (Hard Disk Drive), a semiconductor storage device, an optical storage device, a magneto-optical storage device, or the like.
  • the storage device 919 stores programs executed by the CPU 901 and various data.
  • the frame memory 504 shown in FIG. 4 can be implemented by either the storage device 919 or the RAM 903, or a combination thereof.
  • the drive 921 is a recording medium reader/writer, and is built in or externally attached to the information processing apparatus 900 .
  • the drive 921 reads information recorded on a removable recording medium 927 such as a magnetic disk, an optical disk, a magneto-optical disk, or a semiconductor memory, and outputs the information to the RAM 903 .
  • the drive 921 can also write records to a removable recording medium 927 such as a mounted magnetic disk, optical disk, magneto-optical disk, or semiconductor memory.
  • the removable recording medium 927 is, for example, DVD media, HD-DVD media, Blu-ray (registered trademark) media, or the like.
  • the removable recording medium 927 may be a compact flash (registered trademark) (CF: CompactFlash (registered trademark)), a flash memory, an SD memory card (Secure Digital memory card), or the like. Also, the removable recording medium 927 may be, for example, an IC card (Integrated Circuit card) equipped with a contactless IC chip, an electronic device, or the like.
  • CF CompactFlash
  • SD memory card Secure Digital memory card
  • the connection port 923 is a port for direct connection to the information processing device 900 .
  • Examples of the connection port 923 include a USB (Universal Serial Bus) port, IEEE1394 port, SCSI (Small Computer System Interface) port, and the like.
  • Other examples of the connection port 923 include an RS-232C port, an optical audio terminal, an HDMI (registered trademark) (High-Definition Multimedia Interface) port, and the like.
  • the communication device 925 is, for example, a communication interface configured with a communication device or the like for connecting to a communication network (network) 931 .
  • the communication device 925 is, for example, a communication card for wired or wireless LAN (Local Area Network), Bluetooth (registered trademark), or WUSB (Wireless USB).
  • the communication device 925 may be a router for optical communication, a router for ADSL (Asymmetric Digital Subscriber Line), a modem for various types of communication, or the like.
  • the communication device 925 can transmit and receive signals and the like to and from the Internet and other communication devices according to a predetermined protocol such as TCP/IP.
  • the communication network 931 connected to the communication device 925 is configured by a network or the like connected by wire or wireless, and may be, for example, the Internet, home LAN, infrared communication, radio wave communication, satellite communication, or the like. .
  • FIG. 19 An example of the hardware configuration capable of realizing the functions of the information processing device 900 constituting the signal processing unit 500 of the medical observation system 10 according to the embodiment of the present disclosure has been shown above.
  • Each component described above may be configured using general-purpose members, or may be configured by hardware specialized for the function of each component. Therefore, it is possible to appropriately change the hardware configuration to be used according to the technical level at which the present embodiment is implemented.
  • FIG. 19 various configurations corresponding to the information processing device 900 forming the signal processing unit 500 of the medical observation system 10 are of course provided.
  • a computer-readable recording medium storing such a computer program can also be provided. Recording media include, for example, magnetic disks, optical disks, magneto-optical disks, and flash memories. Also, the above computer program may be distributed, for example, via a network without using a recording medium. Also, the number of computers that execute the computer program is not particularly limited. For example, the computer program may be executed by a plurality of computers (for example, a plurality of servers, etc.) in cooperation with each other.
  • a medical imaging system is a medical system using imaging technology, such as an endoscope system or a microscope system.
  • FIG. 20 is a diagram showing an example of a schematic configuration of an endoscope system 5000 to which technology according to the present disclosure can be applied.
  • FIG. 21 is a diagram showing an example of the configuration of an endoscope 5001 and a CCU (Camera Control Unit) 5039.
  • FIG. 20 illustrates a state in which an operator (for example, a doctor) 5067 who is a surgical participant is performing surgery on a patient 5071 on a patient bed 5069 using an endoscope system 5000 .
  • an operator for example, a doctor
  • the endoscope system 5000 supports an endoscope 5001 as a medical imaging device, a CCU 5039, a light source device 5043, a recording device 5053, an output device 5055, and an endoscope 5001. and a support device 5027 .
  • an insertion aid called a trocar 5025 is punctured into a patient 5071. Then, the scope 5003 and surgical instrument 5021 connected to the endoscope 5001 are inserted into the body of the patient 5071 via the trocar 5025 .
  • the surgical instrument 5021 is, for example, an energy device such as an electric scalpel, forceps, or the like.
  • a surgical image which is a medical image of the inside of the patient's 5071 photographed by the endoscope 5001, is displayed on the display device 5041.
  • the operator 5067 uses the surgical instrument 5021 to treat the surgical target while viewing the surgical image displayed on the display device 5041 .
  • the medical images are not limited to surgical images, and may be diagnostic images captured during diagnosis.
  • the endoscope 5001 is an imaging unit for imaging the inside of the body of a patient 5071.
  • a camera 5005 includes a zoom optical system 50052 that enables optical zoom, a focus optical system 50053 that enables focus adjustment by changing the focal length of an imaging unit, and a light receiving element 50054 .
  • the endoscope 5001 converges light on the light receiving element 50054 through the connected scope 5003 to generate pixel signals, and outputs the pixel signals to the CCU 5039 through the transmission system.
  • the scope 5003 is an insertion portion that has an objective lens at its tip and guides light from the connected light source device 5043 into the body of the patient 5071 .
  • the scope 5003 is, for example, a rigid scope for rigid scopes and a flexible scope for flexible scopes.
  • the scope 5003 may be a direct scope or a perspective scope.
  • the pixel signal may be a signal based on a signal output from a pixel, such as a RAW signal or an image signal.
  • a memory may be installed in the transmission system connecting the endoscope 5001 and the CCU 5039, and the parameters relating to the endoscope 5001 and the CCU 5039 may be stored in the memory.
  • the memory may be arranged, for example, on the connection part of the transmission system or on the cable.
  • the parameters of the endoscope 5001 at the time of shipment and the parameters changed when the power is supplied may be stored in the memory of the transmission system, and the operation of the endoscope may be changed based on the parameters read from the memory.
  • an endoscope and a transmission system may be collectively referred to as an endoscope.
  • the light receiving element 50054 is a sensor that converts received light into pixel signals, and is, for example, a CMOS (Complementary Metal Oxide Semiconductor) type imaging element.
  • the light-receiving element 50054 is preferably an imaging element having a Bayer array and capable of color imaging.
  • the light receiving element 50054 is, for example, 4K (horizontal pixel number 3840 ⁇ vertical pixel number 2160), 8K (horizontal pixel number 7680 ⁇ vertical pixel number 4320) or square 4K (horizontal pixel number 3840 or more ⁇ vertical pixel number 3840 or more). It is preferable that the image sensor has a number of pixels corresponding to the resolution.
  • the light receiving element 50054 may be a single sensor chip or a plurality of sensor chips.
  • a prism may be provided to separate the incident light into predetermined wavelength bands, and each wavelength band may be imaged by a different light-receiving element.
  • a plurality of light receiving elements may be provided for stereoscopic viewing.
  • the light receiving element 50054 may be a sensor including an arithmetic processing circuit for image processing in a chip structure, or may be a ToF (Time of Flight) sensor.
  • the transmission system is, for example, an optical fiber cable or wireless transmission. The wireless transmission is sufficient as long as the pixel signals generated by the endoscope 5001 can be transmitted.
  • Mirror 5001 and CCU 5039 may be connected.
  • the endoscope 5001 may transmit not only the pixel signal but also information related to the pixel signal (for example, processing priority of the pixel signal, synchronization signal, etc.) at the same time.
  • the endoscope may be configured by integrating a scope and a camera, or by providing a light-receiving element at the tip of the scope.
  • the CCU 5039 is a control device that comprehensively controls the connected endoscope 5001 and light source device 5043. For example, as shown in FIG. processing equipment. Also, the CCU 5039 may centrally control the connected display device 5041 , recording device 5053 and output device 5055 . For example, the CCU 5039 controls the irradiation timing and irradiation intensity of the light source device 5043 and the type of irradiation light source.
  • the CCU 5039 performs image processing such as development processing (for example, demosaicing processing) and correction processing on the pixel signals output from the endoscope 5001, and outputs the processed pixel signals (for example, image processing) to an external device such as the display device 5041. ). Also, the CCU 5039 transmits a control signal to the endoscope 5001 to control driving of the endoscope 5001 .
  • the control signal is, for example, information about imaging conditions such as magnification and focal length of the imaging unit.
  • the CCU 5039 may have an image down-conversion function, and may be configured to output a high-resolution (eg, 4K) image to the display device 5041 and a low-resolution (eg, HD) image to the recording device 5053 at the same time.
  • a high-resolution (eg, 4K) image to the display device 5041
  • a low-resolution (eg, HD) image to the recording device 5053 at the same time.
  • the CCU 5039 is connected to external devices (eg, recording device, display device, output device, support device) via an IP converter that converts signals into a predetermined communication protocol (eg, IP (Internet Protocol)).
  • IP Internet Protocol
  • the connection between the IP converter and the external device may be configured by a wired network, or part or all of the network may be configured by a wireless network.
  • the IP converter on the CCU5039 side has a wireless communication function, and the received video is sent to an IP switcher or output via a wireless communication network such as the 5th generation mobile communication system (5G) or the 6th generation mobile communication system (6G). It may be sent to the side IP converter.
  • 5G 5th generation mobile communication system
  • 6G 6th generation mobile communication system
  • the light source device 5043 is a device capable of emitting light in a predetermined wavelength band, and includes, for example, a plurality of light sources and a light source optical system that guides light from the plurality of light sources.
  • the light source is, for example, a xenon lamp, an LED light source, or an LD light source.
  • the light source device 5043 has, for example, LED light sources corresponding to the three primary colors R, G, and B, and emits white light by controlling the output intensity and output timing of each light source. Further, the light source device 5043 may have a light source capable of irradiating special light used for special light observation separately from the light source for irradiating normal light used for normal light observation.
  • Special light is light in a predetermined wavelength band different from normal light that is light for normal light observation.
  • Normal light is, for example, white light or green light.
  • narrow-band light observation which is a type of special light observation, by alternately irradiating blue light and green light, the wavelength dependence of light absorption in body tissues can be used to detect specific tissues such as blood vessels on the surface of the mucous membrane. can be shot with high contrast.
  • fluorescence observation which is a type of special light observation, excitation light that excites the drug injected into the body tissue is irradiated, and fluorescence emitted by the body tissue or the drug as a marker is received to obtain a fluorescence image.
  • a drug such as indocyanine green (ICG) injected into the body tissue is irradiated with infrared light having an excitation wavelength band, and the fluorescence of the drug is received to detect the body tissue. structure and the affected area can be easily visualized.
  • an agent for example, 5-ALA
  • the light source device 5043 sets the type of irradiation light under the control of the CCU 5039 .
  • the CCU 5039 may have a mode in which normal light observation and special light observation are alternately performed by controlling the light source device 5043 and the endoscope 5001 .
  • information based on pixel signals obtained by special light observation is preferably superimposed on pixel signals obtained by normal light observation.
  • the special light observation may be infrared light observation in which infrared light is irradiated to look deeper than the surface of the organ, or multispectral observation utilizing hyperspectral spectroscopy. Additionally, photodynamic therapy may be combined.
  • a recording device 5053 is a device for recording pixel signals (for example, an image) obtained from the CCU 5039, and is, for example, a recorder.
  • a recording device 5053 records the image acquired from the CCU 5039 on an HDD, an SDD, or an optical disk.
  • the recording device 5053 may be connected to a hospital network and accessible from equipment outside the operating room. Also, the recording device 5053 may have an image down-conversion function or an image up-conversion function.
  • the display device 5041 is a device capable of displaying an image, such as a display monitor.
  • a display device 5041 displays a display image based on pixel signals obtained from the CCU 5039 .
  • the display device 5041 may function as an input device that enables line-of-sight recognition, voice recognition, and gesture-based instruction input by being equipped with a camera and a microphone.
  • the output device 5055 is a device for outputting information acquired from the CCU 5039, such as a printer.
  • the output device 5055 prints on paper a print image based on the pixel signals acquired from the CCU 5039, for example.
  • the support device 5027 is an articulated arm including a base portion 5029 having an arm control device 5045 , an arm portion 5031 extending from the base portion 5029 , and a holding portion 5032 attached to the tip of the arm portion 5031 .
  • the arm control device 5045 is configured by a processor such as a CPU, and operates according to a predetermined program to control driving of the arm section 5031 .
  • the support device 5027 controls parameters such as the length of each link 5035 constituting the arm portion 5031 and the rotation angle and torque of each joint 5033 by means of the arm control device 5045 .
  • the support device 5027 functions as an endoscope support arm that supports the endoscope 5001 during surgery. Thereby, the support device 5027 can take the place of the scopist who is an assistant holding the endoscope 5001 .
  • the support device 5027 may be a device that supports a microscope device 5301, which will be described later, and can also be called a medical support arm.
  • the control of the support device 5027 may be an autonomous control method by the arm control device 5045, or may be a control method in which the arm control device 5045 controls based on the user's input.
  • control method is a master/slave method in which the support device 5027 as a slave device (replica device), which is a patient cart, is controlled based on the movement of the master device (primary device), which is the operator console at hand of the user. It's okay. Also, the control of the support device 5027 may be remotely controlled from outside the operating room.
  • slave device replica device
  • master device primary device
  • control of the support device 5027 may be remotely controlled from outside the operating room.
  • FIG. 22 is a diagram illustrating an example of a schematic configuration of a microsurgery system to which technology according to the present disclosure can be applied;
  • the same reference numerals are given to the same configurations as those of the endoscope system 5000, and duplicate descriptions thereof will be omitted.
  • FIG. 22 schematically shows an operator 5067 performing an operation on a patient 5071 on a patient bed 5069 using a microsurgery system 5300 .
  • FIG. 22 omits illustration of the cart 5037 in the configuration of the microsurgery system 5300, and also shows a simplified microscope device 5301 instead of the endoscope 5001.
  • the microscope device 5301 in this description may refer to the microscope section 5303 provided at the tip of the link 5035 or may refer to the entire configuration including the microscope section 5303 and the support device 5027 .
  • an image of a surgical site captured by a microscope device 5301 is enlarged and displayed on a display device 5041 installed in the operating room.
  • the display device 5041 is installed at a position facing the operator 5067, and the operator 5067 observes the state of the operation site by the image displayed on the display device 5041, for example, resection of the affected area.
  • Various measures are taken against Microsurgery systems are used, for example, in ophthalmic and brain surgery.
  • the support device 5027 can support other observation devices or other surgical tools instead of the endoscope 5001 or the microscope section 5303 at its distal end.
  • the other observation device for example, forceps, forceps, a pneumoperitoneum tube for pneumoperitoneum, or an energy treatment instrument for incising tissue or sealing a blood vessel by cauterization can be applied.
  • the technology according to the present disclosure may be applied to a support device that supports components other than such a microscope section.
  • the technology according to the present disclosure can be suitably applied to the CCU 5039, the endoscope 5001, or the microscope unit 5303 among the configurations described above. Specifically, by applying the above-described EVS 200 to the CCU 5039, the endoscope 5001, or the microscope unit 5303, minute changes in observation light can be captured without being affected by the absolute value of the luminance of the observation light. can be done.
  • the above-described embodiment of the present disclosure includes, for example, an observation method executed by the medical observation system 10 as described above, a program for operating the medical observation system 10, and a program in which the program is recorded. may include non-transitory tangible media. Also, the program may be distributed via a communication line (including wireless communication) such as the Internet.
  • each step in the processing method of the embodiment of the present disclosure described above does not necessarily have to be processed in the described order.
  • each step may be processed in an appropriately changed order.
  • each step may be partially processed in parallel or individually instead of being processed in chronological order.
  • the processing of each step does not necessarily have to be processed in accordance with the described method, and may be processed by another method by another functional unit, for example.
  • each component of each device illustrated is functionally conceptual and does not necessarily need to be physically configured as illustrated.
  • the specific form of distribution and integration of each device is not limited to the one shown in the figure, and all or part of them can be functionally or physically distributed and integrated in arbitrary units according to various loads and usage conditions. Can be integrated and configured.
  • the present technology can also take the following configuration.
  • an event vision sensor that detects, as an event, a change in luminance value of light in the first wavelength band emitted from living tissue; an image processing unit configured to generate a first image based on a plurality of pieces of sensing data output from the event vision sensor at different timings; A biological tissue observation system.
  • the image processing unit sequentially adds the plurality of sensing data output after the output of the sensing data to the stored sensing data.
  • the biological tissue observation system according to (6) above.
  • a biological tissue observation system (9) The event vision sensor, a pixel array section having a plurality of pixels arranged in a matrix; an event detection unit that detects that a luminance change amount exceeds a predetermined threshold in each pixel; having A biological tissue observation system according to any one of (3) to (8) above.
  • the adjustment unit adjusts the predetermined threshold based on the frequency of output of the sensing data.
  • the adjustment unit adjusts the predetermined threshold value based on a dynamic range or gradation of luminance of the first image.
  • the adjustment unit adjusts the predetermined threshold based on the number of administrations of the drug.
  • the adjustment unit reduces the predetermined threshold value as the number of administrations of the drug increases.
  • the biological tissue observation system according to any one of (1) to (27) above, which is any one of an endoscope, exoscopy, and microscope.
  • an event vision sensor that detects, as an event, a change in luminance value of light in the first wavelength band emitted from living tissue; an image processing unit configured to generate a first image based on a plurality of pieces of sensing data output from the event vision sensor at different timings;
  • a biological tissue observation device comprising: (30) the event vision sensor detecting, as an event, a change in luminance value of light in the first wavelength band emitted from the living tissue; an image processing unit generating a first image based on a plurality of sensing data output from the event vision sensor at different timings;
  • a biological tissue observation method comprising:

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Optics & Photonics (AREA)
  • Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Public Health (AREA)
  • Analytical Chemistry (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Chemical & Material Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Astronomy & Astrophysics (AREA)
  • Multimedia (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

L'invention concerne un système d'observation de tissu biologique comprenant : un capteur de vision d'événement (200) qui détecte un changement de la valeur de luminance de la lumière d'une première bande de longueur d'onde émise à partir d'un tissu biologique en tant qu'événement ; et une unité de traitement d'image (500) qui génère une première image sur la base d'une pluralité de données de détection délivrées à différents moments par le capteur de vision d'événement.
PCT/JP2022/013701 2021-05-14 2022-03-23 Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique WO2022239495A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021-082557 2021-05-14
JP2021082557 2021-05-14

Publications (1)

Publication Number Publication Date
WO2022239495A1 true WO2022239495A1 (fr) 2022-11-17

Family

ID=84028182

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/013701 WO2022239495A1 (fr) 2021-05-14 2022-03-23 Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique

Country Status (1)

Country Link
WO (1) WO2022239495A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019145516A1 (fr) * 2018-01-26 2019-08-01 Prophesee Procédé et appareil de traitement d'un signal provenant d'un capteur basé sur un événement
JP2020025263A (ja) * 2018-07-31 2020-02-13 ソニーセミコンダクタソリューションズ株式会社 積層型受光センサ及び電子機器

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019145516A1 (fr) * 2018-01-26 2019-08-01 Prophesee Procédé et appareil de traitement d'un signal provenant d'un capteur basé sur un événement
JP2020025263A (ja) * 2018-07-31 2020-02-13 ソニーセミコンダクタソリューションズ株式会社 積層型受光センサ及び電子機器

Similar Documents

Publication Publication Date Title
US11788966B2 (en) Imaging system
US11642004B2 (en) Image processing device, image processing method and recording medium
JP5127639B2 (ja) 内視鏡システム、およびその作動方法
US10904437B2 (en) Control apparatus and control method
US8996087B2 (en) Blood information measuring method and apparatus
JP7088185B2 (ja) 医療用システム、医療用装置および制御方法
JP7095693B2 (ja) 医療用観察システム
WO2022044897A1 (fr) Système d'imagerie médicale, dispositif d'imagerie médicale et procédé de fonctionnement
US20240315562A1 (en) Medical imaging systems and methods
WO2021095697A1 (fr) Appareil de traitement d'informations, procédé et programme de génération
JP2002345739A (ja) 画像表示装置
WO2020008920A1 (fr) Système d'observation médicale, dispositif d'observation médicale, et procédé d'entraînement de dispositif d'observation médicale
WO2021140923A1 (fr) Dispositif de génération d'images médicales, procédé de génération d'images médicales, et programme de génération d'images médicales
WO2022239495A1 (fr) Système d'observation de tissu biologique, dispositif d'observation de tissu biologique et procédé d'observation de tissu biologique
US11523729B2 (en) Surgical controlling device, control method, and surgical system
JP7257544B2 (ja) 情報表示システムおよび情報表示方法
WO2020009127A1 (fr) Système d'observation médicale, dispositif d'observation médicale et procédé de commande de dispositif d'observation médicale
US20220022728A1 (en) Medical system, information processing device, and information processing method
JP7480779B2 (ja) 医療用画像処理装置、医療用画像処理装置の駆動方法、医療用撮像システム、及び医療用信号取得システム
WO2022239339A1 (fr) Dispositif de traitement d'informations médicales, système d'observation médicale et procédé de traitement d'informations médicales
WO2020184228A1 (fr) Dispositif de traitement d'images médicales, procédé d'entraînement de dispositif de traitement d'images médicales et système d'observation médicale
WO2022249572A1 (fr) Dispositif de traitement d'image, procédé de traitement d'image et support d'enregistrement
US20240016364A1 (en) Surgery system, surgery control device, control method, and program
WO2022044898A1 (fr) Système d'imagerie médicale, dispositif d'imagerie médicale et procédé de fonctionnement
JP2024082826A (ja) 撮像装置および撮像装置の作動方法、並びにプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22807200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22807200

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP