WO2024009204A1 - Controller for a surgical microscope and method - Google Patents

Controller for a surgical microscope and method Download PDF

Info

Publication number
WO2024009204A1
WO2024009204A1 PCT/IB2023/056886 IB2023056886W WO2024009204A1 WO 2024009204 A1 WO2024009204 A1 WO 2024009204A1 IB 2023056886 W IB2023056886 W IB 2023056886W WO 2024009204 A1 WO2024009204 A1 WO 2024009204A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
target area
controller
spectral
region
Prior art date
Application number
PCT/IB2023/056886
Other languages
French (fr)
Inventor
George Themelis
Original Assignee
Leica Instruments (Singapore) Pte. Ltd.
Leica Microsystems Cms Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leica Instruments (Singapore) Pte. Ltd., Leica Microsystems Cms Gmbh filed Critical Leica Instruments (Singapore) Pte. Ltd.
Publication of WO2024009204A1 publication Critical patent/WO2024009204A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N2021/6417Spectrofluorimetric devices
    • G01N2021/6421Measuring at two or more wavelengths

Definitions

  • the invention relates to a controller for a surgical microscope and to a surgical microscope.
  • the invention further relates to a method for obtaining an image with a surgical microscope.
  • Fluorophores are introduced into a patient's body to mark specific substances and structures, such as blood vessels or cancer cells.
  • the fluorophores are used to highlight substances and structures that would otherwise not or only hardly be visible to the unaided eye.
  • ICG indocyanine green
  • Other fluorophores e.g. aminolevulinic acid (5-ALA), bind to cancer cells, and can thus be used to highlight the position of cancerous tissue.
  • the fluorescence imaging is performed in a single narrow spectral band, e.g. in the near infrared region when using ICG as fluorophore, and produces a monochrome image.
  • the monochrome image it is impossible to distinguish diagnostic fluorescence signal, i.e. fluorescence signals originating from the used fluorophore, from unwanted fluorescence signals, i.e. signals from any other source, purely based on the fluorescence intensity.
  • Augmented Reality (AR) platforms that merge a fluorescence image with a reflectance or white light image of the operating theater typically display the fluorescence image in a single color. It is therefore not possible to distinguish diagnostic fluorescence signals from unwanted fluorescence signals with these platforms.
  • the proposed controller for a surgical microscope is configured to control an excitation unit of the surgical microscope to emit excitation light for exciting fluorophores located in a target area of a patient.
  • the controller is configured to control an optical detection unit of the surgical microscope to receive light from the target area, and to separate the received light into at least two spectral channels.
  • a first spectral channel corresponds to a first wavelength band
  • a second spectral channel corresponds to a second wavelength band.
  • the controller is configured to generate at least one image comprising pixels of the target area based on the received light, and to determine a first intensity for each pixel based on the first spectral channel and a second intensity for each pixel based on the second spectral channel.
  • the controller is further configured to determine at least one first image region based on the first and second intensities, and at least one second image region based on the first and second intensities.
  • the first image region corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores.
  • the second image region corresponds to a region of the target area exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
  • the first and second regions may overlap.
  • the overlap of the first and second regions corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores as well as at least one fluorescence source other than the excited fluorophores.
  • the first and second regions may be as small as a single pixel of the image of the target area.
  • the controller may be configured to determine for each pixel whether the pixel corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores, and/or whether the pixel corresponds to a region of the target area exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
  • the diagnostic fluorescence signal is the fluorescence caused by the excited fluorophores, e.g. the position of blood vessels indicated by the presence of ICG or the presence of cancerous tissue indicated by the presence of 5-ALA.
  • the unwanted fluorescence signal fluorescence caused by a fluorescence source other than the excited fluorophores, e.g. autofluorescence of bone or tissue.
  • the controller is configured to determine the first and second image regions by spectral unmixing.
  • Spectral unmixing refers to all methods that allow for the separation of different fluorescence sources based on the intensities detected in the different spectral channels. Spectral unmixing can be performed fast and reliable, allowing first and second image regions to be determined quickly. Methods for spectral unmixing include but are not limited to linear unmixing, principle component analysis, learning unsupervised means of spectra, support vector machines, neural networks, a (spectral) phasor approach, and Monte Carlo unmixing algorithms.
  • the controller is configured to control an illumination unit of the surgical microscope to emit illumination light for illuminating the target area, and to generate at least one reflectance image based on the received light.
  • the reflectance image also called white light image, shows the target area as the unaided human eye would see it.
  • the reflectance image can be used to generate a composite image comprising reflectance image and the first and/or second image regions. Such a composite image can help the surgeon determine the position of the first and/or second image regions in the target area.
  • the controller is configured to control the optical detection unit to separate the received light into at least three spectral channels.
  • a third spectral channel corresponds to a third wavelength band.
  • the third wavelength band is complementary to the first and second wavelength bands.
  • the controller is further configured to generate the reflectance image based on at least the third spectral channel. This allows the first and second wavelength bands to be processed separately from the reflectance image. For example, the first and second spectral channels may be enhanced or reduced in intensity before they are displayed to the surgeons.
  • the third wavelength band is complementary to the first and second wavelength band, i.e. the third wavelength band and the first and second wavelength bands are non-overlapping.
  • the third wavelength band may encompass all optical wavelengths except for the first and second wavelength bands.
  • the controller is configured to determine the first and second image regions using machine learning.
  • the controller uses machine learning to distinguish the diagnostic fluorescence signal and an unwanted fluorescence signal.
  • additional information such as the reflectance image of the target area is used to determine the first and second image regions using machine learning.
  • the controller may be configured to perform image detection facilitated by machine learning in order to detect the non- biological object.
  • the controller may be configured to perform a sematic segmentation of the image of the target area using machine learning. For example, certain areas may be determined to be a non-biological object, e.g. a glove or a surgical instrument.
  • the result of the image segmentation may then be used to exclude certain regions of the image of the target area to be emitters of the diagnostic fluorescence signal, e.g. the aforementioned non-biological object.
  • the use of machine learning can greatly aid the determination of the diagnostic fluorescence signal and thereby improve the reliability of the controller.
  • Machine learning techniques include but are not limited to support vector machines and neural networks. Most machine learning techniques require either supervised or unsupervised training using an appropriate training dataset. The choice of the training dataset depends on the specific task of the machine learning technique used. ln the above example of detecting a non-biological object in the image of the target area, the appropriate training dataset would consist of images, in particular images captured by means of a surgical microscope, of various foreign objects inside a patient's body.
  • the controller is configured to generate a composite image from the first image region and the image of the target area, and to control an output unit of the surgical microscope to display the image of the target area and/or the composite image.
  • the composite image aids the surgeon in determining the position of the first and/or second image regions in the target area, thereby reducing mental load during microsurgery and allowing the surgeons to perform their task more efficiently.
  • the first and second regions are preferably highlighted such that they are easily identified.
  • the controller is configured to generate the composite image from the first image region, the second image region, and the image of the target area.
  • the first image region has a first color
  • the second image region has a second color.
  • the second color is different from first color. Displaying the different regions in different colors helps the surgeon to differentiate between the diagnostic fluorescence signal and the unwanted fluorescence signal. Displaying the second image region, i.e. the region emitting what the controller has determined to be an unwanted fluorescence signal, further allows the surgeon to verify the determination by the controller, thereby giving the surgeon all the information to make informed decisions during microsurgery.
  • the controller is configured to generate the composite image from the first image region, the second image region, and the image of the target area.
  • the first image region or the second image region is a blinking overlay over the image of the target area. Displaying the second region as a blinking overlay helps the surgeon to differentiate between the diagnostic fluorescence signal and the unwanted fluorescence signal. In particular this allows the surgeon to verify the determination by the controller during microsurgery.
  • the first and second wavelength bands do not overlap.
  • the first and second wavelength bands complement each other allowing a more reliable determination of the first and second image regions.
  • the invention also relates to a surgical microscope comprising a controller as described above, an excitation unit configured to emit excitation light for exciting fluorophores located in a target area of a patient, and an optical detection unit configured to receive light from the target area, and to separate the received light into at least two spectral channels.
  • a first spectral channel corresponds to the first wavelength band
  • a second spectral channel corresponds to the second wavelength band.
  • the surgical microscope comprises the controller described above, and therefore has the same advantages as the controller.
  • the surgical microscope further comprises an illumination unit configured to emit illumination light for illuminating the target area.
  • the illumination unit can in particular be used in generating the reflectance image.
  • the surgical microscope comprises an output unit.
  • the output unit is one of a screen, an eye piece, an augmented-reality set and a virtual-reality set.
  • the output unit may be configured to display the image of the target area and/or the composite image.
  • the optical detection unit comprises at least one microscope objective that is directed at the target area, and configured to receive the light from the target area.
  • a microscope objective may provide magnification allowing the surgeon to see small details of the target area than would be visible to the unaided eye. In other words, magnification allows the surgeon to perform microsurgery, e.g. surgery of small blood vessels or nerves with a diameter of 1 mm or less.
  • the optical detection unit comprises at least two detector elements and beam splitting means that are configured to direct received light having a wavelength in the first wavelength band onto a first detector element, and to direct received light having a wavelength in the second wavelength band onto a second detector element.
  • the beam splitting element and the first and second detector elements are used as means for generating the first and second spectral channels. Compared to other means for generating the first and second spectral channels using the beam splitting element and the first and second detector elements is easy to implement, cost-effective, and reliable.
  • the optical detection unit is configured to separate the received light into at least three spectral channels.
  • a third spectral channel corresponds to a third wavelength band.
  • the third wavelength band is complementary to the first and second wavelength bands.
  • the optical detection unit comprises a multispectral camera or a hyper spectral camera configured to generate the first and second spectral channels.
  • a multispectral camera is configured to capture a limited number of wavelength bands, typically less than or around 10. Each of these wavelength bands maybe a spectral channel of the surgical microscope.
  • a hyperspectral camera is configured to capture tens or hundreds of wavelength bands per pixel. In other words, hyperspectral images have a very high spectral resolution. More spectral channels allow for a much finer differentiation of sources of fluorescence in the image of the target area based on their emission spectrum and thereby increases the sensitivity and reliability of the surgical microscope.
  • the invention further relates to a method for obtaining an image with a surgical microscope, comprising the following steps: Exciting fluorophores in a target area of a patient. Receiving light from the target area. Separating the received light into at least two spectral channels, a first spectral channel corresponding to a first wavelength band, and a second spectral channel corresponding to a second wavelength band. Generating at least one image of the target area based on the received light, the image comprising pixels. Determining a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel.
  • the method has the same advantages as the controller described above and can be supplemented using the features of the dependent claims directed at the controller.
  • Figure 1 is a schematic view of a surgical microscope according to an embodiment
  • Figure 2 is a flowchart of the method for obtaining an image of the target area with the surgical microscope
  • Figure 3 shows three schematic diagrams, each showing the spectrum of a different component of the detection light
  • Figure 4 is a schematic diagram of the spectra of the different component of the detection light illustrating the state of the art.
  • Figure 1 is a schematic view of a surgical microscope 100 according to an embodiment.
  • the surgical microscope 100 is adapted to provide a surgeon with a magnified image of a target area 102 inside a patient's body, e.g. during microsurgery.
  • the surgical microscope 100 is adapted to obtain a fluorescence image of the target area 102. That is an image generated from fluorescence light emitted by fluorophores located in the target area 102.
  • An illumination unit 104 of the surgical microscope 100 is directed at the target area 102 and configured to illuminate the target area 102 using white light.
  • An excitation unit 106 of the surgical microscope 100 is configured to emit excitation light towards the target area 102 for exciting the fluorophores.
  • the excitation unit 106 may in particular comprise a source of coherent light, e.g. a white light laser, a continuous wave laser or a pulsed laser having a single emission wavelength.
  • An optical detection unit 108 of the surgical microscope 100 has at least one microscope objective 110 directed at the target area 102 and configured to receive detection light 112 from the target area 102.
  • the detection light 112 comprises multiple components.
  • the components of the detection light 112 are fluorescence light emitted by the excited fluorophores, fluorescence light emitted by sources other than the excited fluorophores, and reflectance light caused by the white light illumination.
  • the fluorescence light emitted by the excited fluorophores is a wanted or diagnostic fluorescence signal while the fluorescence light emitted by other sources is an unwanted fluorescence signal.
  • the optical detection unit 108 is exemplary configured to generate three spectral channels, i.e. to separately detect detection light 112 in a first, a second, and a third wavelength band.
  • the three wavelength bands are described below with reference to Figure 3 in more detail.
  • the optical detection unit 108 exemplary comprises an arrangement of beam splitting elements 114a, 114b, e.g. dichroic elements or an acousto optic tunable filter (AOTF), and detector elements 116a, 116b, 116c.
  • the three spectral channels maybe generated by other means, e.g. a multispectral or hyperspectral camera.
  • a first beam splitting element 114a is arranged in the beam path of the detection light 112 following the microscope objective 110.
  • the first beam splitting element 114a is configured to direct detection light 112 of the first wavelength band onto a first detector element 116a.
  • the remaining detection light 112 is directed at a second beam splitting element 114b.
  • the second beam splitting element 114b is configured to direct detection light 112 of the second wavelength band onto a second detector element 116b.
  • the then remaining detection light 112 is directed at a third detector element 116c.
  • the surgical microscope 100 further comprises a controller 118.
  • the controller 118 is connected to the illumination unit 104, the excitation unit 106, the optical detection unit 108, and an output unit 120.
  • the controller 118 configured to control said elements of the surgical microscope 100.
  • the controller 118 is configured to perform a method for obtaining an image of the target area 102 that is described below with reference to Figure 2.
  • the image or images of the target area 102 generated by the controller 118 are displayed to the surgeon by the output unit 120 of the surgical microscope 100.
  • the output unit 120 is exemplary shown to be a monitor.
  • the output unit 120 exemplary shows a composite image of the target area 102 comprising a reflectance or white light image of the target area 102, a first image region 122 and a second image region 124.
  • the first image region 122 corresponds to a diagnostic fluorescence signal, in this example a tumor marked by 5-ALA.
  • the second region corresponds to an unwanted fluorescence signal, in this example bone autofluorescence.
  • the first and second image regions 122, 124 are distinguished by a different hatching. In an actual embodiment the first and second image regions 122, 124 may be distinguished by a different color. Alternatively, the first image region 122 or the second image region 124 may be displayed as a blinking overlay over the reflectance image.
  • Figure 2 is a flowchart of the method for obtaining an image of the target area 102 with the surgical microscope 100 described above.
  • step S200 The process is started in step S200.
  • step S202 the controller 118 controls the excitation unit 106 to emit excitation light towards the target area 102 in order to excite the fluorophores located in the target area 102.
  • step 204 the controller 118 controls the optical detection unit 108 to receive light from the target area 102.
  • the received light is split into at least two, in the present embodiment three spectral channels.
  • the first spectral channel is generated by the first detector element and corresponds to the first wavelength band
  • the second spectral channel is generated by the second detector element and corresponds to the second wavelength band
  • the third spectral channel is generated by the third detector element and corresponds to the third wavelength band.
  • step S206 the controller 118 generates at least one image of the target area 102 based on the received light, i.e. the detection light, the image comprising pixels.
  • step S208 the controller 118 determines a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel.
  • the controller 118 generates a first image corresponding to the detection light 112 received by the first detector element, and a second image corresponding to the detection light 112 received by the second detector element.
  • the first and second image are monochromatic images corresponding to the intensity of detection light 112 received in the first and second wavelength bands, respectively.
  • each pixel of the first and second images stores information about the intensity of light received in the first and second wavelength bands, respectively.
  • the controller 118 generates a third image of the target area 102 based on the light received in the third wavelength band.
  • the third image may in particular be a color image of the target area 102. That is, the third image is a reflectance image.
  • the controller 118 may generate a single image of the target area 102.
  • each pixel of the single image comprises information about the intensity of light received in the three detection channels.
  • step S210 the controller 118 determines the first image region 122 based on the first and second intensities
  • step S212 the controller 118 determines the second image region 124 based on the first and second intensities.
  • the determination of the first and second image regions 122, 124 is described in more detail below with reference to Figure 3.
  • step S214 the controller 118 generates the image of the target area 102 based on the first and second image regions 122, 124, and controls the output unit 120 to output the image of the target area 102 to the surgeon.
  • the controller 118 may generate a composite image comprising the reflectance image of the target area 102, and the first and second image regions 122, 124.
  • the controller 118 may generate a composite image comprising the first and second image regions 122, 124 that can be displayed to the surgeon as an AR overlay.
  • step S216 the process is ended.
  • Figure 3 shows three schematic diagrams 300, 302, 304, each showing the spectrum of a different component of the detection light 112.
  • each diagram 300, 302, 304 denotes wavelength.
  • the ordinate of each diagram 300, 302, 304 denotes intensity.
  • the first and second wavelength bands are indicated in the diagrams 300, 302, 304 by dashed rectangles 306a, 306b.
  • a first diagram 300 shows the spectrum 308 of the fluorescence light emitted by the fluorophores, i.e. the spectrum of the diagnostic fluorescence signal.
  • the diagnostic fluorescence signal has it maximum 310 in the first wavelength band.
  • the intensity of the diagnostic fluorescence signal in the second wavelength band is about a fifth of the diagnostic fluorescence signal at its maximum.
  • a second diagram 302 shows the spectrum 312 of autofluorescence light emitted by a first tissue, i.e. the spectrum of a first unwanted fluorescence signal.
  • the first unwanted fluorescence signal has it maximum 314 in the first wavelength band.
  • the intensity of the first unwanted fluorescence signal in the second wavelength band is almost zero.
  • a third diagram 304 shows the spectrum 316 of autofluorescence light emitted by a second tissue, i.e. the spectrum of a second unwanted fluorescence signal.
  • the second unwanted fluorescence signal has it maximum 318 between the first wavelength band and the second wavelength band.
  • the intensity of the second unwanted fluorescence signal in the first wavelength band and in the second wavelength band is about equal.
  • each source of fluorescence light has its characteristic spectrum 308, 312, 316 that can be determined by the intensities detected in the first and second wavelength bands, i.e. the first and second detection channels. It is therefore possible to determine for each pixel the source of the fluorescence light received based on the first and second detection channels.
  • Figure 4 is a schematic diagram of the spectra of the different component of the detection light 112.
  • Figure 4 illustrates the state of the art in which only a single narrow wavelength band is used to capture the diagnostic fluorescence signal.
  • the abscissa of the diagram 400 denotes wavelength.
  • the ordinate of the diagram 400 denotes intensity.
  • the single narrow wavelength band is indicated in the diagram 400 by a dashed rectangles 402.
  • the spectrum 308 of the diagnostic fluorescence signal is shown as a solid curve.
  • the spectra 312, 316 of the two unwanted fluorescence signals are shown as dashed lines.
  • the three sources of fluorescence light are distinguished by their intensity in the single narrow wavelength band. However, since it is not easily possible to normalize the three signals, it is not possible to distinguish the signals by their intensity in the single narrow wavelength band alone.
  • aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.

Abstract

A controller (118) for a surgical microscope (100) controls an excitation unit (106) of the surgical microscope (100) to emit excitation light for exciting fluorophores located in a target area (102) of a patient. The controller (118) controls an optical detection unit (108) of the surgical microscope (100) to receive light from the target area (102), and to separate the received light into at least two spectral channels. A first spectral channel corresponds to a first wavelength band, and a second spectral channel corresponds to a second wavelength band. The controller (118) generates at least one image comprising pixels of the target area (102) based on the received light, and determines a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel. The controller (118) further determines at least one first image region (122) based on the first and second intensities, and at least one second image region (124) based on the first and second intensities. The first image region (122) corresponds to a region of the target area (102) exhibiting fluorescence caused by the excited fluorophores. The second image region (124) corresponds to a region of the target area (102) exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.

Description

Controller for a surgical microscope and method
Technical field
The invention relates to a controller for a surgical microscope and to a surgical microscope. The invention further relates to a method for obtaining an image with a surgical microscope.
Background
In microsurgery, i.e. surgery aided by a surgical microscope, fluorescence imaging becomes increasingly more important. Fluorophores are introduced into a patient's body to mark specific substances and structures, such as blood vessels or cancer cells. The fluorophores are used to highlight substances and structures that would otherwise not or only hardly be visible to the unaided eye. For example, indocyanine green (ICG) binds to plasma proteins in blood and can be used to highlight blood vessels (angiography). Other fluorophores, e.g. aminolevulinic acid (5-ALA), bind to cancer cells, and can thus be used to highlight the position of cancerous tissue.
Typically, the fluorescence imaging is performed in a single narrow spectral band, e.g. in the near infrared region when using ICG as fluorophore, and produces a monochrome image. In the monochrome image it is impossible to distinguish diagnostic fluorescence signal, i.e. fluorescence signals originating from the used fluorophore, from unwanted fluorescence signals, i.e. signals from any other source, purely based on the fluorescence intensity.
Other applications exist within the context of microsurgery that utilize a broader spectral band for fluorescence imaging, such a 5-ALA fluorescence imaging. In these applications, a color image is produced and a surgeon can in theory distinguish diagnostic fluorescence signals from unwanted fluorescence signals by color. For example, in the case of 5-ALA fluorescence imaging the diagnostic fluorescence signal is pink, while e.g. bone fluorescence is white. Other objects may have virtually any color and may in particular exhibit the color of the diagnostic fluorescence signal. It is therefore not easy to distinguish diagnostic fluorescence signals from unwanted fluorescence signals, especially when the perceived color of the diagnostic fluorescence signal and the unwanted fluorescence signal are similar. In some cases, it is simply not possible to distinguish diagnostic from unwanted fluorescence, e.g. soft tissue autofluorescence appears of similar color as 5-ALA fluorescence, especially in weak fluorescence intensities. Even in the cases where it is possible to use the color information to distinguish diagnostic from unwanted fluorescence, this task is an additional burden for the surgeon.
Augmented Reality (AR) platforms that merge a fluorescence image with a reflectance or white light image of the operating theater typically display the fluorescence image in a single color. It is therefore not possible to distinguish diagnostic fluorescence signals from unwanted fluorescence signals with these platforms.
Summary
It is therefore an object to provide a controller for a surgical microscope and a method for obtaining an image with a surgical microscope that allows a surgeon to easily distinguish diagnostic fluorescence signals from unwanted fluorescence signals.
The aforementioned object is achieved by the subject-matter of the independent claims. Advantageous embodiments are defined in the dependent claims and the following description.
The proposed controller for a surgical microscope is configured to control an excitation unit of the surgical microscope to emit excitation light for exciting fluorophores located in a target area of a patient. The controller is configured to control an optical detection unit of the surgical microscope to receive light from the target area, and to separate the received light into at least two spectral channels. A first spectral channel corresponds to a first wavelength band, and a second spectral channel corresponds to a second wavelength band. The controller is configured to generate at least one image comprising pixels of the target area based on the received light, and to determine a first intensity for each pixel based on the first spectral channel and a second intensity for each pixel based on the second spectral channel. The controller is further configured to determine at least one first image region based on the first and second intensities, and at least one second image region based on the first and second intensities. The first image region corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores. The second image region corresponds to a region of the target area exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
The first and second regions may overlap. The overlap of the first and second regions corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores as well as at least one fluorescence source other than the excited fluorophores. The first and second regions may be as small as a single pixel of the image of the target area. In particular, the controller may be configured to determine for each pixel whether the pixel corresponds to a region of the target area exhibiting fluorescence caused by the excited fluorophores, and/or whether the pixel corresponds to a region of the target area exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
The received light is separated into at least two distinct spectral channels. This provides additional spectral information about the received light compared to using a single narrow spectral band. Different sources of fluorescence have different fluorescence spectra. For example, fluorescein has an emission maxima at 560 nm, 5- ALA has an emission maxima at 630 nm for 5-ALA, and ICG has an emission maxima at 830 nm. However, all fluorophores emit light at other wavelengths than their emission maxima. By comparing the intensities in the different spectral channels to the known spectrum of a particular fluorophore, the presence of that particular fluorophore can be determined. Likewise, other sources of fluorescence, e.g. autofluorescence of tissue or bone, have their own distinct spectral profiles, and can therefore be determined by comparing the intensities in the different spectral channels with the known spectral profile. Thus, the additional spectral information is used to distinguish between a diagnostic fluorescence signal and an unwanted fluorescence signal. The diagnostic fluorescence signal is the fluorescence caused by the excited fluorophores, e.g. the position of blood vessels indicated by the presence of ICG or the presence of cancerous tissue indicated by the presence of 5-ALA. The unwanted fluorescence signal fluorescence caused by a fluorescence source other than the excited fluorophores, e.g. autofluorescence of bone or tissue. By distinguishing between a diagnostic fluorescence signal and an unwanted fluorescence signal, additional mental load is taken of the surgeon, allowing them to perform their task more efficiently.
In a preferred embodiment, the controller is configured to determine the first and second image regions in real-time. Real time in the context of this document means in particular that no post-processing takes place. This allows the information about the first and second region, i.e. the position of blood vessels or cancerous tissue, to be passed to the surgeon without any noticeable delay. For example, the first and second regions may be presented to the surgeon as an overlay in an AR environment, thus expanding the surgeon's vision, thereby aiding the surgeon during microsurgery.
In another preferred embodiment, the controller is configured to determine the first and second image regions by spectral unmixing. Spectral unmixing refers to all methods that allow for the separation of different fluorescence sources based on the intensities detected in the different spectral channels. Spectral unmixing can be performed fast and reliable, allowing first and second image regions to be determined quickly. Methods for spectral unmixing include but are not limited to linear unmixing, principle component analysis, learning unsupervised means of spectra, support vector machines, neural networks, a (spectral) phasor approach, and Monte Carlo unmixing algorithms.
In another preferred embodiment, the controller is configured to control an illumination unit of the surgical microscope to emit illumination light for illuminating the target area, and to generate at least one reflectance image based on the received light. The reflectance image, also called white light image, shows the target area as the unaided human eye would see it. The reflectance image can be used to generate a composite image comprising reflectance image and the first and/or second image regions. Such a composite image can help the surgeon determine the position of the first and/or second image regions in the target area.
In another preferred embodiment, the controller is configured to control the optical detection unit to separate the received light into at least three spectral channels. A third spectral channel corresponds to a third wavelength band. The third wavelength band is complementary to the first and second wavelength bands. The controller is further configured to generate the reflectance image based on at least the third spectral channel. This allows the first and second wavelength bands to be processed separately from the reflectance image. For example, the first and second spectral channels may be enhanced or reduced in intensity before they are displayed to the surgeons. In this embodiment, the third wavelength band is complementary to the first and second wavelength band, i.e. the third wavelength band and the first and second wavelength bands are non-overlapping. In particular, the third wavelength band may encompass all optical wavelengths except for the first and second wavelength bands. In such an embodiment, the reflectance image would not display first and second wavelength bands. In another preferred embodiment, the controller is configured to determine the first and second image regions based on the reflectance image. In this embodiment, additional information in the form of the reflectance image is used to determine the first and second image regions. For example, the controller may be configured to detect a non-biological object, e.g. a glove or a surgical instrument, in the reflectance light image of the target area using image detection, and to determine that any fluorescence light emitted by the glove cannot be the diagnostic fluorescence signal. Using the additional information in form of the reflectance image enhances the reliability with which the controller can determine the first and second image regions.
In another preferred embodiment, the controller is configured to determine the first and second image regions using machine learning. In this embodiment, the controller uses machine learning to distinguish the diagnostic fluorescence signal and an unwanted fluorescence signal. In particular, additional information such as the reflectance image of the target area is used to determine the first and second image regions using machine learning. For example, the controller may be configured to perform image detection facilitated by machine learning in order to detect the non- biological object. In other words, the controller may be configured to perform a sematic segmentation of the image of the target area using machine learning. For example, certain areas may be determined to be a non-biological object, e.g. a glove or a surgical instrument. The result of the image segmentation may then be used to exclude certain regions of the image of the target area to be emitters of the diagnostic fluorescence signal, e.g. the aforementioned non-biological object. Thus, the use of machine learning can greatly aid the determination of the diagnostic fluorescence signal and thereby improve the reliability of the controller.
Machine learning techniques include but are not limited to support vector machines and neural networks. Most machine learning techniques require either supervised or unsupervised training using an appropriate training dataset. The choice of the training dataset depends on the specific task of the machine learning technique used. ln the above example of detecting a non-biological object in the image of the target area, the appropriate training dataset would consist of images, in particular images captured by means of a surgical microscope, of various foreign objects inside a patient's body.
In another preferred embodiment, the controller is configured to generate a composite image from the first image region and the image of the target area, and to control an output unit of the surgical microscope to display the image of the target area and/or the composite image. The composite image aids the surgeon in determining the position of the first and/or second image regions in the target area, thereby reducing mental load during microsurgery and allowing the surgeons to perform their task more efficiently. In the composite image, the first and second regions are preferably highlighted such that they are easily identified.
In another preferred embodiment, the controller is configured to generate the composite image from the first image region, the second image region, and the image of the target area. The first image region has a first color, and the second image region has a second color. The second color is different from first color. Displaying the different regions in different colors helps the surgeon to differentiate between the diagnostic fluorescence signal and the unwanted fluorescence signal. Displaying the second image region, i.e. the region emitting what the controller has determined to be an unwanted fluorescence signal, further allows the surgeon to verify the determination by the controller, thereby giving the surgeon all the information to make informed decisions during microsurgery.
In another preferred embodiment, the controller is configured to generate the composite image from the first image region, the second image region, and the image of the target area. The first image region or the second image region is a blinking overlay over the image of the target area. Displaying the second region as a blinking overlay helps the surgeon to differentiate between the diagnostic fluorescence signal and the unwanted fluorescence signal. In particular this allows the surgeon to verify the determination by the controller during microsurgery.
In another preferred embodiment, the first and second wavelength bands do not overlap. In this embodiment, the first and second wavelength bands complement each other allowing a more reliable determination of the first and second image regions.
The invention also relates to a surgical microscope comprising a controller as described above, an excitation unit configured to emit excitation light for exciting fluorophores located in a target area of a patient, and an optical detection unit configured to receive light from the target area, and to separate the received light into at least two spectral channels. A first spectral channel corresponds to the first wavelength band, and a second spectral channel corresponds to the second wavelength band.
The surgical microscope comprises the controller described above, and therefore has the same advantages as the controller.
In a preferred embodiment, the surgical microscope further comprises an illumination unit configured to emit illumination light for illuminating the target area. The illumination unit can in particular be used in generating the reflectance image.
In another preferred embodiment, the surgical microscope comprises an output unit. In particular, the output unit is one of a screen, an eye piece, an augmented-reality set and a virtual-reality set. The output unit may be configured to display the image of the target area and/or the composite image.
In another preferred embodiment, the optical detection unit comprises at least one microscope objective that is directed at the target area, and configured to receive the light from the target area. A microscope objective may provide magnification allowing the surgeon to see small details of the target area than would be visible to the unaided eye. In other words, magnification allows the surgeon to perform microsurgery, e.g. surgery of small blood vessels or nerves with a diameter of 1 mm or less.
In another preferred embodiment, the optical detection unit comprises at least two detector elements and beam splitting means that are configured to direct received light having a wavelength in the first wavelength band onto a first detector element, and to direct received light having a wavelength in the second wavelength band onto a second detector element. In this embodiment, the beam splitting element and the first and second detector elements are used as means for generating the first and second spectral channels. Compared to other means for generating the first and second spectral channels using the beam splitting element and the first and second detector elements is easy to implement, cost-effective, and reliable.
In another preferred embodiment, the optical detection unit is configured to separate the received light into at least three spectral channels. A third spectral channel corresponds to a third wavelength band. In particular, the third wavelength band is complementary to the first and second wavelength bands. An optical arrangement that separates the received light into three complementary wavelength bands can easily be realized by means of beams splitters. Such an optical arrangement has the advantage that the received light is distributed among the detector elements, and thus little to no received light is lost as would for example be the case with a multispectral camera.
In another preferred embodiment, the optical detection unit comprises a multispectral camera or a hyper spectral camera configured to generate the first and second spectral channels. A multispectral camera is configured to capture a limited number of wavelength bands, typically less than or around 10. Each of these wavelength bands maybe a spectral channel of the surgical microscope. A hyperspectral camera is configured to capture tens or hundreds of wavelength bands per pixel. In other words, hyperspectral images have a very high spectral resolution. More spectral channels allow for a much finer differentiation of sources of fluorescence in the image of the target area based on their emission spectrum and thereby increases the sensitivity and reliability of the surgical microscope.
The invention further relates to a method for obtaining an image with a surgical microscope, comprising the following steps: Exciting fluorophores in a target area of a patient. Receiving light from the target area. Separating the received light into at least two spectral channels, a first spectral channel corresponding to a first wavelength band, and a second spectral channel corresponding to a second wavelength band. Generating at least one image of the target area based on the received light, the image comprising pixels. Determining a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel. Determining at least one first image region based on the first and second intensities, the first image region corresponding to a region of the target area exhibiting fluorescence caused by the excited fluorophores. Determining at least one second image region based on the first and second intensities, the second image region corresponding to a region of the target area exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
The method has the same advantages as the controller described above and can be supplemented using the features of the dependent claims directed at the controller.
Short Description of the Figures
Hereinafter, specific embodiments are described referring to the drawings, wherein: Figure 1 is a schematic view of a surgical microscope according to an embodiment;
Figure 2 is a flowchart of the method for obtaining an image of the target area with the surgical microscope;
Figure 3 shows three schematic diagrams, each showing the spectrum of a different component of the detection light; and
Figure 4 is a schematic diagram of the spectra of the different component of the detection light illustrating the state of the art.
Detailed Description
Figure 1 is a schematic view of a surgical microscope 100 according to an embodiment.
The surgical microscope 100 is adapted to provide a surgeon with a magnified image of a target area 102 inside a patient's body, e.g. during microsurgery. In particular, the surgical microscope 100 is adapted to obtain a fluorescence image of the target area 102. That is an image generated from fluorescence light emitted by fluorophores located in the target area 102.
An illumination unit 104 of the surgical microscope 100 is directed at the target area 102 and configured to illuminate the target area 102 using white light. An excitation unit 106 of the surgical microscope 100 is configured to emit excitation light towards the target area 102 for exciting the fluorophores. The excitation unit 106 may in particular comprise a source of coherent light, e.g. a white light laser, a continuous wave laser or a pulsed laser having a single emission wavelength. An optical detection unit 108 of the surgical microscope 100 has at least one microscope objective 110 directed at the target area 102 and configured to receive detection light 112 from the target area 102. The detection light 112 comprises multiple components. Among the components of the detection light 112 are fluorescence light emitted by the excited fluorophores, fluorescence light emitted by sources other than the excited fluorophores, and reflectance light caused by the white light illumination. The fluorescence light emitted by the excited fluorophores is a wanted or diagnostic fluorescence signal while the fluorescence light emitted by other sources is an unwanted fluorescence signal.
The optical detection unit 108 is exemplary configured to generate three spectral channels, i.e. to separately detect detection light 112 in a first, a second, and a third wavelength band. The three wavelength bands are described below with reference to Figure 3 in more detail. In order to generate the three spectral channels, the optical detection unit 108 exemplary comprises an arrangement of beam splitting elements 114a, 114b, e.g. dichroic elements or an acousto optic tunable filter (AOTF), and detector elements 116a, 116b, 116c. Alternatively, the three spectral channels maybe generated by other means, e.g. a multispectral or hyperspectral camera.
A first beam splitting element 114a is arranged in the beam path of the detection light 112 following the microscope objective 110. The first beam splitting element 114a is configured to direct detection light 112 of the first wavelength band onto a first detector element 116a. The remaining detection light 112 is directed at a second beam splitting element 114b. The second beam splitting element 114b is configured to direct detection light 112 of the second wavelength band onto a second detector element 116b. The then remaining detection light 112 is directed at a third detector element 116c.
The surgical microscope 100 further comprises a controller 118. The controller 118 is connected to the illumination unit 104, the excitation unit 106, the optical detection unit 108, and an output unit 120. The controller 118 configured to control said elements of the surgical microscope 100. In particular, the controller 118 is configured to perform a method for obtaining an image of the target area 102 that is described below with reference to Figure 2.
The image or images of the target area 102 generated by the controller 118 are displayed to the surgeon by the output unit 120 of the surgical microscope 100. In this embodiment, the output unit 120 is exemplary shown to be a monitor. In Figure 1 the output unit 120 exemplary shows a composite image of the target area 102 comprising a reflectance or white light image of the target area 102, a first image region 122 and a second image region 124. The first image region 122 corresponds to a diagnostic fluorescence signal, in this example a tumor marked by 5-ALA. The second region corresponds to an unwanted fluorescence signal, in this example bone autofluorescence. In Figure 1, the first and second image regions 122, 124 are distinguished by a different hatching. In an actual embodiment the first and second image regions 122, 124 may be distinguished by a different color. Alternatively, the first image region 122 or the second image region 124 may be displayed as a blinking overlay over the reflectance image.
Figure 2 is a flowchart of the method for obtaining an image of the target area 102 with the surgical microscope 100 described above.
The process is started in step S200. In step S202 the controller 118 controls the excitation unit 106 to emit excitation light towards the target area 102 in order to excite the fluorophores located in the target area 102. In step 204 the controller 118 controls the optical detection unit 108 to receive light from the target area 102. The received light is split into at least two, in the present embodiment three spectral channels. In the present embodiment, the first spectral channel is generated by the first detector element and corresponds to the first wavelength band, the second spectral channel is generated by the second detector element and corresponds to the second wavelength band, and the third spectral channel is generated by the third detector element and corresponds to the third wavelength band.
In step S206 the controller 118 generates at least one image of the target area 102 based on the received light, i.e. the detection light, the image comprising pixels. In step S208 the controller 118 determines a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel. In the present embodiment, the controller 118 generates a first image corresponding to the detection light 112 received by the first detector element, and a second image corresponding to the detection light 112 received by the second detector element. The first and second image are monochromatic images corresponding to the intensity of detection light 112 received in the first and second wavelength bands, respectively. That is, each pixel of the first and second images stores information about the intensity of light received in the first and second wavelength bands, respectively. Further, in this embodiment, the controller 118 generates a third image of the target area 102 based on the light received in the third wavelength band. The third image may in particular be a color image of the target area 102. That is, the third image is a reflectance image. Alternatively, the controller 118 may generate a single image of the target area 102. In this alternative embodiment, each pixel of the single image comprises information about the intensity of light received in the three detection channels.
In step S210 the controller 118 determines the first image region 122 based on the first and second intensities, and in step S212 the controller 118 determines the second image region 124 based on the first and second intensities. The determination of the first and second image regions 122, 124 is described in more detail below with reference to Figure 3.
In step S214 the controller 118 generates the image of the target area 102 based on the first and second image regions 122, 124, and controls the output unit 120 to output the image of the target area 102 to the surgeon. In case the output unit 120 is a monitor, a digital eyepiece or the like, the controller 118 may generate a composite image comprising the reflectance image of the target area 102, and the first and second image regions 122, 124. In case the output unit 120 is an AR set or the like, the controller 118 may generate a composite image comprising the first and second image regions 122, 124 that can be displayed to the surgeon as an AR overlay. In step S216 the process is ended.
Figure 3 shows three schematic diagrams 300, 302, 304, each showing the spectrum of a different component of the detection light 112.
The abscissa of each diagram 300, 302, 304 denotes wavelength. The ordinate of each diagram 300, 302, 304 denotes intensity. The first and second wavelength bands are indicated in the diagrams 300, 302, 304 by dashed rectangles 306a, 306b.
A first diagram 300 shows the spectrum 308 of the fluorescence light emitted by the fluorophores, i.e. the spectrum of the diagnostic fluorescence signal. The diagnostic fluorescence signal has it maximum 310 in the first wavelength band. The intensity of the diagnostic fluorescence signal in the second wavelength band is about a fifth of the diagnostic fluorescence signal at its maximum.
A second diagram 302 shows the spectrum 312 of autofluorescence light emitted by a first tissue, i.e. the spectrum of a first unwanted fluorescence signal. The first unwanted fluorescence signal has it maximum 314 in the first wavelength band. The intensity of the first unwanted fluorescence signal in the second wavelength band is almost zero.
A third diagram 304 shows the spectrum 316 of autofluorescence light emitted by a second tissue, i.e. the spectrum of a second unwanted fluorescence signal. The second unwanted fluorescence signal has it maximum 318 between the first wavelength band and the second wavelength band. The intensity of the second unwanted fluorescence signal in the first wavelength band and in the second wavelength band is about equal.
As can be seen from the three diagrams 300, 302, 304, each source of fluorescence light has its characteristic spectrum 308, 312, 316 that can be determined by the intensities detected in the first and second wavelength bands, i.e. the first and second detection channels. It is therefore possible to determine for each pixel the source of the fluorescence light received based on the first and second detection channels.
Figure 4 is a schematic diagram of the spectra of the different component of the detection light 112.
Figure 4 illustrates the state of the art in which only a single narrow wavelength band is used to capture the diagnostic fluorescence signal. The abscissa of the diagram 400 denotes wavelength. The ordinate of the diagram 400 denotes intensity. The single narrow wavelength band is indicated in the diagram 400 by a dashed rectangles 402.
In Figure 4 the spectrum 308 of the diagnostic fluorescence signal is shown as a solid curve. The spectra 312, 316 of the two unwanted fluorescence signals are shown as dashed lines. As can be seen, the three sources of fluorescence light are distinguished by their intensity in the single narrow wavelength band. However, since it is not easily possible to normalize the three signals, it is not possible to distinguish the signals by their intensity in the single narrow wavelength band alone.
Identical or similarly acting elements are designated with the same reference signs in all Figures. As used herein the term "and/or" includes any and all combinations of one or more of the associated listed items and may be abbreviated as
Figure imgf000017_0001
Individual features of the embodiments and all combinations of individual features of the embodiments among each other as well as in combination with individual features or feature groups of the preceding description and/or claims are considered disclosed.
Although some aspects have been described in the context of an apparatus, it is clear that these aspects also represent a description of the corresponding method, where a block or device corresponds to a method step or a feature of a method step.
Analogously, aspects described in the context of a method step also represent a description of a corresponding block or item or feature of a corresponding apparatus.
List of Reference Signs
100 surgical microscope
102 target area
104 illumination unit
106 excitation unit
108 optical detection unit
110 objective
112 detection light
114a, 114b beam splitting element
116a, 116b, 116c detector element
118 controller
120 output unit
122, 124 image region
300, 302, 304 diagram
306a, 306b rectangle
308 spectrum
310 maximum
312 spectrum
314 maximum
316 spectrum
318 maximum
400 diagram
402 rectangle

Claims

Claims A controller (118) for a surgical microscope (100), configured to control an excitation unit (106) of the surgical microscope (100) to emit excitation light for exciting fluorophores located in a target area (102) of a patient; control an optical detection unit (108) of the surgical microscope (100) to receive light from the target area (102), and to separate the received light into at least two spectral channels, a first spectral channel corresponding to a first wavelength band, and a second spectral channel corresponding to a second wavelength band; generate at least one image of the target area (102) based on the received light, the image comprising pixels; determine a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel; determine at least one first image region (122) based on the first and second intensities, the first image region (122) corresponding to a region of the target area (102) exhibiting fluorescence caused by the excited fluorophores; and determine at least one second image region (124) based on the first and second intensities, the second image region (124) corresponding to a region of the target area (102) exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores. The controller (118) according to claim 1, configured to determine the first and second image regions (122, 124) in real-time. The controller (118) according to claim 1 or 2, configured to determine the first and second image regions (122, 124) by spectral unmixing. The controller (118) according to any one of the preceding claims, configured to control an illumination unit (104) of the surgical microscope (100) to emit illumination light for illuminating the target area (102), and to generate at least one reflectance image based on the received light. The controller (118) according to claim 4, configured to control the optical detection unit (108) to separate the received light into at least three spectral channels; wherein a third spectral channel corresponds to a third wavelength band; wherein the third wavelength band is complementary to the first and second wavelength bands; and wherein the controller (118) is configured to generate the reflectance image based on at least the third spectral channel. The controller (118) according to claim 4 or 5, configured to determine the first and second image regions (122, 124) based on the reflectance image. The controller (118) according to any one of the preceding claims, configured to determine the first and second image regions (122, 124) using machine learning. The controller (118) according to any one of the preceding claims, configured to generate a composite image from the first image region (122) and the image of the target area (102), and to control an output unit (120) of the surgical microscope (100) to display the image of the target area (102) and/or the composite image. The controller (118) according to claim 8, configured to generate the composite image from the first image region (122), the second image region (124), and the image of the target area (102); wherein the first image region (122) has a first color, and the second image region (124) has a second color, the second color being different from first color. The controller according to claim 8 or 9, configured to generate the composite image from the first image region (122), the second image region (124), and the image of the target area (102); wherein the first image region (122) or the second image region (124) is a blinking overlay over the image of the target area (102). The controller according to any one of the preceding claims, wherein the first and second wavelength bands do not overlap. A surgical microscope (100) comprising a controller (118) according to any one of the preceding claims, an excitation unit (106) configured to emit excitation light for exciting fluorophores located in a target area (102) of a patient, and an optical detection unit (108) configured to receive light from the target area (102), and to separate the received light into at least two spectral channels, a first spectral channel corresponding to a first wavelength band, and a second spectral channel corresponding to a second wavelength band. The surgical microscope (100) according to claim 12, comprising an output unit (120); wherein the output unit (120) is in particular one of a screen, an eye piece, an augmented-reality set and a virtual-reality set. The surgical microscope (100) according to claim 12 or 13, wherein the optical detection unit (108) comprises at least one microscope objective (110) that is directed at the target area (102), and configured to receive the light from the target area (102). The surgical microscope (100) according to any one of the claims 12 to 14, wherein the optical detection unit (108) comprises at least two detector elements and beam splitting means that are configured to direct received light having a wavelength in the first wavelength band onto a first detector element, and to direct received light having a wavelength in the second wavelength band onto a second detector element. The surgical microscope (100) according to any one of the claims 12 to 15, wherein the optical detection unit (108) comprises a multispectral camera or a hyper spectral camera configured to generate the first and second spectral channels. A method for obtaining an image with a surgical microscope (100), comprising the following steps: a) exciting fluorophores in a target area (102) of a patient; b) receiving light from the target area (102); c) separating the received light into at least two spectral channels, a first spectral channel corresponding to a first wavelength band, and a second spectral channel corresponding to a second wavelength band; d) generating at least one image of the target area (102) based on the received light, the image comprising pixels; e) determining a first intensity for each pixel based on the first spectral channel, and a second intensity for each pixel based on the second spectral channel f) determining at least one first image region (122) based on the first and second intensities, the first image region (122) corresponding to a region of the target area (102) exhibiting fluorescence caused by the excited fluorophores; and g) determining at least one second image region (124) based on the first and second intensities, the second image region (124) corresponding to a region of the target area (102) exhibiting fluorescence caused by a fluorescence source other than the excited fluorophores.
PCT/IB2023/056886 2022-07-04 2023-07-03 Controller for a surgical microscope and method WO2024009204A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102022116642.2 2022-07-04
DE102022116642 2022-07-04

Publications (1)

Publication Number Publication Date
WO2024009204A1 true WO2024009204A1 (en) 2024-01-11

Family

ID=87426893

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2023/056886 WO2024009204A1 (en) 2022-07-04 2023-07-03 Controller for a surgical microscope and method

Country Status (1)

Country Link
WO (1) WO2024009204A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140378843A1 (en) * 2012-01-20 2014-12-25 The Trustees Of Dartmouth College Method And Apparatus For Quantitative Hyperspectral Fluorescence And Reflectance Imaging For Surgical Guidance
US20190377170A1 (en) * 2016-02-12 2019-12-12 Massachusetts Institute Of Technology Method and apparatus for imaging unsectioned tissue specimens
US20210318241A1 (en) * 2018-07-24 2021-10-14 Sony Corporation Information processing apparatus, information processing method, information processing system, program, and microscope system
US20210397870A1 (en) * 2019-02-13 2021-12-23 Ventana Medical Systems, Inc. Systems and methods for computing the contributions of autofluorescence in multichannel image

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140378843A1 (en) * 2012-01-20 2014-12-25 The Trustees Of Dartmouth College Method And Apparatus For Quantitative Hyperspectral Fluorescence And Reflectance Imaging For Surgical Guidance
US20190377170A1 (en) * 2016-02-12 2019-12-12 Massachusetts Institute Of Technology Method and apparatus for imaging unsectioned tissue specimens
US20210318241A1 (en) * 2018-07-24 2021-10-14 Sony Corporation Information processing apparatus, information processing method, information processing system, program, and microscope system
US20210397870A1 (en) * 2019-02-13 2021-12-23 Ventana Medical Systems, Inc. Systems and methods for computing the contributions of autofluorescence in multichannel image

Similar Documents

Publication Publication Date Title
JP6062405B2 (en) Surgical microscope, microscopy method, and use of surgical microscope to observe infrared fluorescence
US11800980B2 (en) Augmented reality surgical microscope and microscopy method
CN107072520B (en) Endoscope system for parallel imaging with visible and infrared wavelengths
US9906739B2 (en) Image pickup device and image pickup method
US9513219B2 (en) Fluoroscopy apparatus and image display method therefor
JP6785948B2 (en) How to operate medical image processing equipment, endoscopic system, and medical image processing equipment
EP2543308B1 (en) Fluoroscopy system
JP6017219B2 (en) Fluorescence observation apparatus and fluorescence observation system
EP2950129B1 (en) Optical filter system and fluorescence observation system
EP3110314B1 (en) System and method for specular reflection detection and reduction
WO2020036121A1 (en) Endoscope system
JP7009636B2 (en) Endoscope system
EP3858215A1 (en) Fluorescent observation camera system
EP3009098A1 (en) Microscope system for surgery
WO2009120228A1 (en) Image processing systems and methods for surgical applications
EP3841955A1 (en) Medical image processing apparatus and endoscope system, and operation method for medical image processing device
US9854963B2 (en) Apparatus and method for identifying one or more amyloid beta plaques in a plurality of discrete OCT retinal layers
JP6859554B2 (en) Observation aids, information processing methods, and programs
WO2024009204A1 (en) Controller for a surgical microscope and method
JP7291314B2 (en) Medical imaging device and method of operating a medical imaging device
US20140240673A1 (en) Image processing apparatus, ophthalmologic imaging apparatus, ophthalmologic imaging system, ophthalmologic imaging method, and non-transitory tangible medium having program stored thereon
WO2019083020A1 (en) Medical image processor and endoscope device
JP2001128926A (en) Method and device for fluorescent character display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23744568

Country of ref document: EP

Kind code of ref document: A1