WO2022002399A1 - Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks - Google Patents

Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks Download PDF

Info

Publication number
WO2022002399A1
WO2022002399A1 PCT/EP2020/068593 EP2020068593W WO2022002399A1 WO 2022002399 A1 WO2022002399 A1 WO 2022002399A1 EP 2020068593 W EP2020068593 W EP 2020068593W WO 2022002399 A1 WO2022002399 A1 WO 2022002399A1
Authority
WO
WIPO (PCT)
Prior art keywords
sample
imaging
multicore
camera
examined
Prior art date
Application number
PCT/EP2020/068593
Other languages
French (fr)
Inventor
Eirini KAKKAVA
Navid BORHANI
Demetri Psaltis
Christophe Moser
Original Assignee
Ecole Polytechnique Federale De Lausanne (Epfl)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecole Polytechnique Federale De Lausanne (Epfl) filed Critical Ecole Polytechnique Federale De Lausanne (Epfl)
Priority to PCT/EP2020/068593 priority Critical patent/WO2022002399A1/en
Publication of WO2022002399A1 publication Critical patent/WO2022002399A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000095Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope for image enhancement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000096Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope using artificial intelligence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00165Optical arrangements with light-conductive means, e.g. fibre optics
    • A61B1/00167Details of optical fibre bundles, e.g. shape or fibre distribution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/2407Optical details
    • G02B23/2461Illumination
    • G02B23/2469Illumination using optical fibres
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B23/00Telescopes, e.g. binoculars; Periscopes; Instruments for viewing the inside of hollow bodies; Viewfinders; Optical aiming or sighting devices
    • G02B23/24Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes
    • G02B23/26Instruments or systems for viewing the inside of hollow bodies, e.g. fibrescopes using light guides
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1006Beam splitting or combining systems for splitting or combining different wavelengths
    • G02B27/1013Beam splitting or combining systems for splitting or combining different wavelengths for colour or multispectral image sensors, e.g. splitting an image into monochromatic image components on respective sensors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/50Optics for phase object visualisation
    • G02B27/52Phase contrast optics
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • G02B6/04Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings formed by bundles of fibres
    • G02B6/06Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings formed by bundles of fibres the relative position of the fibres being the same at both ends, e.g. for transporting images

Definitions

  • the present invention relates to an endoscopic system that per forms phase imaging through a conventional multicore fiber (MCF).
  • MCF multicore fiber
  • the MCF endoscope captures bright-field (BF) images of desired samples, and then Deep Neural Networks (DNNs) are used to translate the recorded BF images into phase.
  • the DNNs are in itially trained with pairs consisting of BF images and the cor responding phase image of the same location that is acquired us ing digital holographic microscopy (DHM).
  • DDM digital holographic microscopy
  • the endoscope is bend ing-insensitive and demonstrates the use of deep learning as a novel platform to add phase imaging modality to conventional MCF endoscopes .
  • MCFs multicore fibers
  • MCFs consist of thousands of indi vidual single-mode fiber (SMF) cores that are either fused into a single glass rod or leached cores in a grid, held in a common cladding.
  • SMF single-mode fiber
  • Each core of the MCF delivers the light locally from one end to the other, leading to the formation of an image.
  • Each core of the MCF can be considered as a pixel of the image and therefore, the core spacing and the number of available cores/pixels determines the resolution of the endoscope.
  • the av erage core spacing and thus resolution of the commercially available endoscopes is 4.5pm on average.
  • the production of the MCFs has been extensively studied with the aim to minimize the distance between the individual cores to in crease resolution, while maintaining low crosstalk between the cores to prevent image blurring (see K. L. Reichenbach and C.
  • Graded-index (GRIN) lenses can be added at the distal MCF facet to magnify the sample image on the facet, thus improv ing the resolution of the final endoscope without significantly increasing the final probe size, but resulting in a reduced field of view (see W. Gobel, J. N. D. Kerr, A. Nimmerjahn, and F. Helmchen, "Miniaturized two-photon microscope based on a flexible coherent fiber bundle and a gradient-index lens objec tive,” Optics Letters 29, 2521 (2004)).
  • DPC digital phase con jugation
  • TM transmission matrix
  • DNNs can achieve super resolution imaging through an MCF probe with high accuracy, re moving the effect of core discretization from the acquired image (see J. Shao, J. Zhang, R. Liang, and K. Barnard, "Fiber bundle imaging resolution enhancement using deep learning,” Opt. Ex press, OE 27, 15880-15890 (2019)).
  • DNNs can be used to recover different images related to the dataset once the model is trained with.
  • Imaging biological samples is of great importance for clinical diagnosis and treatment.
  • Cells or even tissues are samples of low contrast for a conventional BF microscope.
  • special chemical protocols to induce contrast mechanisms have been used to prepare biological samples in microscopy.
  • the use of dye molecules that bind to certain parts of the cell or tis sue can offer contrast upon illumination of the sample generated by either absorption or fluorescence. Nevertheless, for practi- cal applications, chemical staining of the living tissues is not usually suggested because of possible toxicity effects or inter vention in the normal activity of the cells.
  • Phase imaging presents an alternative label-free microscopy method for biological sample inspection (see Y. Park, C. Depeur- singe, and G.ffy, “Quantitative phase imaging in biomedi cine,” Nature Photonics 12, 578-589 (2016)).
  • phase contrast endoscopy has been proposed by scientists using HiLO microscopy.
  • HiLO endoscopy relies on illumination of a sample with high oblique angles using LED light sources from the side of the imaging fiber bundle. The detected light comes from the multiple scattering in the tissues that creates a transmission-like illumination towards the MCF imaging probe in the center.
  • the oblique angle illumination from the LED generates an image of gradient phase similar to DIC (differential in- terference contrast) microscopy. Iterative algorithm schemes are necessary for the removal of the MCF structure from the final image (see T. N. Ford, K. K. Chu, and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination, " Nat Methods 9, 1195-1197 (2012)).
  • HiLO endoscopy DIC images generate a contrast mechanism but it cannot give quantitative information concerning the phase varia tions of the inspected sample and the resolution of the final image cannot be further enhanced with the proposed image pro cessing algorithms. Consequently, novel technologies for phase imaging though the available MCFs are desired.
  • the present invention is related to an endoscopic system for phase imaging, comprising
  • a multicore waveguide preferably a multicore fiber bundle (MCF) having a distal end and a proximal end, wherein in op eration the distal end is directed to a sample to be examined and the proximal end is the end remote from the sample to be examined,
  • MCF multicore fiber bundle
  • an optical system comprising at least one first light source, preferably a LED, for illuminating the sample to be examined,
  • a first camera that is provided at the proximal side of the multicore waveguide and is capable of receiving light from said at least one first light source through said multicore waveguide so as to capture a bright field intensity image of the sample to be examined
  • the processing unit comprises a deep neu ral network, preferably of the Unet-type and generative adver sarial network (GAN) type, that is trained so as to provide phase images.
  • GAN generative adver sarial network
  • an endoscope is used that comprises a multicore waveguide, preferably a multicore fiber bundle (MCF).
  • MCF multicore fiber bundle
  • Said mul ticore waveguide has a distal end and a proximal end, wherein in operation the distal end is directed to a sample to be examined and the proximal end is the end remote from the sample to be ex amined.
  • the multicore waveguide is selected from the group consisting of a multicore waveguide made of individual cores fused in a common cladding, and a leached multicore waveguide.
  • the system is bending tolerant.
  • the endoscopic device of the present invention preferably includes a flexible MCF (multicore fiber) probe that relays the image from the area of interest at the distal fiber side to the proximal side for inspection .
  • MCF multicore fiber
  • the sample to be examined is illuminated with at least one first light source.
  • said light source is an LED, prefera bly with a wavelength in the range of 500-700 nm.
  • the light from the first light source after having come into contact with the sample to be examined, is coupled into the multicore waveguide and led there through to the proximal end of the thereof, where a first camera is provided. Said camera thus receives the light from said at least one first light source through said multicore waveguide and captures a bright field intensity image of the sample to be examined.
  • Said image is processed in a processing unit, which can be any conventional processing unit such as a desktop computer, laptop etc.
  • Said processing unit comprises a deep neural network, pref erably of the Unet-type and generative adversarial network (GAN) type, that is trained so as to provide phase images.
  • GAN generative adversarial network
  • phase imaging can be ob tained by generating holographic images of the sample to be ex amined, and use these holographic images for training the DNN.
  • the optical system of the devices further comprises a sec ond camera that is provided at the distal side of the multicore waveguide.
  • the optical system further comprises a second light source for transmitting light to the camera at the proximal side of the multicore waveguide, wherein said light transmitted from the second light source has a different wavelength than the light transmitted from the first light source optical system.
  • said second light source is a laser having a wave length in the range from 500-600 nm.
  • the illumination light sources i.e. the first and second light source
  • the endoscope can be delivered through the endoscope
  • the endoscopic device described herein can be altered to collect:
  • the device can include: - a bandpass filter that separates the illumination wavelength from the fluorescence signal that can be tuned in different parts of the spectrum depending on the collected signal and can be removed for bright field imaging a dichroic beam splitter in the case of core by core illumi nation as suggested above for separating illumination by the core and fluorescence collected by the multicore fiber.
  • the fluorescence imaging module can further comprise:
  • the fluorescence imaging module can further comprise a pinhole placed at the proximal 4f system to achieve confocal fluorescence measurements for higher contrast and sectioning.
  • the collection of bright-field images as described herein can be further improved for better contrast and sectioning by including into the device of the present invention a pinhole placed at the proximal 4f system to achieve confocal imaging, as also suggested for the fluorescence module above.
  • the endoscopic device of the present invention operates with said first light source and said first camera, together with the processing unit containing the respectively trained DNN.
  • the present invention can be employed with any known en doscopic device which is supplemented with the required compo nents (in particular the processing unit with the specifically trained DNN).
  • Training of the DNN may be performed using the same device, in which case it additionally possesses said second light source and second camera.
  • training of the DNN can be performed in a separate device that is equipped with the above first and second light source and first and second camera.
  • the at least first and optionally second light sources are arranged such that they illuminate the sample to be examined and positioned at the distal end of the multicore waveguide under an angle of between 0° and 45°.
  • the optical system further comprises an imaging system be tween the proximal end of the multicore waveguide and the first camera, said imaging system being preferably a 4f magnifying op tical system.
  • the optical system further comprises an imaging system at the distal end of the multicore waveguide, between said multi core waveguide and said sample to be examined, said imaging sys tem being preferably a 4f magnifying optical system.
  • said 4f magnification system covers the camera chip area with the full proximal facet of the probe to achieve optimal sampling in terms of the sensor pixels.
  • the light for the illumination of the sample to be exam ined can be delivered by either the cladding or by mounted SMFs (single-mode fibers) at four points, but not limited to four points, around the central MCF core.
  • the light will be multiply scattered by the biological tissue generating a uniform illumination behind the area of interest that mimics il lumination in transmission for the sample.
  • the illumination wavelength can be tuned to lower wavelengths to achieve higher levels of backscattered light and at the same time partial sec tioning.
  • the wavelength range needs to always be in the range that no phototoxicity and damage occur at the biological sample.
  • the MCF can be potentially used in contact mode using the bare facet of the fiber to touch the area of interest (i.e. the sam ple to be examined).
  • Another approach would be the addition of magnifying optics, preferably a GRIN lens mounted on the active area of the MCF for imaging (meaning the area that contains the individual cores).
  • the GRIN lens is a good option because it does not increase the final size of the endoscope and it can al so provide a better sectioning and thus image quality for the final image generated at the proximal camera.
  • the processing unit of the system is optimized to generate phase images from the intensity images captured by the fiber, while removing at the same time the discretization artefact that the MCF structure creates on the image.
  • the proposed processing unit is based on modern DNNs that are trained to receive the BF (brightfield) intensity image recorded by the proximal first camera, which is sampled by the individual fiber cores, and ren der the corresponding phase image, nicely resolved without sam pling distortion.
  • the present invention is furthermore related to a method for en doscopic examination of sample with a device according to the present invention, comprising the steps: a) providing a sample to be examined, preferably a biological sample, b) illuminating said sample to be examined with the first light source of the device and generating a brightfield image with a first camera that receives the light from the sample and transmitted through the multicore waveguide, c) processing the brightfield image with the processing unit using the deep neural network of the processing unit, pref erably of the Unet-type and generative adversarial network (GAN) type, with phase imaging obtain phase images of the sample to be examined.
  • GAN generative adversarial network
  • the phase imaging is selected from the group con sisting of a quantitative imaging and a qualitative imaging.
  • the image collection of the sample to be examined at the distal side of the multicore waveguide is performed by a method select ed from the group consisting of direct contact of the probe fac- et on the sample, by means of a microscope objective assembly configured at the facet, by a GRIN rod lens attached on the ac tive area of the imaging probe, and by a high numerical aperture multicore waveguide in front of the multicore waveguide's proxi mal facet to distribute the information in the speckle pattern probed by the multicore waveguide.
  • the method comprises an ad ditional step of training the deep neural network for phase im aging, using holographic images of the sample to be examined that have been generated with the second light source and second camera of the device of the present invention.
  • that training step may be performed with the same device before conducting endoscopy, or alternatively with a separate device.
  • the information from the holographic image is used for training the deep neutral network, and the correspondingly trained DNN is implemented into the processing unit of the device with which endoscopy is carried out.
  • intentional perturbations are included in the system during the step of training the deep neural network for the phase imaging so that any additional noise in the measurements gets absorbed and learned by the deep neural network algorithm.
  • image discretization due to the sampling by the individual cores of the endoscopic device of the present invention can be removed using:
  • the brightfield image recorded by the endoscopic device of the present invention can be translated into a corresponding map using the processing unit using deep learning based on: a Unet type deep neural network model that is initially trained by image pairs of pixelated brightfield images of the sample and their corresponding phase images recorded using digital holographic mi croscopy, or, a GAN type deep neural network model that is initially trained by image pairs of pixelated brightfield images of the sample and their corresponding phase images recorded using digital holographic mi- croscopy for a further improved image quality at the expense of training time.
  • the deep learning algorithms used for phase imaging as described herein can be also used for further improvements of the imaging capabilities of the endoscopic device of the present invention, such as:
  • Fig. 1 shows a schematic description of a device according to the present invention
  • Fig. 2 shows the optical setup for dataset acquisition accord ing to the present invention
  • Fig. 3a shows an image of a recorded hologram
  • Fig. 3b shows a BF image after MCF endoscopy for the same sam ple area
  • Fig. 4 shows a DNN architecture suitable for the present in vention
  • Fig. 5a shows a holographically-extracted phase image generated according to the present invention
  • Fig. 5b shows a MCF output generated according to the present invention
  • Fig. 5c shows a reconstructed phase image from Fig. 5b generat ed according to the present invention using DDNs.
  • FIG. 1 a schematic description of a device according to the present invention is shown.
  • a multicore imaging fiber bundle (080) is used to delivery images of a probe (090) such as a bio- logical tissue to a camera sensor (010).
  • the recorded images are generated using the endoscope as a bright-field (BF) microscope.
  • the DNN model on the computer (200) is trained for the specific endoscope that is used, using an image dataset collected by the optical setup described in figure 2.
  • the BF images relayed through the fiber bundle (080) and captured on a camera (010) are then translated into the corresponding phase images on the computer (200) in a continuous feed based on the camera frames.
  • FIG. 1 illustrates the final working principle of the phase en doscope according to the present invention and comprises the fi ber measurement part shown in Fig. 2, which optionally includes a magnification system (021,031) between the sample (090) and the fiber bundle (080) or only the fiber bundle (080) itself and a second 4f magnification system (023,032) which collects the fiber output and generates an image on the camera (010).
  • a magnification system (021,031) between the sample (090) and the fiber bundle (080) or only the fiber bundle (080) itself
  • a second 4f magnification system (023,032
  • FIG 2 the optical setup used for acquisition of the image datasets needed to train the DNNs to translate BF microscopy im- ages delivered by the MCF to phase images is shown. This optical setup is used in a calibration step in order to obtain the DNN model and realize the final phase endoscope presented in figure 1.
  • a 532nm solid state laser (001) is used to implement the digital phase microscope part of the setup which is used to obtain the phase images of the desired sample, which are used as ground truth for the DNN model.
  • the laser beam of the laser (001) is expanded and collimated by two lenses (020 and 030).
  • a pinhole (040) is also used to clean the beam before collimation as shown in the figure 2.
  • a half-wave plate (050), placed after the laser (001) and a polarizing beam splitter (071) are integrated in the beam path after expansion to control the power ratio between the illumination and reference path of the holographic setup.
  • the tissue sample (090) under study is always located in the illumi- nation path of the setup and mounted on 3D motorized stage (100).
  • the illumination path In the illumination path the light after the sample is collected and magnified, before reaching the detector (Oil), by two con secutive 4f systems which result in a total magnification of 32 (021, 031, 022, 036).
  • a beamsplitter (070) is also placed in the illumination path in order to split the image of the sample cre ated by the 4f system (021, 031) into two planes indicated in the figure 2 as "image plane 1 and "image plane 2".
  • the "image plane 2 is located at the focal plane of the second 4f system (022, 036) which is used to further magnify the sample on the camera (Oil).
  • the "image plane 1 is exactly at the facet of the fiber bundle (080) and magnifies the sample image about 5x.
  • the laser light (001) is only used for the holographic imaging of the sample and it does not participate in the image formation through the endoscope (080). Specifically a longpass filter (101) is placed after the beamsplitter (070) to block the 532 nm laser.
  • another half-wave plate (051) ensures that the two paths have the same polarization when they inter- fere at the camera (Oil) plane generating a digital hologram of the sample (090).
  • the reference and illumination paths are com bined on the camera by means of a beamsplitter (072).
  • the light in the illumination path travels through the sample, it gets modulated in phase by the differences in the refractive in- dex among the parts of the tissue.
  • this information related to the phase dif ferences of a sample are not obvious.
  • the phase information encodes in the intensity interference pattern; the hologram of the sample.
  • a second light source (002) is also used to illuminate the sam ple (090); an LED emitting at 625nm for endoscopy.
  • the LED is spatially filtered twice to achieve the best collimation.
  • a 40x microscope objective lens (024) is focusing the emitted light from the LED on a pinhole (041) and another lens (033) is used for collimation.
  • another system of lenses (034, 035) and a second pinhole (042) are used to further clean and demag- nify the beam so that the red and green illumination beams have the same diameter, when they reach the sample. Alignment of the red and green beam is performed using mirrors (060-065) to re- suit in co-propagating beams.
  • the long-pass filter (101) at 630nm allows only the light from the LED to propagate through the MCF, while another band-pass filter (102) at 532nm allows only for the laser light to reach the CCD1 and blocks any light coming from the LED.
  • the LED (002) is used to illuminate the sample in a transmission configuration for simplicity in order to demon strate the proof of concept of the technique.
  • Multiple LED sources can be incorporated around the imaging core of the fiber in order to illuminate the surface of the sample as also pre sented by the previously mentioned work of Ford et al and also demonstrated in several commercial endoscopes.
  • Careful alignment is performed between the area of the sample recorded by the two cameras (010 and Oil) so that the image of the endoscope and the image of the hologram correspond to the same sample location, which is a necessity for training the DNN model afterwards.
  • a 3D motorized stage (100) is used to raster scan the sample so images from different locations are recorded for training.
  • one camera (Oil) records digital holograms of are as of the sample, which are used to generate the ground truth for the DNN training, while the second camera (010) records the image of the sample as delivered by the MCF in conventional BF mode, which is then used as an input to the DNN. Both cameras are needed for the training step of the endoscope. After the da taset for the training is collected, only the camera (010) will be used in practice as shown in the schematic representation of the device in Figure 1.
  • FIG 3a the hologram of a lOum thick histological slice of a mouse liver tissue is presented as recorded in the camera (Oil). The same location is imaged by the MCF endoscope in BF mode using the LED illumination and it is recorded at the (010) in Figure 3b.
  • FIG 4a the schematic representation of the training pro cess for training a generative adversarial network (GAN) DNN to map a conventional BF MCF image to phase is depicted.
  • GAN generative adversarial network
  • FIG 4b the architecture of the DNN used for phase imaging is de scribed in more detail.
  • a GAN architecture is constructed in Keras API and it consists of a generator part which has a Unet structure with skip connections and a discriminator part, which is a VGG-type classifier.
  • a custom loss function is used for the image reconstruction by the Unet generator, which is the summa- tion of two parts: a sobel filter-based error (SFE) and the mean squared error (MSE).
  • SFE sobel filter-based error
  • MSE mean squared error
  • the SFE is added in the loss function to preserve the high frequencies of the image that tend to smooth out by the MSE.
  • the loss function is bina- ry cross-entropy as suggested in literature.
  • the learning rate of the Adam optimizer is selected to be 1CT 4 .
  • the size of the in tensity image inputs and the phase image outputs is 256x256 pix els and for memory and computational time efficiency, a batch size at 10 and 400 epochs were applied.
  • the generator receives the MCF BF images of the sample and attempts to generate the corresponding phase map.
  • FIG. 5 illustrates the results of the phase imaging through the commercial MCF endoscope using DNNs to transform the BF im age to phase.
  • the Fig. 4a shows the extracted phase from the recorded hologram in the camera (Oil) of a tissue liver sample of lOum thickness.
  • the Fig. 4b shows the corresponding BF image that the MCF endoscope would normally show without further pro- cessing for a sample like this as recorded by the camera (010).
  • Fig. 4c shows how the DNN can translate the BF im age (Fig. 4b) to a high contrast image of phase.
  • the images shown in this figure are part of the test set of the dataset meaning that the network had not been trained on them.
  • the re construction of the test set is of high quality, which is veri fied by the MSE metric as well as using the structural similari ty index (SSI) metric.
  • the average MSE and SSI for the dataset were calculated 0.003 and 0.92 respectively.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Physics & Mathematics (AREA)
  • Signal Processing (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Astronomy & Astrophysics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Endoscopes (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The present invention is related to an endoscopic system for phase imaging, comprising a multicore waveguide (080), an optical system comprising at least one first light source (002), for illuminating the sample to be examined, a first camera (010) that is provided at the proximal side of the multicore waveguide (080) to capture a bright field intensity image of the sample (090) to be examined, and a processing unit (200), wherein the processing unit (200) comprises a deep neural network, preferably of the Unet-type and generative adversarial network (GAN) type, that is trained so as to provide phase images. The present invention is related to a method using said endoscopic device.

Description

MULTICORE FIBER ENDOSCOPE FOR PHASE IMAGING BASED ON INTENSITY
RECORDING USING DEEP NEURAL NETWORKS
FIELD OF THE INVENTION
The present invention relates to an endoscopic system that per forms phase imaging through a conventional multicore fiber (MCF). The MCF endoscope captures bright-field (BF) images of desired samples, and then Deep Neural Networks (DNNs) are used to translate the recorded BF images into phase. The DNNs are in itially trained with pairs consisting of BF images and the cor responding phase image of the same location that is acquired us ing digital holographic microscopy (DHM). The endoscope is bend ing-insensitive and demonstrates the use of deep learning as a novel platform to add phase imaging modality to conventional MCF endoscopes .
BACKGROUND ART
Fiber endoscopy is widely used in current clinical practice, be cause it can provide information from confined places inside the human body without the need of surgical intervention. One of the most common fiber types, of which current endoscopes are made, is MCFs (multicore fibers). MCFs consist of thousands of indi vidual single-mode fiber (SMF) cores that are either fused into a single glass rod or leached cores in a grid, held in a common cladding. Each core of the MCF delivers the light locally from one end to the other, leading to the formation of an image. Each core of the MCF can be considered as a pixel of the image and therefore, the core spacing and the number of available cores/pixels determines the resolution of the endoscope. The av erage core spacing and thus resolution of the commercially available endoscopes is 4.5pm on average. The production of the MCFs has been extensively studied with the aim to minimize the distance between the individual cores to in crease resolution, while maintaining low crosstalk between the cores to prevent image blurring (see K. L. Reichenbach and C.
Xu, "Numerical analysis of light propagation in image fibers or coherent fiber bundles," Optics Express 15, 2151 (2007)).
Although tighter arrangements of the MCF cores are limited, oth er options for increasing the final image resolution can be adopted. Graded-index (GRIN) lenses can be added at the distal MCF facet to magnify the sample image on the facet, thus improv ing the resolution of the final endoscope without significantly increasing the final probe size, but resulting in a reduced field of view (see W. Gobel, J. N. D. Kerr, A. Nimmerjahn, and F. Helmchen, "Miniaturized two-photon microscope based on a flexible coherent fiber bundle and a gradient-index lens objec tive," Optics Letters 29, 2521 (2004)).
Another solution that has been proposed to further improve the resolution of an MCF fiber is the generation of a diffraction limited focused spot at the far end of the probe using wavefront shaping. In this case the size of the spot is determined by the numerical aperture (NA) of the individual cores of the MCF and can be significantly smaller than the core spacing. Therefore, point-scanning microscopy techniques can be used to result in finer image formation through MCFs. To do so, digital phase con jugation (DPC) and transmission matrix (TM) methods have been proposed to deliver a focus spot through an MCF system for imag ing (see D. B. Conkey, N. Stasio, E. E. Morales-Delgado, M. Ro- mito, C. Moser, and D. Psaltis, "Lensless two-photon imaging through a multicore fiber with coherence-gated digital phase conjugation," J. Biomed. Opt 21, 045002-045002 (2016) and D. B.
Conkey, E. Kakkava, T. Lanvin, D. Loterie, N. Stasio, E. Mo- rales-Delgado, C. Moser, and D. Psaltis, "High power, ultrashort pulse control through a multi-core fiber for ablation, " Optics Express 25, 11491 (2017)). However, DPC and TM are calibration- based techniques using digital holography and suffer from insta bilities and perturbations of the system.
Apart from optical approaches, computational means have been proposed to enhance the imaging capabilities of an MCF endo scope. Iterative algorithms and compressive sensing are some ex amples for improving the image quality through the MCFs (see J. Shao, W.-C. Liao, R. Liang, and K. Barnard, "Resolution enhance ment for fiber bundle imaging using maximum a posteriori estima tion, " Opt. Lett., OL 43, 1906-1909 (2018) and J. Shin, B. T. Bosworth, and M. A. Foster, "Compressive fluorescence imaging using a multi-core fiber and spatially dependent scattering, " Optics Letters 42, 109 (2017)). The recent advances in computa tional means showed great potential for complex imaging through MCFs using deep learning approaches. DNNs can achieve super resolution imaging through an MCF probe with high accuracy, re moving the effect of core discretization from the acquired image (see J. Shao, J. Zhang, R. Liang, and K. Barnard, "Fiber bundle imaging resolution enhancement using deep learning," Opt. Ex press, OE 27, 15880-15890 (2019)). As compared to the conven tional image processing techniques for which the algorithm opti mization is performed on one image example, DNNs can be used to recover different images related to the dataset once the model is trained with.
Imaging biological samples is of great importance for clinical diagnosis and treatment. Cells or even tissues are samples of low contrast for a conventional BF microscope. In order to ob serve and study the various biological properties of a sample, special chemical protocols to induce contrast mechanisms have been used to prepare biological samples in microscopy. The use of dye molecules that bind to certain parts of the cell or tis sue can offer contrast upon illumination of the sample generated by either absorption or fluorescence. Nevertheless, for practi- cal applications, chemical staining of the living tissues is not usually suggested because of possible toxicity effects or inter vention in the normal activity of the cells.
Phase imaging presents an alternative label-free microscopy method for biological sample inspection (see Y. Park, C. Depeur- singe, and G. Popescu, "Quantitative phase imaging in biomedi cine," Nature Photonics 12, 578-589 (2018)).
Considering the advantages of phase imaging in biomedicine, fi- ber endoscopy could greatly benefit from the realization of a probe that provide both BF and phase contrast imaging capabili ties. Phase contrast endoscopy has been proposed by scientists using HiLO microscopy. HiLO endoscopy relies on illumination of a sample with high oblique angles using LED light sources from the side of the imaging fiber bundle. The detected light comes from the multiple scattering in the tissues that creates a transmission-like illumination towards the MCF imaging probe in the center. The oblique angle illumination from the LED gener ates an image of gradient phase similar to DIC (differential in- terference contrast) microscopy. Iterative algorithm schemes are necessary for the removal of the MCF structure from the final image (see T. N. Ford, K. K. Chu, and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination, " Nat Methods 9, 1195-1197 (2012)). Despite the impressive results of
HiLO endoscopy, DIC images generate a contrast mechanism but it cannot give quantitative information concerning the phase varia tions of the inspected sample and the resolution of the final image cannot be further enhanced with the proposed image pro cessing algorithms. Consequently, novel technologies for phase imaging though the available MCFs are desired.
Summary of the invention
The present invention is related to an endoscopic system for phase imaging, comprising
- a multicore waveguide, preferably a multicore fiber bundle (MCF), having a distal end and a proximal end, wherein in op eration the distal end is directed to a sample to be examined and the proximal end is the end remote from the sample to be examined,
- an optical system comprising at least one first light source, preferably a LED, for illuminating the sample to be examined,
- a first camera that is provided at the proximal side of the multicore waveguide and is capable of receiving light from said at least one first light source through said multicore waveguide so as to capture a bright field intensity image of the sample to be examined, and
- a processing unit, characterized in that the processing unit comprises a deep neu ral network, preferably of the Unet-type and generative adver sarial network (GAN) type, that is trained so as to provide phase images. It has been found according to the present invention that an im provement is endoscopy can be obtained when using a deep neural network (DNN) that not only provides super-resolution, but addi tionally includes further image modalities. This can be achieved by training the DNN also with phase imaging details.
Endoscopes are generally known in the art. According to the pre sent invention, an endoscope is used that comprises a multicore waveguide, preferably a multicore fiber bundle (MCF). Said mul ticore waveguide has a distal end and a proximal end, wherein in operation the distal end is directed to a sample to be examined and the proximal end is the end remote from the sample to be ex amined.
Such multicore waveguides are known in the art. According to a preferred embodiment of the present invention, the multicore waveguide is selected from the group consisting of a multicore waveguide made of individual cores fused in a common cladding, and a leached multicore waveguide.
Preferably, the system is bending tolerant. Thus, the endoscopic device of the present invention preferably includes a flexible MCF (multicore fiber) probe that relays the image from the area of interest at the distal fiber side to the proximal side for inspection .
The sample to be examined is illuminated with at least one first light source. Preferably, said light source is an LED, prefera bly with a wavelength in the range of 500-700 nm. The light from the first light source, after having come into contact with the sample to be examined, is coupled into the multicore waveguide and led there through to the proximal end of the thereof, where a first camera is provided. Said camera thus receives the light from said at least one first light source through said multicore waveguide and captures a bright field intensity image of the sample to be examined.
Said image is processed in a processing unit, which can be any conventional processing unit such as a desktop computer, laptop etc. Said processing unit comprises a deep neural network, pref erably of the Unet-type and generative adversarial network (GAN) type, that is trained so as to provide phase images. Thus, in contrast to the prior art as described in the article by Shao et al. discussed above, the endoscopic system of the present inven tion not only processes the generated images for super resolution, but also for phase imaging.
According to the present invention, phase imaging can be ob tained by generating holographic images of the sample to be ex amined, and use these holographic images for training the DNN.
Thus, according to a preferred embodiment of the present inven tion, the optical system of the devices further comprises a sec ond camera that is provided at the distal side of the multicore waveguide. The optical system further comprises a second light source for transmitting light to the camera at the proximal side of the multicore waveguide, wherein said light transmitted from the second light source has a different wavelength than the light transmitted from the first light source optical system.
Preferably, said second light source is a laser having a wave length in the range from 500-600 nm.
The light from the second light source contacts the sample to be examined, and afterwards is transmitted to the second camera, where holographic images are generated as known in the art. According to a preferred embodiment of the present invention, the illumination light sources (i.e. the first and second light source) can be delivered through the endoscope
- via cladding illumination realized in a compact system around the camera chip collection part
- by addition of a number of single-mode fibers attached around the multicore fiber core
- by means of core by core illumination using a scanning system at the proximal side to stir the light from core to core.
According to another embodiment of the present invention, the endoscopic device described herein can be altered to collect:
- the bright-field images in a semi-transmission configuration as described in the details of the invention - the endogenous fluorescence of the sample under study
- the fluorescence signal coming from a chemically stained and treated sample.
Such alterations can be made in a manner as known to the skilled person.
For the realization of the fluorescence collection as an addi tional imaging module of the endoscopic device of the present invention, the device can include: - a bandpass filter that separates the illumination wavelength from the fluorescence signal that can be tuned in different parts of the spectrum depending on the collected signal and can be removed for bright field imaging a dichroic beam splitter in the case of core by core illumi nation as suggested above for separating illumination by the core and fluorescence collected by the multicore fiber. According to a preferred embodiment of the present invention, the fluorescence imaging module can further comprise:
- a usual CCD camera or,
- a photomultiplier for the collection of the weak endogenous fluorescence of the biological samples instead of the camera sensor that has lower sensitivity, or
- a highly-sensitive camera with low noise for low photon count situations .
Preferably, the fluorescence imaging module can further comprise a pinhole placed at the proximal 4f system to achieve confocal fluorescence measurements for higher contrast and sectioning.
According to another preferred embodiment of the present inven tion, the collection of bright-field images as described herein can be further improved for better contrast and sectioning by including into the device of the present invention a pinhole placed at the proximal 4f system to achieve confocal imaging, as also suggested for the fluorescence module above. The endoscopic device of the present invention operates with said first light source and said first camera, together with the processing unit containing the respectively trained DNN. Accord ingly, the present invention can be employed with any known en doscopic device which is supplemented with the required compo nents (in particular the processing unit with the specifically trained DNN).
Training of the DNN may be performed using the same device, in which case it additionally possesses said second light source and second camera. Alternatively, training of the DNN can be performed in a separate device that is equipped with the above first and second light source and first and second camera.
According to another preferred embodiment of the present inven tion, the at least first and optionally second light sources are arranged such that they illuminate the sample to be examined and positioned at the distal end of the multicore waveguide under an angle of between 0° and 45°.
According to another preferred embodiment of the present inven tion, the optical system further comprises an imaging system be tween the proximal end of the multicore waveguide and the first camera, said imaging system being preferably a 4f magnifying op tical system.
According to another preferred embodiment of the present inven tion, the optical system further comprises an imaging system at the distal end of the multicore waveguide, between said multi core waveguide and said sample to be examined, said imaging sys tem being preferably a 4f magnifying optical system. Preferably, said 4f magnification system covers the camera chip area with the full proximal facet of the probe to achieve optimal sampling in terms of the sensor pixels. According to another preferred embodiment of the present inven tion, the light for the illumination of the sample to be exam ined can be delivered by either the cladding or by mounted SMFs (single-mode fibers) at four points, but not limited to four points, around the central MCF core. In each case, the light will be multiply scattered by the biological tissue generating a uniform illumination behind the area of interest that mimics il lumination in transmission for the sample. The illumination wavelength can be tuned to lower wavelengths to achieve higher levels of backscattered light and at the same time partial sec tioning. However, the wavelength range needs to always be in the range that no phototoxicity and damage occur at the biological sample. For other applications that is not necessary. The MCF can be potentially used in contact mode using the bare facet of the fiber to touch the area of interest (i.e. the sam ple to be examined). Another approach would be the addition of magnifying optics, preferably a GRIN lens mounted on the active area of the MCF for imaging (meaning the area that contains the individual cores). The GRIN lens is a good option because it does not increase the final size of the endoscope and it can al so provide a better sectioning and thus image quality for the final image generated at the proximal camera. The processing unit of the system is optimized to generate phase images from the intensity images captured by the fiber, while removing at the same time the discretization artefact that the MCF structure creates on the image. The proposed processing unit is based on modern DNNs that are trained to receive the BF (brightfield) intensity image recorded by the proximal first camera, which is sampled by the individual fiber cores, and ren der the corresponding phase image, nicely resolved without sam pling distortion.
The present invention is furthermore related to a method for en doscopic examination of sample with a device according to the present invention, comprising the steps: a) providing a sample to be examined, preferably a biological sample, b) illuminating said sample to be examined with the first light source of the device and generating a brightfield image with a first camera that receives the light from the sample and transmitted through the multicore waveguide, c) processing the brightfield image with the processing unit using the deep neural network of the processing unit, pref erably of the Unet-type and generative adversarial network (GAN) type, with phase imaging obtain phase images of the sample to be examined.
Preferably, the phase imaging is selected from the group con sisting of a quantitative imaging and a qualitative imaging. According to a preferred embodiment of the present invention, the image collection of the sample to be examined at the distal side of the multicore waveguide is performed by a method select ed from the group consisting of direct contact of the probe fac- et on the sample, by means of a microscope objective assembly configured at the facet, by a GRIN rod lens attached on the ac tive area of the imaging probe, and by a high numerical aperture multicore waveguide in front of the multicore waveguide's proxi mal facet to distribute the information in the speckle pattern probed by the multicore waveguide.
According to the present invention, the method comprises an ad ditional step of training the deep neural network for phase im aging, using holographic images of the sample to be examined that have been generated with the second light source and second camera of the device of the present invention. As described above, that training step may be performed with the same device before conducting endoscopy, or alternatively with a separate device. In the latter case, the information from the holographic image is used for training the deep neutral network, and the correspondingly trained DNN is implemented into the processing unit of the device with which endoscopy is carried out.
Preferably, intentional perturbations are included in the system during the step of training the deep neural network for the phase imaging so that any additional noise in the measurements gets absorbed and learned by the deep neural network algorithm.
According to a preferred embodiment of the present invention, the processing unit
- receives images propagated through the multicore waveguide and captured in a camera as described above and removes the pixelation artefact caused by core sampling, and
- receives images propagated through the multicore waveguide and captured in a camera as described above in the bright field imaging configuration, removes the pixelation artefact caused by core sampling and translates the recorded image in to its corresponding phase map. According to another preferred embodiment of the present inven tion, image discretization due to the sampling by the individual cores of the endoscopic device of the present invention can be removed using:
- conventional image processing algorithms based on iterative optimization or single shot spatial filtering or,
- a specifically trained deep neural network for super resolution .
According to another preferred embodiment of the present inven- tion, the brightfield image recorded by the endoscopic device of the present invention can be translated into a corresponding map using the processing unit using deep learning based on: a Unet type deep neural network model that is initially trained by image pairs of pixelated brightfield images of the sample and their corresponding phase images recorded using digital holographic mi croscopy, or, a GAN type deep neural network model that is initially trained by image pairs of pixelated brightfield images of the sample and their corresponding phase images recorded using digital holographic mi- croscopy for a further improved image quality at the expense of training time.
The deep learning algorithms used for phase imaging as described herein can be also used for further improvements of the imaging capabilities of the endoscopic device of the present invention, such as:
- super-resolution to reveal feature beyond the limit of the core spacing that characterizes the proposed endoscope
- digital translation between various imaging modalities in ad- dition to phase imaging, such as H&E stained images, fluores cence, confocal only by properly training the deep neural network with corresponding image pairs. BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will now be described in detail with ref erence to non-limiting examples and drawings. Fig. 1 shows a schematic description of a device according to the present invention;
Fig. 2 shows the optical setup for dataset acquisition accord ing to the present invention;
Fig. 3a shows an image of a recorded hologram ; Fig. 3b shows a BF image after MCF endoscopy for the same sam ple area;
Fig. 4 shows a DNN architecture suitable for the present in vention;
Fig. 5a shows a holographically-extracted phase image generated according to the present invention;
Fig. 5b shows a MCF output generated according to the present invention;
Fig. 5c shows a reconstructed phase image from Fig. 5b generat ed according to the present invention using DDNs.
DE TAILED DESCRIPTION
In Figure 1, a schematic description of a device according to the present invention is shown. A multicore imaging fiber bundle (080) is used to delivery images of a probe (090) such as a bio- logical tissue to a camera sensor (010). The recorded images are generated using the endoscope as a bright-field (BF) microscope. The DNN model on the computer (200) is trained for the specific endoscope that is used, using an image dataset collected by the optical setup described in figure 2. The BF images relayed through the fiber bundle (080) and captured on a camera (010) are then translated into the corresponding phase images on the computer (200) in a continuous feed based on the camera frames. Fig. 1 illustrates the final working principle of the phase en doscope according to the present invention and comprises the fi ber measurement part shown in Fig. 2, which optionally includes a magnification system (021,031) between the sample (090) and the fiber bundle (080) or only the fiber bundle (080) itself and a second 4f magnification system (023,032) which collects the fiber output and generates an image on the camera (010).
In figure 2, the optical setup used for acquisition of the image datasets needed to train the DNNs to translate BF microscopy im- ages delivered by the MCF to phase images is shown. This optical setup is used in a calibration step in order to obtain the DNN model and realize the final phase endoscope presented in figure 1.
A 532nm solid state laser (001) is used to implement the digital phase microscope part of the setup which is used to obtain the phase images of the desired sample, which are used as ground truth for the DNN model. The laser beam of the laser (001) is expanded and collimated by two lenses (020 and 030). A pinhole (040) is also used to clean the beam before collimation as shown in the figure 2. A half-wave plate (050), placed after the laser (001) and a polarizing beam splitter (071) are integrated in the beam path after expansion to control the power ratio between the illumination and reference path of the holographic setup. The tissue sample (090) under study is always located in the illumi- nation path of the setup and mounted on 3D motorized stage (100).
In the illumination path the light after the sample is collected and magnified, before reaching the detector (Oil), by two con secutive 4f systems which result in a total magnification of 32 (021, 031, 022, 036). A beamsplitter (070) is also placed in the illumination path in order to split the image of the sample cre ated by the 4f system (021, 031) into two planes indicated in the figure 2 as "image plane 1 and "image plane 2". The "image plane 2 is located at the focal plane of the second 4f system (022, 036) which is used to further magnify the sample on the camera (Oil). The "image plane 1 is exactly at the facet of the fiber bundle (080) and magnifies the sample image about 5x. The laser light (001) is only used for the holographic imaging of the sample and it does not participate in the image formation through the endoscope (080). Specifically a longpass filter (101) is placed after the beamsplitter (070) to block the 532 nm laser.
In the reference path, another half-wave plate (051) ensures that the two paths have the same polarization when they inter- fere at the camera (Oil) plane generating a digital hologram of the sample (090). The reference and illumination paths are com bined on the camera by means of a beamsplitter (072). When the light in the illumination path travels through the sample, it gets modulated in phase by the differences in the refractive in- dex among the parts of the tissue. In a conventional intensity image on the camera this information related to the phase dif ferences of a sample are not obvious. However, when the light wavefront after the sample interferes with a reference field on the camera (Oil), the phase information encodes in the intensity interference pattern; the hologram of the sample.
A second light source (002) is also used to illuminate the sam ple (090); an LED emitting at 625nm for endoscopy. The LED is spatially filtered twice to achieve the best collimation. A 40x microscope objective lens (024) is focusing the emitted light from the LED on a pinhole (041) and another lens (033) is used for collimation. Afterwards another system of lenses (034, 035) and a second pinhole (042) are used to further clean and demag- nify the beam so that the red and green illumination beams have the same diameter, when they reach the sample. Alignment of the red and green beam is performed using mirrors (060-065) to re- suit in co-propagating beams. The long-pass filter (101) at 630nm allows only the light from the LED to propagate through the MCF, while another band-pass filter (102) at 532nm allows only for the laser light to reach the CCD1 and blocks any light coming from the LED. After the image of the sample, generated by the red light illumination at the "imaging plane 1 (as de scribed above), has propagated through the MCF, it is collected and magnified by a 4f system (023, 032) and finally recorded on a second camera (010). This camera saves the intensity image of the same area of the sample as the one for which the hologram is recorded.
In figure 2, the LED (002) is used to illuminate the sample in a transmission configuration for simplicity in order to demon strate the proof of concept of the technique. Multiple LED sources can be incorporated around the imaging core of the fiber in order to illuminate the surface of the sample as also pre sented by the previously mentioned work of Ford et al and also demonstrated in several commercial endoscopes.
Careful alignment is performed between the area of the sample recorded by the two cameras (010 and Oil) so that the image of the endoscope and the image of the hologram correspond to the same sample location, which is a necessity for training the DNN model afterwards.. Finally, a 3D motorized stage (100) is used to raster scan the sample so images from different locations are recorded for training. To summarize, one camera (Oil) records digital holograms of are as of the sample, which are used to generate the ground truth for the DNN training, while the second camera (010) records the image of the sample as delivered by the MCF in conventional BF mode, which is then used as an input to the DNN. Both cameras are needed for the training step of the endoscope. After the da taset for the training is collected, only the camera (010) will be used in practice as shown in the schematic representation of the device in Figure 1.
In figure 3a, the hologram of a lOum thick histological slice of a mouse liver tissue is presented as recorded in the camera (Oil). The same location is imaged by the MCF endoscope in BF mode using the LED illumination and it is recorded at the (010) in Figure 3b.
In figure 4a, the schematic representation of the training pro cess for training a generative adversarial network (GAN) DNN to map a conventional BF MCF image to phase is depicted. In figure 4b, the architecture of the DNN used for phase imaging is de scribed in more detail. A GAN architecture is constructed in Keras API and it consists of a generator part which has a Unet structure with skip connections and a discriminator part, which is a VGG-type classifier. A custom loss function is used for the image reconstruction by the Unet generator, which is the summa- tion of two parts: a sobel filter-based error (SFE) and the mean squared error (MSE). The SFE is added in the loss function to preserve the high frequencies of the image that tend to smooth out by the MSE. For the discriminator the loss function is bina- ry cross-entropy as suggested in literature. The learning rate of the Adam optimizer is selected to be 1CT4. The size of the in tensity image inputs and the phase image outputs is 256x256 pix els and for memory and computational time efficiency, a batch size at 10 and 400 epochs were applied. During the training pro- cess the generator receives the MCF BF images of the sample and attempts to generate the corresponding phase map. Then the dis criminator receives the predicted phase map from the generator and compares it with the ground truth image recorded on the hol ographic setup and it is trained to classify the ground truth as real and the generator output as fake. This way the generator is pushed to create more realistic data to full the discriminator and vice versa. In the end, once GAN is trained the generator is used to provide an accurate phase mapping through the endoscope. Figure 5 illustrates the results of the phase imaging through the commercial MCF endoscope using DNNs to transform the BF im age to phase. The Fig. 4a shows the extracted phase from the recorded hologram in the camera (Oil) of a tissue liver sample of lOum thickness. The Fig. 4b shows the corresponding BF image that the MCF endoscope would normally show without further pro- cessing for a sample like this as recorded by the camera (010).
Finally, the Fig. 4c shows how the DNN can translate the BF im age (Fig. 4b) to a high contrast image of phase. The images shown in this figure, are part of the test set of the dataset meaning that the network had not been trained on them. The re construction of the test set is of high quality, which is veri fied by the MSE metric as well as using the structural similari ty index (SSI) metric. The average MSE and SSI for the dataset were calculated 0.003 and 0.92 respectively.

Claims

Claims
1. An endoscopic system for phase imaging, comprising
- a multicore waveguide (080), preferably a multicore fiber bundle (MCF), having a distal end and a proximal end, wherein in operation the distal end is directed to a sam ple (090) to be examined and the proximal end is the end remote from the sample to be examined,
- an optical system comprising at least one first light source (002), preferably a LED, for illuminating the sam ple to be examined,
- a first camera (010) that is provided at the proximal side of the multicore waveguide (080) and is capable of receiving light from said at least one first light source through said multicore waveguide (080) so as to capture a bright field intensity image of the sample (090) to be examined, and
- a processing unit (200), characterized in that the processing unit (200) comprises a deep neural network, preferably of the Unet-type and generative adversarial net work (GAN) type, that is trained so as to provide phase im ages.
2. The device according to claim 1, wherein the optical system further comprises a second camera (Oil) that is provided at the distal side of the multicore waveguide (080).
3. The device according to any of the preceding claims, wherein the optical system further comprises a second light source (001), preferably a laser, for transmitting light to the camera at the proximal side of the multicore waveguide (080), wherein said light transmitted from the second light source (001) has a different wavelength than the light transmitted from the first light source (002).
4. The device according to claim 3, wherein the second camera (Oil) and the second light source (001) are capable of gen erating holographic images of the sample (090) to be exam ined.
5. The device according to any of the preceding claims, wherein the at least first and optionally second light sources (001, 002) are arranged such that they illuminate the sam ple (090) to be examined and positioned at the distal end of the multicore waveguide (080) under an angle of between 0° and 45°.
6. The device according to any of the preceding claims, wherein the multicore waveguide (080) is selected from the group consisting of a multicore waveguide made of individual cores fused in a common cladding, and a leached multicore waveguide.
7. The device according to any of the preceding claims, wherein the optical system further comprises an imaging system (023, 032) between the proximal end of the multicore wave guide (080) and the first camera (010), said imaging system (023, 032) being preferably a 4f magnifying optical system.
8. The device according to any of the preceding claims, wherein the optical system further comprises an imaging system (021, 022, 031, 036) at the distal end of the multicore waveguide (080), said imaging system (021, 022, 031, 036) being preferably a 4f magnifying optical system.
9. A method for endoscopic examination of sample with a device according to any of claims 1 to 8, comprising the steps: a)providing a sample (090) to be examined, preferably a bi ological sample, b) illuminating said sample (090) to be examined with the first light source (002) of the device and generating a brightfield image with a first camera (010) that receives the light from the sample (090) and transmitted through the multicore waveguide (080), c)processing the brightfield image with the processing unit (200) using the deep neural network of the processing unit (200), preferably of the Unet-type and generative adversarial network (GAN) type, with phase imaging to ob tain phase images of the sample (090) to be examined.
10. The method according to claim 9, wherein the phase imaging is selected from the group consisting of a quantitative im aging and a qualitative imaging.
11. The method according to claim 9 or 10, wherein the method comprises an additional step of training the deep neural network for phase imaging, using holographic images of the sample (090) to be examined that have been generated with the second light source (001) and second camera (Oil) of the device of the present invention.
12. The method according to any of claims 9 to 11, wherein in tentional perturbations are included in the system during the step of training the deep neural network for the phase imaging so that any additional noise in the measurements gets absorbed and learned by the deep neural network algo rithm.
13. The method according to any of claims 9 to 12, wherein the image collection of the sample (090) to be examined at the distal side of the multicore waveguide (080) is performed by a method selected from the group consisting of direct contact of the probe facet on the sample (090), by means of a microscope objective assembly configured at the facet, by a GRIN rod lens attached on the active area of the imaging probe, and by a high numerical aperture multicore waveguide in front of the multicore waveguide's proximal facet to distribute the information in the speckle pattern probed by the multicore waveguide (080).
14. The method according to any of claims 9 to 13, wherein the processing unit (200)
- receives images propagated through the multicore waveguide (080) and captured in the first camera (010) and removes the pixelation artefact caused by core sampling, and - receives images propagated through the multicore waveguide
(080) and captured in the first camera (010) in the bright- field imaging configuration, removes the pixelation artefact caused by core sampling and translates the recorded image into its corresponding phase map.
PCT/EP2020/068593 2020-07-02 2020-07-02 Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks WO2022002399A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/068593 WO2022002399A1 (en) 2020-07-02 2020-07-02 Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/EP2020/068593 WO2022002399A1 (en) 2020-07-02 2020-07-02 Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks

Publications (1)

Publication Number Publication Date
WO2022002399A1 true WO2022002399A1 (en) 2022-01-06

Family

ID=71452240

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2020/068593 WO2022002399A1 (en) 2020-07-02 2020-07-02 Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks

Country Status (1)

Country Link
WO (1) WO2022002399A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116327103A (en) * 2023-05-30 2023-06-27 北京大学第三医院(北京大学第三临床医学院) Large-visual-angle laryngoscope based on deep learning

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10346740B2 (en) * 2016-06-01 2019-07-09 Kla-Tencor Corp. Systems and methods incorporating a neural network and a forward physical model for semiconductor applications
WO2020095071A1 (en) * 2018-11-09 2020-05-14 Cancer Research Technology Limited Methods of characterising and imaging with an optical system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10346740B2 (en) * 2016-06-01 2019-07-09 Kla-Tencor Corp. Systems and methods incorporating a neural network and a forward physical model for semiconductor applications
WO2020095071A1 (en) * 2018-11-09 2020-05-14 Cancer Research Technology Limited Methods of characterising and imaging with an optical system

Non-Patent Citations (9)

* Cited by examiner, † Cited by third party
Title
D. B. CONKEYE. KAKKAVAT. LANVIND. LOTERIEN. STASIOE. MO RALES-DELGADOC. MOSERD. PSALTIS: "High power, ultrashort pulse control through a multi-core fiber for ablation", OPTICS EXPRESS, vol. 25, 2017, pages 11491
D. B. CONKEYN. STASIOE. E. MORALES-DELGADOM. RO-MITOC. MOSERD. PSALTIS: "Lensless two-photon imaging through a multicore fiber with coherence-gated digital phase conjugation", J. BIOMED. OPT, vol. 21, 2016, pages 045002 - 045002
J. SHAOJ. ZHANGR. LIANGK. BARNARD: "Fiber bundle imaging resolution enhancement using deep learning", OPT. EXPRESS, OE, vol. 27, 2019, pages 15880 - 15890
J. SHAOW.-C. LIAOR. LIANGK. BARNARD: "Resolution enhancement for fiber bundle imaging using maximum a posteriori estimation", OPT. LETT., OL, vol. 43, 2018, pages 1906 - 1909
J. SHINB. T. BOSWORTHM. A. FOSTER: "Compressive fluorescence imaging using a multi-core fiber and spatially dependent scattering", OPTICS LETTERS, vol. 42, 2017, pages 109
NAVID BORHANI ET AL: "Learning to see through multimode fibers", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 15 May 2018 (2018-05-15), XP081139315, DOI: 10.1364/OPTICA.5.000960 *
T. N. FORDK. K. CHUJ. MERTZ: "Phase-gradient microscopy in thick tissue with oblique back-illumination", NAT METHODS, vol. 9, 2012, pages 1195 - 1197, XP055429281, DOI: 10.1038/nmeth.2219
W. GOBELJ. N. D. KERRA. NIMMERJAHNF. HELMCHEN: "Miniaturized two-photon microscope based on a flexible coherent fiber bundle and a gradient-index lens objective", OPTICS LETTERS, vol. 29, 2004, pages 2521, XP002450065, DOI: 10.1364/OL.29.002521
Y. PARKC. DEPEUR-SINGEG. POPESCU: "Quantitative phase imaging in biomedicine", NATURE PHOTONICS, vol. 12, 2018, pages 578 - 589

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116327103A (en) * 2023-05-30 2023-06-27 北京大学第三医院(北京大学第三临床医学院) Large-visual-angle laryngoscope based on deep learning
CN116327103B (en) * 2023-05-30 2023-07-21 北京大学第三医院(北京大学第三临床医学院) Large-visual-angle laryngoscope based on deep learning

Similar Documents

Publication Publication Date Title
Oh et al. Optical fibers for high-resolution in vivo microendoscopic fluorescence imaging
CN101375786B (en) Fluorescence endoscopic imaging method and device
Sung et al. Fiber-optic confocal reflectance microscope with miniature objective for in vivo imaging of human tissues
US9185357B2 (en) Optical tissue sectioning using full field optical coherence tomography
JP6046325B2 (en) Method and apparatus for the observation and analysis of one or more biological samples with progressively increased resolution
US9915819B2 (en) Fiber-optic methods and devices enabling multiphoton imaging with improved signal to-noise ratio
Sung et al. Near real time in vivo fibre optic confocal microscopy: sub‐cellular structure resolved
US11803951B2 (en) High resolution microendoscope employing differential structured illumination and method of using same
JP2017504836A (en) Three-dimensional imaging system and method
Psaltis et al. Imaging with multimode fibers
US10595770B2 (en) Imaging platform based on nonlinear optical microscopy for rapid scanning large areas of tissue
Silveira et al. Side-view holographic endomicroscopy via a custom-terminated multimode fibre
Dussaux et al. Fast confocal fluorescence imaging in freely behaving mice
Yoon et al. Removal of back-reflection noise at ultrathin imaging probes by the single-core illumination and wide-field detection
Qiu et al. Spectral imaging with scattered light: from early cancer detection to cell biology
WO2022002399A1 (en) Multicore fiber endoscope for phase imaging based on intensity recording using deep neural networks
US20150080744A1 (en) Method and system for assessing preterm birth and other pathologies
Hsiao et al. Telecentric design for digital‐scanning‐based HiLo optical sectioning endomicroscopy with an electrically tunable lens
Choi et al. Fourier holographic endoscopy for label-free imaging through a narrow and curved passage
Oh et al. Review of endomicroscopic imaging with coherent manipulation of light through an ultrathin probe
Ferrer-Roca Telepathology and optical biopsy
Kim et al. Pixelation-free and real-time endoscopic imaging through a fiber bundle
Bae et al. Feasibility studies of multimodal nonlinear endoscopy using multicore fiber bundles for remote scanning from tissue sections to bulk organs
Jeon et al. High-resolution endomicroscopy with a spectrally encoded miniature objective
US20240126058A1 (en) Methods and apparatus for high-resolution microscopy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20736329

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20736329

Country of ref document: EP

Kind code of ref document: A1