WO2017195163A1 - System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination - Google Patents

System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination Download PDF

Info

Publication number
WO2017195163A1
WO2017195163A1 PCT/IB2017/052803 IB2017052803W WO2017195163A1 WO 2017195163 A1 WO2017195163 A1 WO 2017195163A1 IB 2017052803 W IB2017052803 W IB 2017052803W WO 2017195163 A1 WO2017195163 A1 WO 2017195163A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
illumination
light
imaging
phase
Prior art date
Application number
PCT/IB2017/052803
Other languages
French (fr)
Inventor
Timothé LAFOREST
Dino CARPENTRAS
Christophe Moser
Mathieu KÜNZI
Original Assignee
Ecole Polytechnique Federale De Lausanne (Epfl)
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ecole Polytechnique Federale De Lausanne (Epfl) filed Critical Ecole Polytechnique Federale De Lausanne (Epfl)
Priority to EP22153500.8A priority Critical patent/EP4008237A1/en
Priority to JP2018559733A priority patent/JP6994472B2/en
Priority to EP17729922.9A priority patent/EP3454719A1/en
Priority to US16/300,937 priority patent/US11179033B2/en
Priority to CN201780033204.7A priority patent/CN109414162A/en
Publication of WO2017195163A1 publication Critical patent/WO2017195163A1/en
Priority to US17/514,604 priority patent/US11911107B2/en
Priority to JP2021200525A priority patent/JP7235355B2/en
Priority to JP2023021276A priority patent/JP2023055993A/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/12Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for looking at the eye fundus, e.g. ophthalmoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0008Apparatus for testing the eyes; Instruments for examining the eyes provided with illuminating means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography

Definitions

  • the present invention relates to high resolution quantitative and qualitative absorption, phase and dark field imaging of the retina by the use of oblique illumination.
  • Standard photography relies on the difference in absorption of different features for providing contrast. This is also the case in conventional funduscopy where blood vessels, photoreceptors and other retinal structures are observed thanks to their different reflectivity values that provide intensity modulation at the sensor plane. This is not the case for most of the cells lying in the inner retina (ganglions, nuclear and plexiform layers), whose absorption and scattering values are so low to show almost no contrast even at high resolution. Furthermore, the intensity modulation of these features is negligible with respect to the background modulation signal (due to underlying features) in combination with noise. Even with optical coherence tomography (OCT), the weak contrast of these cells make the retina appear as a smooth layer, almost free of features.
  • OCT optical coherence tomography
  • processing the images also allows correcting the eye's aberrations optimizing Fourier properties of the images (U.S. Pat. No. 8,731,272, Z. Phillips, M. Chen, L. Waller, "Quantitative Phase Microscopy with Simultaneous Aberration Correction,” Optics in the Life Sciences Congress, OSA Technical Digest (online), Optical Society of America, 2017), paper JTu5A.2.
  • transscleral illumination i.e. light is provided to the fundus via illumination through the sclera, to obtain higher contrast dark field images of the retina (A. Schalenbourg, L. Zografos "Pitfalls in colour photography of choroidal tumours.” Eye. 2013, Vol. 27(2), pp. 224-229).
  • U.S. Pat. No. 7,387,385 in Figure 21 U.S. Pat. Pub. No. 2007/0159600, and U.S. Pat. Pub. No. 2007/0030448 in Figure 22, a transscleral illumination with several differing wavelength (red, green, blue) is used to make one image with several wavelength simultaneously. This resulting image is used to diagnose choroidal tumors.
  • transscleral illumination allows to collect light only coming from underneath the -100 um thick first layer of the retina. This is because the high reflectance (at and near specular) coming from the surface is blocked by the eye pupil.
  • the tumors absorb much more light than healthy tissue because of the intense cellular and vasculature activity in tumorous tissue. Because the near specular reflected light is blocked, the transscleral image of the tumor has more contrast than that obtained with transpupillary illumination and hence allows a better diagnostic of the spatial extent of the tumor (A. Schalenbourg, L. Zografos "Pitfalls in colour photography of choroidal tumours," Eye. 2013, Vol. 27(2): pp. 224-229).
  • point we mean "point source like” such as a small area. It can be an area larger than the area given by the diffraction limit.
  • phase image information in either a quantitative or non-quantitative manner of the inside of eye without the use of a scanning system.
  • This phase contrast is such as but not limited to, the fundus and retina. Accordingly, there is a need to obtain phase information from the biological material above the photoreceptors to obtain improved contrast, improved image resolution and additionally to derive functional information that exist from the large body of research in quantitative phase imaging in biology.
  • phase imaging we mean an imaging technique for which a well-known relationship (such as but not limited to linear or logarithmic) exists between the grayscale pixel value of the camera and the corresponding phase that the physical sample imparted on the light traversing it.
  • Phase unwrapping techniques can be also used to remove the effect of phase periodicity and obtain more detailed images.
  • phase gradient contrast is phase gradient contrast and it is given in T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination,” Nat. methods, 9, 12 (2012).
  • phase and absorption reconstruction using Waller's method (L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope,” Opt. Exp. 23,9, pp. 11394- 11403 (2015)) consists already in a deconvolution process.
  • the method can be used to obtain phase and absorption information also in the case of a known or an unknown aberrated pupil (Z. Phillips, M. Chen, L. Waller “Quantitative Phase Microscopy with Simultaneous Aberration Correction” Optics in the Life Science 2017).
  • High resolution imaging of the living retina is performed by aberration correction. This task is done computationally as presented in U.S. Pat. No. 8,731,272, or using hardware devices.
  • In vivo imaging of the retina is usually performed by a scanning system, U.S. Pat. No. 4,213,678, possibly coupled to adaptive optics, European Pat. No. 1427328 Al, or using a camera flood illumination system, also sometimes coupled to adaptive optics, as for instance in U.S. Pat. Pub. No. 2004/0189941, U.S. Pat. No. 7,364,296.
  • AOCSLO Adaptive Optics Confocal Scanning Laser Ophthalmoscope
  • OCT Optical Coherence Tomography
  • E.M. Wells-Gray, R. J. Zawadzki, S. C. Finn, C. Greiner, J. S. Werner, S. S. Choi, N. Doble, "Performance of a combined optical coherence tomography and scanning laser ophthalmoscope with adaptive optics for human retinal imaging applications," Proc. SPIE, vol 9335, pp , 2015.) provides a lateral resolution of about 1.5 um, and 2 um axially. This value is limited by the numerical aperture provided by the eye pupil.
  • a wavefront shaping method has been used by Vellekoop and Mosk to perform focusing of light through a highly scattering medium (I. M.
  • a method for imaging a tissue of an eye includes the steps of providing oblique illumination to the eye by a plurality of light emitting areas of a light delivery device, the plurality of light emitting areas being independently controllable and arranged to direct light towards at least one of a retina and an iris of the eye, causing an output beam from light backscattered from the at least one of the retina and the iris by the oblique illumination;, and capturing the output beam with an imaging system to provide a sequence of images of a fundus of the eye.
  • the method further preferably includes a step of retrieving a phase and absorption contrast image from the sequence of images of the fundus, and the sequence of images of the fundus of the step of capturing is obtained by sequentially turning on one or more of the plurality of light emitting areas at a time in the step of providing the oblique illumination.
  • the tissue of the eye is part of a living eye of a human or an animal
  • the oblique illumination is at least one of a transpupillary illumination, a transscleral illumination, and a transepidermal illumination.
  • the light delivery device is configured for at least one of the following illumination modalities, including no contact between the light delivering device and a face of a patient of the eye, the light delivering device is in contact with a skin surrounding the eye, the light delivering device is in contact with a sclera of the eye, and the light delivering device is in contact with a cornea of the eye.
  • a system for imaging a tissue of an eye preferably includes a light delivering device having a plurality of light emitting areas, the light emitting areas directed towards the tissue of the eye for providing oblique illumination, an output beam caused by light backscattered off a fundus of the eye of the oblique illumination from the plurality of emitting areas, and an imaging system configured to capture the output beam and to provide a sequence of images of the fundus of the eye.
  • the system further preferably includes a controller configured to individually control the plurality of light emitting areas of light delivering device, to sequentially turn on one of the plurality of light emitter areas at a time, for capturing the sequence of images by the imaging system, and the imaging system is further configured to retrieve a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image from the fundus of the eye.
  • a controller configured to individually control the plurality of light emitting areas of light delivering device, to sequentially turn on one of the plurality of light emitter areas at a time, for capturing the sequence of images by the imaging system
  • the imaging system is further configured to retrieve a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image from the fundus of the eye.
  • the imaging system preferably further includes a scanning system and a detector, the scanning system having a collection pupil that is either centered or shifted with respect to a center of a pupil of the eye, and the detector includes at least one of a single pixel detector, a line camera, a two- dimensional multipixel device and a split detector.
  • FIG. 1 shows a scheme of the working principle of the system presented in U.S. Pat. No. 7,387,385, U.S. Pat. Pub. No. 2007/0159600 according to the background art.
  • a waveguiding component is put in contact with the sclera to provide illumination. The user has to hold this waveguiding component;
  • FIG. 2 shows a scheme of the working principle of the system presented in U.S. Pat. Pub No. 2007/0159600 according to the background art.
  • the method uses one illumination point, or two illumination points where the two sources provide illumination simultaneously;
  • FIG. 3 shows an illumination system presented in T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination,” according to the background art, Nat. methods, 9, 12 (2012) and Int. Pat. Pub. No. WO 2013/148360.
  • Light is guided and delivered at the surface of the scattering media.
  • a second waveguiding component is present for providing the symmetrical illumination.
  • FIGs. 4 A and 4B schematically show two different representations of a transmission microscopy phase imaging method using incoherent darkfield illumination, presented in L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope,” Opt. Exp. 23,9,11394-11403(2015), according to the background art;
  • FIG. 5 shows the in vivo darkfield imaging of the retina using a modified AOSLO system with offset aperture, image adapted from Toco Y. P. Chui, Dean A. VanNasdale, and Stephen A. Burns, the use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope, Biomed. Opt. Express 3, 2537-2549 (2012), according to the background art;
  • FIG. 6 shows in vivo darkfield imaging of the retina using a modified AOSLO system with split detector, image adapted from A. Guevara-Torres, D. R. Williams, and J. B. Schallek, Imaging translucent cell bodies in the living mouse retina without contrast agents, Biomed. Opt. Express 6, 2106-2119 (2015), according to the background art;
  • FIG. 7 shows a flood illumination adaptive optic system with transpupillary illumination, as shown in U.S. Pat. Pub. No. 2004/0189941, according to the background art;
  • FIG. 8 shows a review based on state of the art OCT retinal imaging systems, table from Jonnal R.S., Kocaoglu O.P., Zawadzki R.J., Liu Z., Miller D.T., Werner J.S.. A Review of Adaptive Optics Optical Coherence Tomography: Technical Advances, Scientific Applications, and the Future. Invest Ophthalmol Vis Sci. 2016 Jul 1;57(9): Oct. 51-68, showing the cells not being yet imaged in vivo, according to the background art;
  • FIG. 9A schematically shows the transscleral illumination method according to an aspect of the present invention. Light passes through the sclera before illuminating the eye fundus;
  • FIG. 9B schematically shows the proposed transscleral illumination method according to an aspect of the present invention: light is shined directly on the scleral tissue.
  • Light can be delivered by direct contact of the source or the light waveguiding component, or in a non-contact manner with an optical beam (collimated or not). Some examples of a multiplicity of light position, areas are symbolized by discs.
  • the scattering properties of the sclera produce a diffused beam that illuminates the fundus with a high angle. In case of physical contact with the sclera, local anesthesia can be used to make the measurement more comfortable to the patient;
  • FIG. 10A schematically shows a trans-epidermal illumination method according to an aspect of the present invention. Light passes through the skin and the sclera before illuminating the eye fundus;
  • FIG. 10B schematically shows a trans-epidermal illumination method according to an aspect of the present invention.
  • Light is shined on the eye lid and from there, scattered through different layers up to the inside of the eye.
  • Light can be delivered by direct contact (of the source or a waveguiding component) or with an optical beam (collimated or not). Some examples of a multiplicity of light position, areas are symbolized by discs. Many point sources that are spatially separated provide different angles of illumination. Also contact with the skin can be more comfortable for a patient as it does not require anesthesia;
  • FIG. 11 A schematically shows a pupillary illumination method according to an aspect of the present invention.
  • Light is shined on the inside layers of the eye after passing through the pupil.
  • the light is scattered on the inside of the eye and illuminating the eye fundus.
  • Light can be delivered by direct contact (of the source or a waveguiding component) on the cornea or in a non-contact manner with an optical beam (collimated or not);
  • FIG. 1 IB schematically shows a pupillary illumination method with an example of point light entering in the pupil, according to an aspect of the present invention
  • FIG. 12A schematically shows pupillary darkfield illumination according to an aspect of the present invention.
  • Light is shined on the extremity of the pupil (in a single point or in annular shape.
  • Light is illuminating the upper layer of the retina without illuminating the background;
  • FIG. 12B schematically shows pupillary darkfield illumination according to an aspect of the present invention.
  • the light pattern can be an annulus or a restricted portion of the annulus;
  • FIG. 13 schematically shows temporal illumination according to an aspect of the present invention: The light passes through the temporal tissues (skin) before reaching the sclera of the eye and the eye fundus;
  • FIG. 14A schematically shows oblique illumination of the eye fundus with a focused beam according to an aspect of the present invention
  • FIG. 14B schematically shows oblique illumination of the eye fundus with a collimated beam according to an aspect of the present invention
  • FIG. 14C schematically shows oblique illumination of the eye fundus with a diverging beam according to an aspect of the present invention
  • FIG. 15 schematically shows annular illumination using the side eyeball as scattering layer for proving high numerical aperture according to an aspect of the present invention
  • FIG. 16 schematically shows an exemplary apparatus for non-contact illumination according to an aspect of the present invention.
  • FIG. 17 schematically shows a top view of an apparatus for contact illumination, according to another aspect of the present invention.
  • At least one light beam is in contact with the sclera.
  • the light beam can be brought via a light waveguide such as but not limited to a multimode fiber.
  • the imaging lens that images the fundus through the eye lens is also in contact with the sclera via an index matching gel placed between the cornea and the imaging lens. Note that this figure differs from the apparatus shown in FIG. 2 in the number of illumination points (more than two), and the way of illumination. Here the beams are switched on sequentially (one or multiple at the same time), while in FIG. 2, the two points are shined simultaneously;
  • FIG. 18 schematically shows an apparatus for non-contact illumination, according to another aspect of the present invention, a top view of FIG. 17. Note that this figure differs from FIG. 2 in the number of illumination points (more than two), and the way of illumination. Here the beams are switched on sequentially (one or multiple at the same time), while in FIG. 2, the two points are shined simultaneously;
  • FIG. 19 schematically shows an apparatus for non-contact illumination according to an aspect of the present invention.
  • a rotating wheel is pierced with a hole on its periphery.
  • a light beam illuminates the whole surface of the wheel in such a way that the light passes only through the hole and illuminates only on point on the sclera.
  • the illumination point can be on the skin;
  • FIG. 20 schematically shows an apparatus for non-contact illumination according to an aspect of the present invention.
  • a rotating wheel holds a fiber and a lens that focuses light on the sclera or on the skin;
  • FIG. 21 schematically shows an apparatus for non-contact illumination, based on the example of the apparatus shown in FIG. 19 with the imaging system, according to an aspect of the present invention
  • FIG. 22 schematically shows an apparatus for non-contact illumination, based on the example of the apparatus shown in FIG. 19 with the imaging system, according to an aspect of the present invention
  • FIG. 23 schematically shows an apparatus for contact illumination according to an aspect of the present invention.
  • a patch is put in contact with the patient's skin.
  • the patch is connected to several fibers whose distal ends are the illumination points (area) on the patient's eyelids.
  • the patch can be composed of a removable (consumable) protection part in contact to the skin (one per patient);
  • FIG. 24 schematically shows an illustration of a continuous light emitting device having the arc shape of the eye.
  • the light emitting device is held by a flexible electronic circuit
  • FIG. 25 schematically shows a simplified view of the printed circuit board (PCB) system with its electronic components, according to an aspect of the present invention
  • FIG. 26 shows a photograph of two prototypes placed on a subject's left eye.
  • Four (4) light-emitting diodes (LED) can shine light from the top lid and four (4) other LEDs are shined from the bottom lid of the eye;
  • FIG. 27 shows a photograph of a subject positioned on an ophthalmic head mount with two prototypes light sources placed on top and bottom of the eye lid respectively. In the image, one LED is switched ON;
  • FIG. 28A shows a photograph of a designed prototype holding four (4) red surface-mount device (SMD) LEDs having a dimension smaller than 1 mm3.
  • the illuminating device if a flexible PCB;
  • FIG. 28B shows a photograph of a prototype holding four (4) red LEDs having a diameter of 5 mm.
  • the LED connectors are fixed to a threaded tube that can be screwed to the optical system;
  • FIG. 29 is a simplified representation of a schematic of the device, according to an aspect of the present invention.
  • the light beam is first reshaped by the modulating device and then is sent to the retina with a high numerical aperture illumination method.
  • the backscattered light is collected from the pupil and measured by a detector.
  • Beam 1 forward scattered light.
  • Beam 2 collected backscattered light;
  • FIG. 30 shows a representation of the concept validation for focusing using dark background and a reflective bead.
  • a generic wavefront is shined on the surface. Since only the bead can reflect light only a small amount of power is coming back;
  • FIG. 31 shows a representation of the concept validation for focusing using dark background and a reflective bead. An optimized wavefront is shined on the surface. Light is focused on the bead reflecting back all the scattered light;
  • FIG. 32 shows a representation of the optical scheme used for focusing scattered light on a bead, according to an aspect of the present invention
  • FIG. 33A-33D shows intensity enhancement of a single bead using the presented method, according to an aspect.
  • FIG 33A shows the reflectance before running the algorithm, while FIG. 33B the final result.
  • the same profile along one dimension is plotted for low NA in FIG. 33C and for maximum NA (open diaphragm) in FIG. 33D;
  • FIG. 34 is a schematic representation of the backscattered light's angular distribution with oblique illumination, according to an aspect of the present invention.
  • FIG. 35 is a schematic representation of how the beam is scattered at the surface of a scattering media, according to an aspect of the present invention.
  • the case of oblique illumination is represented, showing also an anisotropic scattering (the scattered beam is not symmetric respect the perpendicular to the surface);
  • FIG. 36 shows a schematic representation of angular distribution of the emerging beams at different distances from the illuminating beam, according to an aspect of the present invention. The more the distance increases the more the emerging beam will look like a Lambertian distribution;
  • FIG. 37 shows a schematic representation of an illumination beam is shined on the eye tissue, according to an aspect of the present invention. After travelling through the different layers it emerges as scattered beam. After passing through the transparent retinal layers it is scattered back by deeper tissues (e.g. choroid). The backscattered light presents an oblique mean distribution and, passing again through the retina, the cells contained in the retina alters the phase of the light passing through it. After travelling through the eye it is collected by the eye lens and sent outside as a collimated beam;
  • FIG. 38 is a graph representing the two-dimensional (2D) Monte Carlo simulation of angular distribution of the backscattered light.
  • the illumination beam is considered to impinge with an angle of 45°.
  • the chosen scattering parameters are for choroidal tissue;
  • FIG. 39 shows a schematic representation of a flow chart for performing a measurement, according to an aspect of the present invention.
  • the optical system is positioned on the patient by following the embodiments detailed in FIGs 9A to FIG. 28. After this step, each point is illuminated with one or more wavelength covering a spectrum approximately between 400 nm to 1200 nm and an image of the fundus is acquired through the eye lens. Each image is acquired sequentially.
  • the patient's pupil maybe dilated but not limited to this case.
  • the image captured In the case of dark field images (minimum 1 illumination point) the image captured is directly the dark field image.
  • the chosen method is phase imaging (minimum 2 illumination points) the acquired images need first to be processed to obtain a qualitative or quantitative phase image. When all the pictures have been acquired, the images can be post-processed;
  • FIGs. 40A, 40B, and 40C show representations that illustrate a method for determining the angular spectrum of an illumination point. For every illumination point, a different shadow is cast off the vessels. The spatial frequency spectrum of the illumination can be determined by using the image of the shadows obtained through the eye lens. The knowledge of the spatial frequency spectrum of each illumination point is then used in the phase retrieval algorithm to provide a quantitative phase image. In addition, the aberration of the eye-lens system is also inferred by the iterative algorithm. This works because there are multiple presentations of the shadows. The more images of shadows at different angles, the more precise the phase image and aberration correction. The principle or method is illustrated for two different illumination points as shown in FIGs. 40A and 40B. FIG. 40C shows a typical transcleral illumination image of a human retina, exhibiting vessel shadows;
  • FIG. 41 is a schematic representation showing the stitching different pictures in the Fourier space, according to an aspect of the present invention. Since tilted illumination is equivalent to a shift in Fourier space, stitching, in Fourier domain, pictures with different illumination angles is equivalent to obtaining a single picture with a larger Fourier domain. This results in a higher resolution image;
  • FIG. 42 shows a schematic view of a system or device used to perform ex-vivo measurements, according to an aspect of the present invention. Samples used for validation and proof of concept of phase imaging;
  • FIGs. 43A, 43B, and 43C depict measurements of the samples illustrated in FIG. 42. Comparison with digital holography providing quantitative phase measurement and confocal microscopy providing intensity measurement;
  • FIG. 44 depicts different phase measurement scanning a thick pig retina sample (180 um) in depth. The pictures show the different layers of the retina;
  • FIG. 45 shows a graph that represents a computation of cells density for different layer based on the scan of FIG. 44;
  • FIG. 46 shows a schematic representation of an optical system of the indirect ophthalmoscope used for the proof of concept measurements
  • FIG. 47 shows a schematic representation of a system for in-vivo imaging, according to an aspect of the present invention.
  • the LED used for transscleral illumination is synchronized with the acquisition of the camera.
  • the pupil plane is conjugated to the plane of the diaphragm D. In this way light scattered around the eye is filtered from the final picture.
  • the camera plane is conjugated with a plane in the retina, whose depth can be adjusted thanks to the badal system;
  • FIG. 48 shows a schematic representation of an example of an optical system designed with an adaptive optics loop to correct for the aberrations of the eye, according to an aspect of the present invention. It integrates a wavefront sensor (WFS) and a deformable mirror (DM);
  • WFS wavefront sensor
  • DM deformable mirror
  • FIG. 49 shows on the left side a picture taken with the transepidermal method with a bottom center point of illumination, in the center a picture taken with the
  • transepidermal method with a bottom left point of illumination, and on the right side the difference of the two showing the phase contrast;
  • FIG. 50 shows a phase gradient image obtained with the method according to an aspect of the present invention, with corresponding cross-section showing gradients for the vessels, and the optic disc;
  • FIG. 51 shows on the left side a picture taken with the transepidermal method with a bottom right point of illumination, in the center a picture taken with the transepidermal method with a bottom left point of illumination, and on the right side difference of the two showing the phase contrast;
  • FIG. 52 shows an example of a darkfield trans-epidermal image acquired with the system of FIG. 47;
  • FIG. 53 shows a schematic representation of an optical system for an interferometric measurement with scattered light, according to an aspect of the present invention.
  • a broadband light source e.g. Superlumine scent light emitting diode: SLD
  • SLD Superlumine scent light emitting diode
  • the reference illuminates a mirror that can translate to scan the sample in depth.
  • the object arm illuminates the eye with the methods described in FIGs. 4 to 26.
  • the back-scattered light is collected by the pupil and interferes with the reference beam.
  • a Fourier domain method can be implemented (not shown in the picture) by decomposing the spectral content of the scattered beam to retrieve depth in the retina;
  • FIG. 54 shows a schematic representation of an operational scheme or method for a scanning system using transscleral illumination, according to an aspect of the present invention.
  • FIG. 55 depicts signals in the form of chronograms illustrating a lock-in acquisition, according to an aspect of the present invention.
  • a device for retinal imaging that can establish phase and absorption contrast image thanks to oblique illumination of the retinal layers.
  • the device can be used to ex-vivo and in-vivo imaging. In the first part of this section, the ex- vivo
  • phase and absorption contrast image can include, but is not limited to a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image.
  • phase and absorption contrast image can include but is not limited to one dimensional image, two dimensional image, three dimensional image, or multidimensional image.
  • An ex-vivo sample from the eye can include but is not limited to an entire eye, an untreated piece of an eye, a fixed piece of an eye, a stained piece of an eye, and an in-vitro sample.
  • phase contrast is obtained by oblique illumination generated by scattering in the deep layers of the eye.
  • the sample is illuminated with an light source and an oblique angle.
  • the source is placed in the same side as the imaging system, making a reflection configuration.
  • a scattering layer is placed behind the phase sample to provide a backscattered illumination.
  • the scattering layer providing the back illumination is the choroid of the eye.
  • the light source is scattered by a diffusing plate before reaching the sample.
  • a back reflective layer is added below the sample. For the reconstruction process, see below.
  • a light source can include, but is not limited to a light emitting diode, a super luminescent diode, a quantum dot source, a lamp, a blackbody radiation source, a low temporal coherence source, low spatial coherence source, and a laser source.
  • phase contrast is obtained by oblique illumination generated by scattering in the deep layers of the eye.
  • the features of the method, device, system can be simplified in the following categories: illumination type, light delivering device, image acquisition system, reconstruction process.
  • the light is passing through the sclera, the choroid and the retina.
  • the transmitted and scattered light illuminates the fundus.
  • no or very little light is entering the pupil-lens.
  • the light delivering device is in contact with the sclera.
  • transscleral in combination with transpupillary lighting can be used.
  • the light is passing through the skin layer near the eye, sclera, the choroid and the retina.
  • the transmitted and scattered light illuminates the fundus.
  • no or very little light is entering the pupil -lens.
  • light is passing through the pupil, directed to the side of the eye.
  • the rays are scattered and reflected towards the fundus.
  • light is passing through the pupil and directed on the fundus with a certain angle, generating also angled backscattered light.
  • light is passing through the pupil and shined on an area close to the imaged region.
  • Light scatters in the deep layer providing, behind the imaged region, an angled illumination.
  • light is passing through the temple.
  • the transmitted and scattered light illuminates the fundus. No light is entering the pupil's lens.
  • light is transmitted through the pupil or skin and sclera illuminating directly the imaged area of the retina and not its background, thus providing dark field contrast.
  • the wavefront is manipulated before entering in the eye.
  • the feedback light is collected through the lens of the eye.
  • Several schemes are then possible, for example the focusing of light on the fundus to compensate for the scattering to obtain a spot size smaller than the resolution of the eye-pupil (0.24 NA), the focusing being obtained with an iterative process, using the feedback light through the eye-pupil as criterion of optimization, and also the scanning of the eye's fundus by using a well-known memory effect in scattering media by adding a phase gradient to the wavefront.
  • the scanning pattern can include either an optimized focus spot or a speckle pattern.
  • the light delivering device can be designed for contact and non-contact.
  • the light delivery device for contact is presented in the following embodiments:
  • the light delivering device is made of a flexible electronic circuit that integrates the light emitting device and the electronic wires that brings the driving signals for the light delivering devices.
  • the light delivering device is in contact with the skin or sclera and the light delivering device in contact with the skin has a removable protection patch (one for each patient).
  • the parts that are in contact with a face of a patient whose eye is analyzed can include but is not limited to a head holder, a chin holder and a light delivering device are covered with a removable disposable part.
  • a removable part can include but is not limited to a layer of paper, a stack of paper layers, a polymeric layer.
  • the shape of the flexible electronic circuit of the light delivering device is designed in an ergonomic way.
  • the flexible electronic circuit - patch is held with a head-mounted frame (e.g. a glass' frame) placed on the subject.
  • a head-mounted frame e.g. a glass' frame
  • the contactless light delivering device is made of point sources which are imaged on the illumination surface (cornea, sclera or skin).
  • the contactless light delivering device consists in a circular source whose light is shined on the illumination surface (cornea, sclera or skin).
  • the contactless light delivering device consists in an annular whose light is shined on the illumination surface (cornea, sclera or skin).
  • Illumination is provided thanks to a single or a combination of light sources in the wavelength range of 400 nm to 1200 nm such as but not limited to: pulsed or continuous laser source, light emitting diode, super luminescent diode, quantum dot source, a lamp, a black body radiation source, and a laser source.
  • Light is delivered by placing the source in direct contact with the tissue (sclera or skin) or guided from the source to the tissue or imaging the source on the illumination surface (cornea, sclera or skin).
  • Waveguiding components include but are not limited to: multimode fibers, capillary waveguides, lensed multimode fibers, single mode fibers and photonic crystal fibers.
  • Light beam can be converging, diverging or collimated, depending on the chosen illumination technique.
  • Light can be but not limited to linearly polarized, circularly polarized, non-polarized (meaning that does not presents any known preferential polarization), and a mixture of different polarizations.
  • the image acquisition process is different depending on the required imaging modality: dark field or phase/absorption.
  • dark field imaging can be performed with just one illumination point without image processing.
  • a wider field of view is obtained by stitching together images obtained for different imaging areas.
  • An image of the retina is formed on the camera thanks to a series of lenses and mirrors.
  • a lens or a mirror is translated to change the focal plane in the retina.
  • a tunable lens that is used to change the focal plane in the retina.
  • a cylindrical lens is rotated and translated to compensate for eye's astigmatism.
  • Two independent cylindrical lenses are translated to compensate for eye's astigmatism.
  • Patient's prescription glasses are used to compensate for eye aberrations.
  • a deformable mirror is used to compensate for eye's aberration.
  • a wavefront sensor is conjugated to the pupil's plane to measure eye's aberrations.
  • a camera is conjugated to the pupil's plane to measure the illumination function.
  • a diaphragm is conjugated to the retina to select a small area of the retina for measuring the illumination function.
  • a camera is conjugated to the cornea to observe when the patient's eye is at the right position.
  • polarization optics can be used to stop light's backreflection at the different surfaces.
  • the illumination profile of the retina is not an even function in the range of collection NA.
  • the light used for illumination passes through the retina and its phase and intensity is affected by its optical properties.
  • the modulated light is recorded on a camera for different illumination functions. Pictures are processed together to reconstruct the phase and absorption image. The image quality is improved by increasing resolution and contrast via image processing. Anatomical features are extracted and analyzed to detect possible abnormalities.
  • Phase imaging requires at least two pictures captured with two different illumination points.
  • a reconstruction algorithm it is possible to obtain a qualitative or quantitative phase image.
  • Such reconstruction algorithms known in the art can be, but not limited to, those described in L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9,11394-11403(2015), in Z. Phillips, M. Chen, L. Waller “Quantitative Phase Microscopy with Simultaneous Aberration Correction” Optics in the Life Science 2017 or in Int. Pat. Pub. No. WO 2015/179452, S. B. Mehta and C. J. R.
  • Phase imaging is based on the interference of the beam with itself due to phase difference in the object plane. This interference, in the case of a non-even illumination results in a modulation of the intensity at the camera layer.
  • the intensity values in the image are related to the phase gradient in the image plane. Since this technique requires the illumination beam to be transmitted through the sample it appeared to be impossible for thick biological media.
  • the backscattered light will show different angular distribution, depending on the distance away from the beam incident position, see in FIG. 3.
  • This uneven angle distribution results in a tilted averaged illumination, which can be used for providing oblique illumination.
  • a similar effect can be observed if the shining beam is not perpendicular to the surface, and the backscattered beam will present directionality, as shown in FIGs. 34 and 35.
  • This effect can be used in eyes such as but not limited to the human eye: when light is shined on the fundus with a certain angle, (e.g. when passing through the sclera) it travels through the transparent retinal layers and scatters in the deeper layers (e.g. pigmented epithelium and choroid). Here light scatters back, maintaining an oblique direction, and passing through the upper layers of the retina it is affected by retinal absorption and phase.
  • phase gradient imaging algorithm T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination,” Nat. methods, 9, 12 (2012), S. B. Mehta and C. J. R.
  • FIG. 49 illustrates the principle for two different illumination points.
  • the human in-vivo fundus image is taken with transscleral illumination.
  • the shadow in FIG. 40 exhibits an effect of "twin" image for the vessels' tree, as also illustrated in images 55 and 56, due to the almost transparent layer between the vessels and the layer where the shadows are cast.
  • WhereA?i ⁇ is the i-th renormalization method
  • Low ⁇ is a 2D lowpass filter. Together with that it is convenient in many cases also to subtract the resulting image with its average, removing the zero component in the Fourier space.
  • a normalization method can include but is not limited to the relationship defined in ⁇ 1 ⁇ , the relationship defined in N 2 ⁇ and the relationship defined in N 3 ⁇ .
  • ⁇ and ⁇ are the Fourier transform of the absorption and phase profile respectively and ⁇ ⁇ , G 1 and B 1 are functions dependent to the illumination function and pupil's aberration in measurement 1. From this we can obtain the two profiles as:
  • ⁇ 2 ⁇ 1 - ⁇ 1 ⁇ 2 +( ⁇ 2 ⁇ 1 - ⁇ 1 ⁇ 2 ) ⁇
  • Each one of these methods requires a deconvolution step in with the DPC image is deconvolved with the transfer function. This can be performed with different methods such as but not limited to: direct inversion, Weiner filtering, conjugate gradient minimization, maximum likelihood method, blind deconvolution methods.
  • phase and absorption retrieval algorithm can include, but is not limited to: a Waller method with or without renormalization, a Waller modified method with or without renormalization, and a phase retrieval algorithm.
  • This method completely ignores the angular distribution of the backscattered light, allowing for the reconstruction without any other study on the illuminated surfaces.
  • Reconstruction is then provided by mean of, but not limited to, inverse filtering, least square filter, constrained least- square filter, Tikhonov regularization, blind deconvolution, iterative filters and it can be applied both in spatial or Fourier domain.
  • the image of FIG. 49 is obtained with this approximation and then reconstructing the phase image with Tikhonov regularization.
  • Reconstruction can the obtained by using the same techniques as mentioned in method 1.
  • the image shown in FIG. 51 is obtained with the function S(u) obtained from Monte-Carlo simulations and then reconstructing the phase image with Tikhonov regularization.
  • the parameters used for the simulations were obtained from Rovati et al.(L. Rovati, S. Cattini, N. Zambelli, F. Viola, and G. Staurenghi, "In-vivo diffusing -wave-spectroscopy measurements of the ocular fundus", Optics Express Vol. 15, Issue 7, pp. 4030-4038 (2007)) and from Curcio et al. (C. A. Curcio, J. D. Messinger, K. R. Sloan, A. Mitra, G. McGwin, and R. F. Spaide, "Human Chorioretinal Layer Thicknesses Measured in Macula-wide, High- Resolution Histologic Sections", Invest Ophthalmol Vis Sci. 201 1 Jun; 52(7): 3943-39
  • Image quality can also be improved in the reconstruction process. Indeed, if the aberrations at the pupil plane are known the reconstruction restores the original image. If the aberrations are not known, it is still possible to estimate them using a blind-deconvolution approach, as in Phillips (Z. Phillips, M. Chen, L. Waller “Quantitative Phase Microscopy with Simultaneous Aberration Correction” Optics in the Life Science 2017).
  • an improved pattern recognition algorithm can be run for features extraction.
  • Features extraction is meant to work on retinal features such as, but not limited to, cells, nuclei and microvasculature present in the different retinal layers, for example inner limiting membrane (ILM), retinal nerve fiber layer (RNFL), ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), outer nuclear layer (ONL), external limiting membrane (ELM).
  • Features extraction is performed with, but not limited to edge detection, corner detection, blob detection, ridge detection, scale-invariant feature transform, Hough transform, for instance based on a deep learning software.
  • the deep learning software can be trained with pathologic and nonpathologic images (in vivo or ex vivo).
  • the features extraction can be applied to medical information extraction in order to help the clinician analyzing the data.
  • the illumination function can be estimated by placing a camera in a plane that is conjugate to the pupil's plane in case of curved surface (as in the case of the eye) or to the Fourier plane if the sample surface is flat (as in the case of a flat mounted ex-vivo sample).
  • a diaphragm in a plane conjugate to the sample's plane allows for a better selection of the scattering profile.
  • the measured illumination function is averaged over the samples area but, thanks to the diaphragm, this area can be limited and the illumination function can be measured locally.
  • a small aperture allows approximating a curved surface as locally flat. Under this consideration there is no more difference between placing the camera at the pupil or at the Fourier plane.
  • the image obtained at the pupil camera is the illumination function averaged over the area limited by the diaphragm aperture. If the illumination function needs to be measured pointwise over different areas of the sample the aperture can be replaced by a lens array. In this configuration, each lens creates the image on the camera of the local illumination function.
  • optical aberrations a limiting factor in the image quality is given by optical aberrations. They can be due to eye's aberration and aberrations of the optical system. Aberrations usually produce a dumping of high frequencies, resulting in a poorer image resolution.
  • a wavefront sensor can be part of the system. Many different devices can be used to perform this task, but the main ones used are Shack-Hartmann wavefront sensor and Tscherning wavefront sensor.
  • Changing the focal distance of the method, system, and device is useful both in case of eye defocus (myopia/hyperopia) and in the case in which a change of the focal plane is required for example but not limited to the performance of a stack of different planes.
  • This task can be performed in different ways, for example but not limited to translation of the focusing element (lens or curved mirror), translation of mirrors for increasing the path length (badal system), change of focal distance in a tunable lens, change in a deformable mirror.
  • myopia/hyperopia of a patient can be corrected in the measurement using prescription glasses or contact lenses of the patient.
  • Another common aberration in human eyes is due to astigmatism. This consists in a difference in focal distance of the lens along two different axes. Because of that astigmatism can be compensated by using, for example, translation of two independent cylindrical lenses, a deformable mirror, prescription glasses or contact lenses of the patient.
  • both low and higher order aberrations can be compensated by placing a deformable mirror in a plane conjugate to the pupil's plane.
  • This configuration is more robust if coupled with a wavefront sensor in which the sensor is placed after the deformable mirror. In this way, the wavefront sensor can be used in a closed loop with the deformable mirror to compensate for aberrations.
  • dark-field illumination can be used in the present method, system, and device.
  • an illumination method is used that employs for illumination a range of illumination angles that is different from the range of collection.
  • transscleral illumination associated with transpupillary collection, is considered a dark-field illumination.
  • transscleral FIGs. 9A, 9B
  • transepidermal FIGS. 10A, 10B
  • pupillary oblique illumination FIGS. 11A, 1 IB
  • pupillary direct illumination FIGS. 12A, 12B
  • through the temple FIG. 13
  • a scanning acquisition system can include, but is not limited to a confocal scanning acquisition, an optical coherence tomography acquisition, a shifted pupil scanning acquisition, a split detector acquisition, and a lock-in scanning acquisition.
  • a detector can include but is not limited to a single pixel detector, a line camera and a 2D detector.
  • a single pixel detector can include but is not limited to a photodiode, an avalanche photodiode, a photomultiplier tubes, a micro photomultipliers tube, a lock-in single pixel detector and a split detector composed of single pixel detectors.
  • a two-dimensional detector can include, but is not limited to a lock-in multipixel detector, a CMOS camera, an sCMOS camera, a CCD camera, and a 2D split detector.
  • FIGs. 9A and 9B represent the transscleral illumination method: light 16 is shined directly on the scleral tissue 9.
  • Light can be delivered by direct contact of the source or the light waveguiding component, or in a non-contact manner with an optical beam (collimated, focused, diverging or with structured illumination). Some examples of light position are symbolized by discs 45.
  • the scattering properties of sclera 9 and underneath layers 10, 11 produce a diffused beam 19 that illuminates the fundus with a high angle. In case of physical contact with the sclera, local anesthesia can be used to make the measurement more comfortable to the patient.
  • Structured illumination can include but is not limited to a sinusoidal phase pattern, a sinusoidal intensity pattern, a light pattern modulated in intensity with a micromirror array, a light pattern modulated in phase and/or with a spatial light modulator.
  • Waveguides and waveguiding components can include but are not limited to: single mode fibers, multimode fibers, capillary waveguides, lensed multimode fibers and photonics crystal fibers.
  • FIGs. 10A and 10B represent the transepidermal illumination method: light 16 is shined on the upper 14 and/or lower 15 eyelid and from there, scattered 19 through different layers up to the inside of the eye 1.
  • Light can be delivered by direct contact (of the source or a waveguiding component) 27 or with an optical beam 16 (collimated or not). Some examples of light positions are symbolized by discs 45. Many point sources that are spatially separated provide different angles of illumination. Also contact with the skin can be more comfortable for a patient as it does not require anesthesia.
  • FIGs. 11A and 1 IB represent the pupillary illumination method: light 17 is shined on the inside layers of the eye after passing through the pupil 4 and the lens 5. The light is scattered on the inside of the eye after back reflection from the focal point 28 and illuminating the eye fundus.
  • Light can be delivered by direct contact (of the source or a waveguiding component) on the cornea 3 or in a non-contact manner with an optical beam (collimated or not).
  • Another illumination method is based on direct illumination of the eye fundus. Once light reaches the eye fundus the backscattered light is modulated by the retina and then collected for imaging purposes. In this configuration light can shine either on the background of the imaged area (brightfield) or only on the side of it (darkfield).
  • FIGs. 12A and 12B in these figures it is shown that light is sent through the pupil directly on the imaged retinal area. However, light is sent with such an angle that is not reaching the RPE on the back of the imaged retinal area. In this way the background appears dark. The light collected by this retinal area is not given by modulation of the background light, but to diffraction of the retinal features.
  • FIG. 13 shows the eye fundus can be illuminated also through transtemporal illumination. Light is shined on the patient's temple and from here it scatters into the eye.
  • the beam shape can be modified thanks to, but not limited to, optical methods, wavelength choice, wavefront shaping.
  • a transepidermal illumination method is represented.
  • Light 16 is shined on the upper 14 and lower 15 eyelid either simultaneously or sequentially or in any combination and from there, scattered 19 through different layers towards the inside of the eye 1.
  • Light is delivered by direct contact of the light source 27 to the skin 15.
  • a transparent or scattering media can be present between source and skin, to expand the illuminated area and decrease the power density on the skin.
  • Illustrative examples of light source positions are symbolized by discs 45. Many point sources that are spatially separated provide different angles of illumination to the inside of the eye. Also contact with the skin can be more comfortable for a patient as it does not require anesthetic lubricant as opposed to the case of a light source in contact with the eye (sclera, cornea).
  • the transscleral illumination system is connected to a master driver board.
  • the board provides driving signals for all the LEDs connected to it, as well as the trigger signal for the imaging device in order to synchronize the illumination to the acquisition system.
  • By turning on different LEDs a different illumination spectrum, both in terms of emitting wavelength and angular spectrum can be generated.
  • By changing the driving current it is possible to change total intensity, shape of the power spectrum and spatial distribution of light.
  • Configuration 2 (contact pcb) (184) Referring to FIG. 24, a similar illumination principle to configuration 1 is shown, i.e. transepidermal illumination before passing through the sclera.
  • the light emitting device and its flexible part has a continuous light source on top and bottom of the eye, following the arc shape of the eyelids.
  • the continuous source is composed of pixels, each one of those can be switched ON or OFF independently.
  • the light emitting device is a LED, having an encapsulation of a transparent material (such as but not limited to epoxy and Polydimethylsiloxane) with a diameter of a few millimeters.
  • the LEDs are placed in contact with the skin of the eyelids of the patient. The number of LEDs is not limited to 4.
  • the illumination is provided in non-contact fashion, the beam illuminating the eye or the surrounding tissues can be focusing, collimated or diverging.
  • light directed to a scattering tissue is provided by, but is not limited to a light beam 16 and a rotating wheel 39 pierced with a small hole 41.
  • a light beam illuminates the whole surface of the wheel, in such a way that light passes only through the hole 40 and illuminates only one point on the sclera 9.
  • the illumination point can be on the skin 14, 15 surrounding the eye, or even on the lateral side of the eye, referring to FIGs 4, 8 and 9.
  • a fiber 18 which can be, but is not limited to a single mode or multimode fiber.
  • a rotating wheel 39 holds a fiber 18 and a lens 22 that focuses light on the sclera 9 or on the skin 14, 15.
  • the fiber holder is designed in such a way that fiber can rotate freely, without introducing stress in the fiber.
  • the disc holding the fibers is spun a limited amount of times, preventing the fiber to get coiled around the rotating arm.
  • Another solution of preventing coiling consists of rotating the disc from the sides (and so removing any rotating arm).
  • Another embodiment consists in a series of light sources arranged on a fixed structure shaped like but not limited to a circle (annulus).
  • the light beams 74 can be separated, as shown in FIGs. 16 and 18.
  • the apparatus is configured to send the light in a non-contact manner, referring to FIGs. 10A to 14C.
  • FIG. 54 shows an exemplary schematic view of a scanning system for inspecting eye 541, according to an aspect of the present invention.
  • the use of a scanning system for signal collection allows for a better depth selectivity of the imaging layer of the sample. It can be used for collecting both phase/absorption or darkfield information.
  • the system uses a scanning element, such as but not limited to a two-axis scanning mirror 543 and 545 for scanning the collection beam along the imaged area on eye 541.
  • Other mirrors 540, 544, 548 and 549 are used to reflect the light towards a detector.
  • a diaphragm or a pinhole 546 is then placed in a plane conjugate to the imaging plane. In this way depth selection is improved.
  • the use of a scanning system can be also combined with hardware adaptive optics.
  • the signal is collected by a single pixel detector 547 such as but not limited to photodiode, avalanche photodiode, photomultiplier tube (PMT), micro-PMT which can be used for a standard acquisition or in a lock-in mode.
  • a single pixel detector 547 such as but not limited to photodiode, avalanche photodiode, photomultiplier tube (PMT), micro-PMT which can be used for a standard acquisition or in a lock-in mode.
  • a lock-in acquisition can be efficiently integrated to the system.
  • the lock-in acquisition can be performed using a lock-in camera, for instance in flood illumination, or a single detector, for instance in a scanning system.
  • the output of the camera/detector is then the DPC image.
  • several DPC signals can be integrated to have an output that is an average value of the DPC signal. It allows to increase the SNR.
  • removing the background at an early stage of the readout chain provide a more efficient use of the digital resources.
  • a first embodiment includes the viewing (i.e.
  • Another embodiment is to place markers on the surface of the eye, such as embedded within a contact lens or by displaying non-contact markers or any high resolution eye tracker, for the purpose of obtaining feedback to always illuminate the same area to provide a constant speckle pattern.
  • Another embodiment for image reconstruction is based on scanning a single focus spot. Due to wavefront shaping, it is possible to transform the speckle pattern into a single spot (of the same size as the original speckle). Similarly, in this method, the memory effect is used to scan the spot. By collecting the reflected intensity at each point, it is possible to reconstruct the intensity profile of the entire fundus image.
  • the main difficulty of this embodiment lies in the focusing part, since the limited resolution caused by the pupil does not allow for the measurement of the transmission matrix.
  • An alternative is the use of an iterative process, such as, but not limited to, a genetic algorithm (GA). (D. Conkey, A. Brown, A. Caravaca-Aguirre, and R.
  • FIG. 53 shows the optical principle scheme for optical coherence tomography measurement with scattered light.
  • a broadband light source 53 e.g. SLD
  • the reference 50 illuminates a mirror 54 that can translate to scan the sample in depth after interfering.
  • the object arm illuminates the eye 1 through the skin 8 and/or the sclera 9, the choroid 10 and the retina 11.
  • back-scattered light is collected by the pupil 4 and interferes with the reference beam.
  • the interfering beam passes through an imaging optics blocks 49 and is recorded by a detector 48.
  • Elements of the system can combined with other imaging modalities, for example but not limited to OCT, fluorescence imaging, magnetic resonance imaging (MRI), in a single platform to obtain and merge medical information helping for the diagnostic, to establish a multimodal retinal imaging platform.
  • imaging modalities for example but not limited to OCT, fluorescence imaging, magnetic resonance imaging (MRI), in a single platform to obtain and merge medical information helping for the diagnostic, to establish a multimodal retinal imaging platform.
  • MRI magnetic resonance imaging
  • Phase imaging of the retina with aspects of the present invention can be performed with infrared light.
  • the human eye is not sensitive to infrared light.
  • some retinal function can be imaged by stimulating the retina with visible wavelength, for example through the pupil or the the sclera, to perform functional retina imaging.
  • the response of the photoreceptors to different wavelength can be studied.
  • the Functional analysis method can include but is not limited to a deep learning algorithm.
  • An ophthalmic imaging system can include, but is not limited to an optical coherence tomography system, an eye-fundus imaging system, a slit illumination imaging system, a fluorescing angiography imaging system, an indocyanine green angiography imaging system, a fundus autofluorescence imaging system, a corneal topography imaging system, a endothelial cell-layer photography imaging system, a specular microscopy system to provide multimodal imaging of the eye tissues.
  • the same imaging method can be applied for imaging the anterior part of the eye.
  • the anterior eye tissue can include, but is not limited to the eye lens, the endothelium, and the cornea. Light scattered from the eye fundus or from the pupil is transmitted through the these layers and modulated in intensity and phase.
  • the use of an imaging system whose focal plane is not the retina, but rather the front layers of the eye e.g. the corneal endothelium
  • each point (one at the time or multiple together) is illuminated with one or more wavelength selected in a spectrum approximately between 400 nm to 1200 nm and an image of the fundus is acquired through the eye lens. Each image is acquired sequentially.
  • the patient's pupil may or may not be dilated.
  • dark field images minimum 1 illumination point
  • the image captured is directly the dark field image.
  • phase imaging minimum 2 illumination points
  • the acquired images need first to be processed to obtain a qualitative or quantitative phase image. When all the pictures have been acquired, the images are post-processed.
  • FIG. 46 With respect to measurements related to in vivo phase imaging, for proof of principle demonstration, an indirect ophthalmoscope has been built, as shown in FIG. 46.
  • the aspherical lens 30 allows a field of view in the fundus of about 60°.
  • the camera objective 33 focuses at the image plane of the lens 30 and the camera 32 records the inverted image 31 of the eye's fundus.
  • FIGs. 49, 50, 51 show two dark field images 35, 36 for two individuals and their corresponding phase gradient images 37 obtained by subtracting the two dark field images, using the relation of Equation (1). Illuminating wavelength was 643 nm.
  • FIG. 49 shows a picture taken with transepidermal from bottom center point of illumination
  • the center shows a picture taken with transepidermal from bottom left point of illumination
  • the right side shows the difference of the two showing the phase contrast.
  • FIG. 18 shows the phase gradient image 37 with a line profile 38 showing gradient of the optic disc for example.
  • the left side shows a picture taken with
  • transepidermal from bottom right point of illumination the center shows a picture taken with transepidermal from bottom left point of illumination, and the right side shows a difference of the two showing the phase contrast.
  • FIG. 47 in a second step another ophthalmoscope has been built in order to obtain a smaller field of view. It includes a stage for adjusting the focus, a fixation target for the patient and two telescopes formed by the eye lens, including the first lens, the second lens, and the third lens from left to right). In addition, a diaphragm is placed at the pupil plane to filter the beam. Finally, a high sensitivity camera record the retinal image.
  • FIG. 52 shows an example of a 2x2 mm2 field of view of a retina with trans- epidermal LED (peak wavelength at 870 nm) illumination.
  • a system in a third step a system has been designed to correct the aberrations using a wavefront sensor and a deformable mirror in closed loop using an aberration correction method.
  • An aberration correction method can include but is not limited to a deformable mirror, a spatial light modulator, a Badal system, a tunable lens, a series of cylindrical lenses, a Waller method, a modified Waller method, and a blind deconvolution algorithm.
  • FIG. 42 With respect to measurements related to ex vivo phase imaging, referring to FIG. 42, a microscope has been built having similar parameters to the in-vivo imaging case. It includes an oblique illumination being scattered first by a scattering plate, a reflecting layer the provide the backscattered light, a microscope objective, an imaging lens and a camera.
  • FIG. 43 shows the ex-vivo measurement results, with the resolution assessment, a comparison of the image obtained with the invention and a digital holographic microscope providing a quantitative phase image.
  • FIG. 43 also shows a comparison with images acquired with a confocal microscope providing intensity images.
  • the reflected light is collected by the objective, and passes through a diaphragm in order to artificially decrease the NA of the detection to mimic the limited resolution of the eye pupil.
  • the sample of microbeads is used to reproduce the situation of high reflectivity features that the detection system cannot resolve.
  • FIG. 33 shows the results of the focusing process performed for one 10 ⁇ diameter bead, and a detection NA of 0.02.
  • the procedure is as follow: An image was recorded with the maximum resolution (curve before optimization in FIG. 33D), then the diaphragm is closed to optimize the wavefront (FIG. 33C). The low resolution PSF before and after optimization is shown in FIG. 33C. Finally, we open the diaphragm to record the optimized high resolution PSF (curve after optimization in FIG. 33D). Two-dimensional images before (FIG. 33A) and after optimization (FIG. 33B) are shown.
  • the focusing is not as good as the one bead case, so we develop a method to discriminate the beads thanks to the shape of their PSF. If two beads are closer than the resolution distance the collected image would be similar to the PSF. Again, the ratio between the maximum and the total energy will change depending on the distance between the centers. This parameter can be used to discriminate between the one bead and the multiple beads cases.
  • Various applications can be performed with the present device, system, and method.
  • Applications include quantitative phase imaging of the retinal layer on top of the photoreceptors, between the inner and external limiting membranes, for example inner limiting membrane (ILM), retinal nerve fiber layer (RNFL), ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), outer nuclear layer (ONL), external limiting membrane (ELM).
  • ILM inner limiting membrane
  • RNFL retinal nerve fiber layer
  • GCL ganglion cell layer
  • IPL inner plexiform layer
  • IPL inner nuclear layer
  • OPL outer plexiform layer
  • ONL outer nuclear layer
  • ELM external limiting membrane
  • the proposed method can provide dark field images of the choroid and RPE (retinal pigmented epithelium), allowing for imaging of the choroidal tumors with enhanced contrast and choroidal micro vasculature.
  • RPE retina pigmented epithelium
  • a vision process for the eye is determined by the very first layers of retina. Before reaching the photoreceptor cells, light entering the eye needs to pass through a layer having a thickness of approximately 100 ⁇ of ganglion and neurons cells forming the retina. These cells are phase object and so difficult to see with standard imaging techniques. Indeed, phase imaging methods usually need the illumination system to be on the opposite side of the sample with respect to the imaging system, making this impossible to perform in- vivo. However, the possibility of performing phase imaging from one side using properties of scattering media has been shown.
  • a system for performing qualitative as well as quantitative imaging in the fundus of the eye with oblique illumination.
  • the use of different illumination points, through the pupil, on the sclera itself or directly on the skin covering the sclera, provides, due to through the scattering properties of the eye, oblique back-illumination, allowing for phase contrast images.
  • phase contrast images can be used to reconstruct pictures containing only phase or absorption information.
  • the same illumination scheme can be used for collecting dark field images from the pupil.
  • the use of incoherent illumination allows doubling the resolution of the recovered image compared to coherent imaging.
  • phase contrast can be obtained and how the absolute absorption and phase profile can be obtained for two dimensions (2D) and three dimensions (3D).
  • 2D two dimensions
  • 3D three dimensions
  • different illumination modes have been shown to provide phase -contrast and different apparatus for providing this illumination and acquiring a picture.
  • Algorithms have been discussed for reconstructing 2D and 3D phase and absorption profiles.
  • secondary information has been discussed that can be obtained with this technique and different improvements.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

A method for imaging a tissue of an eye, the method including the steps of providing oblique illumination to the eye by a plurality of light emitting areas of a light delivery device, the plurality of light emitting areas being independently controllable and arranged to direct light towards at least one of a retina and an iris of the eye, causing an output beam from light backscattered from the at least one of the retina and the iris by the oblique illumination, capturing the output beam with an imaging system to provide a sequence of images of a fundus of the eye, and retrieving a phase and absorption contrast image from the sequence of images of the fundus, wherein the sequence of images of the fundus of the step of capturing is obtained by sequentially turning on one or more of the plurality of light emitting areas at a time in the step of providing the oblique illumination.

Description

SYSTEM, METHOD AND APPARATUS FOR RETINAL ABSORPTION PHASE AND DARK FIELD IMAGING WITH OBLIQUE ILLUMINATION
CROSS-REFERNCE TO RELATED APPLICATIONS
(1) The present application claims priority to the International patent applications PCT/IB2016/052787 filed on May 13, 2016 and PCT/IB2016/056806 filed on November 11, 2016, both of these patent applications herewith incorporated by reference in their entirety. FIELD OF THE INVENTION
(2) The present invention relates to high resolution quantitative and qualitative absorption, phase and dark field imaging of the retina by the use of oblique illumination. BACKGROUND
(3) Standard photography relies on the difference in absorption of different features for providing contrast. This is also the case in conventional funduscopy where blood vessels, photoreceptors and other retinal structures are observed thanks to their different reflectivity values that provide intensity modulation at the sensor plane. This is not the case for most of the cells lying in the inner retina (ganglions, nuclear and plexiform layers), whose absorption and scattering values are so low to show almost no contrast even at high resolution. Furthermore, the intensity modulation of these features is negligible with respect to the background modulation signal (due to underlying features) in combination with noise. Even with optical coherence tomography (OCT), the weak contrast of these cells make the retina appear as a smooth layer, almost free of features.
(4) Tian et al. showed that images of a phase sample (sample with weakly absorptive features) acquired with different angles of illumination can be used to obtain a single phase image of the sample (L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9, pp. 11394-11403 (2015)). In their method, where illumination is provided in transmission, they used a Wernier filter based algorithm to reconstruct the phase image. Such a method is not directly applicable to living biological media, since transmission illumination is, in general, not possible. A solution of this problem has been proposed by Ford et al. (T. N Ford, K. K Chu and J. Mertz, "Phase- gradient microscopy in thick tissue with oblique back-illumination," Nat. methods, 9, 12 (2012)), where the light from the deep layer of the sample, is used as a secondary source illuminating the top layers in a transmission fashion. For providing oblique illumination, Ford et al. shined light in the sample at a point pi, showing that the area at a certain distance d was back illuminated in an oblique fashion. By subtracting the pictures obtained with 2 opposite illumination points it was possible to reconstruct a phase contrast image of the sample.
(5) Dark field images of the retina have been studied for the case of illumination through the pupil. In these studies, an enhanced contrast of the vasculature has been shown (D. Scoles, Y. N. Sulai and A. Dubra "In vivo dark-field imaging of the retinal pigment epithelium cell mosaic," Biomed. Opt. Exp.4,9, pp. 1710-1723 (2013), T. Y. P. Chui, D. A. VanNasdale, and S. A. Burns, "The use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope," Biomed. Opt. Exp.3, 10, pp. 2537-2549 (2012)).
(6) By combining several dark field images of a sample can lead to an image containing the partial derivative of the phase information by subtracting two images taken from asymmetric illumination. This has been shown either in transmission for a microscope (Z. Liu, S. Liu and L. Waller "Real-time brightfield, darkfield, and phase contrast imaging in a light emitting diode array microscope," J. of Biomed. Opt. 19, 10, 106002 (2014)) or in reflection using oblique back-illumination in an endoscope (Int. Pat. Pub. No. WO
2013/148360, T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," Nat. Methods, 9, 12 (2012)). Moreover, knowing the angle of illumination of the sample i.e. its spectrum, a quantitative phase information can be recovered using a weak object transfer function model (S. B. Mehta and C. J. R. Sheppard, "Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast," Opt. Lett. 34,13, pp. 1924-1926 (2009), L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9, pp. 11394-11403 (2015)) or a Fourier ptychography algorithm (Int. Pat. Pub. No. WO 2015/179452, G. Zheng, R. Horstmeyer, and C. Yang, "Wide-field, high -resolution Fourier ptychographic microscopy"). A similar approach can be used also for reconstructing a 3D picture of the sample ("Microscopy refocusing and dark-field imaging by using a simple LED array" G. Zheng, C. Kolner, and C. Yang, Optics Letters, pp. 3987-3989 (2011)).
(7) Finally, processing the images also allows correcting the eye's aberrations optimizing Fourier properties of the images (U.S. Pat. No. 8,731,272, Z. Phillips, M. Chen, L. Waller, "Quantitative Phase Microscopy with Simultaneous Aberration Correction," Optics in the Life Sciences Congress, OSA Technical Digest (online), Optical Society of America, 2017), paper JTu5A.2.
(8) Few attempts have been reported to obtain dark field images with higher contrast that obtained via the trans-pupil illumination described above.
(9) One study is using transscleral illumination, i.e. light is provided to the fundus via illumination through the sclera, to obtain higher contrast dark field images of the retina (A. Schalenbourg, L. Zografos "Pitfalls in colour photography of choroidal tumours." Eye. 2013, Vol. 27(2), pp. 224-229). In U.S. Pat. No. 7,387,385 in Figure 21, U.S. Pat. Pub. No. 2007/0159600, and U.S. Pat. Pub. No. 2007/0030448 in Figure 22, a transscleral illumination with several differing wavelength (red, green, blue) is used to make one image with several wavelength simultaneously. This resulting image is used to diagnose choroidal tumors. Contrary to full field illumination through the eye lens (called transpupilary illumination), transscleral illumination allows to collect light only coming from underneath the -100 um thick first layer of the retina. This is because the high reflectance (at and near specular) coming from the surface is blocked by the eye pupil. The tumors absorb much more light than healthy tissue because of the intense cellular and vasculature activity in tumorous tissue. Because the near specular reflected light is blocked, the transscleral image of the tumor has more contrast than that obtained with transpupillary illumination and hence allows a better diagnostic of the spatial extent of the tumor (A. Schalenbourg, L. Zografos "Pitfalls in colour photography of choroidal tumours," Eye. 2013, Vol. 27(2): pp. 224-229).
(10) The transscleral methods above are described using one illumination point or sometimes with two illumination points where the two point sources provide illumination simultaneously. Here, by point, we mean "point source like" such as a small area. It can be an area larger than the area given by the diffraction limit.
(11) However, none of the techniques described above provide phase image information in either a quantitative or non-quantitative manner of the inside of eye without the use of a scanning system. This phase contrast is such as but not limited to, the fundus and retina. Accordingly, there is a need to obtain phase information from the biological material above the photoreceptors to obtain improved contrast, improved image resolution and additionally to derive functional information that exist from the large body of research in quantitative phase imaging in biology.
(12) By quantitative phase imaging we mean an imaging technique for which a well-known relationship (such as but not limited to linear or logarithmic) exists between the grayscale pixel value of the camera and the corresponding phase that the physical sample imparted on the light traversing it. Phase unwrapping techniques can be also used to remove the effect of phase periodicity and obtain more detailed images.
(13) In contrast, the absence of an absolute relationship between the grayscale pixel value of the camera and the corresponding phase that the physical sample imparted on the light traversing it is termed as non-quantitative or qualitative phase imaging. An example of qualitative phase imaging is phase gradient contrast and it is given in T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," Nat. methods, 9, 12 (2012).
(14) Retinal images refinement has been obtained in different ways. One of the main characteristics of human the retina is the presence of cone cells. At high magnifications cone cells appear like bright spots on a dark background allowing for guiding star reconstruction algorithm (N. Meitav and E. N. Ribak, "Estimation of the ocular point spread function by retina modeling", Optics Letter Vol 37(9) (2012) and N. D. Shemonski, F. A. South, T. Z. Liu, S. G. Adie, P. S. Carney and S. A. Boppart "Computational high-resolution optical imaging of the living human retina" Nature Photonics (2015)). Another method takes advantages of natural movements of the eye (saccades). While most of the studies try to suppress this phenomena that leads to a lower resolution in case of averaging, Meitav and Ribak, used averaging after finding the relative shift of each image N. Meitav and E. N. Ribak "Improving retinal image resolution with iterative weighted shift-and-add" J. Opt. Soc. Am. Vol 28(7) (2011). This has been done using image correlation and allows to align all the images while performing the average. Due to eye motion, each image presents a different aberrated PSF (point spread function), the resulting averaged image shows an averaged PSF, filtering out the highest order aberrations.
(15) One can then obtain a PSF that is much more similar to the diffraction limited one. The effect of eye aberrations can be further decreased by deconvolving the picture with different PSF and estimating the best one. This process has been performed by Hillmann et al. (D. Hillmann, H. Spahr, C. Hain, H. Sudkamp, G. Franke, C. Pfaffle, C.Winter, and G. Huttmann "Aberration-free volumetric high-speed imaging of in vivo retina" Scientific reports, Vol. 6 (2016)). In their work deconvolution is obtained with different PSF based on the typical eye aberrations. Entropy is used as a measurement of image quality for estimating the best correction and producing high resolution pictures. Furthermore, the phase and absorption reconstruction using Waller's method (L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9, pp. 11394- 11403 (2015)) consists already in a deconvolution process. The method can be used to obtain phase and absorption information also in the case of a known or an unknown aberrated pupil (Z. Phillips, M. Chen, L. Waller "Quantitative Phase Microscopy with Simultaneous Aberration Correction" Optics in the Life Science 2017).
(16) High resolution imaging of the living retina is performed by aberration correction. This task is done computationally as presented in U.S. Pat. No. 8,731,272, or using hardware devices. In vivo imaging of the retina is usually performed by a scanning system, U.S. Pat. No. 4,213,678, possibly coupled to adaptive optics, European Pat. No. 1427328 Al, or using a camera flood illumination system, also sometimes coupled to adaptive optics, as for instance in U.S. Pat. Pub. No. 2004/0189941, U.S. Pat. No. 7,364,296.
(17) The resolution in optical imaging of the eye is mostly limited by three factors: pupil's numerical aperture (0.24 maximum (R. K. Wang and V. V. Tuchin, "Advanced Biophotonics: Tissue Optical Sectioning," CRC Press, 2014.)), lens' aberrations and intra ocular scattering. Compensation for the last two effects have been proposed leading to the so- called Adaptive Optics Confocal Scanning Laser Ophthalmoscope (AOCSLO) (A. Roorda, F. Romero-Borja, W. Donnelly, H. Queener, T. Hebert, and M. Campbell, "Adaptive optics scanning laser ophthalmoscopy," Opt. Express, vol. 10, pp. 405-412, 2002.)
(18) This system, sometimes coupled with Optical Coherence Tomography (OCT) (E.M. Wells-Gray, R. J. Zawadzki, S. C. Finn, C. Greiner, J. S. Werner, S. S. Choi, N. Doble, "Performance of a combined optical coherence tomography and scanning laser ophthalmoscope with adaptive optics for human retinal imaging applications," Proc. SPIE, vol 9335, pp , 2015.), provides a lateral resolution of about 1.5 um, and 2 um axially. This value is limited by the numerical aperture provided by the eye pupil. A wavefront shaping method has been used by Vellekoop and Mosk to perform focusing of light through a highly scattering medium (I. M. Vellekoop, A. Lagendijk and A. P. Mosk, "Exploiting disorder for perfect focusing," Nature Photonics, vol. 4, pp 320 - 322, 2010.). Further works showed how scattering media can be used as optical elements to provide high numerical aperture (NA= 0.85) (Y. Choi, T.D. Yang, C. Fang-Yen, P. Kang, K.J. Lee, R.R. Dasari, M.S. Feld, and W. Choi, "Overcoming the Diffraction Limit Using Multiple Light Scattering in a Highly Disordered Medium," Phys. Rev. Lett., vol. 107, no. 2, pp. 023902, 201 Land I. N.
Papadopoulos, S. Farahi, C. Moser, D. Psaltis, "Increasing the imaging capabilities of multimode fibers by exploiting the properties of highly scattering media," Optics Letters, vol. 38, pp. 2776-2778, 2013.). Using the memory effect, it is possible to scan this spot for reconstructing a picture of the scanned object (C.-L. Hsieh, Y. Pu, R. Grange, G.Laporte, and D. Psaltis, "Imaging through turbid layers by scanning the phase conjugated second harmonic radiation from a nanoparticle," Opt. Express 18, 20723-20731 (2010)). Another technique consists in scanning the sample directly with the speckle pattern and reconstructing the original image by a phase retrieval algorithm (X. Yang, Y. Pu, and D. Psaltis, "Imaging blood cells through scattering biological tissue using speckle scanning microscopy," Opt. Express 22, 3405-3413 (2014) and H. Yilmaz, E. G. van Putten, J. Bertolotti, A. Lagendijk, W. L. Vos, and A. P. Mosk, "Speckle correlation resolution enhancement of wide-field fluorescence imaging," Optica 2, 424-429 (2015)).
(19) Accordingly, as discussed above, in the background art of imaging instruments for ophthalmology, the retina is always illuminated by passing through the lens of the eye. Then, the reflected light is again collected via the lens. Also, as discussed above, this presents several drawbacks and complications, and novel and substantially improved methods, systems, and devices for ophthalmology are desired.
SUMMARY OF THE INVENTION
(20) According to one aspect of the present invention, a method for imaging a tissue of an eye is provided. Preferably, the method includes the steps of providing oblique illumination to the eye by a plurality of light emitting areas of a light delivery device, the plurality of light emitting areas being independently controllable and arranged to direct light towards at least one of a retina and an iris of the eye, causing an output beam from light backscattered from the at least one of the retina and the iris by the oblique illumination;, and capturing the output beam with an imaging system to provide a sequence of images of a fundus of the eye.
(21) Moreover, the method further preferably includes a step of retrieving a phase and absorption contrast image from the sequence of images of the fundus, and the sequence of images of the fundus of the step of capturing is obtained by sequentially turning on one or more of the plurality of light emitting areas at a time in the step of providing the oblique illumination.
(22) According to another aspect of the present invention, the tissue of the eye is part of a living eye of a human or an animal, and the oblique illumination is at least one of a transpupillary illumination, a transscleral illumination, and a transepidermal illumination. In addition, the light delivery device is configured for at least one of the following illumination modalities, including no contact between the light delivering device and a face of a patient of the eye, the light delivering device is in contact with a skin surrounding the eye, the light delivering device is in contact with a sclera of the eye, and the light delivering device is in contact with a cornea of the eye.
(23) According to still another aspect of the present invention, a system for imaging a tissue of an eye is provided. The system preferably includes a light delivering device having a plurality of light emitting areas, the light emitting areas directed towards the tissue of the eye for providing oblique illumination, an output beam caused by light backscattered off a fundus of the eye of the oblique illumination from the plurality of emitting areas, and an imaging system configured to capture the output beam and to provide a sequence of images of the fundus of the eye.
(24) Moreover, the system further preferably includes a controller configured to individually control the plurality of light emitting areas of light delivering device, to sequentially turn on one of the plurality of light emitter areas at a time, for capturing the sequence of images by the imaging system, and the imaging system is further configured to retrieve a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image from the fundus of the eye.
(25) According to yet another aspect of the present invention, the imaging system preferably further includes a scanning system and a detector, the scanning system having a collection pupil that is either centered or shifted with respect to a center of a pupil of the eye, and the detector includes at least one of a single pixel detector, a line camera, a two- dimensional multipixel device and a split detector.
(26) The above and other objects, features and advantages of the present invention and the manner of realizing them will become more apparent, and the invention itself will best be understood from a study of the following description with reference to the attached drawings showing some preferred embodiments of the invention.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
(27) The accompanying drawings, which are incorporated herein and constitute part of this specification, illustrate the presently preferred embodiments of the invention, and together with the general description given above and the detailed description given below, serve to explain features of the invention.
(28) FIG. 1 shows a scheme of the working principle of the system presented in U.S. Pat. No. 7,387,385, U.S. Pat. Pub. No. 2007/0159600 according to the background art. A waveguiding component is put in contact with the sclera to provide illumination. The user has to hold this waveguiding component;
(29) FIG. 2 shows a scheme of the working principle of the system presented in U.S. Pat. Pub No. 2007/0159600 according to the background art. The method uses one illumination point, or two illumination points where the two sources provide illumination simultaneously;
(30) FIG. 3 shows an illumination system presented in T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," according to the background art, Nat. methods, 9, 12 (2012) and Int. Pat. Pub. No. WO 2013/148360. Light is guided and delivered at the surface of the scattering media. Here, thanks to scattering, some of the light travels back up to the surface, emerging with an asymmetric angle distribution. A second waveguiding component is present for providing the symmetrical illumination.
(31) FIGs. 4 A and 4B schematically show two different representations of a transmission microscopy phase imaging method using incoherent darkfield illumination, presented in L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9,11394-11403(2015), according to the background art;
(32) FIG. 5 shows the in vivo darkfield imaging of the retina using a modified AOSLO system with offset aperture, image adapted from Toco Y. P. Chui, Dean A. VanNasdale, and Stephen A. Burns, the use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope, Biomed. Opt. Express 3, 2537-2549 (2012), according to the background art;
(33) FIG. 6 shows in vivo darkfield imaging of the retina using a modified AOSLO system with split detector, image adapted from A. Guevara-Torres, D. R. Williams, and J. B. Schallek, Imaging translucent cell bodies in the living mouse retina without contrast agents, Biomed. Opt. Express 6, 2106-2119 (2015), according to the background art;
(34) FIG. 7 shows a flood illumination adaptive optic system with transpupillary illumination, as shown in U.S. Pat. Pub. No. 2004/0189941, according to the background art;
(35) FIG. 8 shows a review based on state of the art OCT retinal imaging systems, table from Jonnal R.S., Kocaoglu O.P., Zawadzki R.J., Liu Z., Miller D.T., Werner J.S.. A Review of Adaptive Optics Optical Coherence Tomography: Technical Advances, Scientific Applications, and the Future. Invest Ophthalmol Vis Sci. 2016 Jul 1;57(9): Oct. 51-68, showing the cells not being yet imaged in vivo, according to the background art;
(36) FIG. 9A schematically shows the transscleral illumination method according to an aspect of the present invention. Light passes through the sclera before illuminating the eye fundus;
(37) FIG. 9B schematically shows the proposed transscleral illumination method according to an aspect of the present invention: light is shined directly on the scleral tissue. Light can be delivered by direct contact of the source or the light waveguiding component, or in a non-contact manner with an optical beam (collimated or not). Some examples of a multiplicity of light position, areas are symbolized by discs. The scattering properties of the sclera produce a diffused beam that illuminates the fundus with a high angle. In case of physical contact with the sclera, local anesthesia can be used to make the measurement more comfortable to the patient; (38) FIG. 10A schematically shows a trans-epidermal illumination method according to an aspect of the present invention. Light passes through the skin and the sclera before illuminating the eye fundus;
(39) FIG. 10B schematically shows a trans-epidermal illumination method according to an aspect of the present invention. Light is shined on the eye lid and from there, scattered through different layers up to the inside of the eye. Light can be delivered by direct contact (of the source or a waveguiding component) or with an optical beam (collimated or not). Some examples of a multiplicity of light position, areas are symbolized by discs. Many point sources that are spatially separated provide different angles of illumination. Also contact with the skin can be more comfortable for a patient as it does not require anesthesia;
(40) FIG. 11 A schematically shows a pupillary illumination method according to an aspect of the present invention. Light is shined on the inside layers of the eye after passing through the pupil. The light is scattered on the inside of the eye and illuminating the eye fundus. Light can be delivered by direct contact (of the source or a waveguiding component) on the cornea or in a non-contact manner with an optical beam (collimated or not);
(41) FIG. 1 IB schematically shows a pupillary illumination method with an example of point light entering in the pupil, according to an aspect of the present invention;
(42) FIG. 12A schematically shows pupillary darkfield illumination according to an aspect of the present invention. Light is shined on the extremity of the pupil (in a single point or in annular shape. Light is illuminating the upper layer of the retina without illuminating the background;
(43) FIG. 12B schematically shows pupillary darkfield illumination according to an aspect of the present invention. The light pattern can be an annulus or a restricted portion of the annulus;
(44) FIG. 13 schematically shows temporal illumination according to an aspect of the present invention: The light passes through the temporal tissues (skin) before reaching the sclera of the eye and the eye fundus;
(45) FIG. 14A schematically shows oblique illumination of the eye fundus with a focused beam according to an aspect of the present invention;
(46) FIG. 14B schematically shows oblique illumination of the eye fundus with a collimated beam according to an aspect of the present invention;
(47) FIG. 14C schematically shows oblique illumination of the eye fundus with a diverging beam according to an aspect of the present invention;
(48) FIG. 15 schematically shows annular illumination using the side eyeball as scattering layer for proving high numerical aperture according to an aspect of the present invention;
(49) FIG. 16 schematically shows an exemplary apparatus for non-contact illumination according to an aspect of the present invention. Four (4) beams shining at four (4) different illumination points as a mean to illustrate the method, are represented;
(50) FIG. 17 schematically shows a top view of an apparatus for contact illumination, according to another aspect of the present invention. At least one light beam is in contact with the sclera. The light beam can be brought via a light waveguide such as but not limited to a multimode fiber. In addition, the imaging lens that images the fundus through the eye lens is also in contact with the sclera via an index matching gel placed between the cornea and the imaging lens. Note that this figure differs from the apparatus shown in FIG. 2 in the number of illumination points (more than two), and the way of illumination. Here the beams are switched on sequentially (one or multiple at the same time), while in FIG. 2, the two points are shined simultaneously;
(51) FIG. 18 schematically shows an apparatus for non-contact illumination, according to another aspect of the present invention, a top view of FIG. 17. Note that this figure differs from FIG. 2 in the number of illumination points (more than two), and the way of illumination. Here the beams are switched on sequentially (one or multiple at the same time), while in FIG. 2, the two points are shined simultaneously;
(52) FIG. 19 schematically shows an apparatus for non-contact illumination according to an aspect of the present invention. A rotating wheel is pierced with a hole on its periphery. A light beam illuminates the whole surface of the wheel in such a way that the light passes only through the hole and illuminates only on point on the sclera. Alternatively, the illumination point can be on the skin;
(53) FIG. 20 schematically shows an apparatus for non-contact illumination according to an aspect of the present invention. A rotating wheel holds a fiber and a lens that focuses light on the sclera or on the skin;
(54) FIG. 21 schematically shows an apparatus for non-contact illumination, based on the example of the apparatus shown in FIG. 19 with the imaging system, according to an aspect of the present invention;
(55) FIG. 22 schematically shows an apparatus for non-contact illumination, based on the example of the apparatus shown in FIG. 19 with the imaging system, according to an aspect of the present invention;
(56) FIG. 23 schematically shows an apparatus for contact illumination according to an aspect of the present invention. A patch is put in contact with the patient's skin. The patch is connected to several fibers whose distal ends are the illumination points (area) on the patient's eyelids. The patch can be composed of a removable (consumable) protection part in contact to the skin (one per patient);
(57) FIG. 24 schematically shows an illustration of a continuous light emitting device having the arc shape of the eye. The light emitting device is held by a flexible electronic circuit; (58) FIG. 25 schematically shows a simplified view of the printed circuit board (PCB) system with its electronic components, according to an aspect of the present invention;
(59) FIG. 26 shows a photograph of two prototypes placed on a subject's left eye. Four (4) light-emitting diodes (LED) can shine light from the top lid and four (4) other LEDs are shined from the bottom lid of the eye;
(60) FIG. 27 shows a photograph of a subject positioned on an ophthalmic head mount with two prototypes light sources placed on top and bottom of the eye lid respectively. In the image, one LED is switched ON;
(61) FIG. 28A shows a photograph of a designed prototype holding four (4) red surface-mount device (SMD) LEDs having a dimension smaller than 1 mm3. The illuminating device if a flexible PCB;
(62) FIG. 28B shows a photograph of a prototype holding four (4) red LEDs having a diameter of 5 mm. The LED connectors are fixed to a threaded tube that can be screwed to the optical system;
(63) FIG. 29 is a simplified representation of a schematic of the device, according to an aspect of the present invention. The light beam is first reshaped by the modulating device and then is sent to the retina with a high numerical aperture illumination method. The backscattered light is collected from the pupil and measured by a detector. Beam 1 : forward scattered light. Beam 2: collected backscattered light;
(64) FIG. 30 shows a representation of the concept validation for focusing using dark background and a reflective bead. A generic wavefront is shined on the surface. Since only the bead can reflect light only a small amount of power is coming back;
(65) FIG. 31 shows a representation of the concept validation for focusing using dark background and a reflective bead. An optimized wavefront is shined on the surface. Light is focused on the bead reflecting back all the scattered light; (66) FIG. 32 shows a representation of the optical scheme used for focusing scattered light on a bead, according to an aspect of the present invention;
(67) FIG. 33A-33D shows intensity enhancement of a single bead using the presented method, according to an aspect. FIG 33A shows the reflectance before running the algorithm, while FIG. 33B the final result. The same profile along one dimension is plotted for low NA in FIG. 33C and for maximum NA (open diaphragm) in FIG. 33D;
(68) FIG. 34 is a schematic representation of the backscattered light's angular distribution with oblique illumination, according to an aspect of the present invention;
(69) FIG. 35 is a schematic representation of how the beam is scattered at the surface of a scattering media, according to an aspect of the present invention. The case of oblique illumination is represented, showing also an anisotropic scattering (the scattered beam is not symmetric respect the perpendicular to the surface);
(70) FIG. 36 shows a schematic representation of angular distribution of the emerging beams at different distances from the illuminating beam, according to an aspect of the present invention. The more the distance increases the more the emerging beam will look like a Lambertian distribution;
(71) FIG. 37 shows a schematic representation of an illumination beam is shined on the eye tissue, according to an aspect of the present invention. After travelling through the different layers it emerges as scattered beam. After passing through the transparent retinal layers it is scattered back by deeper tissues (e.g. choroid). The backscattered light presents an oblique mean distribution and, passing again through the retina, the cells contained in the retina alters the phase of the light passing through it. After travelling through the eye it is collected by the eye lens and sent outside as a collimated beam;
(72) FIG. 38 is a graph representing the two-dimensional (2D) Monte Carlo simulation of angular distribution of the backscattered light. The illumination beam is considered to impinge with an angle of 45°. The chosen scattering parameters are for choroidal tissue;
(73) FIG. 39 shows a schematic representation of a flow chart for performing a measurement, according to an aspect of the present invention. The optical system is positioned on the patient by following the embodiments detailed in FIGs 9A to FIG. 28. After this step, each point is illuminated with one or more wavelength covering a spectrum approximately between 400 nm to 1200 nm and an image of the fundus is acquired through the eye lens. Each image is acquired sequentially. The patient's pupil maybe dilated but not limited to this case. In the case of dark field images (minimum 1 illumination point) the image captured is directly the dark field image. If the chosen method is phase imaging (minimum 2 illumination points) the acquired images need first to be processed to obtain a qualitative or quantitative phase image. When all the pictures have been acquired, the images can be post-processed;
(74) FIGs. 40A, 40B, and 40C show representations that illustrate a method for determining the angular spectrum of an illumination point. For every illumination point, a different shadow is cast off the vessels. The spatial frequency spectrum of the illumination can be determined by using the image of the shadows obtained through the eye lens. The knowledge of the spatial frequency spectrum of each illumination point is then used in the phase retrieval algorithm to provide a quantitative phase image. In addition, the aberration of the eye-lens system is also inferred by the iterative algorithm. This works because there are multiple presentations of the shadows. The more images of shadows at different angles, the more precise the phase image and aberration correction. The principle or method is illustrated for two different illumination points as shown in FIGs. 40A and 40B. FIG. 40C shows a typical transcleral illumination image of a human retina, exhibiting vessel shadows;
(75) FIG. 41 is a schematic representation showing the stitching different pictures in the Fourier space, according to an aspect of the present invention. Since tilted illumination is equivalent to a shift in Fourier space, stitching, in Fourier domain, pictures with different illumination angles is equivalent to obtaining a single picture with a larger Fourier domain. This results in a higher resolution image;
(76) FIG. 42 shows a schematic view of a system or device used to perform ex-vivo measurements, according to an aspect of the present invention. Samples used for validation and proof of concept of phase imaging;
(77) FIGs. 43A, 43B, and 43C depict measurements of the samples illustrated in FIG. 42. Comparison with digital holography providing quantitative phase measurement and confocal microscopy providing intensity measurement;
(78) FIG. 44 depicts different phase measurement scanning a thick pig retina sample (180 um) in depth. The pictures show the different layers of the retina;
(79) FIG. 45 shows a graph that represents a computation of cells density for different layer based on the scan of FIG. 44;
(80) FIG. 46 shows a schematic representation of an optical system of the indirect ophthalmoscope used for the proof of concept measurements;
(81) FIG. 47 shows a schematic representation of a system for in-vivo imaging, according to an aspect of the present invention. The LED used for transscleral illumination is synchronized with the acquisition of the camera. The pupil plane is conjugated to the plane of the diaphragm D. In this way light scattered around the eye is filtered from the final picture. The camera plane is conjugated with a plane in the retina, whose depth can be adjusted thanks to the badal system;
(82) FIG. 48 shows a schematic representation of an example of an optical system designed with an adaptive optics loop to correct for the aberrations of the eye, according to an aspect of the present invention. It integrates a wavefront sensor (WFS) and a deformable mirror (DM);
(83) FIG. 49 shows on the left side a picture taken with the transepidermal method with a bottom center point of illumination, in the center a picture taken with the
transepidermal method with a bottom left point of illumination, and on the right side the difference of the two showing the phase contrast;
(84) FIG. 50 shows a phase gradient image obtained with the method according to an aspect of the present invention, with corresponding cross-section showing gradients for the vessels, and the optic disc;
(85) FIG. 51 shows on the left side a picture taken with the transepidermal method with a bottom right point of illumination, in the center a picture taken with the transepidermal method with a bottom left point of illumination, and on the right side difference of the two showing the phase contrast;
(86) FIG. 52 shows an example of a darkfield trans-epidermal image acquired with the system of FIG. 47;
(87) FIG. 53 shows a schematic representation of an optical system for an interferometric measurement with scattered light, according to an aspect of the present invention. A broadband light source (e.g. Superlumine scent light emitting diode: SLD) is split into a reference and an object arm. The reference illuminates a mirror that can translate to scan the sample in depth. The object arm illuminates the eye with the methods described in FIGs. 4 to 26. After illuminating the fundus, the back-scattered light is collected by the pupil and interferes with the reference beam. Alternatively, a Fourier domain method can be implemented (not shown in the picture) by decomposing the spectral content of the scattered beam to retrieve depth in the retina;
(88) FIG. 54 shows a schematic representation of an operational scheme or method for a scanning system using transscleral illumination, according to an aspect of the present invention; and
(89) FIG. 55 depicts signals in the form of chronograms illustrating a lock-in acquisition, according to an aspect of the present invention.
(90) Herein, identical reference numerals are used, where possible, to designate identical elements that are common to the figures. Also, the images are simplified for illustration purposes and may not be depicted to scale.
DETAILED DESCRIPTION OF THE SEVERAL EMBODIMENTS
(91) According to one aspect of the present invention, a device for retinal imaging is provided that can establish phase and absorption contrast image thanks to oblique illumination of the retinal layers. According to an aspect of the invention, the device can be used to ex-vivo and in-vivo imaging. In the first part of this section, the ex- vivo
implementation is summarized, and the in-vivo implementation is detailed in the second part.
(92) The phase and absorption contrast image can include, but is not limited to a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image. Also, phase and absorption contrast image can include but is not limited to one dimensional image, two dimensional image, three dimensional image, or multidimensional image.
(93) An ex-vivo sample from the eye can include but is not limited to an entire eye, an untreated piece of an eye, a fixed piece of an eye, a stained piece of an eye, and an in-vitro sample.
(94) Regarding the ex-vivo imaging, instead of using direct illumination of the sample, according to an aspect of the present invention, phase contrast is obtained by oblique illumination generated by scattering in the deep layers of the eye. In a first embodiment, the sample is illuminated with an light source and an oblique angle. In a second embodiment, the source is placed in the same side as the imaging system, making a reflection configuration. In a third embodiment, a scattering layer is placed behind the phase sample to provide a backscattered illumination. In a fourth embodiment, the scattering layer providing the back illumination is the choroid of the eye. In a fifth embodiment, the light source is scattered by a diffusing plate before reaching the sample. In a sixth embodiment a back reflective layer is added below the sample. For the reconstruction process, see below.
(95) A light source can include, but is not limited to a light emitting diode, a super luminescent diode, a quantum dot source, a lamp, a blackbody radiation source, a low temporal coherence source, low spatial coherence source, and a laser source.
(96) Regarding the in-vivo imaging, instead of using direct illumination of the eye fundus, phase contrast is obtained by oblique illumination generated by scattering in the deep layers of the eye. The features of the method, device, system can be simplified in the following categories: illumination type, light delivering device, image acquisition system, reconstruction process.
(97) The illumination types can be presented by the following non-limiting and non-exclusive embodiments:
(98) In a first embodiment, the light is passing through the sclera, the choroid and the retina. The transmitted and scattered light illuminates the fundus. In this variant, no or very little light is entering the pupil-lens. The light delivering device is in contact with the sclera. In other variant, transscleral in combination with transpupillary lighting can be used.
(99) In a second embodiment, the light is passing through the skin layer near the eye, sclera, the choroid and the retina. The transmitted and scattered light illuminates the fundus. In this variant, no or very little light is entering the pupil -lens.
(100) In a third embodiment, light is passing through the pupil, directed to the side of the eye. Here the rays are scattered and reflected towards the fundus. (101) In a fourth embodiment, light is passing through the pupil and directed on the fundus with a certain angle, generating also angled backscattered light.
(102) In a fifth embodiment, light is passing through the pupil and shined on an area close to the imaged region. Light scatters in the deep layer providing, behind the imaged region, an angled illumination.
(103) In a sixth embodiment, light is passing through the temple. The transmitted and scattered light illuminates the fundus. No light is entering the pupil's lens.
(104) In a seventh embodiment, light is transmitted through the pupil or skin and sclera illuminating directly the imaged area of the retina and not its background, thus providing dark field contrast.
(105) In an eighth embodiment, the wavefront is manipulated before entering in the eye. The feedback light is collected through the lens of the eye. Several schemes are then possible, for example the focusing of light on the fundus to compensate for the scattering to obtain a spot size smaller than the resolution of the eye-pupil (0.24 NA), the focusing being obtained with an iterative process, using the feedback light through the eye-pupil as criterion of optimization, and also the scanning of the eye's fundus by using a well-known memory effect in scattering media by adding a phase gradient to the wavefront. The scanning pattern can include either an optimized focus spot or a speckle pattern.
(106) Once the data of the scan is recorded, it is processed with a phase retrieval algorithm or other digital means to reconstruct a super resolved image, i.e. having the resolution of the illumination speckle pattern.
(107) The light delivering device, according to an aspect of the present invention, can be designed for contact and non-contact. The light delivery device for contact is presented in the following embodiments:
(108) The light delivering device is made of a flexible electronic circuit that integrates the light emitting device and the electronic wires that brings the driving signals for the light delivering devices.
(109) The light delivering device is in contact with the skin or sclera and the light delivering device in contact with the skin has a removable protection patch (one for each patient).
(110) The parts that are in contact with a face of a patient whose eye is analyzed can include but is not limited to a head holder, a chin holder and a light delivering device are covered with a removable disposable part. A removable part can include but is not limited to a layer of paper, a stack of paper layers, a polymeric layer.
(111) The shape of the flexible electronic circuit of the light delivering device is designed in an ergonomic way.
(112) The flexible electronic circuit - patch is held with a head-mounted frame (e.g. a glass' frame) placed on the subject.
(113) An absorptive or reflecting layer placed on the back of the illumination device for suppressing backscattered light.
(114) The light delivering device for non-contact illumination is presented in the following embodiments:
(115) The contactless light delivering device is made of point sources which are imaged on the illumination surface (cornea, sclera or skin).
(116) The contactless light delivering device consists in a circular source whose light is shined on the illumination surface (cornea, sclera or skin).
(117) The contactless light delivering device consists in an annular whose light is shined on the illumination surface (cornea, sclera or skin).
(118) Illumination is provided thanks to a single or a combination of light sources in the wavelength range of 400 nm to 1200 nm such as but not limited to: pulsed or continuous laser source, light emitting diode, super luminescent diode, quantum dot source, a lamp, a black body radiation source, and a laser source. Light is delivered by placing the source in direct contact with the tissue (sclera or skin) or guided from the source to the tissue or imaging the source on the illumination surface (cornea, sclera or skin). Waveguiding components include but are not limited to: multimode fibers, capillary waveguides, lensed multimode fibers, single mode fibers and photonic crystal fibers. Light beam can be converging, diverging or collimated, depending on the chosen illumination technique. Light can be but not limited to linearly polarized, circularly polarized, non-polarized (meaning that does not presents any known preferential polarization), and a mixture of different polarizations.
(119) In Fourier domain, oblique illumination with a plane wave is equivalent to a shift towards higher spatial frequencies, meaning a higher spatial resolution. In addition, shining light on the fundus with higher angles will also produce a more oblique back illumination, providing higher contrast.
(120) The image acquisition process is different depending on the required imaging modality: dark field or phase/absorption. For dark field, imaging can be performed with just one illumination point without image processing. A wider field of view is obtained by stitching together images obtained for different imaging areas.
(121) For the image acquisition system, the following operational steps or scheme can be performed: An image of the retina is formed on the camera thanks to a series of lenses and mirrors. A lens or a mirror is translated to change the focal plane in the retina. A tunable lens that is used to change the focal plane in the retina. A cylindrical lens is rotated and translated to compensate for eye's astigmatism. Two independent cylindrical lenses are translated to compensate for eye's astigmatism. Patient's prescription glasses are used to compensate for eye aberrations. A deformable mirror is used to compensate for eye's aberration. A wavefront sensor is conjugated to the pupil's plane to measure eye's aberrations. A camera is conjugated to the pupil's plane to measure the illumination function. A diaphragm is conjugated to the retina to select a small area of the retina for measuring the illumination function. A camera is conjugated to the cornea to observe when the patient's eye is at the right position. Moreover, polarization optics can be used to stop light's backreflection at the different surfaces.
(122) For the reconstruction, the following operational steps can be performed: The illumination profile of the retina (backscattered light) is not an even function in the range of collection NA. The light used for illumination passes through the retina and its phase and intensity is affected by its optical properties. The modulated light is recorded on a camera for different illumination functions. Pictures are processed together to reconstruct the phase and absorption image. The image quality is improved by increasing resolution and contrast via image processing. Anatomical features are extracted and analyzed to detect possible abnormalities.
(123) Phase imaging requires at least two pictures captured with two different illumination points. By using a reconstruction algorithm, it is possible to obtain a qualitative or quantitative phase image. Such reconstruction algorithms known in the art can be, but not limited to, those described in L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9,11394-11403(2015), in Z. Phillips, M. Chen, L. Waller "Quantitative Phase Microscopy with Simultaneous Aberration Correction" Optics in the Life Science 2017 or in Int. Pat. Pub. No. WO 2015/179452, S. B. Mehta and C. J. R. Sheppard, "Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast," Opt. Lett. 34,13,1924-1926(2009). The proposed illumination schemes, discussed as exemplary embodiments, can be combined to record the images to process. (124) Phase imaging is based on the interference of the beam with itself due to phase difference in the object plane. This interference, in the case of a non-even illumination results in a modulation of the intensity at the camera layer. By recording two pictures with opposite illumination profile (S1(u) = S2 (— u)) and subtracting their intensity the background signal is removed, leaving only the phase information. One obtains an image which is the differential phase contrast (Z. Liu, S. Liu and L. Waller "Real-time brightfield, darkfield, and phase contrast imaging in a light emitting diode array microscope," J. of Biomed. Opt. 19,10, 106002 (2014), T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," Nat. methods, 9, 12 (2012)). The principle is shown in the representations of FIGs. 3, 4A and 4B.
(125) If the two complementary illumination angles images are 10 and II, the differential phase contrast image is computed by
Idiff = (10-11 )/(I0+Il) (1)
(126) The intensity values in the image are related to the phase gradient in the image plane. Since this technique requires the illumination beam to be transmitted through the sample it appeared to be impossible for thick biological media.
(127) It has been shown how the properties of a scattering media, like a biological tissue, can be used for providing a sort of back illumination of the sample (T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back- illumination," Nat. methods, 9, 12 (2012)). Indeed, if a beam of light is shined
perpendicularly on a scattering media, the backscattered light will show different angular distribution, depending on the distance away from the beam incident position, see in FIG. 3. This uneven angle distribution results in a tilted averaged illumination, which can be used for providing oblique illumination. A similar effect can be observed if the shining beam is not perpendicular to the surface, and the backscattered beam will present directionality, as shown in FIGs. 34 and 35.
(128) This effect can be used in eyes such as but not limited to the human eye: when light is shined on the fundus with a certain angle, (e.g. when passing through the sclera) it travels through the transparent retinal layers and scatters in the deeper layers (e.g. pigmented epithelium and choroid). Here light scatters back, maintaining an oblique direction, and passing through the upper layers of the retina it is affected by retinal absorption and phase.
(129) By acquiring pictures sequentially for at least two different complementary illumination angles (e.g +90 and -90 degrees) , it is possible to use phase gradient imaging algorithm (T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," Nat. methods, 9, 12 (2012), S. B. Mehta and C. J. R.
Sheppard, "Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast," Opt. Lett. 34,13,1924-1926(2009), L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array
microscope," Opt. Exp. 23,9,11394-11403(2015), see e.g. FIG. 4) or a Fourier ptychography algorithm (Int. Pat. Pub. No. 2015/179452, G. Zheng, R. Horstmeyer, and C. Yang, "Wide- field, high-resolution Fourier ptychographic microscopy") and reconstruct a qualitative or quantitative phase gradient image. This reconstruction process can then be generalized to any kind of illumination introducing the transfer functions of phase and absorption already introduced by Tian and Waller. Indeed, the intensity on the camera can be written as:
Figure imgf000028_0001
(130) With δ Dirac's delta function and B1 ; H-L and G1 transfer functions dependant only on S. Since if S is known the only unknown are μ and φ, by acquiring at least two pictures it is possible to obtain the two unknown functions. FIG. 49 illustrates the principle for two different illumination points. The human in-vivo fundus image is taken with transscleral illumination. The shadow in FIG. 40 exhibits an effect of "twin" image for the vessels' tree, as also illustrated in images 55 and 56, due to the almost transparent layer between the vessels and the layer where the shadows are cast.
(131) Next, some details to the mathematical background for the aspects of the present invention is given. When an image \1 and a second picture I2 are acquired with two different illumination pattern they could be renormalized using different methods, such as but not limited to:
Ns{Ii} = Ιι Μ (5)
WhereA?i{~} is the i-th renormalization method Low{~} is a 2D lowpass filter. Together with that it is convenient in many cases also to subtract the resulting image with its average, removing the zero component in the Fourier space.
(132) A normalization method can include but is not limited to the relationship defined in Λί1{~}, the relationship defined in N2{~} and the relationship defined in N3{~}.
(133) Regarding phase and absorption reconstruction, from the study of Tian and Waller (L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23,9,11394-11403(2015)) we can write the image in Fourier as:
Figure imgf000029_0001
Where μ and φ are the Fourier transform of the absorption and phase profile respectively and Ηλ, G1 and B1 are functions dependent to the illumination function and pupil's aberration in measurement 1. From this we can obtain the two profiles as:
φ = Η2ί11ί2 +(Β2 Η 11Η2
H2 G 1 H1G2
H2 G H 1G2 If more than 2 images are acquired we can define a phase DPC image between the image i and image j as:
¾ = Hjlj - Hjlj + (BjHj - BiHj)6 (9) And for absorption as:
Aij = GjIi - GiIj + (Bj Gi - BiGj)6 (10) And the transfer function as:
Figure imgf000030_0001
Thanks to that is possible to reconstruct using different DPC images, for example using Weiner filtering as:
$ = (12)
P =∑¾f (13)
(134) Regarding the illumination function estimation, three different variants are presented. In the flat approximation method, as previously mentioned reconstruction is possible only by knowing the different H and G. That can be obtained from the studies of Waller and Tian by knowing the pupil function (aberration) and the illumination profile S.
(135) The method for obtaining phase and absorption using the DPC images obtained by equations (9) and (10) and with the transfer function (1 1) is what here is called Modified Waller Method (MWM). In the special case where 1 = 2, U1 = H2 and G1 =
— G2 it can be simplified in the Waller Method (WM) for obtaining phase with DPC image as:
Pij = Ii - Ij (14) and transfer function
ij = 2Gi (15)
(136) Each one of these methods requires a deconvolution step in with the DPC image is deconvolved with the transfer function. This can be performed with different methods such as but not limited to: direct inversion, Weiner filtering, conjugate gradient minimization, maximum likelihood method, blind deconvolution methods.
(137) These methods for retrieving phase and absorption are here called "phase and absorption retrieval algorithm." A phase and absorption retrieval algorithm can include, but is not limited to: a Waller method with or without renormalization, a Waller modified method with or without renormalization, and a phase retrieval algorithm.
(138) The first method consist in approximates S(u) = h(aux + buy), where h(Ui) represents the Heaviside function dependent on the only coordinate u and a and b are two coefficients that are chosen to determine which half spaced will be equal to 1. This method completely ignores the angular distribution of the backscattered light, allowing for the reconstruction without any other study on the illuminated surfaces. Reconstruction is then provided by mean of, but not limited to, inverse filtering, least square filter, constrained least- square filter, Tikhonov regularization, blind deconvolution, iterative filters and it can be applied both in spatial or Fourier domain. The image of FIG. 49 is obtained with this approximation and then reconstructing the phase image with Tikhonov regularization.
(139) Regarding the ramp approximation method, this method is similar to the one obtained before, but, instead of using an Heaviside approximation, the illumination function is chosen as S(u) = aux + buy + q. With a, b and q arbitrary values. Then the images can be reconstructed with the same methods as proposed in the flat approximation method.
(140) Next, regarding the angular scattering information method, this method is based on a precise knowledge of the function S(u). This function is obtained with, but not limited to, Monte-Carlo scattering simulations or experimental measurements. Experimental results can be obtained, but not limited to, a camera conjugated to the pupil's plane.
Reconstruction can the obtained by using the same techniques as mentioned in method 1. The image shown in FIG. 51 is obtained with the function S(u) obtained from Monte-Carlo simulations and then reconstructing the phase image with Tikhonov regularization. The parameters used for the simulations were obtained from Rovati et al.(L. Rovati, S. Cattini, N. Zambelli, F. Viola, and G. Staurenghi, "In-vivo diffusing -wave-spectroscopy measurements of the ocular fundus", Optics Express Vol. 15, Issue 7, pp. 4030-4038 (2007)) and from Curcio et al. (C. A. Curcio, J. D. Messinger, K. R. Sloan, A. Mitra, G. McGwin, and R. F. Spaide, "Human Chorioretinal Layer Thicknesses Measured in Macula-wide, High- Resolution Histologic Sections", Invest Ophthalmol Vis Sci. 201 1 Jun; 52(7): 3943-3954.
(141) After image reconstruction, resolution is improved by means of, but not limited to, Meitav's or Shemonski's guiding star method, Meitav's shift and add method, Hillman's deconvolution-entropy method, blind deconvolution algorithm or iterative filters. In case of guiding star algorithm the deconvolving function is firstly estimated at the photoreceptors layer by physical focusing. Successively the phase retinal layer is set in focus, images are acquired and the same filter is applied to improve image quality. Filters can be applied on either the raw image or the DPC (IDPC) image.
(142) Image quality can also be improved in the reconstruction process. Indeed, if the aberrations at the pupil plane are known the reconstruction restores the original image. If the aberrations are not known, it is still possible to estimate them using a blind-deconvolution approach, as in Phillips (Z. Phillips, M. Chen, L. Waller "Quantitative Phase Microscopy with Simultaneous Aberration Correction" Optics in the Life Science 2017).
(143) Once the phase image has been extracted, an improved pattern recognition algorithm can be run for features extraction. Features extraction is meant to work on retinal features such as, but not limited to, cells, nuclei and microvasculature present in the different retinal layers, for example inner limiting membrane (ILM), retinal nerve fiber layer (RNFL), ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), outer nuclear layer (ONL), external limiting membrane (ELM). (144) Features extraction is performed with, but not limited to edge detection, corner detection, blob detection, ridge detection, scale-invariant feature transform, Hough transform, for instance based on a deep learning software. In such case, the deep learning software can be trained with pathologic and nonpathologic images (in vivo or ex vivo). Furthermore, the features extraction can be applied to medical information extraction in order to help the clinician analyzing the data.
(145) Regarding the measurements of the illumination function, the illumination function can be estimated by placing a camera in a plane that is conjugate to the pupil's plane in case of curved surface (as in the case of the eye) or to the Fourier plane if the sample surface is flat (as in the case of a flat mounted ex-vivo sample). A diaphragm in a plane conjugate to the sample's plane allows for a better selection of the scattering profile. Indeed, the measured illumination function is averaged over the samples area but, thanks to the diaphragm, this area can be limited and the illumination function can be measured locally. Furthermore a small aperture allows approximating a curved surface as locally flat. Under this consideration there is no more difference between placing the camera at the pupil or at the Fourier plane.
(146) Due to this configuration, the image obtained at the pupil camera is the illumination function averaged over the area limited by the diaphragm aperture. If the illumination function needs to be measured pointwise over different areas of the sample the aperture can be replaced by a lens array. In this configuration, each lens creates the image on the camera of the local illumination function.
(147) Regarding aberration measurement, a limiting factor in the image quality is given by optical aberrations. They can be due to eye's aberration and aberrations of the optical system. Aberrations usually produce a dumping of high frequencies, resulting in a poorer image resolution. In order to compensate for this effect, for example by physical correction or post processing, a wavefront sensor can be part of the system. Many different devices can be used to perform this task, but the main ones used are Shack-Hartmann wavefront sensor and Tscherning wavefront sensor.
(148) In both cases light is sent to the retina and a camera is conjugated to this plane. In the Shack Hartmann case light is sent in order to produce a point, while for Tscherning a pattern similar to a grid of points is imaged on the retina. In the Shack Hartmann a lenses array is then placed in a plane conjugated to the pupil in order to produce several images of the same point on the camera. Since the lens array is conjugated to the pupil plane the image of each spot is translated from the center of the lens of a distance proportional to the local slope of the wavefront. In this way it is possible to reconstruct the wavefront by measuring the spot field. The Tschering wavefront sensor performs the same measurement but producing the spotfield directly on the retina.
(149) Because such a system requires illumination of the retina this light could disturb the retinal image used for reconstruction when the two are used at the same time. Because of that it is convenient to use different wavelengths: one for illumination and the other for wavefront sensing. Filters and dichroic mirrors are then used to avoid light of one system to enter the other one.
(150) Regarding physical aberration correction, aberrations can be compensated in post-processing but, if the dumping effect is too strong, the dynamic range of the camera is not enough to record the information. It is then more convenient physically compensate the effect of aberrations when eye aberrations are too strong. The physical correction can be performed in different ways.
(151) Changing the focal distance of the method, system, and device is useful both in case of eye defocus (myopia/hyperopia) and in the case in which a change of the focal plane is required for example but not limited to the performance of a stack of different planes. This task can be performed in different ways, for example but not limited to translation of the focusing element (lens or curved mirror), translation of mirrors for increasing the path length (badal system), change of focal distance in a tunable lens, change in a deformable mirror.
(152) Furthermore, myopia/hyperopia of a patient can be corrected in the measurement using prescription glasses or contact lenses of the patient.
(153) Another common aberration in human eyes is due to astigmatism. This consists in a difference in focal distance of the lens along two different axes. Because of that astigmatism can be compensated by using, for example, translation of two independent cylindrical lenses, a deformable mirror, prescription glasses or contact lenses of the patient.
(154) Referring to FIG. 48, both low and higher order aberrations can be compensated by placing a deformable mirror in a plane conjugate to the pupil's plane. This configuration is more robust if coupled with a wavefront sensor in which the sensor is placed after the deformable mirror. In this way, the wavefront sensor can be used in a closed loop with the deformable mirror to compensate for aberrations.
(155) Moreover, dark-field illumination can be used in the present method, system, and device. With the dark-field illumination, an illumination method is used that employs for illumination a range of illumination angles that is different from the range of collection. In this way transscleral illumination, associated with transpupillary collection, is considered a dark-field illumination.
(156) When light is incident on the fundus, it is scattered by many different layers. The first retinal layers provide relatively intense backscattering. By using different angles for illumination and collection, most of this light scattered by the first retinal layers will be not collected by the pupil. Deeper layers scatter light as well as seen in OCT images or de- centered collection (T. Y. P. Chui, D. A. VanNasdale, and S. A. Burns, "The use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope," Biomed. Opt. Exp.3, 10,2537-2549(2012), an article in which dark field illumination through the pupil, in illumination and collection, is used to observe retinal and choroidal microvasculature.
(157) In order to provide oblique back-illumination of the fundus, the following illumination can be used: transscleral (FIGs. 9A, 9B), transepidermal (FIGs. 10A, 10B), pupillary oblique illumination (FIGs. 11A, 1 IB), pupillary direct illumination (FIGs. 12A, 12B) or through the temple (FIG. 13).
(158) With respect to dark-field collection, a collection method is described that collects only the light diffracted by sample and not from its background. Dark-field imaging of retinal layer is obtained by avoiding to illuminate the background of the imaged area thanks to high angle illumination or by filtering part of the background layer in a conjugate plane. This can be performed with flood illumination, scanning method and a mixture of the two, for example but not limited to flood illumination and scanning system for collection.
(159) A scanning acquisition system can include, but is not limited to a confocal scanning acquisition, an optical coherence tomography acquisition, a shifted pupil scanning acquisition, a split detector acquisition, and a lock-in scanning acquisition. A detector can include but is not limited to a single pixel detector, a line camera and a 2D detector. A single pixel detector can include but is not limited to a photodiode, an avalanche photodiode, a photomultiplier tubes, a micro photomultipliers tube, a lock-in single pixel detector and a split detector composed of single pixel detectors. A two-dimensional detector can include, but is not limited to a lock-in multipixel detector, a CMOS camera, an sCMOS camera, a CCD camera, and a 2D split detector.
(160) Next, different illumination methods and systems are described, with different configurations that serve as non-limiting and non-exclusive embodiments.
(161) Configuration 1 (transscleral) (162) Referring to FIGs. 9A and 9B, these figures represent the transscleral illumination method: light 16 is shined directly on the scleral tissue 9. Light can be delivered by direct contact of the source or the light waveguiding component, or in a non-contact manner with an optical beam (collimated, focused, diverging or with structured illumination). Some examples of light position are symbolized by discs 45. The scattering properties of sclera 9 and underneath layers 10, 11 produce a diffused beam 19 that illuminates the fundus with a high angle. In case of physical contact with the sclera, local anesthesia can be used to make the measurement more comfortable to the patient.
(163) Structured illumination can include but is not limited to a sinusoidal phase pattern, a sinusoidal intensity pattern, a light pattern modulated in intensity with a micromirror array, a light pattern modulated in phase and/or with a spatial light modulator.
(164) Waveguides and waveguiding components can include but are not limited to: single mode fibers, multimode fibers, capillary waveguides, lensed multimode fibers and photonics crystal fibers.
(165) Configuration 2 (transepidermal)
(166) Referring to FIGs. 10A and 10B, these figures represent the transepidermal illumination method: light 16 is shined on the upper 14 and/or lower 15 eyelid and from there, scattered 19 through different layers up to the inside of the eye 1. Light can be delivered by direct contact (of the source or a waveguiding component) 27 or with an optical beam 16 (collimated or not). Some examples of light positions are symbolized by discs 45. Many point sources that are spatially separated provide different angles of illumination. Also contact with the skin can be more comfortable for a patient as it does not require anesthesia.
(167) Configuration 3 (transpupillary from the side)
(168) Referring to FIGs. 11A and 1 IB, these figures represent the pupillary illumination method: light 17 is shined on the inside layers of the eye after passing through the pupil 4 and the lens 5. The light is scattered on the inside of the eye after back reflection from the focal point 28 and illuminating the eye fundus. Light can be delivered by direct contact (of the source or a waveguiding component) on the cornea 3 or in a non-contact manner with an optical beam (collimated or not).
(169) Configuration 4 (transpupillary direct brightfield)
(170) Another illumination method is based on direct illumination of the eye fundus. Once light reaches the eye fundus the backscattered light is modulated by the retina and then collected for imaging purposes. In this configuration light can shine either on the background of the imaged area (brightfield) or only on the side of it (darkfield).
(171) Configuration 5 (transpupillary direct darkfield)
(172) Referring to FIGs. 12A and 12B, in these figures it is shown that light is sent through the pupil directly on the imaged retinal area. However, light is sent with such an angle that is not reaching the RPE on the back of the imaged retinal area. In this way the background appears dark. The light collected by this retinal area is not given by modulation of the background light, but to diffraction of the retinal features.
(173) Configuration 6 (Mertz-like)
(174) Light is shined through the pupil and focused on the RPE without shining directly on the imaged retinal area or its background. Light scatters inside RPE and choroid, reaching the layer on the back of the imaged retinal area. From here light is backscattered and shined on the retina providing illumination.
(175) Configuration 7 (trans-temporal)
(176) Referring to FIG. 13, this figure shows the eye fundus can be illuminated also through transtemporal illumination. Light is shined on the patient's temple and from here it scatters into the eye.
(177) Configuration 8 (beam shape) (178) Referring to FIGs. 14A, 14B, 14C, these figures show that, using
Configuration 1 to 7, the beam shape can be modified thanks to, but not limited to, optical methods, wavelength choice, wavefront shaping.
(179) Next, different exemplary illumination methods systens are discussed, as non- limiting and non-exclusive embodiments.
(180) Configuration 1 (contact pcb)
(181) Referring to FIGs. 23, 24, 25, 26, 27, 28A, a transepidermal illumination method is represented. Light 16 is shined on the upper 14 and lower 15 eyelid either simultaneously or sequentially or in any combination and from there, scattered 19 through different layers towards the inside of the eye 1. Light is delivered by direct contact of the light source 27 to the skin 15. A transparent or scattering media can be present between source and skin, to expand the illuminated area and decrease the power density on the skin. Illustrative examples of light source positions are symbolized by discs 45. Many point sources that are spatially separated provide different angles of illumination to the inside of the eye. Also contact with the skin can be more comfortable for a patient as it does not require anesthetic lubricant as opposed to the case of a light source in contact with the eye (sclera, cornea).
(182) In the apparatus of FIGs. 25 and 28A the transscleral illumination system is connected to a master driver board. The board provides driving signals for all the LEDs connected to it, as well as the trigger signal for the imaging device in order to synchronize the illumination to the acquisition system. By turning on different LEDs a different illumination spectrum, both in terms of emitting wavelength and angular spectrum can be generated. By changing the driving current it is possible to change total intensity, shape of the power spectrum and spatial distribution of light.
(183) Configuration 2 (contact pcb) (184) Referring to FIG. 24, a similar illumination principle to configuration 1 is shown, i.e. transepidermal illumination before passing through the sclera. The light emitting device and its flexible part has a continuous light source on top and bottom of the eye, following the arc shape of the eyelids. The continuous source is composed of pixels, each one of those can be switched ON or OFF independently.
(185) Configuration 3 (contact led)
(186) Referring to FIG. 28B, a similar illumination principle to configuration 1 is shown, i.e. transepidermal illumination before passing through the sclera. The light emitting device is a LED, having an encapsulation of a transparent material (such as but not limited to epoxy and Polydimethylsiloxane) with a diameter of a few millimeters. The LEDs are placed in contact with the skin of the eyelids of the patient. The number of LEDs is not limited to 4.
(187) Configuration 4 (non contact)
(188) The illumination is provided in non-contact fashion, the beam illuminating the eye or the surrounding tissues can be focusing, collimated or diverging.
(189) Configuration 5 (wheel)
(190) In the apparatus of FIGs. 19, 21 and 22, light directed to a scattering tissue is provided by, but is not limited to a light beam 16 and a rotating wheel 39 pierced with a small hole 41. A light beam illuminates the whole surface of the wheel, in such a way that light passes only through the hole 40 and illuminates only one point on the sclera 9. Alternatively, the illumination point can be on the skin 14, 15 surrounding the eye, or even on the lateral side of the eye, referring to FIGs 4, 8 and 9.
(191) Configuration 6 (wheel and fiber)
(192) In the apparatus of FIG. 20, light on scattering tissues is provided by a fiber 18 which can be, but is not limited to a single mode or multimode fiber. A rotating wheel 39 holds a fiber 18 and a lens 22 that focuses light on the sclera 9 or on the skin 14, 15. The fiber holder is designed in such a way that fiber can rotate freely, without introducing stress in the fiber. Furthermore, the disc holding the fibers is spun a limited amount of times, preventing the fiber to get coiled around the rotating arm. Another solution of preventing coiling consists of rotating the disc from the sides (and so removing any rotating arm).
(193) Another embodiment consists in a series of light sources arranged on a fixed structure shaped like but not limited to a circle (annulus). Alternatively, the light beams 74 can be separated, as shown in FIGs. 16 and 18. In these previous examples, the apparatus is configured to send the light in a non-contact manner, referring to FIGs. 10A to 14C.
(194) Configuration 7 (patch):
(195) In the apparatus of FIG. 23, light is delivered on the patient's skin, in contact with the skin thanks to a patch 46. The patch is connected to several fibers which brings several illumination points 45 on the patient's eyelids. Every illumination points require one optical fiber 18. The patch can be composed of a consumable protection part in contact to the skin. The patch is connected to the split light source thanks to optical connectors 47.
(196) Configuration 8 (contact sclera):
(197) In the apparatus of FIG. 17, light is delivered on the patient's sclera 9, in contact. The light comes out from multiple optical fibers 18 or optical waveguides. In addition, the objective of the imaging system 21 is almost in contact with the cornea 3, an index matching, bio-compatible gel 65 being in between the two. Note that the principle showing in FIG. 17 differs from the FIG. 2 in the number of illumination points (more than two), and the way of illumination. Here the beams are switched on sequentially, while in FIG. 2, the two points are shined simultaneously.
(198) FIG. 54 shows an exemplary schematic view of a scanning system for inspecting eye 541, according to an aspect of the present invention. The use of a scanning system for signal collection allows for a better depth selectivity of the imaging layer of the sample. It can be used for collecting both phase/absorption or darkfield information. The system uses a scanning element, such as but not limited to a two-axis scanning mirror 543 and 545 for scanning the collection beam along the imaged area on eye 541. Other mirrors 540, 544, 548 and 549 are used to reflect the light towards a detector. A diaphragm or a pinhole 546 is then placed in a plane conjugate to the imaging plane. In this way depth selection is improved. Compared to a standard SLO system, only the detection arm is needed because the illumination 542 is provided through the sclera, and not through the pupil. The use of a scanning system can be also combined with hardware adaptive optics. The signal is collected by a single pixel detector 547 such as but not limited to photodiode, avalanche photodiode, photomultiplier tube (PMT), micro-PMT which can be used for a standard acquisition or in a lock-in mode.
(199) Referring to FIG. 55, a lock-in acquisition can be efficiently integrated to the system. The lock-in acquisition can be performed using a lock-in camera, for instance in flood illumination, or a single detector, for instance in a scanning system. The output of the camera/detector is then the DPC image. In addition, several DPC signals can be integrated to have an output that is an average value of the DPC signal. It allows to increase the SNR. Finally, removing the background at an early stage of the readout chain provide a more efficient use of the digital resources.
(200) Wavefront shaping is further explained with respect to FIGs. 29, 30, and 31. The different illumination schemes provide, at the retinal surface a speckle pattern with a speckle grain size smaller than the photoreceptors' diameter, which are a few micrometers (D. Mustafi, A. and H. Engel, Krzysztof Palczewski "Structure of cone photoreceptors," Progress in Retinal and Eye Research, Vol. 28, No. 4, pp. 289-302, 2009.). After the high resolution speckle pattern is shined on the fundus it is possible to reconstruct a high resolution image of the fundus in several embodiments. (201) A first embodiment includes the viewing (i.e. collecting the digital image) of the speckle pattern through the pupil of the eye (and so with a much lower resolution). Due to the scattering media's memory effect, it is possible to shift the speckle pattern and collect a picture for every shift. The resulting collection of images is then used in a phase retrieval algorithm to reconstruct an image of the fundus with the same resolution of the original high resolution projected speckle pattern. Another embodiment is to place markers on the surface of the eye, such as embedded within a contact lens or by displaying non-contact markers or any high resolution eye tracker, for the purpose of obtaining feedback to always illuminate the same area to provide a constant speckle pattern.
(202) Another embodiment for image reconstruction is based on scanning a single focus spot. Due to wavefront shaping, it is possible to transform the speckle pattern into a single spot (of the same size as the original speckle). Similarly, in this method, the memory effect is used to scan the spot. By collecting the reflected intensity at each point, it is possible to reconstruct the intensity profile of the entire fundus image. The main difficulty of this embodiment lies in the focusing part, since the limited resolution caused by the pupil does not allow for the measurement of the transmission matrix. An alternative is the use of an iterative process, such as, but not limited to, a genetic algorithm (GA). (D. Conkey, A. Brown, A. Caravaca-Aguirre, and R. Piestun, "Genetic algorithm optimization for focusing through turbid media in noisy environments," Opt. Express, vol. 20, pp. 4840-4849, 2012.). GA can provide focusing by maximizing a parameter that measures how close is the pattern to the ideal case (perfect focusing). This class of algorithm is the most efficient for the targeted application because of its fast convergence time, i.e. only about 1000 iterations are necessary for focusing light with an acceptable contrast (I. M. Vellekoop, "Feedback-based wavefront shaping," Opt. Express 23, 12189-12206 (2015)).
(203) In the case of the eye fundus, the unique properties of this tissue can be used to provide a parameter for focusing. Cone photoreceptors (1-1.25 um diameter (D. Mustafi, A. and H. Engel, Krzysztof Palczewski "Structure of cone photoreceptors," Progress in Retinal and Eye Research, Vol. 28, No. 4, pp. 289-302, 2009.)) appear to be much brighter than the background because of their waveguiding properties (B. Vohnsen, "Photoreceptor waveguides and effective retinal image quality," J. Opt. Soc. Am. A 24, 597-607 (2007)- B. Vohnsen, I. Iglesias, and P. Artal, "Guided light and diffraction model of human-eye photoreceptors," J. Opt. Soc. Am. A 22, 2318-2328 (2005)), and their sparse distribution can be used for beating the resolution limit. It is straightforward to see that the maximization of the total reflectance coincides with the focusing of the light on the brightest photoreceptor. Another parameter that can be maximized is the total intensity in the area containing just one photoreceptor divided by the background intensity. When two or more points contribute to the generation of a PSF its ratio maximum/energy is smaller than the ideal case.
(204) Regarding interference imaging, FIG. 53 shows the optical principle scheme for optical coherence tomography measurement with scattered light. A broadband light source 53 (e.g. SLD) is split into a reference 50 and an object 51 arm. The reference 50 illuminates a mirror 54 that can translate to scan the sample in depth after interfering. The object arm illuminates the eye 1 through the skin 8 and/or the sclera 9, the choroid 10 and the retina 11. After illuminating the fundus, back-scattered light is collected by the pupil 4 and interferes with the reference beam. After recombining the two beams thanks to a beam splitter 42, the interfering beam passes through an imaging optics blocks 49 and is recorded by a detector 48.
(205) Elements of the system can combined with other imaging modalities, for example but not limited to OCT, fluorescence imaging, magnetic resonance imaging (MRI), in a single platform to obtain and merge medical information helping for the diagnostic, to establish a multimodal retinal imaging platform. Especially the use of a scanning system for image acquisition makes the system more compatible with scanning laser ophthalmoscope and OCT technology.
(206) Phase imaging of the retina with aspects of the present invention can be performed with infrared light. The human eye is not sensitive to infrared light. As a consequence, for the living retinal, some retinal function can be imaged by stimulating the retina with visible wavelength, for example through the pupil or the the sclera, to perform functional retina imaging. For instance, the response of the photoreceptors to different wavelength can be studied. The Functional analysis method, as used herein, can include but is not limited to a deep learning algorithm.
(207) An ophthalmic imaging system can include, but is not limited to an optical coherence tomography system, an eye-fundus imaging system, a slit illumination imaging system, a fluorescing angiography imaging system, an indocyanine green angiography imaging system, a fundus autofluorescence imaging system, a corneal topography imaging system, a endothelial cell-layer photography imaging system, a specular microscopy system to provide multimodal imaging of the eye tissues.
(208) The same imaging method can be applied for imaging the anterior part of the eye. The anterior eye tissue can include, but is not limited to the eye lens, the endothelium, and the cornea. Light scattered from the eye fundus or from the pupil is transmitted through the these layers and modulated in intensity and phase. The use of an imaging system whose focal plane is not the retina, but rather the front layers of the eye (e.g. the corneal endothelium) would allow recording of images containing phase and absorption information. In the same way, by recording two or more of these images using different illumination it is possible to reconstruct the absorption and phase profile.
(209) Next, a general method describing a protocol for the imaging of the eye is described, with respect to FIG. 38. The optical system is positioned on the patient by following the embodiments in FIGs. 9A to 29. Next, the imaging system is aligned with the eye of the patient. After this step, each point (one at the time or multiple together) is illuminated with one or more wavelength selected in a spectrum approximately between 400 nm to 1200 nm and an image of the fundus is acquired through the eye lens. Each image is acquired sequentially. The patient's pupil may or may not be dilated. In the case of dark field images (minimum 1 illumination point) the image captured is directly the dark field image. If the chosen method is phase imaging (minimum 2 illumination points) the acquired images need first to be processed to obtain a qualitative or quantitative phase image. When all the pictures have been acquired, the images are post-processed.
(210) A series of experimental tests have been performed that have shown operability, proof of principle, and substantially improved results over the background art.
(211) With respect to measurements related to in vivo phase imaging, for proof of principle demonstration, an indirect ophthalmoscope has been built, as shown in FIG. 46. The aspherical lens 30 allows a field of view in the fundus of about 60°. The camera objective 33 focuses at the image plane of the lens 30 and the camera 32 records the inverted image 31 of the eye's fundus. FIGs. 49, 50, 51 show two dark field images 35, 36 for two individuals and their corresponding phase gradient images 37 obtained by subtracting the two dark field images, using the relation of Equation (1). Illuminating wavelength was 643 nm. The left side of FIG. 49 shows a picture taken with transepidermal from bottom center point of illumination, the center shows a picture taken with transepidermal from bottom left point of illumination, and the right side shows the difference of the two showing the phase contrast. FIG. 18 shows the phase gradient image 37 with a line profile 38 showing gradient of the optic disc for example. Next, in FIG. 51, the left side shows a picture taken with
transepidermal from bottom right point of illumination, the center shows a picture taken with transepidermal from bottom left point of illumination, and the right side shows a difference of the two showing the phase contrast.
(212) Referring to FIG. 47, in a second step another ophthalmoscope has been built in order to obtain a smaller field of view. It includes a stage for adjusting the focus, a fixation target for the patient and two telescopes formed by the eye lens, including the first lens, the second lens, and the third lens from left to right). In addition, a diaphragm is placed at the pupil plane to filter the beam. Finally, a high sensitivity camera record the retinal image. FIG. 52 shows an example of a 2x2 mm2 field of view of a retina with trans- epidermal LED (peak wavelength at 870 nm) illumination.
(213) Referring to FIG. 48, in a third step a system has been designed to correct the aberrations using a wavefront sensor and a deformable mirror in closed loop using an aberration correction method.
(214) An aberration correction method, as used herein, can include but is not limited to a deformable mirror, a spatial light modulator, a Badal system, a tunable lens, a series of cylindrical lenses, a Waller method, a modified Waller method, and a blind deconvolution algorithm.
(215) With respect to measurements related to ex vivo phase imaging, referring to FIG. 42, a microscope has been built having similar parameters to the in-vivo imaging case. It includes an oblique illumination being scattered first by a scattering plate, a reflecting layer the provide the backscattered light, a microscope objective, an imaging lens and a camera. FIG. 43 shows the ex-vivo measurement results, with the resolution assessment, a comparison of the image obtained with the invention and a digital holographic microscope providing a quantitative phase image. FIG. 43 also shows a comparison with images acquired with a confocal microscope providing intensity images. FIGs. 44 and 45 show the results of a scan in depth of a pig retina, with the different layer of the retina, and computing the cells density for the layers of nucleus. (216) Next, measurement have been performed for the operability and proof of principle demonstration using wavefront shaping. For proof of principle demonstration, we use a static sample of microbeads and a liquid crystal based spatial light modulator to optimize the feedback light. This system is shown in FIG. 32. The linearly polarized collimated laser beam illuminates the SLM before passing through a 400 μιη thick scattering layer. Next, the scattered beam illuminates the sample through a 0.25 NA objective. The reflected light is collected by the objective, and passes through a diaphragm in order to artificially decrease the NA of the detection to mimic the limited resolution of the eye pupil. The sample of microbeads is used to reproduce the situation of high reflectivity features that the detection system cannot resolve.
(217) FIG. 33 shows the results of the focusing process performed for one 10 μπι diameter bead, and a detection NA of 0.02. The procedure is as follow: An image was recorded with the maximum resolution (curve before optimization in FIG. 33D), then the diaphragm is closed to optimize the wavefront (FIG. 33C). The low resolution PSF before and after optimization is shown in FIG. 33C. Finally, we open the diaphragm to record the optimized high resolution PSF (curve after optimization in FIG. 33D). Two-dimensional images before (FIG. 33A) and after optimization (FIG. 33B) are shown. For a sample of several beads, the focusing is not as good as the one bead case, so we develop a method to discriminate the beads thanks to the shape of their PSF. If two beads are closer than the resolution distance the collected image would be similar to the PSF. Anyway, the ratio between the maximum and the total energy will change depending on the distance between the centers. This parameter can be used to discriminate between the one bead and the multiple beads cases.
(218) Various applications can be performed with the present device, system, and method. Applications include quantitative phase imaging of the retinal layer on top of the photoreceptors, between the inner and external limiting membranes, for example inner limiting membrane (ILM), retinal nerve fiber layer (RNFL), ganglion cell layer (GCL), inner plexiform layer (IPL), inner nuclear layer (INL), outer plexiform layer (OPL), outer nuclear layer (ONL), external limiting membrane (ELM).
(219) Next, the proposed method can provide dark field images of the choroid and RPE (retinal pigmented epithelium), allowing for imaging of the choroidal tumors with enhanced contrast and choroidal micro vasculature.
(220) Finally, recording two dark field images allows obtaining phase gradient information of the retinal layers.
(221) In sum, with the aspects of the present invention, a vision process for the eye is determined by the very first layers of retina. Before reaching the photoreceptor cells, light entering the eye needs to pass through a layer having a thickness of approximately 100 μιη of ganglion and neurons cells forming the retina. These cells are phase object and so difficult to see with standard imaging techniques. Indeed, phase imaging methods usually need the illumination system to be on the opposite side of the sample with respect to the imaging system, making this impossible to perform in- vivo. However, the possibility of performing phase imaging from one side using properties of scattering media has been shown.
(222) According to aspects of the present invention, a system is proposed for performing qualitative as well as quantitative imaging in the fundus of the eye with oblique illumination. The use of different illumination points, through the pupil, on the sclera itself or directly on the skin covering the sclera, provides, due to through the scattering properties of the eye, oblique back-illumination, allowing for phase contrast images. These phase contrast images can be used to reconstruct pictures containing only phase or absorption information. Furthermore, the same illumination scheme can be used for collecting dark field images from the pupil. Moreover, the use of incoherent illumination allows doubling the resolution of the recovered image compared to coherent imaging.
(223) In the present application, it has been shown that phase contrast can be obtained and how the absolute absorption and phase profile can be obtained for two dimensions (2D) and three dimensions (3D). With several embodiments, different illumination modes have been shown to provide phase -contrast and different apparatus for providing this illumination and acquiring a picture. Algorithms have been discussed for reconstructing 2D and 3D phase and absorption profiles. In addition, secondary information has been discussed that can be obtained with this technique and different improvements.
(224) While the invention has been disclosed with reference to certain preferred embodiments, numerous modifications, alterations, and changes to the described embodiments, and equivalents thereof, are possible without departing from the sphere and scope of the invention. Accordingly, it is intended that the invention not be limited to the described embodiments, and be given the broadest reasonable interpretation in accordance with the language of the appended claims.
REFERENCES
(225) U.S. Pat. No. 7,387,385
(226) U.S. Pat. Pub. No. 2007/0159600
(227) U.S. Pat. Pub. No. 2007/0030448
(228) U.S. Pat. No. 3,954,392
(229) U.S. Pat. No. 4,200,362
(230) Medibell Medical Vision Technologies Ltd.- Panoret 1000-Wide-Angle Digital Retinal Camera, printed at least as early as Oct. 2002.
(231) A. Schalenbourg, L. Zografos "Pitfalls in colour photography of choroidal tumours." Eye. 2013;27(2):224-229
(232) Devrim Toslak, Damber Thapa, Yanjun Chen, Muhammet Kazim Erol, R. V. Paul Chan, and Xincheng Yao, "Trans-palpebral illumination: an approach for wide-angle fundus photography without the need for pupil dilation," Opt. Lett. Vol. 41, pp. 2688-2691 (2016)
(233) D. Scoles, Y. N. Sulai and A. Dubra "In vivo dark-field imaging of the retinal pigmentepithelium cell mosaic," Biomed. Opt. Exp. Vol. 4, 9 , pp. 1710-1723 (2013)
(234) T. Y. P. Chui, D. A. VanNasdale, and S. A. Burns, "The use of forward scatter to improve retinal vascular imaging with an adaptive opticsscanning laser ophthalmoscope," Biomed. Opt. Exp. Vol. 3, 10, pp. 2537-2549 (2012)
(235) T. Y. P. Chui, T. J. Gast, and S. A. Burns, "Imaging of Vascular Wall Fine Structure in the Human Retina Using Adaptive Optics Scanning Laser Ophthalmoscopy," Invest Ophthalmol VisSci. vol. 54, pp. 7115-7124 (2013)
(236) T. N Ford, K. K Chu and J. Mertz, "Phase-gradient microscopy in thick tissue with oblique back-illumination," Nat. Methods, Vol. 9, 12 (2012)
(237) S. B. Mehta and C. J. R. Sheppard, "Quantitative phase -gradient imaging at high resolution with asymmetric illumination-based differential phase contrast," Opt. Lett. 34, 13, pp. 1924-1926 (2009)
(238) L. Tian and L. Waller, "Quantitative differential phase contrast imaging in an LED array microscope," Opt. Exp. 23, 9, pp. 11394-11403 (2015)
(239) Z. Liu, S. Liuand, L. Waller "Real-time brightfield, darkfield, and phase contrast imaging in a light emitting diode array microscope," Journal of Biomed. Opt. 19,10, 106002 (2014)
(240) Int. Pat. Pub. No. WO 2013/148360
(241) Int. Pat. Pub. No. WO 2015/179452
(242) G. Zheng, R. Horstmeyer, and C. Yang, "Wide-field, high-resolution Fourier ptychographic microscopy," Nature photonics, Vol. 7, Iss. 9, 2013, pp. 739-745. (243) U.S. Pat. No. 8,731,272
(244) U.S. Pat. Pub. No. 2004/0189941
(245) European Pat. No. EP 1427328
(246) U.S. Pat. No. 7,364,296
(247) M. Vellekoop and A. P. Mosk, "Focusing coherent light through opaque strongly scattering media," Opt. Lett.32, 2309-2311 (2007)
(248) H. Yilmaz, E. G. van Putten, J. Bertolotti, A. Lagendijk, W. L. Vos, and A. P. Mosk, "Speckle correlation resolution enhancement of wide-field fluorescence imaging," Optica 2, 424-429 (2015)
(249) U.S. Pat. No. 8,717,574

Claims

1. A method for imaging a tissue of an eye, the method comprising the steps of: providing oblique illumination to the eye by a plurality of light emitting areas of a light delivery device, the plurality of light emitting areas being independently controllable and arranged to direct light towards at least one of a retina and an iris of the eye;
causing an output beam from light backscattered from the at least one of the retina and the iris by the oblique illumination;
capturing the output beam with an imaging system to provide a sequence of images of a fundus of the eye; and
retrieving a phase and absorption contrast image from the sequence of images of the fundus,
wherein the sequence of images of the fundus of the step of capturing is obtained by sequentially turning on one or more of the plurality of light emitting areas at a time in the step of providing the oblique illumination.
2. The method of claim 1, wherein the tissue of the eye is part of a living eye of a human or an animal,
wherein the oblique illumination is at least one of a transpupillary illumination, a transscleral illumination, and a transepidermal illumination, and
the light delivery device is configured for at least one of the following illumination modalities:
no contact between the light delivering device and a face of a patient of the eye; the light delivering device is in contact with a skin surrounding the eye; the light delivering device is in contact with a sclera of the eye; and
the light delivering device is in contact with a cornea of the eye.
3. The method of claim 1, wherein the tissue of the eye includes an ex vivo sample from the eye of a human or an animal.
4. The method of claim 1, wherein the oblique illumination is formed by at least one of a diverging beam, a collimated beam, a focused beam, and a structured illumination.
5. The method of claim 1, wherein the tissue of the eye includes to at least one of an in vivo retina of a human, an ex vivo retina of a human, an in vivo retina of a human, and an in vivo retina of an animal, and
wherein the step of capturing includes at least one modality of a darkfield illumination, a darkfield collection, a focused coherent illumination by wavefront shaping, and an oblique optical coherence tomography by a low coherence source.
6. The method of claim 1, wherein in the step of retrieving, the reconstructed phase and absorption image is obtained by a phase and absorption retrieval algorithm
7. The method of claim 1, wherein in the step of capturing, the sequence of images is captured with at least one of a 2D single frame acquisition and a 2D lock-in acquisition
8. The method of claim 1, wherein the tissue of the eye is an anterior eye tissue, and
wherein in the step of providing the oblique illumination, the illumination is obtained by backreflection from at least one of the fundus of the eye and the iris of the eye.
9. The method of claim 1, further comprising the step of
measuring at least one of aberrations of the eye and an illumination function with at least one of an wavefront sensor and a pupil camera; and
correcting the aberrations with an aberration correction method.
10. The method of claim 1, wherein a functional information is retrieved from the data of the sequence of images.
11. A system for imaging a tissue of an eye, the system comprising: a light delivering device having a plurality of light emitting areas, the light emitting areas directed towards the tissue of the eye for providing oblique illumination, an output beam caused by light backscattered off a fundus of the eye of the oblique illumination from the plurality of emitting areas;
an imaging system configured to capture the output beam and to provide a sequence of images of the fundus of the eye; and
a controller configured to individually control the plurality of light emitting areas of light delivering device, to sequentially turn on one of the plurality of light emitter areas at a time, for capturing the sequence of images by the imaging system,
wherein the imaging system is configured to retrieve a quantitative phase contrast image, a quantitative absorption image, a qualitative phase and absorption image, a qualitative phase contrast image, a qualitative absorption image, a qualitative phase and absorption image, and a dark field image from the fundus of the eye.
12. The system of claim 11, wherein light from the light delivering device has a transmission range of a sclera-choroid-skin in a wavelength range between 400 nm and 1200 nm, and
wherein the light delivering device uses light coming from one or more of the different types of light sources.
13. The system of claim 11, wherein the imaging system includes a scanning system and a detector,
the scanning system having a collection pupil that is either centered or shifted with respect to a center of a pupil of the eye, and
the detector includes at least one of a single pixel detector, a line camera, a two- dimensional multipixel device and a split detector.
14. The system of claim 11, wherein the sequence of images is acquired by imaging the tissue of the eye on a two-dimensional multipixel device.
15. The system of claim 11, wherein the light delivering device comprises plurality of waveguides.
16. The system of claim 11, wherein parts in contact with a face of a patient are covered with a removable disposable part.
17. The system of claim 11, further comprising:
an optical system including a different ophthalmic imaging system to provide multimodal imaging of the eye tissues.
18. The method of claim 1, wherein in the step of capturing, the sequence of captured with a scanning acquisition system
PCT/IB2017/052803 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination WO2017195163A1 (en)

Priority Applications (8)

Application Number Priority Date Filing Date Title
EP22153500.8A EP4008237A1 (en) 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
JP2018559733A JP6994472B2 (en) 2016-05-13 2017-05-12 Systems, methods, and equipment for retinal absorption, phase and darkfield imaging with tilted illumination
EP17729922.9A EP3454719A1 (en) 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
US16/300,937 US11179033B2 (en) 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
CN201780033204.7A CN109414162A (en) 2016-05-13 2017-05-12 For retinal absorption phase under oblique illumination and the system of dark-field imaging, method and apparatus
US17/514,604 US11911107B2 (en) 2016-05-13 2021-10-29 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
JP2021200525A JP7235355B2 (en) 2016-05-13 2021-12-10 Systems, methods, and apparatus for retinal absorption, phase and darkfield imaging with oblique illumination
JP2023021276A JP2023055993A (en) 2016-05-13 2023-02-15 System, method and apparatus for retinal absorption, phase and dark field imaging with oblique illumination

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
IB2016052787 2016-05-13
IBPCT/IB2016/052787 2016-05-13
IBPCT/IB2016/056806 2016-11-11
IB2016056806 2016-11-11

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US16/300,937 A-371-Of-International US11179033B2 (en) 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
US17/514,604 Continuation US11911107B2 (en) 2016-05-13 2021-10-29 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination

Publications (1)

Publication Number Publication Date
WO2017195163A1 true WO2017195163A1 (en) 2017-11-16

Family

ID=59055236

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/052803 WO2017195163A1 (en) 2016-05-13 2017-05-12 System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination

Country Status (5)

Country Link
US (2) US11179033B2 (en)
EP (2) EP3454719A1 (en)
JP (3) JP6994472B2 (en)
CN (1) CN109414162A (en)
WO (1) WO2017195163A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020121243A1 (en) 2018-12-12 2020-06-18 Ecole Polytechnique Federale De Lausanne (Epfl) Ophthalmic system and method for clinical device using transcleral illumination with multiple points source
WO2021058367A1 (en) * 2019-09-26 2021-04-01 Ecole Polytechnique Federale De Lausanne (Epfl) System and methods for differential imaging using a lock-in camera
EP3884843A1 (en) 2020-03-27 2021-09-29 Ecole Polytechnique Federale De Lausanne (Epfl) Multi-modal retinal imaging platform
EP3744228A4 (en) * 2018-01-22 2021-10-13 Shenzhen Thondar Technology Co., Ltd Retinal digital imaging system, retinal digital imaging instrument, and retinal digital imaging method
JP7005808B1 (en) 2020-06-29 2022-01-24 オプトメッド オーワイジェイ Contact devices for eye examination devices, eye examination devices, and methods for contacting the eye with the eye examination device.
EP4190450A1 (en) 2021-12-02 2023-06-07 Scienion GmbH Imaging apparatus for imaging a nozzle section of a droplet dispenser device, dispenser apparatus including the imaging apparatus, and applications thereof
WO2023209245A1 (en) 2022-04-30 2023-11-02 Earlysight Sa Method and use of transscleral optical imaging for detecting a disease
CN116990320A (en) * 2023-09-27 2023-11-03 江西驰宇光电科技发展有限公司 Dark field imaging method and device for defect detection
EP4327723A1 (en) * 2022-08-26 2024-02-28 EarlySight SA Retina tissue image acquisition

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11974809B2 (en) * 2017-06-13 2024-05-07 The Board Of Trustees Of The University Of Illinois Non-mydriatic, non-contact system and method for performing widefield fundus photographic imaging of the eye
US11867625B2 (en) * 2019-02-03 2024-01-09 Bar Ilan University System and method for imaging via scattering medium
CN112051239B (en) * 2019-06-05 2024-04-12 中国科学院上海光学精密机械研究所 Imaging method based on dynamic scattering system under condition of limited detection area
KR102278782B1 (en) * 2019-09-16 2021-07-20 주식회사 스몰머신즈 Active variable speckle illumination wide-field high-resolution imaging appatatus and imaging method using the same
EP4041054A4 (en) * 2019-09-30 2023-12-27 The Regents of the University of Colorado, a body corporate Systems and methods for imaging and characterizing objects including the eye using non-uniform or speckle illumination patterns
US20230072066A1 (en) * 2019-11-25 2023-03-09 Optos Plc Choroidal Imaging
CN111369510B (en) * 2020-02-28 2022-07-01 四川大学华西医院 Method for automatically estimating choroid thickness
JP7214270B2 (en) * 2020-03-31 2023-01-30 国立大学法人東北大学 STATE ESTIMATING DEVICE AND METHOD OF THE INTERNAL TISSUE OF EYE
EP4164470A1 (en) * 2020-06-15 2023-04-19 Akkolens International B.V. Apparatus and method to size the accommodative structure of the eye
NL2026025B1 (en) * 2020-06-15 2022-02-17 Akkolens Int B V Apparatus and method to measure accommodative structure of the eye
US20220039654A1 (en) * 2020-08-10 2022-02-10 Welch Allyn, Inc. Eye imaging devices
CN112965261B (en) * 2021-02-23 2022-10-28 山东仕达思生物产业有限公司 Method for quickly and effectively intelligently correcting microscope optical axis based on machine vision and implementation system thereof
WO2023089401A1 (en) * 2021-11-19 2023-05-25 Alcon Inc. Ophthalmic procedure contact lens with enhanced vitreous visualization
CN114063275A (en) * 2022-01-17 2022-02-18 北京九辰智能医疗设备有限公司 Corneal endothelial cell imaging system, method, apparatus and storage medium
WO2023175544A1 (en) 2022-03-17 2023-09-21 Ricoh Company, Ltd. Method of manufacturing laminate for battery, apparatus for manufacturing laminate for battery, method of manufacturing member for battery, and apparatus for manufacturing member for battery
WO2023182011A1 (en) * 2022-03-24 2023-09-28 株式会社ニコン Image processing method, image processing device, ophthalmological device, and program
CN117398059A (en) * 2023-12-12 2024-01-16 中国科学院长春光学精密机械与物理研究所 Retina imaging method based on differential phase contrast imaging

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3954392A (en) 1973-08-02 1976-05-04 Allied Chemical Corporation Copper phthalocyanine N-Di-N-butylaminoalkyl sulfonamide, quaternized solutions of the quaternary compound, and paper dyeing therewith
US4200362A (en) 1972-09-25 1980-04-29 Retina Foundation Ophthalmoscope with uniform illumination
US4213678A (en) 1977-09-29 1980-07-22 Retina Foundation Scanning ophthalmoscope for examining the fundus of the eye
EP1427328A1 (en) 2001-08-30 2004-06-16 University of Rochester Adaptive optics in a scanning lase ophtalmoscope
US20040189941A1 (en) 2001-08-12 2004-09-30 Bucourt Samuel Henri Device for measuring aberrations in an eye-type system
US20070030448A1 (en) 2003-10-22 2007-02-08 Detlef Biernat Illumination unit for fundus cameras and/or ophthalmoscopes
US20070159600A1 (en) 2003-04-08 2007-07-12 Medibell Medical Vision Technologies, Ltd. Transcleral opthalmic illumination method and system
US7364296B2 (en) 2002-06-12 2008-04-29 University Of Rochester Method and apparatus for improving both lateral and axial resolution in ophthalmoscopy
US7387385B2 (en) 2003-01-21 2008-06-17 Leica Microsystems (Schweiz) Ag Surgical microscope
EP1964510A1 (en) * 2007-02-27 2008-09-03 National University of Ireland Galway Imaging of phase objects
US20090153798A1 (en) * 2005-07-22 2009-06-18 Manfred Dick Device and method for monitoring, documenting and/or diagnosing the fundus
WO2013148360A1 (en) 2012-03-30 2013-10-03 Trustees Of Boston University Phase contrast microscopy with oblique back-illumination
US8717574B2 (en) 2009-11-10 2014-05-06 California Institute Of Technology Turbidity suppression by optical phase conjugation using a spatial light modulator
US8731272B2 (en) 2011-01-24 2014-05-20 The Board Of Trustees Of The University Of Illinois Computational adaptive optics for interferometric synthetic aperture microscopy and other interferometric imaging
WO2015179452A1 (en) 2014-05-19 2015-11-26 The Regents Of The University Of California Fourier ptychographic microscopy with multiplexed illumination

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3954329A (en) * 1972-09-25 1976-05-04 Retina Foundation Wide-angle opthalmoscope employing transillumination
US5099354A (en) * 1988-09-14 1992-03-24 Washington University Kit for converting a slit lamp biomicroscope into a single aperture confocal scanning biomicroscope
US5822036A (en) * 1996-07-24 1998-10-13 Research Development Foundation Eye imaging unit having a circular light guide
CA2353921C (en) * 1998-12-10 2009-03-10 Carl Zeiss Jena Gmbh System and method for the non-contacting measurement of the axis length and/or cornea curvature and/or anterior chamber depth of the eye, preferably for intraocular lens calculation
JP2003000548A (en) 2001-06-18 2003-01-07 Konan Medical Inc Anterior ocular segment observation system
AUPS219002A0 (en) * 2002-05-08 2002-06-06 Lion Eye Institute, The Digital hand-held imaging device
DE102011082363B4 (en) * 2010-10-28 2018-03-29 Oculus Optikgeräte GmbH Illumination system, ophthalmic analyzer and method
US8237835B1 (en) * 2011-05-19 2012-08-07 Aeon Imaging, LLC Confocal imaging device using spatially modulated illumination with electronic rolling shutter detection

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4200362A (en) 1972-09-25 1980-04-29 Retina Foundation Ophthalmoscope with uniform illumination
US3954392A (en) 1973-08-02 1976-05-04 Allied Chemical Corporation Copper phthalocyanine N-Di-N-butylaminoalkyl sulfonamide, quaternized solutions of the quaternary compound, and paper dyeing therewith
US4213678A (en) 1977-09-29 1980-07-22 Retina Foundation Scanning ophthalmoscope for examining the fundus of the eye
US20040189941A1 (en) 2001-08-12 2004-09-30 Bucourt Samuel Henri Device for measuring aberrations in an eye-type system
EP1427328A1 (en) 2001-08-30 2004-06-16 University of Rochester Adaptive optics in a scanning lase ophtalmoscope
US7364296B2 (en) 2002-06-12 2008-04-29 University Of Rochester Method and apparatus for improving both lateral and axial resolution in ophthalmoscopy
US7387385B2 (en) 2003-01-21 2008-06-17 Leica Microsystems (Schweiz) Ag Surgical microscope
US20070159600A1 (en) 2003-04-08 2007-07-12 Medibell Medical Vision Technologies, Ltd. Transcleral opthalmic illumination method and system
US20070030448A1 (en) 2003-10-22 2007-02-08 Detlef Biernat Illumination unit for fundus cameras and/or ophthalmoscopes
US20090153798A1 (en) * 2005-07-22 2009-06-18 Manfred Dick Device and method for monitoring, documenting and/or diagnosing the fundus
EP1964510A1 (en) * 2007-02-27 2008-09-03 National University of Ireland Galway Imaging of phase objects
US8717574B2 (en) 2009-11-10 2014-05-06 California Institute Of Technology Turbidity suppression by optical phase conjugation using a spatial light modulator
US8731272B2 (en) 2011-01-24 2014-05-20 The Board Of Trustees Of The University Of Illinois Computational adaptive optics for interferometric synthetic aperture microscopy and other interferometric imaging
WO2013148360A1 (en) 2012-03-30 2013-10-03 Trustees Of Boston University Phase contrast microscopy with oblique back-illumination
WO2015179452A1 (en) 2014-05-19 2015-11-26 The Regents Of The University Of California Fourier ptychographic microscopy with multiplexed illumination

Non-Patent Citations (47)

* Cited by examiner, † Cited by third party
Title
"Panoret 1000-Wide-Angle Digital Retinal Camera", MEDIBELL MEDICAL VISION TECHNOLOGIES, October 2002 (2002-10-01)
A. GUEVARA-TORRES ET AL: "Imaging translucent cell bodies in the living mouse retina without contrast agents", BIOMED. OPT. EXPRESS, vol. 6, 2015, pages 2106 - 2119
A. ROORDA ET AL: "Adaptive optics scanning laser ophthalmoscopy", OPT. EXPRESS, vol. 10, 2002, pages 405 - 412
A. SCHALENBOURG; L. ZOGRAFOS: "Pitfalls in colour photography of choroidal tumours", EYE, vol. 27, no. 2, 2013, pages 224 - 229
B. VOHNSEN: "Photoreceptor waveguides and effective retinal image quality", J. OPT. SOC. AM. A, vol. 24, 2007, pages 597 - 607
B. VOHNSEN; I. IGLESIAS; P. ARTAL: "Guided light and diffraction model of human-eye photoreceptors", J. OPT. SOC. AM. A, vol. 22, 2005, pages 2318 - 2328
C. A. CURCIO ET AL: "Human Chorioretinal Layer Thicknesses Measured in Macula-wide, High-Resolution Histologic Sections", INVEST OPHTHALMOL VIS SCI., vol. 52, no. 7, June 2011 (2011-06-01), pages 3943 - 3954
C.-L. HSIEH ET AL: "Imaging through turbid layers by scanning the phase conjugated second harmonic radiation from a nanoparticle", OPT. EXPRESS, vol. 18, 2010, pages 20723 - 20731
D. CONKEY ET AL: "Genetic algorithm optimization for focusing through turbid media in noisy environments", OPT. EXPRESS, vol. 20, 2012, pages 4840 - 4849
D. HILLMANN ET AL: "Aberration-free volumetric high-speed imaging of in vivo retina", SCIENTIFIC REPORTS, vol. 6, 2016
D. MUSTAFI, A; H. ENGEL, KRZYSZTOF PALCZEWSKI: "Structure of cone photoreceptors", PROGRESS IN RETINAL AND EYE RESEARCH, vol. 28, no. 4, 2009, pages 289 - 302, XP026323048, DOI: doi:10.1016/j.preteyeres.2009.05.003
D. MUSTAFI, A; H. ENGEL: "Krzysztof Palczewski ''Structure of cone photoreceptors", PROGRESS IN RETINAL AND EYE RESEARCH, vol. 28, no. 4, 2009, pages 289 - 302
D. SCOLES; Y. N. SULAI; A. DUBRA: "In vivo dark-field imaging of the retinal pigment epithelium cell mosaic", BIOMED. OPT. EXP., vol. 4, no. 9, 2013, pages 1710 - 1723
D. SCOLES; Y. N. SULAI; A. DUBRA: "In vivo dark-field imaging of the retinal pigmentepithelium cell mosaic", BIOMED. OPT. EXP., vol. 4, no. 9, 2013, pages 1710 - 1723
DATABASE MEDLINE [online] US NATIONAL LIBRARY OF MEDICINE (NLM), BETHESDA, MD, US; 4 May 2015 (2015-05-04), TIAN LEI ET AL: "Quantitative differential phase contrast imaging in an LED array microscope.", XP002774005, Database accession no. NLM25969234 *
DEVRIM TOSLAK ET AL: "Trans-palpebral illumination: an approach for wide-angle fundus photography without the need for pupil dilation", OPT. LETT., vol. 41, 2016, pages 2688 - 2691
E.M. WELLS-GRAY ET AL: "Performance of a combined optical coherence tomography and scanning laser ophthalmoscope with adaptive optics for human retinal imaging applications", PROC. SPIE, vol. 9335, 2015, XP060049336, DOI: doi:10.1117/12.2079772
G. ZHENG; C. KOLNER; C. YANG, OPTICS LETTERS, 2011, pages 3987 - 3989
G. ZHENG; R. HORSTMEYER; C. YANG: "Wide-field, high-resolution Fourier ptychographic microscopy", NATURE PHOTONICS, vol. 7, no. 9, 2013, pages 739 - 745, XP055181687, DOI: doi:10.1038/nphoton.2013.187
H. YILMAZ ET AL: "Speckle correlation resolution enhancement of wide-field fluorescence imaging", OPTICA, vol. 2, 2015, pages 424 - 429
I. M. VELLEKOOP: "Feedback-based wavefront shaping", OPT. EXPRESS, vol. 23, 2015, pages 12189 - 12206
I. M. VELLEKOOP; A. LAGENDIJK; A. P. MOSK: "Exploiting disorder for perfect focusing", NATURE PHOTONICS, vol. 4, 2010, pages 320 - 322, XP009149181, DOI: doi:10.1038/nphoton.2010.3
I. N. PAPADOPOULOS ET AL: "Increasing the imaging capabilities of multimode fibers by exploiting the properties of highly scattering media", OPTICS LETTERS, vol. 38, 2013, pages 2776 - 2778, XP001583744, DOI: doi:http://dx.doi.org/10.1364/OL.38.002776
JONNAL R.S. ET AL: "A Review of Adaptive Optics Optical Coherence Tomography: Technical Advances, Scientific Applications, and the Future", INVEST OPHTHALMOL VIS SCI, vol. 57, no. 9, 1 July 2016 (2016-07-01), pages 51 - 68
L. ROVATI ET AL: "In-vivo diffusing-wave-spectroscopy measurements of the ocular fundus", OPTICS EXPRESS, vol. 15, no. 7, 2007, pages 4030 - 4038
L. TIAN; L. WALLER: "Quantitative differential phase contrast imaging in an LED array microscope", OPT. EXP., vol. 23, no. 9, 2015, pages 11394 - 11403
M. VELLEKOOP; A. P. MOSK: "Focusing coherent light through opaque strongly scattering media", OPT. LETT., vol. 32, 2007, pages 2309 - 2311, XP001506776, DOI: doi:10.1364/OL.32.002309
N. D. SHEMONSKI ET AL: "Computational high-resolution optical imaging of the living human retina", NATURE PHOTONICS, 2015
N. MEITAV; E. N. RIBAK: "Estimation of the ocular point spread function by retina modeling", OPTICS LETTER, vol. 37, no. 9, 2012, XP001575325, DOI: doi:10.1364/OL.37.001466
N. MEITAV; E. N. RIBAK: "Improving retinal image resolution with iterative weighted shift-and-add", J. OPT. SOC. AM., vol. 28, no. 7, 2011
OPTICS EXPRESS 04 MAY 2015, vol. 23, no. 9, 4 May 2015 (2015-05-04), pages 11394 - 11403, ISSN: 1094-4087 *
R. K. WANG; V. V. TUCHIN: "Advanced Biophotonics: Tissue Optical Sectioning", 2014, CRC PRESS
S. B. MEHTA; C. J. R. SHEPPARD: "Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast", OPT. LETT, vol. 34, no. 13, 2009, pages 1924 - 1926, XP001524101
S. B. MEHTA; C. J. R. SHEPPARD: "Quantitative phase-gradient imaging at high resolution with asymmetric illumination-based differential phase contrast", OPT. LETT., vol. 34, no. 13, 2009, pages 1924 - 1926, XP001524101
SCHMITT J M ET AL: "Differential absorption imaging with optical coherence tomography", JOURNAL OF THE OPTICAL SOCIETY OF AMERICA. A, OPTICS AND IMAGE SCIENCE, OPTICAL SOCIETY OF AMERICA, US, vol. 15, no. 9, 1 September 1998 (1998-09-01), pages 2288 - 2296, XP002474885, ISSN: 0740-3232, DOI: 10.1364/JOSAA.15.002288 *
T. N FORD; K. K CHU; J. MERTZ: "Phase-gradient microscopy in thick tissue with oblique back-illumination", NAT. METHODS, vol. 9, 2012, pages 12
T. N FORD; K. K CHU; J. MERTZ: "Phase-gradient microscopy in thick tissue with oblique back-illumination,'' according to the background art", NAT. METHODS, vol. 9, 2012, pages 12
T. Y. P. CHUI; D. A. VANNASDALE; S. A. BURNS: "The use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope", BIOMED. OPT. EXP., vol. 3, no. 10, 2012, pages 2537 - 2549, XP055276415, DOI: doi:10.1364/BOE.3.002537
T. Y. P. CHUI; D. A. VANNASDALE; S. A. BURNS: "The use of forward scatter to improve retinal vascular imaging with an adaptive opticsscanning laser ophthalmoscope", BIOMED. OPT. EXP., vol. 3, no. 10, 2012, pages 2537 - 2549, XP055276415, DOI: doi:10.1364/BOE.3.002537
T. Y. P. CHUI; T. J. GAST; S. A. BURNS: "Imaging of Vascular Wall Fine Structure in the Human Retina Using Adaptive Optics Scanning Laser Ophthalmoscopy", INVEST OPHTHALMOL VISSCI., vol. 54, 2013, pages 7115 - 7124
TOCO Y. P. CHUI ET AL: "the use of forward scatter to improve retinal vascular imaging with an adaptive optics scanning laser ophthalmoscope", BIOMED. OPT. EXPRESS, vol. 3, 2012, pages 2537 - 2549
X. YANG; Y. PU; D. PSALTIS: "Imaging blood cells through scattering biological tissue using speckle scanning microscopy", OPT. EXPRESS, vol. 22, 2014, pages 3405 - 3413
Y. CHOI ET AL: "Overcoming the Diffraction Limit Using Multiple Light Scattering in a Highly Disordered Medium", PHYS. REV. LETT., vol. 107, no. 2, 2011, pages 023902
Z. LIU; S. LIU; L. WALLER: "Real-time brightfield, darkfield, and phase contrast imaging in a light emitting diode array microscope", J. OF BIOMED. OPT., vol. 19, no. 10, 2014, pages 106002, XP060047118, DOI: doi:10.1117/1.JBO.19.10.106002
Z. LIU; S. LIUAND; L. WALLER: "Real-time brightfield, darkfield, and phase contrast imaging in a light emitting diode array microscope", JOURNAL OF BIOMED. OPT., vol. 19, no. 10, 2014, pages 106002, XP060047118, DOI: doi:10.1117/1.JBO.19.10.106002
Z. PHILLIPS; M. CHEN; L. WALLER: "Optics in the Life Sciences Congress, OSA Technical Digest (online", 2017, OPTICAL SOCIETY OF AMERICA, article "Quantitative Phase Microscopy with Simultaneous Aberration Correction"
Z. PHILLIPS; M. CHEN; L. WALLER: "Quantitative Phase Microscopy with Simultaneous Aberration Correction", OPTICS IN THE LIFE SCIENCE, 2017

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3744228A4 (en) * 2018-01-22 2021-10-13 Shenzhen Thondar Technology Co., Ltd Retinal digital imaging system, retinal digital imaging instrument, and retinal digital imaging method
WO2020121243A1 (en) 2018-12-12 2020-06-18 Ecole Polytechnique Federale De Lausanne (Epfl) Ophthalmic system and method for clinical device using transcleral illumination with multiple points source
WO2021058367A1 (en) * 2019-09-26 2021-04-01 Ecole Polytechnique Federale De Lausanne (Epfl) System and methods for differential imaging using a lock-in camera
EP3884843A1 (en) 2020-03-27 2021-09-29 Ecole Polytechnique Federale De Lausanne (Epfl) Multi-modal retinal imaging platform
WO2021191331A1 (en) 2020-03-27 2021-09-30 Ecole Polytechnique Federale De Lausanne (Epfl) Multi-modal retinal imaging platform
JP2022025004A (en) * 2020-06-29 2022-02-09 オプトメッド オーワイジェイ Contact arrangement for eye examining instrument, eye examining instrument, and method of contacting between eye and eye examining instrument
JP7005808B1 (en) 2020-06-29 2022-01-24 オプトメッド オーワイジェイ Contact devices for eye examination devices, eye examination devices, and methods for contacting the eye with the eye examination device.
US11478144B2 (en) 2020-06-29 2022-10-25 Optomed Oyj Contact arrangement for eye examining instrument, eye examining instrument and method of contacting between eye and eye examining instrument
EP4190450A1 (en) 2021-12-02 2023-06-07 Scienion GmbH Imaging apparatus for imaging a nozzle section of a droplet dispenser device, dispenser apparatus including the imaging apparatus, and applications thereof
WO2023209245A1 (en) 2022-04-30 2023-11-02 Earlysight Sa Method and use of transscleral optical imaging for detecting a disease
EP4327723A1 (en) * 2022-08-26 2024-02-28 EarlySight SA Retina tissue image acquisition
WO2024042230A1 (en) 2022-08-26 2024-02-29 Earlysight Sa Retina tissue image acquisition
CN116990320A (en) * 2023-09-27 2023-11-03 江西驰宇光电科技发展有限公司 Dark field imaging method and device for defect detection
CN116990320B (en) * 2023-09-27 2023-12-19 江西驰宇光电科技发展有限公司 Dark field imaging method and device for defect detection

Also Published As

Publication number Publication date
JP2023055993A (en) 2023-04-18
US20190290124A1 (en) 2019-09-26
US20220117485A1 (en) 2022-04-21
JP7235355B2 (en) 2023-03-08
JP6994472B2 (en) 2022-02-04
JP2022043142A (en) 2022-03-15
US11179033B2 (en) 2021-11-23
CN109414162A (en) 2019-03-01
EP4008237A1 (en) 2022-06-08
JP2019518511A (en) 2019-07-04
US11911107B2 (en) 2024-02-27
EP3454719A1 (en) 2019-03-20

Similar Documents

Publication Publication Date Title
US11911107B2 (en) System, method and apparatus for retinal absorption phase and dark field imaging with oblique illumination
Williams Imaging single cells in the living retina
US8226236B2 (en) Method and apparatus for imaging in an eye
JP4621496B2 (en) Line scan ophthalmoscope
JP2019518511A5 (en)
US9517009B2 (en) Structured illumination ophthalmoscope
LaRocca et al. Optimization of confocal scanning laser ophthalmoscope design
US20150272438A1 (en) Imaging retinal intrinsic optical signals
CN115334953A (en) Multi-modal retinal imaging platform
US20220390369A1 (en) Systems And Methods For Imaging And Characterizing Objects Including The Eye Using Non-Uniform Or Speckle Illumination Patterns
Schramm et al. 3D retinal imaging and measurement using light field technology
Salas Manipulation of the illumination of an Adaptive Optics Flood Illumination Ophthalmoscope for functional imaging of the retina in-vivo
Xiao Adaptive optics in full-field spatially incoherent interferometry and its retinal imaging
WO2021256132A1 (en) Ophthalmic device, method for controlling ophthalmic device, and program
Laforest et al. Quantitative phase imaging of retinal cells
Weber Transillumination techniques in ophthalmic imaging
Mayne Dynamic Aperture Imaging with an Adaptive Optics Scanning Laser Ophthalmoscope as an Approach to Studying Light Scatter in the Retina
Chen Line-Field Spectral Domain Optical Coherence Tomography: Design and Biomedical Applications
Geng Wavefront Sensing and High Resolution Adaptive Optics Imaging in the Living Rodent Eye
Hong Investigations into high resolution imaging of the aqueous outflow system and cornea
Mozaffari Structural and Functional Adaptive Optics Retinal Imaging
Mazlin Tomographie optique cohérente pour l’imagerie in vivo de la cornée
Logean On Phase Contrast Imaging of the Inner Retina
Gevaert The three kings= Les rois mages/from the Collection de choeurs of FA Gevaert; the English text by Stewart A. Trench.
Wade High-resolution, in vivo imaging of the human cone photoreceptor mosaic

Legal Events

Date Code Title Description
ENP Entry into the national phase

Ref document number: 2018559733

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17729922

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2017729922

Country of ref document: EP

Effective date: 20181213