WO2016189095A1 - Apparatus for imaging at least one object - Google Patents

Apparatus for imaging at least one object Download PDF

Info

Publication number
WO2016189095A1
WO2016189095A1 PCT/EP2016/061926 EP2016061926W WO2016189095A1 WO 2016189095 A1 WO2016189095 A1 WO 2016189095A1 EP 2016061926 W EP2016061926 W EP 2016061926W WO 2016189095 A1 WO2016189095 A1 WO 2016189095A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical system
spread function
light
point spread
imaging
Prior art date
Application number
PCT/EP2016/061926
Other languages
French (fr)
Inventor
Marcelo NOLLMANN
Jean-Bernard FICHE
Laura OUDJEDI
Original Assignee
Institut National De La Sante Et De La Recherche Medicale (Inserm)
Centre National De La Recherche Scientifique
Université De Montpellier
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institut National De La Sante Et De La Recherche Medicale (Inserm), Centre National De La Recherche Scientifique, Université De Montpellier filed Critical Institut National De La Sante Et De La Recherche Medicale (Inserm)
Publication of WO2016189095A1 publication Critical patent/WO2016189095A1/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/18Diffraction gratings
    • G02B5/1842Gratings for image generation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/02Objectives
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1066Beam splitting or combining systems for enhancing image performance, like resolution, pixel numbers, dual magnifications or dynamic range, by tiling, slicing or overlapping fields of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/1086Beam splitting or combining systems operating by diffraction only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • G01N21/274Calibration, base line adjustment, drift correction
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/42Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect
    • G02B27/4205Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant
    • G02B27/4211Diffraction optics, i.e. systems including a diffractive element being designed for providing a diffractive effect having a diffractive optical element [DOE] contributing to image formation, e.g. whereby modulation transfer function MTF or optical aberrations are relevant correcting chromatic aberrations

Definitions

  • the invention relates to an apparatus for imaging at least one object.
  • Fluorescence microscopy has proven to be a powerful tool in biology and physical sciences.
  • the use of visible light as a probe enables direct, non-invasive observation of the organization of biological structures or materials at the micrometer scale with high specificity.
  • SmSRM single-molecule super-resolution microscopy
  • multifocus microscopy also named multi-focal microscopy
  • Diffractive Fourier optics is used to create an instant focal series of two-dimensional wide-field images arranged in an array and recorded in a single camera frame.
  • the multi-focus grating is placed in the Fourier plane and splits the emission light from the sample into a set of diffractive orders.
  • a geometrical distortion in the grating introduces a phase-shift that is dependent on diffractive order.
  • the phase shift is calibrated to remove the out-of-focus phase error of a specific plane z, creating an instant focal series.
  • the chromatic dispersion is removed by a multi-panel, blazed chromatic correction grating combined with a multi-faceted prism. In this way, the entire imaging volume is recorded simultaneously and no mechanical movement is required. Acquisition speed is limited by exposure time and the camera readout-rate for a single frame.
  • a specially designed optical module corrects for the chromatic dispersion introduced by the multi-focus grating, allows using this diffractive optics method to image light across the visible spectrum.
  • the invention aims at improving the optical resolution of the above-mentioned apparatus with no additional dose of illumination.
  • the invention concerns an apparatus for imaging at least one object, the apparatus comprising a device adapted to achieve multi-focal microscopy on the at least one object comprising:
  • first optical system adapted to propagate the emitted light towards the at least one object, the first optical system producing a first point spread function in a plane perpendicular to the direction of propagation of the light, the first point spread function being symmetrical with relation to a central point,
  • the apparatus further comprises a second optical system adapted to modify the first point spread function, the point spread function of both optical systems combined producing a second point spread function in the plane perpendicular to the direction of propagation of the light, the second point spread function being asymmetrical with relation to the central point.
  • the main principle of the invention is the combination of the engineering of the point spread function and multi-focal plane microscopy to obtain instant, super-resolved, thick three-dimensional volumes.
  • a change in the design device adapted to achieve multi-focal microscopy is introduced so that the point spread function in each image plane is re- engineered to be asymmetric.
  • the use of the apparatus enables to obtain a much larger depth of field, for the use of a larger variety of fluorophores, and for a larger axial resolution.
  • the apparatus enables to carry out single-molecule tracking as each fluorophore can be imaged with considerably smaller excitation intensity.
  • the apparatus might incorporate one or several of the following features, taken in any technically admissible combination:
  • the second point spread function has an elliptical shape defining a maximum radius and a minimum radius, the ratio between the minimum radius and the maximum radius being superior or equal to 0.1 %.
  • the ratio between the minimum radius and the maximum radius being superior or equal to 10%.
  • the second optical system is an astigmatism system for which a tangential focal plane and a sagittal focal plane are defined, the distance between the tangential focal plane and the sagittal focal plane being comprised between 20 mm and 2000 mm.
  • the second optical system comprises a cylindrical lens.
  • the second optical system is a cylindrical lens.
  • the cylindrical lens has a focal length comprised between 100 mm and 10 4 mm.
  • a focal plane is defined for the first optical system, the distance between the cylindrical lens and the focal plane being inferior or equal to 1000 mm.
  • the second optical system is a spatial light modulator adapted to generate a double-helix phase mask.
  • the second optical system is adapted to achieve a self-bending point spread function technique.
  • FIG. 1 shows schematically an example of apparatus for imaging at least one object
  • FIG. 2 shows schematically another example of apparatus for imaging at least one object
  • figure 3 shows schematically part of the apparatus for imaging at least one object according to figure 2
  • figure 4 is an optical representation of the apparatus for imaging at least one object according to figure 2
  • a sample 10 and an apparatus 12 for imaging at least one object of the sample 10 are represented on figure 1 .
  • the sample 10 is an assembly of objects.
  • the sample 10 is a biological sample.
  • each object is an organism.
  • the sample 10 comprises an assembly of organisms and a substrate supporting the organisms.
  • the organism is acellular or cellular.
  • the cellular organisms are usually distinguished from prokaryotes and eukaryotes.
  • Archaea also called archaebacteria
  • the bacteria are examples of prokaryotic organism.
  • Eukaryotic organisms are either unicellular or multicellular.
  • protozoa, amoeba, algae or yeast are unicellular eukaryotic organisms.
  • Multicellular eukaryotic organisms are derived cells, for instance, human and non-human mammals, fungi, plants, protists or colourists.
  • the apparatus 12 is adapted to image at least one organism of the sample 10.
  • the apparatus 12 comprises a device 14, a second optical system 16 and a sensor 18.
  • the device 14 is adapted to achieve multi-focal microscopy on the at least one organism.
  • Multi-focal microscopy is a technique which relies on the use of a chirped diffraction grating placed in the Fourier plane of a microscope. Emitted light is diffracted into several central orders, for instance, nine central orders, which, after chromatic correction, are separately imaged on a single camera detector. The spacing between the different focal planes can be adjusted by carefully designing the diffraction grating. With an epifluorescence microscope, the device 14 enables high-resolution three-dimensional imaging in multiple colors at speeds that are around 10 times to 30 times faster than other conventional epifluorescence systems.
  • the device 14 comprises a light unit 20, a first optical system 22, a multifocus grating 24 and a correction unit 26.
  • the light unit 20 is adapted to emit light.
  • the direction of light propagation is defined as the Z-direction, the two others directions X and Y being directions perpendicular to the direction of light propagation.
  • the others directions X and Y are labeled transverse direction.
  • the second transverse direction Y is not contained in the sheet.
  • the first optical system 22 is adapted to propagate the emitted light towards the at least one organism of the sample 10.
  • the first optical system 22 produces a first point spread function PSF1 in a plane perpendicular to the direction of propagation of the light Z.
  • the point spread function PSF describes the response of an imaging system to a point source or point object.
  • a more general term for the PSF is a system's impulse response, the PSF being the impulse response of a focused optical system.
  • the PSF in many contexts can be thought of as the extended blob in an image that represents an unresolved object.
  • the degree of spreading (blurring) of the point object is a measure for the quality of an imaging system.
  • non-coherent imaging systems such as fluorescent microscopes, telescopes or optical microscopes
  • the image formation process is linear in power and described by linear system theory. This means that when two objects A and B are imaged simultaneously, the result is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa, owing to the non-interacting property of photons.
  • the image of a complex object can then be seen as a convolution of the true object and the PSF.
  • image formation is linear in the complex field. Recording the intensity image then can lead to cancellations or other non-linear effects.
  • the first point spread function PSF1 is symmetrical with relation to a central point. This means that the first point spread function PSF1 exhibits a property of rotational symmetry.
  • the first optical system 22 comprises a first lens 28, a mirror 30 and an objective 32.
  • the first lens 28 is adapted to shape the light emitted by the light unit 20.
  • the word “lens” designates either a lens system such as a doublet or a single lens.
  • the mirror 30 is adapted to send the light to the objective 32.
  • the objective 32 is adapted to send the light received from the mirror 30 to the sample 10 and to collect the light emitted by the sample 10.
  • objective 32 is an objective of microscope.
  • the objective 32 is an objective of epifluorescence microscope.
  • the multifocus grating 24 is adapted to split and to shift the focus of the sample 10 emission light to form an instant focal series, in which each focal plane corresponds to a diffractive order of the multifocus grating 24.
  • a multifocus grating is a set of a repeated identical diffractive element.
  • the shape of the diffractive element depends on the specific experiment which is carried out.
  • the multifocus grating 24 is located at the Fourier plane of the objective 32.
  • the correction unit 26 is adapted to correct the chromatic dispersion introduced by the multifocus grating 24.
  • dispersion is the phenomenon in which the phase velocity of a wave depends on its frequency. Media having this common property may be termed dispersive media. Sometimes the term chromatic dispersion is used for specificity. Although the term is used in the field of optics to describe light and other electromagnetic waves, dispersion in the same sense can apply to any sort of wave motion such as acoustic dispersion in the case of sound and seismic waves, in gravity waves (ocean waves), and for telecommunication signals propagating along transmission lines (such as coaxial cable) or optical fiber.
  • the device 14 is deprived of a multifocus grating and of a correction unit.
  • An example of such device 14 is described in an article by S. Geissbuehler, A. Sharipov, A. Godinat, N. L. Bocchio, P. A. Sandoz, A. Huss, N. A. Jensen, S. Jakobs, J. Enderlein, F. Gisou van der Goot, E. A. Dubikovskaya, T. Lasser, and M. honegger, the article being entitled Live-cell multiplane three-dimensional super-resolution optical fluctuation imaging and the article being extracted from the review Nat. Commun. Volume 5, page 5830 and published in 2014.
  • the second optical system 16 is adapted to modify the first point spread function PSF1 .
  • the point spread function of both optical systems combined produces a second point spread function PSF2 in the plane perpendicular to the direction of propagation of the light Z.
  • the second point spread function PSF2 is asymmetrical with relation to the central point.
  • the sensor 18 is a camera.
  • the camera is a CCD camera.
  • a charge-coupled device is a device for the movement of electrical charge, usually from within the device to an area where the charge can be manipulated, for example conversion into a digital value. This is achieved by "shifting" the signals between stages within the device one at a time. CCDs move charge between capacitive bins in the device, with the shift allowing for the transfer of charge between bins.
  • the camera is an electron multiplied CCD camera, also named emCCD camera
  • the camera is a sCMOS camera.
  • Light is emitted by the light unit 20.
  • the mirror 30 reflects the light to the objective 32 which focuses the light in the backfocal plane of the objective 32. This results in a wide field illumination of the sample 10. Then the organisms which are stimulated by light emit fluorescence light which is collected by the objective 32.
  • the collected light passes through the multifocus grating 24 and the correction unit 26 so as to achieve a multi-focal microscopy.
  • the light is then modified by the second optical system 16 which is adapted to modify the first point spread function PSF1 .
  • the second point spread function PSF2 is obtained when the sensor 18 receives the light from the second optical system 16.
  • the point spread function of a system is evaluated in this context by evaluating the point spread function of the system by using the light which is emitted by the sample 10.
  • the lateral position (position in the plane perpendicular to the light direction Z) of the fluorophore but also the distance of the fluorophore to the focal plane can be infered. This does not require the collection of light from the same fluorophore in different planes.
  • PALM photo-activated localization microscopy and designates a technique according to which the fluorophores are light-activated in a random way.
  • STORM stands for stochastic optical reconstruction microscopy and designates a technique according to which the fluorophores are light-activated in a random way.
  • the proposed apparatus 12 which combines the device 14 and the second optical system 16 for engineering the point spread function enables the detection of single molecules in three dimensions, at nanometer resolution, and with very large depths of field (> 4 micrometers).
  • the apparatus 12 permits higher labeling densities without requiring additional doses of light.
  • Such apparatus 12 therefore enables to obtain speed improvement, signal quality improvement, resolution improvement and depth of field improvement in single-molecule super resolution microscopy at no additional illumination dose by combining multi-focal microscopy to shaping of the point spread function.
  • the apparatus 12 allows for a much larger depth of field, for the use of a larger variety of fluorophores, and for a larger axial resolution.
  • it will allow for single-molecule tracking as each fluorophore can be imaged with a considerably smaller excitation intensity.
  • the apparatus 12 is expected to be able to image deeper samples, get better axial resolution for smaller depths of fields, and be usable even with low-emitting fluorophores, such as the commonly used photoactivatable proteins.
  • the second point spread function PSF2 has an elliptical shape defining a maximum radius and a minimum radius, the ratio between the minimum radius and the maximum radius being superior or equal to 0.1 %.
  • the ratio between the minimum radius and the maximum radius is superior or equal to 10%.
  • the ratio between the minimum radius and the maximum radius is inferior or equal to 30%.
  • Such shape for the second point spread function PSF2 enables to increase the performances of the apparatus 12.
  • the degree of asymmetry introduced in the second point spread function PSF2 is advantageously chosen such that single-molecules can be localized at high-resolution at half the spacing between planes.
  • the resulting degree of asymmetry depends on the brightness of the fluorophore, the excitation power and the distance between the focal planes.
  • the use of the multi-focal microscopy in the apparatus 12 enables simultaneous imaging of nine focal planes separated by a distance dz in the sample 10, but a larger number of planes can be used.
  • n x dz is the position of its closest focal plane (n) of the multi-focal microscopy
  • the emitter is imaged on the sensor 18 with a signal to noise ratio S(n x dz + Az).
  • S(n x dz + Az) a signal to noise ratio
  • the response of the imaging system without any emitter inside is drawn.
  • the signal within the response of the imaging system is registered in the same imaging conditions as for the fluorescent sample.
  • the standard deviation between pixels intensities corresponds to the standard deviation of the background.
  • the resolution of localization based microscopy techniques depends on their localization precision. In order to get a localization precision under 50 nm, which is typically the resolution required to perform SR imaging, it is desired to have a signal to noise ratio higher than 2.
  • the signal to noise ratio of the image of an emitter located just in the middle of two planes is preferably greater than the computed limit of 2 :
  • the localization precision of our apparatus 12 will be less than 50 nm all throughout the imaging volume of the designed multi-focal technique.
  • the light unit 20 is a laser.
  • a laser is a device that emits light through a process of optical amplification based on the stimulated emission of electromagnetic radiation.
  • the term "laser” originated as an acronym for "light amplification by stimulated emission of radiation”.
  • a laser differs from other sources of light in that it emits light coherently.
  • Spatial coherence allows a laser to be focused to a tight spot, enabling applications such as laser cutting and lithography. Spatial coherence also allows a laser beam to stay narrow over great distances (collimation), enabling applications such as laser pointers.
  • Lasers can also have high temporal coherence, which allows them to emit light with a very narrow spectrum, for instance, they can emit a single color of light.
  • Temporal coherence can be used to produce pulses of light as short as a femtosecond.
  • the light unit 20 is two excitation lasers.
  • the first optical system 22 according to the embodiment of figure 2 comprises more elements than the first optical system 22 according to the embodiment of figure 1 .
  • the first optical system 22 further comprises a tube lens 40, a first relay lens 42, a dichroic mirror 44 and a second relay lens 46.
  • the first lens 28 is a doublet.
  • a doublet is a type of lens made up of two simple lenses paired together. Such an arrangement allows more optical surfaces, thicknesses, and formulations, especially as the space between lenses may be considered an "element.” With additional degrees of freedom, optical designers have more latitude to correct more optical aberrations more thoroughly.
  • the tube lens 40 is also a doublet.
  • the tube lens 40 is adapted to form an image. This image is called "primary image” and labeled PI in the remainder of the specification.
  • the two relay lens 42 and 46 are adapted to create a conjugate pupil plane or Fourier plane and the final image plane.
  • the first relay lens 42 is used to relay the image and defines the Fourier plane on which the grating 24 will be positioned.
  • the first relay lens 42 can also be used to modify the magnification of the final image registered on sensor 18.
  • Both the first relay lens 42 and the second relay lens 46 are doublets.
  • the focal length of the first relay lens 42 is equal to 150 mm.
  • the focal length f 2 of the second relay lens 46 is equal to 200 mm.
  • the dichroic mirror 44 is adapted to separate the different light beams with different chromatic dispersion generated by the multi-focus grating 24.
  • the multifocus grating 24 is placed between the first relay lens 42 and the correction 50. More precisely, the multifocus grating 24 is placed in the Fourier plane of the first relay lens 42 (at a distance equal to the focal length of the first relay lens 42).
  • the correction unit 26 comprises a chromatic correction grating 50 and a prism 52.
  • the second optical system 16 is an astigmatism system for which a tangential focal plane and a sagittal focal plane are defined, the distance between the tangential focal plane and the sagittal focal plane being comprised between 20 mm and 2000 mm.
  • the second optical system 16 is a cylindrical lens.
  • a cylindrical lens is a lens which focuses light which passes through onto a line instead of onto a point, as a spherical lens would.
  • the curved face or faces of a cylindrical lens are sections of a cylinder, and focus the image passing through it onto a line parallel to the intersection of the surface of the lens and a plane tangent to it.
  • the lens compresses the image in the direction perpendicular to this line, and leaves it unaltered in the direction parallel to it (in the tangent plane).
  • the axis on which the cylindrical lens is adapted to act is the first transverse axis X.
  • the second optical system 16 has a focal length comprised between 100 mm and 10 4 mm.
  • the focal length of such a cylindrical lens determines the strength of the astigmatism added and consequently the axial range in which the individual emitter is imaged with a satisfying signal to noise ratio so that the individual emitter can be localized with the required precision.
  • the focal length of the second optical system 16 is equal to 500 mm.
  • a parameter C is comprised between two values A and B when the parameter C is superior or equal to A and when the parameter C is inferior or equal to B. Furthermore, the cylindrical lens is positioned at a specific position in the apparatus 12.
  • the cylindrical lens is positioned between the second relay lens 46 and the surface of the sensor 18 where the final image Fl is formed.
  • a focal plane is defined for the first optical system 22, the distance between the cylindrical lens and the focal plane being inferior or equal to 1000 mm.
  • the distance between the second relay lens 46 and the second optical system 16 is equal to 20 mm.
  • the tayloring of the point spread function is achieved by adding a weak cylindrical lens in the infinity path of the microscope. This enables to obtain the axial position of the organism since the position of the organism directly depends on the ellipticity of the second point spread function PSF2. In other words, the ellipticity of the second point spread function PSF2 directly depends on the axial position of the organism.
  • any optical system adapted to break the symmetry of the first point spread function is to be considered since it leads to the same improvement for the apparatus 12.
  • the second optical system 16 is a spatial light modulator adapted to generate a double-helix phase mask.
  • Such technique relies on the encoding of the z position of the emitter in the angular orientation of two lobes in the point spread function.
  • an adequate phase mask (in reflection or transmission configuration) is placed at the back focal plane of the objective, inducing a double helix point spead function.
  • the two lobs rotate depending on the axial position of the emitter along the direction of propagation of light Z.
  • the rotation of the two lobs should be limitated to 180° between two focal planes and the distance between the two lobs should be enough to separate the two lobs.
  • the second optical system 16 is adapted to achieve a self-bending point spread function technique.
  • the present invention considerably increases the depth of field, the speed of acquisition, and the resolution of these systems.
  • the Applicant has developed a three-dimensional super- resolution microscopy method that enables deep imaging in cells.
  • This technique relies on the effective combination of multifocus microscopy and astigmatic three-dimensional single-molecule localization microscopy.
  • the optical system and the fabrication process of a multifocus grating are described.
  • two strategies for localizing emitters with the imaging method are presented and compared with a previously described deep three-dimensional localization algorithm.
  • the performance of the method by imaging the nuclear envelope of eukaryotic cells reaching a depth of field of about 4 ⁇ is demonstrated.
  • the optical system is positioned at the imaging output of a standard inverted microscope (Zeiss Axiovert 200) in wide field fluorescence microscopy configuration.
  • the multifocus grating splits the fluorescence signal coming from the sample into a set of diffractive orders, their number and intensity being defined by the shape of the diffraction pattern.
  • a geometrical distortion is applied to the grating pattern in order to get a constant focus step dz between each diffraction order.
  • the multifocus grating also introduces chromatic dispersion and is therefore not by itself suitable for high-resolution broadband imaging. This effect is compensated by adding a chromatic correction grating after the multifocus grating.
  • the chromatic correction grating is placed at a distance where the diffractive orders are well separated, without overlap, and each diffraction order goes through a different blazed grating, specifically designed to reverse the dispersion introduced by the multifocus grating.
  • the MFG multifocus grating splits the fluorescence light into nine diffractive orders.
  • This optical element introduces astigmatism in the emitted light wavefront and enables to break the axial symmetry of the PSF.
  • the diameter of the cylindrical lens has to be larger than the space occupied by the different beams coming from the different focal planes in order to induce the same amount of deformation to each diffractive order.
  • the cylindrical lens introduces astigmatism on the PSF of the emitters imaged on the camera chip.
  • a 561 nm excitation laser (Sapphire 561 -100 CW, Coherent, UK) and a 405 nm activation laser (Stradus 405-100, Vortran, USA) were combined and collimated by a series of dichroic mirrors and achromatic lenses, individually controlled by an acousto-optic tunable filter (AOTFnC-400.650-TN AAoptics, France) and focused onto the back focal plane of the objective through the rear port of the microscope.
  • AOTFnC-400.650-TN AAoptics, France acousto-optic tunable filter
  • the fluorescence signal emitted from the sample was collected by the objective lens, separated from the excitation wavelengths through a four-band dichroic mirror (zt405/488/561 /638rpc - Chroma, USA) and filtered using a bandpass filter (ET600/50m, Chroma, USA).
  • the design and performance of the multifocus grating control the spacing between the imaged planes and the transmission efficiency in a specific spectral range.
  • a binary grating adapted for SMLM was designed and fabricated.
  • the grating function was optimized for an equal distribution of light between nine different diffraction orders using the pixel flipper algorithm.
  • the grating pattern was then calculated, taking into account the emission range of Cy3b and the desired focus step separating each of the nine planes.
  • the numerically generated grating pattern file was sent to a mask printer (DWL 200, Heidelberg, Germany) that uses direct laser writing on a chromium plate coated with photoresist to generate the photolithography mask. After development, this mask was loaded into a Stepper (FPEA-3000 i4, Canon, Japan) and the grating pattern was transferred onto a fused silica wafer (500 ⁇ thick) coated with 1 ⁇ -thick ECl positive photoresist by UV exposition.
  • a mask printer DWL 200, Heidelberg, Germany
  • Stepper FPEA-3000 i4, Canon, Japan
  • the wafer was then developed and etched by RIE-ICP with CHF3 gas.
  • the etching time was adjusted to reach the exact design depth according to the spectral range selected for the multifocus grating.
  • the newly fabricated multifocus grating was characterized by scanning electron microscopy to ensure that the dimensions and aspect ratio of the grating pattern correspond to the expected theoretical values.
  • atomic force microscopy (AFM) microscopy was used to confirm scanning electron microscopy (SEM) observations and estimate with a nanometer precision the etch depth of the grating.
  • SEM scanning electron microscopy
  • a depth of 663 nm wa measured. This depth corresponds to a phase shift of ⁇ at 606 nm, very close to the center of our emission filter (600 +/- 25 nm).
  • a HeNe laser was shined through the grating and intensities in the different diffractive orders were measured.
  • the diffraction efficiency was 63%, close to the maximal theoretical diffraction efficiency of 67% for binary gratings.
  • the light distribution heterogeneity was 5%, and was calculated as the ratio between the standard deviation of the intensities of diffractive orders and the mean intensity of diffractive orders.
  • the homogeneous distribution of light between diffractive orders was confirmed by the homogeneity of intensity of a sample of beads imaged in different panels.
  • Such image is the z projection of a stack of images acquired while scanning a sample of fluorescence beads (Invitrogen TetraSpeck 0.1 ⁇ , Thermo Fisher Scientific, USA) along the optical axis using a piezoelectric stage (Nano F-100, Mad City Labs, USA).
  • the entropy of the images in the different panels during z scan was measured.
  • the axial position of the minimum entropy in each panel corresponds to the focal plane of each panel.
  • the average distance between minimum entropy positions gives the average distance between planes imaged by the microscope: 360 nm.
  • the cylindrical lens positioned before the camera modifies the wavefront of fluorescence light of single emitters. As stated before, this translates in an asymmetric PSF in the imaging plane, the ratio between the PSF width in x (w x ) and y (w y ) is directly related to the axial position of the emitter.
  • the position of single emitters was determined by fitting their PSF with an asymmetric two-dimensional Gaussian.
  • the center of this Gaussian corresponds to the lateral position of the emitter, while the axial position is encoded in the PSF asymmetry.
  • a careful calibration was performed by scanning a sample of fluorescent beads immobilized on a glass slide along the optical axis with a piezoelectric stage. A bright isolated bead was then selected on each panel of the image stack and an inferring technique was used to infer the widths along the x and y directions of its asymmetric PSF (w x and w y ).
  • the widths were fitted by a polynomial function of the third degree. These fitted functions were used as calibrations to compute the z position of every emitter localized in each plane.
  • the localizations of a large number of beads (>10) from the calibration stack also enabled the calculation of the geometrical transformation (translation, rotation etc.) between the different panels. Once this transformation is calculated, it can be used to align the localizations extracted from the stack of images acquired during the axial scanning of the beads sample.
  • the reconstructed trajectories recorded along about 4 ⁇ show that the Applicant's method was able to localize emitters over an extended axial depth of field.
  • the Applicant has evaluated the performance of multiple-plane detection and the optical system by imaging fluorescent beads on a glass slide at different excitation powers as represented on Figures 5 to 7.
  • the first excitation power P1 is strictly inferior to the second power P2 and the second power P2 is strictly inferior to the third excitation power P3.
  • Only the plane corresponding to the position closest to the emitters focal plane is displayed.
  • the colormap corresponds to the log of the signal. All images are represented using the same color scale.
  • Figures 8 to 1 1 are graphs illustrating the comparison of localization precisions for ⁇ MFM according to the prior art, o an asymmetric Gaussian fit and x cross-correlation localization methods.
  • Figure 8 represents a lateral localization precision as a function of excitation power. Arrows indicate the powers corresponding to the images shown in Figures 5 to 7.
  • Figure 9 represents the lateral localization precision as a function of the number of detected photons.
  • Figure 10 represents the axial localization precision as a function of excitation power
  • Figure 1 1 represents the axial localization precision as a function of the number of detected photons.
  • filled symbols represent localization precisions measured for single Cy3b molecules, and grayed areas symbolize the typical range of detected photons associated to single molecule emission.
  • the lateral localization precision is comparable in both approaches for different numbers of photons collected as represented on Figures 8 and 9, despite the localization of emitters in a single plane in the developed optical system as opposed to multiple-plane detection in MFM of the prior art.
  • the analysis time for the developed optical system was five times slower than for conventional MFM, independently of the number of frames analyzed (from 10 to 10000).
  • Drift correction was performed in post- acquisition by three-dimensional tracking a fluorescent bead attached to the surface of the coverslip.
  • the reconstructed image shows a homogeneous distribution of lamina around the nuclear envelope of a S2 cell. Spots of higher density appear on this image. Furthermore, the imaging depth was over 4 ⁇ .
  • the Applicant measured a localization precision of about 30 nm in the lateral direction and about 70 nm in the axial direction. This estimate is in good agreement with the localization precision measured for single Cy3b molecules deposited on a coverslip (see Figures 9 and 1 1 )..
  • the Applicant performed two-color imaging of the Fab-7 genomic locus in chromosome 3R in S2 cells and of a large, extended (-300 kbp) chromatin domain called the bithorax complex (BX-C).
  • BX-C bithorax complex
  • Fab-7 was labelled with Cy3b using a 4 kbp FISH probe.
  • BX-C was labeled with Alexa 647 using the oligopaint DNA hybridization method.
  • the Fab-7 sequence is part of BX-C, thus it would be expected to appear within the volume defined by BX-C. Cy3b imaging was performed using the same experimental conditions as the ones described in the last section.
  • Alexa 647 was excited at 643 nm and activated at 405 nm. 10000 frames were sequentially acquired for each color. As expected, the two different genomic targets exhibited very different sizes and shapes. BX-C was about 500 nm in size, consistent with previous measurements (Beliveau, 2015). Fab-7 was within the BX-C volume, as expected.
  • the Applicant measured a size of about 60 nm in the lateral direction and about100 nm in the axial direction.
  • the size of the Fab-7 locus imaged under identical conditions but with conventional two-dimensional STORM is about 60 nm (data not shown).
  • the resolution of our setup under real biological conditions is at least 60 nm lateral and about 100 nm axial.
  • the Applicant has combined multifocus microscopy with PSF engineering to enable the fast detection of single emitters in thick samples.
  • the Applicant demonstrated the ability of this technique to detect single-emitters over an extended axial imaging depth (superior to 4 ⁇ ) while maintaining high lateral and axial localization precisions even for low emitter intensities.
  • Cy3b super-resolution imaging of antibodylabeled lamin was used to demonstrate three-dimensional super-resolution imaging with the developed optical setup, even though it is well-known that this dye is not the best suited for super-resolution. Further optimization of the multifocus grating and chromatic correction grating for far red dyes (i.e. Alexa647 or cy5) should thus improve the performance of the developed optical setup and allow for better resolution. Overall, these improvements will be key for numerous biological applications.
  • Previous multifocus microscopy method used the detection of single-emitters in multiple planes to achieve three-dimensional nanometer localization. This method is computationally intensive, as it requires the alignment and assembly of the nine imaging planes into a three-dimensional volume, and the non-linear fitting of the PSF of each detected emitter. Instead, the developed optical setup requires only the detection and localization of emitters in a single imaging plane, this allows for an increase in the distance between MFM planes to reach thicker axial imaging depths. Importantly, this method also allows for a considerable increase in image reconstruction speed without sacrificing localization precision, as it requires the fitting of the emitter PSF in a single plane to yield a three-dimensaional localization.
  • the invention opens the door to further combination of MFM and other PSF engineering methods.
  • Combination of MFM with adaptive optics would enable three- dimensional superresolution imaging at high penetration depths, and will further lead to an improvement in the photon budget to increase localization precision and extend imaging depth.
  • Use of other PSF engineering methods such as the double helix PSF or adaptations to use other sensors could lead to a considerable increase in the axial and lateral imaging ranges. Excitingly, these future improvements have the potential to further empower superresolution methods by enabling the imaging of a larger variety of biological specimens.

Abstract

The invention concerns an apparatus (12) for imaging at least one object, the apparatus (12) comprising: - a device (14) adapted to achieve multi-focal microscopy on the at least one object comprising: - a light unit (20) adapted to emit light, and - a first optical system (22) adapted to propagate the emitted light towards the at least one object, the first optical system (22) producing a first point spread function in a plane perpendicular to the direction of propagation of the light, the first point spread function being symmetrical with relation to a central point, - a second optical system (16) adapted to modify the first point spread function, the point spread function of both optical systems combined producing a second point spread function in the plane perpendicular to the direction of propagation of the light, the second point spread function being asymmetrical with relation to the central point.

Description

APPARATUS FOR IMAGING AT LEAST ONE OBJECT
TECHNICAL FIELD OF THE INVENTION
The invention relates to an apparatus for imaging at least one object.
BACKGROUND OF THE INVENTION
Fluorescence microscopy has proven to be a powerful tool in biology and physical sciences. The use of visible light as a probe enables direct, non-invasive observation of the organization of biological structures or materials at the micrometer scale with high specificity.
However, the maximum resolution attainable in standard fluorescence microscopy is intrinsically limited by the diffraction limit of light and is several orders of magnitude lower than for X-ray or electron tomography, typically around 250 nanometers (nm).
It is therefore desirable to develop methods or device enabling to increase the optical resolution that can be reached.
A method termed single-molecule super-resolution microscopy (smSRM) that relies on the direct detection of single molecules has been developed. Such method is able to increase the resolution of conventional fluorescence microscopes by a factor of about 10 in the lateral direction. Such methodology has been extended to the third dimension.
However, most of these methods are slow and are limited to a very close distance away from the water-glass interface or provide a reduced depth of field, typically between 500 nm and 1000 nm.
For this, it is known an abstract from Sara Abrahamsson, Cori Bargmann and Mats Gustafsson whose title is "Expanding the capabilities of multifocus microscopy (MFM)" which was presented in the conference Focus on Microscopy 2013, Maastricht, The Netherlands on March 24 - March 27, 2013 that multifocus microscopy (MFM), also named multi-focal microscopy, is an imaging method that enables researchers to study quickly moving living samples over extended three-dimensional volumes. Diffractive Fourier optics is used to create an instant focal series of two-dimensional wide-field images arranged in an array and recorded in a single camera frame.
The multi-focus grating is placed in the Fourier plane and splits the emission light from the sample into a set of diffractive orders. A geometrical distortion in the grating introduces a phase-shift that is dependent on diffractive order. The phase shift is calibrated to remove the out-of-focus phase error of a specific plane z, creating an instant focal series. The chromatic dispersion is removed by a multi-panel, blazed chromatic correction grating combined with a multi-faceted prism. In this way, the entire imaging volume is recorded simultaneously and no mechanical movement is required. Acquisition speed is limited by exposure time and the camera readout-rate for a single frame. A specially designed optical module corrects for the chromatic dispersion introduced by the multi-focus grating, allows using this diffractive optics method to image light across the visible spectrum.
SUMMARY OF THE INVENTION
The invention aims at improving the optical resolution of the above-mentioned apparatus with no additional dose of illumination.
To this end, the invention concerns an apparatus for imaging at least one object, the apparatus comprising a device adapted to achieve multi-focal microscopy on the at least one object comprising:
- a light unit adapted to emit light, and
- a first optical system adapted to propagate the emitted light towards the at least one object, the first optical system producing a first point spread function in a plane perpendicular to the direction of propagation of the light, the first point spread function being symmetrical with relation to a central point,
The apparatus further comprises a second optical system adapted to modify the first point spread function, the point spread function of both optical systems combined producing a second point spread function in the plane perpendicular to the direction of propagation of the light, the second point spread function being asymmetrical with relation to the central point.
The main principle of the invention is the combination of the engineering of the point spread function and multi-focal plane microscopy to obtain instant, super-resolved, thick three-dimensional volumes. A change in the design device adapted to achieve multi-focal microscopy is introduced so that the point spread function in each image plane is re- engineered to be asymmetric.
Thus, for each of the nine planes, it becomes possible to infer not only the lateral position of the fluorophore but also the distance of the fluorophore to the focal plane. This does not require the collection of light from the same fluorophore in different planes.
Because of the need of a single plane to detect single-molecules in three dimensions, the use of the apparatus enables to obtain a much larger depth of field, for the use of a larger variety of fluorophores, and for a larger axial resolution.
In addition, the use of the apparatus enables to carry out single-molecule tracking as each fluorophore can be imaged with considerably smaller excitation intensity. According to further aspects of the invention which are advantageous but not compulsory, the apparatus might incorporate one or several of the following features, taken in any technically admissible combination:
- the second point spread function has an elliptical shape defining a maximum radius and a minimum radius, the ratio between the minimum radius and the maximum radius being superior or equal to 0.1 %.
the ratio between the minimum radius and the maximum radius being superior or equal to 10%.
- the second optical system is an astigmatism system for which a tangential focal plane and a sagittal focal plane are defined, the distance between the tangential focal plane and the sagittal focal plane being comprised between 20 mm and 2000 mm.
- the second optical system comprises a cylindrical lens.
- the second optical system is a cylindrical lens.
the cylindrical lens has a focal length comprised between 100 mm and 104 mm.
- a focal plane is defined for the first optical system, the distance between the cylindrical lens and the focal plane being inferior or equal to 1000 mm.
- the second optical system is a spatial light modulator adapted to generate a double-helix phase mask.
- the second optical system is adapted to achieve a self-bending point spread function technique.
BRIEF DESCRIPTION OF THE DRAWINGS
The invention will be better understood on the basis of the following description which is given in correspondence with the annexed figures and as an illustrative example, without restricting the object of the invention. In the annexed figures:
- figure 1 shows schematically an example of apparatus for imaging at least one object,
- figure 2 shows schematically another example of apparatus for imaging at least one object,
- figure 3 shows schematically part of the apparatus for imaging at least one object according to figure 2,
- figure 4 is an optical representation of the apparatus for imaging at least one object according to figure 2, and
- figures 5 to 1 1 show experimental results obtained when using an apparatus for imaging according to figure 2. DETAILED DESCRIPTION OF SOME EMBODIMENTS
A sample 10 and an apparatus 12 for imaging at least one object of the sample 10 are represented on figure 1 .
The sample 10 is an assembly of objects.
According to the specific example of figure 1 , the sample 10 is a biological sample. For instance, each object is an organism.
The sample 10 comprises an assembly of organisms and a substrate supporting the organisms.
Depending on the case, the organism is acellular or cellular. Among the cellular organisms are usually distinguished from prokaryotes and eukaryotes. Archaea (also called archaebacteria) and the bacteria are examples of prokaryotic organism. Eukaryotic organisms are either unicellular or multicellular. For example, protozoa, amoeba, algae or yeast are unicellular eukaryotic organisms. Multicellular eukaryotic organisms are derived cells, for instance, human and non-human mammals, fungi, plants, protists or colourists.
The apparatus 12 is adapted to image at least one organism of the sample 10.
The apparatus 12 comprises a device 14, a second optical system 16 and a sensor 18.
The device 14 is adapted to achieve multi-focal microscopy on the at least one organism.
Multi-focal microscopy (also named after the acronym MFM) is a technique which relies on the use of a chirped diffraction grating placed in the Fourier plane of a microscope. Emitted light is diffracted into several central orders, for instance, nine central orders, which, after chromatic correction, are separately imaged on a single camera detector. The spacing between the different focal planes can be adjusted by carefully designing the diffraction grating. With an epifluorescence microscope, the device 14 enables high-resolution three-dimensional imaging in multiple colors at speeds that are around 10 times to 30 times faster than other conventional epifluorescence systems.
The device 14 comprises a light unit 20, a first optical system 22, a multifocus grating 24 and a correction unit 26.
The light unit 20 is adapted to emit light.
In the following, the notions of "upstream" and "downstream" are defined in reference to the sense of light propagation.
Furthermore, the direction of light propagation is defined as the Z-direction, the two others directions X and Y being directions perpendicular to the direction of light propagation. In the following, the others directions X and Y are labeled transverse direction. In the figure 1 , the second transverse direction Y is not contained in the sheet.
The first optical system 22 is adapted to propagate the emitted light towards the at least one organism of the sample 10.
The first optical system 22 produces a first point spread function PSF1 in a plane perpendicular to the direction of propagation of the light Z.
The point spread function PSF describes the response of an imaging system to a point source or point object. A more general term for the PSF is a system's impulse response, the PSF being the impulse response of a focused optical system. The PSF in many contexts can be thought of as the extended blob in an image that represents an unresolved object.
In functional terms, it is the spatial domain version of the transfer function of the imaging system. It is a useful concept in Fourier optics, astronomical imaging, electron microscopy and other imaging techniques such as 3D microscopy (like in confocal laser scanning microscopy) and fluorescence microscopy.
The degree of spreading (blurring) of the point object is a measure for the quality of an imaging system. In non-coherent imaging systems such as fluorescent microscopes, telescopes or optical microscopes, the image formation process is linear in power and described by linear system theory. This means that when two objects A and B are imaged simultaneously, the result is equal to the sum of the independently imaged objects. In other words: the imaging of A is unaffected by the imaging of B and vice versa, owing to the non-interacting property of photons.
The image of a complex object can then be seen as a convolution of the true object and the PSF. However, when the detected light is coherent, image formation is linear in the complex field. Recording the intensity image then can lead to cancellations or other non-linear effects.
The first point spread function PSF1 is symmetrical with relation to a central point. This means that the first point spread function PSF1 exhibits a property of rotational symmetry.
In the example of figure 1 , the first optical system 22 comprises a first lens 28, a mirror 30 and an objective 32.
The first lens 28 is adapted to shape the light emitted by the light unit 20. In the present specification, the word "lens" designates either a lens system such as a doublet or a single lens.
The mirror 30 is adapted to send the light to the objective 32. The objective 32 is adapted to send the light received from the mirror 30 to the sample 10 and to collect the light emitted by the sample 10.
According to the example of figure 1 , objective 32 is an objective of microscope.
Preferably, the objective 32 is an objective of epifluorescence microscope.
The multifocus grating 24 is adapted to split and to shift the focus of the sample 10 emission light to form an instant focal series, in which each focal plane corresponds to a diffractive order of the multifocus grating 24.
As an example, a multifocus grating is a set of a repeated identical diffractive element. The shape of the diffractive element depends on the specific experiment which is carried out.
The multifocus grating 24 is located at the Fourier plane of the objective 32.
The correction unit 26 is adapted to correct the chromatic dispersion introduced by the multifocus grating 24.
In optics, dispersion is the phenomenon in which the phase velocity of a wave depends on its frequency. Media having this common property may be termed dispersive media. Sometimes the term chromatic dispersion is used for specificity. Although the term is used in the field of optics to describe light and other electromagnetic waves, dispersion in the same sense can apply to any sort of wave motion such as acoustic dispersion in the case of sound and seismic waves, in gravity waves (ocean waves), and for telecommunication signals propagating along transmission lines (such as coaxial cable) or optical fiber.
According to another embodiment, the device 14 is deprived of a multifocus grating and of a correction unit. An example of such device 14 is described in an article by S. Geissbuehler, A. Sharipov, A. Godinat, N. L. Bocchio, P. A. Sandoz, A. Huss, N. A. Jensen, S. Jakobs, J. Enderlein, F. Gisou van der Goot, E. A. Dubikovskaya, T. Lasser, and M. Leutenegger, the article being entitled Live-cell multiplane three-dimensional super-resolution optical fluctuation imaging and the article being extracted from the review Nat. Commun. Volume 5, page 5830 and published in 2014.
The second optical system 16 is adapted to modify the first point spread function PSF1 .
The point spread function of both optical systems combined produces a second point spread function PSF2 in the plane perpendicular to the direction of propagation of the light Z.
The second point spread function PSF2 is asymmetrical with relation to the central point.
The sensor 18 is a camera. For instance, the camera is a CCD camera.
A charge-coupled device (CCD) is a device for the movement of electrical charge, usually from within the device to an area where the charge can be manipulated, for example conversion into a digital value. This is achieved by "shifting" the signals between stages within the device one at a time. CCDs move charge between capacitive bins in the device, with the shift allowing for the transfer of charge between bins.
In a specific embodiment, the camera is an electron multiplied CCD camera, also named emCCD camera
According to another embodiment, the camera is a sCMOS camera.
Operation of the apparatus 12 is now described by illustrating an example of using the apparatus 12 for imaging at least one organism.
Light is emitted by the light unit 20.
Light passes through the first lens 28 and is transmitted to the mirror 30.
The mirror 30 reflects the light to the objective 32 which focuses the light in the backfocal plane of the objective 32. This results in a wide field illumination of the sample 10. Then the organisms which are stimulated by light emit fluorescence light which is collected by the objective 32.
The collected light passes through the multifocus grating 24 and the correction unit 26 so as to achieve a multi-focal microscopy.
The light is then modified by the second optical system 16 which is adapted to modify the first point spread function PSF1 .
The second point spread function PSF2 is obtained when the sensor 18 receives the light from the second optical system 16.
The combination of the engineering of point spread function with multi-focal microscopy enables to obtain instant, super-resolved, thick three-dimensional volumes. More precisely, a change in the original multi-focal microscopy design is introduced so that the PSF in each image plane is re-engineered to render it asymmetric.
In this example, the point spread function of a system is evaluated in this context by evaluating the point spread function of the system by using the light which is emitted by the sample 10.
Thus, for each of the nine planes the lateral position (position in the plane perpendicular to the light direction Z) of the fluorophore but also the distance of the fluorophore to the focal plane can be infered. This does not require the collection of light from the same fluorophore in different planes.
Such property would also be valid if a different number of planes is considered, notably a larger one. The strong sensitivity of multi-focal microscopy makes the apparatus 12 suitable for image single molecules.
Furthermore, the multi-focal microscopy is compatible with PALM technique. PALM stands for photo-activated localization microscopy and designates a technique according to which the fluorophores are light-activated in a random way.
Similarly, the multi-focal microscopy is compatible with STORM technique. STORM stands for stochastic optical reconstruction microscopy and designates a technique according to which the fluorophores are light-activated in a random way.
The proposed apparatus 12 which combines the device 14 and the second optical system 16 for engineering the point spread function enables the detection of single molecules in three dimensions, at nanometer resolution, and with very large depths of field (> 4 micrometers).
In addition to the increased depth of field, the use of the multi-focal microscopy is considerably faster than methods known in the prior art. A factor of 10 to 50 can be reached.
The apparatus 12 permits higher labeling densities without requiring additional doses of light.
Such apparatus 12 therefore enables to obtain speed improvement, signal quality improvement, resolution improvement and depth of field improvement in single-molecule super resolution microscopy at no additional illumination dose by combining multi-focal microscopy to shaping of the point spread function.
In other words, the apparatus 12 allows for a much larger depth of field, for the use of a larger variety of fluorophores, and for a larger axial resolution. In addition, it will allow for single-molecule tracking as each fluorophore can be imaged with a considerably smaller excitation intensity.
Moreover, the apparatus 12 is expected to be able to image deeper samples, get better axial resolution for smaller depths of fields, and be usable even with low-emitting fluorophores, such as the commonly used photoactivatable proteins.
According to a specific embodiment, the second point spread function PSF2 has an elliptical shape defining a maximum radius and a minimum radius, the ratio between the minimum radius and the maximum radius being superior or equal to 0.1 %.
According to a more specific embodiment, the ratio between the minimum radius and the maximum radius is superior or equal to 10%.
According to another embodiment, the ratio between the minimum radius and the maximum radius is inferior or equal to 30%. Such shape for the second point spread function PSF2 enables to increase the performances of the apparatus 12.
More generally, the degree of asymmetry introduced in the second point spread function PSF2 is advantageously chosen such that single-molecules can be localized at high-resolution at half the spacing between planes. The resulting degree of asymmetry depends on the brightness of the fluorophore, the excitation power and the distance between the focal planes.
The use of the multi-focal microscopy in the apparatus 12 enables simultaneous imaging of nine focal planes separated by a distance dz in the sample 10, but a larger number of planes can be used.
We consider a single organism as a single emitter located at an axial position z along the direction of propagation of light Z:
Figure imgf000010_0001
where:
• n x dz is the position of its closest focal plane (n) of the multi-focal microscopy and
• Az≤ dz/2.
The emitter is imaged on the sensor 18 with a signal to noise ratio S(n x dz + Az). We define the signal to noise ratio as the ratio between the maximum intensity of the single molecule signal and the standard deviation of the background.
Usually, the response of the imaging system without any emitter inside is drawn. The signal within the response of the imaging system is registered in the same imaging conditions as for the fluorescent sample. The standard deviation between pixels intensities corresponds to the standard deviation of the background.
The resolution of localization based microscopy techniques depends on their localization precision. In order to get a localization precision under 50 nm, which is typically the resolution required to perform SR imaging, it is desired to have a signal to noise ratio higher than 2.
As a consequence, for the localization precision in all three dimensions to be below 50 nm, the signal to noise ratio of the image of an emitter located just in the middle of two planes is preferably greater than the computed limit of 2 :
S(n x dz + dz/2) > 2
In such conditions, the localization precision of our apparatus 12 will be less than 50 nm all throughout the imaging volume of the designed multi-focal technique.
Another embodiment of apparatus 12 is described in reference to figures 2 to 4. Each remark made in reference to the embodiment of figure 1 applies to the embodiment illustrated by figures 2 to 4.
According to the example of figure 2, the light unit 20 is a laser.
A laser is a device that emits light through a process of optical amplification based on the stimulated emission of electromagnetic radiation. The term "laser" originated as an acronym for "light amplification by stimulated emission of radiation". A laser differs from other sources of light in that it emits light coherently. Spatial coherence allows a laser to be focused to a tight spot, enabling applications such as laser cutting and lithography. Spatial coherence also allows a laser beam to stay narrow over great distances (collimation), enabling applications such as laser pointers. Lasers can also have high temporal coherence, which allows them to emit light with a very narrow spectrum, for instance, they can emit a single color of light. Temporal coherence can be used to produce pulses of light as short as a femtosecond.
In the specific example of figure 2, the light unit 20 is two excitation lasers.
The two excitation lasers are adapted to emit light at a wavelength of respectively
488 nanometers (nm) and 561 nanometers.
The first optical system 22 according to the embodiment of figure 2 comprises more elements than the first optical system 22 according to the embodiment of figure 1 .
In addition to the first lens 28, the mirror 30 and the objective 32, in the example of figure 2, the first optical system 22 further comprises a tube lens 40, a first relay lens 42, a dichroic mirror 44 and a second relay lens 46.
According to the embodiment of figure 2, the first lens 28 is a doublet.
In optics, a doublet is a type of lens made up of two simple lenses paired together. Such an arrangement allows more optical surfaces, thicknesses, and formulations, especially as the space between lenses may be considered an "element." With additional degrees of freedom, optical designers have more latitude to correct more optical aberrations more thoroughly.
The tube lens 40 is also a doublet.
The tube lens 40 is adapted to form an image. This image is called "primary image" and labeled PI in the remainder of the specification.
The two relay lens 42 and 46 are adapted to create a conjugate pupil plane or Fourier plane and the final image plane.
The first relay lens 42 is used to relay the image and defines the Fourier plane on which the grating 24 will be positioned. The first relay lens 42 can also be used to modify the magnification of the final image registered on sensor 18.
Both the first relay lens 42 and the second relay lens 46 are doublets. The focal length of the first relay lens 42 is equal to 150 mm.
The focal length f2 of the second relay lens 46 is equal to 200 mm.
The dichroic mirror 44 is adapted to separate the different light beams with different chromatic dispersion generated by the multi-focus grating 24.
The multifocus grating 24 is placed between the first relay lens 42 and the correction 50. More precisely, the multifocus grating 24 is placed in the Fourier plane of the first relay lens 42 (at a distance equal to the focal length of the first relay lens 42).
In the example of figure 2, the correction unit 26 comprises a chromatic correction grating 50 and a prism 52.
In addition, the second optical system 16 is an astigmatism system for which a tangential focal plane and a sagittal focal plane are defined, the distance between the tangential focal plane and the sagittal focal plane being comprised between 20 mm and 2000 mm.
Generally, light rays lying in the tangential and sagittal planes are refracted differently and both sets of rays intersect the chief ray at different image points, termed the tangential focal plane image and the sagittal focal plane.
More specifically, as illustrated, the second optical system 16 is a cylindrical lens. A cylindrical lens is a lens which focuses light which passes through onto a line instead of onto a point, as a spherical lens would. The curved face or faces of a cylindrical lens are sections of a cylinder, and focus the image passing through it onto a line parallel to the intersection of the surface of the lens and a plane tangent to it. The lens compresses the image in the direction perpendicular to this line, and leaves it unaltered in the direction parallel to it (in the tangent plane).
According to the example of figure 2, the axis on which the cylindrical lens is adapted to act is the first transverse axis X.
Along the first transverse axis X, the second optical system 16 has a focal length comprised between 100 mm and 104 mm.
The focal length of such a cylindrical lens determines the strength of the astigmatism added and consequently the axial range in which the individual emitter is imaged with a satisfying signal to noise ratio so that the individual emitter can be localized with the required precision.
In the specific example illustrated, the focal length of the second optical system 16 is equal to 500 mm.
By the expression "comprised", it should be understood that a parameter C is comprised between two values A and B when the parameter C is superior or equal to A and when the parameter C is inferior or equal to B. Furthermore, the cylindrical lens is positioned at a specific position in the apparatus 12.
The cylindrical lens is positioned between the second relay lens 46 and the surface of the sensor 18 where the final image Fl is formed.
More precisely, a focal plane is defined for the first optical system 22, the distance between the cylindrical lens and the focal plane being inferior or equal to 1000 mm.
In the specific example illustrated, the distance between the second relay lens 46 and the second optical system 16 is equal to 20 mm.
Using an astigmatism system is an easy way to taylor the point spread function. As illustrated, the tayloring of the point spread function is achieved by adding a weak cylindrical lens in the infinity path of the microscope. This enables to obtain the axial position of the organism since the position of the organism directly depends on the ellipticity of the second point spread function PSF2. In other words, the ellipticity of the second point spread function PSF2 directly depends on the axial position of the organism.
However, any optical system adapted to break the symmetry of the first point spread function is to be considered since it leads to the same improvement for the apparatus 12.
For instance, the second optical system 16 is a spatial light modulator adapted to generate a double-helix phase mask.
Such technique relies on the encoding of the z position of the emitter in the angular orientation of two lobes in the point spread function. By introducing a double-helix phase- mask thanks to a spatial light modulator in the imaging path, it is possible to engineer the point spread function so as to provide to the point spead function the axial shape of a double-helix. In the plane perpendicular to the direction of propagation of light Z, the double helix appears as two lobes.
More generally, an adequate phase mask (in reflection or transmission configuration) is placed at the back focal plane of the objective, inducing a double helix point spead function. The two lobs rotate depending on the axial position of the emitter along the direction of propagation of light Z.
Preferably, so as to obtain the better performances, the rotation of the two lobs should be limitated to 180° between two focal planes and the distance between the two lobs should be enough to separate the two lobs.
According to another embodiment, the second optical system 16 is adapted to achieve a self-bending point spread function technique.
A recent development using a "self-bending PSF" has enabled the achievement of a depth of field down to 3 μηι but requires imaging on two different channels (two cameras or split the camera chip). For instance, it is proposed to introduce an astigmatic lens in the optical path to create an asymmetry between the X and Y transverse directions. The introduction of this modification is a key as such modification permits the detection of single molecules in between planes and thus allows for an increased resolution between planes and for the detection of single molecules over a wider depth of field.
As single-molecule super-resolution microscopy is an increasingly popular and useful tool in the biological sciences, the present invention considerably increases the depth of field, the speed of acquisition, and the resolution of these systems.
The embodiments and alternative embodiments considered here-above can be combined to generate further embodiments of the invention.
EXPERIMENTAL SECTION
As explained before, the Applicant has developed a three-dimensional super- resolution microscopy method that enables deep imaging in cells. This technique relies on the effective combination of multifocus microscopy and astigmatic three-dimensional single-molecule localization microscopy. In this section, the optical system and the fabrication process of a multifocus grating are described. Then, two strategies for localizing emitters with the imaging method are presented and compared with a previously described deep three-dimensional localization algorithm. Finally, the performance of the method by imaging the nuclear envelope of eukaryotic cells reaching a depth of field of about 4 μηι is demonstrated.
In this section, the effective combination of multiple plane imaging using MFM and axial localization of single emitters by PSF engineering is described. This implementation enables deep three-dimensional super-resolved imaging with high localization precision in the three directions with fast localization algorithm processing.
Optical setup of the experiment
The optical system is positioned at the imaging output of a standard inverted microscope (Zeiss Axiovert 200) in wide field fluorescence microscopy configuration. A first relay lens L1 (f = 150 mm) is used to form a secondary Fourier plane, conjugate of the objective pupil plane, where a multifocus grating is positioned. The multifocus grating splits the fluorescence signal coming from the sample into a set of diffractive orders, their number and intensity being defined by the shape of the diffraction pattern. To create a multifocus image, a geometrical distortion is applied to the grating pattern in order to get a constant focus step dz between each diffraction order. However, due to its diffractive nature, the multifocus grating also introduces chromatic dispersion and is therefore not by itself suitable for high-resolution broadband imaging. This effect is compensated by adding a chromatic correction grating after the multifocus grating. The chromatic correction grating is placed at a distance where the diffractive orders are well separated, without overlap, and each diffraction order goes through a different blazed grating, specifically designed to reverse the dispersion introduced by the multifocus grating. The MFG (multifocus grating) splits the fluorescence light into nine diffractive orders. The chromatic correction module composed of the CCG (chromatic correction grating) and a multifacet prism compensates for chromatic dispersion induced by the MFG and redirects light to the camera. More precisely, the multifacet prism is positioned right after the chromatic correction grating and, combined with a second relay lens L2 (f = 200 mm), enables the refocusing of each diffractive order to its corresponding position on the chip of an enhanced CCD camera (IXON DU-897, Andor, Ireland) thus creating an instant focal series. In the Applicant's implementation of the multiple focus microscope, a long focal length (f = 1000 mm) cylindrical lens is positioned in front of the enhanced CCD camera. This optical element introduces astigmatism in the emitted light wavefront and enables to break the axial symmetry of the PSF. The diameter of the cylindrical lens has to be larger than the space occupied by the different beams coming from the different focal planes in order to induce the same amount of deformation to each diffractive order. The cylindrical lens introduces astigmatism on the PSF of the emitters imaged on the camera chip.
For STORM experiments, a 561 nm excitation laser (Sapphire 561 -100 CW, Coherent, UK) and a 405 nm activation laser (Stradus 405-100, Vortran, USA) were combined and collimated by a series of dichroic mirrors and achromatic lenses, individually controlled by an acousto-optic tunable filter (AOTFnC-400.650-TN AAoptics, France) and focused onto the back focal plane of the objective through the rear port of the microscope. The fluorescence signal emitted from the sample was collected by the objective lens, separated from the excitation wavelengths through a four-band dichroic mirror (zt405/488/561 /638rpc - Chroma, USA) and filtered using a bandpass filter (ET600/50m, Chroma, USA).
MFG fabrication and characterization
The design and performance of the multifocus grating control the spacing between the imaged planes and the transmission efficiency in a specific spectral range. For the purpose of this experiment, a binary grating adapted for SMLM was designed and fabricated. The grating function was optimized for an equal distribution of light between nine different diffraction orders using the pixel flipper algorithm. The grating pattern was then calculated, taking into account the emission range of Cy3b and the desired focus step separating each of the nine planes.
The numerically generated grating pattern file was sent to a mask printer (DWL 200, Heidelberg, Germany) that uses direct laser writing on a chromium plate coated with photoresist to generate the photolithography mask. After development, this mask was loaded into a Stepper (FPEA-3000 i4, Canon, Japan) and the grating pattern was transferred onto a fused silica wafer (500 μηι thick) coated with 1 μΓΤΐ-thick ECl positive photoresist by UV exposition.
The wafer was then developed and etched by RIE-ICP with CHF3 gas. The etching time was adjusted to reach the exact design depth according to the spectral range selected for the multifocus grating.
The newly fabricated multifocus grating was characterized by scanning electron microscopy to ensure that the dimensions and aspect ratio of the grating pattern correspond to the expected theoretical values. In addition, atomic force microscopy (AFM) microscopy was used to confirm scanning electron microscopy (SEM) observations and estimate with a nanometer precision the etch depth of the grating. In the present experiment, a depth of 663 nm wa measured. This depth corresponds to a phase shift of π at 606 nm, very close to the center of our emission filter (600 +/- 25 nm).
In order to evaluate the diffraction efficiency of the grating and the homogeneity of the light distribution, a HeNe laser was shined through the grating and intensities in the different diffractive orders were measured. The diffraction efficiency was 63%, close to the maximal theoretical diffraction efficiency of 67% for binary gratings. The light distribution heterogeneity was 5%, and was calculated as the ratio between the standard deviation of the intensities of diffractive orders and the mean intensity of diffractive orders. The homogeneous distribution of light between diffractive orders was confirmed by the homogeneity of intensity of a sample of beads imaged in different panels. Such image is the z projection of a stack of images acquired while scanning a sample of fluorescence beads (Invitrogen TetraSpeck 0.1 μηι, Thermo Fisher Scientific, USA) along the optical axis using a piezoelectric stage (Nano F-100, Mad City Labs, USA). The entropy of the images in the different panels during z scan was measured. The axial position of the minimum entropy in each panel corresponds to the focal plane of each panel. The average distance between minimum entropy positions gives the average distance between planes imaged by the microscope: 360 nm.
Careful microfabrication enabled the fabrication of a multifocus grating with transmission properties optimized for single Cy3b imaging and axial spacing optimized for deep three-dimensional super-resolution imaging. Image analysis
The cylindrical lens positioned before the camera modifies the wavefront of fluorescence light of single emitters. As stated before, this translates in an asymmetric PSF in the imaging plane, the ratio between the PSF width in x (wx) and y (wy) is directly related to the axial position of the emitter.
The position of single emitters was determined by fitting their PSF with an asymmetric two-dimensional Gaussian. The center of this Gaussian corresponds to the lateral position of the emitter, while the axial position is encoded in the PSF asymmetry. In order to establish the relation between the PSF asymmetry and the z position of single emitters, a careful calibration was performed by scanning a sample of fluorescent beads immobilized on a glass slide along the optical axis with a piezoelectric stage. A bright isolated bead was then selected on each panel of the image stack and an inferring technique was used to infer the widths along the x and y directions of its asymmetric PSF (wx and wy). The widths were fitted by a polynomial function of the third degree. These fitted functions were used as calibrations to compute the z position of every emitter localized in each plane. The localizations of a large number of beads (>10) from the calibration stack also enabled the calculation of the geometrical transformation (translation, rotation etc.) between the different panels. Once this transformation is calculated, it can be used to align the localizations extracted from the stack of images acquired during the axial scanning of the beads sample. The reconstructed trajectories recorded along about 4 μηι show that the Applicant's method was able to localize emitters over an extended axial depth of field.
An alternative three-dimensional localization algorithm was also implemented in this experiment. Although, this method still relies on fitting the asymmetric PSF to determine the lateral position of the emitter, the axial localization principle is different. The Applicant's method relies on the cross correlation of the PSF with a library of PSF acquired during calibration and corresponding to different axial positions of the emitter. Each acquired PSF is cross-correlated with all the PSFs in the library to obtain a profile of the cross-correlation coefficient as a function of z position. The axial position of the emitter is inferred by finding the maximum in the cross-correlation profile by interpolation. During the calibration acquisition, a library of PSF was recorded for every panel. The reconstructed image was then calculated with our localization algorithm. The transformation between different panels can be calculated using the calibration stack and used to align localizations from different panels. The trajectories of the beads during the calibration scan reconstructed using the cross correlation algorithm span of about 4 μηι . Localization precision
Next, the Applicant has evaluated the performance of multiple-plane detection and the optical system by imaging fluorescent beads on a glass slide at different excitation powers as represented on Figures 5 to 7. In other words, images of fluorescent beads at three different excitation powers P1 (for Figure 5), P2 (for Figure 6) and P3 (for figure 7). The first excitation power P1 is strictly inferior to the second power P2 and the second power P2 is strictly inferior to the third excitation power P3. Only the plane corresponding to the position closest to the emitters focal plane is displayed. The colormap corresponds to the log of the signal. All images are represented using the same color scale.
Figures 8 to 1 1 are graphs illustrating the comparison of localization precisions for Δ MFM according to the prior art, o an asymmetric Gaussian fit and x cross-correlation localization methods. Figure 8 represents a lateral localization precision as a function of excitation power. Arrows indicate the powers corresponding to the images shown in Figures 5 to 7. Figure 9 represents the lateral localization precision as a function of the number of detected photons. Figure 10 represents the axial localization precision as a function of excitation power Figure 1 1 represents the axial localization precision as a function of the number of detected photons. In figures 9 and 1 1 , filled symbols represent localization precisions measured for single Cy3b molecules, and grayed areas symbolize the typical range of detected photons associated to single molecule emission.
200 frames were acquired for each condition using a 50 ms integration time. The localization precisions were computed as the standard deviation of localized positions for the same emitter for multiple localizations. These acquisitions were done in two different experimental conditions. First, the Applicant imaged beads with the developed optical system and localized single emitters by nonlinear fitting of the astigmatic PSF as previously described. Second, beads were imaged with the MFM apparatus of the prior art and analyzed the image by fitting their three-dimensional position on the three- dimensional image stack. This second fitting required the localization of the same emitter in several planes. The lateral localization precision is comparable in both approaches for different numbers of photons collected as represented on Figures 8 and 9, despite the localization of emitters in a single plane in the developed optical system as opposed to multiple-plane detection in MFM of the prior art. The analysis time for the developed optical system was five times slower than for conventional MFM, independently of the number of frames analyzed (from 10 to 10000).
The axial localization precision is slightly degraded in 3D-MF-SMLM as compared to
MFM of the prior art as represented on Figures 10 and 1 1 . This degraded localization (about 40%) precision can be explained by the use of a single plane for the former method and the signal from multiple planes for the latter. To improve the axial localization precision of the fitting methods, a cross correlation localization algorithm was applied. To get an estimate of the localization precision in biological imaging conditions, the localization precision of antibodies labelled with single Cy3b dyes by STORM with and without the cylindrical lens was measured. Antibodies (anti-rabbit, 2 nM final concentration) were immobilized on a clean coverslip and immersed in a freshly prepared STORM buffer. We estimated the localization precision to be about 20 nm in the lateral direction (see Figure 9) and about 70 nm in the axial direction (see Figure 1 1 ) with the developed optical setup. Comparable values were obtained with MFM according to the prior art (see Figures 9 and 1 1 ). These values of the localization precision should enable SMLM in biological samples.
Biological imaging using the developed optical setup
In order to ensure the robustness and test the performance of the developed optical setup, three-dimensional super-resolution imaging of the nuclear envelope of Drosophila S2 cells was achieved by labeling the nuclear lamina. S2 Drosophila cultured cells were seeded onto coverslips coated with poly-L-lysine. Cells were labeled post fixation and permeabilization with a primary mouse antilamina antibody (ADL101 , DSHB, USA) and a secondary anti-mouse antibody labelled with a single Cy3b molecule. Excitation was performed at 561 nm, and photoactivation at 405 nm. The acquisition time was 50 ms and 4000 frames were acquired. The images were analysed with the fitting method and aligned with our alignment Matlab routine. Drift correction was performed in post- acquisition by three-dimensional tracking a fluorescent bead attached to the surface of the coverslip. The reconstructed image shows a homogeneous distribution of lamina around the nuclear envelope of a S2 cell. Spots of higher density appear on this image. Furthermore, the imaging depth was over 4 μηι.
From this data, the Applicant measured a localization precision of about 30 nm in the lateral direction and about 70 nm in the axial direction. This estimate is in good agreement with the localization precision measured for single Cy3b molecules deposited on a coverslip (see Figures 9 and 1 1 )..
In order to obtain an independent estimation of resolution in our experimental setup, the Applicant performed two-color imaging of the Fab-7 genomic locus in chromosome 3R in S2 cells and of a large, extended (-300 kbp) chromatin domain called the bithorax complex (BX-C). Fab-7 was labelled with Cy3b using a 4 kbp FISH probe. BX-C was labeled with Alexa 647 using the oligopaint DNA hybridization method. The Fab-7 sequence is part of BX-C, thus it would be expected to appear within the volume defined by BX-C. Cy3b imaging was performed using the same experimental conditions as the ones described in the last section. Alexa 647 was excited at 643 nm and activated at 405 nm. 10000 frames were sequentially acquired for each color. As expected, the two different genomic targets exhibited very different sizes and shapes. BX-C was about 500 nm in size, consistent with previous measurements (Beliveau, 2015). Fab-7 was within the BX-C volume, as expected.
From line profiles on the images of Fab-7, the Applicant measured a size of about 60 nm in the lateral direction and about100 nm in the axial direction. The size of the Fab-7 locus imaged under identical conditions but with conventional two-dimensional STORM is about 60 nm (data not shown). Thus, the resolution of our setup under real biological conditions is at least 60 nm lateral and about 100 nm axial.
Conclusions and discussion
In this experiment, the Applicant has combined multifocus microscopy with PSF engineering to enable the fast detection of single emitters in thick samples. The Applicant demonstrated the ability of this technique to detect single-emitters over an extended axial imaging depth (superior to 4μηι) while maintaining high lateral and axial localization precisions even for low emitter intensities.
General methodologies to design and construct multifocus gratings were described before. Here, the Applicant presented the construction and characterization of a binary grating with about 400 nm spacing, ideal for cellular imaging with organic dyes. With this configuration, the Applicant was able to image the nuclear envelope of eukaryotic cells with an imaging depth of over 4 μηι. The use of multi-phase gratings with higher transmission efficiencies (from 67 to 92%) and with larger inter-plane distances (for instance 800 nm) should enable the further extension of the imaging depth to up to about 10 μηι. Cy3b super-resolution imaging of antibodylabeled lamin was used to demonstrate three-dimensional super-resolution imaging with the developed optical setup, even though it is well-known that this dye is not the best suited for super-resolution. Further optimization of the multifocus grating and chromatic correction grating for far red dyes (i.e. Alexa647 or cy5) should thus improve the performance of the developed optical setup and allow for better resolution. Overall, these improvements will be key for numerous biological applications.
Previous multifocus microscopy method used the detection of single-emitters in multiple planes to achieve three-dimensional nanometer localization. This method is computationally intensive, as it requires the alignment and assembly of the nine imaging planes into a three-dimensional volume, and the non-linear fitting of the PSF of each detected emitter. Instead, the developed optical setup requires only the detection and localization of emitters in a single imaging plane, this allows for an increase in the distance between MFM planes to reach thicker axial imaging depths. Importantly, this method also allows for a considerable increase in image reconstruction speed without sacrificing localization precision, as it requires the fitting of the emitter PSF in a single plane to yield a three-dimensaional localization.
Finally, the invention opens the door to further combination of MFM and other PSF engineering methods. Combination of MFM with adaptive optics would enable three- dimensional superresolution imaging at high penetration depths, and will further lead to an improvement in the photon budget to increase localization precision and extend imaging depth. Use of other PSF engineering methods such as the double helix PSF or adaptations to use other sensors could lead to a considerable increase in the axial and lateral imaging ranges. Excitingly, these future improvements have the potential to further empower superresolution methods by enabling the imaging of a larger variety of biological specimens.

Claims

1 . - Apparatus (12) for imaging at least one object, the apparatus (12) comprising:
- a device (14) adapted to achieve multi-focal microscopy on the at least one object comprising:
- a light unit (20) adapted to emit light, and
- a first optical system (22) adapted to propagate the emitted light towards the at least one object, the first optical system (22) producing a first point spread function in a plane perpendicular to the direction of propagation of the light, the first point spread function being symmetrical with relation to a central point,
characterized in that the apparatus (12) further comprises:
- a second optical system (16) adapted to modify the first point spread function, the point spread function of both optical systems combined producing a second point spread function in the plane perpendicular to the direction of propagation of the light, the second point spread function being asymmetrical with relation to the central point.
2. - Apparatus according to claim 1 , wherein the second point spread function has an elliptical shape defining a maximum radius and a minimum radius, the ratio between the minimum radius and the maximum radius being superior or equal to 0.1 %.
3. - Apparatus according to claim 2, wherein the ratio between the minimum radius and the maximum radius being superior or equal to 10%.
4. - Apparatus according to any one of claims 1 to 3, wherein the second optical system (16) is an astigmatism system for which a tangential focal plane and a sagittal focal plane are defined, the distance between the tangential focal plane and the sagittal focal plane being comprised between 20 mm and 2000 mm.
5. - Apparatus according to any one of claims 1 to 4, wherein the second optical system (16) comprises a cylindrical lens.
6. - Apparatus according to claim 5, wherein the second optical system (16) is a cylindrical lens.
7. - Apparatus according to claim 6, wherein the cylindrical lens has a focal length comprised between 100 mm and 104 mm.
8. - Apparatus according to claim 6 or 7, wherein a focal plane is defined for the first optical system (22), the distance between the cylindrical lens and the focal plane being inferior or equal to 1000 mm.
9. - Apparatus according to any one of claims 1 to 3, wherein the second optical system (16) is a spatial light modulator adapted to generate a double-helix phase mask.
10. - Apparatus according to any one of claims 1 to 3, wherein the second optical system (16) is adapted to achieve a self-bending point spread function technique.
PCT/EP2016/061926 2015-05-26 2016-05-26 Apparatus for imaging at least one object WO2016189095A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15305787 2015-05-26
EP15305787.2 2015-05-26

Publications (1)

Publication Number Publication Date
WO2016189095A1 true WO2016189095A1 (en) 2016-12-01

Family

ID=53284187

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/061926 WO2016189095A1 (en) 2015-05-26 2016-05-26 Apparatus for imaging at least one object

Country Status (1)

Country Link
WO (1) WO2016189095A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108982455A (en) * 2018-07-31 2018-12-11 浙江大学 A kind of multifocal light slice fluorescent microscopic imaging method and device
TWI727393B (en) * 2019-08-14 2021-05-11 國立陽明交通大學 Laser and imaging integration system and method thereof
WO2021099242A1 (en) * 2019-11-20 2021-05-27 Carl Zeiss Smt Gmbh Device and method for measuring substrates for semiconductor lithography

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140146159A1 (en) * 2012-11-28 2014-05-29 The Penn State Research Foundation Z-microscopy

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140146159A1 (en) * 2012-11-28 2014-05-29 The Penn State Research Foundation Z-microscopy

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"Expanding the capabilities of multifocus microscopy (MFM", CONFERENCE FOCUS ON MICROSCOPY 2013, 24 March 2013 (2013-03-24)
BLANCHARD P M ET AL: "SIMULTANEOUS MULTIPLANE IMAGING WITH A DISTORTED DIFFRACTION GRATING", APPLIED OPTICS, OPTICAL SOCIETY OF AMERICA, WASHINGTON, DC; US, vol. 38, no. 32, 10 November 1999 (1999-11-10), pages 6692 - 6699, XP000893705, ISSN: 0003-6935, DOI: 10.1364/AO.38.006692 *
ROBERT S FISCHER ET AL: "Microscopy in 3D: a biologist's toolbox", TRENDS IN CELL BIOLOGY, vol. 21, no. 12, 16 December 2011 (2011-12-16), pages 682 - 691, XP028120981, ISSN: 0962-8924, [retrieved on 20110930], DOI: 10.1016/J.TCB.2011.09.008 *
S. GEISSBUEHLER; A. SHARIPOV; A. GODINAT; N. L. BOCCHIO; P. A. SANDOZ; A. HUSS; N. A. JENSEN; S. JAKOBS; J. ENDERLEIN; F. GISOU VA: "Live-cell multiplane three-dimensional super-resolution optical fluctuation imaging", NAT. COMMUN., vol. 5, 2014, pages 5830
SARA ABRAHAMSSON ET AL: "Fast multicolor 3D imaging using aberration-corrected multifocus microscopy", NATURE METHODS, vol. 10, no. 1, 9 December 2012 (2012-12-09), GB, pages 60 - 63, XP055223709, ISSN: 1548-7091, DOI: 10.1038/nmeth.2277 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108982455A (en) * 2018-07-31 2018-12-11 浙江大学 A kind of multifocal light slice fluorescent microscopic imaging method and device
CN108982455B (en) * 2018-07-31 2020-08-18 浙江大学 Multi-focus light section fluorescence microscopic imaging method and device
TWI727393B (en) * 2019-08-14 2021-05-11 國立陽明交通大學 Laser and imaging integration system and method thereof
WO2021099242A1 (en) * 2019-11-20 2021-05-27 Carl Zeiss Smt Gmbh Device and method for measuring substrates for semiconductor lithography

Similar Documents

Publication Publication Date Title
Papagiakoumou et al. Scanless two-photon excitation with temporal focusing
Ströhl et al. Frontiers in structured illumination microscopy
US10795144B2 (en) Microscopy with structured plane illumination and point accumulation for imaging and nanoscale topography
JP6166776B2 (en) High resolution imaging of extended volume
Muller Introduction to confocal fluorescence microscopy
Baumgart et al. Scanned light sheet microscopy with confocal slit detection
JP7192048B2 (en) Composition and method for light sheet microscopy
US8705172B2 (en) Microscopy method and microscope with enhanced resolution
EP2107363B1 (en) Method of fluorescence-microscopically imaging a structure in a sample with high three-dimensional spatial resolution
US10247934B2 (en) Method for examining a specimen by means of light sheet microscopy
Erdelyi et al. Correcting chromatic offset in multicolor super-resolution localization microscopy
US20200150446A1 (en) Method and System for Improving Lateral Resolution in Optical Scanning Microscopy
Birk Super-resolution microscopy: a practical guide
Oudjedi et al. Astigmatic multifocus microscopy enables deep 3D super-resolved imaging
US10018817B2 (en) Adaptive optics for imaging through highly scattering media in oil reservoir applications
Roider et al. 3D image scanning microscopy with engineered excitation and detection
Hajj et al. PSF engineering in multifocus microscopy for increased depth volumetric imaging
WO2016189095A1 (en) Apparatus for imaging at least one object
WO2013176549A1 (en) Optical apparatus for multiple points of view three-dimensional microscopy and method
Birk et al. Super-resolution microscopy with very large working distance by means of distributed aperture illumination
Thibon et al. Resolution enhancement in laser scanning microscopy with deconvolution switching laser modes (D-SLAM)
Zeng et al. Advances in three‐dimensional super‐resolution nanoscopy
Owen et al. Super-resolution imaging by localization microscopy
Zhao et al. Large field of view correction by using conjugate adaptive optics with multiple guide stars
Wang et al. Adaptive optics in super-resolution microscopy

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16727369

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16727369

Country of ref document: EP

Kind code of ref document: A1