WO2017161055A2 - Super-resolution imaging of extended objects - Google Patents

Super-resolution imaging of extended objects Download PDF

Info

Publication number
WO2017161055A2
WO2017161055A2 PCT/US2017/022600 US2017022600W WO2017161055A2 WO 2017161055 A2 WO2017161055 A2 WO 2017161055A2 US 2017022600 W US2017022600 W US 2017022600W WO 2017161055 A2 WO2017161055 A2 WO 2017161055A2
Authority
WO
WIPO (PCT)
Prior art keywords
sub
image
super
match
illumination spot
Prior art date
Application number
PCT/US2017/022600
Other languages
French (fr)
Other versions
WO2017161055A3 (en
Inventor
Jiun-Yann Yu
Carol J. COGSWELL
Simeng CHEN
Robert H. Cormack
Jian Xing
Original Assignee
Regents Of The University Of Colorado, A Body Corporate Corporate
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Regents Of The University Of Colorado, A Body Corporate Corporate filed Critical Regents Of The University Of Colorado, A Body Corporate Corporate
Priority to EP17767483.5A priority Critical patent/EP3430460A4/en
Priority to US16/085,524 priority patent/US10663750B2/en
Publication of WO2017161055A2 publication Critical patent/WO2017161055A2/en
Publication of WO2017161055A3 publication Critical patent/WO2017161055A3/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/58Optics for apodization or superresolution; Optical synthetic aperture systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0075Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for altering, e.g. increasing, the depth of field or depth of focus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4053Scaling of whole images or parts thereof, e.g. expanding or contracting based on super-resolution, i.e. the output image resolution being higher than the sensor resolution

Definitions

  • the present invention relates to apparatus and methods for super-resolution imaging of 2D and 3D extended objects through optimization decoding of spatially
  • PSFs overlapping point spread functions
  • An object of the present invention is to provide apparatus and methods for improved super-resolution imaging of 2D and 3D extended objects.
  • An embodiment first scans an object using an illumination spot which subdivides the field of view into a series of small sub-images and then applies a numerical optimization procedure to
  • the illumination spot size is approximately the size of a diffraction-limited focused beam (or slightly larger), and ensures that each recorded sub-image is composed of only the PSFs produced by sample features that fall within the illumination beam.
  • a method of super-resolution imaging an extended object includes the steps of providing an illumination source configured to illuminate the object with an
  • illumination spot having an illumination spot area at the object, scanning the illumination spot across the object, capturing a series of sub-images of the object wherein the sub-image detection region area substantially exceeds the illumination spot area, providing a dictionary of templates comprising PSFs, comparing each sub-image to combinations of highly-overlapping PSF templates from the dictionary, while varying intensities of individual PSF templates, to find a best-match solution for that sub-image; and creating a best-match reconstruction of each sub-image from the best-match solution for that sub-image.
  • the best-match reconstructions of sub-images are then combined to form a super-resolution image of the object. In some cases overlapping sub-images are captured, in order to improve final image quality.
  • One embodiment of the invention utilizes a non-negative least-squares optimization algorithm to find best-match solutions.
  • a phase plate or equivalent may be placed in the illumination path in order to produce depth-encoded sub-images of the object having point spread functions (PSFs) which vary according to depth range.
  • PSFs point spread functions
  • Axial super-resolution can be as good as X-Y super-resolution.
  • Fluorescent objects may be super-resolved by this method.
  • Figure 1 is a block diagram illustrating an embodiment of the present invention.
  • Figures 2A and 2B illustrate an overview of a method of super-resolution imaging of an extended object according to the present invention.
  • Figure 2A shows the illumination spot and associated sub-image capture.
  • Figure 2B is a schematic block diagram showing how sub-images are acquired and processed.
  • Figures 3A-3C are diagrams showing the process of iteratively matching an acquired sub-image to a combination of PSF templates from a dictionary.
  • Figure 4 illustrates three possible PSF template dictionaries where the modeled PSFs are spaced increasingly closer together and resulting reconstructed sub- objects.
  • Figures 5A-C illustrate super-resolution imaging of a two-dimensional object compared to traditional fluorescent microscope imaging.
  • Figure 5A shows an image captured by a traditional fluorescence microscope at NA0.5.
  • Figure 5B shows a super-resolved image of the same object at the same NA.
  • Figure 5C shows a traditional image at NA1 .4.
  • Figures 6A-C illustrate super-resolution imaging of a three-dimensional object compared to traditional fluorescent microscope imaging.
  • Figure 6A shows a three- dimensional rendering of a super-resolved fluorescent biological object produced according to the present invention.
  • Figure 6B shows a super-resolved side-view of the same object.
  • Figure 6C is a side-view image of the same object assembled from a confocal laser scanning microscope image stack.
  • FIG. 1 is a block diagram illustrating an embodiment of the present invention.
  • This embodiment comprises a microscope 100 and optionally includes a phase plate such as a circular caustic.
  • the phase plate is useful for 3D imaging, as a convenient way to form a set of axially asymmetric PSFs.
  • asymmetric sets of PSFs can be produced in other ways.
  • the optics of the imaging system itself will often result in such PSFs.
  • PSFs could be produced by lenses such as axicons or cylindrical lenses.
  • a phase plate or equivalent is not required for two dimensional super-resolution imaging.
  • a narrow beam 124 is focused onto object 102 and scanned over object 102 by an illumination source with scanning apparatus 1 16.
  • Beam splitter 104 provides scanning illumination 124 to object 102 and then allows the image light to travel the imaging path to camera 120.
  • Scanning apparatus 1 16 might comprise, for example, XY scanning mirrors and other optics controlled by processor 140. In this
  • a laser may be used, but incoherent light works as well.
  • Sub-images 204 are relayed through lens 108, lens 1 10, phase plate 1 12 (if used), and lens 1 14 to image on camera 120.
  • Camera 120 is placed at the back focal plane of lens 1 14, and phase plate 1 12 is placed at the front focal plane of lens 1 14.
  • Phase plate 1 12 is conjugate 130 with the objective 1 18 back focal plane.
  • EMF 106 is an optional fluorescent filter that is useful when imaging fluorescent objects.
  • Camera 120 provides the sub-images to processor 140.
  • Processor 140 iteratively compares each sub-image to combinations of PSF templates within a modeled dictionary of templates (see Figures 2-4) and each sub-image is matched and associated with a best-match solution.
  • Processor 140 then creates best-match reconstructions from the best-match solutions and combines the reconstructions associated with the sub-images to form a super-resolution image of the object.
  • the embodiment of Figure 1 optionally utilizes a phase plate 1 12 to, for example, to generate ring-shaped PSFs as described in U.S. Pat. No. 9,325,971 ("Engineered Point Spread Function for Simultaneous Extended Depth of Field and 3D Ranging," incorporated herein by reference). In this configuration, it achieves super-resolution imaging in all three-dimensions rather than just two-dimensional super-resolution imaging as in 2D embodiments. Briefly, the diameter of each ring is related to its depth in the object.
  • Embodiments of the present invention use alternative designs of the conventional optical imaging system so that they preferably produce image PSFs according to the following criteria: 1 .
  • PSFs preferably comprise a set of best- focus Airy disc PSFs that can be detected digitally using numerical
  • axial shift of the PSFs ideally form a range of unique shapes (such as rings) that can be precisely correlated to sample depth and can be detected digitally using numerical optimization with no ambiguity. This has advantages over traditional Airy disc PSFs where identical shapes appear when moving in either direction from the best-focus plane.
  • photons radiating from the original object are generally redirected into these PSFs in such a way as to maximize the uniformity of their intensities over extended depths, rather than concentrating all photons into a "best focus" spot as in traditional Airy disc PSFs.
  • a goal is increased dynamic range.
  • PSFs are preferably designed to contain maximum information content and high signal-to-noise ratio (SNR) over depths as much as 20 times thicker than the imaging lens depth of field (DOF).
  • SNR signal-to-noise ratio
  • the size of illumination (the diameter of the scanning illumination spot) for one recorded image is limited to a relatively small region that is only a few times greater (e.g. 1.5 - 2 times greater) than a diffraction-limited illumination spot. This ensures that the PSF overlapping can be well conditioned for digital processing.
  • This small illumination region (or an array of small regions) is then scanned in a two-dimensional pattern to build up an image of the entire 2D or 3D object.
  • Figures 2A and 2B illustrate an overview of a method of super-resolution imaging of an extended object according to the present invention.
  • Figure 2A illustrates the illumination spot size 220 at the object (left image) as compared to the sub-image detection region 204 (center image).
  • the right-hand image is a schematic diagram illustrating the comparison between the illumination spot area 226 (A E ) and detection region area 228 (A D ).
  • detection region area is defined as "the area the detection region would cover projected back to the object (i.e. the detection region at the detector divided by the magnification power of the imaging system 100).
  • the sub-image detection region area is considerably larger than the illumination spot area, for example about an order of magnitude bigger.
  • Ratios (A D )/(A E ) of 2: 1 work to some extent. A ratio of 9: 1 has been found to work well.
  • Limiting the size of the illuminated sample region is essential to achieve super-resolution.
  • modeling a microscope formulates a fluorescence image as a linear superposition of the PSFs of individual point-like fluorescent photo-emitters. This leads to a series of linear equations:
  • k denotes the indices of individual pixels
  • l(s k ) denotes the intensity detected by the camera pixel at position s k
  • x denotes the brightness of the i-th photo-emitter (i.e. the unknowns we are trying to determine)
  • n is the total number of individual photo-emitters
  • PSF denotes the intensity point spread function formed by photo- emitter
  • p is the total number of pixels.
  • Eq. (1 ) is a linear system, if the detection is noiseless, and if all of the n PSFs are known a priori and are linearly independent, we only need the intensities detected by n individual pixels to solve x exactly.
  • the brightness of each of the photo-emitters can be recovered, even though their combined image PSFs are highly overlapped (spaced much closer together than the Rayleigh two-point resolution criterion for optical imaging). Since physical systems do have detector readout noise and photon shot noise, it is favorable to have the number of detections (i.e., number of pixels in the detection region, n) larger than the number of unknowns (i.e., the number of photo-emitters, p) in order to solve for the brightness coefficients i accurately.
  • Figure 2A shows that the total number of photo-emitters (unknowns) is approximately proportional to the illumination area A E , while the total number of pixels in the recorded sub-image PSFs (i.e. detections) is proportional to the image detection area A D .
  • An estimate of numerical problem determination is given by dividing the detection area by the illumination area:
  • Figure 2B is a schematic block diagram illustrating the process of how sub-images 204 are acquired through scanning 202, iteratively associated to PSF templates from a dictionary of templates 206 using a numerical optimization procedure 208, and the set of best match solutions 230 are reconstructed 210 and combined to generate a super-resolution image 212.
  • an acquired sub-image 204 is compared to combinations of highly-overlapping PSF templates from dictionary 206 until a best- match solution 230 is found.
  • the PSF templates are fixed in position once the dictionary is provided.
  • the PSF template intensities are varied (including being set to zero) while process 208 is performed for each sub-image 204. Experiments have shown that overlapping sub-images 204 results in better final images 212.
  • the dictionary of templates is preferably calibrated to the set of PSFs produced by the imaging system so that the resulting reconstructed image is accurate for that imaging system.
  • the sub-images 204 might be composed of overlapping Airy discs.
  • the photo- emitters have real, non-negative brightness.
  • 202 illustrates an example of a scanning pattern performed at the object 102, in this case a spiral pattern. Generally, the scanning is done in steps, with a sub-image captured at each step.
  • 204 is an example of a sub-image acquired during the scanning process.
  • 206 shows a dictionary of templates which may be iteratively compared to each sub-image.
  • 208 shows the image comparison process, in this case a non-negative least-squares optimization process. This process finds a best- match solution 230 for each sub-image and then creates a best-match
  • 212 shows the reconstructed image formed from combining best-match sub-image
  • Figures 3A-3C are diagrams showing in more detail the process of iteratively comparing an acquired sub-image 204 to combinations of highly-overlapping PSF templates from a dictionary 206 to find a best-match solution.
  • Figure 3A shows sub-object 302, which represents the ground-truth of this portion of the object.
  • 204 is the acquired sub-image from sub-object 302.
  • 306A is an intermediate solution after one iteration of process 208 (which is comparing solutions 306A-C to acquired sub-image 204).
  • 308A represents what the reconstructed object sub-image 204 would look like after this one iteration. In general the process would not reconstruct intermediate sub-images - this is shown for clarity.
  • Figure 3B shows the result after 51 iterations.
  • 306B is an intermediate solution and 308B is how the intermediate reconstructed sub-object would now appear.
  • the reconstructed sub-objects 308A-C more closely resemble the original ground truth sub-object 302.
  • Figure 3C best-match solution 306C (produced by choosing a combination of PSF template intensities in dictionary 206) becomes very similar to sub-image 204.
  • Figure 4 shows the process and results of selecting different template dictionaries 206 for the sub-image comparison and matching process.
  • 206A is a relatively sparse PSF template dictionary, having an array of nine PSFs.
  • 206B is a 7x7 array PSF dictionary and 206C is a 9x9 array PSF dictionary.
  • denser PSF dictionaries 206 result in more highly-resolved final images 212.
  • image 212C is better than image 212B, which is better than image 212A.
  • a user might select a sparse PSF template dictionary for a quick initial look at the object and then use a denser PSF dictionary for higher-resolution imaging if warranted.
  • the intensity combination of the template PSFs is varied.
  • the iterative process zeroes in on the best match solution efficiently.
  • 206A, 206B, and 206C show examples of three possible dictionaries in which the template PSF spacing, is made successively smaller.
  • each square represents one camera pixel
  • the dots represent the centers of the template PSFs
  • the dashed circles represent the full width at half maximum (FWHM) diameters of a selected number of PSFs to show how they increase in overlap as the PSF spacing is reduced.
  • This template spacing in each dictionary is what determines the upper bound of the resolution of the final image after processing.
  • Images 212A, 212B, and 212C show reconstructed image results from ⁇ 900 (scanned) experimentally-acquired sub-images, of fluorescein-labeled actin filaments in a BPAE cell.
  • the objective lens is 63x, NA 1 .4.
  • 412C shows that a super-resolution of ⁇ 50nm in XY can be achieved with the present invention as compared to the five-times worse ( ⁇ 250nm) maximum (diffraction-limited) capability of the same objective when used in a standard fluorescence microscope
  • Figures 5A-C show experimental image results to confirm the super-resolution capabilities of the new invention for a two-dimensional continuous object.
  • Figure 5A is a traditional fluorescence microscope image
  • Figure 5B is a super-resolution image generated according to the present invention. Both images are captured with a wide-field microscope with an NA 0.5 objective, but the super-resolution image (Figure 5B) includes much more detail.
  • the object is fluorescently-labeled mitochondria in a BPAE cell. Image insets show higher magnification regions for ease of resolution comparison.
  • Figures 6A-C show super-resolution in all three dimensions is possible for continuous 3D objects using the present invention with phase plate 1 12 inserted in the imaging path to produce axially asymmetric ring-shaped PSFs (as described in U.S. Pat. No. 9,325,971 ).
  • Figure 6A shows a three-dimensional rendering of a super-resolved fluorescent biological object produced according to the present invention.
  • Figure 6B shows a side-view of the same object acquired using the new invention to illustrate its axial super-resolution capabilities as compared to a side- view image assembled from a confocal image stack ( Figure 6C).
  • the super-resolution capabilities of the invention are similar to its lateral (X-Y axis) resolution capabilities (i.e. 100nm in all three dimensions). This is unlike the confocal image 6C that shows the biological features are elongated (smeared) along the Z axis which corresponds to its much poorer axial resolution.
  • the object is a 15pm thick, fluorescently-labeled Tetrahymena cell. Objective is 63x, NA 1 .4.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Biochemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Microscoopes, Condenser (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Image Processing (AREA)

Abstract

Apparatus and methods for super-resolution imaging of extended objects utilize a scanned illumination source with a small spot size. Sub-images composed of highly-overlapping point spread functions are captured and each sub-image is iteratively compared to a series of brightness combinations of template point spread functions in a dictionary. The dictionary is composed of highly-overlapping point spread functions. Each sub-image is associated with a best-match template combination solution, and the best-match reconstructions created by best-match solutions are combined into a super-resolution image of the object.

Description

SUPER-RESOLUTION IMAGING OF EXTENDED OBJECTS BACKGROUND OF THE INVENTION FIELD OF THE INVENTION
The present invention relates to apparatus and methods for super-resolution imaging of 2D and 3D extended objects through optimization decoding of spatially
overlapping point spread functions (PSFs).
DISCUSSION OF RELATED ART
Advances in optical microscopy continue to take center stage due to their potential for non-invasive imaging in such diverse applications as live-cell dynamics, particle tracking, and nano-scale materials fabrication. In particular, biological research communities have long been eager for high speed three-dimensional "super- resolution" optical imaging (i.e. image resolutions beyond the diffraction limit) - techniques that enable the observation of mechanical and biochemical dynamics in living cells down to nanometer scales. However, if this goal to observe ever-smaller objects is to be achieved, two fundamental limits in current optical imaging theory must be addressed: (1 ) diffraction (which maintains that object features smaller than ~250nm cannot be resolved) and (2) out-of-focus blur that arises from objects above and below the plane of best focus. Recently, microscopes developed to overcome the first of these limits (i.e. diffraction) were so significant to biological fluorescence imaging studies that they led to the awarding of the 2014 Nobel Prize in Chemistry. However, these new microscope approaches, based on either stimulated emission (STED) or particle localization (PALM or STORM), have major problems for 3D imaging at speed.
The second fundamental limit of optical imaging (the out-of-focus blur) has led to the development of microscopes such as laser-scanning confocal systems over the past 30 years. However, these instruments still do not meet the goal of imaging an extended 3D sample volume at high speed (i.e. in real time or faster). This is because they require through-depth image acquisitions at each experimental time point in order to remove out-of-focus features from each acquired image plane. In addition, confocal-like systems provide minimal resolution improvement beyond the diffraction limit.
Nearly all existing high-and super-resolution imaging techniques rely on the objective lens forming a tightly focused point spread function over a flat plane, which is the most photon-efficient approach for pursuing precise two-dimensional imaging. However, such an approach has two drawbacks: (1 ) it provides rather limited (i.e. no super-resolution) information about the third dimension, as these tightly-focused PSFs are, in most cases, axially invariant near their best-focus plane and (2) image features rapidly go out of focus when observing objects that are thicker than the plane of best focus.
One use of super resolution microscopy is the examination of continuous biological structures. Existing devices generally require photo-switchable fluorophores, stimulated emission depletion, or specially designed illumination patterns.
A need remains in the art for apparatus and methods for improved super-resolution imaging of 2D and 3D extended objects.
SUMMARY OF THE INVENTION
An object of the present invention is to provide apparatus and methods for improved super-resolution imaging of 2D and 3D extended objects. An embodiment first scans an object using an illumination spot which subdivides the field of view into a series of small sub-images and then applies a numerical optimization procedure to
differentiate the overlapping PSFs to super-resolution (sub-diffraction-limit) accuracy in the resulting small sub-images.
The illumination spot size is approximately the size of a diffraction-limited focused beam (or slightly larger), and ensures that each recorded sub-image is composed of only the PSFs produced by sample features that fall within the illumination beam.
This invention demonstrates two-dimensional and three-dimensional super- resolution imaging for extended object features when observed using microscopes or other optical instruments such as endoscopes. A method of super-resolution imaging an extended object includes the steps of providing an illumination source configured to illuminate the object with an
illumination spot having an illumination spot area at the object, scanning the illumination spot across the object, capturing a series of sub-images of the object wherein the sub-image detection region area substantially exceeds the illumination spot area, providing a dictionary of templates comprising PSFs, comparing each sub-image to combinations of highly-overlapping PSF templates from the dictionary, while varying intensities of individual PSF templates, to find a best-match solution for that sub-image; and creating a best-match reconstruction of each sub-image from the best-match solution for that sub-image. Generally the best-match reconstructions of sub-images are then combined to form a super-resolution image of the object. In some cases overlapping sub-images are captured, in order to improve final image quality.
One embodiment of the invention utilizes a non-negative least-squares optimization algorithm to find best-match solutions. In addition, a phase plate or equivalent may be placed in the illumination path in order to produce depth-encoded sub-images of the object having point spread functions (PSFs) which vary according to depth range. Once depth range is decoded, super-resolution in the third, axial dimension is possible. Axial super-resolution can be as good as X-Y super-resolution.
Fluorescent objects may be super-resolved by this method.
BRIEF DESCRIPTION OF THE DRAWINGS
Figure 1 is a block diagram illustrating an embodiment of the present invention.
Figures 2A and 2B illustrate an overview of a method of super-resolution imaging of an extended object according to the present invention. Figure 2A shows the illumination spot and associated sub-image capture. Figure 2B is a schematic block diagram showing how sub-images are acquired and processed.
Figures 3A-3C are diagrams showing the process of iteratively matching an acquired sub-image to a combination of PSF templates from a dictionary. Figure 4 illustrates three possible PSF template dictionaries where the modeled PSFs are spaced increasingly closer together and resulting reconstructed sub- objects.
Figures 5A-C illustrate super-resolution imaging of a two-dimensional object compared to traditional fluorescent microscope imaging. Figure 5A shows an image captured by a traditional fluorescence microscope at NA0.5. Figure 5B shows a super-resolved image of the same object at the same NA. Figure 5C shows a traditional image at NA1 .4.
Figures 6A-C illustrate super-resolution imaging of a three-dimensional object compared to traditional fluorescent microscope imaging. Figure 6A shows a three- dimensional rendering of a super-resolved fluorescent biological object produced according to the present invention. Figure 6B shows a super-resolved side-view of the same object. Figure 6C is a side-view image of the same object assembled from a confocal laser scanning microscope image stack.
DETAILED DESCRIPTION OF THE INVENTION
Figure 1 is a block diagram illustrating an embodiment of the present invention. This embodiment comprises a microscope 100 and optionally includes a phase plate such as a circular caustic. The phase plate is useful for 3D imaging, as a convenient way to form a set of axially asymmetric PSFs. Note however that asymmetric sets of PSFs can be produced in other ways. The optics of the imaging system itself will often result in such PSFs. For example, such PSFs could be produced by lenses such as axicons or cylindrical lenses. Also note that a phase plate or equivalent is not required for two dimensional super-resolution imaging. Briefly, a narrow beam 124 is focused onto object 102 and scanned over object 102 by an illumination source with scanning apparatus 1 16. Beam splitter 104 provides scanning illumination 124 to object 102 and then allows the image light to travel the imaging path to camera 120. Scanning apparatus 1 16 might comprise, for example, XY scanning mirrors and other optics controlled by processor 140. In this
embodiment, a laser may be used, but incoherent light works as well.
Sub-images 204 (see Figure 2) are relayed through lens 108, lens 1 10, phase plate 1 12 (if used), and lens 1 14 to image on camera 120. Camera 120 is placed at the back focal plane of lens 1 14, and phase plate 1 12 is placed at the front focal plane of lens 1 14. Phase plate 1 12 is conjugate 130 with the objective 1 18 back focal plane. EMF 106 is an optional fluorescent filter that is useful when imaging fluorescent objects.
Camera 120 provides the sub-images to processor 140. Processor 140 iteratively compares each sub-image to combinations of PSF templates within a modeled dictionary of templates (see Figures 2-4) and each sub-image is matched and associated with a best-match solution. Processor 140 then creates best-match reconstructions from the best-match solutions and combines the reconstructions associated with the sub-images to form a super-resolution image of the object. The embodiment of Figure 1 optionally utilizes a phase plate 1 12 to, for example, to generate ring-shaped PSFs as described in U.S. Pat. No. 9,325,971 ("Engineered Point Spread Function for Simultaneous Extended Depth of Field and 3D Ranging," incorporated herein by reference). In this configuration, it achieves super-resolution imaging in all three-dimensions rather than just two-dimensional super-resolution imaging as in 2D embodiments. Briefly, the diameter of each ring is related to its depth in the object.
This process is not limited to microscopes. Endoscopes, telescopes, etc. can use the same process. Further, various kinds of illumination or excitation may be used depending upon the object to be imaged. Embodiments of the present invention use alternative designs of the conventional optical imaging system so that they preferably produce image PSFs according to the following criteria: 1 . For imaging 2D extended objects, PSFs preferably comprise a set of best- focus Airy disc PSFs that can be detected digitally using numerical
optimization with no ambiguity, even though they are highly-overlapping.
2. For imaging 3D extended objects, axial shift of the PSFs ideally form a range of unique shapes (such as rings) that can be precisely correlated to sample depth and can be detected digitally using numerical optimization with no ambiguity. This has advantages over traditional Airy disc PSFs where identical shapes appear when moving in either direction from the best-focus plane.
3. For imaging 3D extended objects, photons radiating from the original object are generally redirected into these PSFs in such a way as to maximize the uniformity of their intensities over extended depths, rather than concentrating all photons into a "best focus" spot as in traditional Airy disc PSFs. A goal is increased dynamic range.
4. For imaging 3D extended objects, PSFs are preferably designed to contain maximum information content and high signal-to-noise ratio (SNR) over depths as much as 20 times thicker than the imaging lens depth of field (DOF).
5. The size of illumination (the diameter of the scanning illumination spot) for one recorded image is limited to a relatively small region that is only a few times greater (e.g. 1.5 - 2 times greater) than a diffraction-limited illumination spot. This ensures that the PSF overlapping can be well conditioned for digital processing. This small illumination region (or an array of small regions) is then scanned in a two-dimensional pattern to build up an image of the entire 2D or 3D object.
Figures 2A and 2B illustrate an overview of a method of super-resolution imaging of an extended object according to the present invention. (See Figures 3A-3C for more detail.) Figure 2A illustrates the illumination spot size 220 at the object (left image) as compared to the sub-image detection region 204 (center image). The right-hand image is a schematic diagram illustrating the comparison between the illumination spot area 226 (AE) and detection region area 228 (AD). Note that for convenience, "detection region area" is defined as "the area the detection region would cover projected back to the object (i.e. the detection region at the detector divided by the magnification power of the imaging system 100). The sub-image detection region area is considerably larger than the illumination spot area, for example about an order of magnitude bigger. Ratios (AD)/(AE) of 2: 1 work to some extent. A ratio of 9: 1 has been found to work well.
Limiting the size of the illuminated sample region (the illumination spot size 220) is essential to achieve super-resolution. As an example, modeling a microscope formulates a fluorescence image as a linear superposition of the PSFs of individual point-like fluorescent photo-emitters. This leads to a series of linear equations:
Figure imgf000008_0001
where k denotes the indices of individual pixels, l(sk) denotes the intensity detected by the camera pixel at position sk, x, denotes the brightness of the i-th photo-emitter (i.e. the unknowns we are trying to determine), n is the total number of individual photo-emitters, PSF, denotes the intensity point spread function formed by photo- emitter,, and p is the total number of pixels.
Since Eq. (1 ) is a linear system, if the detection is noiseless, and if all of the n PSFs are known a priori and are linearly independent, we only need the intensities detected by n individual pixels to solve x exactly. The brightness of each of the photo-emitters can be recovered, even though their combined image PSFs are highly overlapped (spaced much closer together than the Rayleigh two-point resolution criterion for optical imaging). Since physical systems do have detector readout noise and photon shot noise, it is favorable to have the number of detections (i.e., number of pixels in the detection region, n) larger than the number of unknowns (i.e., the number of photo-emitters, p) in order to solve for the brightness coefficients i accurately. Figure 2A shows that the total number of photo-emitters (unknowns) is approximately proportional to the illumination area AE, while the total number of pixels in the recorded sub-image PSFs (i.e. detections) is proportional to the image detection area AD. An estimate of numerical problem determination is given by dividing the detection area by the illumination area:
^ # of detections A
Determination ratio≡ °=— ·
# of unknowns A_ The smaller the illumination area, the higher the determination ratio (Eq. 2), and therefore the more 'determined' the equation system. This has been confirmed by experiments showing that the higher the determination ratio the greater the amount of super-resolution achievable while keeping image noise artifacts to a minimum.
Figure 2B is a schematic block diagram illustrating the process of how sub-images 204 are acquired through scanning 202, iteratively associated to PSF templates from a dictionary of templates 206 using a numerical optimization procedure 208, and the set of best match solutions 230 are reconstructed 210 and combined to generate a super-resolution image 212. Specifically, an acquired sub-image 204 is compared to combinations of highly-overlapping PSF templates from dictionary 206 until a best- match solution 230 is found. Generally the PSF templates are fixed in position once the dictionary is provided. The PSF template intensities are varied (including being set to zero) while process 208 is performed for each sub-image 204. Experiments have shown that overlapping sub-images 204 results in better final images 212.
Note that the dictionary of templates is preferably calibrated to the set of PSFs produced by the imaging system so that the resulting reconstructed image is accurate for that imaging system. For example, in a two-dimensional imaging configuration of the embodiment of Figure 1 , the sub-images 204 (see Figure 2) might be composed of overlapping Airy discs. In a fluorescence system, the photo- emitters have real, non-negative brightness. 202 illustrates an example of a scanning pattern performed at the object 102, in this case a spiral pattern. Generally, the scanning is done in steps, with a sub-image captured at each step. 204 is an example of a sub-image acquired during the scanning process. 206 shows a dictionary of templates which may be iteratively compared to each sub-image. 208 shows the image comparison process, in this case a non-negative least-squares optimization process. This process finds a best- match solution 230 for each sub-image and then creates a best-match
reconstruction 210 of sub-image 204 from the best-match solution 230. 212 shows the reconstructed image formed from combining best-match sub-image
reconstructions 210.
Figures 3A-3C are diagrams showing in more detail the process of iteratively comparing an acquired sub-image 204 to combinations of highly-overlapping PSF templates from a dictionary 206 to find a best-match solution.
Figure 3A shows sub-object 302, which represents the ground-truth of this portion of the object. 204 is the acquired sub-image from sub-object 302. 306A is an intermediate solution after one iteration of process 208 (which is comparing solutions 306A-C to acquired sub-image 204). 308A represents what the reconstructed object sub-image 204 would look like after this one iteration. In general the process would not reconstruct intermediate sub-images - this is shown for clarity. Figure 3B shows the result after 51 iterations. 306B is an intermediate solution and 308B is how the intermediate reconstructed sub-object would now appear. As the process continues through more iterations, the reconstructed sub-objects 308A-C more closely resemble the original ground truth sub-object 302. After 1001 iterations (Figure 3C) best-match solution 306C (produced by choosing a combination of PSF template intensities in dictionary 206) becomes very similar to sub-image 204.
Therefore, by using the PSF template positions and intensities from this best-match solution, a super-resolved best-match reconstruction 308C is produced that closely resembles the original ground truth sub-object 302.
Figure 4 shows the process and results of selecting different template dictionaries 206 for the sub-image comparison and matching process. 206A is a relatively sparse PSF template dictionary, having an array of nine PSFs. 206B is a 7x7 array PSF dictionary and 206C is a 9x9 array PSF dictionary. In general, denser PSF dictionaries 206 result in more highly-resolved final images 212. Thus, image 212C is better than image 212B, which is better than image 212A. However, there are diminishing returns with denser PSF dictionaries, and processing time is increased. Thus, it is sometimes desirable to use a sparser PSF dictionary if quick results are desired or the object is less complex. In some embodiments, a user might select a sparse PSF template dictionary for a quick initial look at the object and then use a denser PSF dictionary for higher-resolution imaging if warranted. During the iterative process, the intensity combination of the template PSFs is varied. Preferably, the iterative process zeroes in on the best match solution efficiently.
In the present invention, details finer than the camera pixel size can be resolved by varying the design of the PSF dictionaries used as inputs to the optimization algorithm without imposing additional information (such as interpolation) to the acquired images. 206A, 206B, and 206C show examples of three possible dictionaries in which the template PSF spacing, is made successively smaller. Here each square represents one camera pixel, the dots represent the centers of the template PSFs, and the dashed circles represent the full width at half maximum (FWHM) diameters of a selected number of PSFs to show how they increase in overlap as the PSF spacing is reduced. This template spacing in each dictionary is what determines the upper bound of the resolution of the final image after processing. Images 212A, 212B, and 212C show reconstructed image results from ~900 (scanned) experimentally-acquired sub-images, of fluorescein-labeled actin filaments in a BPAE cell. The objective lens is 63x, NA 1 .4. Here, 412C shows that a super-resolution of ~50nm in XY can be achieved with the present invention as compared to the five-times worse (~250nm) maximum (diffraction-limited) capability of the same objective when used in a standard fluorescence microscope
configuration. Figures 5A-C show experimental image results to confirm the super-resolution capabilities of the new invention for a two-dimensional continuous object. Figure 5A is a traditional fluorescence microscope image and Figure 5B is a super-resolution image generated according to the present invention. Both images are captured with a wide-field microscope with an NA 0.5 objective, but the super-resolution image (Figure 5B) includes much more detail. Figure 5C shows a traditional fluorescence microscope image of the same object using a higher resolution objective lens (NA = 1.4), to demonstrate that the super- resolution image Figure 5B is accurate (rather than containing artifacts). (Note the contrast has been inverted to black on white for ease of viewing.) The object is fluorescently-labeled mitochondria in a BPAE cell. Image insets show higher magnification regions for ease of resolution comparison.
Figures 6A-C show super-resolution in all three dimensions is possible for continuous 3D objects using the present invention with phase plate 1 12 inserted in the imaging path to produce axially asymmetric ring-shaped PSFs (as described in U.S. Pat. No. 9,325,971 ). Figure 6A shows a three-dimensional rendering of a super-resolved fluorescent biological object produced according to the present invention. Figure 6B shows a side-view of the same object acquired using the new invention to illustrate its axial super-resolution capabilities as compared to a side- view image assembled from a confocal image stack (Figure 6C). Here features along the vertical Z axis show that the super-resolution capabilities of the invention are similar to its lateral (X-Y axis) resolution capabilities (i.e. 100nm in all three dimensions). This is unlike the confocal image 6C that shows the biological features are elongated (smeared) along the Z axis which corresponds to its much poorer axial resolution. The object is a 15pm thick, fluorescently-labeled Tetrahymena cell. Objective is 63x, NA 1 .4.
What is claimed is:

Claims

1 . The method of super-resolution imaging an extended object comprising the steps of:
(a) providing an illumination source configured to illuminate the object with an illumination spot having an illumination spot area at the object;
(b) scanning the illumination spot across the object;
(c) capturing a series of sub-images of the object wherein the sub-image detection region area substantially exceeds the illumination spot area;
(d) providing a dictionary of templates comprising PSFs; (e) comparing each sub-image to combinations of highly-overlapping PSF templates from the dictionary, while varying intensities of individual PSF templates, to find a best-match solution for that sub-image; and
(f) creating a best-match reconstruction of each sub-image from the best-match solution for that sub-image.
2. The method of claim 1 further comprising the step of combining best-match reconstructions from step (f) to form a super-resolution image of the object.
3. The method of claim 1 wherein step (c) captures overlapping sub-images.
4. The method of claim 1 wherein step (e) utilizes a non-negative least-squares optimization algorithm.
5. The method of claim 1 further including the step of:
(b)(1 ) applying a phase adjustment to an imaging path prior to step (c); wherein the phase adjustment produces depth-encoded sub-images of the object having point spread functions (PSFs) which vary according to depth range.
6. The method of claim 5 further including the step of decoding depth range within sub-images based upon the phase adjustment.
7. The method of claim 6 further including the step of super-resolving axial information within sub-images based upon the phase adjustment.
8. The method of claim 1 wherein the object includes fluorescent photo-emitters excited by the illumination spot.
9. The method of claim 1 wherein the detection region area is approximately an order of magnitude greater than the illumination spot area.
10. Apparatus for generating a super-resolution image of an extended object comprising: an illumination source configured to illuminate the object with an illumination spot having an illumination spot area at the object; scanning apparatus configured to scan the illumination spot across the object; a camera configured to capture a series of sub-images of the object wherein the area of the detected region substantially exceeds the area of the illumination spot; and a processor configured to -
- provide a dictionary of templates comprising PSFs,
- compare each sub-image to combinations of highly-overlapping PSF templates from the dictionary to find a best-match solution for that sub-image, and
- create a best-match reconstruction of sub-image from the best-match solution.
1 1 . The apparatus of claim 10 wherein the processor is further configured to combine best-match reconstructions and form a super-resolution image of the object.
12. The apparatus of claim 10 further comprising a phase mask disposed in an imaging path, the phase mask configured to produce depth-encoded sub-images of the object having point spread functions (PSFs) which vary according to depth- range.
13. The apparatus of claim 12 wherein the processor is further configured to decode depth-range within sub-images.
14. The apparatus of claim 12 wherein the phase mask is a circular caustic.
15. The apparatus of claim 10 wherein the illumination spot is configured to excite fluorescent particles of the object.
PCT/US2017/022600 2016-03-15 2017-03-15 Super-resolution imaging of extended objects WO2017161055A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17767483.5A EP3430460A4 (en) 2016-03-15 2017-03-15 Super-resolution imaging of extended objects
US16/085,524 US10663750B2 (en) 2016-03-15 2017-03-15 Super-resolution imaging of extended objects

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662308799P 2016-03-15 2016-03-15
US62/308,799 2016-03-15

Publications (2)

Publication Number Publication Date
WO2017161055A2 true WO2017161055A2 (en) 2017-09-21
WO2017161055A3 WO2017161055A3 (en) 2018-08-23

Family

ID=59851192

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/022600 WO2017161055A2 (en) 2016-03-15 2017-03-15 Super-resolution imaging of extended objects

Country Status (3)

Country Link
US (1) US10663750B2 (en)
EP (1) EP3430460A4 (en)
WO (1) WO2017161055A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2020127647A1 (en) * 2018-12-19 2020-06-25 Abberior Instruments Gmbh Fluorescent light microscopy with increased axial resolution
CN111479097A (en) * 2020-03-25 2020-07-31 清华大学 Scattering lens imaging system based on deep learning
US11307397B2 (en) * 2017-10-02 2022-04-19 Carl Zeiss Microscopy Gmbh High-resolution confocal microscope

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014144820A1 (en) * 2013-03-15 2014-09-18 The Regents Of The University Of Colorado 3-d localization and imaging of dense arrays of particles
DE102019107267A1 (en) * 2019-03-21 2020-09-24 Carl Zeiss Microscopy Gmbh Process for high-resolution scanning microscopy
EP3979301A4 (en) * 2019-05-31 2023-06-28 Hamamatsu Photonics K.K. Semiconductor device examination method and semiconductor device examination device
US11567010B2 (en) * 2019-12-02 2023-01-31 Bioaxial Sas Dark tracking, hybrid method, conical diffraction microscopy, and dark addressing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7772569B2 (en) 2008-04-01 2010-08-10 The Jackson Laboratory 3D biplane microscopy

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070177799A1 (en) * 2006-02-01 2007-08-02 Helicos Biosciences Corporation Image analysis
CN102177444B (en) * 2008-10-10 2014-07-09 皇家飞利浦电子股份有限公司 Practical SPECT calibration method for quantification of nuclides with high-energy contributions
US8237786B2 (en) * 2009-12-23 2012-08-07 Applied Precision, Inc. System and method for dense-stochastic-sampling imaging
US8941720B2 (en) * 2011-02-02 2015-01-27 National Tsing Hua University Method of enhancing 3D image information density
WO2014144820A1 (en) * 2013-03-15 2014-09-18 The Regents Of The University Of Colorado 3-d localization and imaging of dense arrays of particles
DE102013022538B3 (en) * 2013-09-03 2018-12-13 Georg-August-Universität Göttingen Stiftung Öffentlichen Rechts Method for creating a microscope image and microscopy device
WO2015157769A1 (en) * 2014-04-11 2015-10-15 The Regents Of The University Of Colorado, A Body Corporate Scanning imaging for encoded psf identification and light field imaging

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7772569B2 (en) 2008-04-01 2010-08-10 The Jackson Laboratory 3D biplane microscopy

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3430460A4

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11307397B2 (en) * 2017-10-02 2022-04-19 Carl Zeiss Microscopy Gmbh High-resolution confocal microscope
WO2020127647A1 (en) * 2018-12-19 2020-06-25 Abberior Instruments Gmbh Fluorescent light microscopy with increased axial resolution
CN111479097A (en) * 2020-03-25 2020-07-31 清华大学 Scattering lens imaging system based on deep learning

Also Published As

Publication number Publication date
WO2017161055A3 (en) 2018-08-23
US20190121155A1 (en) 2019-04-25
EP3430460A4 (en) 2019-11-06
US10663750B2 (en) 2020-05-26
EP3430460A2 (en) 2019-01-23

Similar Documents

Publication Publication Date Title
US10663750B2 (en) Super-resolution imaging of extended objects
US20220254538A1 (en) Fourier ptychographic imaging systems, devices, and methods
US10419665B2 (en) Variable-illumination fourier ptychographic imaging devices, systems, and methods
CN107850767B (en) Light sheet microscope for simultaneously imaging a plurality of object planes
CN110262026B (en) Aperture scanning Fourier ptychographic imaging
US10613312B2 (en) Scanning imaging for encoded PSF identification and light field imaging
US10996452B2 (en) High-resolution 2D microscopy with improved section thickness
US8946619B2 (en) Talbot-illuminated imaging devices, systems, and methods for focal plane tuning
Wang et al. Optical ptychography for biomedical imaging: recent progress and future directions
US20190075247A1 (en) System for generating a synthetic 2d image with an enhanced depth of field of a biological sample
EP2871511A1 (en) Spinning disk confocal using paired microlens disks
FR3057068A1 (en) SAMPLE OBSERVATION DEVICE AND SAMPLE OBSERVATION METHOD
De Luca et al. Re-scan confocal microscopy (RCM) improves the resolution of confocal microscopy and increases the sensitivity
Guo et al. Deep learning-enabled whole slide imaging (DeepWSI): oil-immersion quality using dry objectives, longer depth of field, higher system throughput, and better functionality
WO2013176549A1 (en) Optical apparatus for multiple points of view three-dimensional microscopy and method
US20230221541A1 (en) Systems and methods for multiview super-resolution microscopy
Jeong et al. High-speed dual-beam, crossed line-scanning fluorescence microscope with a point confocal resolution
US20090161210A1 (en) Microscopy system with revolvable stage
WO2020160146A1 (en) Fast volumetric imaging system and process for fluorescent tissue structures and activities
CN117369106B (en) Multi-point confocal image scanning microscope and imaging method
Li THE NOISE AND INFLUENCE ON FLUORESCENCE MICROSCOPY
Koho Bioimage informatics in STED super-resolution microscopy
Ikoma Computational Fluorescence Microscopy for Three Dimensional Reconstruction
Huelsnitz Very high numerical aperture three-dimensional imaging with three objectives
Grinbaum Multi-Height-Based Lensfree On-Chip Microscopy for Biomedical Applications

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017767483

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017767483

Country of ref document: EP

Effective date: 20181015

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17767483

Country of ref document: EP

Kind code of ref document: A2