GB2477844A - Highly spatially resolved optical spectroscopy/microscopy - Google Patents

Highly spatially resolved optical spectroscopy/microscopy Download PDF

Info

Publication number
GB2477844A
GB2477844A GB1101356A GB201101356A GB2477844A GB 2477844 A GB2477844 A GB 2477844A GB 1101356 A GB1101356 A GB 1101356A GB 201101356 A GB201101356 A GB 201101356A GB 2477844 A GB2477844 A GB 2477844A
Authority
GB
United Kingdom
Prior art keywords
sample
diffraction
light
microscopy
quantum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
GB1101356A
Other versions
GB201101356D0 (en
Inventor
Frank Michael Ohnesorge
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of GB201101356D0 publication Critical patent/GB201101356D0/en
Publication of GB2477844A publication Critical patent/GB2477844A/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01QSCANNING-PROBE TECHNIQUES OR APPARATUS; APPLICATIONS OF SCANNING-PROBE TECHNIQUES, e.g. SCANNING PROBE MICROSCOPY [SPM]
    • G01Q30/00Auxiliary means serving to assist or improve the scanning probe techniques or apparatus, e.g. display or data processing devices
    • G01Q30/02Non-SPM analysing devices, e.g. SEM [Scanning Electron Microscope], spectrometer or optical microscope
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B82NANOTECHNOLOGY
    • B82YSPECIFIC USES OR APPLICATIONS OF NANOSTRUCTURES; MEASUREMENT OR ANALYSIS OF NANOSTRUCTURES; MANUFACTURE OR TREATMENT OF NANOSTRUCTURES
    • B82Y20/00Nanooptics, e.g. quantum optics or photonic crystals
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N33/00Investigating or analysing materials by specific methods not covered by groups G01N1/00 - G01N31/00
    • G01N33/48Biological material, e.g. blood, urine; Haemocytometers
    • G01N33/50Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing
    • G01N33/58Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving labelled substances
    • G01N33/588Chemical analysis of biological material, e.g. blood, urine; Testing involving biospecific ligand binding methods; Immunological testing involving labelled substances with semiconductor nanocrystal label, e.g. quantum dots
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01QSCANNING-PROBE TECHNIQUES OR APPARATUS; APPLICATIONS OF SCANNING-PROBE TECHNIQUES, e.g. SCANNING PROBE MICROSCOPY [SPM]
    • G01Q60/00Particular types of SPM [Scanning Probe Microscopy] or microscopes; Essential components thereof
    • G01Q60/02Multiple-type SPM, i.e. involving more than one SPM techniques
    • G01Q60/06SNOM [Scanning Near-field Optical Microscopy] combined with AFM [Atomic Force Microscopy]
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01QSCANNING-PROBE TECHNIQUES OR APPARATUS; APPLICATIONS OF SCANNING-PROBE TECHNIQUES, e.g. SCANNING PROBE MICROSCOPY [SPM]
    • G01Q60/00Particular types of SPM [Scanning Probe Microscopy] or microscopes; Essential components thereof
    • G01Q60/18SNOM [Scanning Near-Field Optical Microscopy] or apparatus therefor, e.g. SNOM probes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0072Optical details of the image generation details concerning resolution or correction, including general design of CSOM objectives
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0076Optical details of the image generation arrangements using fluorescence or luminescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/58Optics for apodization or superresolution; Optical synthetic aperture systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N2021/6417Spectrofluorimetric devices
    • G01N2021/6423Spectral mapping, video display

Abstract

Highly spatially resolved optical microscopy or spectroscopy of a sample 1.8 is performed at rapid imaging rates beyond the diffraction limit (4, figure 1b). It is computer-assisted by reconstruction of the originally formed diffraction image by back-transforming computer software after diffraction image data have been directly recorded by a large array 3 of very small pixel sensors to quantitatively measure the light intensity profile.

Description

Concept for non-scanning Fraunhofer-and/or Fresnel regime optical microscopy and spatially resolved optical spectroscopy -e.g. utilizing FTIR-technique -in special cases beyond the lateral diffraction limit.
Description:
The invention is about a technical principle and an apparative technique (an apparatus), Fig. 1, in several versions by means of which fast (time scale of digital video) optical spectroscopy with a lateral resolution below/beyond the general diffraction limit (i.e. < Xlrghtl'2) could be realized in the optical far field (or also Fresnel-regime/spherical wave approximation), particularly in certain special cases. As well the invention is about application examples for a new kind of digital data storage as well as for a microbiological analysis method. In combination with ordinary FTIR (Fourier transform infrared spectroscopy) all mentioned fields of application open up optimally by the this way enormously accelerated data acquisition rate.
Task of the present invention is thus to find ways to circumvent/extend/overcome the lateral diffraction limit in non-scanning optical (color) microscopy/laterally resolved optical spectroscopy under certain special circumstances and this problem is proposed to be solved using a concept for (numeric) computer-assisted digital optical microscopy as outlined mainly in the concept of claims 1- 3, which in particular relies on the special provision of knowing the sample topography/geometry (not the color details) in advance and/or on mutually incoherently luminescing sample details.
When micrographically viewing an illuminated sample through an objective lens (magnifying glass), two limitations will occur, which are folding/convoluting the lateral light intensity distribution representing the micrographical image in a very complicated manner, which usually practically cannot be back-computed / deconvoluted if looking on a lateral scale beyond the diffraction limit: Firstly on the length scale of the illuminating light sources coherence length all resolved scatterers (light points) will mutually interfere forming an interference pattern in the objective lenss focal plane and secondly on a length scale of the diffraction limit and the light source's coherence length, each one of all these scatterers (light points of non-zero spatial/lateral extent -sample details) on the sample will exhibit its own single diffraction pattern (also forming in the focal plane of the objective lens) which will then also additionally mutually interfere between/among each other as mentioned above, if the light sources coherence length is sufficiently long. This situation is to be regarded in full analogy to diffraction by an optical grating or a double slit where the diffraction of the single slit is always superimposed (Fig. lb -"double slit experiment"). Hereby, the said spatial diffraction limit is determined by the wavelength X of the illuminating light and the numerical aperture NA=nsina of the objective lens and thus Abbes resolution limit amounts to amounts to roughly g= X/NA, where g is the average separation of the light points which at best can be resolved and NA=nsina, all if only 1.
order interference maxima are "caught by the objective lens's acceptance angle a. It is emphasized, that this actually is the definition of Abbe's microscopical diffraction limit, that the first order diffraction peaks have to be "caught" within the acceptance angle a as given by the numerical aperture of the viewing optics. However, if the image formation is done firstly by a high resolution light pixel detector array such as a CCD-camera instead of basically a frosted glass screen, where the first can actually quantitatively measure the light intensity profiles of the optical image and thus it should not be necessary anymore that the first diffraction order gets caught within the acceptance angle a of the viewing optics for the smallest structures to be resolved and secondly the image formation is in particular done already at a place in the optical path where still the mere diffracted image forms (i.e. directly behind the illuminated sample or in the objective lens's focal plane), then it should be possible to overcome respectively circumvent the diffraction limit as defined by Abbe even in wide-area ("non-scanning") optical far field imaging in particular in certain specialized cases as will be described. But even in the general case, it should be possible to numerically determine the complete diffracted intensity profile from quantitatively measuring a large part of the intensity profile of the 0th order diffraction peak as indicated in Fig. id for the case of a double slit experiment; since the shape of the intensity profile of the 0th order peak already depends on both, the width of the slits as well as their separation, from fitting a large part of the 0th order peak to the intensity profile equation l(e)= l x [(sin((iia/A)sinO)/ ((iia/X)sinO)]2 x cos2[(iid/X)sinO], where I is the light intensity as a function of observation angle 0, a is the width of the slit and d is the distance between the slits, and 0, the observation angle, is equal to the above acceptance angle a for viewing of the 15t order diffraction maximum. Now, if the complete diffracted intensity profile can already be numerically determined from only a (large) portion of the 0th order diffraction peak, already the smaller acceptance angle a' (or a numerical aperture of nsina' would be sufficient to resolve the separation d (and the width a) of the slits. Thus, with an objective lens possessing a numerical aperture of nsina, not only d can be resolved but also (paired) sample detail separations smaller than d, depending how much of the 0th order diffraction peak is needed to be measured for the computer to be able to fit the whole intensity profile curve according to the above equation. (Fig. id).
However, this is very demanding on both the measurement/data quality and the numerical software, especially in the Fresnel case (when viewing the double slit/ grating in the regime of the spherical wave approximation instead of the far field/plane wave Fraunhofer regime), where the above equation would have to be replaced by the Fresnel equations. Further, obviously, for many sample details like a grating with lattice constant g, this task would become much more difficult and demanding of course than for a double slit (i.e. a pair of sample details at small separation) as described by Fig. id of course. Thus, this de-convolution may only be possible, if the sample topography/geometry below the diffraction limit is known in advance, which pre-information could be provided by scanning probe microscopy; another intermediate possibility for reaching a resolution only slightly below the diffraction limit could possibly be achieved by simultaneously imaging with conventional high resolution optical microscopy such that only 2 or 3 sample details below the diffraction limit would have to be expected within that small area of approximately X/2 x X/2. Obviously, the numerical analysis would of course be largely simplified, if the mutual interference of the sample details would be "switched off" and the analysis of the diffracted intensity profile would be reduced to mere "single slit" (single aperture/opaque disk respectively) diffraction while being otherwise the casting of a shadow basically.
Thus, when viewing a sample through an ordinary objective lens ("magnifying glass") usually the mutual interference of the sample's light points (analogy to an optical grating) is -in the diffraction limited case -viewed only up to first interference order, while higher orders do not get collimated i.e. do not get "caught" anymore by the objective lens -this basically represents the definition of the numerical aperture NA=nsinct, where a is the maximum acceptance angle of the objective lens which coincides with the angle under which the first diffraction order (e.g. of a diffraction grating as a sample) of the smallest sample separations occurs. This diffraction pattern of the mutually interfering sample detail light points forms in the objective lens's focal plane while the real space image of the sample gets reconstructed in the image plane of the objective lens. However, since NA of the objective lens cuts out all higher diffraction orders than the first, the resolution already goes down.
Secondly, there is the diffraction of the light points themselves, which have a non-zero lateral extent, in full analogy to single slit diffraction. However, the first order diffraction minimum of this Airy-diffraction pattern (diffraction pattern from a small circular aperture) is usually much further out than the first order diffraction maximum of the mutual interference of the plurality of light points (e.g. an array of quantum troughs or a grating). Therefore, from this Airy diffraction pattern only the 0th order maximum gets viewed by this ordinary objective lens within its acceptance angle a=sin (gn/X), where g is the average separation of these maximally resolved light points/quantum troughs.
About the mutually interfering light points (the sample details reflecting / scattering the illuminating light), in principle nothing can be done -since the resulting convolution of the lateral intensity distribution (i.e. the image information on the sample) is far too complicated for being deconvoluted / back-computed -other than reducing the illuminating light sources coherence length below the average separation distance of the sample details and or viewing only in first diffraction order which further reduces the effective numerical aperture NA and thus raises the diffraction limit originating from the diffraction of the single sample detail light sources (the "single slit" analogy). When viewing the sample additionally through an ocular (eye piece), the magnification will greatly increase, but the combined numerical aperture will go down (e.g. by only viewing first order diffraction and cutting out higher orders using an aperture basically) and thus the diffraction limit -i.e. the lateral size of the smallest viewable sample detail -will eventually further rise.
Now, when in particular viewing a plurality of luminescing quantum dots, the first limitation of mutual interference of the light emanating from these quantum dots/troughs will mostly and to a great extent not occur, as the luminescing light (portions) will mostly loose their phase in the luminescing (resonance) process, and thus the light (portions) emanating from those various quantum troughs will be mutually incoherent; this leaves the second limitation which is the diffraction limit caused by the diffraction of the single sample details (again the "single slit analogy), i.e. the quantum dots. But this limitation can be lifted or at least be improved if instead of an image forming eye piece lens a computer software is used to back-transform / deconvolute the diffracted image of the sample as recorded directly by a 2-dimensional light pixel sensor array (e.g. a CCD-chip) into a real space image.
Theoretically even a samples mutually interfering point light sources (sample details) forming an interference pattern in the objective Ienss focal plane should in principle be back-transformable using such a computer software, especially -but not necessarily -if the sample geometry I topography is known, but practically this deconvolution in my opinion does not appear possible yet, if the sample geometry/topography is not known.
Hence, by limiting the range of samples of interest in particular to pluralities of luminescing nanoparticles (also molecules such as in particular dye-molecules) -quantum troughs in a wider sense -the optical resolution problem will be reduced to deconvoluting I back-transforming the partially (but only scalarly additively) superimposed diffraction patterns which each single luminescing nanoparticle I quantum trough will generate. This wavelength-dependent deconvolution I back-transform can particularly conveniently be performed if the topography / geometry of the single nanoparticles is known from other ultraresolution microscopy methods such as scanning probe microscopy resulting in spatially ultrahighly resolved spectroscopical (color) images.
The second contribution to the well-known spatial diffraction limit in optics, which is the mutual interference of the light emanating from the nanoparticles as point light sources is literally switched off if dealing with mutually incoherently luminescing/fluorescing quantum troughs due to the statistically varying optical resonance relaxation times of the various nanoparticles forming said quantum troughs.
However, even if the quantum troughs would also exhibit for instance irradiated light portions, which are coherent -e.g. portions of reflected light -and these nanometric light sources would then partially mutually interfere, a wavelength-dependent full deconvolution / back-transformation of the diffracted (color) image should still theoretically although much more difficult practically be possible, at least if the geometry I topography of the said nanometric light sources (sample details) is known in advance e.g. from scanning probe microscopy or electron microscopy.
On the other hand, in the case of mutually incoherently luminescing quantum troughs, the geometrical I topographical detail structure of the sample is in principle not needed anymore for the back-computation of the color images from the diffracted color images to be possible; this knowledge of the sample geometry would only significantly simplify the deconvolution I back-computation for obtaining spectroscopical color images beyond the diffraction limit.
It is hereby emphasized, that in my opinion the diffraction limit cannot be beaten in principle when using a non-scanning microscopy technique, whereas all the near field scanning microscopy techniques obviously use the simple trick of using in principle a scanned illumination spot which is smaller than the said diffraction limit und thus in the latter the diffraction limit is circumvented.
However, the Abbe definition of the diffraction limit has to be carefully considered, which is based on the definition of the numerical aperture as being essentially the angle under which the first order diffraction maximum of the smallest sample details' average separations would appear. Now if this numerical aperture can be enlarged in special cases by various tricks, either by essentially using an extreme high resolution light pixel array (e.g. CCD) camera very close to the sample quantitatively measuring light intensity profiles or by essentially switching off the mutual interference of the sample details, this per definition unbeatable diffraction limit can be extended, e.g. by catching higher diffraction orders of the always present "Airy diffraction pattern" originating from the sample details. Second and simpler method is of course the special case of only considering samples, whose geometry/topography is known on that small scale e.g. from scanning probe microscopy, since then the convolution" of the image structure as induced by the diffraction limit can be back-calculated/back-transformed/de-convoluted" for the various wavelengths and thus a color micrograph respectively a laterally resolved spectroscopic image of the sample can be obtained below the diffraction limit.
Therefore, task of the present invention is a concept for a computer-aided optical non-scanning color microscopy (spatially resolved optical spectroscopy) which should be able to overcome the diffraction limit as commonly defined at least in certain specialized cases, especially regarding certain optical sample properties like the degree of (in-)coherence of light portions emitted from various sample details and/or in particular if topographical pre-information about the sample is known. One proposed trick is extending the common diffraction limit by effectively achieving higher numerical aperture either by exploiting certain optical sample properties and/or optimizing the optical path of the microscope by replacing at least the ocular part by a suitably in the optical path located high-resolution light pixel array camera; the other trick is enabling or simplifying the back-computation of the reconstruction of the small scale (non-scanning far-/Fresnel-field) optical image which is usually blurred by the diffraction effects as described by the diffraction limit simply by knowing the sample geometry on that small scale from other techniques such as scanned probe microscopies.
The diffraction limit of optics says, that 2 point-like (structural) details cannot be resolved (separately), if their (light) diffraction patterns overlap too closely, to be resolved (i.e. separately visible) e.g. on a photographic film or a frosted glass screen. Fig. ib, 2a,2b explain the definition of the diffraction limit. This case occurs roughly, if the structural detail sizes to be imaged become smaller than the half wavelength of the light used for observing/microscopying, of course still depending on the numerical aperture of the imaging optics (objective lens(es)).
If now, however, this image displaying frosted glass screen is replaced by a CCD-sensor, which is able to quantitatively measure the,,shape" (i.e. the lateral intensity profile) of the diffraction peak of a structural detail of the sample -e.g. the Airy-diffraction pattern intensity distribution of a small disc of diameter a (for Xght/2 < a or slightly > a [18] -Mie scattering) or its dipole respectively multipole radiation characteristics (for Xlrghti'2 >> a -Rayleigh scattering) -then the situation is different and then it could be said, that theoretically, in special cases, there is no diffraction limit of approximately XIrght/2 anymore, even if one stays within the picture of linear optics; e.g. in aperture-less microscopy of the diffraction image, if the detector pixel array were almost infinitely large (large effective numerical aperture). Hereby, it is remarked that for XIrght > a completely pronounced diffraction minima will not occur anymore. The structures can nevertheless be deconvoluted, since the CCD-camera is measuring intensity profiles of the diffraction peaks quantitatively -note that no frosted glass screen is used here. A further physical (deeper fundamental) reason for this statement is, however, that an array of arbitrarily small (nanometric) objects (especially -metallic), if irradiated by light of any wavelength X, they will always -due to non-linear optical (electromagnetic) effects -also emit light again (generally they will luminesce, or fluoresce, phosphoresce), however incoherently with respect to each other and they will also emit scattered electromagnetic radiation of a wavelength of fractions of the above X (Fourier expansion plus multipole expansion of the light emission by a scatterer < or << X) since such a nanometric (in particular if metallic) scatterer is -in a wider sense -always an antenna too. For these,,new", much shorter wavelengths than X, i.e. those X/i, i=1-oc, the diffraction limit holds again in the picture of linear optics and aperture-using microscopy. For the case of Xlrght/2 >> a (Rayleigh scattering) instead of the Airy diffraction pattern the radiation characteristics of an antenna (Hertz dipole in the simplest case, or multipole) has to be applied; i.e. roughly proportional to a2sin2v/r2 (Fig. ic).
For the case of two neighboring microscopically small discs, which are to be imaged, the diffraction patterns of these discs would be two overlapping,,Airy-diffraction pattern functions" (Figs. ib, 2a,b).
If now the geometrical shape of these microscopic structural details is known, e.g. two little round discs, it can of course be calculated, that the resulting diffraction pattern (Fourier space: 2-dimensional Fourier transformation of the lateral absorption function (x,y)) of two such small discs (e.g. quantum dots) is the interfering overlap of two such Airy-diffraction pattern functions (Figs. 2a and 2b show in this case the envelope of the total intensity, see also Fig. lb which is simply the double slit experiment) and of course, it can be back-calculated (inverse transformed) to the diffracting (scattering) structural details (real space). In the envelope of these interfering Airy diffraction pattern functions, the near field information should still be contained (hypothesis!), also even in the far field, at least in the Fresnel regime. Here, the reason would be, that partially incoherent light emanates from the (little) discs, because of which the Airy diffraction patterns also -to a small extent -overlap non-interferingly, i.e. they add to a small portion as scalars, also in the general case. For the case of completely independent of each other luminescing quantum troughs, which thus also radiate again completely incoherently with respect to each other, there is -except for the Airy diffraction patterns of the single discs themselves -no interference of the point light sources with respect to each other. Then the Airy diffraction pattern intensity profiles (for Xlrght/2 <=a or slightly > a [18] -Mie scattering) respectively the dipole/multipole radiation characteristics (for XIrght/2>>a, Rayleigh scattering) add up as scalars (as opposed to vectors when they would mutually interfere) and can thus be deconvoluted more simply by subtraction and single Fourier back-transformation (of the single Airy disc diffraction patterns) afterwards (for Xlrght/2 <= a or slightly > a [18], Mie scattering). For the case of Aht/2 >> a (Rayleigh scattering) the Hertz dipole characteristics has to be calculated back to the scatterer (dipole/multipole).
Thus, if, by means of a CCD sensor (e.g. a video camera), the intensity profile of the diffraction image of a sample whose details emit at least partially mutually incoherently is quantitatively measured, then, in principle, this profile could be theoretically deconvoluted by a numerical computer using certain additional information about the optical imaging/transformation (mapping of the imaging profile and transfer function of the microscope optics (lenses) regarding an infinitesimal small light point -i.e. the so-called,,point spread function"). The resolution is then only dependent on the light sensitivity and the dynamic range of the pixel of the CCD-camera, as well as on the lateral pixel size relative to the optical magnification of the optical image, but also with respect to the total number of pixels. For highest accuracy, non-linear effects, i.e. the light emission of the sample's nanometric scatterer(s) at additional other, especially smaller, wavelengths than that the sample is irradiated with -which however is still mostly dominating also in the scattered light -, would have to be considered when back-calculating/deconvoluting the optical image.
Due to lens imperfections (even just the finite lens diameter represents of course already a limitation/imperfection), a (simple) pin hole camera (camera obscura) would have advantages; thus, best would be of course an imaging method, which can completely leave out [avoid] any apertures, i.e. in the case if the pixel array is just of sample size, and of course has extremely small pixels, where the pixel size then represents the resolution limit for direct near field imaging in real space. Ideally the detector pixel array is, however, much larger with extremely many and extremely small pixels and the diffracted/scattered image ("Fourier/reciprocal space) of the sample is imaged in the Fresnel regime with taking account of higher diffraction orders (equivalent to smaller sample structure details) -i.e. with very high effective numerical aperture. The real space image is then obtained by numerical (Fourier) back-transformation, eventually numerically corrected for the spherical wave approximation in the Fresnel regime (for Xlrght/2 <=a, Mie scattering). Again in the case of XIrght/2>>a (Rayleigh) scattering theory has to be applied, i.e. in the simplest case dipole characteristics for every single quantum dot.
For the case that the microscope's optical beam path goes through an optical apparatus (with apertures and lenses), arbitrary structural details which shall be viewed in the optical far field beyond/below the diffraction limit, generally cannot practically be deconvoluted because of the interference, even if the point spread function of the system was determined accurately or could be determined accurately, and thus for instance lens aberrations could be corrected in a digital image acquisition system. However, optical super resolution in the far field by deconvolution should nevertheless be possible with restrictions, if sufficient additional information/data are available, for instance from combination with other microscopy methods. For instance, if in the most simple case it is known, that only 2 small discs of known diameter and position (size and distance beyond/below the diffraction limit) are to be imaged, then these 2 Airy-diffraction pattern functions (i.e. the diffraction pattern / diffracted intensity profile of an opaque (dark) disc) respectively the dipole characteristics can be relatively easily de-convoluted/back-calculated; if it is three well-defined discs within the diffraction limit, it is naturally already much more complicated, especially if their positions are unknown, and so forth. The more structural details (and the more ill-defined in shape) within[/below/beyond] the diffraction limit, the more difficult the de-convolution gets of course, if not impossible, since too many unknowns. The more unknowns, the more additional information / boundary values are needed (e.g. from scanning probe microscopies), in order to still make the de-convolution possible. As mentioned, for highest accuracy the non-linear optical effects in light scattering on nanometric structural details (especially metallic nanoparticles as nanometric antennas") have to be taken into account, i.e. besides the irradiated wavelength also the other emitted wavelengths -especially the smaller ones -have to be considered.
Here, the setup of the present invention is initially restricted to the spatially ultra-highly resolved spectroscopy (beyond/below the lateral diffraction limit) of a known sample geometry of incoherent point light sources, in which case the de-convolution becomes much simpler and non-ambiguous; basically here, the de-convolution is reduced down to a simple subtraction in "Fourier/reciprocal space, especially if the many structural details are not too dense (on the sample surface) -the entanglement" [folding] of the diffraction maxima of 0th order of course becomes stronger, the closer the structural details lie next to each other within the diffraction limit: Thus, if the pixel detector is color sensitive, e.g. by splitting of the imaging light via a prism (as in commercial high quality video cameras usually the case) in three or more beam paths of different color, which are directed onto three or more CCD-sensors (one each for red, yellow/green and blue), then optical spectroscopy can be performed with extremely high spatial resolution, essentially only by using a high quality video camera and digital image processing (de-convolution/subtraction) -Fig. 2c. Alternatively, there is also color-CCD-arrays, whose spatial resolution of course is lower in principle (roughly 1/3 because of 3 CCD-pixels per image pixel) and eventually, by means of a tunable interference filter in front of the pixel detector, the spectroscopy can be performed; this latter version would, however, sacrifice partly the here in the present invention emphasized fast data aqcuisition rate advantage [by roughly a factor of three due to the serial acquisition of the 3 colors].
By constant (e.g. back and forth) movement of the CCD sensor, the lateral resolution of the CCD sensors itself could even be optimized to sub-pixel resolution (which is used in state of the art high resolution video microscopy as mentioned by V.Moy), however, the de-convolution/back-calculation would then be immensely more difficult and more computation-time is needed.
Obviously, FTIR (Fourier transform infrared spectroscopy) using the setup in Fig.1 would be the spectroscopy method of choice for optimizing a very fast data acquisition rate.
This way, spatially resolved luminescence spectroscopy on a 2-dimensional array of quantum dots is possible, [and it is possible] in the,,far field" (Fig. 2d, exaggeratedly drawn in near vicinity to the sample), i.e. without having to employ the very slow scanning optical near field microscopy for each spectroscopical picture; a scanning probe microscope is needed only once for the initial characterization of especially the geometry and number and ideally also the exact positions of the quantum troughs (often also named as quantum dots -not quite correctly, because their extent mostly is much larger than the Fermi wavelength of the electrons in the material). This functions especially since the luminescence of the quantum troughs makes them independent (incoherent) light sources with respect to each other, i.e. they do not interfere with each other and the single Airy-diffraction patterns (for Xlrghtl'2 <=a, "Mie') respectively the dipole! multipole characteristics (for XIrght!2 >> a, Rayleigh) add scalarly. By means of this, a new kind of optical memory can be read out (Fig. 2d, 3), which is based on (2-dimensional) arrays of quantum troughs and thus allows extreme storage density (about 100 times higher than realized so far, taking account of typical quantum trough dimensions/sizes of about 5 nm (plus roughly 10 nm mean distance) and of nowadays customary structure widths in micro electronics fabrication of at best roughly 45 nm for conventional microprocessors, DRAMs for instance -not to mention CDs or DVDs. Moreover, quantum troughs can also be arranged in 3 dimensions (3-dimensional arrays), not just in 2 dimensions [1,la]. Since the spectroscopy method of the present invention is also based on interferometryj'phase contrast, also quantum troughs in somewhat more deeply lying (buried) layers can be read out -of course provided, they were deposited layer by layer and each layer was geometrically characterized before by means of scanning probe microscopy (e.g. AFM).
Writing" of such 3-dimensional arrays of quantum troughs would have to be realized in kind of shift register, as suggested in Fig. 3a,b for 2 dimensions, or scanning by means of scanning probe techniques (of course also only for 2 dimensions, for 3 dimensions by means of confocal interferometric methods). In the latter case, only very slow writing would be provided, which, however, could be accelerated by the many probe tips' millipede-concept [2], but still provides the fast,,areal", i.e. not scanning optical read-out of the quantum trough arrays according to the present invention. A further method to electrically contact the quantum troughs, would be a vertically arranged quantum wire array in the electrically insulating substrate layer, according to the manufacturing method in [3] or [4] (CNT-array), as indicated in Fig.3a/'ll.
The following Gedankenexperiment shall explain, that fundamentally/theoretically optical microscopy/spectroscopy beyond (below) the diffraction limit should be possible in principle not only in the near field, -even if practically difficult to realize (the main issue in this present patent application): As widely known, optical near field microscopy provides optical resolution far beyond/below the diffraction limit. There, the light coming from the sample (in reflection or transmission) is recorded using an extremely sensitive photo detector (photon counter) through a tiny (<< light wavelength) aperture which is scanned across the sample, this light recorded as a function of the lateral position of the aperture on the sample and thus, a near field optical image of the sample is recorded. This has been proven many times experimentally, also theoretically (e.g. since [5]), demonstrated many times and lateral resolutions down to a few nm have been achieved -also using commercial instruments. If a suitable matrix of pixels were at hand, i.e. a pixel array with extremely small, but extremely sensitive light detectors, which are placed at very small distances to each other, such that the sample could be directly prepared on that pixel array, then equivalently a near field image of the sample would be recorded without having to raster-scan the sample (patent claims 9-11). Of course, the pixel array has to be roughly of the same size as the sample itself, and each pixel detector would be a near field optical sensor/detector replacing the usual scanning near field optical probe tip. In a customarily scanning near field optical microscope usually an extremely sharpened tip-end of a monomode glass fiber is used as a scanning aperture, which is taking up out of the near field regime the initially non-propagating exponentially decaying electromagnetic field (not quite a,,light wave") and is transmitting it via the monomode fiber over longer distances (0(m)) to the photo-detector (e.g. a photomultiplier). That means, the light is indeed propagating again after tunneling through the,,layer thickness" of the aperture (diameter <A), even though extremely damped by the aperture (diameter < A); this is widely realized that way, it just is necessary to use an extremely sensitive photon counter (photo multiplier) as a detector. As an explanation: The electromagnetic wave cannot exist/propagate only in the aperture (diameter < A) and in the near field regime (distance from the sample < A) itself, or in other words respectively in those regimes a few A away from the antenna"/the scattering particle (the light emitting "point" of the electromagnetic field), this electromagnetic field is a quasi-static field, oscillating in time; but this field can in fact tunnel through those regimes (the amplitude of the electromagnetic (more accurately speaking the electric and the magnetic field vector which are out of phase) field vector decays roughly exponentially while passing through the aperture; behind the aperture, i.e. many As away, no matter whether in a suitable monomode fiber or in free space, the light wave can exist again, thus can also propagate. Now, if it were possible to bundle extremely many of such sharpened fiber tips, such that the bundle's cross section is roughly of the same size as the sample's surface area, a non-scanning optical near field microscope would be at hand, similar to patent claim 12, while of course at each fiber's back end an extremely sensitive photo detector would have to be placed at. Such fiber bundles are possible in principle, the problem here would be of merely geometrical nature, because then the fiber length range (close to the probe tips) in which the fiber is (much) thinner than needed for un-damped light propagation (of certain wave length, e.g. 633nm) would be very long and the propagating intensity fraction would be damped away rather rapidly, there would be too little intensity arriving at the detector -all this is of course a question of the detectors sensitivity and a question of stray light intensities, i.e. theoretically (in principle) possible but practically -to best of my knowledge -not feasible at present. Thus, a shadow throwing microscope in the far-field would be at hand, just with the difference that the propagating light is,,mediated" via monomode fibers to the detectors. In the present invention it is now suggested, to simply omit this (strongly damping) hypothetical fiber bundle and to directly record using an extremely sensitive pixel array (for instance one as suggested in [6]) the admittedly extremely small intensity variations, which, however, still have to be modulated on top of propagating intensity portion (non-linear optical effects, thus Fourier expansion in the wave vector). Of course, the near field information then becomes drastically "entangled" by the interference, but should -in principle -still be present in the far field or at least in the Fresnel-regime, even though minutely small, at least because of admittedly very small incoherent luminescence portions in the reflected/transmitted light. These incoherent intensity fractions emanating from the quantum troughs thus add scalarly -e.g. the in the scattered light with respect to each other incoherently luminescing quantum troughs; for Alrght/2 <= a or slightly >a (Mie) it is the Airy diffraction patterns of a small disc a, for AIrghJ2 >> a (Rayleigh) it is the multipole radiation characteristics, or in approximation the Hertz dipole characteristics.
Quantum mechanical effects (light dependent modulation of the current through quantum wires), in principle (theoretically) should be able to reach the sensitivity of photo multipliers, perhaps comparable to highly sensitive liquid nitrogen cooled CCD-arrays, which can already detect single photons. The latter makes this possible traded off by the disadvantages of the extensive cooling (which reduces thermal noise) as well as relatively large, thus more sensitive but slow pixel detectors. A quantum electronic detector alone would not know these disadvantages. Especially, a very large (perhaps also hemispherical) light pixel array detector with extremely many and extremely small pixels would be suggested, which hence would in fact provide a very large effective numerical aperture for recording the sample's diffraction image (for XjghJ2 <= a or slightly > a [18], Mie). For XIrghtI2 >> a it should provide sufficient intensity profiling of the dipole radiation characteristics of the single scatterers/quantum dots.
Problem and conceptually suggested solution: Optical microscopy and thus optical spectroscopy is diffraction limited in their lateral (spatial) resolution, the diffraction limit amounts to about half the wavelength of the imaging light, additionally dependent only on the numerical aperture of the imaging objective (lens(es)). Scanning probe microscopies and electron microscopy overcome this (optical) resolution limit (electron microscopy of course only via the much shorter (de Broglie) wavelength of the imaging irradiation), but provide no optical spectroscopy (color) data though. Near field scanning optical microscopy is an exception, which can also provide optical spectroscopic data; however, it is extremely slow (one image in order of minutes to an hour).
The here invented concept shall overcome these limitations: Optically spectroscopic images can be provided at a lateral resolution of a scanning probe microscope and at the same time at the spectroscopical resolution (color) of a light microscope at a maximum picture rate comparable to that of (digital) video microscopy. The concept is based on using spatially highly resolved geometrical (topographical) pre-information from scanning probe microscopies or similar methods, in order to back-calculate (to de-convolute, eventually by mere subtraction in Fourier space) the subsequent light microscope's color images "blurred" by the (lateral) diffraction limit of the same sample mathematically using a computer in real time. In the simplest case at low spatial resolution near the lateral diffraction limit of X/2, this should actually also function for actual real space images [17].
However, in the here primarily proposed concept (Fig.1) "merely" the diffraction image of the sample shall be recorded optically and shall then, by means of a computer, be compared with the Fourier-transformed respectively scattering theory-transformed (all sample quantum dots regarded as incoherent dipoles with respect to each other) scanning probe image of the sample and shall then afterwards be finally back-transformed by a computer (not lens-optically). This concept is very useful for the read-out of a new kind of quantum trough memory cells of extreme storage density -the sample just has to be characterized topographically once (e.g. via scanning probe microscopies or electron microscopy or similar techniques) and then can be rewritten on (electronically) over and over again and optically rapidly be read out over and over again. In this case of many independently of each other luminescing quantum troughs as mutually incoherent light sources the mutual interference of these point light sources will not occur and the many Airy diffraction pattern functions (for X/2<=a or slightly >a [18]) respectively the dipole/multipole radiation intensity profile characteristics (for X/2>>a) will add up as scalars. The de-convolution reduces then to a simple subtraction of these many Airy diffraction pattern profiles respectively dipole profile characteristics and their subsequent one by one Fourier back transformation respectively scattering theoretical back calculation afterwards.
State of the art: Traditional interference/phase contrast microscopy provides vertically the very high spatial resolution of interferometry (i.e. far in the sub-nm regime); however with a lateral spatial resolution which is diffraction limited, i.e. roughly X/2.
An interference microscopical method aiming at a lateral optical resolution beyond/below the diffraction limit, is proposed in EP0776457B1 [7] and references therein. A further optical microscopy method, which however is based on a special kind of laterally varying fluorescence excitation, in which the optical diffraction limit is circumvented, is described in T.Klar et al [8], as well as its technical basic concepts in [9]. The latter is not affecting the present invention, for one reason because in my understanding it [8,9] is also a slow (line-/pixel-wise) scanning method and for the other reason, in my understanding, since it is based on a locally defined and varying fluorescence excitation (slope steepness of the fluorescence excitation as a function of position (x,y) ,,<" diffraction limit) -it is thus related to scanning near field optical microscopy, since it scans a tiny light spot (<X/2 of the fluorescing -emanating-wavelength, diffraction limit) while also in Hell et al. [8,9] (by pulse illumination) a steep (<X/2 of the fluorescing/emanating light, <diffraction limit) fluorescence exciting light intensity slope is scanned [across the sample] (remark: if I understood [8,9] correctly: there in [8,9] the incident exiting laser wavelength is probably significantly smaller than the fluorescing outgoing wavelength), although however the data recording is performed in the far field (just as is mostly the case for the near field scanning optically microscopy NSOM as well).
Nevertheless it is a sensational progress, if so to speak optical near field microscopy can be realized omitting the near field probe tip. The first imaging method [7], apparently also based on complicated mathematical back-calculation, could perhaps in combination with the presently invented concept provide a significant improvement of the microscopic image quality. However, it is unnecessary for the here proposed presently invented spectroscopy concept. A concept similar to the presently invented is proposed in [17], while there the aberrations and limitations of the lens and aperture system shall be reduced there via their point spread function by de-convoluting the real space image (whereas the present invention concerns the de-convolution of the Fourier/diffracted space image), but also using highly resolved scanning probe microscopy data.
In my understanding, M. Gustafsson is closest to truly "beat" the diffraction limit in optical far field fluorescence microscopy using a non-scanning wide area technique and achieves sensational image clarity. Gustafsson projects a spatially structured illumination onto the sample, where this (periodic) structuring is of course diffraction limited, but is actually optimized down to that very diffraction limit, thus having a spatial frequency 1/(2/2). Now he demonstrates a resolution of at least 2 times better than the diffraction limit of about 2/2 by regular optical fluorescence microscopy where he additionally reconstructs the image by back-calculation of the Moire-patterns, which form by "mixing" of the spatial frequencies of the structural illumination with the sample's actual spatial frequencies representing the actual sample details of interest. Further he claims, when saturating the fluorescence dye molecules and thus finding also higher harmonics of the incident and/or fluorescent light (from multi-photon effects), even much higher resolution can be achieved.
However in my understanding, the "linear case" for theoretically achieving a factor of 2 better than the regular diffraction limit would work only for periodic sample structures, not for statistically scattered sample details. Thus, in my understanding, already the demonstrated resolution of about 2 times better than the diffraction limit as defined by the incoming exciting (laser) light might be already due to higher harmonic effects, since: The diffraction limited structured illumination in the linear case would have a sinusoidal structure. If this spatial frequency were to "mix" with the sample's spatial frequencies, the (fluorescent) structured (structured by the sample details) would have to possess smaller/shorter periodicities than the sinusoidal structure at the diffraction limit, since these 2 frequencies can only "mix" if non-linear effects are involved; i.e. the "light-structure" directly emanating from the sample has to contain already higher harmonics, otherwise these 2 spatial frequencies wouldn't mix, they only would add linearly as two exact equal spatial frequencies resulting in another pattern with the same periodicity and cycling phases, but would not contain any sum-or difference frequencies. Thus, in my understanding, the increased resolution might already be due to shorter wavelength components, in particular the first harmonic possessing half the wavelength. But I admit, that I do not fully understand the very complex mathematical transformation here.
FTIR (Fourier transform infrared spectroscopy) is a widely and for a long time used technique to realize particularly fast spectroscopy in the infrared regime [19, e.g. Wikipedia].
Fabrication of 2-dimensional and also 3-dimensional (ordered) arrays of quantum dots / quantum troughs for instance using Langmuir-Blodgett / Langmuir-Schafer technique has for instance been demonstrated in [la, 12, 12a-c, 22].
Solution with descriptive explanations of the patent claims: Referring to major patent claim 1 and 2: The present invention relates to a basic concept for a computer-aided optical color "non-scanning" microscopy (spatially resolved optical spectroscopy) method which should enable for a lateral resolution better than the usual lateral diffraction limit, particularly under certain conditions, especially regarding the degree of mutual (in-)coherence of emitted light portions from small sample details and/or in particular if the sample topography/geometry (but not the color details) is known in advance e.g. from scanning probe microscopy. The lateral diffraction limit is thus proposed to be circumvented/extended/overcome for an optical color image by quantitatively recording the diffracted (color) image (i.e. its 2-dimensional intensity profile/map) of a small sample (e.g. in the simplest case a double slit or two minute opaque discs or a grating with a separation/lattice constant smaller than X/2) directly or in the focal plane of a microscope's objective lens using a (color sensitive) high-resolution light pixel sensor array (e.g. a CCD-camera) and then transforming this diffracted (color) image back to a real space color image using numerical computer software instead of allowing the real space image to form in the image plane in the far field of a microscope's objective lens. (It is, however, remarked that this conventional optical real space image can of course be nevertheless simultaneously recorded for obtaining additional information about the sample structure.) Hereby, depending on the distance from the sample, where this diffracted color image has been recorded, the appropriate equations have to be integrated in this numerical back-transforming/-computing software, i.e. Fourier-transformation in the Fraunhofer (far field / plane wave) approximation and the Fresnel equations in the spherical wave approximation closer to the sample, respectively. For very small, especially nanometric metallic sample details also scattering theory and non-linear optics might have to be considered as well. The smaller the sample details are and the more of them are found within sample area elements of roughly (X/2)2 beyond the lateral diffraction limit, the more topographical/geometrical pre-information about these sample details would be obviously necessary for the back-computation of the diffracted image to be successful. In one preferred embodiement, the spectroscopy part is performed using FTIR-technique.
Background of claims 1-3 is further largely described and illustrated in the introduction.
Referring to patent claim 3: Microscopy/spectroscopy on,,quantum dots" (here representing any kind of nm-scale luminescence-fluorescence-, phosphorescence-objects, i.e. natural or artificial atoms/molecules/nano-particles) beyond/below the diffraction limit by means of optimized optical video microscopy based on the principles of major claims 1 and 2 in the Fresnel-regime or even in the far field: Video microscopy according to Fig. 1 (here interferometry-supported, but not essentially necessary for the present functional principle) could provide a direct image or diffracted image (scattering in the case of A/2>>a) of the array of,,quantum dots" at a lateral resolution below(beyond) the diffraction limit. Especially, if all lenses (eventually except one large diverging lens in front of the CCD-detector) and apertures are omitted, this here invented interferometric (Michelson/Linnik-type) imaging procedure should provide extremely high optical resolution below the diffraction limit, however then, 1. a very high light intensity is necessary and 2. a very large CCD-array which thus provides a very high numerical aperture, with extremely small and extremely many pixels is needed, whose light sensitivity must be extremely high; for instance a liquid nitrogen cooled conventional CCD-camera would be an option, but especially also the in [6] proposed,,artificial retina" [high density light pixel array]. Ideally, the detector pixel array is, however, very much larger than the sample, with extremely many, extremely small pixels and the diffraction image of the sample is imaged in the Fresnel-regime taking account of higher diffraction orders respectively smaller structural details of the sample, i.e. imaging with very high effective numerical aperture. The real space image is then obtained essentially by (numerical) Fourier back-transformation, eventually (numerically) corrected for the spherical wave approximation in the Fresnel-regime, all this in the case of X/2<=a or slightly >a [18]. In the case X/2>>a the quantum dots' Airy diffraction pattern of a small disc with diameter a is replaced by its approximate dipole radiation characteristics; then the de-convolution is not anymore a Fourier back-transformation but it is a (numerical) scattering theory's back-calculation to the dipole/multipole moment p qa of the quantum dots.
In case, due to unavoidable lens aberrations (even for Fresnel lenses [14]) and due to finite lens and aperture diameters, the resolution is not sufficient to image the quantum dots directly, then the diffraction "image" from the CCD-sensor has to be de-convoluted mathematically, i.e. the "blurred" interfering superimpositions of the in the ideal case (if the sample structures all possess the geometry of little discs of diameter a) many Airy-disc diffracted intensity profiles (for X/2<=a or slightly >a [18] respectively dipole radiation characteristics for X/2>>a) are computed out /subtracted out; Fig. lb, 2a and 2b. The image information is always contained in the quantitative diffracted intensity profile of an arbitrary sample image, even in the far field (hypothesis I, illustrated in Fig. 2d, exaggeratedly drawn for the regime,,close" to the sample); the only question is just whether one can always back-calculate/de-convolute to the (correct) sample geometry, which, for the case of many unknowns (various structural details, non-periodic distances -all beyond the diffraction limit) will become arbitrarily (very) complicated respectively the detector (pixel array) has to become infinitely large in order to quantitatively measure the Airy diffraction patterns respectively the dipole radiation profile characteristics. This back-calculating/de-convoluting to the sample's geometry is of course simpler possible, either if the pixel detector array were infinitely large (Fourier back transformation of the diffracted image" respectively scattering theoretical back calculation of the scattered "image"), or if only 2 "Airy' sample-discs of known diameter and position (diameter and distance non-trivially smaller than the diffraction limit) is the sample; having 3 such "Airy-sample discs makes the situation already much more complicated etc. Here, it is particularly intended, to apply the presently invented technique on periodic 2-dimensional arrays of identical quantum dots of diameter a, which thus are luminescing independently (incoherently) of each other and thus their Airy diffraction patterns (for X/2<=a or slightly >a [18]) respectively their dipole characteristics for X/2>>a add only incoherently (scalarly) and do not mutually interfere with each other. Hence, the de-convolution reduces to a subtraction of the quantum dots Airy diffraction pattern intensity profiles respectively dipole characteristics from the blurred diffracted/scattered image at the positions of the quantum troughs. By Fourier back transformation for various wavelengths of the separate differences, the real space color image of the separate quantum troughs is obtained, in the case X/2<=a or slightly >a [18]. In the case X/2>>a, the Airy diffraction patterns of the quantum dots are replaced by the approximate dipole characteristics; then the de-convolution is not anymore a Fourier back transformation, but the back calculation of the scattering theory to dipole moment p'qa of the quantum dots.
If the geometrical topographic structure of the,,quantum dot" array is known at all in advance e.g. from scanning force microscopy, then the de-convolution/subtraction is in particular also possible for the 3 different color components (Fig. 2c), by means of which the spectroscopic information about the quantum trough luminescence (representing any kind of luminescence, fluorescence, phosphorescence on the nm-scale) is then, in principle (theoretically), available at a lateral resolution of the scanning force microscope providing the topographical pre-information, for each new spectroscopical situation again using the fast far field optical (here invented) procedure, without having to employ each time a slow (scanning) near field method. For highest accuracy, however, non-linear optical effects in light scattering on nanometric (especially metallic) scatterers have to be considered, since during scattering of intensive electromagnetic radiation at a nanometric,,antenna" (like for instance a metallic nano-particle), there is -besides the dominating incoming wavelength -also many other wavelengths in the scattered light, especially shorter ones, which would allow higher resolution even within the diffraction limit as expressed with respect to the incoming wavelength.
Then the de-convolution however again becomes much more complicated, since then for every sample detail (as known from scanning probe microscopy), i.e. practically each nano particle, complicated scattering theory has to be applied and has to be computed into the complete image.
By means of a phase contrast method (interferometry), which in principle (theoretically) provides a vertical (DC-/wide bandwidth) resolution of about a 10th of an Angstrom (or roughly iO Angstrom x Hertz2 for modulation methods), also 3-dimensional arrays of quantum troughs can be (luminescence) spectroscopically measured -i.e. can be read out -, since the phase of the (laser) light wave can be,,moved through" vertically (through the sample) at this high spatial vertical resolution.
Thus, a much higher storage density is reachable as compared to having just 2-dimensional arrays of quantum troughs and only these are accessible by highest resolving scanning probe methods, the 3-dimensional array of course for the initial topographical characterization as well, i.e. each layer once, while the 3-dimensional array is built layer by layer.
The here invented concept is fundamentally based on at best aperture-less throwing a,,shadow" [absorption chart as a function of position of the sample] of a sample after illumination (in reflection or transmission) by means of an eventually expanded/shaped laser beam onto a pixel detector array, which either possesses sufficiently high pixel density (with sufficiently small pixels) or, respectively, is large enough (with sufficiently many pixels) that by means of a single diverging lens the expanded shadow" can be imaged interferometrically (i.e. with phase contrast) after Fourier back transformation respectively scattering theoretical back calculation with an optical lateral resolution far below the "X/2" diffraction limit (quantum dots with 5 nm diameter and roughly lOnm distance).
The interferometry is here -for 2-dimensional imaging primarily -only needed to cancel out the incoming (illuminating/luminescence exciting) light in the dark field and only to filter out (i.e. extract) the small signals of the incoherently luminescing quantum troughs. For this, besides the as small as possible pixel size, the high light sensitivity of the detector pixels is essential and therefore it is also essential, that they allow an as high as possible dynamic range, in case that the dynamic range of the light falling onto the detector cannot interferometrically be sufficiently compensated, i.e. in case that it is not possible to sufficiently measure on a "dark fringe.
Fig. 1: The Gaussian intensity profile of a,,color-Iaser" (red-green-blue or tunable, there is nowadays already white,,Iaser light") -depending on sample size via a beam expander or compressor (or not) -falls onto a beam splitter (polarizing or not), from there for one path onto a mobile mirror of adjustable reflectivity (e.g. by means of an adjustable absorber in front of it) and for the other path onto the sample interferometry cavity. The latter can be formed between an objective lens (partly reflective at the interface side towards the sample) of high numerical aperture [of course] and the reflecting sample, or as well between the back-reflecting end of a very short monomode fiber and the reflecting sample. Third possibility is omitting the objective or the glass fiber respectively completely and just simply forming an interferometer cavity (tunable or bandwidth in the relevant wavelength range) by means of very thin,,silvered" i.e. ,,mirrored", not necessarily using silver coatings (reflectivity adjusted such that it is comparable to the expected sample reflectivity) plate whose normal is parallel to that of the sample in close distance to sample. This way, almost all apertures except the sample size itself would be avoided, however, the laser then has to illuminate the sample with very high intensity on its whole surface area, which in turn causes an increased portion of unwanted reflections at eventual other interfaces; furthermore, the detector has to become very large then, in order to take account of higher diffraction orders respectively in order to evaluate the 0th diffraction order quantitatively more accurately (for X/2<=a or slightly >a [18]) respectively in order to quantitatively measure the superimposed dipole profile characteristics sufficiently precisely (for for X/2>>a). For the case of using fiber optic interferometry, this part of the here invented setup (the fiber optical interferometry component for the mere measurement of smallest vertical distances, not for the image forming microscopy) is based on the invention in [10].
Ideally in this embodiement, the sample plane is exactly the focal plane of the objective lens focusing the laser beam, so that the light spot focused (the focus can be relatively small, doesn't have to be though, since a larger spot means a larger imaged sample area) onto the sample is reflected exactly back into incoming beam path (i.e. into itself). In the case of illumination by means of a monomode fiber whose end surface is supposed to serve as a (partly reflective) mirror for the (interferometry) cavity, ideally ends in a rod lens (graded index lens) whose focal plane again is exactly the sample plane.
In the proposed setup in Fig.1, three light rays interfere at the pixel detector, however, the 3rd ray, the reference ray, coming from the mobile mirror can basically be omitted, it only serves for the optimal adjustment of the detector signal on a dark fringe when averaged over the detector / image area, which is necessary for a good signal to noise ratio. This,,dark fringe" can also be adjusted directly by accurate adjustment of the distance/thickness (for the corresponding, e.g. 3 wavelengths/colors) of the upper partly mirrored surface of the interferometer cavity, which is formed by the sample surface and the exit face of the objective lens/glass-fiber, as well as by the accurate adjustment of the reflected intensities (sample and objective/glass fiber exit face). This, however, is very complicated and elaborate -and of course has to be readjusted for different sample reflectivities -, in particular if this interferometer cavity is supposed to be filled with a liquid (e.g. for increasing the numerical aperture and/or for applications in biology) and thus, in Fig. 1 the (3) reference beam is introduced. Its possible but very elaborate elimination would have the significant advantage, that light of small coherence length (only a few interferometer cavity widths) could be used instead of a coherence length of the order of the length scale of the whole beam path (mirror -beam splitter -sample -detector), e.g. a LED instead of a laser diode as a light source), by means of which stray interferences of the useful light with unwanted reflections would be largely reduced, which becomes the more important, the more the signal to noise ratio, i.e. ultimately the resolution, has to be optimized for the corresponding application. Further, relatively short coherence lengths allow the application of the FTIR-method as the spectroscopy technique.
By suitable usage of X/4-waveplates or equivalent the polarizations are adjusted such that only the three desired laser beams interfere at the pixel array detector and stray reflections are largely eliminated. By means of the mirror position the relative phases of the laser beams are adjusted such that the measurement is performed by almost 100% on a dark fringe (averaged across the sample area); the only photons that arrive at the CCD-array/pixel detector are the minute deviations of the light beam intensity profile from the incoming laser's ideal Gaussian intensity profile caused by the sample structure (Fig. 2d, exaggeratedly drawn for the regime near the sample). For the case of the fiber optic version it is remarked that 1. the effective inner diameter of a monomode fiber for 633 nm is about 4iim i.e. roughly same size as the expected sample size (for 5 nm quantum dots in 10 nm distance this would be already almost lOOkBits, even if just one quantum level is used) and 2. that this monomode fiber must be very short (<<O(lm)), since in an ideal monomode fiber deviations from the ideal Gaussian profile are rapidly damped. For large memory cell arrays, e.g. 1cm2 -which corresponds already, even if realized only in 2 dimensions, to a storage amount of about 400Gbits (at (lSnm)2 per quantum trough and only one used quantum level) -one has to scan line-wise (laterally in 4iim steps), or a large objective lens or multimode fiber of larger diameter has to be used, or just that aperture-less shadow-throwing technique mentioned above (patent claims 9-12) using a very large detector pixel array -e.g. the,,artificial retina" (light pixel detector array of highest density) of 1cm2 from [6].
If not measuring on a "perfect" dark fringe, i.e. when using short coherence lengths of the illuminating light, because of the GauRian intensity profile of the laser beam in the middle of the beam a larger signal results than at the,,edges" (quotation marks because a Gaussian beam does not really have an edge) of the beam. This has to be additionally compensated, e.g. when using a CCD-array e.g. by according bias voltage of the single pixels, i.e. compensation of the Gaussian intensity profile by according pre-adjustment of the sensitivity of the pixels, increasingly from the middle of the beam towards the outer end of the beam. Accordingly profiled absorber plates can also be envisioned, however, they certainly would introduce disturbances (reflections, phase shifts) into the system and thus would further reduce the already very small useful signal. A further possibility for this compensation lies in that that the diverging lens (refractive lens or,,suitable" Fresnel lens [14]) behind the pinhole is milled into such a shape, that it converts the known Gaussian beam profile exactly into a homogeneous light intensity distribution at the CCD-detector, i.e. such that it perfectly compensates the Gaussian beam profile -of course the small intensity variations of the useful signal remain contained, the more, the larger the aperture (the pinhole) in Fig. 1 is. Reshaping the Gaussian beam profile into a homogeneous distribution will introduce new laterally varying phase shifts to be corrected, though, e.g. by means of a,,variable-iVs" wave-plate, similar to the above mentioned tailored absorber.
The here primarily proposed method to achieve spectroscopical (color) resolution is simply based on the technology of customary high quality video color cameras, namely to split the used signal via a prism (or a,,suitable" [14] optical grating) into the spectral colors and to record them with e.g. 3 or more pixel detectors (e.g. highly sensitive black and white CCD cameras, optimized for the according wavelength regime). Fig. 1 -inset.
Writing" onto such 2-3-dimensional arrays of quantum troughs would have to be of the kind of a 2 dimensional shift register, as suggested in Fig. 3, there however drawn for just 1 dimension, or scanned by means of scanning probe techniques. In the latter case, one would have only relatively slow writing, which, however, can be accelerated by the millipede concept using many probe tips [2], but still the here invented fast large area optical (simultaneous) read-out of the quantum trough arrays is available.
The presently invented technical principle is thus based on mathematical back-computation (,,de-convolution") of the,,blurred" microscopic far field diffraction image (eventually perhaps only Fresnel -regime image -i.e. the intermediate range between near and far field, in which propagating spherical waves are present, but plane wave approximation as in the far field is not yet applicable - also for the case of using "suitable" Fresnel lenses [14]), that -because of incoherent light portions -in its diffracted intensity distribution also must contain the information about structural details below/beyond the diffraction limit of A/2 (Fig. 2d), by using partial or complete additional data on the sample's geometry/topography, which was obtained before by other highly resolving microscopy methods. Initially it appears of course trivial or only little useful respectively, to derive again -by means of a complicated procedure -an already completely known sample topography (e.g. from atomically resolving scanning force microscopy) from the blurred optical far field image, but in the light image of course many further spectroscopical-optical information about the sample properties are contained, i.e. the colors which are of course not contained in the force microscopy image.
(Remark: There is of course also highly spatially resolved scanning probe spectrocopies, but these are firstly very slow, e.g. the scanning optical near field microscopy/spectroscopy, however, these deliver in general additionally completely different spectroscopic information, e.g. electrical or magnetic as well as elasticity-dependent effects.) And furthermore, the presently invented method will be of course, due to the omission of the time-consuming raster-scanning procedure, primarily much faster, i.e. time scale of digital video microscopy.
The here invented apparative method for the highest spatially resolved (<<X/2) fast spectroscopy on an array of luminescing quantum troughs is based firstly on the principle of a highly resolving (laser) interference microscopes (Michelson-/Linnik-or Fizeau-type respectively, also fiber optics version) -by means of which the incoming light is eliminated at the detector in the dark field -under avoidance of lenses/apertures as far as possible, and secondly on a highest resolving pixel array detector (e.g. a CCD camera) with extremely small and many pixels -by means of which a very large effective numerical aperture for recording of the diffraction image is provided respectively the dipole characteristics of the quantum dots can be measured sufficiently quantitatively-, as well as thirdly on a fast digital image acquisition and image processing procedure which -with video data rate -subtracts the sample's topography image known from scanning probe microscopy (more precisely its Fourier transform i.e. its diffraction image of the interference of plain or spherical waves respectively -multipole expansion) ,,online" from the blurred optical (and e.g. split into the 3 fundamental colors) far field light image and then Fourier back-transforms these 3 images. In the for the present invention's concept signal-stronger Fresnel regime (intermediate range between near field and far field, at and up to a distance of about bOX from the sample, scatterer size of order X), where spherical waves have to be regarded, thus higher terms also in the multipole expansion have to be included, not just plane waves. These for instance 3 images obtained in this way then contain the correct color distribution of the sample at a spatial resolution of the supporting scanning probe microscopy (with which the sample geometry has to be determined only once) and at a time resolution and spectroscopical resolution of the here invented optimized video microscopy. Finally 4.
the de-convolution is here relatively simply solvable in form of a subtraction [in Fourier space], since the quantum troughs should be luminescing mutually independently, thus should represent mutually incoherent point light sources; hence, the many Airy diffraction pattern intensity profiles (for X/2<=a or slightly >a [18], "Mie") respectively the dipole radiation characteristics (for X/2>>a, "Rayleigh") should simply add up scalarly and should not interfere mutually. The Airy diffraction pattern intensity profiles respectively the dipole characteristics of the single quantum dots are "simply" subtracted from the "blurred" optical diffraction/scattered image for the according incoming wavelength at those positions as determined by scanning force microscopy and left over is the color information, however still in the form of Airy diffraction pattern profiles respectively dipole radiation characteristics of the quantum troughs. Every single quantum dot (its luminescence) can then be obtained in real space by Fourier back-transformation for the according luminescence wavelengths for the case X/2<=a or slightly >a [18] ("Mie"), eventually (numerically) corrected for the spherical wave approximation in the Fresnel regime. For the case X/2>>a, the Hertz dipole /multipole characteristics has to be back-calculated to the scatterer size and form. Even for the case, that the quantum dots would be coherent (e.g. phase-conservingly reflecting) light sources, there should be a small fraction of incoherent light intensity, which then also adds up scalarly, even though the major portion of the (reflected) intensity interferes.
In principle (theoretically speaking) this method should also work with usual digital light microscopy, the interference microscopy as well as the eventual additional digital calibration of the optical apparatus by evaluation of the point spread function of the image transforming system for the correction of lens/aperture aberrations can be employed as an option to increase the expected very small useful signal/contrast [17]. For all deviations of the present invention from the usual video microscopy it is always an issue to omit as many lenses/apertures as possible, this holds even also for the ("suitable" [14]) Fresnel lenses.
referring to patent claim 4: Principle as in patent claims 1-3, further specified in that that the light source is a tunable laser. The interferometer cavity above the sample is adjusted in particular with respect to its width and eventually also its reflectivity (e.g. by means of electrically controllable/polarizable liquid crystal coatings) simultaneously with the tuning of the laser wave length; the wave length dependence of the detector pixels is calibrated and taken account of in the mathematical analysis. The Faraday isolator is either also simultaneously wave length tuned (for highest accuracy) or is a broad band isolator. This method provides highest accuracy (signal to noise ratio), however, sacrifices speed of the achievable image rate, which would not be too much of a problem for the read-out of quantum troughs as memory cells, since always only one (total) image (of all quantum troughs) is necessary (i.e. would be still fast enough). Sacrificing speed, however, would be of significant disadvantage when recording fluorescence microscopical movies on biological samples in vitro (see claim 13), unless, in this case, the image rate would be limited to the according extent anyhow by the (sometimes) necessarily simultaneously running much slower scanning probe microscopy; and unless, anyway only 1 or 2 or a few known fluorescence wavelengths are observed simultaneously.
referring to patent claim 5: Principle as in patent claims 1-3, further specified in that that the light source is a white (pulse) laser.
The Faraday isolator in this case necessarily is a broad band isolator. The pixel array detector is in this case the one of a commercial color video camera, either with a color CCD array or the white light is for instance split by a prism in 3 or more partial beams and recorded on several (wave length calibrated/tuned) pixel array detectors. This method provides the highest image rate, however, sacrifices signal to noise ratio, particularly because the interferometer cavity above the sample can only be adjusted for one wave length. This interferometer cavity, however, does possess a certain band width, which is only necessary, if the luminescence is to be imaged, i.e. if one wants to obtain a color image. Particularly for the fluorescence microscopy which is relevant in the,,in vitro" biology (claim 13), in principle (theoretically), the very narrow bandwidth of the interferometer cavity should be sufficient, since there only 2 or in the case of some few fluorescence molecules only a few wave lengths play a role, which firstly are relatively adjacent to each other and secondly can be selected at the detector/several detectors by filters. I.e. specific (perhaps also differential/non-linear) analysis of the light absorbed by the fluorescence molecules with respect to the light emitted by them.
referring to patent claim 6: As in patent claims 1-3, further specified in that that the complete beam path is built up in a fiber optical manner. The whole (Michelson-/Fizeau-) interferometer is then built up merely fiber-optically, then reflections at interfaces/transitions are further significantly minimized, furthermore there is no beam paths going through free space (i.e. no stray reflections/refraction by air currents) and thus the signal to noise ratio of the apparatus is further improved. All necessary phase shift adjustments, e.g. for the 3rd reference beam, can, as in [10], be realized by stress birefringence of the fiber, i.e. by bending of the fiber in a plane with appropriate angle to the polarization plane of the light; the X/4-waveplates for the 90°-polarization rotation (twice 45° polarization rotation in the same direction on the light path back and forth [through the bent fiber portion]) can as well be realized by bending of the fiber in a suitable plane as in [10]. The polarizing beam splitter will eventually be still a more or less customary polarizing glass beam splitter cube, however, at its faces the 4 fibers will be connected via an integrated system (e.g. commercially available from [11]), which eliminates reflections and free space beam paths. By using rod lenses (graded index lenses) in coupling optics to and from the monomode fibers, these rod lenses could be directly glued to the beam splitter cube (by means of index matching epoxy -Ella]), which practically eliminates all reflections at interfaces and unguided beams are reduced to the (small) space within the beam splitter cube. In the future there will certainly be beam splitter cubes with fiber optic connectors realized on a single chip.
Such a system would have perhaps a worse signal to noise ratio (which would still have to be evaluated in the case of realization by means of integrated optics -here by usage of,,suitable" [14] Fresnel-lenses, while in read-write heads of commercial CD-/DVD read-write devices "normal" Fresnel lenses are used), however, the advantage of highest compaction, usability, portability and pricing.
referring to patent claim 7: As patent claim 6, further specified in that that even the beam splitter is realized purely light wave guide optically, preferably by means of integrated optics with the advantage of highest compaction, portability and cost effectiveness in case of suitable micro technical manufacturing methods. Regular commercial fiber-optic polarizing and non-polarizing beam splitters (fused fibers out of two or more glass fibers) have many disadvantages, for instance that stray reflections, if they occur, cannot be eliminated or it is extremely difficult to eliminate them via the above mentioned A/4-"trick" and that there are strong back reflections back into the light source, which destabilize their intensity, which also the Faraday isolator can only eliminate to a certain extent.
referring to patent claim 8: As patent claims 1-3, further specified in that that there is of course quantum troughs that have no or only little topographical structures (embedded local material alterations, as typical for quantum troughs that are realized in semiconductor structures), which then cannot be ideally characterized by AFM. In this case, however, practically always a suitable scanning probe method can be found particularly for the geometrical (but eventually also for the electronic) characterization of the quantum troughs, for instance scanning elasticity/capacitance/conductivity/magnetic force probe microscopy, or as well the scanning near field optical microscopy itself, which then provide the additional data, which are necessary for the de-convolution/subtraction/back-calculation of the non-
scanning optical far field image.
referring to patent claim 9: Non-scanning ("wide area") near field optical microscopy far beyond the diffraction limit, just by usage of a pixel detector array with extremely small pixels and extremely small pixel separation. For instance the,,artificial retina" (light pixel array of extreme density) from [6] could be used hereby (pixel size about Snm, mean pixel separation down to 1O-3Onm) -Fig. 3a/ll. The sample, e.g. molecules, bacteriae, cells, but also quantum dots could directly prepared onto the pixel array chip and by almost trivial shadow throwing onto the pixel detectors they are imaged in the near field. For thin samples not even a lot of light is necessary, since quantum effects are always extremely sensitive. The resolution is limited simply by the pixel size and separation and should be able to reach the resolution of a scanning near field optical microscope, however here, in a non-scanning, wide area near field technique. Such an,,artificial retina" (light pixel array of highest density) as suggested in [6] with extremely small and extremely many pixels would solve all resolution problems in color microscopy/spatially resolved spectroscopy, the spatial resolution being only limited by the pixel size (quantum wire diameters are about mm, mean separations of about lOnm are thinkable as aimed at in [6] and [3]; the time resolution would be limited by the circuitry to the pixels of the,,artificial retina" [6], since quantum electronic effects in quantum wires (ohmic resistance practically 0) occur instantaneously. Spectroscopic resolution should already be possible in a single quantum wire, since such quantum mechanical mechanisms (excitation events of quantum mechanicals states) always are depending on the photon energy -i.e. depending on the light frequency -but to quantify the latter, the electronic properties of the quantum wires from [3] would have to be studied much more accurately.
referring to patent claim 10: Aperture-and lens-less microscopy as in patent claim 9, characterized in that that the pixel detector array is located approximately at a distance of 10-100 X from the sample, so that the diffraction image of the sample is recorded in the Fresnel regime, that the also ordinary (CCD-) pixel detector array or the quantum wire pixel array as in patent claim 9 is of much larger area than the sample itself and thus provides a very high effective numerical aperture for the mere diffraction "image! scattered "image, that in the case for X/2<=a or slightly >a [18] (Mie) the real space image is obtained by Fourier back transformation for various wavelengths, corrected (numerically) for the Fresnel spherical wave approximation or is obtained in further approximation by mere Fourier back transformation for various wave lengths, that in the case X/2>>a (Rayleigh") the scatterer [form] is derived by back-calculation of the superimposed dipole radiation characteristics.
referring to patent claim 11: As patent claim 10, characterized in that that the pixel detector array is hemi-spherically concave and the sample is located in its center point.
referring to patent claim 12: Analogous to claims 9-11, wherein a non-scanning near field microscope with a 2-dimensional array of many near field probe tips would be in complete analogy to patent claim 9, which in principle can be achieved by bundeling many customary monomode fibers sharpened into near field apertures and transmitting the light in each fiber towards a photo-multiplier/photon counter, as described in the "Gedankenexperiment" above; this basically is the parallel usage of many optical near field microscopes, which would make the raster-scanning of the sample obsolete. The problem is a geometrical one, because when bundeling many fine sharpened glass fiber tips that regime of the fibers would be fairly long, in which their diameter is much smaller than would be necessary for almost un-damped light propagation (at a certain wave length, e.g. 633nm), thus the light collected by the near field optical tips would be extremely damped, before it hits the detector; thus, the signals might not be detectable anymore, especially because of stray light intensities -single photon counter would be (commercially) available though.
referring to patent claim 13: Time resolution and,,color" of the optical (far field) microscopy at simultaneous spatial resolution of the scanning probe techniques in biology / crystallography / physical chemistry etc.: Using the here invented spectroscopical technique of course also fluorophores (always representative for luminescence, fluorescence and phosphorescence, which can be regarded as fully equivalent for the here invented concept) in biology/crystallography/physical chemistry can be spectroscopically imaged. Also these -just like quantum troughs -are regarded as mutually independent (incoherent) point light sources, which do not interfere with each other; their Airy diffraction pattern intensity profiles for X<=a or slightly >a ("Mie") respectively their dipole radiation characteristics for A.>> a (Rayleigh) will thus add up as scalars in the diffracted/scattered image. For instance fluorophores that are bound directly to or in immediate vicinity (on a molecular scale) of proteins often are indicators for the function of such bio-molecules (or as well for re-crystallization events e.g. in Langmuir Blodgett films). The AFM could localize these luminescence or fluorescence particles (e.g. metallic nano particles with or without attached fluorescence molecules) and the here invented optical spectroscopy then is able to prove biochemical functions e.g. on a cell / bacteria /virus surface, all with the spatial resolution of the scanning force microscope and the color of the optical diffraction/scattering imaging microscopy. The time resolution can be better than that of scanning probe techniques and can be almost equal to that of the optical video microscopy, since nowadays' computers are very powerful, while the scanning probe microscopy just provides markers/marker-images in time containing the necessary additional (image) information on their own time scale of (nowadays) up to 10 images per second.
As an example, proposed is this method e.g. for the imaging of the surface of living cells/bacteriae/viruses in vitro on a molecular scale and the physiological processes on them: Scanning force microscopy already provides,,movies" with a resolution of down to about lOnm laterally at up to one frame per second [13]; based on that spatial resolution the here invented fast (much faster than the 1 image per second of scanning force microscopy in biology) spectroscopy could then follow luminescence / fluorescence markers at a much higher image rate and thus could image (kinematic and) dynamic biochemical processes in color and could identify them.
In fact monoclonal antibodies labeled with 5 nm gold (or other metallic) nano particles (which are also available with attached fluorescence molecules) against certain proteins could be imaged by means of the here invented optical (color) microscopy on the surface of viruses (e.g. vaccination strains), while these viruses attach to cell surfaces or while newly formed (progeny) virions leave the (host) cell through the cell wall, and thus certain viruses can be identified unambiguously.
Equivalently of course generally certain proteins in the cell membrane could be labeled unambiguously by such (fluorescence) labeled monoclonal antibodies and thus biochemical processes can be followed on a molecular scale in a well defined manner, and all that with the time resolution of the optical microscopy, if the spatial changes within (below/beyond) the diffraction limit (above the diffraction limit optical microscopy sees it anyway of course) are only slow -order of the time resolution of the AFM, about 1 image frame per second). Spatially fixed processes, e.g. protein motion / enzymatic activity, which for instance can quench the fluorescence of a marker molecule, can of course be imaged with the time resolution of the here invented optical microscopy /spatially resolved spectroscopy -i.e. for instance with the time resolution of a highest quality color camera; there is high speed cameras with over 100000 images per second (depending on the number of pixels of course) [reference: Slomotec: http://www.slomotec.de/productview.php?article=99] The,,artificial retina" (light pixel array of highest density) as hypothetically suggested in [6] could perhaps be even faster in the future due to the there exploited quantum effects. All of course given extremely fast efficient computer / numeric software technology. A few single well-defined marker (illuminated) points, i.e. a few single (2-3 pieces) fluorophores placed within the diffraction limit can be followed of course (in principle/theoretically) directly -even in 3 dimensions -with the here invented method, without needing another supporting highly resolving (,,black and white") microscopy like scanning probe microscopy (as mentioned above).
The,,back-calculation" of the lateral resolution of the optical (far field) microscopy would then tolerate relatively slow and relatively minor spatial changes (order of magnitude of the scanning force microscopy resolution) of the structural details, while the scanning force microscopy is refreshing these data continuously at 1 image frame per second. Spatial changes of these structural details may as well be of order of a few AFM-spatial resolutions (i.e. some 10 nm), since as mentioned above, it should be sufficient for the back-calculation of the spectroscopy image of well-defined,,points" (,,Airy-discs"), to know their number and their approximate position within the optical diffraction limit. The fast numerical (real-time) procedures necessary for this are initially not issue of the present invention, they exist however possibly in an adaptable manner e.g. in the field of image / object recognition in electron microscopy or also in [17].
It is remarked, that the here invented spatially resolved spectroscopy can of course besides AFM be combined with other highly resolving microscopy techniques, such as electron microscopy or photonic force microscopy, which is claimed [20] to provide 3d-images for instance also from the inside of a cell in vitro. The latter would be interesting here as a supporting high resolution scanning probe method, since also the here invented method also could provide 3-dimensional spatial information due to the possibility of using a phase contrast procedure.
referring to patent claim 14: The here invented concept from patent claims 1-3 to overcome the diffraction limitation of wave optical imaging techniques is of course basically applicable to all wave optical microscopies / telescopies, such as electron microscopy (there of course magnetic electron beam optics is used) or such as the imaging using infrared (KBr-lenses or,,suitable" Fresnel lenses [14])) or microwaves (directed microwave-/ radar-optics, ,,suitable" [14] Fresnel lenses or parabolic mirrors) -the electronically readable pixel detector just has to be suitable/sensitive for the according wave length.
In the case of micro waves, the laser in Fig. 1 is replaced by a maser, all lenses/apertures are largely omitted (perhaps rudimentary beam shaping by directed micro wave optics, i.e. ,,suitable" Fresnel lenses [14] or parabolic mirrors respectively, polarization rotations by Faraday (or Kerr) effects), and since then parallel,,light" is used, the sample can also be located in a large distance. This would lead to a concept of a highly resolving radar-telescopes, imaging mechanism according to Fig 2d, i.e. by circumvention/"extension" of the wave optical diffraction limit by numerical back-computation/de-convolution, eventually also in order to obtain spectroscopical data, in that case perhaps by means of pre-information about the geometrically known sample/objects to be observed / to be imaged.
In the case of a telescope, of course the interferometer cavity on the sample's side can hardly be realized, only in rare special cases -it will be restricted to the reference mirror in Fig. 1 to adjust to a dark fringe and to level out the Gaussian intensity profile (see patent claim 3). Under circumstances -depending on the requirements for sensitivity and resolution capability and depending on the expected contrast from the,,sample" -one could possibly omit the whole interferometry amplification, which as mentioned above in principle is not necessarily needed for circumventing/"extending" the diffraction limit of the wave optical imaging mechanism. In principle (theoretically) a suitable electronically I digitally readable pixel detector array is sufficient as well as suitable beam shaping and suitable numeric software and of course sufficient pre-information on the "sample" or the far away observed object; just the signal strength of the,,sub-diffraction limit contrast" will practically rarely suffice, to be made visible without further,,tricks". Microwave x-ray scanners can obviously be also envisioned this way: Different materials are differently transparent for microwaves just as for x-rays, i.e. a microwave,,x-ray" apparatus can be realized that way too.
referring to patent claim 15: The here suggested microscopy method with ultrahigh spatial resolution is combined with the long known Fourier transform infrared spectroscopy (FTIR): The laser light source (1.1) is replaced by an optimally collimated (ideally" parallel light) light source, which emits non-coherent infrared light. An ultra-highly spatially resolved spectrum is recorded by pixel-wise performing the Fourier transformation of the time-dependent intensity signal from time space into frequency space of the because of the periodic modulation of the vertical -in beam direction -position of the mirror (1.3) or of the CCD-array (3). Furthermore, such a spatially resolved spectrum is eventually obtained by 3-dimensional Fourier transformation: I.e. firstly a 1-dimensional Fourier transform from time space into frequency space of the because of the periodic mirror position modulation time dependent intensity signal and afterwards secondly a 2-dimensional Fourier transform of the laterally ultra-highly resolved diffracted "intensity images" for all frequencies of the spectrum.
Referring to claim 16: Using at least one of the microscopy/spectroscopy concepts of the claims 1-15, it is here proposed to use structured illumination of the sample just like in reference [21]. However, here it is now proposed to use the casting of an optical near field shadow from a nanometric patterned structure onto the sample. In particular a periodic 2-dimensional array of quantum troughs is proposed to be imaged while being illuminated with such a casted shadow generated by another 2-dimensional periodic -with slightly different spatial periodicity -array of quantum dots in near field separation to the "sample quantum trough array." The method proposed in [21] might then as well reach a spatial resolution beyond the diffraction limit as might the here invented concept.
Drawings: Fig. 1: High resolution CCD-camera (with a big dynamic range) could read out the quantum troughs spectroscopically (i.e. image their,,color-excited state"), by quantitatively de-convoluting/subtracting the (approximately) Airy-diffraction patterns of each,,quantum dot", particularly if their position and also their shape is known (characterization by scanning probe microscopy). The here invented interferometric set-up in Fig. 1 however could already theoretically in principle provide lateral optically resolution beyond the diffraction limit, at least in certain, in the patent claims specified special cases. In particular by computer aided image processing (de-convolution by appropriate numerical,,scanning" of an Airy-function across the whole image, under the provision that the quantum dots are little discs of known size, i.e. a kind of cross correlation of the diffraction image with one (or suitable groups of) airy functions) a real space image of the array of tiny (about Snm) quantum dots could be generated.
By using a phase contrast method (interferometry), which has a vertical resolution of order of 0.1 Angstrom, also 3-dimensional arrays (Fig. lb) of quantum troughs can be spectroscopically investigated this way -i.e. read-out -, because the light phase can be,,moved through" vertically at this high resolution. This way, a much higher storage density than just with 2-dimensional quantum trough arrays is possible (only just these are accessible by high resolution scanning probe techniques).
The Gaussian intensity profile of a color laser (red-green-blue or tunable) falls on a beam splitter (polarizing or not), from there one beam falls onto a movable mirror with adjustable reflectivity (e.g. by means of an adjustable absorber in front of it -e.g. via electrically controllable liquid crystals) and the other beam falls onto the with respect to distance (and also reflectivity) adjustable/modulateable sample interferometer cavity. The latter can be formed between an (at the sample side partly reflecting) high numerical aperture objective and the (reflecting) sample (which eventually can be scanned in 3 dimensions, perhaps also with a rotation as for a HDD/DVD/CD) or between the reflecting end of very short mono-/multi-mode fiber and the (reflecting) sample. By suitable usage of X/4-waveplates or equivalent, the polarizations are adjusted such that 3 laser beams come to interference on the detector, e.g. the CCD-camera. By means of the position of the mirror the relative phases of the laser beams are adjusted such that the measurement is almost to a 100% performed on a dark fringe, i.e. the only photons that hit the CCD-array are those caused by the sample structure causing minute deviations of the light beam intensity profile from the ideal Gaussian profile of the incoming laser (Fig. 2d). For the case of the fiber optic version it is remarked, that 1. The [effective] inner diameter of a monomode fiber for 633 nm is about 4 lim, i.e. roughly of the same size as the expected sample area (at Snm quantum dots in 10 nm distance this would result already in almost l00kbits, if only one quantum level is used), that 2. This monomode fiber has to be very short (of order << O(lm)), since in an ideal monomode fiber, deviations from the Gaussian profile are damped rapidly. For large memory cell arrays, e.g. (1cm)2 which, only realized in 2 dimensions, would already correspond to a storage size of 400 GigaBits (at (lSnm)2 per quantum trough and only one used quantum level) a line-wise raster scan is then needed (roughly in 4iim steps, i.e. roughly the inner diameter of the monomode fiber), or a large objective lens has to be used. This objective ought to be a,,suitable" [14] Fresnel lens, since the by the present invention exploited effect should be much more significant in the Fresnel-approximation (spherical wave approximation -Fresnel diffraction optics, scatterer size A) in the regime close to the sample up to X away from the sample, than in the far field approximation (plane wave fronts -Fraunhofer diffraction optics); in particular such a here invented system could be analogue to the laser write/read heads of a CD/DVD player/burner manufactured integrated on a chip, under usage of suitable" Fresnel lenses [14].
For further improvement of the signal to noise ratio by further reduction of stray reflections also the beam splitter can be realized fiber optically and thus the whole system including the,,3." reference beam (if used at all) as described in patent claim la -phases and polarization adjustment by means of stress birefringence of the fiber as in [10].
Fig 1 inset: The here primarily proposed method in order to obtain spectroscopical (color) resolution simply is based on the technology commercially available high quality color video cameras, namely to split the use-signal beam coming from the beam splitter by a prism (or a,,suitable" optical grating [14]) into the spectral colors and to record them with e.g. 3 or more pixel detectors (e.g. highly sensitive black and white cameras, optimized for the according wavelength regime) eventually via a wavelength filter.
Fig. ib: Diffraction at a double slit.
Fig. ic: Hertz dipole radiation characteristics.
Fig. id: Abbe's (spatial) diffraction limit for optical microscopy. The numerical aperture NA=nsinct determines the resolution limit of the microscope, i.e. the smallest sample detail separation d, which can be observed, defined by the acceptance angle a of the objective lens under which in the limiting case the 15t order diffraction/interference peak (of this "double slit" or "grating" diffraction) is viewed. However, if the 0th order diffraction/interference peak's intensity profile is quantitatively measured, for instance up to a viewing angle 0= a', so accurately that the whole double slit diffraction curve can be numerically fitted to it, then already an objective lens with the smaller numerical aperture NA=nsinct' would be sufficient to resolve the slit separation d, which would then be well below the diffraction limit for optical microscopy as defined by Abbe. Note that the single slit diffraction -here in the drawing the slit width a < slit separation d -is always superimposed leading to an total intensity profile of form l(O)= l x [(sin((iia/A)sinO)/ ((iia/X)sinO)]2 x cos2[(iid/X)sinO], where as described in the text, 0 is the viewing angle and A the wave length of the illumination. If the mutual interference of the two light sources were "switched off (e.g. if the two light sources were two mutually independently luminescing/fluorescing quantum troughs), then the two light source interference would disappear and only the single slit diffraction pattern (or Airy diffraction discs in the case of circular light sources) of the two independent light sources would be left over and their separate intensity profiles would add scalarly, they would not interfer anymore Fig. 2a: Diffraction limit 2 Airy functions, if the,,quantum dots" are small discs, at the,,conventional" (Sparrows definition) diffraction limit.
The diffraction pattern, which corresponds to every single,,quantum dot" (e.g. roughly disc geometry) can be reconstructed (calculated/de-convoluted); if the disc separation is slightly above the diffraction limit, the maxima of the diffraction patterns of the single discs could be even resolved separately on a frosted glass screen/photographic film.
Furthermore, a CCD-camera can quantify the intensity profile (as a function of x and y, the lateral extent) of the diffracted light resulting from the superimposition of the diffraction patterns from single quantum dots.
Fig. 2b: Below (beyond) the diffraction limit Superimposed intensity profiles, if the quantum troughs are very close to each other (Airyl, Airy2), i.e. far beyond (below) the conventional (Sparrows definition) diffraction limit. By knowing 2 Airy functions Airyl and Airy2 precisely, because the disc geometry of the,,quantum dots" is precisely known for instance from scanning force microscopy, a computer can easily,,deconvolute" (not really de-convolute but rather just subtract in the simpler case). The same holds of course for arbitrary geometries of the structural details to be imaged, since the single diffraction image always is (in the far field) the Fourier-transform (in the Fresnel regime, i.e. scatterer size A and detector distance some, finitely many (roughly 100) As, with consideration of higher order terms in the multipole expansion -spherical wave approximation) of the light absorption (as a function of x,y) of the structural details. General de-convolution, meaning direct video microscopy without SPM-support, can also be envisioned, while however many pre-information from other highly resolving microscopies are needed (e.g. how many structural details, which mean size and distance etc., accurate imaging transfer functions / point spread functions of the perhaps used lens system.
Fig. 2c: Spectroscopy: The resulting diffraction pattern of the 2,,quantum dots" in close vicinity, i.e. the diffracted light, is split into for instance red-yellow-blue by means of a prism and recorded by a corresponding CCD-array sensor each. The,,de-convolution" (subtraction) is then performed for each color (wave length). Thus, the spectroscopy can be spatially resolved at each quantum dot by back-calculation.
Fig. 2d: 2 dimensional array of quantum troughs (drawn here in projection only of course), illuminated by the Gaussian beam profile of a laser. Even below/beyond the diffraction limit lateral modulations occur in the intensity profile of the,,shadow", which will hardly show maxima and minima, as drawn here in an exaggerated manner in a regime in close vicinity to the sample where not yet entanglement [folding] of the intensity variations caused by the diffraction at the sample structure details occurs (only such would be visible for the eye on a frosted glass screen /photographic film), there are however measurable deviations from the ideal Gaussian profile and the diffraction limited blurred shadow respectively. A CCD-camera can of course quantitatively measure the intensity incident on the [single] pixels, and not only distinguish between bright and dark -a photographic film has of course the same capability, but it cannot amplify for the eye (and particularly cannot de-convolute) and turn the many densely packed folded/entangled inflection points into maxima and minima. The PC, which is hooked up to the electronic pixel detecting camera is capable of this though. An ultra-highly resolved,,regular" photograph could of course even in retrospect be (via the here invented concept similarly ultra-highly resolved) be scanned in, and be de-convoluted by a PC, if the transfer function of the used (two) lens systems are known, if the film processing method had taken account of these (ultra fine) photographic grains and if the scanner exhibits the same resolution capability, or the negative/the film format was large enough.
Fig. 3a: Large ordered or as well statistically distributed array of quantum troughs (e.g. metallic islands in the nm-size regime) between two electrodes, eventually connected again (via tunneling contacts each) by means of a resistor cascade just like in a shift register/CCD-array/DRAM or contacting by means of the quantum wire array from [3] or as suggested in [6] respectively (Fig. 3a/ll).
Manufacturing of 2-dimensional arrays of,,quantum dots" by positioning of e.g. 5nm colloidal Au-, Ag-(or many other materials) nano particles -either statistically deposited from suspension onto a suitable substrate, or positioned with the AFM, or at best, by using Langmuir Blodgett films [12,12a,12b], where such,,nano spheres" can be linked chemically to the amphiphilic (e.g. lipid-) molecules ([12, 22] and references therein, [12a,b]) or these nano spheres can be enveloped by these amphiphilic molecules. Other (e.g. ,,imprinting-") methods can as well be envisioned [12c]. By transfer of the crystalline or partly crystalline LB-films onto a suitable substrate a ordered layer of such for instance gold nano particles is generated. Consequently, an according Langmuir Blodgett deposition of multi-layers leads to a 3-dimensional array of such nano spheres (quantum troughs) [la].
Fig. 3b: Optically (spectroscopically) read-out (or write-on) of the stored information (luminescence excitation states of the quantum troughs) in the far field, supported by interferometry by means of the here invented set-up in Fig. 1. Furthermore, at the otherwise electrically insulating AFM probe tip a single quantum wire has been generated (procedure as in [3]), by means of which the quantum troughs can be (electronically) written on (charged/excited) but this way they also can be electronically read out as well as optically; the latter because the current through a quantum wire is (has been found to be) light sensitive [6].
Legend: 1 prism or,,suitable" [14] optical grating 1.1 Laser 1.2 Faraday isolator + beam shaper/expander 1.3 Mirror 1.4 Beam splitter 1.5 X/4 -wave plate 1.6 Very short monomode glass fiber or strong objective lens with high numerical aperture 1.7 Optical cavity 1.8 Sample 1.9 Laser intensity profile -incident 1.10 Laser intensity profile -reflected 1.11 Resulting intensity profile in the dark field / in destructive interference 2 expander-(diverging-) lens, perhaps with an aperture (pin hole) in front of it 3 high resolution pixel camera (e.g. CCD-camera chip) 4 disc shaped structural details (>> ,,Airy discs") resulting diffracted intensity profile with dispersion (superimposition of 2 Airy-functions with dispersion spreading[/splitting], where the 2 diffracting structural details lie within/beyond/below the Sparrow's definition of the diffraction limit) 6 quantum troughs, e.g. metal film islands 6a quantum troughs, which are loaded with 1,2 or 3 electrons. Note: A quantum trough charged with 3 electrons will exhibit a resonance at a different wave length, than the same quantum trough which is charged only with one electron -for various reasons.
7 electrodes for,,linear charging" of the quantum troughs 8 electrically insulating layer (e.g. insulating DLC or Si02) 9 electrically conducting substrate (e.g. highly doped Si-wafer) Spectroscopy-respectively microscopy laser (focus diameter/beam waist respectively beam diameter expanded/tailored to sample size) 11 Ideal Gaussian intensity profile of the laser (dotted line in the range where it deviates) 12 Deviation from the ideal Gaussian profile (drawn exaggeratedly: Below/beyond the diffraction limit, the intensity profile I(x,y) will not exhibit minima/maxima, but will remain a monotonous function, will, however, measurably deviate locally from the perfect Gaussian profile.
13,,resistive wire" -potentiometric conducting lead with well defined R and C (not just stray capacitances/resistances) 14 Electrically insulating substrate (e.g. Si02-layer/wafer) Tunneling contacts 16 Electrically insulating DLC-layer with embedded vertical quantum wires (manufacturing method e.g. as in [3] or in [6]) -on average every quantum trough is contacted by one or a few quantum wires.
17 Wiring matrix -as in a DRAM/FIa5hRAM/shift register etc respectively as suggested in [6] 18 AFM detection laser 19 AFM cantilever spring with probe tip Single quantum wire (eventually a few parallel quantum wires) vertically embedded in an otherwise electrically insulating (e.g. diamond-) AFM-probe tip.
21 Protective resistor (of suitable size) 22 Highly sensitive (pico-femto-) amperemeter (e.g. IVC plus electrometer volt meter) 23 Cuircuit for optional electronic read out of the quantum troughs by means of the quantum wire in the AFM tip 24 Cuircuit for optional electronic,,loading" of the quantum troughs by means of the quantum wire in the AFM probe tip (with small alteration also for optional read-out of the quantum troughs by means of the quantum wire in the AFM probe tip) References: 1. P.M. Petroff, G. Medeiros-Ribeiro, MRS Bulletin 21(4), 50 (1996).
la. Xuehua Zhou, Chunyan Liu, Zhiying Zhang, Long Jiang, Jinru Li "Formation of a 3 dimensional (3D) structure of nanoparticles using Langmuir Blodgett method"; Chemistry Letters 33(6), 710 (2004).
2. U55835477,G.Binnig,H.Rohrer,P.Vettiger,"Mass-storage applications of local probe arrays".
3. EP1096569A1, F. Ohnesorge et al. 4. U56566704B2 Wun-bong Choi et al. 5. H.A.Bethe, Phys.Rev.66(7,8),163(Oct. 1944); C.L.Pekeris,Phys.Rev.66(11,12),351(1944) 6. Patent application at the DPMA Az:102008015118.1-33, of 10.03.2008, F. Ohnesorge, and GB0903401.8 7. [P 0776457B1, C.H.F. Veizel et al. 8. T.A. KIar, S. Jakobs, M.Dyba, A. Egner, S. Hell, PNAS 97(15), 8206 (2000) 9. DE10154699A1, S. Hell et al. 10. D.Rugar,H.i.Mamin,R.Erlandsson,i.E.Stern,B.D.Terris,Rev.Sci.lnstr.59(11), 2337 (1988) 11. z.B. Fa. Schäfter-Kirchhoff ha. z.B. Fa. Epotec 12. C.P.Collier, R.J.Saykally, JJ.Shiang, S.E.Henrichs, J.R.Heath, Science 277, 1978 (1997) 12a. J.R. Heath, C.M. Knobler, D.V. Leff, J.Phys. Chem. B101, 189 (1997) 12b. US6159620A, J. Heath, D. Leff, G. Markovic 12c. US6294401 J.M. Jacobson, B.N. Hubert, B. Ridley, B. Nivi, S. Fuller 13. Dissertation, F. Ohnesorge, Juni 1994, LMU München 14. Here it is remarked that there is a danger of confusion: A Fresnel lense (diffraction lense) usually also works with Fraunhofer diffraction (i.e. in the Fraunhofer-regime/plane wave approximation of diffraction), but eventually -depending on the size scales -also in the Fresnel regime of diffraction (spherical wave approximation). Furthermore, it is emphasized, that for diffraction limited optics any refraktive lense can be replaced by a Fresnel lense. In the here invented concept, a,,normal" Fresnel lense would unwantedly filter away the information below/beyond the diffraction limit. A for the here invented concept,,suitable" Fresnel lense would have to contain much finer gratings, than the usualy wave length order of magnitude, in order to take account of higher orders /shorter wavelength contributions (non-linear optical effects); or the Fresnel lense would have to have suitably shaped gratings, i.e. roughly Gaussian shaped, in order not to mix information of higher order (eventually shorter wave length contributions) with the,,ringing" from diffraction at cornered or arbitrarily shaped edges of a usual[/customary] grating. The evaluation of the,,point spread function" could here only partly and then only theoretically provide,,relief", since then in fact 2 diffraction limits are superimposed; one occurs by imaging of the sample itself and the other at the diffractive lense; this blurred image will hardly ever be reconstructable by computation, at least not to my present imagination. A refractive lense does not have this limitation in principle [theoretically], but does have of course -as any lense -further other aberrations (e.g. the deviation of the lense curvature from a polynomial 4th order or perhaps higher orders). The lense error,,finite diameter", i.e. finite numrical aperture, both lense types have in common.
15. The here invented concept, I already have proposed in Sept. 1996 for a reasearch fund's application (confidential, of course not published) at the Alexander v. Humboldt foundation and thus, I claim the Urheber-Recht at this date. Furthermore in part in the deemed withdrawn patent application Az. 10019037.5 at the DPMA of 18.04.2000 (not published, is considered withdrawn).
16. E. Hecht "Optics", 2nd Ed., Addison-Wesley 1987 17. A. Lewis, US6900435B1, 2005 18. Where roughly for 2>a full-developed diffraction minima do not occur anymore. The mutually incoherently luminescing structure details can nevertheless be deconvoluted, since the CCD (or else)-camera quantitatively measures the intensity profiles of the diffraction peaks and thus also the 0th oder diffraction peak -since no frosted glass screen is used.
19. FTIR, e.g. Wikipedia 20. H. Hörber et. al., Photonic force microscopy 21. M.G.L. Gustafsson, PNAS 102(37), 13081-13086 (2005).
22. S. Chen, Langmuir 17, 2878-2884 (2001).
Abbreviations: AFM -atomic force microscopy DLC -diamond like carbon ETIR -Fourier transform infrared spectroscopy SPM -scanning probe microscopy SNOM/NSOM -scanning near field optical microscopy/near field scanning optical microscopy LB-film/LB-technique -Langmuir Blodgett film / Langmuir Blodgett technique -quantum dot, which relates to the geometrical shape of a (nearly) 0-dimensional object, while "quantum trough" describes the physics of particle(s) (e.g. electrons) in a 0-dimensional potential box.
GB1101356A 2010-02-10 2011-01-26 Highly spatially resolved optical spectroscopy/microscopy Withdrawn GB2477844A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102010007676A DE102010007676A1 (en) 2010-02-10 2010-02-10 Concept for laterally resolved Fourier transform infrared spectroscopy below/beyond the diffraction limit - applications for optical (but also electronic) fast reading of ultra-small memory cells in the form of luminescent quantum wells - as well as in biology/crystallography
GB1011618A GB2477817A (en) 2010-02-10 2010-07-12 Spectroscopy on mutually incoherently luminescing quantum dots, optical read out of quantum troughs, and fabrication of a writable array of quantum troughs

Publications (2)

Publication Number Publication Date
GB201101356D0 GB201101356D0 (en) 2011-03-09
GB2477844A true GB2477844A (en) 2011-08-17

Family

ID=42712183

Family Applications (2)

Application Number Title Priority Date Filing Date
GB1011618A Withdrawn GB2477817A (en) 2010-02-10 2010-07-12 Spectroscopy on mutually incoherently luminescing quantum dots, optical read out of quantum troughs, and fabrication of a writable array of quantum troughs
GB1101356A Withdrawn GB2477844A (en) 2010-02-10 2011-01-26 Highly spatially resolved optical spectroscopy/microscopy

Family Applications Before (1)

Application Number Title Priority Date Filing Date
GB1011618A Withdrawn GB2477817A (en) 2010-02-10 2010-07-12 Spectroscopy on mutually incoherently luminescing quantum dots, optical read out of quantum troughs, and fabrication of a writable array of quantum troughs

Country Status (2)

Country Link
DE (1) DE102010007676A1 (en)
GB (2) GB2477817A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016112750A1 (en) * 2016-07-12 2018-01-18 Net Se Opto-electronic measuring device for a colorimeter
EP4109067A1 (en) * 2021-06-25 2022-12-28 INTEL Corporation Methods and apparatus to calibrate spatial light modulators

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9664614B2 (en) 2011-07-11 2017-05-30 University Of Limerick Method for high resolution sum-frequency generation and infrared microscopy
EP2941662B1 (en) 2013-01-04 2019-03-13 University of Limerick Differential infra red nanoscopy system and method
US10451553B2 (en) 2014-04-03 2019-10-22 Hitachi High-Technologies Corporation Fluorescence spectrometer
CN104360425B (en) * 2014-11-24 2017-02-22 京东方科技集团股份有限公司 Optical film layer, light emitting device and display device
DE102017129837A1 (en) * 2017-12-13 2019-06-13 Leibniz-Institut für Photonische Technologien e. V. Combined examination with imaging and laser measurement
CN110132895A (en) * 2019-06-21 2019-08-16 陕西科技大学 A kind of measuring device and measuring method of liquid refractivity
CN112666135B (en) * 2020-11-26 2023-04-21 中国科学技术大学 Three-dimensional microscopic imaging device and method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5874726A (en) * 1995-10-10 1999-02-23 Iowa State University Research Foundation Probe-type near-field confocal having feedback for adjusting probe distance
US5995227A (en) * 1994-08-19 1999-11-30 Velzel; Christiaan H. F. Method and interference microscope for microscoping an object in order to obtain a resolution beyond the diffraction limit (high resolution)
US20040150818A1 (en) * 1999-05-17 2004-08-05 Armstrong Robert L. Optical devices and methods employing nanoparticles, microcavities, and semicontinuous metal films
WO2005089369A2 (en) * 2004-03-17 2005-09-29 Thomas Jeff Adamo Apparatus for imaging using an array of lenses
US20070212681A1 (en) * 2004-08-30 2007-09-13 Benjamin Shapiro Cell canaries for biochemical pathogen detection
US20080193034A1 (en) * 2007-02-08 2008-08-14 Yu Wang Deconvolution method using neighboring-pixel-optical-transfer-function in fourier domain
US20090095882A1 (en) * 2007-09-18 2009-04-16 Northwestern University Near-field nano-imager
US20100019296A1 (en) * 2008-07-24 2010-01-28 Cha Dae-Kil Image sensor having nanodot
DE102009031481A1 (en) * 2008-07-03 2010-02-11 Ohnesorge, Frank, Dr. High-space resolved spectroscopy method for scanning e.g. molecule, involves providing array with camera designed as charge-coupled device camera/color video camera in version with multiple detector arrays for component areas without lens

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5835477A (en) 1996-07-10 1998-11-10 International Business Machines Corporation Mass-storage applications of local probe arrays
US6159620A (en) 1997-03-31 2000-12-12 The Regents Of The University Of California Single-electron solid state electronic device
US6294401B1 (en) 1998-08-19 2001-09-25 Massachusetts Institute Of Technology Nanoparticle-based electrical, chemical, and mechanical structures and methods of making same
US6900435B1 (en) 1999-02-14 2005-05-31 Nanoptics, Inc. Deconvolving far-field images using scanned probe data
EP1096569A1 (en) 1999-10-29 2001-05-02 Ohnesorge, Frank, Dr. Quantum wire array, uses thereof, and methods of making the same
KR100360476B1 (en) 2000-06-27 2002-11-08 삼성전자 주식회사 Vertical nano-size transistor using carbon nanotubes and manufacturing method thereof
US7298461B2 (en) * 2001-10-09 2007-11-20 Ruprecht-Karls-Universitat Far field light microscopical method, system and computer program product for analysing at least one object having a subwavelength size
DE10154699B4 (en) 2001-11-09 2004-04-08 MAX-PLANCK-Gesellschaft zur Förderung der Wissenschaften e.V. Method and device for stimulating an optical transition in a spatially limited manner
US8353061B2 (en) * 2008-05-02 2013-01-08 Ofs Fitel, Llc Near-field scanning optical microscopy with nanoscale resolution from microscale probes

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5995227A (en) * 1994-08-19 1999-11-30 Velzel; Christiaan H. F. Method and interference microscope for microscoping an object in order to obtain a resolution beyond the diffraction limit (high resolution)
US5874726A (en) * 1995-10-10 1999-02-23 Iowa State University Research Foundation Probe-type near-field confocal having feedback for adjusting probe distance
US20040150818A1 (en) * 1999-05-17 2004-08-05 Armstrong Robert L. Optical devices and methods employing nanoparticles, microcavities, and semicontinuous metal films
WO2005089369A2 (en) * 2004-03-17 2005-09-29 Thomas Jeff Adamo Apparatus for imaging using an array of lenses
US20070212681A1 (en) * 2004-08-30 2007-09-13 Benjamin Shapiro Cell canaries for biochemical pathogen detection
US20080193034A1 (en) * 2007-02-08 2008-08-14 Yu Wang Deconvolution method using neighboring-pixel-optical-transfer-function in fourier domain
US20090095882A1 (en) * 2007-09-18 2009-04-16 Northwestern University Near-field nano-imager
DE102009031481A1 (en) * 2008-07-03 2010-02-11 Ohnesorge, Frank, Dr. High-space resolved spectroscopy method for scanning e.g. molecule, involves providing array with camera designed as charge-coupled device camera/color video camera in version with multiple detector arrays for component areas without lens
US20100019296A1 (en) * 2008-07-24 2010-01-28 Cha Dae-Kil Image sensor having nanodot

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Ki Young Kim, "Recent Optical and Photonic Technologies", January 2010, INTECH, Croatia, pp 299 to 316, see especially last paragraph, page 307, *
Proceedings of the 2003 IEEE Conference on Computer Vision and Pattern Recognition Workshop, Ryu et al, "Application of Structured Illumination in Nano-Scale Vision" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102016112750A1 (en) * 2016-07-12 2018-01-18 Net Se Opto-electronic measuring device for a colorimeter
EP4109067A1 (en) * 2021-06-25 2022-12-28 INTEL Corporation Methods and apparatus to calibrate spatial light modulators

Also Published As

Publication number Publication date
GB2477817A (en) 2011-08-17
GB201101356D0 (en) 2011-03-09
DE102010007676A1 (en) 2011-08-11
GB201011618D0 (en) 2010-08-25

Similar Documents

Publication Publication Date Title
GB2477844A (en) Highly spatially resolved optical spectroscopy/microscopy
Kino et al. Confocal scanning optical microscopy and related imaging systems
US7142308B2 (en) Microscopy
JP5120873B2 (en) Spectroscopic measurement apparatus and spectral measurement method
US20190317012A1 (en) Rapid multiplexed infrared 3d nano-tomography
JP6578278B2 (en) Three-dimensional focus adjustment apparatus and method for microscope
US10012495B2 (en) Optical telemetry device
US8209767B1 (en) Near field detection for optical metrology
US9134242B2 (en) Method and apparatus for retrieval of amplitude and phase of nonlinear electromagnetic waves
Wasserroth et al. Graphene as a local probe to investigate near-field properties of plasmonic nanostructures
DE102009031481A1 (en) High-space resolved spectroscopy method for scanning e.g. molecule, involves providing array with camera designed as charge-coupled device camera/color video camera in version with multiple detector arrays for component areas without lens
Hauler et al. Direct phase mapping of the light scattered by single plasmonic nanoparticles
Dereux et al. Subwavelength mapping of surface photonic states
JPH0949851A (en) Method for drawing physical characteristic of workpiece
Abbasian et al. Microsphere-assisted quantitative phase microscopy: a review
Bauer Probe-based nano-interferometric reconstruction of tightly focused vectorial light fields
Kim et al. Amplitude and phase measurements of highly focused light in optical data storage systems
Berguiga et al. Sensing nanometer depth of focused optical fields with scanning surface plasmon microscopy
Martinez-Marrades et al. Characterization of plasmonic nanoantennas by holographic microscopy and scanning near-field microscopy
JP7159260B2 (en) Apparatus and method for characterizing surface and subsurface structures
Atalay et al. Analysing one isolated single walled carbon nanotube in the near-field domain with selective nanovolume Raman spectroscopy
Kennedy et al. Nanoscale Spectroscopy in the Infrared with Applications to Biology
Bek Apertureless SNOM: A new tool for nano-optics
Sti Building a Combined Brightfield and Confocal Quantum Diamond Microscope for Imaging of Magnetic Samples
Helseth et al. Fundamental limits of optical microrheology

Legal Events

Date Code Title Description
WAP Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1)