US20100195873A1 - Quantitative differential interference contrast (dic) devices for computed depth sectioning - Google Patents
Quantitative differential interference contrast (dic) devices for computed depth sectioning Download PDFInfo
- Publication number
- US20100195873A1 US20100195873A1 US12/690,952 US69095210A US2010195873A1 US 20100195873 A1 US20100195873 A1 US 20100195873A1 US 69095210 A US69095210 A US 69095210A US 2010195873 A1 US2010195873 A1 US 2010195873A1
- Authority
- US
- United States
- Prior art keywords
- wavefront
- light
- image
- structured
- wavefront sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/41—Refractivity; Phase-affecting properties, e.g. optical path length
- G01N21/45—Refractivity; Phase-affecting properties, e.g. optical path length using interferometric methods; using Schlieren methods
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/2441—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures using interferometry
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/0004—Microscopes specially adapted for specific applications
- G02B21/0092—Polarisation microscopes
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
- G02B21/08—Condensers
- G02B21/14—Condensers affording illumination for phase-contrast observation
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/18—Arrangements with more than one light path, e.g. for comparing two specimens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
Definitions
- Embodiments of the present invention generally relate to quantitative differential interference contrast (DIC) devices with wavefront sensors. More specifically, certain embodiments relate to quantitative DIC devices with wavefront sensors used in applications such as microscopy or photography, and that are adapted to compute depth sectioning of specimens and other objects.
- DIC quantitative differential interference contrast
- DIC microscopes render excellent contrast for optically transparent biological samples without the need of introducing exogenous contrast agents into the samples. Due to the noninvasive nature, DIC microscopes are widely used in biology laboratories.
- FIG. 1 is a schematic drawing of a conventional DIC device 10 that operates by interfering slightly displaced duplicate image light fields of polarized light.
- the conventional DIC device 10 includes an illumination source 20 providing polarized light to an object 30 .
- the light fields are transmitted through the object 30 and are laterally displaced with respect to each other along the x-direction.
- a net phase lag (typically ⁇ /2) is then introduced on one of the transmitted image light fields.
- the two light fields are allowed to interfere with each other at the image plane 40 . More simply, the process is equivalent to duplicating the transmitted image light field, laterally displacing a copy slightly and detecting the interference of the two light fields at image plane 40 .
- conventional DIC devices have several limitations.
- One major limitation is that conventional DIC devices translate phase variations into amplitude (intensity) variations.
- the DIC intensity image, I DIC (x,y) is a sine function of the differential phase so that the phase information cannot be interpreted directly from the intensity of the DIC image.
- the B(x,y) and C(x,y) terms both contain amplitude information so that the DIC image contains entangled amplitude and phase information. Therefore, phase variations cannot be easily disentangled from amplitude (intensity) variations that arise from absorption and/or scattering by an object.
- conventional DIC devices do not distinguish between the effects of absorption and phase variation. As a consequence of this entanglement of amplitude and phase information and nonlinear phase gradient response, conventional DIC devices are inherently qualitative and do not provide quantitative phase measurements
- DIC devices use polarized light and depend on the polarization in their phase-imaging strategies. Since polarized light must be used, conventional DIC devices generate images of birefringent objects (e.g., potato starch storage granules) that typically suffer from significant artifacts.
- birefringent objects e.g., potato starch storage granules
- Embodiments of the present invention relate to methods of using a quantitative DIC device(s) to compute depth sections of an object.
- An object introduced into the quantitative DIC device alters a light field and induces an image wavefront having an amplitude and phase gradient.
- the light detector at the back of the wavefront sensor measures the distribution of light passing through structured apertures in the wavefront sensor.
- the wavefront sensor uses the light distribution to measures separately the amplitude and phase gradient of the wavefront in two orthogonal directions.
- the quantitative DIC device numerically reconstructs the image wavefront from the amplitude and phase gradient, and computationally propagates the reconstructed wavefront to planes at different depths through the thickness of the object.
- One embodiment is directed to a method for computing depth sectioning of an object using a quantitative differential interference contrast device having a wavefront sensor with one or more structured apertures, a light detector and a transparent layer between the one or more structured apertures and the light detector.
- the method comprises receiving light by the light detector through the one or more structured apertures.
- the method further comprises measuring an amplitude of an image wavefront based on the received light measuring a phase gradient in two orthogonal directions of the image wavefront based on the received light.
- the processor reconstructs the image wavefront using the measured amplitude and phase gradient and propagates the reconstructed wavefront to a first plane intersecting an object at a first depth.
- Another embodiment is directed to a wavefront sensor comprising an aperture layer having one or more structured apertures, a light detector and a transparent layer between the aperture layer and the light detector.
- the light detector measures the amplitude of the wavefront and the phase gradient in two orthogonal directions based on the light received through the structured apertures.
- the wavefront sensor comprises an aperture layer having one or more structured apertures, a light detector and a transparent layer between the aperture layer and the light detector.
- the light detector measures the amplitude of the wavefront and the phase gradient in two orthogonal directions based on the light received through the structured apertures.
- the processor reconstructs the wavefront using the measured amplitude and phase gradient.
- FIG. 1 is a schematic drawing of a conventional DIC device.
- FIG. 2 is schematic drawing of a quantitative DIC device, according to an embodiment of the invention.
- FIG. 3 is schematic drawing of a side view of components of a quantitative DIC device in a first configuration, according to an embodiment of the invention.
- FIG. 3 is schematic drawing of a side view of components of a quantitative DIC device in a first configuration, according to an embodiment of the invention.
- FIG. 4 is a schematic drawing of a side view of components of a quantitative DIC device in a second configuration, according to an embodiment of the invention.
- FIG. 5 is a schematic drawing of a side view of components of a quantitative DIC device in a third configuration, according to an embodiment of the invention.
- FIG. 6( a ) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 6( b ) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 6( c ) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 7( a )( 1 ) is a schematic drawing of a perspective view of a two-dimensional structured aperture in the form of a ‘plus’ sign configuration, according to an embodiment of the invention.
- FIG. 7( a )( 2 ) is a schematic drawing of a perspective view of a two-dimensional structured aperture in the form of a ‘plus’ sign configuration, according to an embodiment of the invention.
- FIGS. 7( b ), 7 ( c ), and 7 ( d ) are images taken by a scanning electron microscope of two-dimensional structured apertures, according to embodiments of the invention.
- FIG. 8 is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, in accordance with embodiments of the invention.
- FIG. 9 is a schematic drawing of a side view of components of a quantitative DIC device having a Shack-Hartmann wavefront sensor, in accordance with an embodiment of the invention.
- FIG. 10( a ) is a schematic drawing of a top view of components of an intensity OFM device including light transmissive regions in the form of a one-dimensional array of single light transmissive regions.
- FIG. 10( b ) is a schematic drawing of a top view of components of a quantitative DIC device having an OFM wavefront sensor, according to an embodiment of the invention.
- FIG. 11 illustrates a schematic diagram illustrating this propagation approach, according to embodiments of the invention.
- FIG. 12 is a schematic diagram of the propagation approach using a processor, according to embodiments of the invention.
- FIGS. 13( a ) and 12 ( b ) are schematic drawings of a focusing approach taken by a traditional microscope.
- FIG. 14 is a flow chart of a compute depth sectioning method using a quantitative DIC device having a wavefront sensor, according to embodiments of the invention.
- FIG. 15( a ) is a side view of components of the wavefront sensor, according to an embodiment of the invention.
- FIG. 15( b ) is a top view of components of the wavefront sensor in FIG. 15( a ), according to an embodiment of the invention.
- FIG. 15( c ) is a sectional view of components of the wavefront sensor in FIG. 15( a ) through the center of structured aperture, according to an embodiment of the invention.
- FIG. 16( a ) is an intensity/amplitude image taken of a starfish embryo using a quantitative DIC microscope having an SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 16( b ) is an image based on phase gradient in the x direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 16( c ) is an image based on phase gradient in the y direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.
- FIGS. 16( d ), 16 ( e ) and 16 ( f ) showcase some of the unwrapping algorithms when applied to the raw amplitude, differential phase x and differential phase y data of FIGS. 16( a ), 16 ( b ) and 16 ( c ), according to embodiments of the invention.
- FIG. 17( a ) is an image of potato starch storage granules in immersion oil taken by a conventional transmission microscope.
- FIG. 17( b ) is an image of potato starch storage granules in immersion oil taken by a conventional DIC microscope.
- FIG. 17( c ) is an intensity image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.
- FIG. 17( d ) is an artifact-free x-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.
- FIG. 17( e ) is an artifact-free y-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.
- FIG. 18 shows a block diagram of subsystems that may be present in computer devices that are used in quantitative DIC device, according to embodiments of the invention.
- Some embodiments include a simple and quantitative DIC device with a wavefront sensor that can be used in applications such as microscopy, photography, or other imaging applications.
- Wavefront sensors of embodiments of the invention can be in any suitable form.
- the wavefront sensor can be in the form of a single pixel (element) wavefront sensor.
- the wavefront sensor can be in the form of a one dimensional wavefront sensor array having sensor elements (e.g., pixels) located along a single direction.
- the wavefront sensor can be in the form of a two-dimensional wavefront sensor array comprising sensor elements located along two orthogonal directions.
- quantitative DIC devices of embodiments of the invention provide advantages because they can separately measure the amplitude and phase gradient of an image wavefront in two orthogonal directions. With this information, the quantitative DIC device has sufficient data to numerically reconstruct an image wavefront and propagate it to other planes.
- Quantitative DIC devices of embodiments of the invention also provide advantages because they do not require polarized light as part of their imaging technique. Since these quantitative DIC devices are not dependent on the polarization of the light (illumination), these devices can use unpolarized light to generate artifact-free DIC images for both birefringent and homogenous objects. Also, an ordinary light source can be used such as the light source used in a conventional microscope. Another advantage of the quantitative DIC devices of embodiments of the invention is that they integrate DIC functionality onto a simple wavefront sensor. This integration is advantageous over conventional DIC devices which use bulky optical elements to provide DIC functionality. For this reason, embodiments of the present invention are more compact, less expensive, and simpler in use and design than conventional DIC devices.
- FIG. 2 is schematic drawing of a quantitative DIC device 100 , according to an embodiment of the invention.
- the quantitative DIC device 100 includes an illumination source 20 providing unpolarized light to an object 30 .
- the quantitative DIC device 100 employs a phase comparison that selectively combines and interferes the light fields of unpolarized light at two adjacent points of the image with a separation a.
- the quantitative DIC device 100 uses the phase comparison to generate the image 42 of the object 30 at the image plane 40 . Quantitative DIC devices employing this phase comparison can separately measure the amplitude and phase gradient of the light scattered by, or transmitted through, the object 30 .
- the first configuration includes a quantitative DIC device with a single pixel wavefront sensor.
- raster scanning can be employed to measure two-dimensional data about the image wavefront.
- the second configuration includes a two-dimensional wavefront sensor array which can measure the two-dimensional data about the image wavefront at the same time, without the need for raster scanning.
- the first and second configurations use a wavefront relay system to project the image wavefront from the object to the wavefront sensor.
- the third configuration eliminates the wavefront relay system of the previous two configurations.
- the quantitative DIC devices of these configurations do not depend on the polarization of light as part of their imaging method. These quantitative DIC devices can generate artifact-free DIC images, even for birefringence objects, if unpolarized light is used.
- FIG. 3 is a schematic drawing of a side view of components of a quantitative DIC device 100 in a first configuration, according to an embodiment of the invention.
- the quantitative DIC device 100 includes a single pixel wavefront sensor 110 .
- An illumination source 20 provides light to an object 30 being imaged by the quantitative DIC device 100 .
- the object 30 modulates or otherwise alters the light and induces an image wavefront 120 .
- the quantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with the single pixel wavefront sensor 110 .
- the wavefront relay system 130 projects or otherwise relays the image wavefront 120 generated by the object 30 onto the single pixel wavefront sensor 110 .
- the single pixel wavefront sensor 110 measures the local intensity and/or slope of the projected image wavefront 120 induced by a point of the object 30 , which conjugates with the single pixel wavefront sensor 110 .
- the quantitative DIC device 100 further includes a raster scanning device 140 for scanning the object 30 or scanning the single pixel wavefront sensor 110 to generate two-dimensional maps of the local intensity and/or slope of the image wavefront 120 induced by the object 30 .
- the quantitative DIC device 100 further includes a host computer 150 having a processor 152 in communication with a computer readable medium (CRM) 154 .
- the host computer 150 is in communication with the single pixel wavefront sensor 110 to receive wavefront data.
- One or more components of the quantitative DIC device 100 can be located within a body, which can be a multi-layer structure or a single, monolithic structure.
- the quantitative DIC device 100 measures the two-dimensional amplitude and measures the phase gradient in two orthogonal directions of the image wavefront 120 based on the measured intensity distribution.
- the total transmission of the interference is proportional to the average image intensity at the aperture plane.
- the offsets ( ⁇ s and ⁇ t) of the zero-order interference spot are related to the wavelength-normalized phase gradients ( ⁇ x and ⁇ y ) at the aperture, respectively, through the spacer (transparent layer) thickness (H) and refractive index (n) as:
- the quantitative DIC device 100 can numerically reconstruct the two-dimensional image wavefront 120 associated with the object 30 using the measured two-dimensional amplitude and phase gradient information.
- the quantitative DIC device 100 can then computationally propagate the image wavefront 120 to one or more z-planes at different depths through the thickness of the object 30 .
- the quantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of the object 30 , a reconstructed image, and propagated two dimensional images at different depths through the thickness of the object 30 .
- the two dimensional images are cross-sectional images of the object 30 .
- the quantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of the object 30 .
- the computing depth sectioning method will be described in further detail in following sections.
- An illumination source 20 can refer to any suitable device or other source of light such as ambient light.
- the light provided by illumination source 20 can be of any suitable wavelength and intensity.
- the light can include polarized and/or unpolarized light.
- the quantitative DIC device 100 can generate artifact-free DIC images of birefringence specimens or other objects 30 in samples.
- Suitable illumination sources 20 are naturally and commercially available.
- the illumination source 20 can be a component of the quantitative DIC device 100 . In other embodiments, the illumination source can be a separate from the quantitative DIC device 100 .
- An illumination source 20 can be placed in any suitable location and positioned in any suitable direction to provide appropriate light to the object 30 .
- multiple illumination sources 20 provide light in one or more directions.
- a camera system including a quantitative DIC device may have a first illumination source 20 that provides light in a direction from the wavefront sensor to the object 30 such as from a flash and a second illumination source 20 that provides light in another direction.
- a single illumination source provides light in a single direction.
- a microscope system comprising a quantitative DIC device 100 may have a single illumination source 20 positioned to provide light in the negative z-direction.
- a wavefront relay system 130 can refer to a device or combination of devices configured to relay (e.g., project) the image wavefront 120 induced by the object 30 onto a wavefront sensor such as the single pixel wavefront sensor 110 in FIG. 3 or the wavefront sensor array 210 shown in FIGS. 4 and 5 .
- a wavefront relay system 130 includes one or more lenses. The light may be relayed in any suitable manner.
- the wavefront relay system 130 projects the image wavefront 120 from the object 30 onto a single pixel wavefront sensor 110 .
- a raster scanning device 140 can refer to any suitable device for raster scanning the object 30 or raster scanning the wavefront sensor.
- the raster scanning device 140 can be a component of the quantitative DIC device 100 in some embodiments and can be a separate device in other embodiments.
- the host computer 150 is a component of the quantitative DIC device 100 . In other embodiments, the host computer 150 can be a separate device. Although the processor 152 and CRM 154 are shown as components of the quantitative DIC device 100 , in other embodiments the processor 152 and/or CRM 154 can be components of the wavefront sensor.
- a processor can refer to any suitable device for processing the functions of the quantitative DIC device 100 .
- the processor 150 receives signals with wavefront data associated with the intensity distribution and/or slope data of the image wavefront 120 measured by the single pixel wavefront sensor 110 .
- the wavefront data may include a two-dimensional map of the intensity distribution and/or slope of the image wavefront 120 measured by the single pixel wavefront sensor 110 by employing the raster scanning device 140 , the amplitude and phase gradient data associated with the image wavefront 120 , the wavelength(s) of the light, and/or other information about the light received by the single pixel wavefront sensor 110 .
- the processor 150 executes code stored on the CRM 154 to perform some of the functions of the quantitative DIC device 100 . These functions include interpreting the light distribution and/or slope data measured by the single pixel wavefront sensor 110 , measuring the center of a projection through a structured aperture, separating the projections from structured apertures, determining offsets of the projections or the focal points, determining the amplitude and phase gradient of the image wavefront 120 in two orthogonal directions using the light distribution data, reconstructing the image wavefront using the amplitude and phase gradient data in two orthogonal directions, propagating the reconstructed wavefront from the detector z plane to one or more z planes, generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, combining two-dimensional image data to generate three-dimensional data and images of the object 30 , displaying one or more images of the object 30 , and other functions associated with computed depth sectioning
- a CRM 154 can refer to a memory that stores data and may be in any suitable form including a memory chip, etc.
- the CRM 154 stores the code for performing some functions of the quantitative DIC device 100 .
- the code is executable by the processor 152 .
- the CRM 154 comprises a) code for interpreting the light distribution data received from the single pixel wavefront sensor 110 , b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of the image wavefront 120 , and determining the phase gradient of the image wavefront 120 in two orthogonal directions using the light distribution data, d) code for reconstructing the image wavefront using the amplitude data and the phase gradient data in two orthogonal directions, e) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, f) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wave
- the quantitative DIC device 100 may also include a display communicatively coupled to the processor 152 . Any suitable display may be used. In one embodiment, the display may be a part of the quantitative DIC device 100 . The display may provide information such as the image of the object 30 to a user of the quantitative DIC device 100 . In addition, the quantitative DIC device 100 may also have an input device communicatively coupled to the processor 152 .
- FIG. 4 is a schematic drawing of a side view of components of a quantitative DIC device 100 in a second configuration, according to an embodiment of the invention.
- the second configuration of the quantitative DIC device 100 includes a two-dimensional wavefront sensor array 210 of sensor elements for measuring two-dimensional data about the image wavefront 120 in a single snapshot reading without the need of raster scanning.
- the quantitative DIC device 100 includes an illumination source 20 for providing light to an object 30 being imaged.
- the object 30 modulates or otherwise alters the light and induces an image wavefront 120 .
- a single illumination source 20 providing light in a single direction is shown in the illustrated embodiment, multiple illumination sources can be used providing light in one or more directions.
- the quantitative DIC device 100 also includes a wavefront relay system 130 in communication with the wavefront sensor array 210 .
- the wavefront relay system 130 projects or otherwise relays the image wavefront 120 generated by the object 30 onto the wavefront sensor array 210 .
- Each sensor element (pixel) of the wavefront sensor array 210 measures the local intensity and slope of the image wavefront 120 induced by a point of the object 30 which conjugates with the sensor element (pixel).
- the quantitative DIC device 100 naturally measures two-dimensional maps of the local intensity and slope of the image wavefront 120 modulated by the object 30 at the same time.
- a raster scanning device is not shown in the illustrated embodiment, another embodiment can include a raster scanning device 140 to raster scan the object 30 or the wavefront sensor array 210 to form more densely sampled images.
- the quantitative DIC device 100 also includes a host computer 150 having a processor 152 in communication with a computer readable medium (CRM) 154 .
- the host computer 150 is in communication with the wavefront sensor array 210 to receive wavefront data.
- One or more components of the quantitative DIC device 100 can be located within a body, which can be a multi-layer structure or a single, monolithic structure.
- the quantitative DIC device 100 measures the two-dimensional amplitude and phase gradient of the image wavefront 120 based on the measured intensity distribution using Eqs. (2) and (3). Using an unwrapping method, the quantitative DIC device 100 can reconstruct the two-dimensional image wavefront 120 associated with the object 30 using the measured two-dimensional amplitude and phase gradient information. The quantitative DIC device 100 can computationally propagate the reconstructed image wavefront 120 to one or more z-planes at different depths through the thickness of the object 30 .
- the quantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of the object 30 , a reconstructed image, and propagated two dimensional images at different depths through the thickness of the object 30 .
- the two dimensional images are cross-sectional images of the object 30 .
- the quantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of the object 30 .
- the computing depth sectioning method will be described in further detail in following sections.
- the host computer 150 is a component of the quantitative DIC device 100 . In other embodiments, the host computer 150 can be a separate device. Although the processor 152 and CRM 154 are shown as components of the quantitative DIC device 100 , in other embodiments the processor 152 and/or CRM 154 can be components of the wavefront sensor.
- the processor 150 receives signals with wavefront data associated with the intensity distribution and slope of the image wavefront 120 measured by the wavefront sensor array 210 .
- the wavefront data may include the two-dimensional map of the intensity distribution and slope of the image wavefront 120 measured by the wavefront sensor array 210 , the amplitude and phase gradient information associated with the image wavefront 120 , the wavelength(s) of the light, and/or other information about the light received by the wavefront sensor array 210 .
- the processor 150 executes code stored on the CRM 154 to perform some of the functions of the quantitative DIC device 100 such as interpreting the intensity distribution and/or slope data measured by wavefront sensor array 210 , generating amplitude and phase gradient information associated with the image wavefront 120 induced by the object 30 , reconstructing an image wavefront 120 using the amplitude and phase gradient information, computing depth sectioning of the object by numerically propagating the image wavefront 120 back to multiple z-planes through the depth of the object 30 to generate two-dimensional image information about the object 30 , and generating three-dimensional information about the object 30 from the two-dimensional information multiple z-planes of the object 30 .
- a CRM 154 can refer to any computer readable medium (e.g., memory) that stores data and may be in any suitable form including a memory chip, etc.
- the CRM 154 stores the code for performing some functions of the quantitative DIC device 100 .
- the code is executable by the processor.
- the CRM 154 includes a) code for interpreting the light distribution data received from the wavefront sensor array 210 , b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of the image wavefront 120 , and determining the phase gradient of the image wavefront 120 in two orthogonal directions using the light distribution data, c) code for reconstructing the image wavefront using the amplitude and phase gradient data in two orthogonal directions, d) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, e) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, and f) code for combining two-dimensional image data to generate three-dimensional data and images of the object 30 , g) code for displaying one or more images of the object 30 , and h) any other suitable code for compute
- the quantitative DIC device 100 may also include a display communicatively coupled to the processor 152 . Any suitable display may be used. In one embodiment, the display may be a part of the quantitative DIC device 100 . The display may provide information such as the image of the object 30 to a user of the quantitative DIC device 100 .
- FIG. 5 is a schematic drawing of a side view of components of a quantitative DIC device 100 in a third configuration, according to an embodiment of the invention.
- the third configuration of the quantitative DIC device 100 eliminates the wavefront relay system 130 included in the first and second configurations.
- the quantitative DIC 100 includes a two-dimensional wavefront sensor array 210 of sensor elements for measuring two-dimensional data about the image wavefront 120 at the same time.
- a quantitative DIC 100 in the third configuration comprises a single pixel wavefront sensor 110 and a raster scanning device 140 .
- a quantitative DIC 100 in the third configuration comprises a one dimensional OFM wavefront sensor array that uses an optofluidic microscope (OFM) scanning scheme shown in FIGS. 10( a ) and 10 ( b ).
- OFM optofluidic microscope
- the sensor elements detect time varying data as the object 30 passes through a fluid channel in the form of line scans. The line scans are compiled to generate two-dimensional data.
- the quantitative DIC device 100 includes an illumination source 20 providing light to an object 30 being imaged.
- the object 30 modulates or otherwise alters the light, which induces the image wavefront 120 .
- a single illumination source 20 providing light in a single direction is shown in the illustrated embodiment, multiple illumination sources can be used providing light in one or more directions.
- the quantitative DIC device 100 also includes a host computer 150 having a processor 152 in communication with a computer readable medium (CRM) 154 .
- the host computer 150 is in communication with the wavefront sensor array 210 to receive wavefront data.
- Each sensor element (pixel) of the wavefront sensor array 210 measures the local intensity and slope of the image wavefront 120 induced by a point of the object 30 which conjugates with the sensor element (pixel).
- the quantitative DIC device 100 naturally measures two-dimensional maps of the local intensity and slope of the image wavefront 120 modulated by the object 30 at the same time.
- a raster scanning device is not shown in the illustrated embodiment, another embodiment can include a raster scanning device 140 to raster scan the object 30 or the wavefront sensor array 210 to form more densely sampled images.
- the quantitative DIC device 100 measures the two-dimensional amplitude and phase gradient of the image wavefront 120 based on the measured intensity distribution using Eqs. (2) and (3). Using an unwrapping method, the quantitative DIC device 100 can reconstruct the two-dimensional image wavefront 120 associated with the object 30 using the measured two-dimensional amplitude and phase gradient information. The quantitative DIC device 100 can computationally propagate the reconstructed image wavefront 120 to one or more z-planes at different depths through the thickness of the object 30 .
- the quantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of the object 30 , a reconstructed image, and propagated two dimensional images at different depths through the thickness of the object 30 .
- the two dimensional images are cross-sectional images of the object 30 .
- the quantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of the object 30 .
- the computing depth sectioning method will be described in further detail in following sections.
- the host computer 150 is a component of the quantitative DIC device 100 . In other embodiments, the host computer 150 can be a separate device. Although the processor 152 and CRM 154 are shown as components of the quantitative DIC device 100 , in other embodiments the processor 152 and/or CRM 154 can be components of the wavefront sensor.
- the processor 150 receives signals with wavefront data associated with the intensity distribution and slope of the image wavefront 120 measured by the wavefront sensor array 210 .
- the wavefront data may include the two-dimensional map of the intensity distribution and slope of the image wavefront 120 measured by the wavefront sensor array 210 , the amplitude and phase gradient information associated with the image wavefront 120 , the wavelength(s) of the light, and/or other information about the light received by the wavefront sensor array 210 .
- the wavefront data further includes time varying information from an OFM wavefront sensor which can be in the form of line scans.
- the processor 150 executes code stored on the CRM 154 to perform some of the functions of the quantitative DIC device 100 such as interpreting the intensity distribution and/or slope data measured by wavefront sensor array 210 , generating amplitude and phase gradient information associated with the image wavefront 120 induced by the object 30 , reconstructing an image wavefront 120 using the amplitude and phase gradient information, computing depth sectioning of the object by numerically propagating the image wavefront 120 back to multiple z-planes through the depth of the object 30 to generate two-dimensional image information about the object 30 , and generating three-dimensional information about the object 30 from the two-dimensional information multiple z-planes of the object 30 .
- a CRM 154 can refer to any computer readable medium (e.g., memory) that stores data and may be in any suitable form including a memory chip, etc.
- the CRM 154 stores the code for performing some functions of the quantitative DIC device 100 .
- the code is executable by the processor.
- the CRM 154 comprises a) code for interpreting the light distribution data received from the wavefront sensor array 210 , b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of the image wavefront 120 , and determining the phase gradient of the image wavefront 120 in two orthogonal directions using the light distribution data, d) code for reconstructing the image wavefront using the amplitude of the image wavefront 120 , and determining the phase gradient data in two orthogonal directions, e) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, f) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, and g) code for combining two-dimensional image data to generate three-dimensional data and images of the object 30 , h) code for displaying one or more images of the object 30 , h
- the quantitative DIC device 100 may also include a display communicatively coupled to the processor 152 . Any suitable display may be used. In one embodiment, the display may be a part of the quantitative DIC device 100 . The display may provide information such as the image of the object 30 to a user of the quantitative DIC device 100 .
- the quantitative DIC device 100 of some embodiments may also include a body which incorporates one or more components of the quantitative DIC device 100 .
- the body can be a multi-layer structure or a single, monolithic structure.
- the body is a multi-layer structure.
- the body forms or includes a fluid channel having a first surface.
- the body also includes an opaque or semi-opaque aperture layer that is an inner surface layer of the fluid channel.
- the opaque or semi-opaque aperture layer has light transmissive regions 222 in it.
- the opaque or semi-opaque aperture layer can be a thin metallic layer in some cases.
- the body may optionally include a transparent protective layer (not shown) that covers the opaque or semi-opaque aperture layer to isolate the opaque or semi-opaque aperture layer from the fluid and the object 30 moving through the fluid channel.
- imaging can be done in many ways.
- object scanning can be replaced by wavefront sensor scanning by the raster scanning device.
- the two-dimensional raster scanning can be replaced by one-dimensional (1D) optofluidic microscope (OFM) scanning scheme described in U.S. patent application Ser. Nos. 11/686,095, 11/743,581, 12/638,518, which are hereby incorporated by reference in their entirety for all purposes.
- OFM optofluidic microscope
- a quantitative DIC device can be in the form of a wavefront sensing chip for use with a microscope, camera or other imaging device.
- the wavefront sensing chip can be communicatively coupled to the imaging device through a port in the imaging device or the chip can be placed in a housing of the imaging device that accepts the chip.
- the device can be provided with quantitative DIC functionality such as the ability to capture phase gradient images and compute depth sectioning.
- a quantitative DIC device can be utilized as a wavefront sensing component of an adaptive optics device.
- the adaptive optics device operates by measuring distortions in the wavefront and compensating for them with a spatial phase modulator such as a deformable mirror or liquid crystal array.
- Wavefront sensors of embodiments of the invention can be used in various wavefront sensing applications including adaptive optics, optical testing, adaptive microscopy, retina imaging, etc.
- An example of using wavefront sensors in adaptive microscopy can be found in Booth, M. J., Neil, M. A. A., Juskaitis, R. Wilson, T, Proceedings of the National Academy of Sciences of the United States of America 99, 5788 (April, 2002), which is herein incorporated by reference in its entirety for all purposes.
- An example of using wavefront sensors in retinal imaging can be found in Liang, J. Z, Williams, D. R., Miller, D. T., Journal of the Optical Society of America a-Optics Image Science and Vision 14, 2884 (November, 1997), which is herein incorporated by reference in its entirety for all purposes.
- the wavefront sensor which can measure the local intensity and slope of the wavefront modulated by the object (sample) at the same time, can be implemented in several different ways. Three types of wavefront sensors are described below. The first type uses a structured aperture interference (SAI) scheme. The second type uses a Shack-Hartmann scheme. The third type uses the optofluidic microscope (OFM) scheme. These types of wavefront sensors can be used in either a single sensor element (pixel) configuration or in a wavefront sensor array (one or two dimensions) of sensor elements. If a single pixel wavefront sensor is used, a raster scanning device can be used to scan the object or the wavefront sensor to measure two-dimensional image data. If a two-dimensional array of wavefront sensors is used, the array can capture the two-dimensional image in a single snapshot reading. Although three types of wavefront sensors are described below, any suitable type of wavefront sensor can be used in embodiments of quantitative DIC devices.
- Wavefront sensors of embodiments can be implemented in conjunction with an imaging system (e.g., a camera system, microscope system, etc.) to provide capabilities such as computing depth sectioning of an object.
- an imaging system e.g., a camera system, microscope system, etc.
- one or more wavefront sensors can be coupled to the imaging device (e.g., a microscope or camera) or inserted within an imaging device to provide the additional capabilities.
- SAI Structured Aperture Interference
- the local slope of the image wavefront can be measured by looking at the offset of the zero diffraction order on the back detector array (light detector) and the local intensity (amplitude) of the image wavefront can be measured by integrating the diffraction signal on the back detector array.
- FIGS. 6( a ), 6 ( b ) and 6 ( c ) are schematic drawings of a side view of components of a quantitative DIC device 100 having a SAI wavefront sensor, according to embodiments of the invention.
- the SAI wavefront sensor can be a single pixel wavefront sensor 110 or a wavefront sensor array 210 of one or two dimensions.
- the SAI wavefront sensor 110 / 210 is a single pixel wavefront sensor 110 or can be a component of a wavefront sensor array 210 of one or two dimensions.
- the SAI wavefront sensor 110 / 210 comprises an aperture layer 220 (e.g., a metal film), a light detector 230 and a transparent layer 240 between the aperture layer 220 and the light detector 230 .
- the aperture layer 230 includes a first light transmissive region 222 ( a ) and a second light transmissive region 222 ( b ).
- the two light transmissive regions 222 ( a ) and 222 ( b ) are located at a distance D away from the light detector 230 and are separated from each other by a spacing ⁇ x.
- the transparent layer 240 between the light detector 230 and the aperture layer 220 can include one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin), or can be a vacuum or gas-filled space.
- An illumination source 20 provides illumination (light) to an object 30 which modulates or otherwise alters the light inducing an image wavefront.
- the light detector 230 measures the distribution of light received from the light transmissive regions 222 ( a ) and 222 ( b ).
- a light detector 230 can refer to any suitable device capable of detecting light and generating signals with wavefront data about the intensity, wavelength, wavefront slope, phase gradient in one or more orthogonal directions, and/or other information about the light being detected.
- the signals may be in the form of electrical current that results from the photoelectric effect.
- suitable light detectors 230 include a charge coupled device (CCD) or a linear or two-dimensional array of photodiodes (e.g., avalanche photodiodes (APDs)).
- a light detector 230 could also be a complementary metal-oxide-semiconductor (CMOS) or photomultiplier tubes (PMTs). Other suitable light detectors 230 are commercially available.
- CMOS complementary metal-oxide-semiconductor
- PMTs photomultiplier tubes
- the light detector 230 comprises one or more light detecting elements 232 .
- the light detecting elements 232 can be of any suitable size (e.g., 1-4 microns) and any suitable shape (e.g., circular or square).
- the light detecting elements 232 can be arranged in any suitable form such as a one-dimensional array, a two-dimensional array, and a multiplicity of one-dimensional and/or two-dimensional arrays.
- the arrays can have any suitable orientation or combination of orientations.
- the light detecting elements 232 can be arranged in the same form as the light transmissive regions 222 ( a ) and 222 ( b ) and correspond to the light transmissive regions 222 ( a ) and 222 ( b ).
- the light detector 230 also comprises a first surface 230 ( a ).
- the object 30 being imaged is homogenous.
- the object 30 includes a feature 250 having a refractive index variation from other homogenous portions of the object 30 .
- the object 30 in FIG. 6( c ) could be a cell and the feature 250 may be a nucleus having a different refractive index than other portions of the cell.
- light transmissive region 222 ( a ) collects a reference beam of light and light transmissive region 222 ( b ) collects a sample beam of light.
- the transparent layer (spacer) 240 has a refractive index of n and a thickness of D.
- the vertical plane wave is incident on the two light transmissive regions 222 ( a ) and 222 ( b ) of the aperture layer 220 and the interference pattern 280 ( a ) is centered or substantially centered on the light detector 230 .
- the sample beam and reference beams with the same phase exit the two light transmissive regions 222 ( a ) and 222 ( b ), the centroid 270 ( a ) of the light intensity distribution 280 ( a ) of their Young's interference 260 ( a ) is centered or substantially centered on the light detector 230 .
- phase difference induced by the object 30 between the light transmissive regions 222 ( a ) and 222 ( b ) will shift the centroid 270 of the interference pattern 280 to one side.
- the phase difference ⁇ is directly related to the offset ⁇ s of the interference pattern:
- phase gradient induced by the object 30 is directly related to the offset ( ⁇ s) of the interference pattern:
- the reference and sample beams carry different phases.
- the sample beam passes through a homogeneous portion of the object 30 .
- the centroid 270 ( b ) of the light intensity distribution 280 ( b ) of their Young's interference 260 ( b ) shifts on the light detector 230 by an offset ⁇ s 1 .
- the sample beam passes through a heterogeneous portion of the object 30 and the reference beam passes through a homogenous portion of the object 30 .
- the centroid 270 ( c ) of the light intensity distribution 280 ( c ) of their Young's interference 260 ( c ) shifts on the light detector 230 by an offset ⁇ s 2 .
- the ⁇ s 2 may be greater than ⁇ s 1 . In other embodiments, the ⁇ s 2 may be smaller than ⁇ s 1 .
- the quantitative DIC device 100 can measure the phase difference and the phase gradient of the image wavefront using the measured offset. In addition, the quantitative DIC device 100 can measure the local amplitude by integrating the intensity distribution measured by the light detector 230 . By measuring the intensity and phase gradient of the light modulated by the object 30 through independent aspects of the interference pattern, the quantitative DIC device 100 can separate the amplitude and phase information.
- the one-dimensional Young's experiment with two slits can be generalized to two dimensions by utilizing varieties of two-dimensional structured apertures e.g., four holes, rose-shaped, ring or Fresnel zone plate, or a single hole.
- a two-dimensional structured aperture can refer to one or more light transmissive regions 222 in the aperture layer of a wavefront sensor 110 / 210 where the light transmissive regions extend in two orthogonal directions (e.g., in the x- and y-directions).
- the SAI wavefront sensor 110 / 210 can measure the local slope and phase gradient of the image wavefront in two orthogonal directions (e.g., x-direction and y-direction) at the same time.
- This aperture-based phase decoding scheme can be referred to as SAI wavefront sensing.
- Some embodiments of quantitative DIC devices 100 have an SAI wavefront sensor 110 / 210 that employs one or more two-dimensional structured apertures to measure the amplitude and differential phase gradient in two orthogonal directions at the same time.
- the light transmissive regions of the two-dimensional structure aperture are separated by a thickness D from the light detector.
- the SAI wavefront sensor 110 / 210 is generally located at the image plane of the imaging system.
- the structure aperture selectively transmits and combines the light fields from two directions on the image to create an interference pattern on the light sensor. The total transmission of the interference pattern is proportional to the average image intensity at the light transmissive region.
- wavefront sensors 110 / 210 can measure the spatial phase gradient of the light field.
- the wavefront sensors 110 / 210 of some embodiments measure:
- G x (x, y) is the two-dimensional phase gradient in the x-direction
- G y (x, y) is the two-dimensional phase gradient in the y-direction
- A(x, y) is the two-dimensional amplitude of the detected wavefront.
- a quantitative DIC device 100 with a wavefront sensor 110 / 210 can mathematically reconstruct (unwrap) the detected wavefront by combining the measured data appropriately.
- One unwrapping method is given by Eq. (9):
- the unwrapping methods should all return the same answer if the signal to noise ratio (SNR) of the measurements approaches infinity.
- SNR signal to noise ratio
- the unwrapping methods vary in their performance based on the quantity and type of noise present in the measurements.
- Embodiments of the SAI wavefront sensor 110 / 210 can use any suitable type of two-dimensional structured aperture.
- the quantitative DIC device 100 can use a two-dimensional structured aperture in the form of a ‘plus’ sign configuration with four light transmissive regions (e.g., holes) arranged in orthogonal x and y directions.
- An exemplary quantitative DIC device that uses a four hole structured aperture for differential phase imaging can be found in Lew, Matthew, Cui, Xiquan, Heng, Xin, Yang, Changhuei, Interference of a four - hole aperture for on - chip quantitative two - dimensional differential phase imaging , Optic Letters, Vol. 32, No.
- two-dimensional structured apertures include a single pedal-shaped aperture, a ring or Fresnel zone plate, and other suitable arrangements of light transmissive regions extending in orthogonal directions.
- a quantitative DIC device 100 has a SAI wavefront sensor 110 / 210 that employs a two-dimensional structured aperture of four light transmissive regions (e.g., holes) in the form of a ‘plus’ sign to measure the differential phase and amplitude of an image wavefront in the x-direction and y-direction at the same time.
- the two long axes of the ‘plus’ sign are in the orthogonal x and y directions respectively.
- the offsets offset x (x, y) and offset y (x, y) of the zero-order interference spot are related to the net wavefront gradient G x (x, y) and G y (x, y) at the light transmissive region respectively:
- FIGS. 7( a )( 1 ) and 7 ( a )( 2 ) are schematic drawings of a perspective view of a two-dimensional structured aperture 300 in the form of a ‘plus’ sign configuration, according to embodiments of the invention.
- the quantitative DIC device 100 comprises an aperture layer 220 having a two-dimensional structured aperture 300 of four light transmissive regions 222 extending in both x and y directions in a ‘plus’ sign configuration.
- the quantitative DIC device 100 can measure the differential phase and amplitude in both x and y directions.
- the quantitative DIC device 100 also has a light detector 230 located at a distance D from the aperture layer 230 .
- the quantitative DIC device 100 also includes a transparent layer 240 with a thickness D located between the aperture layer 220 and the light detector 230 .
- the transparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. Any suitable spacing ⁇ x between the light transmissive regions 222 in the ‘plus’ sign configuration can be used. Some examples of suitable spacing ⁇ x are 1 ⁇ m, 2 ⁇ m, or 3 ⁇ m.
- the light detector 230 receives light passing through light transmissive regions 222 in two-dimensional structured aperture 300 .
- the illumination source 20 projects a light field perpendicular to the first surface 220 ( a ).
- the illumination source 20 projects a light field at an angle with respect to the first surface 22 ( a ).
- the light field projected at different angles in FIG. 7( a )( 1 ) and FIG. 7( a )( 2 ) results in different projections 310 ( a ) and 310 ( b ) respectively onto the light detector 230 .
- FIGS. 7( b ), 7 ( c ), and 7 ( d ) are images taken by a scanning electron microscope of two-dimensional structured apertures 300 , according to embodiments of the invention.
- FIG. 7( b ) illustrates a two-dimensional structured aperture 300 in the form of a single pedal-shaped aperture.
- FIG. 7( c ) and FIG. 7( d ) illustrate two-dimensional structured apertures 300 of light transmissive regions 222 in the form of a ‘plus’ sign configuration.
- the spacing between the light transmissive regions 222 in the structured aperture 300 in FIG. 7( c ) is shorter than the spacing between the light transmissive regions 222 in the structured aperture 300 shown in FIG. 7( d ).
- FIGS. 7( b ), 7 ( c ), and 7 ( d ) are images taken by a scanning electron microscope of two-dimensional structured apertures 300 , according to embodiments of the invention.
- FIG. 7( b ) illustrates a two-dimensional structured aperture
- FIG. 7( h ) is a schematic drawing of a Fresnel-zone plate structured aperture with a circular frame.
- FIG. 8 is a schematic drawing of a side view of components of a quantitative DIC device 100 having a SAI wavefront sensor 110 / 210 , in accordance with embodiments of the invention.
- the SAI wavefront sensor 110 / 210 has an aperture layer 220 having an array 325 of three structured apertures 300 .
- the SAI wavefront sensor 110 / 210 also includes a light detector 230 and a transparent layer 240 with a thickness, D located between the aperture layer 220 and the light detector 230 .
- the transparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space.
- the quantitative DIC device 100 is a component of a camera system with a first illumination source 20 ( a ) providing light in the z-direction and a second illumination source 20 ( b ) (e.g., a flash) providing light from another direction.
- the light from the additional illumination source reflects off the object 30 .
- the object 30 modulates or otherwise alters the light from both illumination sources 20 ( a ) and 20 ( b ) inducing an image wavefront 120 .
- the second illumination source 20 ( b ) may be eliminated such as a quantitative DIC device 100 that is a component of a microscope system.
- other illumination sources may provide light from other directions.
- the quantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with the wavefront sensor 110 / 120 .
- the wavefront relay system 130 relays the wavefront 120 induced by the object 30 to the SAI wavefront sensor 110 / 120 .
- the quantitative DIC device 100 can also include a host computer 150 having a processor 152 and a computer readable medium 154 .
- the illumination sources 20 ( a ) and 20 ( b ) provide light to the object 30 inducing the image wavefront 120 .
- the wavefront relay system 130 relays the image wavefront 120 to the structured apertures 300 of the SAI wavefront sensor 110 / 210 .
- the light passes through the structured apertures.
- the SAI wavefront sensor 110 / 210 measures the offsets in the x and y directions of the zero diffraction order of the light distribution read by the light detector 230 .
- the SAI wavefront sensor 110 / 210 measures the two dimensional phase gradient in the x and y directions based on the offsets using Eqns. (2) and (3).
- the SAI wavefront sensor 110 / 210 also measures the amplitude of the image wavefront by integrating the intensity readings over the light detector 230 .
- the quantitative DIC device 100 can then reconstruct the image wavefront 120 using an unwrapping method.
- the quantitative DIC device 100 can also propagate the reconstructed wavefront to one or more parallel planes intersecting the object 30 in order to compute depth sectioning of the object 30 .
- the quantitative DIC device 100 can also compile the two-dimensional information about the reconstructed and propagated wavefronts to generate three-dimensional information about the object 30 .
- the quantitative DIC device 100 can generate two or three-dimensional images of the object 30 based on the amplitude, phase gradient in a first direction, and/or phase gradient in a second direction orthogonal to the first direction.
- the quantitative DIC device 100 also includes an x-axis, a y-axis, and a z-axis.
- the x-axis and the y-axis lie in the plane of the surface 230 ( a ) of the light detector 230 .
- the z-axis is orthogonal to the plane of the surface 230 ( a ).
- the Shack-Hartmann wavefront sensor includes a microlens array of the same focal length. Each Shack-Hartmann wavefront sensor focuses the local wavefront across each microlens and forms a focal spot onto a photosensor array. The local slope of the wavefront can then be calculated from the position of the focal spot on the light sensor.
- the principles of the Shack-Hartmann sensors are further described in Platt, Ben C. and Shack, Roland, History and Principles of Shack-Hartmann Wavefront Sensing, Journal of Refractive Surgery 17, S573-S577 (September/October 2001) and in Wikipedia, Shack-Hartmann, at http://en.wikipedia.org/wiki/Shack-Hartmann (last visited Jan. 21, 2009), which are which are hereby incorporated by reference in their entirety for all purposes.
- Microlenses can refer to small lenses, generally with diameters of less than about a millimeter and can be as small as about 10 ⁇ m. Microlenses can be of any suitable shape (e.g., circular, hexagonal, etc.). Microlenses can also have any suitable surface configuration.
- a microlens is a structure with one plane surface and one spherical convex surface to refract light.
- a microlens has two flat and parallel surfaces and the focusing action is obtained by a variation of refractive index across the microlens.
- a microlens is a micro-Fresnel lens having a set of concentric curved surfaces which focus light by refraction.
- a microlens is a binary-optic microlens with grooves having stepped edges.
- a microlens array can refer to a one or two-dimensional array of one or more microlenses.
- a Shack-Hartmann wavefront sensor comprises an aperture layer having a microlens array with one or more microlenses located within the light transmissive regions (e.g., holes) in the aperture layer.
- the Shack-Hartmann wavefront sensor 110 / 210 also includes a light detector 230 and a transparent layer 240 with a thickness, D located between the aperture layer 220 and the light detector 230 .
- the transparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space.
- FIG. 9 is a schematic drawing of a side view of components of a quantitative DIC device 100 having a Shack-Hartmann wavefront sensor 110 / 210 , in accordance with an embodiment of the invention.
- the Shack-Hartmann wavefront sensor 110 / 210 has an aperture layer 230 comprising a microlens array 325 having three microlenses 330 .
- the quantitative DIC device 100 is a component of a camera system with a first illumination source 20 ( a ) providing light in the direction of the z-axis and a second illumination source 20 ( b ) (e.g., a flash) providing light from another direction.
- the light from the additional illumination source reflects off the object 30 .
- the object 30 alters the light from both illumination sources 20 ( a ) and 20 ( b ) inducing an image wavefront 120 .
- other illumination sources or a single illumination source may be used.
- the quantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with the wavefront sensor 110 / 120 .
- the quantitative DIC device 100 can also include a host computer 150 having a processor 152 and a computer readable medium 154 .
- the illumination sources 20 provide light to the object 30 inducing the wavefront 120 .
- the wavefront relay system 130 relays the image wavefront 120 induced by the object 30 to the array 320 of microlenses 330 .
- Each microlens 330 concentrates light to focal spots on the light detector 230 .
- the Shack-Hartmann wavefront sensor 110 / 120 measures offsets of the positions of the focal spots on the light detector 230 .
- the Shack-Hartmann wavefront sensor 110 / 120 measures the two dimensional phase gradient in x-direction and y-direction and the amplitude as given by Eqns. (2) and (3).
- the Shack-Hartmann wavefront sensor 110 / 210 also measures the amplitude of the image wavefront by integrating the intensity readings over the light detector 230 .
- the quantitative DIC device 100 can then reconstruct the image wavefront 120 using an unwrapping method and propagate the wavefront from the plane at the surface 230 ( a ) of the light detector 230 to any number of parallel planes intersecting the object 30 to determine image data at different depths through the thickness of the object 30 .
- the quantitative DIC device 100 can also compile the two-dimensional information about the wavefront 120 to generate three-dimensional information about the object 30 .
- the quantitative DIC device 100 can generate two or three-dimensional images of the object 30 based on the amplitude, phase gradient in a first direction, and/or phase gradient in a second direction orthogonal to the first direction.
- the quantitative DIC device 100 also includes an x-axis, a y-axis, and a z-axis.
- the x-axis and the y-axis lie in the plane of the surface 230 ( a ) of the light detector 230 .
- the z-axis is orthogonal to the plane of the surface 230 ( a ).
- the above compact and lensless two-dimensional differential phase measurement scheme can be deployed in OFM imaging scheme as well.
- the intensity OFM device becomes an on-chip and quantitative differential interference contrast optofluidic microscope which can improve image quality while providing high throughput in a compact and inexpensive device.
- the quantitative DIC device has an OFM wavefront sensor which includes an aperture layer with an array of two-dimensional structured apertures, a light detector, and a transparent layer with a thickness D between the aperture layer and the light detector.
- the OFM wavefront sensor can determine the amplitude of the image wavefront of an object and determine the phase gradient of the image wavefront in two orthogonal directions of the object. Young's double slit experiment provides a basis for this technique.
- FIG. 10( a ) is a schematic drawing of a top view of components of an intensity OFM device 400 including light transmissive regions 222 in the form of a one-dimensional array 410 of single light transmissive regions 222 .
- the intensity OFM device 400 also includes a body 420 forming or including a fluid channel 430 .
- the light transmissive regions 222 are located in the aperture layer 440 of the body 420 .
- the intensity OFM device 400 also includes a light detector 230 (shown in FIG. 4) having elements for taking time varying readings of the light received through the light transmissive regions 222 as the object 30 travels through the fluid channel 430 .
- the intensity OFM device 400 can use the time varying readings to reconstruct an image of the object 30 based on light intensity detected by the light detector 230 .
- FIG. 10( b ) is a schematic drawing of a top view of components of a quantitative DIC device 100 having an OFM wavefront sensor 210 , according to an embodiment of the invention.
- the quantitative DIC device 100 includes a body 420 comprising an OFM wavefront sensor 210 and forming or including a fluid channel 430 .
- the body 420 can be a multi-layer structure or a single, monolithic structure.
- the body 420 is a multi-layer structure having an opaque or semi-opaque aperture layer 440 that is an inner surface layer of fluid channel 22 .
- the opaque or semi-opaque aperture layer 440 has light transmissive regions 222 in it.
- the opaque or semi-opaque aperture layer 440 can be a thin metallic layer in some cases.
- the body 420 may optionally include a transparent protective layer (not shown) that covers the opaque or semi-opaque aperture layer 440 to isolate the opaque or semi-opaque aperture layer 440 from the fluid and the object 30 moving through the fluid channel 430 of the quantitative DIC device 100 having an OFM wavefront sensor 210 .
- the fluid channel 430 may have any suitable dimensions.
- the width and/or height of the fluid channel 430 may each be less than about 10, 5, or 1 micron.
- the fluid channel 430 may be sized based on the size of the objects 30 being imaged by the quantitative DIC device 100 .
- the height of the fluid channel 430 may be 10 micron where the objects 30 being imaged are 8 micron in order to keep the objects 30 close to the opaque or semi-opaque aperture layer 440 , which may help improve the quality of the image.
- the flow of the fluid in the fluid channel 430 is generally in the direction of the x-axis.
- the OFM wavefront sensor 210 includes a one-dimensional array 450 of structured apertures 300 .
- Each structured aperture 300 is in the configuration of a ‘plus’ sign configuration of light transmissive regions 222 extending in orthogonal x and y directions. In other embodiments, other configurations (e.g., rose-shaped, ringer shaped, single hole, etc.) can be used.
- a microlens is located inside one or more of the light transmissive regions for focusing the light.
- the OFM wavefront sensor 210 also includes a light detector 230 (shown in FIG. 8 ) having elements (e.g., pixels) for taking time varying readings of the light it receives from the light transmissive regions 222 as the object 30 moves through the fluid channel 430 .
- the OFM wavefront sensor 210 also includes a transparent layer (shown in FIG. 8 ) with a thickness, D between the aperture layer 440 and the light detector 230 .
- the transparent layer 240 can be one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. Any suitable spacing ⁇ x between the light transmissive regions 222 in the structured apertures 300 can be used. Some examples of suitable spacing ⁇ x are 1 ⁇ m, 2 ⁇ m, or 3 ⁇ m.
- the quantitative DIC device 100 also includes an illumination source 20 (shown in FIG. 8 ) to the outside of the opaque or semi-opaque aperture layer 440 .
- Illumination sources such as those shown in FIG. 8 can provide light to the fluid channel 430 .
- an object 30 in the fluid is illuminated by the illumination source.
- the object 30 alters (e.g., blocks, reduces intensity, and/or modifies wavelength) the light passes through, reflecting or refracting off of it to the light transmissive regions 222 .
- the elements in the light detector 230 detect light transmitted through the light transmissive regions 222 .
- the quantitative DIC device 100 also includes an x-axis and a y-axis that lie in the plane of the inner surface of the light detector 230 proximal to the fluid channel 430 .
- the x-axis lies along a longitudinal axis of the fluid channel 430 .
- the y-axis is orthogonal to the x-axis in the plane of the inner surface of the light detector 230 .
- the light transmissive regions 222 in the opaque or semi-opaque aperture layer 440 can be of any suitable shape and any suitable dimension.
- the light transmissive regions 222 are holes.
- the holes may be etched, for example, into the opaque or semi-opaque aperture layer 440 (e.g., a thin metallic layer).
- the light transmissive regions 222 may be in the form of one or more slits.
- a slit can refer to an elongated opening such as a narrow rectangle.
- Each slit may have any suitable dimension.
- the slits may have uniform dimensions or may have variable dimensions.
- the slits can be oriented at any suitable angle or angles with respect to the x-axis of the fluid channel 430 .
- the light transmissive regions 222 in the one-dimensional array 450 of structure apertures 300 collectively extend from one lateral surface 430 ( a ) to another lateral surface 430 ( b ) of the fluid channel 430 .
- the one-dimensional array 450 is located at an angle, ⁇ with respect to the x-axis.
- the angle, ⁇ can be any suitable angle.
- the illustrated embodiment includes a one dimensional array, other embodiments may include an OFM wavefront sensor 210 with other suitable formation(s) of structured apertures 300 can be used such as a slit, a two-dimensional array, or a multiplicity of one-dimensional and/or two-dimensional arrays.
- the formations of structured apertures 300 can be in any suitable orientation or combination of orientations.
- the light detector 230 takes time varying readings of the light it receives from the light transmissive regions 222 as the object 30 moves through the fluid channel 430 .
- the quantitative DIC device 100 uses the time varying readings to determine a two dimensional light intensity distribution generated by ‘plus’ sign configurations of light transmissive regions 222 .
- the quantitative DIC device 100 uses the light intensity distribution to determine the interference in orthogonal directions x and y to determine the offsets.
- the quantitative DIC device 100 also determines the differential phase (gradient) in orthogonal directions x and y based on the determined interference.
- the quantitative DIC device 100 also determines the amplitude by summing the intensity of the light detected over an area of the light detector 230 mapping to a particular set of light transmissive regions 222 .
- Examples of methods of measuring the amplitude and differential phase in two orthogonal directions of the sample wavefront quantitatively can be found in Cui, Xiquan, Lew, Matthew, Yang, Changhuei, Quantitative differential interference contrast microscopy based on structured - aperture interference , Appl. Phys. Lett. 93, 091113 (2008), which is hereby incorporated by reference in its entirety for all purposes.
- structured apertures convert the phase gradient of the image wavefront into a measurable form, the offset of the projection of the light field measured by the light detector.
- a microlens is used to convert the phase gradient of the image wavefront into a movement of the focal point on the light detector.
- Computed depth sectioning refers to a technique for determining images at different depths through the thickness of an object 30 using a quantitative DIC device 100 with a wavefront sensor 110 / 210 . Any type of wavefront sensor 110 / 210 can be used. In some embodiments, polarization effects are ignored. For simplicity, it is assumed that in these embodiments this technique is premised on a light field that is linearly polarized and on no interactions in the light field depolarizing or in any other way disrupting the polarization of the light field.
- the approach to the computed depth sectioning technique is primarily based on two concepts.
- the light field at any given plane z can be fully described by a complete set of spatially varying amplitude and phase information.
- a light field at plane z can be described by Eq. (11) as:
- ⁇ (x, y, z) is the light field at plane z
- A(x, y, z) is the amplitude at plane z
- ⁇ (x, y, z) is the phase at plane z.
- a second concept provides the Huygen's principle which states that the light field at an earlier or later (higher or lower z value) plane can be calculated from the light field at plane z.
- a known function (f) connects according to Eq. (12):
- ⁇ (x, y, z+ ⁇ z) is the light field at plane (z+ ⁇ z).
- the function (f) is well known and studied in electromagnetic theory. For example, this function f is described in Kraus, John Daniel, Fleisch, Daniel A., Electromagnetics with Applications (5 th Ed), Chapters 4-16 (1999), which is herein incorporated by reference in its entirety for all purposes. This computation assumes the absence of unknown scattering objects between plane z and plane (z+ ⁇ z).
- phase imaging in embodiments of the invention. It implies that if one can measure the phase and amplitude distribution at the plane of the sensor (plane z), one can calculate and render the light field distributions at different heights (different (z+ ⁇ z) planes) above the sensor. The light field distributions are, in effect, images at those chosen planes. According to this treatment, if a wavefront sensor can measure the two dimensional phase and amplitude data at a light detector plane z, the quantitative DIC device can use this data to numerically reconstruct the image and numerically propagate it to any plane z+ ⁇ z above or below the plane z of the light detector.
- the propagation of the light field is governed by Maxwell's equations entirely. If one can measure the phase and amplitude distribution of the light field at the sensor, one can take that information and calculate the light field distribution at any given plane above the sensor (or below the sensor). The amplitude distribution of the computed light field is equivalent to the traditional microscope image taken at the focal plane set at z+ ⁇ z. This treatment is strictly true if no unknown object is present between the plane z and z+ ⁇ z.
- FIG. 11 illustrates a schematic diagram illustrating this propagation approach, according to embodiments of the invention.
- an illumination source 20 provides light.
- the light distribution of a first wavefront 120 ( a ) is ⁇ (x, y, z+ ⁇ z) at plane z+ ⁇ z.
- plane z refers to a plane perpendicular to the z-axis and coinciding with the surface 230 ( a ) of the light detector 230 of wavefront sensor 110 / 210 .
- Plane z+ ⁇ z refers to a plane parallel to plane z and at a distance ⁇ z from plane z.
- the light detector 230 of the wavefront sensor 110 / 210 measures the light distribution ⁇ (x, y, z) of the second (detected) wavefront 120 ( b ) at plane z.
- the light detector 230 measures the two-dimensional amplitude and two-dimensional phase gradient data in two orthogonal directions, associated with the second (detected) wavefront 120 ( b ) at plane z based on the measured light distribution ⁇ (x, y, z).
- a processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 numerically reconstructs a third (reconstructed) wavefront 120 ( c ) having the light distribution ⁇ (x, y, z) using the measured phase and amplitude information of the detected wavefront 120 ( b ).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 calculates and renders a light distribution ⁇ calculated (x, y, z+ ⁇ z) of a fourth (propagated) wavefront 120 ( d ) at a plane z+ ⁇ z based on the reconstructed light distribution ⁇ (x, y, z).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 numerically propagates the reconstructed wavefront 120 ( c ) from plane z to the plane z+ ⁇ z to generate the fourth (propagated) wavefront 120 ( d ).
- the imaging of an unknown but weak scatterer can be performed computationally using the same mathematical frame work as described above—by ignoring the presence of the scatterer during the calculation and back-computing the light field at z+ ⁇ z.
- the quantitative DIC device can image an object by computationally back propagating a reconstructed image of the object at plane z to parallel planes above and below the plane z.
- FIG. 12 is a schematic diagram of the propagation approach using a processor, according to embodiments of the invention.
- the illumination source generates a uniform wavefront 120 ( e ) associated with a uniform light distribution of ⁇ (x, y, z+ ⁇ z′) at plane z+ ⁇ z′.
- Plane z+ ⁇ z′ refers to a plane between the illumination source 20 and the object 30 , that is parallel to plane z, and that is at a distance ⁇ z′ from plane z.
- the object 30 induces a first (induced) wavefront 120 ( a ) associated with a light distribution of ⁇ (x, y, z+ ⁇ z).
- Plane z+ ⁇ z refers to a plane parallel to plane z and at a distance ⁇ z from plane z.
- the light detector 230 of the wavefront sensor 110 / 210 measures the light distribution ⁇ (x, y, z) of the second (detected) image wavefront 120 ( b ) at plane z.
- the light detector 230 measures the amplitude and phase gradient of the second (detected) image wavefront 120 ( b ) at plane z based on the light distribution ⁇ (x, y, z).
- plane z refers to a plane perpendicular to the z-axis and coinciding with the surface 230 ( a ) of the light detector 230 of wavefront sensor 110 / 210 .
- a processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 numerically reconstructs the light distribution ⁇ (x, y, z) of a third (reconstructed) image wavefront 120 ( c ) based on the measured phase and amplitude information of the detected image wavefront 120 ( b ).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 calculates and renders a computed light distribution ⁇ calculated (x, y, z+ ⁇ z) of a first (propagated) image wavefront 120 ( d ) at a plane z+ ⁇ z based on the reconstructed light distribution ⁇ (x, y, z).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 numerically propagates the reconstructed image wavefront 120 ( c ) from plane z to the plane z+ ⁇ z to generate a first propagated image wavefront 120 ( d ) that approximates the first induced wavefront 120 ( a ).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 also calculates and renders a computed light distribution ⁇ calculated (x, y, z+ ⁇ z′) of a second propagated image wavefront 120 ( f ) at a plane z+ ⁇ z′ based on the reconstructed wavefront 120 ( c ) or based on the first propagated image wavefront 120 ( d ).
- the second propagated image wavefront 120 ( f ) approximates the image wavefront 120 ( e ) associated with the uniform light distribution ⁇ (x, y, z+ ⁇ z).
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 can numerically propagate the reconstructed image wavefront 120 ( c ) or any previously propagated image wavefronts to other z-planes at any height through the thickness of the object 30 .
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 can compute the depth sectioning of the object 30 by propagating image wavefronts at different heights through the object 30 .
- the amplitude distribution of the computed light field is equivalent to the traditional microscope image taken at the focal plane set at z+ ⁇ z. In fact, it can be identical. In the traditional microscope, the calculation is performed optically by the optical elements. By adjusting the optics (e.g., lenses), one can bring different planes into focus but effectively what one is doing is making slight adjustments to the optical computing process.
- the optics e.g., lenses
- FIGS. 13( a ) and 13 ( b ) are schematic drawings of a focusing approach taken by a traditional microscope.
- the optical elements e.g., lenses
- the microscope is adjusting the optics to bring plane z+ ⁇ z into focus.
- the microscope is adjusting the optics to bring plane z+ ⁇ z′ into focus.
- the illumination source generates a uniform wavefront 120 ( e ) associated with a uniform light distribution of ⁇ (x, y, z+ ⁇ z′) at plane z+ ⁇ z′.
- Plane z+ ⁇ z′ refers to a plane between the illumination source 20 and the object 30 , that is parallel to plane z, and that is at a distance ⁇ z′ from plane z.
- the object 30 induces a first (induced) wavefront 120 ( a ) associated with a light distribution of ⁇ (x, y, z+ ⁇ z).
- Plane z+ ⁇ z refers to a plane parallel to plane z and at a distance ⁇ z from plane z.
- the optical elements 500 are placed at a distance away from the sensor 510 to calculate a light distribution of ⁇ calculated (x, y, z+ ⁇ z) to focus an image wavefront 120 ( f ) on plane z+ ⁇ z.
- the illumination source generates a uniform wavefront 120 ( e ) associated with a uniform light distribution of ⁇ (x, y, z+ ⁇ z′) at plane z+ ⁇ z′.
- Plane z+ ⁇ z′ refers to a plane between the illumination source 20 and the object 30 , that is parallel to plane z, and that is at a distance ⁇ z′ from plane z.
- the object 30 induces a first (induced) wavefront 120 ( a ) associated with a light distribution of ⁇ (x, y, z+ ⁇ z).
- Plane z+ ⁇ z refers to a plane parallel to plane z and at a distance ⁇ z from plane z.
- the optical elements 500 are placed at a distance away from the sensor 510 to calculate a light distribution of ⁇ calculated (x, y, z+ ⁇ z′) to focus an image wavefront 120 ( f ) on plane z+ ⁇ z′.
- the process may not be perfect because one may not achieve a good image if the scatterer is thick and/or highly scattering because the assumption that the scatterer is ignorable in the computation process may be violated.
- this problem can affect computation-based depth sectioning using a quantitative DIC device and optical-based sectioning using a traditional microscope equally.
- the axiom of ‘no free lunch’ can apply equally in both situations.
- the distortion may be nominally tolerable. In practical situations, one can typically deal with a 100 microns thick tissue sample before distortion starts becoming significant.
- a sensor capable of spatially measuring amplitude and phase is required to compute depth sectioning of the object based on the propagation approach.
- the wavefront sensors embodiments have the capability of measuring the two-dimensional amplitude and phase required to compute depth sectioning of the object.
- the signal to noise ratio of the sensor measurements may need to be high in some cases. Otherwise the computed images may be poor in quality.
- the quantitative DIC device 100 of any configuration or to the wavefront sensors 110 or 210 of any type may be made to the quantitative DIC device 100 of any configuration or to the wavefront sensors 110 or 210 of any type (e.g., SAI wavefront sensor, Shack-Hartman wavefront sensor, and OFM wavefront sensor) without departing from the scope of the disclosure.
- the components of the quantitative DIC device 100 of any configuration or to the wavefront sensors 110 or 210 of any type may be integrated or separated according to particular needs.
- the processor 152 may be a component of the light detector 230 .
- the operations of the quantitative DIC device 100 may be performed by more, fewer, or other components and the operations of the wavefront sensors 110 or 210 may be performed by more, fewer, or other components.
- operations of the quantitative DIC device 100 or wavefront sensors 110 / 210 may be performed using any suitable logic comprising software, hardware, other logic, or any suitable combination of the preceding.
- FIG. 14 is a flow chart of a compute depth sectioning method using a quantitative DIC device 100 having a wavefront sensor 110 / 210 , according to embodiments of the invention.
- the quantitative DIC device 100 used in this method can have any suitable type of wavefront sensor 110 / 210 .
- Suitable types of wavefront sensors 110 / 210 include a SAI wavefront sensor, Shack-Hartmann wavefront sensor, OFM wavefront sensor, or other suitable wavefront sensor.
- any suitable configuration of wavefront sensor wavefront sensor 110 / 210 can be used.
- the quantitative DIC device 100 can have a single pixel wavefront sensor 110 , a one dimensional array of sensor elements 210 , a two-dimensional wavefront sensor array 210 , etc.
- the wavefront sensor wavefront sensor 110 / 210 is a single pixel wavefront sensor, a raster scanning device can be employed to scan the wavefront sensor or the object to get a two-dimensional reading at a single time. If the wavefront sensor 110 / 210 is a two-dimensional wavefront sensor array, the wavefront sensor can take a two-dimensional reading of the image wavefront at the same time. If the wavefront sensor 110 / 210 is a one-dimensional array, the wavefront sensor 110 / 210 can read time varying data in the form of line scans and compile the line scans to generate two-dimensional wavefront data.
- the method begins with an illumination source or sources 20 providing light (step 600 ). Without the object 30 being present, the light generates an initialization wavefront with a light distribution of ⁇ (x, y, z+ ⁇ z′) at plane z+ ⁇ z′ of the illumination source 20 .
- the illumination source or sources 20 can provide light in any suitable direction(s). If the quantitative DIC device 100 is a component of a camera system, multiple illumination sources 20 such as a flash, ambient light, etc. may provide light from multiple directions. If the quantitative DIC device 100 is a component of a microscope, a single illumination source 20 may provide light in a single direction along the z-axis toward the wavefront sensor.
- a wavefront relay system 130 relays or projects the wavefront to the wavefront sensor. In other embodiments, the wavefront relay system 130 is eliminated and the wavefront is projected directly to the wavefront sensor 110 / 210 .
- the wavefront sensor 110 / 210 measures the light distribution ⁇ (x, y, z) of the initialization wavefront 120 at plane z (step 602 ).
- the initialization wavefront is received through the structured apertures 300 by the light detecting elements 232 of the light detector 230 .
- the wavefront sensor is a single pixel wavefront sensor 110
- the light detector 230 uses a raster scanning device 140 to scan the wavefront sensor 110 / 210 or the object 30 to generate a reading of the two dimensional light intensity distribution at a single time.
- the wavefront sensor is a two dimensional sensor array, the light detector 230 can read the two dimensional light intensity distribution in a snapshot reading without raster scanning.
- the wavefront sensor is a one dimensional sensor array using an OFM scheme (OFM wavefront sensor)
- the light detector 230 reads time varying data as the object 30 passes through the fluid channel.
- the time varying data can be in the form of line scans which can be compiled to generate the two dimensional light distribution.
- the processor 152 of the wavefront sensor 110 / 210 or of the host computer 150 separates the light projection/distribution of each structured aperture 300 from the light projection/distribution from other structured apertures 300 (step 604 ). Once separated, the light distributions/projections can be used to map the light detecting elements 232 of the light detector 230 to the structured apertures 300 .
- any suitable technique for separating the projections/distributions can be used. In one embodiment, separation can be performed by suppressing the crosstalk from adjacent projections of the wavefront 120 through adjacent structured apertures 300 .
- the processor 152 can determine the maximum intensity values of the light distribution. The processor 152 can then determine the light detecting elements 232 that have read the maximum intensity values. The processor 152 can determine the midpoint on the light detector 230 between the light detecting elements 232 reading the maximum intensity values. The processor 152 can then use the midpoint to separate the light detecting elements reading the light distribution from particular structured apertures 300 . Alternatively, a predefined number of light detecting elements 232 around each light detecting element 232 with the maximum intensity value can define the light detecting elements 232 associated with each structured aperture 300 .
- the processor 152 predicts the center of each projection/distribution 300 (step 606 ). Any method can be used to predict the initial centers. In one embodiment, the processor 152 may determine that the center of each projection/distribution is the light detecting element having the highest intensity value.
- the object 30 is introduced (step 608 ). In other embodiments, steps 602 through 606 can be performed after step 614 and before step 616 .
- the object 30 can be introduced using any suitable technique. For example, the object 30 may be injected with a fluid sample into an input port of the quantitative DIC device 100 .
- FIGS. 15( a ), 15 ( b ), and 15 ( c ) a wavefront sensor 210 is shown with an image wavefront induced by an object, according to an embodiment of the invention.
- FIG. 15( a ) is a side view of components of the wavefront sensor 210 , according to an embodiment of the invention.
- the wavefront sensor 210 includes an aperture layer 220 having an array of three structured apertures 300 ( a ), 300 ( b ) and 300 ( c ) along the x-axis.
- FIG. 15( b ) is a top view of components of the wavefront sensor 210 in FIG. 15( a ), according to an embodiment of the invention.
- FIG. 15( c ) is a sectional view of components of the wavefront sensor 210 in FIG. 15( a ) through the center of structured aperture 300 ( c ), according to an embodiment of the invention.
- three structured apertures 300 ( a ), 300 ( b ) and 300 ( c ) are shown, any number of structured apertures can be used.
- any type of structured aperture e.g., four hole, single hole, etc. may be used.
- the wavefront sensor 210 also includes a light detector 230 having a surface 230 ( a ) and a transparent layer 240 between the light detector 230 and the aperture layer 220 .
- the transparent layer 240 has a thickness D.
- the illustrated example shows the light detector 230 comprising a two-dimensional array of light detecting elements, any suitable number or configuration of light detecting elements 232 can be used. For example, a single light detecting element 232 can be used.
- the wavefront sensor measures the light distribution ⁇ (x, y, z) of the wavefront induced by the object 30 (step 610 ).
- the wavefront is measured at plane z of the light detector corresponding to surface 230 ( a ) of the light detector 230 .
- the wavefront 120 is received through the structured apertures 300 ( a ), 300 ( b ) and 300 ( c ) in the aperture layer 220 by the light detecting elements 232 of the light detector 230 .
- the light detector 230 can take a two-dimensional snapshot reading of the light distribution or can use a raster scanning device 140 to scan the wavefront sensor to get a two-dimensional reading at a single time.
- the light detector 230 can read time varying intensity information through structured apertures 300 of an OFM wavefront sensor. In this case, the light distribution is compiled from line scans of the time varying information.
- the processor 152 of the wavefront sensor or the host computer separates the light distribution/projection (step 612 ).
- the processor 152 separates the light distributions 800 ( a ), 800 ( b ) and 800 ( c ) from a particular light transmissive region 232 of a structured aperture 300 from the light distribution projected from light transmissive regions 232 of other structured apertures 300 .
- the processor 152 separates the light distribution 800 ( a ) associated with structured aperture 300 ( a ) from the light distributions 800 ( b ) and 800 ( c ) associated with the structured apertures 300 ( b ) and 300 ( c ).
- the separation of the light distributions/projections can be used to map the light detecting elements 232 of the light detector 230 to the light transmissive region(s) of the structured apertures 300 .
- any suitable method of separating the projections/distributions 800 ( a ), 800 ( b ) and 800 ( c ) can be used.
- separation can be performed by suppressing the crosstalk from adjacent projections of the wavefront 120 through adjacent structured apertures 300 ( a ), 300 ( b ) and 300 ( c ).
- the processor 152 of the quantitative DIC device 100 determines the maximum intensity values 820 ( a ), 820 ( b ), 820 ( c ), and 820 ( d ) of the light distributions 800 ( a ), 800 ( b ) and 800 ( c ) in both orthogonal x- and y-directions.
- the processor 152 determines the light detecting elements 232 reading the maximum intensity values 820 ( a ), 820 ( b ) and 820 ( c ).
- the processor 152 can determine the midpoint on the light detector 230 between the light detecting elements 232 reading the maximum intensity values 820 ( a ), 820 ( b ), 820 ( c ), and 820 ( d ).
- the processor 152 can then use the midpoint to separate the light detecting elements reading the light distribution from particular structured apertures 300 .
- a predefined number of light detecting elements 232 around the light detecting element 232 reading the maximum intensity value can define the light detecting elements 232 associated with each structured aperture 300 . For example, all light detecting elements 232 within three light detecting elements of the one reading the maximum intensity value can be associated with the light distribution from a particular structured aperture 300 .
- the processor 152 also predicts the center of the projections associated with the structured apertures (step 614 ). Any method can be used to predict the centers. In one embodiment, the processor 152 may determine that the center of each separated projection/distribution is the light detecting element having the highest intensity value.
- the processor 152 determines the x and y offsets (step 616 ).
- the processor 152 can determine the offsets from the change in position of the center in both the x and y directions of each projection before and after an object is introduced.
- the processor 152 determines the offset 810 ( a ) at structured aperture 300 ( a ) in the x-direction, offset 810 ( b ) at structured aperture 300 ( b ) in the x-direction and offset 810 ( c ) at structured aperture 300 ( c ) in the x-direction.
- the processor 152 also determines an offset 810 ( d ) at aperture 300 ( c ) in the y-direction. Although not shown, the processor 152 also determines offsets in the y-direction at apertures 300 ( a ) and 300 ( b ). In another embodiment, the processor 152 may determine the offsets from a change in the position of another portion of each projection before and after the object is introduced.
- the processor 152 determines the phase gradient in two orthogonal directions (step 618 ).
- the processor 152 can determine the local phase gradient in two orthogonal directions based on the offsets in the x and y directions using Eqns. (2) and (3).
- wavefront sensors having structured apertures 300 e.g., four hole aperture or single hole aperture sensors
- the wavefront sensors of some embodiments measure:
- G x (x, y) is the two-dimensional phase gradient in the x-direction
- G y (x, y) is the two-dimensional phase gradient in the y-direction
- A(x, y) is the two-dimensional amplitude of the detected wavefront.
- the processor 152 can measure the net wavefront gradient G x (x, y) and G y (x, y) at each aperture respectively based on Eq. (10):
- D is the transparent layer (spacer) thickness
- the offset in the x-direction is offset x (x,y) and the offset in the y-direction is offset y (x,y).
- the processor 152 also measures the amplitude of the image wavefront 120 (step 620 ).
- the processor 152 measures the amplitude by summing up all the intensity values in each separated projection/distribution associated with each structured aperture 300 . With the amplitude and phase gradient information, the quantitative DIC device 100 has sufficient data to reconstruct the image wavefront at the plane z at the light detector 230 .
- the quantitative DIC device 100 can mathematically reconstruct (unwrap) the detected wavefront by combining the measured data appropriately using an unwrapping method (step 622 ).
- One unwrapping method is given by Eq. (9):
- unwrapping Numerous approaches for reconstructing a field distribution exist (unwrapping). Some examples of suitable unwrapping methods include the Affine transformation method, the least squares method, the Frankot Chellappa methods of wrapping, etc. The unwrapping methods should all return the same answer if the signal to noise ratio (SNR) of the measurements approaches infinity. The unwrapping methods may vary in their performance based on the quantity and type of noise present in the measurements.
- the processor 152 propagates the reconstructed wavefront at plane z to one or more planes z+ ⁇ z intersecting the object 30 (step 624 ).
- the function (f) is well known and studied in electromagnetic theory. For example, this function f is described in Kraus, John Daniel, Fleisch, Daniel A., Electromagnetics with Applications (5 th Ed), Chapters 4-16 (1999).
- the processor 152 may propagate the reconstructed wavefront to any number of planes (z+ ⁇ z) at different depths through the object 30 .
- the object 30 may have a dimension h along the z axis and may be located adjacent to the surface 230 ( a ) of the light detector 230 .
- particular depths are used to image a region of the object 30 .
- the processor 152 may propagate the reconstructed wavefront to a plane at the middle of the object 30 .
- the processor 152 generates two-dimensional and three-dimensional images (step 626 ). In one embodiment, the processor 152 generates two-dimensional images based on the reconstructed wavefront, the intensity distribution, the phase gradient distribution in the x-direction, and/or the phase gradient distribution in the y-direction. The processor 152 can also combine the two-dimensional images to generate a three-dimensional image of the object 30 or portions of the object 30 .
- an output device e.g., a printer, display, etc.
- the quantitative DIC device 100 can output various forms of data.
- the quantitative DIC device 100 can output a two-dimensional local intensity image map, a two-dimensional phase gradient image map in and x-direction, a two-dimensional phase gradient image map in and y-direction, two-dimensional reconstructed image, a two-dimensional propagated image, and/or a three-dimensional image.
- the quantitative DIC device can also further analyze the two-dimensional wavefront data.
- the quantitative DIC device 100 can use the intensity data to analyze biological properties of the object 30 .
- FIG. 16( a ) is an intensity/amplitude image taken of a starfish embryo using a quantitative DIC microscope having an SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 16( b ) is an image based on phase gradient in the x direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 16( c ) is an image based on phase gradient in the y direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.
- FIG. 16( d ), 16 ( e ) and 16 ( f ) showcase some of the unwrapping algorithms when applied to the raw amplitude, differential phase x and differential phase y data of FIGS. 16( a ), 16 ( b ) and 16 ( c ), according to embodiments of the invention.
- FIG. 16( d ) is an image reconstructed using the least squares unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown in FIGS. 16( a ), 16 ( b ) and 16 ( c ) respectively.
- FIG. 16( d ) is an image reconstructed using the least squares unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown in FIGS. 16( a ), 16 ( b ) and 16 ( c ) respectively.
- FIG. 16( e ) is an image reconstructed using the Frankot Chellappa unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown in FIGS. 16( a ), 16 ( b ) and 16 ( c ) respectively.
- FIG. 16( f ) is an image reconstructed using the Affine transformation unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown in FIGS. 16( a ), 16 ( b ) and 16 ( c ) respectively.
- FIG. 17( a ) is an image of potato starch storage granules in immersion oil taken by a conventional transmission microscope.
- FIG. 17( b ) is an image of potato starch storage granules in immersion oil taken by a conventional DIC microscope.
- FIG. 17( c ) is an intensity image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.
- FIG. 17( d ) is an artifact-free x-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.
- FIG. 17( e ) is an artifact-free y-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device 100 in a microscope system, according to an embodiment of the invention.
- Birefringent objects such as the potato starch storage granules in FIGS. 17( a )- 17 ( e ) can alter polarization of the two displaced fields in a conventional DIC microscope, such that the subsequent combination of the two files in FIG. 2 is no longer describable by Eq. (1). This can give rise to the Maltese-cross-like pattern artifacts in the resulting conventional DIC images.
- FIG. 17( b ) shows Maltese-cross-like pattern artifacts in the conventional DIC image of the potato starch storage granules.
- a quantitative DIC device 100 uses unpolarized light and does not rely on polarization for image processing.
- the quantitative DIC device 100 can image birefringent samples (e.g., potato starch storage granules) without artifacts.
- FIGS. 17( d ) and 17 ( e ) show images of birefringent samples taken by a quantitative DIC device in a microscope system.
- the birefringent images of the potato starch storage granules i.e. birefringent samples are without artifacts.
- the SAI wavefront sensor, a Shack-Hartmann wavefront sensor, or an OFM wavefront sensor can be placed into a camera system.
- One advantage to placing the wavefront sensor into a camera is that the camera can detect the phase gradient of the projected object wavefront in addition to the intensity information about the projected object wavefront.
- wavefront sensors 110 / 210 can be used with broadband illumination and/or monochromatic illumination. Wavefront sensors 110 / 210 apply to a monochromatic light field distribution in which k is well defined at each point on the image plane. However, wavefront sensing can also be used for a broadband light source and with situations where k at any given point may be a mix of different wave vectors. In this regard, wavefront sensors 110 / 210 can be used with broadband light illumination, monochromatic illumination with mixed k, and broadband light illumination with mixed k.
- wavefront sensors 110 / 210 An example of using broadband illumination by wavefront sensors 110 / 210 can be found in Cui, Xiquan, Lew, Matthew, Yang, Changhuei, Quantitative differential interference contrast microscopy based on structured-aperture interference,” Applied Physics Letters Vol. 93 (9), 091113 (2008), which is hereby incorporated by reference in its entirety for all purposes.
- the diffraction spot size in a SAI wavefront sensor and the focal spot size in the Shack-Hartmann wavefront sensor of a quantitative DIC device 100 can be used to determine the spread of wave vector k at any given image point.
- the quantitative DIC device 100 in this embodiment can render images where the extent of scattering is plotted.
- the quantitative DIC device 100 can determine the proportionality of the phase gradient response of the wavefront sensor 110 / 210 .
- the quantitative DIC device 100 measures the interference pattern as the wavefront sensor 110 / 210 is illuminated by a suitable illumination source (e.g., a collimated He—Ne laser beam) with a light having suitable properties (e.g., 632.8 nm wavelength, 25 mm beam diameter, and 4 mW power) and with a range of incident angles.
- a suitable illumination source e.g., a collimated He—Ne laser beam
- suitable properties e.g., 632.8 nm wavelength, 25 mm beam diameter, and 4 mW power
- the total transmission and the offsets of the zero-order spot in both x and y directions can be computed with a suitable method such as a least-square 2D Gaussian fit.
- the relationship between the offsets of the zero order spot and the normalized phase gradient can be approximately linear.
- the quantitative DIC device 100 estimates the
- FIG. 18 shows a block diagram of subsystems that may be present in computer devices that are used in quantitative DIC device 100 , according to embodiments of the invention.
- the host computer 150 or wavefront sensor 110 / 210 may use any suitable combination of components in FIG. 18 .
- FIG. 18 The various components previously described in the Figures may operate using one or more computer devices to facilitate the functions described herein. Any of the elements in the Figures may use any suitable number of subsystems to facilitate the functions described herein. Examples of such subsystems or components are shown in a FIG. 18 .
- the subsystems shown in FIG. 18 are interconnected via a system bus 775 . Additional subsystems such as a printer 774 , keyboard 778 , fixed disk 779 (or other memory comprising computer readable media), monitor 776 , which is coupled to display adapter 782 , and others are shown.
- Peripherals and input/output (I/O) devices which couple to I/O controller 771 , can be connected to the computer system by any number of means known in the art, such as serial port 777 .
- serial port 777 or external interface 781 can be used to connect the computer apparatus to a wide area network such as the Internet, a mouse input device, or a scanner.
- the interconnection via system bus allows the central processor 152 to communicate with each subsystem and to control the execution of instructions from system memory 772 or the fixed disk 779 , as well as the exchange of information between subsystems.
- the system memory 772 and/or the fixed disk 779 may embody a computer readable medium. Any of these elements may be present in the previously described features.
- a computer readable medium according to an embodiment of the invention may comprise code for performing any of the functions described above.
- an output device e.g., the printer 774 of the quantitative DIC device 100 can output various forms of data.
- the quantitative DIC device 100 can output a two-dimensional local intensity image map, a two-dimensional phase gradient image map in and x-direction, a two-dimensional phase gradient image map in and y-direction, two-dimensional reconstructed image, a two-dimensional propagated image, and/or a three-dimensional image.
- the quantitative DIC device 100 can also further analyze the two-dimensional wavefront data.
- the quantitative DIC device 100 can use the intensity data to analyze biological properties of the object 30 .
- any of the software components or functions described in this application may be implemented as software code to be executed by a processor using any suitable computer language such as, for example, Java, C++ or Perl using, for example, conventional or object-oriented techniques.
- the software code may be stored as a series of instructions, or commands on a computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM.
- RAM random access memory
- ROM read only memory
- magnetic medium such as a hard-drive or a floppy disk
- optical medium such as a CD-ROM.
- Any such computer readable medium may reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Biochemistry (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Microscoopes, Condenser (AREA)
- Instruments For Measurement Of Length By Optical Means (AREA)
Abstract
Description
- This is a non-provisional application that claims benefit of the filing date of U.S. Provisional Patent Application No. 61/205,487 entitled “Quantitative differential interference contrast (DIC) microscopy and its computed depth sectioning ability” filed on Jan. 21, 2009. That provisional application is hereby incorporated by reference in its entirety for all purposes.
- This non-provisional application is related to the following co-pending and commonly-assigned patent applications, which are hereby incorporated by reference in their entirety for all purposes:
-
- U.S. patent application Ser. No. 11/125,718 entitled “Optofluidic Microscope Device” filed on May 9, 2005.
- U.S. patent application Ser. No. 11/686,095 entitled “Optofluidic Microscope Device” filed on Mar. 14, 2007.
- U.S. patent application Ser. No. 11/743,581 entitled “On-chip Microscope/Beam Profiler based on Differential Interference Contrast and/or Surface Plasmon Assisted Interference” filed on May 2, 2007.
- U.S. patent application Ser. No. 12/398,098 entitled “Methods of Using Optofluidic Microscope Devices” filed Mar. 4, 2009.
- U.S. patent application Ser. No. 12/398,050 entitled “Optofluidic Microscope Device with Photosensor Array” filed on Mar. 4, 2009.
- U.S. patent application Ser. No. 12/638,518 entitled “Techniques for Improving Optofluidic Microscope Devices” filed on Dec. 15, 2009.
- U.S. patent application Ser. No. 12/435,165 entitled “Quantitative Differential Interference Contrast (DIC) Microscopy and Photography based on Wavefront Sensors” filed May 4, 2009.
- Embodiments of the present invention generally relate to quantitative differential interference contrast (DIC) devices with wavefront sensors. More specifically, certain embodiments relate to quantitative DIC devices with wavefront sensors used in applications such as microscopy or photography, and that are adapted to compute depth sectioning of specimens and other objects.
- DIC microscopes render excellent contrast for optically transparent biological samples without the need of introducing exogenous contrast agents into the samples. Due to the noninvasive nature, DIC microscopes are widely used in biology laboratories.
- Conventional DIC microscopes and other conventional DIC devices typically operate by first creating two identical illumination light fields exploiting polarization selection.
FIG. 1 is a schematic drawing of aconventional DIC device 10 that operates by interfering slightly displaced duplicate image light fields of polarized light. Theconventional DIC device 10 includes anillumination source 20 providing polarized light to anobject 30. As illustrated, the light fields are transmitted through theobject 30 and are laterally displaced with respect to each other along the x-direction. A net phase lag (typically π/2) is then introduced on one of the transmitted image light fields. The two light fields are allowed to interfere with each other at theimage plane 40. More simply, the process is equivalent to duplicating the transmitted image light field, laterally displacing a copy slightly and detecting the interference of the two light fields atimage plane 40. - Mathematically, this implies that the observed
DIC intensity image 42 from theconventional DIC device 10 with a magnification factor of M is given by Eq. (1): -
- where B(x, y)=|(ψ(x−Δ/2,y))|2+|(ψ(x+Δ/2,y))|2,
C(x, y)=2|(ψ(x−Δ/2,y))∥(ψ(x+Δ/2,y))|, and ψ(x, y) is the image wavefront as relayed by the microscope for each light field, ψDIC(x, y) is the DIC image wavefront, and Δ=Ma is the relative displacement of the images associated with the light fields. The last expression in Eq. (1) is valid only in situations where the phase difference is small. - However, conventional DIC devices have several limitations. One major limitation is that conventional DIC devices translate phase variations into amplitude (intensity) variations. As shown in Eq. (1), the DIC intensity image, IDIC(x,y) is a sine function of the differential phase so that the phase information cannot be interpreted directly from the intensity of the DIC image. Also, the B(x,y) and C(x,y) terms both contain amplitude information so that the DIC image contains entangled amplitude and phase information. Therefore, phase variations cannot be easily disentangled from amplitude (intensity) variations that arise from absorption and/or scattering by an object. In other words, conventional DIC devices do not distinguish between the effects of absorption and phase variation. As a consequence of this entanglement of amplitude and phase information and nonlinear phase gradient response, conventional DIC devices are inherently qualitative and do not provide quantitative phase measurements
- Another limitation of conventional DIC devices is that they use polarized light and depend on the polarization in their phase-imaging strategies. Since polarized light must be used, conventional DIC devices generate images of birefringent objects (e.g., potato starch storage granules) that typically suffer from significant artifacts.
- Embodiments of the present invention relate to methods of using a quantitative DIC device(s) to compute depth sections of an object. An object introduced into the quantitative DIC device alters a light field and induces an image wavefront having an amplitude and phase gradient. The light detector at the back of the wavefront sensor measures the distribution of light passing through structured apertures in the wavefront sensor. The wavefront sensor uses the light distribution to measures separately the amplitude and phase gradient of the wavefront in two orthogonal directions. The quantitative DIC device numerically reconstructs the image wavefront from the amplitude and phase gradient, and computationally propagates the reconstructed wavefront to planes at different depths through the thickness of the object.
- One embodiment is directed to a method for computing depth sectioning of an object using a quantitative differential interference contrast device having a wavefront sensor with one or more structured apertures, a light detector and a transparent layer between the one or more structured apertures and the light detector. The method comprises receiving light by the light detector through the one or more structured apertures. The method further comprises measuring an amplitude of an image wavefront based on the received light measuring a phase gradient in two orthogonal directions of the image wavefront based on the received light. Then, the processor reconstructs the image wavefront using the measured amplitude and phase gradient and propagates the reconstructed wavefront to a first plane intersecting an object at a first depth.
- Another embodiment is directed to a wavefront sensor comprising an aperture layer having one or more structured apertures, a light detector and a transparent layer between the aperture layer and the light detector. The light detector measures the amplitude of the wavefront and the phase gradient in two orthogonal directions based on the light received through the structured apertures.
- Another embodiment is directed to a quantitative DIC device comprising a wavefront sensor and a processor communicatively coupled to the wavefront sensor. The wavefront sensor comprises an aperture layer having one or more structured apertures, a light detector and a transparent layer between the aperture layer and the light detector. The light detector measures the amplitude of the wavefront and the phase gradient in two orthogonal directions based on the light received through the structured apertures. The processor reconstructs the wavefront using the measured amplitude and phase gradient.
- These and other embodiments of the invention are described in further detail below.
-
FIG. 1 . is a schematic drawing of a conventional DIC device. -
FIG. 2 is schematic drawing of a quantitative DIC device, according to an embodiment of the invention. -
FIG. 3 is schematic drawing of a side view of components of a quantitative DIC device in a first configuration, according to an embodiment of the invention. -
FIG. 3 is schematic drawing of a side view of components of a quantitative DIC device in a first configuration, according to an embodiment of the invention. -
FIG. 4 is a schematic drawing of a side view of components of a quantitative DIC device in a second configuration, according to an embodiment of the invention. -
FIG. 5 is a schematic drawing of a side view of components of a quantitative DIC device in a third configuration, according to an embodiment of the invention. -
FIG. 6( a) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention. -
FIG. 6( b) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention. -
FIG. 6( c) is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, according to an embodiment of the invention. -
FIG. 7( a)(1) is a schematic drawing of a perspective view of a two-dimensional structured aperture in the form of a ‘plus’ sign configuration, according to an embodiment of the invention. -
FIG. 7( a)(2) is a schematic drawing of a perspective view of a two-dimensional structured aperture in the form of a ‘plus’ sign configuration, according to an embodiment of the invention. -
FIGS. 7( b), 7(c), and 7(d) are images taken by a scanning electron microscope of two-dimensional structured apertures, according to embodiments of the invention. -
FIG. 8 is a schematic drawing of a side view of components of a quantitative DIC device having a SAI wavefront sensor, in accordance with embodiments of the invention. -
FIG. 9 is a schematic drawing of a side view of components of a quantitative DIC device having a Shack-Hartmann wavefront sensor, in accordance with an embodiment of the invention. -
FIG. 10( a) is a schematic drawing of a top view of components of an intensity OFM device including light transmissive regions in the form of a one-dimensional array of single light transmissive regions. -
FIG. 10( b) is a schematic drawing of a top view of components of a quantitative DIC device having an OFM wavefront sensor, according to an embodiment of the invention. -
FIG. 11 illustrates a schematic diagram illustrating this propagation approach, according to embodiments of the invention. -
FIG. 12 is a schematic diagram of the propagation approach using a processor, according to embodiments of the invention. -
FIGS. 13( a) and 12(b) are schematic drawings of a focusing approach taken by a traditional microscope. -
FIG. 14 is a flow chart of a compute depth sectioning method using a quantitative DIC device having a wavefront sensor, according to embodiments of the invention. -
FIG. 15( a) is a side view of components of the wavefront sensor, according to an embodiment of the invention. -
FIG. 15( b) is a top view of components of the wavefront sensor inFIG. 15( a), according to an embodiment of the invention. -
FIG. 15( c) is a sectional view of components of the wavefront sensor inFIG. 15( a) through the center of structured aperture, according to an embodiment of the invention. -
FIG. 16( a) is an intensity/amplitude image taken of a starfish embryo using a quantitative DIC microscope having an SAI wavefront sensor, according to an embodiment of the invention. -
FIG. 16( b) is an image based on phase gradient in the x direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention. -
FIG. 16( c) is an image based on phase gradient in the y direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention. -
FIGS. 16( d), 16(e) and 16(f) showcase some of the unwrapping algorithms when applied to the raw amplitude, differential phase x and differential phase y data ofFIGS. 16( a), 16(b) and 16(c), according to embodiments of the invention. -
FIG. 17( a) is an image of potato starch storage granules in immersion oil taken by a conventional transmission microscope. -
FIG. 17( b) is an image of potato starch storage granules in immersion oil taken by a conventional DIC microscope. -
FIG. 17( c) is an intensity image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention. -
FIG. 17( d) is an artifact-free x-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention. -
FIG. 17( e) is an artifact-free y-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention. -
FIG. 18 shows a block diagram of subsystems that may be present in computer devices that are used in quantitative DIC device, according to embodiments of the invention. - Embodiments of the present invention will be described below with reference to the accompanying drawings. Some embodiments include a simple and quantitative DIC device with a wavefront sensor that can be used in applications such as microscopy, photography, or other imaging applications.
- Wavefront sensors of embodiments of the invention can be in any suitable form. For example, the wavefront sensor can be in the form of a single pixel (element) wavefront sensor. In another example, the wavefront sensor can be in the form of a one dimensional wavefront sensor array having sensor elements (e.g., pixels) located along a single direction. In another example, the wavefront sensor can be in the form of a two-dimensional wavefront sensor array comprising sensor elements located along two orthogonal directions.
- In general, quantitative DIC devices of embodiments of the invention provide advantages because they can separately measure the amplitude and phase gradient of an image wavefront in two orthogonal directions. With this information, the quantitative DIC device has sufficient data to numerically reconstruct an image wavefront and propagate it to other planes.
- Quantitative DIC devices of embodiments of the invention also provide advantages because they do not require polarized light as part of their imaging technique. Since these quantitative DIC devices are not dependent on the polarization of the light (illumination), these devices can use unpolarized light to generate artifact-free DIC images for both birefringent and homogenous objects. Also, an ordinary light source can be used such as the light source used in a conventional microscope. Another advantage of the quantitative DIC devices of embodiments of the invention is that they integrate DIC functionality onto a simple wavefront sensor. This integration is advantageous over conventional DIC devices which use bulky optical elements to provide DIC functionality. For this reason, embodiments of the present invention are more compact, less expensive, and simpler in use and design than conventional DIC devices.
- Quantitative DIC devices of embodiments of the invention operate by selectively combining and interfering light fields of unpolarized light at two adjacent points as illustrated in
FIG. 2 .FIG. 2 is schematic drawing of aquantitative DIC device 100, according to an embodiment of the invention. Thequantitative DIC device 100 includes anillumination source 20 providing unpolarized light to anobject 30. In this illustrated embodiment, thequantitative DIC device 100 employs a phase comparison that selectively combines and interferes the light fields of unpolarized light at two adjacent points of the image with a separation a. Thequantitative DIC device 100 uses the phase comparison to generate theimage 42 of theobject 30 at theimage plane 40. Quantitative DIC devices employing this phase comparison can separately measure the amplitude and phase gradient of the light scattered by, or transmitted through, theobject 30. - I. Quantitative DIC Device Configurations
- Three configurations of quantitative DIC devices are described below. The first configuration includes a quantitative DIC device with a single pixel wavefront sensor. In this configuration, raster scanning can be employed to measure two-dimensional data about the image wavefront. The second configuration includes a two-dimensional wavefront sensor array which can measure the two-dimensional data about the image wavefront at the same time, without the need for raster scanning. The first and second configurations use a wavefront relay system to project the image wavefront from the object to the wavefront sensor. The third configuration eliminates the wavefront relay system of the previous two configurations.
- In some embodiments, the quantitative DIC devices of these configurations do not depend on the polarization of light as part of their imaging method. These quantitative DIC devices can generate artifact-free DIC images, even for birefringence objects, if unpolarized light is used.
- A. First Configuration
-
FIG. 3 is a schematic drawing of a side view of components of aquantitative DIC device 100 in a first configuration, according to an embodiment of the invention. In this configuration, thequantitative DIC device 100 includes a singlepixel wavefront sensor 110. Anillumination source 20 provides light to anobject 30 being imaged by thequantitative DIC device 100. Theobject 30 modulates or otherwise alters the light and induces animage wavefront 120. Thequantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with the singlepixel wavefront sensor 110. Thewavefront relay system 130 projects or otherwise relays theimage wavefront 120 generated by theobject 30 onto the singlepixel wavefront sensor 110. The singlepixel wavefront sensor 110 measures the local intensity and/or slope of the projectedimage wavefront 120 induced by a point of theobject 30, which conjugates with the singlepixel wavefront sensor 110. - The
quantitative DIC device 100 further includes araster scanning device 140 for scanning theobject 30 or scanning the singlepixel wavefront sensor 110 to generate two-dimensional maps of the local intensity and/or slope of theimage wavefront 120 induced by theobject 30. Thequantitative DIC device 100 further includes ahost computer 150 having aprocessor 152 in communication with a computer readable medium (CRM) 154. Thehost computer 150 is in communication with the singlepixel wavefront sensor 110 to receive wavefront data. One or more components of thequantitative DIC device 100 can be located within a body, which can be a multi-layer structure or a single, monolithic structure. - The
quantitative DIC device 100 measures the two-dimensional amplitude and measures the phase gradient in two orthogonal directions of theimage wavefront 120 based on the measured intensity distribution. The total transmission of the interference is proportional to the average image intensity at the aperture plane. The offsets (Δs and Δt) of the zero-order interference spot are related to the wavelength-normalized phase gradients (θx and θy) at the aperture, respectively, through the spacer (transparent layer) thickness (H) and refractive index (n) as: -
- Using an unwrapping method, the
quantitative DIC device 100 can numerically reconstruct the two-dimensional image wavefront 120 associated with theobject 30 using the measured two-dimensional amplitude and phase gradient information. Thequantitative DIC device 100 can then computationally propagate theimage wavefront 120 to one or more z-planes at different depths through the thickness of theobject 30. Thequantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of theobject 30, a reconstructed image, and propagated two dimensional images at different depths through the thickness of theobject 30. In some cases, the two dimensional images are cross-sectional images of theobject 30. Thequantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of theobject 30. The computing depth sectioning method will be described in further detail in following sections. - An
illumination source 20 can refer to any suitable device or other source of light such as ambient light. The light provided byillumination source 20 can be of any suitable wavelength and intensity. Also, the light can include polarized and/or unpolarized light. In embodiments where unpolarized light is used, thequantitative DIC device 100 can generate artifact-free DIC images of birefringence specimens orother objects 30 in samples.Suitable illumination sources 20 are naturally and commercially available. In some embodiments, theillumination source 20 can be a component of thequantitative DIC device 100. In other embodiments, the illumination source can be a separate from thequantitative DIC device 100. - An
illumination source 20 can be placed in any suitable location and positioned in any suitable direction to provide appropriate light to theobject 30. In some embodiments,multiple illumination sources 20 provide light in one or more directions. For example, a camera system including a quantitative DIC device may have afirst illumination source 20 that provides light in a direction from the wavefront sensor to theobject 30 such as from a flash and asecond illumination source 20 that provides light in another direction. In other embodiments, a single illumination source provides light in a single direction. For example, a microscope system comprising aquantitative DIC device 100 may have asingle illumination source 20 positioned to provide light in the negative z-direction. - A
wavefront relay system 130 can refer to a device or combination of devices configured to relay (e.g., project) theimage wavefront 120 induced by theobject 30 onto a wavefront sensor such as the singlepixel wavefront sensor 110 inFIG. 3 or thewavefront sensor array 210 shown inFIGS. 4 and 5 . In one example, awavefront relay system 130 includes one or more lenses. The light may be relayed in any suitable manner. InFIG. 3 , thewavefront relay system 130 projects theimage wavefront 120 from theobject 30 onto a singlepixel wavefront sensor 110. - A
raster scanning device 140 can refer to any suitable device for raster scanning theobject 30 or raster scanning the wavefront sensor. Theraster scanning device 140 can be a component of thequantitative DIC device 100 in some embodiments and can be a separate device in other embodiments. - In the illustrated embodiment, the
host computer 150 is a component of thequantitative DIC device 100. In other embodiments, thehost computer 150 can be a separate device. Although theprocessor 152 andCRM 154 are shown as components of thequantitative DIC device 100, in other embodiments theprocessor 152 and/orCRM 154 can be components of the wavefront sensor. - A processor (e.g., a microprocessor) can refer to any suitable device for processing the functions of the
quantitative DIC device 100. In the illustrated embodiment, theprocessor 150 receives signals with wavefront data associated with the intensity distribution and/or slope data of theimage wavefront 120 measured by the singlepixel wavefront sensor 110. The wavefront data may include a two-dimensional map of the intensity distribution and/or slope of theimage wavefront 120 measured by the singlepixel wavefront sensor 110 by employing theraster scanning device 140, the amplitude and phase gradient data associated with theimage wavefront 120, the wavelength(s) of the light, and/or other information about the light received by the singlepixel wavefront sensor 110. Theprocessor 150 executes code stored on theCRM 154 to perform some of the functions of thequantitative DIC device 100. These functions include interpreting the light distribution and/or slope data measured by the singlepixel wavefront sensor 110, measuring the center of a projection through a structured aperture, separating the projections from structured apertures, determining offsets of the projections or the focal points, determining the amplitude and phase gradient of theimage wavefront 120 in two orthogonal directions using the light distribution data, reconstructing the image wavefront using the amplitude and phase gradient data in two orthogonal directions, propagating the reconstructed wavefront from the detector z plane to one or more z planes, generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, combining two-dimensional image data to generate three-dimensional data and images of theobject 30, displaying one or more images of theobject 30, and other functions associated with computed depth sectioning and image processing. - A
CRM 154 can refer to a memory that stores data and may be in any suitable form including a memory chip, etc. TheCRM 154 stores the code for performing some functions of thequantitative DIC device 100. The code is executable by theprocessor 152. In one embodiment, theCRM 154 comprises a) code for interpreting the light distribution data received from the singlepixel wavefront sensor 110, b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of theimage wavefront 120, and determining the phase gradient of theimage wavefront 120 in two orthogonal directions using the light distribution data, d) code for reconstructing the image wavefront using the amplitude data and the phase gradient data in two orthogonal directions, e) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, f) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, and g) code for combining two-dimensional image data to generate three-dimensional data and images of theobject 30, h) code for displaying one or more images of theobject 30, and i) any other suitable code for computed depth sectioning and image processing. TheCRM 154 may also include code for performing any of the signal processing or other software-related functions that may be created by those of ordinary skill in the art. The code may be in any suitable programming language including C, C++, Pascal, etc. - Although not shown, the
quantitative DIC device 100 may also include a display communicatively coupled to theprocessor 152. Any suitable display may be used. In one embodiment, the display may be a part of thequantitative DIC device 100. The display may provide information such as the image of theobject 30 to a user of thequantitative DIC device 100. In addition, thequantitative DIC device 100 may also have an input device communicatively coupled to theprocessor 152. - B. Second Configuration
-
FIG. 4 is a schematic drawing of a side view of components of aquantitative DIC device 100 in a second configuration, according to an embodiment of the invention. The second configuration of thequantitative DIC device 100 includes a two-dimensionalwavefront sensor array 210 of sensor elements for measuring two-dimensional data about theimage wavefront 120 in a single snapshot reading without the need of raster scanning. - In the illustrated example, the
quantitative DIC device 100 includes anillumination source 20 for providing light to anobject 30 being imaged. Theobject 30 modulates or otherwise alters the light and induces animage wavefront 120. Although asingle illumination source 20 providing light in a single direction is shown in the illustrated embodiment, multiple illumination sources can be used providing light in one or more directions. - The
quantitative DIC device 100 also includes awavefront relay system 130 in communication with thewavefront sensor array 210. Thewavefront relay system 130 projects or otherwise relays theimage wavefront 120 generated by theobject 30 onto thewavefront sensor array 210. Each sensor element (pixel) of thewavefront sensor array 210 measures the local intensity and slope of theimage wavefront 120 induced by a point of theobject 30 which conjugates with the sensor element (pixel). In this case, thequantitative DIC device 100 naturally measures two-dimensional maps of the local intensity and slope of theimage wavefront 120 modulated by theobject 30 at the same time. Although a raster scanning device is not shown in the illustrated embodiment, another embodiment can include araster scanning device 140 to raster scan theobject 30 or thewavefront sensor array 210 to form more densely sampled images. - The
quantitative DIC device 100 also includes ahost computer 150 having aprocessor 152 in communication with a computer readable medium (CRM) 154. Thehost computer 150 is in communication with thewavefront sensor array 210 to receive wavefront data. One or more components of thequantitative DIC device 100 can be located within a body, which can be a multi-layer structure or a single, monolithic structure. - The
quantitative DIC device 100 measures the two-dimensional amplitude and phase gradient of theimage wavefront 120 based on the measured intensity distribution using Eqs. (2) and (3). Using an unwrapping method, thequantitative DIC device 100 can reconstruct the two-dimensional image wavefront 120 associated with theobject 30 using the measured two-dimensional amplitude and phase gradient information. Thequantitative DIC device 100 can computationally propagate thereconstructed image wavefront 120 to one or more z-planes at different depths through the thickness of theobject 30. Thequantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of theobject 30, a reconstructed image, and propagated two dimensional images at different depths through the thickness of theobject 30. In some cases, the two dimensional images are cross-sectional images of theobject 30. Thequantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of theobject 30. The computing depth sectioning method will be described in further detail in following sections. - In the illustrated embodiment, the
host computer 150 is a component of thequantitative DIC device 100. In other embodiments, thehost computer 150 can be a separate device. Although theprocessor 152 andCRM 154 are shown as components of thequantitative DIC device 100, in other embodiments theprocessor 152 and/orCRM 154 can be components of the wavefront sensor. - In the illustrated embodiment, the
processor 150 receives signals with wavefront data associated with the intensity distribution and slope of theimage wavefront 120 measured by thewavefront sensor array 210. The wavefront data may include the two-dimensional map of the intensity distribution and slope of theimage wavefront 120 measured by thewavefront sensor array 210, the amplitude and phase gradient information associated with theimage wavefront 120, the wavelength(s) of the light, and/or other information about the light received by thewavefront sensor array 210. Theprocessor 150 executes code stored on theCRM 154 to perform some of the functions of thequantitative DIC device 100 such as interpreting the intensity distribution and/or slope data measured bywavefront sensor array 210, generating amplitude and phase gradient information associated with theimage wavefront 120 induced by theobject 30, reconstructing animage wavefront 120 using the amplitude and phase gradient information, computing depth sectioning of the object by numerically propagating theimage wavefront 120 back to multiple z-planes through the depth of theobject 30 to generate two-dimensional image information about theobject 30, and generating three-dimensional information about theobject 30 from the two-dimensional information multiple z-planes of theobject 30. - A
CRM 154 can refer to any computer readable medium (e.g., memory) that stores data and may be in any suitable form including a memory chip, etc. TheCRM 154 stores the code for performing some functions of thequantitative DIC device 100. The code is executable by the processor. In one embodiment, theCRM 154 includes a) code for interpreting the light distribution data received from thewavefront sensor array 210, b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of theimage wavefront 120, and determining the phase gradient of theimage wavefront 120 in two orthogonal directions using the light distribution data, c) code for reconstructing the image wavefront using the amplitude and phase gradient data in two orthogonal directions, d) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, e) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, and f) code for combining two-dimensional image data to generate three-dimensional data and images of theobject 30, g) code for displaying one or more images of theobject 30, and h) any other suitable code for computed depth sectioning and image processing. TheCRM 154 may also include code for performing any of the signal processing or other software-related functions that may be created by those of ordinary skill in the art. The code may be in any suitable programming language including C, C++, Pascal, etc. - Although not shown, the
quantitative DIC device 100 may also include a display communicatively coupled to theprocessor 152. Any suitable display may be used. In one embodiment, the display may be a part of thequantitative DIC device 100. The display may provide information such as the image of theobject 30 to a user of thequantitative DIC device 100. - C. Third Configuration
-
FIG. 5 is a schematic drawing of a side view of components of aquantitative DIC device 100 in a third configuration, according to an embodiment of the invention. The third configuration of thequantitative DIC device 100 eliminates thewavefront relay system 130 included in the first and second configurations. In the illustrated embodiment, thequantitative DIC 100 includes a two-dimensionalwavefront sensor array 210 of sensor elements for measuring two-dimensional data about theimage wavefront 120 at the same time. In another embodiment, aquantitative DIC 100 in the third configuration comprises a singlepixel wavefront sensor 110 and araster scanning device 140. In yet another embodiment, aquantitative DIC 100 in the third configuration comprises a one dimensional OFM wavefront sensor array that uses an optofluidic microscope (OFM) scanning scheme shown inFIGS. 10( a) and 10(b). In this embodiment, the sensor elements detect time varying data as theobject 30 passes through a fluid channel in the form of line scans. The line scans are compiled to generate two-dimensional data. - In the illustrated example shown in
FIG. 5 , thequantitative DIC device 100 includes anillumination source 20 providing light to anobject 30 being imaged. Theobject 30 modulates or otherwise alters the light, which induces theimage wavefront 120. Although asingle illumination source 20 providing light in a single direction is shown in the illustrated embodiment, multiple illumination sources can be used providing light in one or more directions. - The
quantitative DIC device 100 also includes ahost computer 150 having aprocessor 152 in communication with a computer readable medium (CRM) 154. Thehost computer 150 is in communication with thewavefront sensor array 210 to receive wavefront data. - Each sensor element (pixel) of the
wavefront sensor array 210 measures the local intensity and slope of theimage wavefront 120 induced by a point of theobject 30 which conjugates with the sensor element (pixel). In this case, thequantitative DIC device 100 naturally measures two-dimensional maps of the local intensity and slope of theimage wavefront 120 modulated by theobject 30 at the same time. Although a raster scanning device is not shown in the illustrated embodiment, another embodiment can include araster scanning device 140 to raster scan theobject 30 or thewavefront sensor array 210 to form more densely sampled images. - The
quantitative DIC device 100 measures the two-dimensional amplitude and phase gradient of theimage wavefront 120 based on the measured intensity distribution using Eqs. (2) and (3). Using an unwrapping method, thequantitative DIC device 100 can reconstruct the two-dimensional image wavefront 120 associated with theobject 30 using the measured two-dimensional amplitude and phase gradient information. Thequantitative DIC device 100 can computationally propagate thereconstructed image wavefront 120 to one or more z-planes at different depths through the thickness of theobject 30. Thequantitative DIC device 100 can generate an intensity image, a phase gradient image in the x-direction, and a phase gradient image in the y-direction of theobject 30, a reconstructed image, and propagated two dimensional images at different depths through the thickness of theobject 30. In some cases, the two dimensional images are cross-sectional images of theobject 30. Thequantitative DIC device 100 can also combine the two-dimensional wavefront data to generate three-dimensional data and images of theobject 30. The computing depth sectioning method will be described in further detail in following sections. - In the illustrated embodiment, the
host computer 150 is a component of thequantitative DIC device 100. In other embodiments, thehost computer 150 can be a separate device. Although theprocessor 152 andCRM 154 are shown as components of thequantitative DIC device 100, in other embodiments theprocessor 152 and/orCRM 154 can be components of the wavefront sensor. - In the illustrated embodiment, the
processor 150 receives signals with wavefront data associated with the intensity distribution and slope of theimage wavefront 120 measured by thewavefront sensor array 210. The wavefront data may include the two-dimensional map of the intensity distribution and slope of theimage wavefront 120 measured by thewavefront sensor array 210, the amplitude and phase gradient information associated with theimage wavefront 120, the wavelength(s) of the light, and/or other information about the light received by thewavefront sensor array 210. In some embodiments, the wavefront data further includes time varying information from an OFM wavefront sensor which can be in the form of line scans. Theprocessor 150 executes code stored on theCRM 154 to perform some of the functions of thequantitative DIC device 100 such as interpreting the intensity distribution and/or slope data measured bywavefront sensor array 210, generating amplitude and phase gradient information associated with theimage wavefront 120 induced by theobject 30, reconstructing animage wavefront 120 using the amplitude and phase gradient information, computing depth sectioning of the object by numerically propagating theimage wavefront 120 back to multiple z-planes through the depth of theobject 30 to generate two-dimensional image information about theobject 30, and generating three-dimensional information about theobject 30 from the two-dimensional information multiple z-planes of theobject 30. - A
CRM 154 can refer to any computer readable medium (e.g., memory) that stores data and may be in any suitable form including a memory chip, etc. TheCRM 154 stores the code for performing some functions of thequantitative DIC device 100. The code is executable by the processor. In one embodiment, theCRM 154 comprises a) code for interpreting the light distribution data received from thewavefront sensor array 210, b) code for generating local slope data from the light distribution data, c) code for determining the amplitude of theimage wavefront 120, and determining the phase gradient of theimage wavefront 120 in two orthogonal directions using the light distribution data, d) code for reconstructing the image wavefront using the amplitude of theimage wavefront 120, and determining the phase gradient data in two orthogonal directions, e) code for propagating the reconstructed wavefront from the detector z plane to one or more z planes, f) code for generating one or more two-dimensional images based on intensity, phase gradient in the x-direction, phase gradient in the y-direction, the reconstructed wavefront, and/or the propagated wavefronts, and g) code for combining two-dimensional image data to generate three-dimensional data and images of theobject 30, h) code for displaying one or more images of theobject 30, and i) any other suitable code for computed depth sectioning and image processing. TheCRM 154 may also include code for performing any of the signal processing or other software-related functions that may be created by those of ordinary skill in the art. The code may be in any suitable programming language including C, C++, Pascal, etc. - Although not shown, the
quantitative DIC device 100 may also include a display communicatively coupled to theprocessor 152. Any suitable display may be used. In one embodiment, the display may be a part of thequantitative DIC device 100. The display may provide information such as the image of theobject 30 to a user of thequantitative DIC device 100. - In addition, the
quantitative DIC device 100 of some embodiments may also include a body which incorporates one or more components of thequantitative DIC device 100. The body can be a multi-layer structure or a single, monolithic structure. - In an embodiment where the
quantitative DIC device 100 includes an OFM wavefront sensor, the body is a multi-layer structure. The body forms or includes a fluid channel having a first surface. The body also includes an opaque or semi-opaque aperture layer that is an inner surface layer of the fluid channel. The opaque or semi-opaque aperture layer has lighttransmissive regions 222 in it. The opaque or semi-opaque aperture layer can be a thin metallic layer in some cases. The body may optionally include a transparent protective layer (not shown) that covers the opaque or semi-opaque aperture layer to isolate the opaque or semi-opaque aperture layer from the fluid and theobject 30 moving through the fluid channel. - In general, imaging can be done in many ways. For example, object scanning can be replaced by wavefront sensor scanning by the raster scanning device. As another example, the two-dimensional raster scanning can be replaced by one-dimensional (1D) optofluidic microscope (OFM) scanning scheme described in U.S. patent application Ser. Nos. 11/686,095, 11/743,581, 12/638,518, which are hereby incorporated by reference in their entirety for all purposes. An embodiment of a quantitative DIC device that uses a one dimensional OFM scanning scheme is also described in detail in sections below.
- The quantitative DIC devices of embodiments of the invention can be implemented in various ways. In one embodiment, a quantitative DIC device can be in the form of a wavefront sensing chip for use with a microscope, camera or other imaging device. The wavefront sensing chip can be communicatively coupled to the imaging device through a port in the imaging device or the chip can be placed in a housing of the imaging device that accepts the chip. By coupling the wavefront sensing chip to the imaging device, the device can be provided with quantitative DIC functionality such as the ability to capture phase gradient images and compute depth sectioning. For example, a quantitative DIC device can be utilized as a wavefront sensing component of an adaptive optics device. The adaptive optics device operates by measuring distortions in the wavefront and compensating for them with a spatial phase modulator such as a deformable mirror or liquid crystal array.
- II. Wavefront Sensor Types
- Wavefront sensors of embodiments of the invention can be used in various wavefront sensing applications including adaptive optics, optical testing, adaptive microscopy, retina imaging, etc. An example of using wavefront sensors in adaptive microscopy can be found in Booth, M. J., Neil, M. A. A., Juskaitis, R. Wilson, T, Proceedings of the National Academy of Sciences of the United States of America 99, 5788 (April, 2002), which is herein incorporated by reference in its entirety for all purposes. An example of using wavefront sensors in retinal imaging can be found in Liang, J. Z, Williams, D. R., Miller, D. T., Journal of the Optical Society of America a-Optics Image Science and Vision 14, 2884 (November, 1997), which is herein incorporated by reference in its entirety for all purposes.
- The wavefront sensor, which can measure the local intensity and slope of the wavefront modulated by the object (sample) at the same time, can be implemented in several different ways. Three types of wavefront sensors are described below. The first type uses a structured aperture interference (SAI) scheme. The second type uses a Shack-Hartmann scheme. The third type uses the optofluidic microscope (OFM) scheme. These types of wavefront sensors can be used in either a single sensor element (pixel) configuration or in a wavefront sensor array (one or two dimensions) of sensor elements. If a single pixel wavefront sensor is used, a raster scanning device can be used to scan the object or the wavefront sensor to measure two-dimensional image data. If a two-dimensional array of wavefront sensors is used, the array can capture the two-dimensional image in a single snapshot reading. Although three types of wavefront sensors are described below, any suitable type of wavefront sensor can be used in embodiments of quantitative DIC devices.
- Wavefront sensors of embodiments can be implemented in conjunction with an imaging system (e.g., a camera system, microscope system, etc.) to provide capabilities such as computing depth sectioning of an object. In these implementations, one or more wavefront sensors can be coupled to the imaging device (e.g., a microscope or camera) or inserted within an imaging device to provide the additional capabilities.
- A. Structured Aperture Interference (SAI) Wavefront Sensor
- The basic concept of the SAI wavefront sensor is similar to the Young's double-slit phase decoding scheme described in U.S. patent application Ser. Nos. 11/743,581, 11/686,095, and 12/638,518 which is hereby incorporated by reference in its entirety for all purposes. Generally, the local slope of the image wavefront can be measured by looking at the offset of the zero diffraction order on the back detector array (light detector) and the local intensity (amplitude) of the image wavefront can be measured by integrating the diffraction signal on the back detector array.
-
FIGS. 6( a), 6(b) and 6(c) are schematic drawings of a side view of components of aquantitative DIC device 100 having a SAI wavefront sensor, according to embodiments of the invention. The SAI wavefront sensor can be a singlepixel wavefront sensor 110 or awavefront sensor array 210 of one or two dimensions. In the illustrated examples, theSAI wavefront sensor 110/210 is a singlepixel wavefront sensor 110 or can be a component of awavefront sensor array 210 of one or two dimensions. - The
SAI wavefront sensor 110/210 comprises an aperture layer 220 (e.g., a metal film), alight detector 230 and atransparent layer 240 between theaperture layer 220 and thelight detector 230. Theaperture layer 230 includes a first light transmissive region 222(a) and a second light transmissive region 222(b). The two light transmissive regions 222(a) and 222(b) are located at a distance D away from thelight detector 230 and are separated from each other by a spacing Δx. Thetransparent layer 240 between thelight detector 230 and theaperture layer 220 can include one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin), or can be a vacuum or gas-filled space. Anillumination source 20 provides illumination (light) to anobject 30 which modulates or otherwise alters the light inducing an image wavefront. Thelight detector 230 measures the distribution of light received from the light transmissive regions 222(a) and 222(b). - A light detector 230 (e.g., photosensor) can refer to any suitable device capable of detecting light and generating signals with wavefront data about the intensity, wavelength, wavefront slope, phase gradient in one or more orthogonal directions, and/or other information about the light being detected. The signals may be in the form of electrical current that results from the photoelectric effect. Some examples of suitable
light detectors 230 include a charge coupled device (CCD) or a linear or two-dimensional array of photodiodes (e.g., avalanche photodiodes (APDs)). Alight detector 230 could also be a complementary metal-oxide-semiconductor (CMOS) or photomultiplier tubes (PMTs). Other suitablelight detectors 230 are commercially available. - The
light detector 230 comprises one or morelight detecting elements 232. Thelight detecting elements 232 can be of any suitable size (e.g., 1-4 microns) and any suitable shape (e.g., circular or square). Thelight detecting elements 232 can be arranged in any suitable form such as a one-dimensional array, a two-dimensional array, and a multiplicity of one-dimensional and/or two-dimensional arrays. The arrays can have any suitable orientation or combination of orientations. In some cases, thelight detecting elements 232 can be arranged in the same form as the light transmissive regions 222(a) and 222(b) and correspond to the light transmissive regions 222(a) and 222(b). Thelight detector 230 also comprises a first surface 230(a). - In
FIGS. 6( a) and 6(b), theobject 30 being imaged is homogenous. InFIG. 6( c), theobject 30 includes afeature 250 having a refractive index variation from other homogenous portions of theobject 30. As an example, theobject 30 inFIG. 6( c) could be a cell and thefeature 250 may be a nucleus having a different refractive index than other portions of the cell. - In
FIGS. 6( a), 6(b) and 6(c), light transmissive region 222(a) collects a reference beam of light and light transmissive region 222(b) collects a sample beam of light. In these examples, the transparent layer (spacer) 240 has a refractive index of n and a thickness of D. When a vertical plane wave is incident on the two light transmissive regions 222(a) and 222(b) of theaperture layer 220, theinterference pattern 280 will be centered on thelight detector 230. InFIG. 6( a), the vertical plane wave is incident on the two light transmissive regions 222(a) and 222(b) of theaperture layer 220 and the interference pattern 280(a) is centered or substantially centered on thelight detector 230. In this illustrated example, the sample beam and reference beams with the same phase exit the two light transmissive regions 222(a) and 222(b), the centroid 270(a) of the light intensity distribution 280(a) of their Young's interference 260(a) is centered or substantially centered on thelight detector 230. - If an
object 30 is introduced, the phase difference induced by theobject 30 between the light transmissive regions 222(a) and 222(b) will shift thecentroid 270 of theinterference pattern 280 to one side. The phase difference Δφ is directly related to the offset Δs of the interference pattern: -
- when Δs<<D. In addition, the differential phase
-
- (phase gradient) induced by the
object 30 is directly related to the offset (Δs) of the interference pattern: -
- In
FIGS. 6( b) and 6(c), the reference and sample beams carry different phases. InFIG. 6( b), the sample beam passes through a homogeneous portion of theobject 30. In this case, the centroid 270(b) of the light intensity distribution 280(b) of their Young's interference 260(b) shifts on thelight detector 230 by an offset Δs1. InFIG. 6( c), the sample beam passes through a heterogeneous portion of theobject 30 and the reference beam passes through a homogenous portion of theobject 30. In this case, the centroid 270(c) of the light intensity distribution 280(c) of their Young's interference 260(c) shifts on thelight detector 230 by an offset Δs2. In some embodiments, the Δs2 may be greater than Δs1. In other embodiments, the Δs2 may be smaller than Δs1. - Using Eqns. 4 and 5, the
quantitative DIC device 100 can measure the phase difference and the phase gradient of the image wavefront using the measured offset. In addition, thequantitative DIC device 100 can measure the local amplitude by integrating the intensity distribution measured by thelight detector 230. By measuring the intensity and phase gradient of the light modulated by theobject 30 through independent aspects of the interference pattern, thequantitative DIC device 100 can separate the amplitude and phase information. - The one-dimensional Young's experiment with two slits can be generalized to two dimensions by utilizing varieties of two-dimensional structured apertures e.g., four holes, rose-shaped, ring or Fresnel zone plate, or a single hole. A two-dimensional structured aperture can refer to one or more light
transmissive regions 222 in the aperture layer of awavefront sensor 110/210 where the light transmissive regions extend in two orthogonal directions (e.g., in the x- and y-directions). With two-dimensional structured apertures, theSAI wavefront sensor 110/210 can measure the local slope and phase gradient of the image wavefront in two orthogonal directions (e.g., x-direction and y-direction) at the same time. This aperture-based phase decoding scheme can be referred to as SAI wavefront sensing. - Some embodiments of
quantitative DIC devices 100 have anSAI wavefront sensor 110/210 that employs one or more two-dimensional structured apertures to measure the amplitude and differential phase gradient in two orthogonal directions at the same time. In these embodiments, the light transmissive regions of the two-dimensional structure aperture are separated by a thickness D from the light detector. TheSAI wavefront sensor 110/210 is generally located at the image plane of the imaging system. The structure aperture selectively transmits and combines the light fields from two directions on the image to create an interference pattern on the light sensor. The total transmission of the interference pattern is proportional to the average image intensity at the light transmissive region. - Generally,
wavefront sensors 110/210 (e.g., four holes or single hole aperture sensors) of some embodiments can measure the spatial phase gradient of the light field. Mathematically, thewavefront sensors 110/210 of some embodiments measure: -
G x(x,y)=k x(x,y)/k 0≈(dφ(x,y)/dx)/k 0, (6) -
G y(x,y)=k y(x,y)/k 0≈(dφ(x,y)/dy)/k 0,and (7) -
A(x,y), (8) - where Gx(x, y) is the two-dimensional phase gradient in the x-direction, Gy(x, y) is the two-dimensional phase gradient in the y-direction, and A(x, y) is the two-dimensional amplitude of the detected wavefront. As such, a
quantitative DIC device 100 with awavefront sensor 110/210 can mathematically reconstruct (unwrap) the detected wavefront by combining the measured data appropriately. One unwrapping method is given by Eq. (9): -
- Numerous approaches for reconstructing a field distribution exist (unwrapping). The unwrapping methods should all return the same answer if the signal to noise ratio (SNR) of the measurements approaches infinity. The unwrapping methods vary in their performance based on the quantity and type of noise present in the measurements.
- Embodiments of the
SAI wavefront sensor 110/210 can use any suitable type of two-dimensional structured aperture. For example, thequantitative DIC device 100 can use a two-dimensional structured aperture in the form of a ‘plus’ sign configuration with four light transmissive regions (e.g., holes) arranged in orthogonal x and y directions. An exemplary quantitative DIC device that uses a four hole structured aperture for differential phase imaging can be found in Lew, Matthew, Cui, Xiquan, Heng, Xin, Yang, Changhuei, Interference of a four-hole aperture for on-chip quantitative two-dimensional differential phase imaging, Optic Letters, Vol. 32, No. 20, 2963 (2007), which is hereby incorporated by reference in its entirety for all purposes. Some other examples of two-dimensional structured apertures include a single pedal-shaped aperture, a ring or Fresnel zone plate, and other suitable arrangements of light transmissive regions extending in orthogonal directions. - In some embodiments, a
quantitative DIC device 100 has aSAI wavefront sensor 110/210 that employs a two-dimensional structured aperture of four light transmissive regions (e.g., holes) in the form of a ‘plus’ sign to measure the differential phase and amplitude of an image wavefront in the x-direction and y-direction at the same time. The two long axes of the ‘plus’ sign are in the orthogonal x and y directions respectively. By placing the SAI wavefront sensor in the image plane of an imaging system, the four transmissive regions will selectively transmit and combine light fields from four adjacent points on the image to create an interference pattern read by the light detector. The total transmission of the interference is proportional to the average image intensity at the light transmissive region. In addition to the spacer thickness D, the offsets offsetx(x, y) and offsety(x, y) of the zero-order interference spot are related to the net wavefront gradient Gx(x, y) and Gy(x, y) at the light transmissive region respectively: -
-
FIGS. 7( a)(1) and 7(a)(2) are schematic drawings of a perspective view of a two-dimensionalstructured aperture 300 in the form of a ‘plus’ sign configuration, according to embodiments of the invention. In the illustrated examples, thequantitative DIC device 100 comprises anaperture layer 220 having a two-dimensionalstructured aperture 300 of fourlight transmissive regions 222 extending in both x and y directions in a ‘plus’ sign configuration. Using the two-dimensionalstructured aperture 300, thequantitative DIC device 100 can measure the differential phase and amplitude in both x and y directions. Thequantitative DIC device 100 also has alight detector 230 located at a distance D from theaperture layer 230. Thequantitative DIC device 100 also includes atransparent layer 240 with a thickness D located between theaperture layer 220 and thelight detector 230. Thetransparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. Any suitable spacing Δx between thelight transmissive regions 222 in the ‘plus’ sign configuration can be used. Some examples of suitable spacing Δx are 1 μm, 2 μm, or 3 μm. - In
FIGS. 7( a)(1) and 7(a)(2), thelight detector 230 receives light passing through lighttransmissive regions 222 in two-dimensionalstructured aperture 300. InFIG. 7( a)(1), theillumination source 20 projects a light field perpendicular to the first surface 220(a). InFIG. 7( a)(2), theillumination source 20 projects a light field at an angle with respect to the first surface 22(a). The light field projected at different angles inFIG. 7( a)(1) andFIG. 7( a)(2) results in different projections 310(a) and 310(b) respectively onto thelight detector 230. -
FIGS. 7( b), 7(c), and 7(d) are images taken by a scanning electron microscope of two-dimensionalstructured apertures 300, according to embodiments of the invention.FIG. 7( b) illustrates a two-dimensionalstructured aperture 300 in the form of a single pedal-shaped aperture.FIG. 7( c) andFIG. 7( d) illustrate two-dimensionalstructured apertures 300 of lighttransmissive regions 222 in the form of a ‘plus’ sign configuration. The spacing between thelight transmissive regions 222 in the structuredaperture 300 inFIG. 7( c) is shorter than the spacing between thelight transmissive regions 222 in the structuredaperture 300 shown inFIG. 7( d).FIGS. 7( e), 7(f), and 7(g) are images of the resulting interference patterns of the two-dimensionalstructured apertures 300 respectively fromFIGS. 7( b), 7(c), and 7(d), according to embodiments of the invention.FIG. 7( h) is a schematic drawing of a Fresnel-zone plate structured aperture with a circular frame. -
FIG. 8 is a schematic drawing of a side view of components of aquantitative DIC device 100 having aSAI wavefront sensor 110/210, in accordance with embodiments of the invention. TheSAI wavefront sensor 110/210 has anaperture layer 220 having anarray 325 of three structuredapertures 300. TheSAI wavefront sensor 110/210 also includes alight detector 230 and atransparent layer 240 with a thickness, D located between theaperture layer 220 and thelight detector 230. Thetransparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. - In the illustrated embodiment, the
quantitative DIC device 100 is a component of a camera system with a first illumination source 20(a) providing light in the z-direction and a second illumination source 20(b) (e.g., a flash) providing light from another direction. In this example, the light from the additional illumination source reflects off theobject 30. Theobject 30 modulates or otherwise alters the light from both illumination sources 20(a) and 20(b) inducing animage wavefront 120. In other embodiments, the second illumination source 20(b) may be eliminated such as aquantitative DIC device 100 that is a component of a microscope system. In yet other embodiments, other illumination sources may provide light from other directions. - The
quantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with thewavefront sensor 110/120. Thewavefront relay system 130 relays thewavefront 120 induced by theobject 30 to theSAI wavefront sensor 110/120. In other embodiments, thequantitative DIC device 100 can also include ahost computer 150 having aprocessor 152 and a computerreadable medium 154. - In operation, the illumination sources 20(a) and 20(b) provide light to the
object 30 inducing theimage wavefront 120. Thewavefront relay system 130 relays theimage wavefront 120 to the structuredapertures 300 of theSAI wavefront sensor 110/210. The light passes through the structured apertures. TheSAI wavefront sensor 110/210 measures the offsets in the x and y directions of the zero diffraction order of the light distribution read by thelight detector 230. TheSAI wavefront sensor 110/210 measures the two dimensional phase gradient in the x and y directions based on the offsets using Eqns. (2) and (3). TheSAI wavefront sensor 110/210 also measures the amplitude of the image wavefront by integrating the intensity readings over thelight detector 230. Thequantitative DIC device 100 can then reconstruct theimage wavefront 120 using an unwrapping method. Thequantitative DIC device 100 can also propagate the reconstructed wavefront to one or more parallel planes intersecting theobject 30 in order to compute depth sectioning of theobject 30. Thequantitative DIC device 100 can also compile the two-dimensional information about the reconstructed and propagated wavefronts to generate three-dimensional information about theobject 30. Thequantitative DIC device 100 can generate two or three-dimensional images of theobject 30 based on the amplitude, phase gradient in a first direction, and/or phase gradient in a second direction orthogonal to the first direction. - The
quantitative DIC device 100 also includes an x-axis, a y-axis, and a z-axis. The x-axis and the y-axis lie in the plane of the surface 230(a) of thelight detector 230. The z-axis is orthogonal to the plane of the surface 230(a). - B. Shack-Hartmann Wavefront Sensor
- The Shack-Hartmann wavefront sensor includes a microlens array of the same focal length. Each Shack-Hartmann wavefront sensor focuses the local wavefront across each microlens and forms a focal spot onto a photosensor array. The local slope of the wavefront can then be calculated from the position of the focal spot on the light sensor. The principles of the Shack-Hartmann sensors are further described in Platt, Ben C. and Shack, Roland, History and Principles of Shack-Hartmann Wavefront Sensing, Journal of Refractive Surgery 17, S573-S577 (September/October 2001) and in Wikipedia, Shack-Hartmann, at http://en.wikipedia.org/wiki/Shack-Hartmann (last visited Jan. 21, 2009), which are which are hereby incorporated by reference in their entirety for all purposes.
- Microlenses can refer to small lenses, generally with diameters of less than about a millimeter and can be as small as about 10 μm. Microlenses can be of any suitable shape (e.g., circular, hexagonal, etc.). Microlenses can also have any suitable surface configuration. In one example, a microlens is a structure with one plane surface and one spherical convex surface to refract light. In another example, a microlens has two flat and parallel surfaces and the focusing action is obtained by a variation of refractive index across the microlens. In another example, a microlens is a micro-Fresnel lens having a set of concentric curved surfaces which focus light by refraction. In yet another example, a microlens is a binary-optic microlens with grooves having stepped edges.
- A microlens array can refer to a one or two-dimensional array of one or more microlenses. A Shack-Hartmann wavefront sensor comprises an aperture layer having a microlens array with one or more microlenses located within the light transmissive regions (e.g., holes) in the aperture layer. The Shack-
Hartmann wavefront sensor 110/210 also includes alight detector 230 and atransparent layer 240 with a thickness, D located between theaperture layer 220 and thelight detector 230. Thetransparent layer 240 can be comprised of one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. -
FIG. 9 is a schematic drawing of a side view of components of aquantitative DIC device 100 having a Shack-Hartmann wavefront sensor 110/210, in accordance with an embodiment of the invention. In the illustrated embodiment, the Shack-Hartmann wavefront sensor 110/210 has anaperture layer 230 comprising amicrolens array 325 having threemicrolenses 330. - In the illustrated embodiment, the
quantitative DIC device 100 is a component of a camera system with a first illumination source 20(a) providing light in the direction of the z-axis and a second illumination source 20(b) (e.g., a flash) providing light from another direction. In this example, the light from the additional illumination source reflects off theobject 30. Theobject 30 alters the light from both illumination sources 20(a) and 20(b) inducing animage wavefront 120. In other embodiments, other illumination sources or a single illumination source may be used. Thequantitative DIC device 100 also includes a wavefront relay system 130 (e.g., one or more lenses) in communication with thewavefront sensor 110/120. In other embodiments, thequantitative DIC device 100 can also include ahost computer 150 having aprocessor 152 and a computerreadable medium 154. - In operation, the
illumination sources 20 provide light to theobject 30 inducing thewavefront 120. Thewavefront relay system 130 relays theimage wavefront 120 induced by theobject 30 to the array 320 ofmicrolenses 330. Eachmicrolens 330 concentrates light to focal spots on thelight detector 230. The Shack-Hartmann wavefront sensor 110/120 measures offsets of the positions of the focal spots on thelight detector 230. The Shack-Hartmann wavefront sensor 110/120 measures the two dimensional phase gradient in x-direction and y-direction and the amplitude as given by Eqns. (2) and (3). The Shack-Hartmann wavefront sensor 110/210 also measures the amplitude of the image wavefront by integrating the intensity readings over thelight detector 230. Thequantitative DIC device 100 can then reconstruct theimage wavefront 120 using an unwrapping method and propagate the wavefront from the plane at the surface 230(a) of thelight detector 230 to any number of parallel planes intersecting theobject 30 to determine image data at different depths through the thickness of theobject 30. Thequantitative DIC device 100 can also compile the two-dimensional information about thewavefront 120 to generate three-dimensional information about theobject 30. Thequantitative DIC device 100 can generate two or three-dimensional images of theobject 30 based on the amplitude, phase gradient in a first direction, and/or phase gradient in a second direction orthogonal to the first direction. - The
quantitative DIC device 100 also includes an x-axis, a y-axis, and a z-axis. The x-axis and the y-axis lie in the plane of the surface 230(a) of thelight detector 230. The z-axis is orthogonal to the plane of the surface 230(a). - C. Optofluidic Microscope (OFM) Wavefront Sensor
- The above compact and lensless two-dimensional differential phase measurement scheme can be deployed in OFM imaging scheme as well. By replacing the single light transmissive regions of an intensity OFM device with two-dimensional structured apertures, the intensity OFM device becomes an on-chip and quantitative differential interference contrast optofluidic microscope which can improve image quality while providing high throughput in a compact and inexpensive device. The quantitative DIC device has an OFM wavefront sensor which includes an aperture layer with an array of two-dimensional structured apertures, a light detector, and a transparent layer with a thickness D between the aperture layer and the light detector. The OFM wavefront sensor can determine the amplitude of the image wavefront of an object and determine the phase gradient of the image wavefront in two orthogonal directions of the object. Young's double slit experiment provides a basis for this technique.
-
FIG. 10( a) is a schematic drawing of a top view of components of anintensity OFM device 400 including lighttransmissive regions 222 in the form of a one-dimensional array 410 of singlelight transmissive regions 222. Theintensity OFM device 400 also includes abody 420 forming or including afluid channel 430. Thelight transmissive regions 222 are located in theaperture layer 440 of thebody 420. Theintensity OFM device 400 also includes a light detector 230 (shown inFIG. 4) having elements for taking time varying readings of the light received through thelight transmissive regions 222 as theobject 30 travels through thefluid channel 430. Theintensity OFM device 400 can use the time varying readings to reconstruct an image of theobject 30 based on light intensity detected by thelight detector 230. -
FIG. 10( b) is a schematic drawing of a top view of components of aquantitative DIC device 100 having anOFM wavefront sensor 210, according to an embodiment of the invention. Thequantitative DIC device 100 includes abody 420 comprising anOFM wavefront sensor 210 and forming or including afluid channel 430. Thebody 420 can be a multi-layer structure or a single, monolithic structure. In the illustrated example, thebody 420 is a multi-layer structure having an opaque orsemi-opaque aperture layer 440 that is an inner surface layer of fluid channel 22. The opaque orsemi-opaque aperture layer 440 has lighttransmissive regions 222 in it. The opaque orsemi-opaque aperture layer 440 can be a thin metallic layer in some cases. Thebody 420 may optionally include a transparent protective layer (not shown) that covers the opaque orsemi-opaque aperture layer 440 to isolate the opaque orsemi-opaque aperture layer 440 from the fluid and theobject 30 moving through thefluid channel 430 of thequantitative DIC device 100 having anOFM wavefront sensor 210. - The
fluid channel 430 may have any suitable dimensions. For example, the width and/or height of thefluid channel 430 may each be less than about 10, 5, or 1 micron. In some embodiments, thefluid channel 430 may be sized based on the size of theobjects 30 being imaged by thequantitative DIC device 100. For example, the height of thefluid channel 430 may be 10 micron where theobjects 30 being imaged are 8 micron in order to keep theobjects 30 close to the opaque orsemi-opaque aperture layer 440, which may help improve the quality of the image. In most embodiments, the flow of the fluid in thefluid channel 430 is generally in the direction of the x-axis. - The
OFM wavefront sensor 210 includes a one-dimensional array 450 of structuredapertures 300. Eachstructured aperture 300 is in the configuration of a ‘plus’ sign configuration of lighttransmissive regions 222 extending in orthogonal x and y directions. In other embodiments, other configurations (e.g., rose-shaped, ringer shaped, single hole, etc.) can be used. In one embodiment, a microlens is located inside one or more of the light transmissive regions for focusing the light. - The
OFM wavefront sensor 210 also includes a light detector 230 (shown inFIG. 8 ) having elements (e.g., pixels) for taking time varying readings of the light it receives from thelight transmissive regions 222 as theobject 30 moves through thefluid channel 430. TheOFM wavefront sensor 210 also includes a transparent layer (shown inFIG. 8 ) with a thickness, D between theaperture layer 440 and thelight detector 230. Thetransparent layer 240 can be one or more layers of transparent material such as water or a viscous polymer (e.g., SU-8 resin) or can be a vacuum or gas-filled space. Any suitable spacing Δx between thelight transmissive regions 222 in the structuredapertures 300 can be used. Some examples of suitable spacing Δx are 1 μm, 2 μm, or 3 μm. - The
quantitative DIC device 100 also includes an illumination source 20 (shown inFIG. 8 ) to the outside of the opaque orsemi-opaque aperture layer 440. Illumination sources such as those shown inFIG. 8 can provide light to thefluid channel 430. As a fluid flows through thefluid channel 430, anobject 30 in the fluid is illuminated by the illumination source. Theobject 30 alters (e.g., blocks, reduces intensity, and/or modifies wavelength) the light passes through, reflecting or refracting off of it to thelight transmissive regions 222. The elements in thelight detector 230 detect light transmitted through thelight transmissive regions 222. - The
quantitative DIC device 100 also includes an x-axis and a y-axis that lie in the plane of the inner surface of thelight detector 230 proximal to thefluid channel 430. The x-axis lies along a longitudinal axis of thefluid channel 430. The y-axis is orthogonal to the x-axis in the plane of the inner surface of thelight detector 230. - The
light transmissive regions 222 in the opaque orsemi-opaque aperture layer 440 can be of any suitable shape and any suitable dimension. In the illustrated example, thelight transmissive regions 222 are holes. The holes may be etched, for example, into the opaque or semi-opaque aperture layer 440 (e.g., a thin metallic layer). In another embodiment, thelight transmissive regions 222 may be in the form of one or more slits. A slit can refer to an elongated opening such as a narrow rectangle. Each slit may have any suitable dimension. The slits may have uniform dimensions or may have variable dimensions. The slits can be oriented at any suitable angle or angles with respect to the x-axis of thefluid channel 430. - In the illustrated embodiment, the
light transmissive regions 222 in the one-dimensional array 450 ofstructure apertures 300 collectively extend from one lateral surface 430(a) to another lateral surface 430(b) of thefluid channel 430. The one-dimensional array 450 is located at an angle, θ with respect to the x-axis. The angle, θ can be any suitable angle. Although the illustrated embodiment includes a one dimensional array, other embodiments may include anOFM wavefront sensor 210 with other suitable formation(s) of structuredapertures 300 can be used such as a slit, a two-dimensional array, or a multiplicity of one-dimensional and/or two-dimensional arrays. In addition, the formations ofstructured apertures 300 can be in any suitable orientation or combination of orientations. - In operation, the
light detector 230 takes time varying readings of the light it receives from thelight transmissive regions 222 as theobject 30 moves through thefluid channel 430. Thequantitative DIC device 100 uses the time varying readings to determine a two dimensional light intensity distribution generated by ‘plus’ sign configurations of lighttransmissive regions 222. Thequantitative DIC device 100 uses the light intensity distribution to determine the interference in orthogonal directions x and y to determine the offsets. Thequantitative DIC device 100 also determines the differential phase (gradient) in orthogonal directions x and y based on the determined interference. Thequantitative DIC device 100 also determines the amplitude by summing the intensity of the light detected over an area of thelight detector 230 mapping to a particular set of lighttransmissive regions 222. Examples of methods of measuring the amplitude and differential phase in two orthogonal directions of the sample wavefront quantitatively can be found in Cui, Xiquan, Lew, Matthew, Yang, Changhuei, Quantitative differential interference contrast microscopy based on structured-aperture interference, Appl. Phys. Lett. 93, 091113 (2008), which is hereby incorporated by reference in its entirety for all purposes. - In the SAI and OFM wavefront sensors, structured apertures convert the phase gradient of the image wavefront into a measurable form, the offset of the projection of the light field measured by the light detector. In a Shack-Hartmann wavefront sensor, a microlens is used to convert the phase gradient of the image wavefront into a movement of the focal point on the light detector. An advantage of using a SAI or OFM wavefront sensor is that the simple structured apertures provide the ability to build a more compact and cost-effective wavefront sensor than the Shack-Hartmann wavefront sensor having microlenses. In addition, the spacing between the structured apertures in a SAI wavefront sensor can be much shorter than the spacing between the microlenses of the Shack-Hartmann wavefront sensor. Shorter spacing can provide higher spatial resolution and denser wavefront sampling, which can be especially beneficial when detecting complex wavefronts generated by many biological samples.
- IV. Computed Depth Sectioning
- A. Approach
- Computed depth sectioning refers to a technique for determining images at different depths through the thickness of an
object 30 using aquantitative DIC device 100 with awavefront sensor 110/210. Any type ofwavefront sensor 110/210 can be used. In some embodiments, polarization effects are ignored. For simplicity, it is assumed that in these embodiments this technique is premised on a light field that is linearly polarized and on no interactions in the light field depolarizing or in any other way disrupting the polarization of the light field. - The approach to the computed depth sectioning technique is primarily based on two concepts. According to a first concept, the light field at any given plane z can be fully described by a complete set of spatially varying amplitude and phase information. In other words, a light field at plane z can be described by Eq. (11) as:
-
ψ(x,y,z)=A(x,y,z)exp(iφ(x,y,z)), (11) - where ψ(x, y, z) is the light field at plane z, A(x, y, z) is the amplitude at plane z and φ(x, y, z) is the phase at plane z. A second concept provides the Huygen's principle which states that the light field at an earlier or later (higher or lower z value) plane can be calculated from the light field at plane z. In other words, a known function (f) connects according to Eq. (12):
-
ψ(x,y,z+Δz)=f(ψ(x,y,z),Δz), (12) - where ψ(x, y, z+Δz) is the light field at plane (z+Δz). The function (f) is well known and studied in electromagnetic theory. For example, this function f is described in Kraus, John Daniel, Fleisch, Daniel A., Electromagnetics with Applications (5th Ed), Chapters 4-16 (1999), which is herein incorporated by reference in its entirety for all purposes. This computation assumes the absence of unknown scattering objects between plane z and plane (z+Δz).
- These two concepts are powerful when applied to phase imaging in embodiments of the invention. It implies that if one can measure the phase and amplitude distribution at the plane of the sensor (plane z), one can calculate and render the light field distributions at different heights (different (z+Δz) planes) above the sensor. The light field distributions are, in effect, images at those chosen planes. According to this treatment, if a wavefront sensor can measure the two dimensional phase and amplitude data at a light detector plane z, the quantitative DIC device can use this data to numerically reconstruct the image and numerically propagate it to any plane z+Δz above or below the plane z of the light detector.
- The propagation of the light field is governed by Maxwell's equations entirely. If one can measure the phase and amplitude distribution of the light field at the sensor, one can take that information and calculate the light field distribution at any given plane above the sensor (or below the sensor). The amplitude distribution of the computed light field is equivalent to the traditional microscope image taken at the focal plane set at z+Δz. This treatment is strictly true if no unknown object is present between the plane z and z+Δz.
-
FIG. 11 illustrates a schematic diagram illustrating this propagation approach, according to embodiments of the invention. In the illustrated example, anillumination source 20 provides light. The light distribution of a first wavefront 120(a) is ψ(x, y, z+Δz) at plane z+Δz. In this example, plane z refers to a plane perpendicular to the z-axis and coinciding with the surface 230(a) of thelight detector 230 ofwavefront sensor 110/210. Plane z+Δz refers to a plane parallel to plane z and at a distance Δz from plane z. - In
FIG. 11 , thelight detector 230 of thewavefront sensor 110/210 measures the light distribution ψ(x, y, z) of the second (detected) wavefront 120(b) at plane z. Thelight detector 230 measures the two-dimensional amplitude and two-dimensional phase gradient data in two orthogonal directions, associated with the second (detected) wavefront 120(b) at plane z based on the measured light distribution ψ(x, y, z). Aprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 numerically reconstructs a third (reconstructed) wavefront 120(c) having the light distribution ψ(x, y, z) using the measured phase and amplitude information of the detected wavefront 120(b). Theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 calculates and renders a light distribution ψcalculated(x, y, z+Δz) of a fourth (propagated) wavefront 120(d) at a plane z+Δz based on the reconstructed light distribution ψ(x, y, z). That is, theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 numerically propagates the reconstructed wavefront 120(c) from plane z to the plane z+Δz to generate the fourth (propagated) wavefront 120(d). - The imaging of an unknown but weak scatterer (e.g., a transparent object) can be performed computationally using the same mathematical frame work as described above—by ignoring the presence of the scatterer during the calculation and back-computing the light field at z+Δz. Using this frame work by ignoring the presence of the object, the quantitative DIC device can image an object by computationally back propagating a reconstructed image of the object at plane z to parallel planes above and below the plane z.
-
FIG. 12 is a schematic diagram of the propagation approach using a processor, according to embodiments of the invention. In this example, the illumination source generates a uniform wavefront 120(e) associated with a uniform light distribution of ψ(x, y, z+Δz′) at plane z+Δz′. Plane z+Δz′ refers to a plane between theillumination source 20 and theobject 30, that is parallel to plane z, and that is at a distance Δz′ from plane z. At plane z+Δz, theobject 30 induces a first (induced) wavefront 120(a) associated with a light distribution of ψ(x, y, z+Δz). Plane z+Δz refers to a plane parallel to plane z and at a distance Δz from plane z. Thelight detector 230 of thewavefront sensor 110/210 measures the light distribution ψ(x, y, z) of the second (detected) image wavefront 120(b) at plane z. Thelight detector 230 measures the amplitude and phase gradient of the second (detected) image wavefront 120(b) at plane z based on the light distribution ψ(x, y, z). In this example, plane z refers to a plane perpendicular to the z-axis and coinciding with the surface 230(a) of thelight detector 230 ofwavefront sensor 110/210. - A
processor 152 of thewavefront sensor 110/210 or of thehost computer 150 numerically reconstructs the light distribution ψ(x, y, z) of a third (reconstructed) image wavefront 120(c) based on the measured phase and amplitude information of the detected image wavefront 120(b). Theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 calculates and renders a computed light distribution ψcalculated(x, y, z+Δz) of a first (propagated) image wavefront 120(d) at a plane z+Δz based on the reconstructed light distribution ψ(x, y, z). That is, theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 numerically propagates the reconstructed image wavefront 120(c) from plane z to the plane z+Δz to generate a first propagated image wavefront 120(d) that approximates the first induced wavefront 120(a). Theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 also calculates and renders a computed light distribution ψcalculated(x, y, z+Δz′) of a second propagated image wavefront 120(f) at a plane z+Δz′ based on the reconstructed wavefront 120(c) or based on the first propagated image wavefront 120(d). The second propagated image wavefront 120(f) approximates the image wavefront 120(e) associated with the uniform light distribution ψ(x, y, z+Δz). In the same way, theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 can numerically propagate the reconstructed image wavefront 120(c) or any previously propagated image wavefronts to other z-planes at any height through the thickness of theobject 30. Theprocessor 152 of thewavefront sensor 110/210 or of thehost computer 150 can compute the depth sectioning of theobject 30 by propagating image wavefronts at different heights through theobject 30. - The amplitude distribution of the computed light field is equivalent to the traditional microscope image taken at the focal plane set at z+Δz. In fact, it can be identical. In the traditional microscope, the calculation is performed optically by the optical elements. By adjusting the optics (e.g., lenses), one can bring different planes into focus but effectively what one is doing is making slight adjustments to the optical computing process.
-
FIGS. 13( a) and 13(b) are schematic drawings of a focusing approach taken by a traditional microscope. In this approach, the optical elements (e.g., lenses) are moved to bring different planes into focus. InFIG. 13( a), the microscope is adjusting the optics to bring plane z+Δz into focus. InFIG. 13( b), the microscope is adjusting the optics to bring plane z+Δz′ into focus. - In
FIG. 13( a), the illumination source generates a uniform wavefront 120(e) associated with a uniform light distribution of ψ(x, y, z+Δz′) at plane z+Δz′. Plane z+Δz′ refers to a plane between theillumination source 20 and theobject 30, that is parallel to plane z, and that is at a distance Δz′ from plane z. At plane z+Δz, theobject 30 induces a first (induced) wavefront 120(a) associated with a light distribution of ψ(x, y, z+Δz). Plane z+Δz refers to a plane parallel to plane z and at a distance Δz from plane z. Theoptical elements 500 are placed at a distance away from thesensor 510 to calculate a light distribution of ψcalculated(x, y, z+Δz) to focus an image wavefront 120(f) on plane z+Δz. - In
FIG. 13( b), the illumination source generates a uniform wavefront 120(e) associated with a uniform light distribution of ψ(x, y, z+Δz′) at plane z+Δz′. Plane z+Δz′ refers to a plane between theillumination source 20 and theobject 30, that is parallel to plane z, and that is at a distance Δz′ from plane z. At plane z+Δz, theobject 30 induces a first (induced) wavefront 120(a) associated with a light distribution of ψ(x, y, z+Δz). Plane z+Δz refers to a plane parallel to plane z and at a distance Δz from plane z. Theoptical elements 500 are placed at a distance away from thesensor 510 to calculate a light distribution of ψcalculated(x, y, z+Δz′) to focus an image wavefront 120(f) on plane z+Δz′. - In some cases, the process may not be perfect because one may not achieve a good image if the scatterer is thick and/or highly scattering because the assumption that the scatterer is ignorable in the computation process may be violated. However, this problem can affect computation-based depth sectioning using a quantitative DIC device and optical-based sectioning using a traditional microscope equally. The axiom of ‘no free lunch’ can apply equally in both situations. For thin tissue sections or cell samples, the distortion may be nominally tolerable. In practical situations, one can typically deal with a 100 microns thick tissue sample before distortion starts becoming significant.
- A sensor capable of spatially measuring amplitude and phase is required to compute depth sectioning of the object based on the propagation approach. The wavefront sensors embodiments have the capability of measuring the two-dimensional amplitude and phase required to compute depth sectioning of the object. In addition, the signal to noise ratio of the sensor measurements may need to be high in some cases. Otherwise the computed images may be poor in quality.
- Modifications, additions, or omissions may be made to the
quantitative DIC device 100 of any configuration or to thewavefront sensors quantitative DIC device 100 of any configuration or to thewavefront sensors processor 152 may be a component of thelight detector 230. Moreover, the operations of thequantitative DIC device 100 may be performed by more, fewer, or other components and the operations of thewavefront sensors quantitative DIC device 100 orwavefront sensors 110/210 may be performed using any suitable logic comprising software, hardware, other logic, or any suitable combination of the preceding. - B. Flow Chart
-
FIG. 14 is a flow chart of a compute depth sectioning method using aquantitative DIC device 100 having awavefront sensor 110/210, according to embodiments of the invention. Thequantitative DIC device 100 used in this method can have any suitable type ofwavefront sensor 110/210. Suitable types ofwavefront sensors 110/210 include a SAI wavefront sensor, Shack-Hartmann wavefront sensor, OFM wavefront sensor, or other suitable wavefront sensor. In addition, any suitable configuration of wavefrontsensor wavefront sensor 110/210 can be used. For example, thequantitative DIC device 100 can have a singlepixel wavefront sensor 110, a one dimensional array ofsensor elements 210, a two-dimensionalwavefront sensor array 210, etc. If the wavefrontsensor wavefront sensor 110/210 is a single pixel wavefront sensor, a raster scanning device can be employed to scan the wavefront sensor or the object to get a two-dimensional reading at a single time. If thewavefront sensor 110/210 is a two-dimensional wavefront sensor array, the wavefront sensor can take a two-dimensional reading of the image wavefront at the same time. If thewavefront sensor 110/210 is a one-dimensional array, thewavefront sensor 110/210 can read time varying data in the form of line scans and compile the line scans to generate two-dimensional wavefront data. - The method begins with an illumination source or
sources 20 providing light (step 600). Without theobject 30 being present, the light generates an initialization wavefront with a light distribution of ψ(x, y, z+Δz′) at plane z+Δz′ of theillumination source 20. The illumination source orsources 20 can provide light in any suitable direction(s). If thequantitative DIC device 100 is a component of a camera system,multiple illumination sources 20 such as a flash, ambient light, etc. may provide light from multiple directions. If thequantitative DIC device 100 is a component of a microscope, asingle illumination source 20 may provide light in a single direction along the z-axis toward the wavefront sensor. In one embodiment, awavefront relay system 130 relays or projects the wavefront to the wavefront sensor. In other embodiments, thewavefront relay system 130 is eliminated and the wavefront is projected directly to thewavefront sensor 110/210. - Next, the
wavefront sensor 110/210 measures the light distribution ψ(x, y, z) of theinitialization wavefront 120 at plane z (step 602). The initialization wavefront is received through the structuredapertures 300 by thelight detecting elements 232 of thelight detector 230. If the wavefront sensor is a singlepixel wavefront sensor 110, thelight detector 230 uses araster scanning device 140 to scan thewavefront sensor 110/210 or theobject 30 to generate a reading of the two dimensional light intensity distribution at a single time. If the wavefront sensor is a two dimensional sensor array, thelight detector 230 can read the two dimensional light intensity distribution in a snapshot reading without raster scanning. If the wavefront sensor is a one dimensional sensor array using an OFM scheme (OFM wavefront sensor), thelight detector 230 reads time varying data as theobject 30 passes through the fluid channel. The time varying data can be in the form of line scans which can be compiled to generate the two dimensional light distribution. - The
processor 152 of thewavefront sensor 110/210 or of thehost computer 150 separates the light projection/distribution of eachstructured aperture 300 from the light projection/distribution from other structured apertures 300 (step 604). Once separated, the light distributions/projections can be used to map thelight detecting elements 232 of thelight detector 230 to the structuredapertures 300. - Any suitable technique for separating the projections/distributions can be used. In one embodiment, separation can be performed by suppressing the crosstalk from adjacent projections of the
wavefront 120 through adjacentstructured apertures 300. In this embodiment, theprocessor 152 can determine the maximum intensity values of the light distribution. Theprocessor 152 can then determine thelight detecting elements 232 that have read the maximum intensity values. Theprocessor 152 can determine the midpoint on thelight detector 230 between the light detectingelements 232 reading the maximum intensity values. Theprocessor 152 can then use the midpoint to separate the light detecting elements reading the light distribution from particularstructured apertures 300. Alternatively, a predefined number of light detectingelements 232 around each light detectingelement 232 with the maximum intensity value can define thelight detecting elements 232 associated with eachstructured aperture 300. - Next, the
processor 152 predicts the center of each projection/distribution 300 (step 606). Any method can be used to predict the initial centers. In one embodiment, theprocessor 152 may determine that the center of each projection/distribution is the light detecting element having the highest intensity value. - The
object 30 is introduced (step 608). In other embodiments,steps 602 through 606 can be performed afterstep 614 and beforestep 616. Theobject 30 can be introduced using any suitable technique. For example, theobject 30 may be injected with a fluid sample into an input port of thequantitative DIC device 100. - Once introduced, the
object 30 alters the light from theillumination source 20 and induces animage wavefront 120 at z+Δz having a light distribution ψ(x, y, z+Δz). Referring toFIGS. 15( a), 15(b), and 15(c), awavefront sensor 210 is shown with an image wavefront induced by an object, according to an embodiment of the invention.FIG. 15( a) is a side view of components of thewavefront sensor 210, according to an embodiment of the invention. Thewavefront sensor 210 includes anaperture layer 220 having an array of three structured apertures 300(a), 300(b) and 300(c) along the x-axis.FIG. 15( b) is a top view of components of thewavefront sensor 210 inFIG. 15( a), according to an embodiment of the invention.FIG. 15( c) is a sectional view of components of thewavefront sensor 210 inFIG. 15( a) through the center of structured aperture 300(c), according to an embodiment of the invention. Although three structured apertures 300(a), 300(b) and 300(c) are shown, any number of structured apertures can be used. In addition, any type of structured aperture (e.g., four hole, single hole, etc.) may be used. - In
FIGS. 15( a), 15(b), and 15(c), thewavefront sensor 210 also includes alight detector 230 having a surface 230(a) and atransparent layer 240 between thelight detector 230 and theaperture layer 220. Thetransparent layer 240 has a thickness D. Although the illustrated example shows thelight detector 230 comprising a two-dimensional array of light detecting elements, any suitable number or configuration oflight detecting elements 232 can be used. For example, a singlelight detecting element 232 can be used. - Returning to
FIG. 14 , the wavefront sensor measures the light distribution ψ(x, y, z) of the wavefront induced by the object 30 (step 610). The wavefront is measured at plane z of the light detector corresponding to surface 230(a) of thelight detector 230. - In the illustrated embodiment shown in
FIGS. 15( a), 15(b), and 15(c), thewavefront 120 is received through the structured apertures 300(a), 300(b) and 300(c) in theaperture layer 220 by thelight detecting elements 232 of thelight detector 230. Thelight detector 230 can take a two-dimensional snapshot reading of the light distribution or can use araster scanning device 140 to scan the wavefront sensor to get a two-dimensional reading at a single time. In another embodiment, thelight detector 230 can read time varying intensity information through structuredapertures 300 of an OFM wavefront sensor. In this case, the light distribution is compiled from line scans of the time varying information. - Returning to
FIG. 14 , theprocessor 152 of the wavefront sensor or the host computer separates the light distribution/projection (step 612). In the illustrated embodiment shown inFIGS. 15( a), 15(b), and 15(c), theprocessor 152 separates the light distributions 800(a), 800(b) and 800(c) from a particular lighttransmissive region 232 of astructured aperture 300 from the light distribution projected from lighttransmissive regions 232 of otherstructured apertures 300. For example, theprocessor 152 separates the light distribution 800(a) associated with structured aperture 300(a) from the light distributions 800(b) and 800(c) associated with the structured apertures 300(b) and 300(c). The separation of the light distributions/projections can be used to map thelight detecting elements 232 of thelight detector 230 to the light transmissive region(s) of the structuredapertures 300. - Any suitable method of separating the projections/distributions 800(a), 800(b) and 800(c) can be used. In one embodiment, separation can be performed by suppressing the crosstalk from adjacent projections of the
wavefront 120 through adjacent structured apertures 300(a), 300(b) and 300(c). Theprocessor 152 of thequantitative DIC device 100 determines the maximum intensity values 820(a), 820(b), 820(c), and 820(d) of the light distributions 800(a), 800(b) and 800(c) in both orthogonal x- and y-directions. Theprocessor 152 then determines thelight detecting elements 232 reading the maximum intensity values 820(a), 820(b) and 820(c). Theprocessor 152 can determine the midpoint on thelight detector 230 between the light detectingelements 232 reading the maximum intensity values 820(a), 820(b), 820(c), and 820(d). Theprocessor 152 can then use the midpoint to separate the light detecting elements reading the light distribution from particularstructured apertures 300. Alternatively, a predefined number of light detectingelements 232 around thelight detecting element 232 reading the maximum intensity value can define thelight detecting elements 232 associated with eachstructured aperture 300. For example, alllight detecting elements 232 within three light detecting elements of the one reading the maximum intensity value can be associated with the light distribution from a particularstructured aperture 300. - Returning to
FIG. 14 , theprocessor 152 also predicts the center of the projections associated with the structured apertures (step 614). Any method can be used to predict the centers. In one embodiment, theprocessor 152 may determine that the center of each separated projection/distribution is the light detecting element having the highest intensity value. - Next, the
processor 152 determines the x and y offsets (step 616). Theprocessor 152 can determine the offsets from the change in position of the center in both the x and y directions of each projection before and after an object is introduced. InFIGS. 15( a), 15(b), and 15(c), theprocessor 152 determines the offset 810(a) at structured aperture 300(a) in the x-direction, offset 810(b) at structured aperture 300(b) in the x-direction and offset 810(c) at structured aperture 300(c) in the x-direction. Theprocessor 152 also determines an offset 810(d) at aperture 300(c) in the y-direction. Although not shown, theprocessor 152 also determines offsets in the y-direction at apertures 300(a) and 300(b). In another embodiment, theprocessor 152 may determine the offsets from a change in the position of another portion of each projection before and after the object is introduced. - Returning to
FIG. 14 , theprocessor 152 determines the phase gradient in two orthogonal directions (step 618). Theprocessor 152 can determine the local phase gradient in two orthogonal directions based on the offsets in the x and y directions using Eqns. (2) and (3). Generally, wavefront sensors having structured apertures 300 (e.g., four hole aperture or single hole aperture sensors) of some embodiments can measure the spatial phase gradient of the light field. Mathematically, the wavefront sensors of some embodiments measure: -
G x(x,y)=k x(x,y)/k 0≈(dφ(x,y)/dx)/k 0, (6) -
G y(x,y)=k y(x,y)/k 0≈(dφ(x,y)/dy)/k 0,and (7) -
A(x,y), (8) - where Gx(x, y) is the two-dimensional phase gradient in the x-direction, Gy(x, y) is the two-dimensional phase gradient in the y-direction, and A(x, y) is the two-dimensional amplitude of the detected wavefront. To determine the phase gradient in two orthogonal directions based on the offsets, the
processor 152 can measure the net wavefront gradient Gx(x, y) and Gy(x, y) at each aperture respectively based on Eq. (10): -
- where D is the transparent layer (spacer) thickness, the offset in the x-direction is offsetx(x,y) and the offset in the y-direction is offsety(x,y).
- The
processor 152 also measures the amplitude of the image wavefront 120 (step 620). Theprocessor 152 measures the amplitude by summing up all the intensity values in each separated projection/distribution associated with eachstructured aperture 300. With the amplitude and phase gradient information, thequantitative DIC device 100 has sufficient data to reconstruct the image wavefront at the plane z at thelight detector 230. - The
quantitative DIC device 100 can mathematically reconstruct (unwrap) the detected wavefront by combining the measured data appropriately using an unwrapping method (step 622). One unwrapping method is given by Eq. (9): -
- Numerous approaches for reconstructing a field distribution exist (unwrapping). Some examples of suitable unwrapping methods include the Affine transformation method, the least squares method, the Frankot Chellappa methods of wrapping, etc. The unwrapping methods should all return the same answer if the signal to noise ratio (SNR) of the measurements approaches infinity. The unwrapping methods may vary in their performance based on the quantity and type of noise present in the measurements.
- The
processor 152 propagates the reconstructed wavefront at plane z to one or more planes z+Δz intersecting the object 30 (step 624). Theprocessor 152 propagates the reconstructed wavefront having a distribution ψ(x, y, z) to the other planes based on Eq. (12): ψ(x, y, z+Δz)=f(ψ(x, y, z), Δz) where ψ(x, y, z+Δz) is the light distribution at plane (z+Δz) intersecting theobject 30. The function (f) is well known and studied in electromagnetic theory. For example, this function f is described in Kraus, John Daniel, Fleisch, Daniel A., Electromagnetics with Applications (5th Ed), Chapters 4-16 (1999). - The
processor 152 may propagate the reconstructed wavefront to any number of planes (z+Δz) at different depths through theobject 30. In one embodiment, theobject 30 may have a dimension h along the z axis and may be located adjacent to the surface 230(a) of thelight detector 230. Theprocessor 152 may propagate the reconstructed wavefront n times (e.g., 100, 1000, 2000, etc.) through the planes z+Δzk=1 ton where Δzk=Δz1+hk/n. In this case, the reconstructed wavefront is propagated to n depths starting at the surface 230(a) where n=1 and incrementally increasing the depths by hk/n. In other embodiments, particular depths are used to image a region of theobject 30. For example, theprocessor 152 may propagate the reconstructed wavefront to a plane at the middle of theobject 30. - The
processor 152 generates two-dimensional and three-dimensional images (step 626). In one embodiment, theprocessor 152 generates two-dimensional images based on the reconstructed wavefront, the intensity distribution, the phase gradient distribution in the x-direction, and/or the phase gradient distribution in the y-direction. Theprocessor 152 can also combine the two-dimensional images to generate a three-dimensional image of theobject 30 or portions of theobject 30. - In some embodiments, an output device (e.g., a printer, display, etc.) of the
quantitative DIC device 100 can output various forms of data. For example, thequantitative DIC device 100 can output a two-dimensional local intensity image map, a two-dimensional phase gradient image map in and x-direction, a two-dimensional phase gradient image map in and y-direction, two-dimensional reconstructed image, a two-dimensional propagated image, and/or a three-dimensional image. The quantitative DIC device can also further analyze the two-dimensional wavefront data. For example, thequantitative DIC device 100 can use the intensity data to analyze biological properties of theobject 30. - Modifications, additions, or omissions may be made to the method without departing from the scope of the disclosure. The method may include more, fewer, or other steps. Additionally, steps may be performed in any suitable order without departing from the scope of the disclosure.
- The
quantitative DIC device 100 of some embodiments can be used in various applications such as biological sample imaging.FIG. 16( a) is an intensity/amplitude image taken of a starfish embryo using a quantitative DIC microscope having an SAI wavefront sensor, according to an embodiment of the invention.FIG. 16( b) is an image based on phase gradient in the x direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.FIG. 16( c) is an image based on phase gradient in the y direction taken of a starfish embryo using a quantitative DIC device having an SAI wavefront sensor, according to an embodiment of the invention.FIGS. 16( d), 16(e) and 16(f) showcase some of the unwrapping algorithms when applied to the raw amplitude, differential phase x and differential phase y data ofFIGS. 16( a), 16(b) and 16(c), according to embodiments of the invention.FIG. 16( d) is an image reconstructed using the least squares unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown inFIGS. 16( a), 16(b) and 16(c) respectively.FIG. 16( e) is an image reconstructed using the Frankot Chellappa unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown inFIGS. 16( a), 16(b) and 16(c) respectively.FIG. 16( f) is an image reconstructed using the Affine transformation unwrapping method applied to the raw amplitude/intensity, phase gradient in the x-direction and phase gradient in the y-direction shown inFIGS. 16( a), 16(b) and 16(c) respectively. With the wavefront computed (reconstructed), we can then perform computation-based sectioning via the approach outlined in the first section. -
FIG. 17( a) is an image of potato starch storage granules in immersion oil taken by a conventional transmission microscope.FIG. 17( b) is an image of potato starch storage granules in immersion oil taken by a conventional DIC microscope.FIG. 17( c) is an intensity image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.FIG. 17( d) is an artifact-free x-direction phase image of potato starch storage granules in immersion oil taken by a quantitative DIC device in a microscope system, according to an embodiment of the invention.FIG. 17( e) is an artifact-free y-direction phase image of potato starch storage granules in immersion oil taken by aquantitative DIC device 100 in a microscope system, according to an embodiment of the invention. - Birefringent objects such as the potato starch storage granules in
FIGS. 17( a)-17(e) can alter polarization of the two displaced fields in a conventional DIC microscope, such that the subsequent combination of the two files inFIG. 2 is no longer describable by Eq. (1). This can give rise to the Maltese-cross-like pattern artifacts in the resulting conventional DIC images.FIG. 17( b) shows Maltese-cross-like pattern artifacts in the conventional DIC image of the potato starch storage granules. - In some embodiments, a
quantitative DIC device 100 uses unpolarized light and does not rely on polarization for image processing. In these examples, thequantitative DIC device 100 can image birefringent samples (e.g., potato starch storage granules) without artifacts.FIGS. 17( d) and 17(e) show images of birefringent samples taken by a quantitative DIC device in a microscope system. In these figures, the birefringent images of the potato starch storage granules i.e. birefringent samples are without artifacts. Also, the dark absorption spots of the potato starch granules in the center of the intensity images inFIG. 17( c) do not appear in the phase images inFIGS. 17( d) and 17(e). This indicates that thequantitative DIC device 100 can separate the intensity variations of the image wavefront from the phase variations. This is advantageous over conventional DIC devices which cannot distinguish between the effects of absorption and phase variation and cannot provide quantitative phase measurements. - In one embodiment, the SAI wavefront sensor, a Shack-Hartmann wavefront sensor, or an OFM wavefront sensor can be placed into a camera system. One advantage to placing the wavefront sensor into a camera is that the camera can detect the phase gradient of the projected object wavefront in addition to the intensity information about the projected object wavefront.
- In some embodiments,
wavefront sensors 110/210 can be used with broadband illumination and/or monochromatic illumination.Wavefront sensors 110/210 apply to a monochromatic light field distribution in which k is well defined at each point on the image plane. However, wavefront sensing can also be used for a broadband light source and with situations where k at any given point may be a mix of different wave vectors. In this regard,wavefront sensors 110/210 can be used with broadband light illumination, monochromatic illumination with mixed k, and broadband light illumination with mixed k. An example of using broadband illumination bywavefront sensors 110/210 can be found in Cui, Xiquan, Lew, Matthew, Yang, Changhuei, Quantitative differential interference contrast microscopy based on structured-aperture interference,” Applied Physics Letters Vol. 93 (9), 091113 (2008), which is hereby incorporated by reference in its entirety for all purposes. - In one embodiment, the diffraction spot size in a SAI wavefront sensor and the focal spot size in the Shack-Hartmann wavefront sensor of a
quantitative DIC device 100 can be used to determine the spread of wave vector k at any given image point. Thequantitative DIC device 100 in this embodiment can render images where the extent of scattering is plotted. - In one embodiment, the
quantitative DIC device 100 can determine the proportionality of the phase gradient response of thewavefront sensor 110/210. Thequantitative DIC device 100 measures the interference pattern as thewavefront sensor 110/210 is illuminated by a suitable illumination source (e.g., a collimated He—Ne laser beam) with a light having suitable properties (e.g., 632.8 nm wavelength, 25 mm beam diameter, and 4 mW power) and with a range of incident angles. The total transmission and the offsets of the zero-order spot in both x and y directions can be computed with a suitable method such as a least-square 2D Gaussian fit. The relationship between the offsets of the zero order spot and the normalized phase gradient can be approximately linear. Thequantitative DIC device 100 estimates the constant that represents the approximately linear relationship between the offsets of the zero order spot and the normalized phase gradient. - V. Computer Devices
-
FIG. 18 shows a block diagram of subsystems that may be present in computer devices that are used inquantitative DIC device 100, according to embodiments of the invention. For example, thehost computer 150 orwavefront sensor 110/210 may use any suitable combination of components inFIG. 18 . - The various components previously described in the Figures may operate using one or more computer devices to facilitate the functions described herein. Any of the elements in the Figures may use any suitable number of subsystems to facilitate the functions described herein. Examples of such subsystems or components are shown in a
FIG. 18 . The subsystems shown inFIG. 18 are interconnected via asystem bus 775. Additional subsystems such as aprinter 774,keyboard 778, fixed disk 779 (or other memory comprising computer readable media), monitor 776, which is coupled todisplay adapter 782, and others are shown. Peripherals and input/output (I/O) devices, which couple to I/O controller 771, can be connected to the computer system by any number of means known in the art, such asserial port 777. For example,serial port 777 orexternal interface 781 can be used to connect the computer apparatus to a wide area network such as the Internet, a mouse input device, or a scanner. The interconnection via system bus allows thecentral processor 152 to communicate with each subsystem and to control the execution of instructions fromsystem memory 772 or the fixeddisk 779, as well as the exchange of information between subsystems. Thesystem memory 772 and/or the fixeddisk 779 may embody a computer readable medium. Any of these elements may be present in the previously described features. A computer readable medium according to an embodiment of the invention may comprise code for performing any of the functions described above. - In some embodiments, an output device (e.g., the printer 774) of the
quantitative DIC device 100 can output various forms of data. For example, thequantitative DIC device 100 can output a two-dimensional local intensity image map, a two-dimensional phase gradient image map in and x-direction, a two-dimensional phase gradient image map in and y-direction, two-dimensional reconstructed image, a two-dimensional propagated image, and/or a three-dimensional image. Thequantitative DIC device 100 can also further analyze the two-dimensional wavefront data. For example, thequantitative DIC device 100 can use the intensity data to analyze biological properties of theobject 30. - It should be understood that the present invention as described above can be implemented in the form of control logic using computer software in a modular or integrated manner. Based on the disclosure and teachings provided herein, a person of ordinary skill in the art will know and appreciate other ways and/or methods to implement the present invention using hardware and a combination of hardware and software.
- Any of the software components or functions described in this application, may be implemented as software code to be executed by a processor using any suitable computer language such as, for example, Java, C++ or Perl using, for example, conventional or object-oriented techniques. The software code may be stored as a series of instructions, or commands on a computer readable medium, such as a random access memory (RAM), a read only memory (ROM), a magnetic medium such as a hard-drive or a floppy disk, or an optical medium such as a CD-ROM. Any such computer readable medium may reside on or within a single computational apparatus, and may be present on or within different computational apparatuses within a system or network.
- A recitation of “a”, “an” or “the” is intended to mean “one or more” unless specifically indicated to the contrary.
- The above description is illustrative and is not restrictive. Many variations of the disclosure will become apparent to those skilled in the art upon review of the disclosure. The scope of the disclosure should, therefore, be determined not with reference to the above description, but instead should be determined with reference to the pending claims along with their full scope or equivalents.
- One or more features from any embodiment may be combined with one or more features of any other embodiment without departing from the scope of the disclosure. Further, modifications, additions, or omissions may be made to any embodiment without departing from the scope of the disclosure. The components of any embodiment may be integrated or separated according to particular needs without departing from the scope of the disclosure.
- All patents, patent applications, publications, and descriptions mentioned above are herein incorporated by reference in their entirety for all purposes. None is admitted to be prior art.
Claims (19)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/690,952 US8660312B2 (en) | 2009-01-21 | 2010-01-21 | Quantitative differential interference contrast (DIC) devices for computed depth sectioning |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US20548709P | 2009-01-21 | 2009-01-21 | |
US12/690,952 US8660312B2 (en) | 2009-01-21 | 2010-01-21 | Quantitative differential interference contrast (DIC) devices for computed depth sectioning |
Publications (2)
Publication Number | Publication Date |
---|---|
US20100195873A1 true US20100195873A1 (en) | 2010-08-05 |
US8660312B2 US8660312B2 (en) | 2014-02-25 |
Family
ID=42397753
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/690,952 Active 2031-07-12 US8660312B2 (en) | 2009-01-21 | 2010-01-21 | Quantitative differential interference contrast (DIC) devices for computed depth sectioning |
Country Status (4)
Country | Link |
---|---|
US (1) | US8660312B2 (en) |
EP (1) | EP2380055A4 (en) |
CN (1) | CN102292662A (en) |
WO (1) | WO2010090849A1 (en) |
Cited By (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100309457A1 (en) * | 2009-06-03 | 2010-12-09 | Xiquan Cui | Wavefront Imaging Sensor |
US20110075254A1 (en) * | 2006-05-02 | 2011-03-31 | Xiquan Cui | Surface Wave Enabled Darkfield Aperture |
US20110085219A1 (en) * | 2009-10-13 | 2011-04-14 | California Institute Of Technology | Holographically Illuminated Imaging Devices |
US20110170105A1 (en) * | 2008-03-04 | 2011-07-14 | Xiquan Cui | Techniques for Improving Optofluidic Microscope Devices |
US20110205352A1 (en) * | 2010-02-23 | 2011-08-25 | California Institute Of Technology | High resolution imaging devices with wide field and extended focus |
US20110226972A1 (en) * | 2009-09-21 | 2011-09-22 | California Institute Of Technology | Reflective Focusing and Transmissive Projection Device |
WO2013023988A1 (en) * | 2011-08-16 | 2013-02-21 | Hseb Dresden Gmbh | Measurement method for height profiles of surfaces |
US8411282B2 (en) | 2006-05-02 | 2013-04-02 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US8525091B2 (en) | 2008-05-05 | 2013-09-03 | California Institute Of Technology | Wavefront imaging devices comprising a film with one or more structured two dimensional apertures and their applications in microscopy and photography |
EP2635871A2 (en) * | 2010-11-07 | 2013-09-11 | Council for Scientific and Industrial Research | On-chip 4d lightfield microscope |
US8536545B2 (en) | 2010-09-09 | 2013-09-17 | California Institute Of Technology | Delayed emission detection devices and methods |
US8822894B2 (en) | 2011-01-07 | 2014-09-02 | California Institute Of Technology | Light-field pixel for detecting a wavefront based on a first intensity normalized by a second intensity |
US20140285634A1 (en) * | 2013-03-15 | 2014-09-25 | Digimarc Corporation | Cooperative photography |
US8946619B2 (en) | 2011-04-20 | 2015-02-03 | California Institute Of Technology | Talbot-illuminated imaging devices, systems, and methods for focal plane tuning |
US20150124513A1 (en) * | 2013-11-05 | 2015-05-07 | Postech Academy-Industry Foundation | Light incident angle controllable electronic device and manufacturing method thereof |
US9041938B2 (en) | 2006-05-02 | 2015-05-26 | California Institute Of Technology | Surface wave assisted structures and systems |
US9086536B2 (en) | 2011-03-09 | 2015-07-21 | California Institute Of Technology | Talbot imaging devices and systems |
US9350977B1 (en) * | 2013-03-11 | 2016-05-24 | Stc.Unm | Rotating point-spread function (PSF) design for three-dimensional imaging |
US20160291343A1 (en) * | 2013-03-11 | 2016-10-06 | Sudhakar Prasad | Rotating point-spread function (psf) design for three-dimensional imaging |
EP3505641A1 (en) * | 2015-12-18 | 2019-07-03 | Paris Sciences et Lettres - Quartier Latin | Optical device for measuring the position of an object |
US10498939B2 (en) * | 2009-06-16 | 2019-12-03 | Nri R&D Patent Licensing, Llc | Small-profile lensless optical microscopy imaging and tomography instruments and elements for low cost and integrated microscopy |
Families Citing this family (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2010090849A1 (en) | 2009-01-21 | 2010-08-12 | California Institute Of Technology | Quantitative differential interference contrast (dic) devices for computed depth sectioning |
JP6112824B2 (en) * | 2012-02-28 | 2017-04-12 | キヤノン株式会社 | Image processing method and apparatus, and program. |
US20140181630A1 (en) * | 2012-12-21 | 2014-06-26 | Vidinoti Sa | Method and apparatus for adding annotations to an image |
US11468557B2 (en) | 2014-03-13 | 2022-10-11 | California Institute Of Technology | Free orientation fourier camera |
CA2966926A1 (en) | 2014-12-22 | 2016-06-30 | California Institute Of Technology | Epi-illumination fourier ptychographic imaging for thick samples |
CN107407799B (en) * | 2015-03-13 | 2020-09-18 | 加州理工学院 | Correction of aberrations in incoherent imaging systems using fourier stack imaging techniques |
US9810862B2 (en) | 2015-08-21 | 2017-11-07 | SA Photonics, Inc. | Free space optical (FSO) system |
WO2017035095A1 (en) * | 2015-08-21 | 2017-03-02 | SA Photonics, Inc. | Free space optical (fso) system |
US9973274B1 (en) | 2015-10-07 | 2018-05-15 | SA Photonics, Inc. | Fast tracking free space optical module |
US11092795B2 (en) | 2016-06-10 | 2021-08-17 | California Institute Of Technology | Systems and methods for coded-aperture-based correction of aberration obtained from Fourier ptychography |
US10386289B2 (en) | 2016-12-23 | 2019-08-20 | miDiagnostics NV | Method and system for determining features of objects in a suspension |
US10446369B1 (en) * | 2017-06-14 | 2019-10-15 | National Technology & Engineering Solutions Of Sandia, Llc | Systems and methods for interferometric end point detection for a focused ion beam fabrication tool |
WO2019018851A1 (en) | 2017-07-21 | 2019-01-24 | California Institute Of Technology | Ultra-thin planar lens-less camera |
WO2019033110A1 (en) * | 2017-08-11 | 2019-02-14 | California Institute Of Technology | Lensless 3-dimensional imaging using directional sensing elements |
US10733419B2 (en) | 2017-08-29 | 2020-08-04 | Georgia Tech Research Corporation | Systems and methods for cell membrane identification and tracking, and technique automation using the same |
US10754140B2 (en) | 2017-11-03 | 2020-08-25 | California Institute Of Technology | Parallel imaging acquisition and restoration methods and systems |
EP3712596A4 (en) * | 2017-11-14 | 2021-11-24 | Nikon Corporation | Quantitative phase image generating method, quantitative phase image generating device, and program |
CN112449090A (en) * | 2019-09-03 | 2021-03-05 | 睿镞科技(北京)有限责任公司 | System, method and apparatus for generating depth image |
EP3828617A1 (en) * | 2019-11-26 | 2021-06-02 | Siemens Healthcare Diagnostics Inc. | Method for the digital colouring of cells |
WO2022226199A1 (en) | 2021-04-23 | 2022-10-27 | SA Photonics, Inc. | Wavefront sensor with inner detector and outer detector |
Citations (57)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4438330A (en) * | 1982-03-01 | 1984-03-20 | Itek Corporation | Wavefront sensor employing a modulation reticle |
US4692027A (en) * | 1985-07-03 | 1987-09-08 | Itek Corporation | Spatial processing for single or dual shear wavefront sensor |
US4737621A (en) * | 1985-12-06 | 1988-04-12 | Adaptive Optics Assoc., Inc. | Integrated adaptive optical wavefront sensing and compensating system |
US4980716A (en) * | 1988-04-28 | 1990-12-25 | Canon Kabushiki Kaisha | Focus detecting device |
US4981362A (en) * | 1989-10-16 | 1991-01-01 | Xerox Corporation | Particle concentration measuring method and device |
US5124927A (en) * | 1990-03-02 | 1992-06-23 | International Business Machines Corp. | Latent-image control of lithography tools |
US5196350A (en) * | 1991-05-29 | 1993-03-23 | Omnigene, Inc. | Ligand assay using interference modulation |
US5233174A (en) * | 1992-03-11 | 1993-08-03 | Hughes Danbury Optical Systems, Inc. | Wavefront sensor having a lenslet array as a null corrector |
US5300766A (en) * | 1993-04-13 | 1994-04-05 | Eastman Kodak Company | Scanning scene-based wavefront sensor having a linear image sensor array and a pupil sensor array |
US5362653A (en) * | 1989-09-12 | 1994-11-08 | Carr Robert J G | Examination of objects of macromolecular size |
US5426505A (en) * | 1992-11-25 | 1995-06-20 | Ciba-Geigy Corporation | Interferometric apparatus for monitoring changes of the refractive index of fluid samples in capillary tubes |
US5795755A (en) * | 1994-07-05 | 1998-08-18 | Lemelson; Jerome H. | Method of implanting living cells by laser poration at selected sites |
US5798262A (en) * | 1991-02-22 | 1998-08-25 | Applied Spectral Imaging Ltd. | Method for chromosomes classification |
US5973316A (en) * | 1997-07-08 | 1999-10-26 | Nec Research Institute, Inc. | Sub-wavelength aperture arrays with enhanced light transmission |
US6130419A (en) * | 1996-07-10 | 2000-10-10 | Wavefront Sciences, Inc. | Fixed mount wavefront sensor |
US6143247A (en) * | 1996-12-20 | 2000-11-07 | Gamera Bioscience Inc. | Affinity binding-based system for detecting particulates in a fluid |
US6499499B2 (en) * | 2001-04-20 | 2002-12-31 | Nanostream, Inc. | Flow control in multi-stream microfluidic devices |
US20030142291A1 (en) * | 2000-08-02 | 2003-07-31 | Aravind Padmanabhan | Portable scattering and fluorescence cytometer |
US20030174992A1 (en) * | 2001-09-27 | 2003-09-18 | Levene Michael J. | Zero-mode metal clad waveguides for performing spectroscopy with confined effective observation volumes |
US20030203502A1 (en) * | 2002-04-30 | 2003-10-30 | Frederic Zenhausern | Near-field transform spectroscopy |
US6753131B1 (en) * | 1996-07-22 | 2004-06-22 | President And Fellows Of Harvard College | Transparent elastomeric, contact-mode photolithography mask, sensor, and wavefront engineering element |
US20040156610A1 (en) * | 1997-05-16 | 2004-08-12 | Btg International Limited | Optical devices and methods of fabrication thereof |
US20040175734A1 (en) * | 1998-08-28 | 2004-09-09 | Febit Ferrarius Biotechnology Gmbh | Support for analyte determination methods and method for producing the support |
US20040190116A1 (en) * | 2001-08-31 | 2004-09-30 | Lezec Henri Joseph | Optical transmission apparatus with directionality and divergence control |
US20040224380A1 (en) * | 2002-04-01 | 2004-11-11 | Fluidigm Corp. | Microfluidic particle-analysis systems |
US20050007603A1 (en) * | 2002-01-24 | 2005-01-13 | Yoel Arieli | Spatial wavefront analysis and 3d measurement |
US20050088735A1 (en) * | 2003-10-22 | 2005-04-28 | Olszak Artur G. | Multi-axis imaging system with single-axis relay |
US20050161594A1 (en) * | 2004-01-02 | 2005-07-28 | Hollingsworth Russell E. | Plasmon enhanced near-field optical probes |
US20050271548A1 (en) * | 2004-06-04 | 2005-12-08 | California Institute Of Technology, Office Of Technology Transfer | Optofluidic microscope device |
US20060003145A1 (en) * | 2004-02-04 | 2006-01-05 | Hansen Carl L | Ultra-smooth microfabricated pores on a planar substrate for integrated patch-clamping |
US6987255B2 (en) * | 2003-08-25 | 2006-01-17 | The Boeing Company | State space wavefront reconstructor for an adaptive optics control |
US20060013031A1 (en) * | 2001-10-26 | 2006-01-19 | Vitra Bioscience, Inc. | Assay systems with adjustable fluid communication |
US7045781B2 (en) * | 2003-01-17 | 2006-05-16 | Ict, Integrated Circuit Testing Gesellschaft Fur Halbleiterpruftechnik Mbh | Charged particle beam apparatus and method for operating the same |
US20060175528A1 (en) * | 2003-06-20 | 2006-08-10 | Heriot-Watt University | Phase-diversity wavefront sensor |
US7113268B2 (en) * | 2004-01-12 | 2006-09-26 | The Boeing Company | Scintillation tolerant optical field sensing system and associated method |
US20070069999A1 (en) * | 2005-09-28 | 2007-03-29 | Rockwell Scientific Company | Spatial light modulator employing voltage gradient pixels, and associated methods |
US20070172745A1 (en) * | 2006-01-26 | 2007-07-26 | Smith Bruce W | Evanescent wave assist features for microlithography |
US20070207061A1 (en) * | 2004-06-04 | 2007-09-06 | California Institute Of Technology | Optofluidic microscope device |
US7271885B2 (en) * | 2004-03-25 | 2007-09-18 | Perkinelmer Las, Inc. | Plasmon resonance measuring method and apparatus |
US7283229B2 (en) * | 2001-01-25 | 2007-10-16 | Precision System Science Co., Ltd. | Small object identifying device and its identifying method |
US20070258096A1 (en) * | 2006-05-02 | 2007-11-08 | California Institute Of Tecnology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US7399445B2 (en) * | 2002-01-11 | 2008-07-15 | Canon Kabushiki Kaisha | Chemical sensor |
US20090079992A1 (en) * | 2007-09-25 | 2009-03-26 | Carl Zeiss Smt Ag | Method And System for Measuring a Surface of an Object |
US20090225319A1 (en) * | 2008-03-04 | 2009-09-10 | California Institute Of Technology | Methods of using optofluidic microscope devices |
US20090276188A1 (en) * | 2008-05-05 | 2009-11-05 | California Institute Of Technology | Quantitative differential interference contrast (dic) microscopy and photography based on wavefront sensors |
US7641856B2 (en) * | 2004-05-14 | 2010-01-05 | Honeywell International Inc. | Portable sample analyzer with removable cartridge |
US7671987B2 (en) * | 2000-08-02 | 2010-03-02 | Honeywell International Inc | Optical detection system for flow cytometry |
US7738695B2 (en) * | 2002-10-25 | 2010-06-15 | Institut Pasteur | Method and device for 3 dimensional imaging of suspended micro-objects providing high-resolution microscopy |
US20100309457A1 (en) * | 2009-06-03 | 2010-12-09 | Xiquan Cui | Wavefront Imaging Sensor |
US7864333B1 (en) * | 2008-12-03 | 2011-01-04 | Itt Manufacturing Enterprises, Inc. | Polarization modulated image conjugate piston sensing and phase retrieval system |
US20110075254A1 (en) * | 2006-05-02 | 2011-03-31 | Xiquan Cui | Surface Wave Enabled Darkfield Aperture |
US20110085219A1 (en) * | 2009-10-13 | 2011-04-14 | California Institute Of Technology | Holographically Illuminated Imaging Devices |
US20110170105A1 (en) * | 2008-03-04 | 2011-07-14 | Xiquan Cui | Techniques for Improving Optofluidic Microscope Devices |
US20110181884A1 (en) * | 2008-03-04 | 2011-07-28 | California Institute Of Technology | Optofluidic microscope device with photosensor array |
US8120765B2 (en) * | 2008-02-14 | 2012-02-21 | Hamamatsu Photonics K.K. | Observation device |
US20120211644A1 (en) * | 2011-01-07 | 2012-08-23 | California Institute Of Technology | Light-field pixel |
US20120250027A1 (en) * | 2006-05-02 | 2012-10-04 | California Institute Of Technology | Surface Wave Assisted Structures and Systems |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6597438B1 (en) | 2000-08-02 | 2003-07-22 | Honeywell International Inc. | Portable flow cytometry |
JP2003207454A (en) | 2002-01-15 | 2003-07-25 | Minolta Co Ltd | Transmission light-detecting apparatus |
US8080382B2 (en) | 2003-06-13 | 2011-12-20 | University Of Pittsburgh - Of The Commonwealth System Of Higher Education | Monitoring immunologic, hematologic and inflammatory diseases |
EP1787156A1 (en) | 2004-06-11 | 2007-05-23 | Nicholas Etienne Ross | Automated diagnosis of malaria and other infections |
WO2010090849A1 (en) | 2009-01-21 | 2010-08-12 | California Institute Of Technology | Quantitative differential interference contrast (dic) devices for computed depth sectioning |
-
2010
- 2010-01-21 WO PCT/US2010/021561 patent/WO2010090849A1/en active Application Filing
- 2010-01-21 CN CN2010800050975A patent/CN102292662A/en active Pending
- 2010-01-21 US US12/690,952 patent/US8660312B2/en active Active
- 2010-01-21 EP EP10738939A patent/EP2380055A4/en not_active Withdrawn
Patent Citations (74)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4438330A (en) * | 1982-03-01 | 1984-03-20 | Itek Corporation | Wavefront sensor employing a modulation reticle |
US4692027A (en) * | 1985-07-03 | 1987-09-08 | Itek Corporation | Spatial processing for single or dual shear wavefront sensor |
US4737621A (en) * | 1985-12-06 | 1988-04-12 | Adaptive Optics Assoc., Inc. | Integrated adaptive optical wavefront sensing and compensating system |
US4980716A (en) * | 1988-04-28 | 1990-12-25 | Canon Kabushiki Kaisha | Focus detecting device |
US5362653A (en) * | 1989-09-12 | 1994-11-08 | Carr Robert J G | Examination of objects of macromolecular size |
US4981362A (en) * | 1989-10-16 | 1991-01-01 | Xerox Corporation | Particle concentration measuring method and device |
US5124927A (en) * | 1990-03-02 | 1992-06-23 | International Business Machines Corp. | Latent-image control of lithography tools |
US5798262A (en) * | 1991-02-22 | 1998-08-25 | Applied Spectral Imaging Ltd. | Method for chromosomes classification |
US5196350A (en) * | 1991-05-29 | 1993-03-23 | Omnigene, Inc. | Ligand assay using interference modulation |
US5233174A (en) * | 1992-03-11 | 1993-08-03 | Hughes Danbury Optical Systems, Inc. | Wavefront sensor having a lenslet array as a null corrector |
US5426505A (en) * | 1992-11-25 | 1995-06-20 | Ciba-Geigy Corporation | Interferometric apparatus for monitoring changes of the refractive index of fluid samples in capillary tubes |
US5300766A (en) * | 1993-04-13 | 1994-04-05 | Eastman Kodak Company | Scanning scene-based wavefront sensor having a linear image sensor array and a pupil sensor array |
US5795755A (en) * | 1994-07-05 | 1998-08-18 | Lemelson; Jerome H. | Method of implanting living cells by laser poration at selected sites |
US6130419A (en) * | 1996-07-10 | 2000-10-10 | Wavefront Sciences, Inc. | Fixed mount wavefront sensor |
US6753131B1 (en) * | 1996-07-22 | 2004-06-22 | President And Fellows Of Harvard College | Transparent elastomeric, contact-mode photolithography mask, sensor, and wavefront engineering element |
US6143247A (en) * | 1996-12-20 | 2000-11-07 | Gamera Bioscience Inc. | Affinity binding-based system for detecting particulates in a fluid |
US20040156610A1 (en) * | 1997-05-16 | 2004-08-12 | Btg International Limited | Optical devices and methods of fabrication thereof |
US5973316A (en) * | 1997-07-08 | 1999-10-26 | Nec Research Institute, Inc. | Sub-wavelength aperture arrays with enhanced light transmission |
US20040175734A1 (en) * | 1998-08-28 | 2004-09-09 | Febit Ferrarius Biotechnology Gmbh | Support for analyte determination methods and method for producing the support |
US20030142291A1 (en) * | 2000-08-02 | 2003-07-31 | Aravind Padmanabhan | Portable scattering and fluorescence cytometer |
US7671987B2 (en) * | 2000-08-02 | 2010-03-02 | Honeywell International Inc | Optical detection system for flow cytometry |
US7283229B2 (en) * | 2001-01-25 | 2007-10-16 | Precision System Science Co., Ltd. | Small object identifying device and its identifying method |
US6499499B2 (en) * | 2001-04-20 | 2002-12-31 | Nanostream, Inc. | Flow control in multi-stream microfluidic devices |
US20040190116A1 (en) * | 2001-08-31 | 2004-09-30 | Lezec Henri Joseph | Optical transmission apparatus with directionality and divergence control |
US20030174992A1 (en) * | 2001-09-27 | 2003-09-18 | Levene Michael J. | Zero-mode metal clad waveguides for performing spectroscopy with confined effective observation volumes |
US20060013031A1 (en) * | 2001-10-26 | 2006-01-19 | Vitra Bioscience, Inc. | Assay systems with adjustable fluid communication |
US7399445B2 (en) * | 2002-01-11 | 2008-07-15 | Canon Kabushiki Kaisha | Chemical sensor |
US20050007603A1 (en) * | 2002-01-24 | 2005-01-13 | Yoel Arieli | Spatial wavefront analysis and 3d measurement |
US20040224380A1 (en) * | 2002-04-01 | 2004-11-11 | Fluidigm Corp. | Microfluidic particle-analysis systems |
US6858436B2 (en) * | 2002-04-30 | 2005-02-22 | Motorola, Inc. | Near-field transform spectroscopy |
US20030203502A1 (en) * | 2002-04-30 | 2003-10-30 | Frederic Zenhausern | Near-field transform spectroscopy |
US7738695B2 (en) * | 2002-10-25 | 2010-06-15 | Institut Pasteur | Method and device for 3 dimensional imaging of suspended micro-objects providing high-resolution microscopy |
US7045781B2 (en) * | 2003-01-17 | 2006-05-16 | Ict, Integrated Circuit Testing Gesellschaft Fur Halbleiterpruftechnik Mbh | Charged particle beam apparatus and method for operating the same |
US20060175528A1 (en) * | 2003-06-20 | 2006-08-10 | Heriot-Watt University | Phase-diversity wavefront sensor |
US6987255B2 (en) * | 2003-08-25 | 2006-01-17 | The Boeing Company | State space wavefront reconstructor for an adaptive optics control |
US20050088735A1 (en) * | 2003-10-22 | 2005-04-28 | Olszak Artur G. | Multi-axis imaging system with single-axis relay |
US20050161594A1 (en) * | 2004-01-02 | 2005-07-28 | Hollingsworth Russell E. | Plasmon enhanced near-field optical probes |
US7250598B2 (en) * | 2004-01-02 | 2007-07-31 | Hollingsworth Russell E | Plasmon enhanced near-field optical probes |
US7113268B2 (en) * | 2004-01-12 | 2006-09-26 | The Boeing Company | Scintillation tolerant optical field sensing system and associated method |
US20060003145A1 (en) * | 2004-02-04 | 2006-01-05 | Hansen Carl L | Ultra-smooth microfabricated pores on a planar substrate for integrated patch-clamping |
US7271885B2 (en) * | 2004-03-25 | 2007-09-18 | Perkinelmer Las, Inc. | Plasmon resonance measuring method and apparatus |
US7641856B2 (en) * | 2004-05-14 | 2010-01-05 | Honeywell International Inc. | Portable sample analyzer with removable cartridge |
US20070207061A1 (en) * | 2004-06-04 | 2007-09-06 | California Institute Of Technology | Optofluidic microscope device |
US7773227B2 (en) * | 2004-06-04 | 2010-08-10 | California Institute Of Technology | Optofluidic microscope device featuring a body comprising a fluid channel and having light transmissive regions |
US7751048B2 (en) * | 2004-06-04 | 2010-07-06 | California Institute Of Technology | Optofluidic microscope device |
US20050271548A1 (en) * | 2004-06-04 | 2005-12-08 | California Institute Of Technology, Office Of Technology Transfer | Optofluidic microscope device |
US20100296094A1 (en) * | 2004-06-04 | 2010-11-25 | Changhuei Yang | Optofluidic microscope device |
US20070069999A1 (en) * | 2005-09-28 | 2007-03-29 | Rockwell Scientific Company | Spatial light modulator employing voltage gradient pixels, and associated methods |
US20070172745A1 (en) * | 2006-01-26 | 2007-07-26 | Smith Bruce W | Evanescent wave assist features for microlithography |
US8189204B2 (en) * | 2006-05-02 | 2012-05-29 | California Institute Of Technology | Surface wave enabled darkfield aperture |
US20120250027A1 (en) * | 2006-05-02 | 2012-10-04 | California Institute Of Technology | Surface Wave Assisted Structures and Systems |
US8411282B2 (en) * | 2006-05-02 | 2013-04-02 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US20070258096A1 (en) * | 2006-05-02 | 2007-11-08 | California Institute Of Tecnology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US20110063623A1 (en) * | 2006-05-02 | 2011-03-17 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US7768654B2 (en) * | 2006-05-02 | 2010-08-03 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US7982883B2 (en) * | 2006-05-02 | 2011-07-19 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US20120026509A1 (en) * | 2006-05-02 | 2012-02-02 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US20110075254A1 (en) * | 2006-05-02 | 2011-03-31 | Xiquan Cui | Surface Wave Enabled Darkfield Aperture |
US20090079992A1 (en) * | 2007-09-25 | 2009-03-26 | Carl Zeiss Smt Ag | Method And System for Measuring a Surface of an Object |
US8120765B2 (en) * | 2008-02-14 | 2012-02-21 | Hamamatsu Photonics K.K. | Observation device |
US8325349B2 (en) * | 2008-03-04 | 2012-12-04 | California Institute Of Technology | Focal plane adjustment by back propagation in optofluidic microscope devices |
US20090225319A1 (en) * | 2008-03-04 | 2009-09-10 | California Institute Of Technology | Methods of using optofluidic microscope devices |
US20110170105A1 (en) * | 2008-03-04 | 2011-07-14 | Xiquan Cui | Techniques for Improving Optofluidic Microscope Devices |
US8314933B2 (en) * | 2008-03-04 | 2012-11-20 | California Institute Of Technology | Optofluidic microscope device with photosensor array |
US20110181884A1 (en) * | 2008-03-04 | 2011-07-28 | California Institute Of Technology | Optofluidic microscope device with photosensor array |
US8039776B2 (en) * | 2008-05-05 | 2011-10-18 | California Institute Of Technology | Quantitative differential interference contrast (DIC) microscopy and photography based on wavefront sensors |
US8525091B2 (en) * | 2008-05-05 | 2013-09-03 | California Institute Of Technology | Wavefront imaging devices comprising a film with one or more structured two dimensional apertures and their applications in microscopy and photography |
US20120061554A1 (en) * | 2008-05-05 | 2012-03-15 | The General Hospital Corporation | Quantitative differential interference contrast (dic) microscopy and photography based on wavefront sensors |
US20090276188A1 (en) * | 2008-05-05 | 2009-11-05 | California Institute Of Technology | Quantitative differential interference contrast (dic) microscopy and photography based on wavefront sensors |
US7864333B1 (en) * | 2008-12-03 | 2011-01-04 | Itt Manufacturing Enterprises, Inc. | Polarization modulated image conjugate piston sensing and phase retrieval system |
US8416400B2 (en) * | 2009-06-03 | 2013-04-09 | California Institute Of Technology | Wavefront imaging sensor |
US20100309457A1 (en) * | 2009-06-03 | 2010-12-09 | Xiquan Cui | Wavefront Imaging Sensor |
US20110085219A1 (en) * | 2009-10-13 | 2011-04-14 | California Institute Of Technology | Holographically Illuminated Imaging Devices |
US20120211644A1 (en) * | 2011-01-07 | 2012-08-23 | California Institute Of Technology | Light-field pixel |
Non-Patent Citations (2)
Title |
---|
Lew et al., ("Interference of a four-hole aperture for on-chip quantitative two-dimensional differential phase imaging"), Optic Letters Vol. 32, No.20, pp. 2963 - 2965, Oct. 15, 2007. * |
LEW, Matthew, et al., "Interference of a four-hole aperture for on-chip quantitative two-dimensional differential phase imaging," Optics Letters, Vol. 32, No. 20, pp. 2963-2965 (October 2007). * |
Cited By (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8189204B2 (en) | 2006-05-02 | 2012-05-29 | California Institute Of Technology | Surface wave enabled darkfield aperture |
US20110075254A1 (en) * | 2006-05-02 | 2011-03-31 | Xiquan Cui | Surface Wave Enabled Darkfield Aperture |
US9041938B2 (en) | 2006-05-02 | 2015-05-26 | California Institute Of Technology | Surface wave assisted structures and systems |
US8411282B2 (en) | 2006-05-02 | 2013-04-02 | California Institute Of Technology | On-chip phase microscope/beam profiler based on differential interference contrast and/or surface plasmon assisted interference |
US20110170105A1 (en) * | 2008-03-04 | 2011-07-14 | Xiquan Cui | Techniques for Improving Optofluidic Microscope Devices |
US8325349B2 (en) | 2008-03-04 | 2012-12-04 | California Institute Of Technology | Focal plane adjustment by back propagation in optofluidic microscope devices |
US8525091B2 (en) | 2008-05-05 | 2013-09-03 | California Institute Of Technology | Wavefront imaging devices comprising a film with one or more structured two dimensional apertures and their applications in microscopy and photography |
US8416400B2 (en) | 2009-06-03 | 2013-04-09 | California Institute Of Technology | Wavefront imaging sensor |
US20100309457A1 (en) * | 2009-06-03 | 2010-12-09 | Xiquan Cui | Wavefront Imaging Sensor |
US10498939B2 (en) * | 2009-06-16 | 2019-12-03 | Nri R&D Patent Licensing, Llc | Small-profile lensless optical microscopy imaging and tomography instruments and elements for low cost and integrated microscopy |
US20110226972A1 (en) * | 2009-09-21 | 2011-09-22 | California Institute Of Technology | Reflective Focusing and Transmissive Projection Device |
US8633432B2 (en) | 2009-09-21 | 2014-01-21 | California Institute Of Technology | Reflective focusing and transmissive projection device |
US8767216B2 (en) | 2009-10-13 | 2014-07-01 | California Institute Of Technology | Holographically illuminated imaging devices |
US20110085219A1 (en) * | 2009-10-13 | 2011-04-14 | California Institute Of Technology | Holographically Illuminated Imaging Devices |
US20110205339A1 (en) * | 2010-02-23 | 2011-08-25 | California Institute Of Technology | Nondiffracting beam detection devices for three-dimensional imaging |
US20110205352A1 (en) * | 2010-02-23 | 2011-08-25 | California Institute Of Technology | High resolution imaging devices with wide field and extended focus |
US8970671B2 (en) | 2010-02-23 | 2015-03-03 | California Institute Of Technology | Nondiffracting beam detection devices for three-dimensional imaging |
US9357202B2 (en) | 2010-02-23 | 2016-05-31 | California Institute Of Technology | High resolution imaging devices with wide field and extended focus |
US8536545B2 (en) | 2010-09-09 | 2013-09-17 | California Institute Of Technology | Delayed emission detection devices and methods |
EP2635871A4 (en) * | 2010-11-07 | 2014-06-04 | Council Scient Ind Res | On-chip 4d lightfield microscope |
EP2635871A2 (en) * | 2010-11-07 | 2013-09-11 | Council for Scientific and Industrial Research | On-chip 4d lightfield microscope |
US8822894B2 (en) | 2011-01-07 | 2014-09-02 | California Institute Of Technology | Light-field pixel for detecting a wavefront based on a first intensity normalized by a second intensity |
US9086536B2 (en) | 2011-03-09 | 2015-07-21 | California Institute Of Technology | Talbot imaging devices and systems |
US8946619B2 (en) | 2011-04-20 | 2015-02-03 | California Institute Of Technology | Talbot-illuminated imaging devices, systems, and methods for focal plane tuning |
US9671602B2 (en) | 2011-08-16 | 2017-06-06 | Hseb Dresden Gmbh | Measurement method for height profiles of surfaces using a differential interference contrast image |
WO2013023988A1 (en) * | 2011-08-16 | 2013-02-21 | Hseb Dresden Gmbh | Measurement method for height profiles of surfaces |
US9350977B1 (en) * | 2013-03-11 | 2016-05-24 | Stc.Unm | Rotating point-spread function (PSF) design for three-dimensional imaging |
US20160291343A1 (en) * | 2013-03-11 | 2016-10-06 | Sudhakar Prasad | Rotating point-spread function (psf) design for three-dimensional imaging |
US9823486B2 (en) * | 2013-03-11 | 2017-11-21 | Stc. Unm | Rotating point-spread function (PSF) design for three-dimensional imaging |
US20140285634A1 (en) * | 2013-03-15 | 2014-09-25 | Digimarc Corporation | Cooperative photography |
US9554123B2 (en) * | 2013-03-15 | 2017-01-24 | Digimarc Corporation | Cooperative photography |
US20150124513A1 (en) * | 2013-11-05 | 2015-05-07 | Postech Academy-Industry Foundation | Light incident angle controllable electronic device and manufacturing method thereof |
US9653158B2 (en) * | 2013-11-05 | 2017-05-16 | Postech Academy-Industry Foundation | Light incident angle controllable electronic device and manufacturing method thereof |
EP3505641A1 (en) * | 2015-12-18 | 2019-07-03 | Paris Sciences et Lettres - Quartier Latin | Optical device for measuring the position of an object |
Also Published As
Publication number | Publication date |
---|---|
EP2380055A4 (en) | 2012-07-11 |
US8660312B2 (en) | 2014-02-25 |
CN102292662A (en) | 2011-12-21 |
WO2010090849A1 (en) | 2010-08-12 |
EP2380055A1 (en) | 2011-10-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8660312B2 (en) | Quantitative differential interference contrast (DIC) devices for computed depth sectioning | |
US8525091B2 (en) | Wavefront imaging devices comprising a film with one or more structured two dimensional apertures and their applications in microscopy and photography | |
US10606055B2 (en) | Aperture scanning Fourier ptychographic imaging | |
US10419665B2 (en) | Variable-illumination fourier ptychographic imaging devices, systems, and methods | |
US8822894B2 (en) | Light-field pixel for detecting a wavefront based on a first intensity normalized by a second intensity | |
US8416400B2 (en) | Wavefront imaging sensor | |
EP3488221B1 (en) | An integrated lens free imaging device | |
US9086536B2 (en) | Talbot imaging devices and systems | |
CN103959040B (en) | Optical coherence tomography system is attached on smart mobile phone | |
US20150160450A1 (en) | Embedded pupil function recovery for fourier ptychographic imaging devices | |
US20070148792A1 (en) | Wafer measurement system and apparatus | |
JP2013228735A (en) | Device and method for holographic reflection imaging | |
Isikman et al. | Modern Trends in Imaging VIII: Lensfree Computational Microscopy Tools for Cell and Tissue Imaging at the Point‐of‐Care and in Low‐Resource Settings | |
US12098991B2 (en) | Method and apparatus for detecting nanoparticles and biological molecules | |
Isikman et al. | Lensfree computational microscopy tools for cell and tissue imaging at the point-of-care and in low-resource settings | |
Sensing | Coherence-gated wavefront sensing | |
EP4191344A1 (en) | Imaging device and method for holographic imaging of samples | |
Dehghan et al. | Diffraction of correlated biphotons through transparent samples | |
Luo | Novel biomedical imaging systems | |
Koklu | High numerical aperture subsurface imaging | |
Shi | Multi-spectral Laser Scanning Confocal Microscopy with Structured Illumination |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CALIFORNIA INSTITUTE OF TECHNOLOGY, CALIFORNIA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CUI, XIQUAN;YANG, CHANGHUEI;SIGNING DATES FROM 20100217 TO 20100219;REEL/FRAME:024074/0582 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: M2552); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY Year of fee payment: 8 |