US20190235224A1 - Computational microscopes and methods for generating an image under different illumination conditions - Google Patents
Computational microscopes and methods for generating an image under different illumination conditions Download PDFInfo
- Publication number
- US20190235224A1 US20190235224A1 US15/775,389 US201615775389A US2019235224A1 US 20190235224 A1 US20190235224 A1 US 20190235224A1 US 201615775389 A US201615775389 A US 201615775389A US 2019235224 A1 US2019235224 A1 US 2019235224A1
- Authority
- US
- United States
- Prior art keywords
- image
- sample
- illumination
- microscope
- controller
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
- G02B21/365—Control or image processing arrangements for digital or video microscopes
- G02B21/367—Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/02—Objectives
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/06—Means for illuminating specimens
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/245—Devices for focusing using auxiliary sources, detectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/36—Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/02—Mountings, adjusting means, or light-tight connections, for optical elements for lenses
- G02B7/04—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification
- G02B7/09—Mountings, adjusting means, or light-tight connections, for optical elements for lenses with mechanism for focusing or varying magnification adapted for automatic focusing or varying magnification
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B7/00—Mountings, adjusting means, or light-tight connections, for optical elements
- G02B7/28—Systems for automatic generation of focusing signals
- G02B7/36—Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
-
- G06K9/00134—
-
- G06K9/00147—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/54—Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/50—Constructional details
- H04N23/55—Optical parts specially adapted for electronic image sensors; Mounting thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/56—Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
-
- H04N5/2253—
-
- H04N5/2254—
-
- H04N5/2256—
Definitions
- the present disclosure relates generally to computational microscopy and, more specifically, to systems and methods for generating an image under different illumination conditions.
- a new generation of microscope technology known as computational microscopy, has begun to emerge, and makes use of advanced image-processing algorithms (usually with hardware modifications) to overcome limitations of conventional microscopes.
- a computational microscope can, in some cases, produce high-resolution digital images of samples without using expensive optical lenses.
- a computational microscope may open the door for additional capabilities based on computer vision, sharing of data, etc.
- Disclosed systems and methods relate to the field of computational microscopy. Certain disclosed embodiments are directed to systems and methods for focusing a microscope using images acquired under a plurality of illumination conditions. The disclosed embodiments also include systems and methods for acquiring images under a plurality of illumination conditions to generate a high-resolution image of a sample.
- a microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions.
- the microscope may include at least one image capture device configured to capture at a first image resolution images of a sample.
- the microscope may further include a lens with a first numerical aperture.
- the microscope may also include an illumination assembly including at least one light source configured to illuminate the sample, wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture.
- the microscope may further include at least one controller programmed to: cause the illumination assembly to illuminate the sample at a series of different illumination conditions; acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination condition; determine, from the at least one image, image data of the sample for each illumination condition; and generate, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- a microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions.
- the microscope may include at least one image capture device configured to capture at a first image resolution, images of a sample.
- the microscope may also include a lens with a first numerical aperture.
- the microscope may further include an illumination assembly including at least one light source configured to illuminate the sample, wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture.
- the microscope may further include at least one controller programmed to cause the illumination assembly to illuminate the sample at a series of different illumination angles; acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination angle; determine, from the at least one image, image data of the sample for each illumination angle, wherein the image data includes phase information of the sample under each illumination condition; and generate, from the image data for the series of different illumination angles, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- a method for constructing an image of a sample using image information acquired under a plurality of different illumination conditions.
- the method may include illuminating a sample at a series of different illumination conditions, wherein an illumination of the sample is at an incidence angle representing a numerical aperture which is at least 1.5 times of a numerical aperture associated with an image capture device; acquiring, from the image capture device, a plurality of images captured at the first image resolution of the sample, wherein the plurality of images includes at least one image for each illumination condition; determining, from the at least one image, image data of the sample for each illumination condition, wherein the image data includes phase information of the sample under each illumination condition; and generating, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- a non-transitory computer-readable storage media may store program instructions, which are executed by at least one controller and perform any of the methods described herein.
- FIG. 1 is a diagrammatic representation of an exemplary microscope, consistent with the disclosed embodiments
- FIG. 2A is a diagrammatic representation of the optical paths of two beam pairs when the microscope of FIG. 1 is out of focus, consistent with the disclosed embodiments;
- FIG. 2B is a diagrammatic representation of the optical paths of two beam pairs when the microscope of FIG. 1 is in focus, consistent with the disclosed embodiments;
- FIG. 3A is a diagrammatic representation of an exemplary image shown on a display when the microscope of FIG. 1 is out of focus, consistent with the disclosed embodiments;
- FIG. 3B is a diagrammatic representation of an exemplary image shown on a display when the microscope of FIG. 1 is in focus, consistent with the disclosed embodiments;
- FIG. 4 is a flowchart showing an exemplary process for focusing an image of a sample using images acquired under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 5 is a representation of an exemplary process for constructing an image of a sample using images acquired under a plurality of illumination conditions, consistent with disclosed embodiments;
- FIG. 6A is a diagrammatic representation of a configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 6B is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 6C is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 6D is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 6E is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 6F is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments;
- FIG. 7 is a flow diagram showing the implementation of the process of FIG. 5 using the configuration of FIG. 6A , consistent with the disclosed embodiments;
- FIG. 8 is a diagrammatic representation of the numerical aperture of the microscope of FIG. 1 , consistent with the disclosed embodiments;
- FIG. 9A is an illustration in Fourier-plane of image data acquired under a single illumination condition, consistent with the disclosed embodiments.
- FIG. 9B is an illustration in Fourier-plane of image data acquired under a plurality of different illumination conditions, consistent with the disclosed embodiments.
- FIG. 10 is a flowchart showing an exemplary process for reconstructing an image of a sample using images acquired under a plurality of illumination conditions, consistent with the disclosed embodiments.
- Disclosed embodiments provide microscopes and methods that use one or more cameras to provide high-resolution images of a sample which may be located on a stage.
- the microscope may use images of the sample captured under a plurality of illumination conditions.
- the plurality of illumination conditions may include different illumination angles.
- the microscope may identify, in the captured images, multiple occurrences of the sample corresponding to the plurality of illumination conditions. The microscope may estimate a shift between the occurrences and determine a degree in which the microscope is out of focus. This aspect of the disclosure is described in detail with reference to FIGS. 2-4 .
- the microscope may capture multiple images of the sample under each illumination condition, aggregate image data from these images, and construct a high-resolution image from the image data.
- the microscope may aggregate the image data in the Fourier plane and then use inverse Fourier transform to reconstruct the high-resolution image. This aspect of the disclosure is described in detail with reference to FIGS. 5-10 .
- FIG. 1 is a diagrammatic representation of a microscope 100 consistent with the exemplary disclosed embodiments.
- the term “microscope” refers to any device or instrument for magnifying an object which is smaller than easily observable by the naked eye, i.e., creating an image of an object for a user where the image is larger than the object.
- One type of microscope may be an “optical microscope” that uses light in combination with an optical system for magnifying an object.
- An optical microscope may be a simple microscope having one or more magnifying lens.
- Another type of microscope may be a “computational microscope” that includes an image sensor and image-processing algorithms to enhance or magnify the object's size or other properties.
- the computational microscope may be a dedicated device or created by incorporating software and/or hardware with an existing optical microscope to produce high-resolution digital images.
- microscope 100 includes an image capture device 102 , a focus actuator 104 , a controller 106 connected to memory 108 , an illumination assembly 110 , and a user interface 112 .
- An example usage of microscope 100 may be capturing images of a sample 114 mounted on a stage 116 located within the field-of-view (FOV) of image capture device 102 , processing the captured images, and presenting on user interface 112 a magnified image of sample 114 .
- FOV field-of-view
- Image capture device 102 may be used to capture images of sample 114 .
- the term “image capture device” includes a device that records the optical signals entering a lens as an image or a sequence of images.
- the optical signals may be in the near-infrared, infrared, visible, and ultraviolet spectrums.
- Examples of an image capture device include a CCD camera, a CMOS camera, a photo sensor array, a video camera, a mobile phone equipped with a camera, etc.
- Some embodiments may include only a single image capture device 102 , while other embodiments may include two, three, or even four or more image capture devices 102 .
- image capture device 102 may be configured to capture images in a defined field-of-view (FOV).
- FOV field-of-view
- image capture devices 102 may have overlap areas in their respective FOVs.
- Image capture device 102 may have one or more image sensors (not shown in FIG. 1 ) for capturing image data of sample 114 .
- image capture device 102 may be configured to capture images at an image resolution higher than 10 Megapixels, higher than 12 Megapixels, higher than 15 Megapixels, or higher than 20 Megapixels.
- image capture device 102 may also be configured to have a pixel size smaller than 5 micrometers, smaller than 3 micrometers, or smaller than 1.6 micrometer.
- microscope 100 includes focus actuator 104 .
- focus actuator refers to any device capable of converting input signals into physical motion for adjusting the relative distance between sample 114 and image capture device 102 .
- Various focus actuators may be used, including, for example, linear motors, electrostrictive actuators, electrostatic motors, capacitive motors, voice coil actuators, magnetostrictive actuators, etc.
- focus actuator 104 may include an analog position feedback sensor and/or a digital position feedback element. Focus actuator 104 is configured to receive instructions from controller 106 in order to make light beams converge to form a clear and sharply defined image of sample 114 .
- focus actuator 104 may be configured to adjust the distance by moving image capture device 102 .
- focus actuator 104 may be configured to adjust the distance by moving stage 116 , or by moving both image capture device 102 and stage 116 .
- Microscope 100 may also include controller 106 for controlling the operation of microscope 100 according to the disclosed embodiments.
- Controller 106 may comprise various types of devices for performing logic operations on one or more inputs of image data and other data according to stored or accessible software instructions providing desired functionality.
- controller 106 may include a central processing unit (CPU), support circuits, digital signal processors, integrated circuits, cache memory, or any other types of devices for image processing and analysis such as graphic processing units (GPUs).
- the CPU may comprise any number of microcontrollers or microprocessors configured to process the imagery from the image sensors.
- the CPU may include any type of single- or multi-core processor, mobile device microcontroller, etc.
- processors may be used, including, for example, processors available from manufacturers such as Intel®, AMD®, etc. and may include various architectures (e.g., x86 processor, ARM®, etc.).
- the support circuits may be any number of circuits generally well known in the art, including cache, power supply, clock and input-output circuits.
- controller 106 may be associated with memory 108 used for storing software that, when executed by controller 106 , controls the operation of microscope 100 .
- memory 108 may also store electronic data associated with operation of microscope 100 such as, for example, captured or generated images of sample 114 .
- memory 108 may be integrated into the controller 106 .
- memory 108 may be separated from the controller 106 .
- memory 108 may refer to multiple structures or computer-readable storage mediums located at controller 106 or at a remote location, such as a cloud server.
- Memory 108 may comprise any number of random access memories, read only memories, flash memories, disk drives, optical storage, tape storage, removable storage and other types of storage.
- Microscope 100 may include illumination assembly 110 .
- illumination assembly refers to any device or system capable of projecting light to illuminate sample 114 .
- Illumination assembly 110 may include any number of light sources, such as light emitting diodes (LEDs), lasers, and lamps configured to emit light.
- illumination assembly 110 may include only a single light source.
- illumination assembly 110 may include four, sixteen, or even more than a hundred light sources organized in an array or a matrix.
- illumination assembly 110 may use one or more light sources located at a surface parallel to illuminate sample 114 .
- illumination assembly 110 may use one or more light sources located at a surface perpendicular or at an angle to sample 114 .
- illumination assembly 110 may be configured to illuminate sample 114 in a series of different illumination conditions.
- illumination assembly 110 may include a plurality of light sources arranged in different illumination angles, such as a two-dimensional arrangement of light sources.
- the different illumination conditions may include different illumination angles.
- FIG. 1 depicts a beam 118 projected from a first illumination angle ⁇ 1
- a beam 120 projected from a second illumination angle ⁇ 2 may have the same value but opposite sign.
- first illumination angle ⁇ 1 may be separated from second illumination angle ⁇ 2 .
- both angles originate from points within the acceptance angle of the optics.
- illumination assembly 110 may include a plurality of light sources configured to emit light in different wavelengths.
- the different illumination conditions may include different wavelengths.
- illumination assembly 110 may configured to use a number of light sources at predetermined times.
- the different illumination conditions may include different illumination patterns. Accordingly and consistent with the present disclosure, the different illumination conditions may be selected from a group including: different durations, different intensities, different positions, different illumination angles, different illumination patterns, different wavelengths, or any combination thereof.
- microscope 100 may include, be connected with, or in communication with (e.g., over a network or wirelessly, e.g., via Bluetooth) user interface 112 .
- the term “user interface” refers to any device suitable for presenting a magnified image of sample 114 or any device suitable for receiving inputs from one or more users of microscope 100 .
- FIG. 1 illustrates two examples of user interface 112 .
- the first example is a smartphone or a tablet wirelessly communicating with controller 106 over a Bluetooth, cellular connection or a Wi-Fi connection, directly or through a remote server.
- the second example is a PC display physically connected to controller 106 .
- user interface 112 may include user output devices, including, for example, a display, tactile device, speaker, etc.
- user interface 112 may include user input devices, including, for example, a touchscreen, microphone, keyboard, pointer devices, cameras, knobs, buttons, etc. With such input devices, a user may be able to provide information inputs or commands to microscope 100 by typing instructions or information, providing voice commands, selecting menu options on a screen using buttons, pointers, or eye-tracking capabilities, or through any other suitable techniques for communicating information to microscope 100 .
- User interface 112 may be connected (physically or wirelessly) with one or more processing devices, such as controller 106 , to provide and receive information to or from a user and process that information.
- processing devices may execute instructions for responding to keyboard entries or menu selections, recognizing and interpreting touches and/or gestures made on a touchscreen, recognizing and tracking eye movements, receiving and interpreting voice commands, etc.
- Microscope 100 may also include or be connected to stage 116 .
- Stage 116 includes any horizontal rigid surface where sample 114 may be mounted for examination.
- Stage 116 may include a mechanical connector for retaining a slide containing sample 114 in a fixed position.
- the mechanical connector may use one or more of the following: a mount, an attaching member, a holding arm, a clamp, a clip, an adjustable frame, a locking mechanism, a spring or any combination thereof.
- stage 116 may include a translucent portion or an opening for allowing light to illuminate sample 114 .
- light transmitted from illumination assembly 110 may pass through sample 114 and towards image capture device 102 .
- stage 116 and/or sample 114 may be moved using motors or manual controls in the XY plane to enable imaging of multiple areas of the sample.
- FIG. 2A and FIG. 2B depict a closer view of microscope 100 in two cases. Specifically, FIG. 2A illustrates the optical paths of two beams pairs when microscope 100 is out of focus, and FIG. 2B illustrates the optical paths of two beams pairs when microscope 100 is in focus.
- image capture device 102 includes an image sensor 200 and a lens 202 .
- lens 202 may be referred to as an objective lens of microscope 100 .
- image sensor refers to a device capable of detecting and converting optical signals into electrical signals. The electrical signals may be used to form an image or a video stream based on the detected signals.
- Examples of image sensor 200 may include semiconductor charge-coupled devices (CCD), active pixel sensors in complementary metal-oxide-semiconductor (CMOS), or N-type metal-oxide-semiconductor (NMOS, Live MOS).
- CCD semiconductor charge-coupled devices
- CMOS complementary metal-oxide-semiconductor
- NMOS N-type metal-oxide-semiconductor
- lens may refer to a ground or molded piece of glass, plastic, or other transparent material with opposite surfaces either or both of which are curved, by means of which light rays are refracted so that they converge or diverge to form an image.
- the term “lens” also refers to an element containing one or more lenses as defined above, such as in a microscope objective.
- the lens is positioned at least generally transversely of the optical axis of image sensor 200 .
- Lens 202 may be used for concentrating light beams from sample 114 and directing them towards image sensor 200 .
- image capture device 102 may include a fixed lens or a zoom lens.
- focal-plane When sample 114 is located at a focal-plane 204 , the image projected from lens 202 is completely focused.
- focal-plane is used herein to describe a plane that is perpendicular to the optical axis of lens 202 and passes through the lens's focal point.
- the distance between focal-plane 204 and the center of lens 202 is called the focal length and is represented by D 1 .
- sample 114 may not be completely flat, and there may be small differences between focal-plane 204 and various regions of sample 114 . Accordingly, the distance between focal-plane 204 and sample 114 or a region of interest (ROI) of sample 114 is marked as D 2 .
- ROI region of interest
- the distance D 2 corresponds with the degree in which an image of sample 114 or an image of ROI of sample 114 is out of focus.
- distance D 2 may be between 0 and about 3 mm. In some embodiments, D 2 may be greater than 3 mm.
- distance D 2 equals to zero, the image of sample 114 (or the image of ROI of sample 114 ) is completely focused. In contrast, when D 2 has a value other than zero, the image of sample 114 (or the image of ROI of sample 114 ) is out of focus.
- FIG. 2A depicts a case where the image of sample 114 is out of focus.
- the image of sample 114 may be out of focus when the beams of light received from sample 114 do not converge on image sensor 200 .
- FIG. 2A depicts a beams pair 206 and a beams pair 208 . Neither pair converges on image sensor 200 .
- the optical paths below sample 114 are not shown.
- beams pair 206 may correspond with beam 120 projected from illumination assembly 110 at illumination angle ⁇ 2
- beams pair 208 may correspond with beam 118 projected from illumination assembly 110 at illumination angle ⁇ 1 .
- beams pair 206 may concurrently hit image sensor 200 with beams pair 208 .
- beams pair 206 and beams pair 208 may sequentially contact image sensor 200 .
- image sensor 200 has started recording information associated with, for example, beam pair 206 after the completion of recording information associated with, for example, beam pair 208 .
- D 2 is the distance between focal-plane 204 and sample 114 , and it corresponds with the degree in which sample 114 is out of focus. In one example, D 2 may have a value of 50 micrometers.
- Focus actuator 104 is configured to change distance D 2 by converting input signals from controller 106 into physical motion. In some embodiments, in order to focus the image of sample 114 , focus actuator 104 may move image capture device 102 . In this example, to focus the image of sample 114 focus actuator 104 may move image capture device 102 50 micrometers up. In other embodiments, in order to focus the image of sample 114 , focus actuator 104 may move stage 116 down. Therefore, in this example, instead of moving image capture device 102 50 micrometers up, focus actuator 104 may move stage 116 50 micrometers down.
- FIG. 2B illustrates a case where the image of sample 114 is in focus.
- both beam pairs 206 and 208 converge on image sensor 200 , and distance D 2 equals to zero.
- focusing the image of sample 114 may require adjusting the relative distance between image capture device 102 and sample 114 .
- the relative distance may be represented by D 1 -D 2 , and when distance D 2 equals to zero, the relative distance between image capture device 102 and sample 114 equals to distance D 1 , which means that the image of sample 114 is focused.
- lens 202 has a fixed focal length, i.e., distance D 1 is constant. Therefore, the missing parameter needed to focus the image of sample 114 is distance D 2 .
- the present disclosure provides a microscope and a method for determining the value of distance D 2 .
- FIG. 3A and FIG. 3B illustrate how microscope 100 may determine the value of distance D 2 using images acquired under a plurality of different illumination conditions.
- FIG. 3A illustrates an exemplary image (or two images overlaid on top of each other) shown on user interface 112 when the image of sample 114 is out of focus
- FIG. 3B illustrates an exemplary image shown user interface 112 when the image of sample 114 is in focus.
- FIG. 3A shows user interface 112 displaying information obtained from image sensor 200 that corresponds with the case illustrated in FIG. 2A .
- user interface 112 displays a first representation 300 of sample 114 and a second representation 302 of sample 114 .
- Each representation corresponds to a different illumination condition.
- first representation 300 may correspond to first illumination angle ⁇ 1
- second representation 302 may correspond to the second illumination angle ⁇ 2 .
- both first representation 300 and second representation 302 are displayed together as part of a captured image because the light from the first illumination angle ⁇ 1 may concurrently hit image sensor 200 with the light projected the second illumination angle ⁇ 2 .
- first representation 300 is captured as a first image and second representation 302 is captured as a second image. Both images may be overlaid on top of each other and shown as a single image or used for calculations together.
- controller 106 may be configured to identify the relative positions of the two (or more) representations using at least one common image feature of sample 114 .
- image feature refers to an identifiable element in a digital image, such as a line, a point, a spot, an edges, a region of similar brightness, a similar shape, an area of the image, etc. or other distinguishing characteristic of the pixels that comprise the image of sample 114 .
- first representation 300 and second representation 302 include a sharp protrusion on the upper side of the representation.
- controller 106 may identify a first occurrence 304 of the sharp protrusion and a second occurrence 306 of the sharp protrusion as a common image feature of sample 114 .
- first occurrence 304 may be associated the first illumination angle ⁇ 1 and second occurrence 306 may be associated with the second illumination angle ⁇ 2 .
- controller 106 may estimate an amount of shift between the occurrences.
- the shift between first occurrence 304 and second occurrence 306 is represented by D 3 .
- the shift between first occurrence 304 and second occurrence 306 may be measured by counting the number of pixels between two occurrences of the same one or more image features.
- measured values of shift D 3 originate from comparing multiple image features in the first and second representations and should be substantially identical. However, as often happens in real-life applications, there may be a significant variation in the measured values of shift D 3 when estimating shifts of a plurality of image features.
- controller 106 may apply statistical calculations on the measured values.
- the statistical calculations may include one or more of the following operations: a mean, a median, an average, a mode, a percentile or a standard deviation.
- Controller 106 may additionally apply these statistical calculations when determining a plurality of shift values or a vector shift when using more than two illumination conditions, such as, more than two illumination angles.
- controller 106 may determine distance D 2 using the distance between the illumination sources L, the distance between the illumination source plane and current focal plane Z and D 3 in order to calculate the distance D 2 .
- the distance D 2 may be calculated using the following linear equation:
- controller 106 may also determine the direction of the required adjustment. For example, in some cases focal-plane 204 may be below sample 114 (as illustrated in FIG. 2A ), and controller 106 would need to increase the relative distance between image capture device 102 and sample 114 to focus the image. But in other cases focal-plane 204 may be above sample 114 , and controller 106 may need to decrease the relative distance between image capture device 102 and sample 114 to focus the image. In one embodiment, controller 106 may determine the direction of the required adjustment using a two-step process.
- controller 106 may instruct focus actuator 104 to move image capture device 102 0.3 mm up, and check if the focus of the image had improved. If it did improve, controller 106 may instruct focus actuator 104 to continue moving image capture device 102 up for additional 0.7 mm. But if it did not improve, controller 106 may instruct focus actuator 104 to move image capture device 102 down for 1.3 mm. In another embodiment, controller 106 may determine the direction of the required adjustment using a one-step process such as, for example, by purposefully introducing a known separation between sample 114 and the focal-plane 204 .
- controller 106 may determine the direction of the required adjustment using a one-step process such as, for example, measure the direction of shift of features between object images 300 and 302 and use the knowledge of the illumination conditions used, to understand if the sample is above or below the focal plane.
- FIG. 3B depicts user interface 112 displaying information obtained from image sensor 200 that corresponds with the case illustrated in FIG. 2B .
- user interface 112 displays a representation 308 of sample 114 .
- user interface 112 displays a single representation because the image of sample 114 is in focus. That is, first occurrence 304 and second occurrence 306 were merged to representation 308 after the controller 106 adjusted the relative distance between image capture device 102 and sample 114 .
- controller 106 may determine that the quality of the image is not sufficient. For example, the level of sharpness associated with an image of sample 114 may be below a predefined threshold. The level of sharpness may vary due to, for example, unintentional movement of microscope 100 , a change of the ROI of sample 114 , and more. To improve the quality of the image, controller 106 may refocus microscope 100 . In addition, controller 106 may determine a plurality of shift values that correspond with a plurality of portions of a field of view of image capture device 102 to determine three-dimensional information. The three-dimensional information may include a tilt information between microscope 100 and sample 114 , 3D shape of the object, and/or field curvature of lens 202 . Controller 106 may use the tilt information when reconstructing the image of sample 114 to improve the sharpness of the image of sample 114 . Additional examples regarding the reconstruction of the image of sample 114 is provided below with reference to FIGS. 5-10 .
- FIG. 4 is a flowchart showing an exemplary process 400 for focusing an image of sample 114 using two images captured when sample 114 is illuminated under two illumination conditions.
- Process 400 may be adapted to focus an image of sample 114 using a single image captured when sample 114 is illuminated under two illumination conditions, or using one or more images when sample 114 is illuminated under more than two illumination conditions.
- the steps of process 400 may be performed by an autofocus microscope.
- autofocus microscope refers to any device for magnifying sample 114 with the capability to focus the image of sample 114 (or the image of ROI of sample 114 ) in an automatic or semiautomatic manner
- controller 106 may cause illumination assembly 110 to illuminate sample 114 under a first illumination condition.
- controller 106 may acquire, from image capture device 102 , a first image of sample 114 illuminated under the first illumination condition.
- controller 106 may cause illumination assembly 110 to illuminate sample 114 using a single light source located within a numerical aperture of image capture device 102 .
- controller 106 may cause illumination assembly 110 to illuminate sample 114 using a plurality of light sources located within the numerical aperture of image capture device 102 .
- controller 106 may cause illumination assembly 110 to illuminate sample 114 under a second illumination condition different from the first illumination condition.
- controller 106 may acquire, from image capture device 102 , a second image of sample 114 illuminated under the second illumination condition.
- the illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof.
- the illumination conditions may include a first illumination angle and a second illumination angle symmetrically located with respect to an optical axis of image capture device 102 .
- the illumination conditions may include a first illumination angle and a second illumination angle asymmetrically located with respect to an optical axis of image capture device 102 .
- the illumination conditions may include a first illumination angle and a second illumination angle within the numerical aperture of image capture device 102 .
- first illumination angle ⁇ 1 is greater than second illumination angle ⁇ 2 , thus the first illumination angle and the second illumination angle are asymmetrically located with respect to an optical axis of image capture device 102 .
- controller 106 may determine an amount of shift D 3 between one or more image features present in the first image of sample 114 and a corresponding one or more image features present in the second image of sample 114 .
- controller 106 may determine a plurality of shift values based on multiple image features and calculate an overall shift associated with shift D 3 .
- the overall shift may be a mean, a median, a mode of the plurality of shift values.
- controller 106 may determine a size of the distance change based on a magnitude of shift D 3 .
- controller 106 may also determine a direction of the distance change based on a direction of shift D 3 , or by purposely introducing a known separation between the sample and the focal plane.
- focal-plane 204 may be below sample 114 (as illustrated in FIG. 2A ), and in other cases, focal-plane 204 may be above sample 114 . These different cases may require a different direction of the distance change to focus microscope 100 .
- controller 106 may calculate a distance from focal-plane 204 to sample 114 based on the shift (e.g., shift D 3 in the lateral direction).
- controller 106 may, where the amount of determined shift D 3 is non-zero, cause focus actuator 104 to change distance D 2 between sample 114 and focal-plane 204 .
- 104 may move image capture device 102 to adjust distance D 2 between sample 114 and focal-plane 204 , or move stage 116 to adjust the distance between sample 114 and focal-plane 204 .
- controller 106 may cause focus actuator 104 to reduce the distance between sample 114 and focal-plane 204 to substantially zero, for example, as illustrated in FIG. 2B .
- controller 106 may determine that the amount of shift D 3 has increased after the change in the first direction, and cause focus actuator 104 to change the distance in a second direction (e.g., down).
- a first direction e.g., up
- controller 106 may determine that the amount of shift D 3 has increased after the change in the first direction, and cause focus actuator 104 to change the distance in a second direction (e.g., down).
- controller 106 may repeat steps 402 to 410 to determine an amount of a new shift after adjusting distance D 2 between sample 114 and focal-plane 204 . If the amount of the new shift is still non-zero, or above a predefined threshold. Controller 106 may cause focus actuator 104 to change again distance D 2 between sample 114 and focal-plane 204 . In some embodiments, controller 106 may readjust distance D 2 between sample 114 and focal-plane 204 until shift D 3 would be substantially zero or below the predefined threshold. When the amount of the new shift is below a predetermined threshold, controller 106 may store the amount of determined shift for future focus compensation calculations. After completing process 400 , microscope 100 is completely focused.
- microscope 100 may acquire a plurality of focused images to generate a high-resolution image of sample 114 .
- the high-resolution image of sample 114 may be sent to a display (e.g., a screen or phone), stored in memory, sent for further processing or sent over a network.
- controller 106 may use the determined distance D 2 to perform calculations for computational correction of focus along with physical motion stage 116 or without causing stage 116 to move. Furthermore, in some embodiments, stage 116 and/or sample 114 may be moved using motors or manual controls in the XY plane to enable imaging of multiple areas of sample 114 .
- controller 106 may acquire images at a first image resolution and generate a reconstructed image of sample 114 having a second (enhanced) image resolution.
- image resolution is a measure of the degree to which the image represents the fine details of sample 114 .
- the quality of a digital image may also be related to the number of pixels and the range of brightness values available for each pixel.
- generating the reconstructed image of sample 114 is based on images having an image resolution lower than the enhanced image resolution.
- the enhanced image resolution may have at least 2 times, 5 times, 10 times, or 100 times more pixels than the lower image resolution images.
- the first image resolution of the captured images may be referred to hereinafter as low-resolution and may have a value between 2 megapixels and 25 megapixels, between 10 megapixels and 20 megapixels, or about 15 megapixels.
- the second image resolution of the reconstructed image may be referred to hereinafter as high-resolution and may have a value higher than 40 megapixels, higher than 100 megapixels, higher than 500 megapixels, or higher than 1000 megapixels.
- FIG. 5 is an illustration of an exemplary process 500 for reconstructing an image of sample 114 , consistent with disclosed embodiments.
- controller 106 may acquire from image capture device 102 a plurality of low resolution images of sample 114 .
- the plurality of images includes at least one image for each illumination condition.
- the different illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof.
- the total number (N) of the plurality of different illumination conditions is between 2 to 10, between 5 to 50, between 10 to 100, between 50 to 1000, or more than 1000.
- controller 106 may determine image data of sample 114 associated with each illumination condition. For example, controller 106 may apply a Fourier transform on images acquired from image capture device 102 to obtain Fourier transformed images.
- the Fourier transform is an image processing tool which is used to decompose an image into its sine and cosine components.
- the input of the transformation may be an image in the normal image space (also known as real-plane), while the output of the transformation may be a representation of the image in the frequency domain (also known as a Fourier-plane).
- controller 106 may use other transformations, such as a Laplace transform, a Z transform, a Gelfand transform, or a Wavelet transform.
- controller 106 may use a Fast Fourier Transform (FFT) algorithm to compute the Discrete Fourier Transform (DFT) by factorizing the DFT matrix into a product of sparse (mostly zero) factors.
- FFT Fast Fourier Transform
- controller 106 may aggregate the image data determined from images captured under a plurality of illumination conditions to form a combined complex image.
- One way for controller 106 to aggregate the image data is by locating in the Fourier-plane overlapping regions in the image data.
- Another way for controller 106 to aggregate the image data is by determining the intensity and phase for the acquired low-resolution images per illumination condition.
- the image data, corresponding to the different illumination conditions does not necessarily include overlapping regions.
- this method has a great advantage in reducing the number of illumination conditions needed in order to reconstruct an image with a certain resolution, and therefore increasing the acquisition speed of the image information.
- FIGS. 6A-6F illustrate different configurations of microscope 100 for determining phase information under a variety of illumination conditions.
- controller 106 may generate a reconstructed high-resolution image of sample 114 .
- controller 106 may apply the inverse Fourier transform to obtain the reconstructed image.
- the reconstructed high-resolution image of sample 114 may be shown on a display (e.g., user interface 112 ).
- the reconstructed high-resolution image of sample 114 may be used to identify at least one element of sample 114 in the reconstructed image.
- the at least one element of sample 114 may include any organic or nonorganic material identifiable using a microscope.
- the at least one element examples include, but are not limited to, biomolecules, whole cells, portions of cells such as various cell components (e.g., cytoplasm, mitochondria, nucleus, chromosomes, nucleoli, nuclear membrane, cell membrane, Golgi apparatus, lysosomes), cell-secreted components (e.g., proteins secreted to intercellular space, proteins secreted to body fluids, such as serum, cerebrospinal fluid, urine), microorganisms, and more.
- the reconstructed image may be used in the following procedures: blood cell recognition, identification of chromosomes and karyotypes, detection of parasitic infections, identification of tissues suspected as malignant, and more.
- microscope 100 may include illumination assembly 110 , focus actuator 104 , lens 202 , and image sensor 200 .
- controller 106 may acquire a group of images from different focal-planes for each illumination condition. Therefore, controller 106 may use the information from the different focal-planes to determine the phase information under each illumination condition.
- FIG. 7 describes a detailed example process of how controller 106 may use the configuration of FIG. 6A to generate the reconstructed high-resolution image of sample 114 .
- microscope 100 may include illumination assembly 110 , lens 202 , a beam splitter 600 , a first image sensor 200 A, and a second image sensor 200 B.
- first image sensor 200 A and second image sensor 200 B may capture different types of images
- controller 106 may combine the information from first image sensor 200 A and second image sensor 200 B to determine the phase information under each illumination condition.
- image sensor 200 A may capture Fourier-plane images
- second image sensor 200 B may capture real-plane images.
- controller 106 may acquire, for each illumination condition, a Fourier-plane image from first image sensor 200 A and a real-plane image from second image sensor 200 B.
- controller 106 may combine information from the Fourier-plane image and the real-plane image to determine the phase information under each illumination condition.
- image sensor 200 A may be configured to capture focused images and second image sensor 200 B is configured to capture unfocused images. It is also possible that additional sensors may be added. For example, 3 different sensors may be configured to capture images in 3 different focal planes.
- microscope 100 may include a light source 602 , a beam splitter 600 , lens 202 , and image sensor 200 .
- light source 602 may project a light beam (coherent or at least partially coherent) towards beam splitter 600 , the beam splitter generates two light beams that travel through two different optical paths and create an interference pattern.
- the interference pattern is created on sample 114
- the interference pattern is created on image sensor 200 .
- controller 106 may identify, for each illumination condition, the interference pattern between the two light beams traveling through the different optical paths, and determine, from the interference pattern, the phase associated with each illumination condition.
- microscope 100 may include illumination assembly 110 , lens 202 , an optical element 604 , and at least one image sensor 200 .
- optical element 604 is configured to impose some form of modulation on the light received from sample 114 . The modulation may be imposed on the phase, the frequency, the amplitude, or the polarization of the beam.
- microscope 100 may include a dynamic optical element, such as spatial light modulator (SLM), that may dynamically change the modulation. Controller 106 may use the different information caused by the dynamic optical element to determine the phase information under each illumination condition.
- SLM spatial light modulator
- microscope 100 may include a fixed optical element, such as phase-shift mask, beam splitter 600 , first image sensor 200 A, and second image sensor 200 B. Controller 106 may combine information from first image sensor 200 A and second image sensor 200 B to determine the phase information under each illumination condition.
- a fixed optical element such as phase-shift mask, beam splitter 600 , first image sensor 200 A, and second image sensor 200 B.
- Controller 106 may combine information from first image sensor 200 A and second image sensor 200 B to determine the phase information under each illumination condition.
- controller 106 may determine phase information under each illumination condition independently.
- FIG. 7 is a flow diagram showing the process of FIG. 5 using the configuration of FIG. 6A . The process begins when controller 106 causes illumination assembly 110 to illuminate sample 114 at a first illumination condition (block 700 ). Next, controller 106 may acquire an image when sample 114 is in focal-plane 204 (block 702 ). Then, controller 106 may cause focus actuator 104 to change the distance between image capture device 102 and sample 114 (block 704 ), and acquire an additional image when sample 114 is not in focal-plane 204 (block 706 ).
- the distance between image capture device 102 and sample 114 may constitute a distance from lens 202 to sample 114 , a distance from image sensor 200 to sample 144 , or a sum of th distance from lens 202 to sample 114 and the distance from image sensor 200 to sample 114 .
- controller 106 may determine whether there is sufficient data to determine the phase information (decision block 708 ). If there is insufficient data to determine the phase information, controller 106 may repeat the steps in blocks 704 - 708 until there is sufficient data to determine the phase information of sample 114 under the current illumination condition.
- the phase may be calculated using methods such as transport of intensity (TIE), error reduction algorithms, Hybrid input-output (HIO), optimization algorithms such as gradient descent and others.
- TIE transport of intensity
- HIO Hybrid input-output
- controller 106 may transform the image captured from real space in the focal plane 204 to a Fourier space (block 710 ). Thereafter, controller 106 may add image data associated with the current illumination condition to a combined complex image (block 712 ). If controller 106 collected all of the image data associated with all of the illumination conditions and added them to the combined complex image (decision block 714 ), then controller 106 may transform the combined complex image into the image-plane to generate a reconstructed image of sample 114 . But, if not all of the image data associated with all of the illumination conditions was collected, controller 106 may cause illumination assembly 110 to illuminate sample 114 under another illumination condition (block 716 ) and then may repeat steps 702 - 714 .
- FIG. 8 is a schematic illustration that identifies the numerical aperture (NA) of microscope 100 .
- microscope 100 may include a lens (e.g., lens 202 ) with a first numerical aperture.
- the first numerical aperture may be less than 1, less than 0.8, less than 0.6, less than 0.4, less than 0.2, or less than 0.1, or can be more than 1.
- illumination assembly 110 may illuminate sample 114 at an incidence angle of illumination.
- the term “incidence angle of illumination” refers to an angle (e.g., ⁇ 2 ) formed between the optical axis of the lens and a light beam projected from illumination assembly 110 .
- the second numerical aperture may be more than 2 ⁇ NA 1 , more than 2.5 ⁇ NA 1 , more than 3.5 ⁇ NA 1 , or more than 5 ⁇ NA 1 .
- FIG. 9A and FIG. 9B depict two schematic illustrations in Fourier-plane.
- FIG. 9A is an illustration in Fourier-plane of image data acquired under a single illumination condition
- FIG. 9B is an illustration in Fourier-plane of image data acquired under a plurality of different illumination conditions.
- the illustration of FIG. 9A includes an image in Fourier-plane with a first circular area 900 in Fourier-plane whose radius is equal to NA 1 , and a second theoretical circular area 902 whose radius is equal to NA 2 .
- the radius of second circular area 902 is at least 1.5 times the radius of first circular area 900 .
- First circular area 900 is associated with image data of sample 114 when the illumination angle equals to 0 degrees.
- the non-iterative process of generating the reconstructed image includes using image data associated with a single illumination condition for each point in the combined complex image.
- FIG. 9B illustrates this embodiment. Specifically, FIG. 9B depicts a plurality of circles in Fourier-plane. Each circle represents a point in the combined complex image and is associated with image data acquired under a single illumination condition. For example, each circle in FIG. 9B is associated with a different illumination angle. Since the radius of Circular area 902 is at least 1.5 times the radius of the first circular area 900 , the system in some embodiments is not limited one to a first ‘order’ of additional areas around area 900 , but can have additional ‘orders’ of circular areas further away from the first circular area 900 .
- the illumination of sample 114 may result from light sources located at a surface parallel to sample 114 and/or from light sources located at a surface perpendicular to sample 114 .
- light sources may be included on illumination assembly 110 .
- light sources may be located at any appropriate location of microscope 100 and illuminate sample 114 at any appropriate angle.
- light sources may be positioned on other surfaces, such as on the surface of a hemisphere or cube positioned under stage 116 .
- FIG. 10 is a flowchart showing an exemplary process 1000 for constructing a high-resolution image of sample 114 using image information acquired under a plurality of different illumination conditions.
- controller 106 may cause the illumination assembly to illuminate sample 114 at a series of different illumination conditions.
- the illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof.
- controller 106 may acquire from image capture device 102 a plurality of images of sample 114 , and the plurality of images may include at least one image for each illumination condition.
- the acquired images are captured at an image resolution higher than 12 megapixels with a pixel size smaller than 2 micrometers in both the horizontal and vertical dimensions
- High resolution sensors with small pixel sizes are low cost and lead to a large field of view, but require significant effort in working with low signal to noise ratios (a result of the small pixel size) and optimization of speed due to the high memory bandwidths and large fields of view.
- our method is capable of operating with such sensors whereas other methods cannot.
- controller 106 may determine, from the at least one image, image data of sample 114 for each illumination condition.
- controller 106 may transform the at least one image from a real space to a Fourier space, aggregate the image data of the sample in the Fourier-space to form a combined complex image, and transform the combined complex image data back to the image space to generate the reconstructed image of sample 114 .
- determining image data of sample 114 for each illumination condition may include determining phase information of sample 114 under each illumination condition independently. As discussed above with reference to FIGS. 6A-6F , determining the phase information under each illumination condition may be implemented using different configurations of microscope 100 .
- controller 106 may acquire, from image capture device 102 , a group of first images from different focal planes for each illumination condition and determine, from the group of first images, phase information under each illumination condition independently.
- controller 106 may acquire, from first image sensor 200 A, a first image for each illumination condition; acquire, from second image sensor 200 B, a second image different from the first image for each illumination condition; and combine information from the first image and the second image to determine phase information under each illumination condition independently.
- controller 106 may identify, for each illumination condition, an interference pattern between the first and second light beams and determine, from the interference pattern, phase information associated with each illumination condition independently.
- controller 106 may acquire, for each illumination condition, a first image from first image sensor 200 A, and a second image from second image sensor 200 B, wherein the second image is modulated differently from the first image; and combine information from the first image and the second image to determine phase information under each illumination condition.
- controller 106 may generate, from the determined image data for each illumination condition, a reconstructed image of sample 114 .
- the reconstructed image having a second image resolution higher than the first image resolution.
- controller 106 may generate the reconstructed image in a non-iterative process.
- the term “generate a reconstructed image in a non-iterative process” refers to a process in which the reconstructed image is not compared to the acquired images nor are the acquired images compared to themselves.
- the non-iterative process may include using image data associated with a single illumination condition for each point in the combined complex image, as depicted in FIG. 9 .
- controller 106 may determine the intensity and phase information of sample 114 for each illumination condition. Thereafter, controller 106 may use the intensity and phase information to organize all of the pieces of the puzzle (i.e., the image data determined under each illumination condition) in their place.
- controller 106 may use the intensity and phase information to organize all of the pieces of the puzzle (i.e., the image data determined under each illumination condition) in their place.
- using this non-iterative process enables one to decrease the computation time needed to reconstruct the high-resolution image. It is possible, but not mandatory, that determining the phase information or other information for each illumination condition independently will be done using an iterative process. However, generating the final high resolution image from the information determined from the multiple illumination conditions will be done in a non-iterative process. In this case the overlap between regions in Fourier space can still be reduced or eliminated.
- controller 106 may cause the reconstructed image to be shown on a display (step 1010 ) or identify at least one element of sample 114 in the reconstructed image (step 1012 ).
- controller 106 may confirm the quality of the reconstructed image before using it. For example, controller 106 may generate the reconstructed image using a first set of constructing parameters, and determine that the reconstructed image is not in a desired quality. In one example, the determination that reconstructed image is not in the desired quality is based on a level of sharpness of the reconstructed image or parts of it, or a comparison with expected or known results based on prior knowledge. Thereafter, controller 106 may generate a second reconstructed image using a second set of constructing parameters. In addition, controller 106 may acquire another set of images of sample 114 after changing the focus of microscope 100 , as described above with reference to FIG. 4 .
- Programs based on the written description and disclosed methods are within the skill of an experienced developer.
- the various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software.
- program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective-C, python, Matlab, Cuda, HTML, HTML/AJAX combinations, XML, or HTML with included Java applets.
- One or more of such software sections or modules can be integrated into a computer system or existing e-mail or browser software.
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Multimedia (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Theoretical Computer Science (AREA)
- Microscoopes, Condenser (AREA)
- Automatic Focus Adjustment (AREA)
- Image Processing (AREA)
Abstract
Systems and methods are disclosed for constructing an image of a sample using a plurality of images acquired under multiple illumination conditions. In one implementation, a microscope may include an image capture device, an illumination assembly, and a controller. The controller may cause the illumination assembly to illuminate the sample at a series of different illumination conditions. The controller may further acquire from image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination condition. The controller may also determine, from the at least one image, image data of the sample for each illumination condition. The controller may further generate, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample. The reconstructed image may have a second image resolution higher than the first image resolution.
Description
- This application claims the benefit of priority of United States Provisional Patent Application No. 62/253,723, filed on Nov. 11, 2015; United States Provisional Patent Application No. 62/253,726, filed on Nov. 11, 2015; and United States Provisional Patent Application No. 62/253,734, filed on Nov. 11, 2015. All of the foregoing applications are incorporated herein by reference in their entirety.
- The present disclosure relates generally to computational microscopy and, more specifically, to systems and methods for generating an image under different illumination conditions.
- Today's commercial microscopes rely on expensive and delicate optical lenses and typically need additional hardware to share and process acquired images. Moreover, for scanning optical microscopy, additional expensive equipment such as accurate mechanics and scientific cameras are required. A new generation of microscope technology, known as computational microscopy, has begun to emerge, and makes use of advanced image-processing algorithms (usually with hardware modifications) to overcome limitations of conventional microscopes. A computational microscope can, in some cases, produce high-resolution digital images of samples without using expensive optical lenses. In addition, a computational microscope may open the door for additional capabilities based on computer vision, sharing of data, etc.
- Disclosed systems and methods relate to the field of computational microscopy. Certain disclosed embodiments are directed to systems and methods for focusing a microscope using images acquired under a plurality of illumination conditions. The disclosed embodiments also include systems and methods for acquiring images under a plurality of illumination conditions to generate a high-resolution image of a sample.
- Consistent with disclosed embodiments, a microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions is provided. The microscope may include at least one image capture device configured to capture at a first image resolution images of a sample. The microscope may further include a lens with a first numerical aperture. The microscope may also include an illumination assembly including at least one light source configured to illuminate the sample, wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture. The microscope may further include at least one controller programmed to: cause the illumination assembly to illuminate the sample at a series of different illumination conditions; acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination condition; determine, from the at least one image, image data of the sample for each illumination condition; and generate, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- Also consistent with disclosed embodiments, a microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions is provided. The microscope may include at least one image capture device configured to capture at a first image resolution, images of a sample. The microscope may also include a lens with a first numerical aperture. The microscope may further include an illumination assembly including at least one light source configured to illuminate the sample, wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture. The microscope may further include at least one controller programmed to cause the illumination assembly to illuminate the sample at a series of different illumination angles; acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination angle; determine, from the at least one image, image data of the sample for each illumination angle, wherein the image data includes phase information of the sample under each illumination condition; and generate, from the image data for the series of different illumination angles, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- Consistent with the disclosed embodiments, a method is provided for constructing an image of a sample using image information acquired under a plurality of different illumination conditions. The method may include illuminating a sample at a series of different illumination conditions, wherein an illumination of the sample is at an incidence angle representing a numerical aperture which is at least 1.5 times of a numerical aperture associated with an image capture device; acquiring, from the image capture device, a plurality of images captured at the first image resolution of the sample, wherein the plurality of images includes at least one image for each illumination condition; determining, from the at least one image, image data of the sample for each illumination condition, wherein the image data includes phase information of the sample under each illumination condition; and generating, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
- Additionally, a non-transitory computer-readable storage media may store program instructions, which are executed by at least one controller and perform any of the methods described herein.
- The foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the claims.
- The accompanying drawings, which are incorporated in and constitute a part of this disclosure, illustrate various disclosed embodiments. In the drawings:
-
FIG. 1 is a diagrammatic representation of an exemplary microscope, consistent with the disclosed embodiments; -
FIG. 2A is a diagrammatic representation of the optical paths of two beam pairs when the microscope ofFIG. 1 is out of focus, consistent with the disclosed embodiments; -
FIG. 2B is a diagrammatic representation of the optical paths of two beam pairs when the microscope ofFIG. 1 is in focus, consistent with the disclosed embodiments; -
FIG. 3A is a diagrammatic representation of an exemplary image shown on a display when the microscope ofFIG. 1 is out of focus, consistent with the disclosed embodiments; -
FIG. 3B is a diagrammatic representation of an exemplary image shown on a display when the microscope ofFIG. 1 is in focus, consistent with the disclosed embodiments; -
FIG. 4 is a flowchart showing an exemplary process for focusing an image of a sample using images acquired under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 5 is a representation of an exemplary process for constructing an image of a sample using images acquired under a plurality of illumination conditions, consistent with disclosed embodiments; -
FIG. 6A is a diagrammatic representation of a configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 6B is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 6C is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 6D is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 6E is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 6F is a diagrammatic representation of another configuration for determining phase information of a sample under a plurality of illumination conditions, consistent with the disclosed embodiments; -
FIG. 7 is a flow diagram showing the implementation of the process ofFIG. 5 using the configuration ofFIG. 6A , consistent with the disclosed embodiments; -
FIG. 8 is a diagrammatic representation of the numerical aperture of the microscope ofFIG. 1 , consistent with the disclosed embodiments; -
FIG. 9A is an illustration in Fourier-plane of image data acquired under a single illumination condition, consistent with the disclosed embodiments; -
FIG. 9B is an illustration in Fourier-plane of image data acquired under a plurality of different illumination conditions, consistent with the disclosed embodiments; and -
FIG. 10 is a flowchart showing an exemplary process for reconstructing an image of a sample using images acquired under a plurality of illumination conditions, consistent with the disclosed embodiments. - The following detailed description refers to the accompanying drawings. Wherever possible, the same reference numbers are used in the drawings and the following description to refer to the same or similar parts. While several illustrative embodiments are described herein, modifications, adaptations and other implementations are possible. For example, substitutions, additions or modifications may be made to the components illustrated in the drawings, and the illustrative methods described herein may be modified by substituting, reordering, removing, or adding steps to the disclosed methods. Accordingly, the following detailed description is not limited to the disclosed embodiments and examples. Instead, the proper scope is defined by the appended claims
- Disclosed embodiments provide microscopes and methods that use one or more cameras to provide high-resolution images of a sample which may be located on a stage. In various embodiments, the microscope may use images of the sample captured under a plurality of illumination conditions. For example, the plurality of illumination conditions may include different illumination angles. In one aspect of the disclosure, the microscope may identify, in the captured images, multiple occurrences of the sample corresponding to the plurality of illumination conditions. The microscope may estimate a shift between the occurrences and determine a degree in which the microscope is out of focus. This aspect of the disclosure is described in detail with reference to
FIGS. 2-4 . In another aspect of the disclosure, the microscope may capture multiple images of the sample under each illumination condition, aggregate image data from these images, and construct a high-resolution image from the image data. In one example, the microscope may aggregate the image data in the Fourier plane and then use inverse Fourier transform to reconstruct the high-resolution image. This aspect of the disclosure is described in detail with reference toFIGS. 5-10 . -
FIG. 1 is a diagrammatic representation of amicroscope 100 consistent with the exemplary disclosed embodiments. The term “microscope” refers to any device or instrument for magnifying an object which is smaller than easily observable by the naked eye, i.e., creating an image of an object for a user where the image is larger than the object. One type of microscope may be an “optical microscope” that uses light in combination with an optical system for magnifying an object. An optical microscope may be a simple microscope having one or more magnifying lens. Another type of microscope may be a “computational microscope” that includes an image sensor and image-processing algorithms to enhance or magnify the object's size or other properties. The computational microscope may be a dedicated device or created by incorporating software and/or hardware with an existing optical microscope to produce high-resolution digital images. As shown inFIG. 1 ,microscope 100 includes animage capture device 102, afocus actuator 104, acontroller 106 connected tomemory 108, anillumination assembly 110, and auser interface 112. An example usage ofmicroscope 100 may be capturing images of asample 114 mounted on astage 116 located within the field-of-view (FOV) ofimage capture device 102, processing the captured images, and presenting on user interface 112 a magnified image ofsample 114. -
Image capture device 102 may be used to capture images ofsample 114. In this specification, the term “image capture device” includes a device that records the optical signals entering a lens as an image or a sequence of images. The optical signals may be in the near-infrared, infrared, visible, and ultraviolet spectrums. Examples of an image capture device include a CCD camera, a CMOS camera, a photo sensor array, a video camera, a mobile phone equipped with a camera, etc. Some embodiments may include only a singleimage capture device 102, while other embodiments may include two, three, or even four or moreimage capture devices 102. In some embodiments,image capture device 102 may be configured to capture images in a defined field-of-view (FOV). Also, whenmicroscope 100 includes severalimage capture devices 102,image capture devices 102 may have overlap areas in their respective FOVs.Image capture device 102 may have one or more image sensors (not shown inFIG. 1 ) for capturing image data ofsample 114. In other embodiments,image capture device 102 may be configured to capture images at an image resolution higher than 10 Megapixels, higher than 12 Megapixels, higher than 15 Megapixels, or higher than 20 Megapixels. In addition,image capture device 102 may also be configured to have a pixel size smaller than 5 micrometers, smaller than 3 micrometers, or smaller than 1.6 micrometer. - In some embodiments,
microscope 100 includesfocus actuator 104. The term “focus actuator” refers to any device capable of converting input signals into physical motion for adjusting the relative distance betweensample 114 andimage capture device 102. Various focus actuators may be used, including, for example, linear motors, electrostrictive actuators, electrostatic motors, capacitive motors, voice coil actuators, magnetostrictive actuators, etc. In some embodiments, focusactuator 104 may include an analog position feedback sensor and/or a digital position feedback element.Focus actuator 104 is configured to receive instructions fromcontroller 106 in order to make light beams converge to form a clear and sharply defined image ofsample 114. In the example illustrated inFIG. 1 , focusactuator 104 may be configured to adjust the distance by movingimage capture device 102. However, in other embodiments, focusactuator 104 may be configured to adjust the distance by movingstage 116, or by moving bothimage capture device 102 andstage 116. -
Microscope 100 may also includecontroller 106 for controlling the operation ofmicroscope 100 according to the disclosed embodiments.Controller 106 may comprise various types of devices for performing logic operations on one or more inputs of image data and other data according to stored or accessible software instructions providing desired functionality. For example,controller 106 may include a central processing unit (CPU), support circuits, digital signal processors, integrated circuits, cache memory, or any other types of devices for image processing and analysis such as graphic processing units (GPUs). The CPU may comprise any number of microcontrollers or microprocessors configured to process the imagery from the image sensors. For example, the CPU may include any type of single- or multi-core processor, mobile device microcontroller, etc. Various processors may be used, including, for example, processors available from manufacturers such as Intel®, AMD®, etc. and may include various architectures (e.g., x86 processor, ARM®, etc.). The support circuits may be any number of circuits generally well known in the art, including cache, power supply, clock and input-output circuits. - In some embodiments,
controller 106 may be associated withmemory 108 used for storing software that, when executed bycontroller 106, controls the operation ofmicroscope 100. In addition,memory 108 may also store electronic data associated with operation ofmicroscope 100 such as, for example, captured or generated images ofsample 114. In one instance,memory 108 may be integrated into thecontroller 106. In another instance,memory 108 may be separated from thecontroller 106. Specifically,memory 108 may refer to multiple structures or computer-readable storage mediums located atcontroller 106 or at a remote location, such as a cloud server.Memory 108 may comprise any number of random access memories, read only memories, flash memories, disk drives, optical storage, tape storage, removable storage and other types of storage. -
Microscope 100 may includeillumination assembly 110. The term “illumination assembly” refers to any device or system capable of projecting light to illuminatesample 114.Illumination assembly 110 may include any number of light sources, such as light emitting diodes (LEDs), lasers, and lamps configured to emit light. In one embodiment,illumination assembly 110 may include only a single light source. Alternatively,illumination assembly 110 may include four, sixteen, or even more than a hundred light sources organized in an array or a matrix. In some embodiments,illumination assembly 110 may use one or more light sources located at a surface parallel to illuminatesample 114. In other embodiments,illumination assembly 110 may use one or more light sources located at a surface perpendicular or at an angle to sample 114. - In addition,
illumination assembly 110 may be configured to illuminatesample 114 in a series of different illumination conditions. In one example,illumination assembly 110 may include a plurality of light sources arranged in different illumination angles, such as a two-dimensional arrangement of light sources. In this case, the different illumination conditions may include different illumination angles. For example,FIG. 1 depicts abeam 118 projected from a first illumination angle α1, and abeam 120 projected from a second illumination angle α2. In some embodiments, first illumination angle α1 and second illumination angle α2 may have the same value but opposite sign. In other embodiments, first illumination angle α1 may be separated from second illumination angle α2. However, both angles originate from points within the acceptance angle of the optics. In another example,illumination assembly 110 may include a plurality of light sources configured to emit light in different wavelengths. In this case, the different illumination conditions may include different wavelengths. In yet another example,illumination assembly 110 may configured to use a number of light sources at predetermined times. In this case, the different illumination conditions may include different illumination patterns. Accordingly and consistent with the present disclosure, the different illumination conditions may be selected from a group including: different durations, different intensities, different positions, different illumination angles, different illumination patterns, different wavelengths, or any combination thereof. - Consistent with disclosed embodiments,
microscope 100 may include, be connected with, or in communication with (e.g., over a network or wirelessly, e.g., via Bluetooth)user interface 112. The term “user interface” refers to any device suitable for presenting a magnified image ofsample 114 or any device suitable for receiving inputs from one or more users ofmicroscope 100.FIG. 1 illustrates two examples ofuser interface 112. The first example is a smartphone or a tablet wirelessly communicating withcontroller 106 over a Bluetooth, cellular connection or a Wi-Fi connection, directly or through a remote server. The second example is a PC display physically connected tocontroller 106. In some embodiments,user interface 112 may include user output devices, including, for example, a display, tactile device, speaker, etc. In other embodiments,user interface 112 may include user input devices, including, for example, a touchscreen, microphone, keyboard, pointer devices, cameras, knobs, buttons, etc. With such input devices, a user may be able to provide information inputs or commands tomicroscope 100 by typing instructions or information, providing voice commands, selecting menu options on a screen using buttons, pointers, or eye-tracking capabilities, or through any other suitable techniques for communicating information tomicroscope 100.User interface 112 may be connected (physically or wirelessly) with one or more processing devices, such ascontroller 106, to provide and receive information to or from a user and process that information. In some embodiments, such processing devices may execute instructions for responding to keyboard entries or menu selections, recognizing and interpreting touches and/or gestures made on a touchscreen, recognizing and tracking eye movements, receiving and interpreting voice commands, etc. -
Microscope 100 may also include or be connected to stage 116.Stage 116 includes any horizontal rigid surface wheresample 114 may be mounted for examination.Stage 116 may include a mechanical connector for retaining aslide containing sample 114 in a fixed position. The mechanical connector may use one or more of the following: a mount, an attaching member, a holding arm, a clamp, a clip, an adjustable frame, a locking mechanism, a spring or any combination thereof. In some embodiments,stage 116 may include a translucent portion or an opening for allowing light to illuminatesample 114. For example, light transmitted fromillumination assembly 110 may pass throughsample 114 and towardsimage capture device 102. In some embodiments,stage 116 and/orsample 114 may be moved using motors or manual controls in the XY plane to enable imaging of multiple areas of the sample. -
FIG. 2A andFIG. 2B depict a closer view ofmicroscope 100 in two cases. Specifically,FIG. 2A illustrates the optical paths of two beams pairs whenmicroscope 100 is out of focus, andFIG. 2B illustrates the optical paths of two beams pairs whenmicroscope 100 is in focus. - As shown in
FIGS. 2A and 2B ,image capture device 102 includes animage sensor 200 and alens 202. In microscopy,lens 202 may be referred to as an objective lens ofmicroscope 100. The term “image sensor” refers to a device capable of detecting and converting optical signals into electrical signals. The electrical signals may be used to form an image or a video stream based on the detected signals. Examples ofimage sensor 200 may include semiconductor charge-coupled devices (CCD), active pixel sensors in complementary metal-oxide-semiconductor (CMOS), or N-type metal-oxide-semiconductor (NMOS, Live MOS). The term “lens” may refer to a ground or molded piece of glass, plastic, or other transparent material with opposite surfaces either or both of which are curved, by means of which light rays are refracted so that they converge or diverge to form an image. The term “lens” also refers to an element containing one or more lenses as defined above, such as in a microscope objective. The lens is positioned at least generally transversely of the optical axis ofimage sensor 200.Lens 202 may be used for concentrating light beams fromsample 114 and directing them towardsimage sensor 200. In some embodiments,image capture device 102 may include a fixed lens or a zoom lens. - When
sample 114 is located at a focal-plane 204, the image projected fromlens 202 is completely focused. The term “focal-plane” is used herein to describe a plane that is perpendicular to the optical axis oflens 202 and passes through the lens's focal point. The distance between focal-plane 204 and the center oflens 202 is called the focal length and is represented by D1. In some cases,sample 114 may not be completely flat, and there may be small differences between focal-plane 204 and various regions ofsample 114. Accordingly, the distance between focal-plane 204 andsample 114 or a region of interest (ROI) ofsample 114 is marked as D2. The distance D2 corresponds with the degree in which an image ofsample 114 or an image of ROI ofsample 114 is out of focus. For example, distance D2 may be between 0 and about 3 mm. In some embodiments, D2 may be greater than 3 mm. When distance D2 equals to zero, the image of sample 114 (or the image of ROI of sample 114) is completely focused. In contrast, when D2 has a value other than zero, the image of sample 114 (or the image of ROI of sample 114) is out of focus. -
FIG. 2A depicts a case where the image ofsample 114 is out of focus. For example, the image ofsample 114 may be out of focus when the beams of light received fromsample 114 do not converge onimage sensor 200.FIG. 2A depicts abeams pair 206 and abeams pair 208. Neither pair converges onimage sensor 200. For the sake of simplicity, the optical paths belowsample 114 are not shown. Consistent with the present disclosure, beams pair 206 may correspond withbeam 120 projected fromillumination assembly 110 at illumination angle α2, and beams pair 208 may correspond withbeam 118 projected fromillumination assembly 110 at illumination angle α1. In addition, beams pair 206 may concurrently hitimage sensor 200 withbeams pair 208. The term “concurrently ” in this context means thatimage sensor 200 has recorded information associated with two or more beams pairs during coincident or overlapping time periods, either where one begins and ends during the duration of the other, or where a later one starts before the completion of the other. In other embodiments, beams pair 206 and beams pair 208 may sequentiallycontact image sensor 200. The term “sequentially” means thatimage sensor 200 has started recording information associated with, for example,beam pair 206 after the completion of recording information associated with, for example,beam pair 208. - As discussed above, D2 is the distance between focal-
plane 204 andsample 114, and it corresponds with the degree in whichsample 114 is out of focus. In one example, D2 may have a value of 50 micrometers.Focus actuator 104 is configured to change distance D2 by converting input signals fromcontroller 106 into physical motion. In some embodiments, in order to focus the image ofsample 114, focusactuator 104 may moveimage capture device 102. In this example, to focus the image ofsample 114focus actuator 104 may moveimage capture device 102 50 micrometers up. In other embodiments, in order to focus the image ofsample 114, focusactuator 104 may movestage 116 down. Therefore, in this example, instead of movingimage capture device 102 50 micrometers up, focusactuator 104 may movestage 116 50 micrometers down. -
FIG. 2B illustrates a case where the image ofsample 114 is in focus. In this case, both beam pairs 206 and 208 converge onimage sensor 200, and distance D2 equals to zero. In other words, focusing the image of sample 114 (or the image of ROI of sample 114) may require adjusting the relative distance betweenimage capture device 102 andsample 114. The relative distance may be represented by D1-D2, and when distance D2 equals to zero, the relative distance betweenimage capture device 102 andsample 114 equals to distance D1, which means that the image ofsample 114 is focused. In the embodiment illustrated inFIGS. 2A and 2B ,lens 202 has a fixed focal length, i.e., distance D1 is constant. Therefore, the missing parameter needed to focus the image ofsample 114 is distance D2. The present disclosure provides a microscope and a method for determining the value of distance D2. -
FIG. 3A andFIG. 3B illustrate howmicroscope 100 may determine the value of distance D2 using images acquired under a plurality of different illumination conditions. Specifically,FIG. 3A illustrates an exemplary image (or two images overlaid on top of each other) shown onuser interface 112 when the image ofsample 114 is out of focus, andFIG. 3B illustrates an exemplary image shownuser interface 112 when the image ofsample 114 is in focus. -
FIG. 3A showsuser interface 112 displaying information obtained fromimage sensor 200 that corresponds with the case illustrated inFIG. 2A . As shown,user interface 112 displays afirst representation 300 ofsample 114 and asecond representation 302 ofsample 114. Each representation corresponds to a different illumination condition. For example,first representation 300 may correspond to first illumination angle α1, andsecond representation 302 may correspond to the second illumination angle α2. In one embodiment, bothfirst representation 300 andsecond representation 302 are displayed together as part of a captured image because the light from the first illumination angle α1 may concurrently hitimage sensor 200 with the light projected the second illumination angle α2. In another embodiment,first representation 300 is captured as a first image andsecond representation 302 is captured as a second image. Both images may be overlaid on top of each other and shown as a single image or used for calculations together. - In some embodiments,
controller 106 may be configured to identify the relative positions of the two (or more) representations using at least one common image feature ofsample 114. As used herein, the term “image feature” refers to an identifiable element in a digital image, such as a line, a point, a spot, an edges, a region of similar brightness, a similar shape, an area of the image, etc. or other distinguishing characteristic of the pixels that comprise the image ofsample 114. The changes between the two (or more) representations may be distinguishable with the naked eye and/or with the aid of image analysis algorithms that include feature detection or use a region of interest, which may be part, or all of the image, as the input features, such as, Marr-Hildreth algorithm, scale-invariant feature transform (SIFT) algorithm, speeded up robust features (SURF) algorithm, Digital image correlation (DIC) algorithm, cross correlation etc. As shown inFIG. 3A , bothfirst representation 300 andsecond representation 302 include a sharp protrusion on the upper side of the representation. Accordingly,controller 106 may identify afirst occurrence 304 of the sharp protrusion and asecond occurrence 306 of the sharp protrusion as a common image feature ofsample 114. Consistent with the present disclosure,first occurrence 304 may be associated the first illumination angle α1 andsecond occurrence 306 may be associated with the second illumination angle α2. - After identifying multiple occurrences of at least one image feature of
sample 114 associated with a plurality of illumination conditions,controller 106 may estimate an amount of shift between the occurrences. InFIG. 3A , the shift betweenfirst occurrence 304 andsecond occurrence 306 is represented by D3. The shift betweenfirst occurrence 304 andsecond occurrence 306 may be measured by counting the number of pixels between two occurrences of the same one or more image features. In theory, measured values of shift D3 originate from comparing multiple image features in the first and second representations and should be substantially identical. However, as often happens in real-life applications, there may be a significant variation in the measured values of shift D3 when estimating shifts of a plurality of image features. These variations may be caused by a tilt ofmicroscope 100, a non-flat sample, field curvature oflens 202, and more. Therefore, in order to estimate the shift D3 betweenfirst representation 300 andsecond representation 302,controller 106 may apply statistical calculations on the measured values. The statistical calculations may include one or more of the following operations: a mean, a median, an average, a mode, a percentile or a standard deviation.Controller 106 may additionally apply these statistical calculations when determining a plurality of shift values or a vector shift when using more than two illumination conditions, such as, more than two illumination angles. - In one embodiment, after estimating shift D3 between
first representation 300 andsecond representation 302,controller 106 may determine distance D2 using the distance between the illumination sources L, the distance between the illumination source plane and current focal plane Z and D3 in order to calculate the distance D2. In one example the distance D2 may be calculated using the following linear equation: -
- In order for
controller 106 to reduce the distance betweensample 114 and focal-plane 204,controller 106 may also determine the direction of the required adjustment. For example, in some cases focal-plane 204 may be below sample 114 (as illustrated inFIG. 2A ), andcontroller 106 would need to increase the relative distance betweenimage capture device 102 andsample 114 to focus the image. But in other cases focal-plane 204 may be abovesample 114, andcontroller 106 may need to decrease the relative distance betweenimage capture device 102 andsample 114 to focus the image. In one embodiment,controller 106 may determine the direction of the required adjustment using a two-step process. For example, assuming D2 has a value of 1 mm,controller 106 may instructfocus actuator 104 to moveimage capture device 102 0.3 mm up, and check if the focus of the image had improved. If it did improve,controller 106 may instructfocus actuator 104 to continue movingimage capture device 102 up for additional 0.7 mm. But if it did not improve,controller 106 may instructfocus actuator 104 to moveimage capture device 102 down for 1.3 mm. In another embodiment,controller 106 may determine the direction of the required adjustment using a one-step process such as, for example, by purposefully introducing a known separation betweensample 114 and the focal-plane 204. The known separation may correspond with a known shift, and a change to the known shift may indicate the size and direction of the actual shift D3. In another embodiment,controller 106 may determine the direction of the required adjustment using a one-step process such as, for example, measure the direction of shift of features betweenobject images -
FIG. 3B depictsuser interface 112 displaying information obtained fromimage sensor 200 that corresponds with the case illustrated inFIG. 2B . As shown,user interface 112 displays arepresentation 308 ofsample 114. In this example,user interface 112 displays a single representation because the image ofsample 114 is in focus. That is,first occurrence 304 andsecond occurrence 306 were merged torepresentation 308 after thecontroller 106 adjusted the relative distance betweenimage capture device 102 andsample 114. - In some embodiments,
controller 106 may determine that the quality of the image is not sufficient. For example, the level of sharpness associated with an image ofsample 114 may be below a predefined threshold. The level of sharpness may vary due to, for example, unintentional movement ofmicroscope 100, a change of the ROI ofsample 114, and more. To improve the quality of the image,controller 106 may refocusmicroscope 100. In addition,controller 106 may determine a plurality of shift values that correspond with a plurality of portions of a field of view ofimage capture device 102 to determine three-dimensional information. The three-dimensional information may include a tilt information betweenmicroscope 100 andsample 114, 3D shape of the object, and/or field curvature oflens 202.Controller 106 may use the tilt information when reconstructing the image ofsample 114 to improve the sharpness of the image ofsample 114. Additional examples regarding the reconstruction of the image ofsample 114 is provided below with reference toFIGS. 5-10 . -
FIG. 4 is a flowchart showing anexemplary process 400 for focusing an image ofsample 114 using two images captured whensample 114 is illuminated under two illumination conditions.Process 400, however, may be adapted to focus an image ofsample 114 using a single image captured whensample 114 is illuminated under two illumination conditions, or using one or more images whensample 114 is illuminated under more than two illumination conditions. The steps ofprocess 400 may be performed by an autofocus microscope. The term “autofocus microscope” refers to any device for magnifyingsample 114 with the capability to focus the image of sample 114 (or the image of ROI of sample 114) in an automatic or semiautomatic manner In the following description, reference is made to certain components ofmicroscope 100 for purposes of illustration. It will be appreciated, however, that other implementations are possible and that other components may be utilized to implement the example process. - At
step 402,controller 106 may causeillumination assembly 110 to illuminatesample 114 under a first illumination condition. Atstep 404,controller 106 may acquire, fromimage capture device 102, a first image ofsample 114 illuminated under the first illumination condition. In some embodiments,controller 106 may causeillumination assembly 110 to illuminatesample 114 using a single light source located within a numerical aperture ofimage capture device 102. Alternatively,controller 106 may causeillumination assembly 110 to illuminatesample 114 using a plurality of light sources located within the numerical aperture ofimage capture device 102. - At
step 406,controller 106 may causeillumination assembly 110 to illuminatesample 114 under a second illumination condition different from the first illumination condition. Next, atstep 408,controller 106 may acquire, fromimage capture device 102, a second image ofsample 114 illuminated under the second illumination condition. In some embodiments, the illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof. For example, the illumination conditions may include a first illumination angle and a second illumination angle symmetrically located with respect to an optical axis ofimage capture device 102. Alternatively, the illumination conditions may include a first illumination angle and a second illumination angle asymmetrically located with respect to an optical axis ofimage capture device 102. Alternatively, the illumination conditions may include a first illumination angle and a second illumination angle within the numerical aperture ofimage capture device 102. In the example depicted inFIG. 1 , first illumination angle α1 is greater than second illumination angle α2, thus the first illumination angle and the second illumination angle are asymmetrically located with respect to an optical axis ofimage capture device 102. - At
step 410,controller 106 may determine an amount of shift D3 between one or more image features present in the first image ofsample 114 and a corresponding one or more image features present in the second image ofsample 114. In some embodiments,controller 106 may determine a plurality of shift values based on multiple image features and calculate an overall shift associated with shift D3. For example, the overall shift may be a mean, a median, a mode of the plurality of shift values. In other embodiments,controller 106 may determine a size of the distance change based on a magnitude of shift D3. In addition,controller 106 may also determine a direction of the distance change based on a direction of shift D3, or by purposely introducing a known separation between the sample and the focal plane. As discussed above, in some cases, focal-plane 204 may be below sample 114 (as illustrated inFIG. 2A ), and in other cases, focal-plane 204 may be abovesample 114. These different cases may require a different direction of the distance change to focusmicroscope 100. In some embodiments,controller 106 may calculate a distance from focal-plane 204 to sample 114 based on the shift (e.g., shift D3 in the lateral direction). - At
step 412,controller 106 may, where the amount of determined shift D3 is non-zero,cause focus actuator 104 to change distance D2 betweensample 114 and focal-plane 204. As discussed above, 104 may moveimage capture device 102 to adjust distance D2 betweensample 114 and focal-plane 204, or movestage 116 to adjust the distance betweensample 114 and focal-plane 204. In some embodiments,controller 106 may causefocus actuator 104 to reduce the distance betweensample 114 and focal-plane 204 to substantially zero, for example, as illustrated inFIG. 2B . In some embodiments, when focus actuator 104 changes distance D2 in a first direction (e.g., up),controller 106 may determine that the amount of shift D3 has increased after the change in the first direction, and causefocus actuator 104 to change the distance in a second direction (e.g., down). - In some embodiments,
controller 106 may repeatsteps 402 to 410 to determine an amount of a new shift after adjusting distance D2 betweensample 114 and focal-plane 204. If the amount of the new shift is still non-zero, or above a predefined threshold.Controller 106 may causefocus actuator 104 to change again distance D2 betweensample 114 and focal-plane 204. In some embodiments,controller 106 may readjust distance D2 betweensample 114 and focal-plane 204 until shift D3 would be substantially zero or below the predefined threshold. When the amount of the new shift is below a predetermined threshold,controller 106 may store the amount of determined shift for future focus compensation calculations. After completingprocess 400,microscope 100 is completely focused. Thereafter, and according to another aspect of the disclosure,microscope 100 may acquire a plurality of focused images to generate a high-resolution image ofsample 114. As shown inFIG. 1 , the high-resolution image ofsample 114 may be sent to a display (e.g., a screen or phone), stored in memory, sent for further processing or sent over a network. - In some embodiments,
controller 106 may use the determined distance D2 to perform calculations for computational correction of focus along withphysical motion stage 116 or without causingstage 116 to move. Furthermore, in some embodiments,stage 116 and/orsample 114 may be moved using motors or manual controls in the XY plane to enable imaging of multiple areas ofsample 114. - There are several known methods in the field of computational imaging processing for producing a high-resolution image of a sample from a set of low-resolution images. One of these methods is, for example, ptychography. These methods may use an iterative process in order to compute the high-resolution image in a way that the reconstructed image in each iteration is compared to a pre-iteration high-resolution image, and the difference between them serves as the convergence condition. The present disclosure describes microscopes and methods for producing a high-resolution image from a set of low resolution images taken with different illumination conditions, but does not require iterations as used by the known methods. Therefore, the disclosed microscopes and methods enable decreasing the computation time needed to reconstruct the high-resolution image.
- Consistent with the present disclosure,
controller 106 may acquire images at a first image resolution and generate a reconstructed image ofsample 114 having a second (enhanced) image resolution. The term “image resolution” is a measure of the degree to which the image represents the fine details ofsample 114. For example, the quality of a digital image may also be related to the number of pixels and the range of brightness values available for each pixel. In some embodiments, generating the reconstructed image ofsample 114 is based on images having an image resolution lower than the enhanced image resolution. The enhanced image resolution may have at least 2 times, 5 times, 10 times, or 100 times more pixels than the lower image resolution images. For example, the first image resolution of the captured images may be referred to hereinafter as low-resolution and may have a value between 2 megapixels and 25 megapixels, between 10 megapixels and 20 megapixels, or about 15 megapixels. Whereas, the second image resolution of the reconstructed image may be referred to hereinafter as high-resolution and may have a value higher than 40 megapixels, higher than 100 megapixels, higher than 500 megapixels, or higher than 1000 megapixels. -
FIG. 5 is an illustration of anexemplary process 500 for reconstructing an image ofsample 114, consistent with disclosed embodiments. Atstep 502,controller 106 may acquire from image capture device 102 a plurality of low resolution images ofsample 114. The plurality of images includes at least one image for each illumination condition. As mentioned above, the different illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof. In some embodiments, the total number (N) of the plurality of different illumination conditions is between 2 to 10, between 5 to 50, between 10 to 100, between 50 to 1000, or more than 1000. - At
step 504,controller 106 may determine image data ofsample 114 associated with each illumination condition. For example,controller 106 may apply a Fourier transform on images acquired fromimage capture device 102 to obtain Fourier transformed images. The Fourier transform is an image processing tool which is used to decompose an image into its sine and cosine components. The input of the transformation may be an image in the normal image space (also known as real-plane), while the output of the transformation may be a representation of the image in the frequency domain (also known as a Fourier-plane). Consistent with the present disclosure, the output of a transformation, such as the Fourier transform, is also referred to as “image data.” Alternatively,controller 106 may use other transformations, such as a Laplace transform, a Z transform, a Gelfand transform, or a Wavelet transform. In order to rapidly and efficiently convert the captured images into images in the Fourier-plane,controller 106 may use a Fast Fourier Transform (FFT) algorithm to compute the Discrete Fourier Transform (DFT) by factorizing the DFT matrix into a product of sparse (mostly zero) factors. - At
step 506,controller 106 may aggregate the image data determined from images captured under a plurality of illumination conditions to form a combined complex image. One way forcontroller 106 to aggregate the image data is by locating in the Fourier-plane overlapping regions in the image data. Another way forcontroller 106 to aggregate the image data is by determining the intensity and phase for the acquired low-resolution images per illumination condition. In this way, the image data, corresponding to the different illumination conditions, does not necessarily include overlapping regions. By eliminating or reducing the amount of overlap needed, this method has a great advantage in reducing the number of illumination conditions needed in order to reconstruct an image with a certain resolution, and therefore increasing the acquisition speed of the image information.FIGS. 6A-6F illustrate different configurations ofmicroscope 100 for determining phase information under a variety of illumination conditions. - At
step 508,controller 106 may generate a reconstructed high-resolution image ofsample 114. For example,controller 106 may apply the inverse Fourier transform to obtain the reconstructed image. In one embodiment, depicted inFIG. 5 , the reconstructed high-resolution image ofsample 114 may be shown on a display (e.g., user interface 112). In another embodiment, the reconstructed high-resolution image ofsample 114 may be used to identify at least one element ofsample 114 in the reconstructed image. The at least one element ofsample 114 may include any organic or nonorganic material identifiable using a microscope. Examples of the at least one element include, but are not limited to, biomolecules, whole cells, portions of cells such as various cell components (e.g., cytoplasm, mitochondria, nucleus, chromosomes, nucleoli, nuclear membrane, cell membrane, Golgi apparatus, lysosomes), cell-secreted components (e.g., proteins secreted to intercellular space, proteins secreted to body fluids, such as serum, cerebrospinal fluid, urine), microorganisms, and more. In some embodiments, the reconstructed image may be used in the following procedures: blood cell recognition, identification of chromosomes and karyotypes, detection of parasitic infections, identification of tissues suspected as malignant, and more. - The present disclosure provides several ways to determine the phase information under each illumination condition. According to one embodiment that may be implemented in the configuration of
FIG. 6A ,microscope 100 may includeillumination assembly 110, focusactuator 104,lens 202, andimage sensor 200. In this embodiment,controller 106 may acquire a group of images from different focal-planes for each illumination condition. Therefore,controller 106 may use the information from the different focal-planes to determine the phase information under each illumination condition.FIG. 7 describes a detailed example process of howcontroller 106 may use the configuration ofFIG. 6A to generate the reconstructed high-resolution image ofsample 114. - According to another embodiment that may be implemented in the configuration of
FIG. 6B ,microscope 100 may includeillumination assembly 110,lens 202, abeam splitter 600, afirst image sensor 200A, and asecond image sensor 200B. In this embodiment,first image sensor 200A andsecond image sensor 200B may capture different types of images, andcontroller 106 may combine the information fromfirst image sensor 200A andsecond image sensor 200B to determine the phase information under each illumination condition. In one example,image sensor 200A may capture Fourier-plane images andsecond image sensor 200B may capture real-plane images. Accordingly,controller 106 may acquire, for each illumination condition, a Fourier-plane image fromfirst image sensor 200A and a real-plane image fromsecond image sensor 200B. Therefore,controller 106 may combine information from the Fourier-plane image and the real-plane image to determine the phase information under each illumination condition. In another example,image sensor 200A may be configured to capture focused images andsecond image sensor 200B is configured to capture unfocused images. It is also possible that additional sensors may be added. For example, 3 different sensors may be configured to capture images in 3 different focal planes. - According to another embodiment that may be implemented in the configurations of
FIG. 6C andFIG. 6D ,microscope 100 may include alight source 602, abeam splitter 600,lens 202, andimage sensor 200. In this embodiment,light source 602 may project a light beam (coherent or at least partially coherent) towardsbeam splitter 600, the beam splitter generates two light beams that travel through two different optical paths and create an interference pattern. In the configuration ofFIG. 6C , the interference pattern is created onsample 114, and inFIG. 6D , the interference pattern is created onimage sensor 200. In the case presented in Fig.6D,controller 106 may identify, for each illumination condition, the interference pattern between the two light beams traveling through the different optical paths, and determine, from the interference pattern, the phase associated with each illumination condition. - According to yet another embodiment that may be implemented in the configurations of
FIG. 6E andFIG. 6F ,microscope 100 may includeillumination assembly 110,lens 202, anoptical element 604, and at least oneimage sensor 200. In this embodiment,optical element 604 is configured to impose some form of modulation on the light received fromsample 114. The modulation may be imposed on the phase, the frequency, the amplitude, or the polarization of the beam. In the configuration illustrated inFIG. 6E ,microscope 100 may include a dynamic optical element, such as spatial light modulator (SLM), that may dynamically change the modulation.Controller 106 may use the different information caused by the dynamic optical element to determine the phase information under each illumination condition. Alternatively, in the configuration illustrated inFIG. 6F ,microscope 100 may include a fixed optical element, such as phase-shift mask,beam splitter 600,first image sensor 200A, andsecond image sensor 200B.Controller 106 may combine information fromfirst image sensor 200A andsecond image sensor 200B to determine the phase information under each illumination condition. - In one embodiment,
controller 106 may determine phase information under each illumination condition independently.FIG. 7 is a flow diagram showing the process ofFIG. 5 using the configuration ofFIG. 6A . The process begins whencontroller 106 causesillumination assembly 110 to illuminatesample 114 at a first illumination condition (block 700). Next,controller 106 may acquire an image whensample 114 is in focal-plane 204 (block 702). Then,controller 106 may causefocus actuator 104 to change the distance betweenimage capture device 102 and sample 114 (block 704), and acquire an additional image whensample 114 is not in focal-plane 204 (block 706). In some embodiments, the distance betweenimage capture device 102 andsample 114 may constitute a distance fromlens 202 to sample 114, a distance fromimage sensor 200 to sample 144, or a sum of th distance fromlens 202 to sample 114 and the distance fromimage sensor 200 to sample 114. Thereafter,controller 106 may determine whether there is sufficient data to determine the phase information (decision block 708). If there is insufficient data to determine the phase information,controller 106 may repeat the steps in blocks 704-708 until there is sufficient data to determine the phase information ofsample 114 under the current illumination condition. The phase may be calculated using methods such as transport of intensity (TIE), error reduction algorithms, Hybrid input-output (HIO), optimization algorithms such as gradient descent and others. - The example process of
FIG. 7 may continue whencontroller 106 transforms the image captured from real space in thefocal plane 204 to a Fourier space (block 710). Thereafter,controller 106 may add image data associated with the current illumination condition to a combined complex image (block 712). Ifcontroller 106 collected all of the image data associated with all of the illumination conditions and added them to the combined complex image (decision block 714), thencontroller 106 may transform the combined complex image into the image-plane to generate a reconstructed image ofsample 114. But, if not all of the image data associated with all of the illumination conditions was collected,controller 106 may causeillumination assembly 110 to illuminatesample 114 under another illumination condition (block 716) and then may repeat steps 702-714. -
FIG. 8 is a schematic illustration that identifies the numerical aperture (NA) ofmicroscope 100. Consistent with the present disclosure,microscope 100 may include a lens (e.g., lens 202) with a first numerical aperture. The term “numerical aperture” refers to the medium index of refraction (e.g., n1) multiplied by the sine of the maximal angle (e.g., θ1) formed between the optical axis of the lens and the cone of light beams over which the lens can accept light, i.e., NA1=n1*Sin θ1. For example, the first numerical aperture may be less than 1, less than 0.8, less than 0.6, less than 0.4, less than 0.2, or less than 0.1, or can be more than 1. In addition,illumination assembly 110 may illuminatesample 114 at an incidence angle of illumination. The term “incidence angle of illumination” refers to an angle (e.g., θ2) formed between the optical axis of the lens and a light beam projected fromillumination assembly 110. In some embodiments, the maximal incidence angle of illumination represents a second numerical aperture, i.e., NA2=n2*Sin Maxθ2, which is at least 1.5 times the first numerical aperture. For example, the second numerical aperture may be more than 2×NA1, more than 2.5×NA1, more than 3.5×NA1, or more than 5×NA1. -
FIG. 9A andFIG. 9B depict two schematic illustrations in Fourier-plane. Specifically,FIG. 9A is an illustration in Fourier-plane of image data acquired under a single illumination condition, andFIG. 9B is an illustration in Fourier-plane of image data acquired under a plurality of different illumination conditions. The illustration ofFIG. 9A includes an image in Fourier-plane with a firstcircular area 900 in Fourier-plane whose radius is equal to NA1, and a second theoreticalcircular area 902 whose radius is equal to NA2. As shown in the figure, the radius of secondcircular area 902 is at least 1.5 times the radius of firstcircular area 900. Firstcircular area 900 is associated with image data ofsample 114 when the illumination angle equals to 0 degrees. In one embodiment, the non-iterative process of generating the reconstructed image includes using image data associated with a single illumination condition for each point in the combined complex image.FIG. 9B illustrates this embodiment. Specifically,FIG. 9B depicts a plurality of circles in Fourier-plane. Each circle represents a point in the combined complex image and is associated with image data acquired under a single illumination condition. For example, each circle inFIG. 9B is associated with a different illumination angle. Since the radius ofCircular area 902 is at least 1.5 times the radius of the firstcircular area 900, the system in some embodiments is not limited one to a first ‘order’ of additional areas aroundarea 900, but can have additional ‘orders’ of circular areas further away from the firstcircular area 900. This is important in order to achieve higher resolution of the final image. This method is not limited to increasing the numerical aperture by only a factor of 2. As described above with reference toFIG. 1 , the illumination ofsample 114 may result from light sources located at a surface parallel to sample 114 and/or from light sources located at a surface perpendicular to sample 114. For example, as shown inFIG. 1 , light sources may be included onillumination assembly 110. Further, in some embodiments, light sources may be located at any appropriate location ofmicroscope 100 and illuminatesample 114 at any appropriate angle. Further, light sources may be positioned on other surfaces, such as on the surface of a hemisphere or cube positioned understage 116. -
FIG. 10 is a flowchart showing anexemplary process 1000 for constructing a high-resolution image ofsample 114 using image information acquired under a plurality of different illumination conditions. Atstep 1002,controller 106 may cause the illumination assembly to illuminatesample 114 at a series of different illumination conditions. As described above, the illumination conditions may include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof. Atstep 1004,controller 106 may acquire from image capture device 102 a plurality of images ofsample 114, and the plurality of images may include at least one image for each illumination condition. In some embodiments, the acquired images are captured at an image resolution higher than 12 megapixels with a pixel size smaller than 2 micrometers in both the horizontal and vertical dimensions High resolution sensors with small pixel sizes are low cost and lead to a large field of view, but require significant effort in working with low signal to noise ratios (a result of the small pixel size) and optimization of speed due to the high memory bandwidths and large fields of view. However our method is capable of operating with such sensors whereas other methods cannot. - At
step 1006,controller 106 may determine, from the at least one image, image data ofsample 114 for each illumination condition. In some embodiments, in order to determine the image data ofsample 114 for each illumination condition,controller 106 may transform the at least one image from a real space to a Fourier space, aggregate the image data of the sample in the Fourier-space to form a combined complex image, and transform the combined complex image data back to the image space to generate the reconstructed image ofsample 114. Consistent with some embodiments, determining image data ofsample 114 for each illumination condition may include determining phase information ofsample 114 under each illumination condition independently. As discussed above with reference toFIGS. 6A-6F , determining the phase information under each illumination condition may be implemented using different configurations ofmicroscope 100. - In a first embodiment,
controller 106 may acquire, fromimage capture device 102, a group of first images from different focal planes for each illumination condition and determine, from the group of first images, phase information under each illumination condition independently. In a second embodiment,controller 106 may acquire, fromfirst image sensor 200A, a first image for each illumination condition; acquire, fromsecond image sensor 200B, a second image different from the first image for each illumination condition; and combine information from the first image and the second image to determine phase information under each illumination condition independently. In a third embodiment,controller 106 may identify, for each illumination condition, an interference pattern between the first and second light beams and determine, from the interference pattern, phase information associated with each illumination condition independently. In a fourth embodiment,controller 106 may acquire, for each illumination condition, a first image fromfirst image sensor 200A, and a second image fromsecond image sensor 200B, wherein the second image is modulated differently from the first image; and combine information from the first image and the second image to determine phase information under each illumination condition. - At
step 1008,controller 106 may generate, from the determined image data for each illumination condition, a reconstructed image ofsample 114. The reconstructed image having a second image resolution higher than the first image resolution. In some embodiments,controller 106 may generate the reconstructed image in a non-iterative process. The term “generate a reconstructed image in a non-iterative process” refers to a process in which the reconstructed image is not compared to the acquired images nor are the acquired images compared to themselves. The non-iterative process may include using image data associated with a single illumination condition for each point in the combined complex image, as depicted inFIG. 9 . In order to reconstruct an image in a non-iterative process,controller 106 may determine the intensity and phase information ofsample 114 for each illumination condition. Thereafter,controller 106 may use the intensity and phase information to organize all of the pieces of the puzzle (i.e., the image data determined under each illumination condition) in their place. As one skilled in art would recognize, using this non-iterative process enables one to decrease the computation time needed to reconstruct the high-resolution image. It is possible, but not mandatory, that determining the phase information or other information for each illumination condition independently will be done using an iterative process. However, generating the final high resolution image from the information determined from the multiple illumination conditions will be done in a non-iterative process. In this case the overlap between regions in Fourier space can still be reduced or eliminated. - After
controller 106 generates the reconstructed image ofsample 114, it may cause the reconstructed image to be shown on a display (step 1010) or identify at least one element ofsample 114 in the reconstructed image (step 1012). In some embodiments,controller 106 may confirm the quality of the reconstructed image before using it. For example,controller 106 may generate the reconstructed image using a first set of constructing parameters, and determine that the reconstructed image is not in a desired quality. In one example, the determination that reconstructed image is not in the desired quality is based on a level of sharpness of the reconstructed image or parts of it, or a comparison with expected or known results based on prior knowledge. Thereafter,controller 106 may generate a second reconstructed image using a second set of constructing parameters. In addition,controller 106 may acquire another set of images ofsample 114 after changing the focus ofmicroscope 100, as described above with reference toFIG. 4 . - The foregoing description has been presented for purposes of illustration. It is not exhaustive and is not limited to the precise forms or embodiments disclosed. Modifications and adaptations will be apparent to those skilled in the art from consideration of the specification and practice of the disclosed embodiments. Additionally, although aspects of the disclosed embodiments are described as being stored in memory, one skilled in the art will appreciate that these aspects can also be stored on other types of computer readable media, such as secondary storage devices; for example, hard disks, floppy disks, CD ROM, other forms of RAM or ROM, USB media, DVD, or other optical drive media.
- Computer programs based on the written description and disclosed methods are within the skill of an experienced developer. The various programs or program modules can be created using any of the techniques known to one skilled in the art or can be designed in connection with existing software. For example, program sections or program modules can be designed in or by means of .Net Framework, .Net Compact Framework (and related languages, such as Visual Basic, C, etc.), Java, C++, Objective-C, python, Matlab, Cuda, HTML, HTML/AJAX combinations, XML, or HTML with included Java applets. One or more of such software sections or modules can be integrated into a computer system or existing e-mail or browser software.
- Moreover, while illustrative embodiments have been described herein, the scope of any and all embodiments having equivalent elements, modifications, omissions, combinations (e.g., of aspects across various embodiments), adaptations and/or alterations as would be appreciated by those skilled in the art based on the present disclosure. The limitations in the claims are to be interpreted broadly based on the language employed in the claims and not limited to examples described in the present specification or during the prosecution of the application. The examples are to be construed as non-exclusive. Furthermore, the steps of the disclosed routines may be modified in any manner, including by reordering steps and/or inserting or deleting steps. It is intended, therefore, that the specification and examples be considered as illustrative only, with a true scope and spirit being indicated by the following claims and their full scope of equivalents.
Claims (23)
1. A microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions, the microscope comprising:
at least one image capture device configured to capture at a first image resolution images of a sample;
a lens with a first numerical aperture;
an illumination assembly including at least one light source configured to illuminate the sample;
at least one controller programmed to:
cause the illumination assembly to illuminate the sample at a series of different illumination conditions;
acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination condition;
determine, from the at least one image, image data of the sample for each illumination condition; and
generate, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
2. The microscope of claim 1 , wherein the at least one controller is further programmed to cause the reconstructed image to be shown on a display.
3. The microscope of claim 1 , wherein the at least one controller is further programmed to identify at least one element of the sample in the reconstructed image.
4. The microscope of claim 1 , wherein the different illumination conditions include at least one of: different illumination angles, different illumination patterns, different wavelengths, or a combination thereof.
5. The microscope of claim 1 , wherein determining image data of the sample for each illumination condition includes determining phase information of the sample under each illumination condition independently.
6. The microscope of claim 5 , wherein in order to determine the phase information under each illumination condition, the at least one controller is programmed to:
acquire, from the at least one image capture device, a group of first images from different focal planes for each illumination condition; and
determine, from the group of first images, the phase information under each illumination condition independently.
7. The microscope of claim 5 , wherein the at least one image capture device includes a beam splitter, a first image sensor configured to capture first images, and a second image sensor configured to capture second images, and wherein in order to determine the phase information under each illumination condition, the at least one controller is programmed to:
acquire, from the at least one image capture device, a first image and a second image under each illumination condition, wherein the second image differs from the first image; and
combine information from the first image and the second image to determine the phase information under each illumination condition independently.
8. The microscope of claim 5 further includes a beam splitter configured to generate a first light beam that travels towards the at least one image capture device via a first optical path, and a second light beam that travels towards the at least one image capture device via a second optical path, wherein in order to determine the phase information under each illumination condition, the at least one controller is programmed to:
identify, for each illumination condition, an interference pattern between the first and second light beams; and
determine, from the interference pattern, the phase information under each illumination condition independently.
9. The microscope of claim 5 further includes an optical element configured to modulate light reflected from the sample, wherein in order to determine the phase information under each illumination condition, the at least one controller is programmed to:
acquire, from the at least one image capture device, a first image and a second image for each illumination condition, wherein the second image is modulated differently from the first image; and
combine information from the first image and the second image to the phase information under each illumination condition independently.
10. The microscope of claim 1 , wherein in order to determine the image data of the sample for each illumination condition the at least one controller is further programmed to:
transform the at least one image from a real-plane to a Fourier-plane;
aggregate the image data of the sample in the Fourier-plane to form a combined complex image; and
transform the combined complex image data back to the image space to generate the reconstructed image of the sample.
11. The microscope of claim 10 , wherein the non-iterative process of generating the reconstructed image includes using image data associated with a single illumination condition for each point in the combined complex image.
12. The microscope of claim 1 , wherein the at least one image capture device is configured to capture images at the first image resolution higher than 12 megapixels and with a pixel size smaller than 2 micrometers.
13. A microscope for constructing an image of a sample using image information acquired under a plurality of different illumination conditions, the microscope comprising:
at least one image capture device configured to capture at a first image resolution images of a sample;
a lens with a first numerical aperture;
an illumination assembly including at least one light source configured to illuminate the sample;
at least one controller programmed to:
cause the illumination assembly to illuminate the sample at a series of different illumination conditions;
acquire from the at least one image capture device a plurality of images of the sample, wherein the plurality of images includes at least one image for each illumination condition;
determine, from the at least one image, image data of the sample for each illumination condition, wherein the image data includes phase information of the sample under each illumination condition; and
generate, from the image data for the series of different illumination conditions, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
14. The microscope of claim 13 , wherein the at least one controller is further programmed to cause the reconstructed image to be shown on a display or to identify at least one element of the sample in the reconstructed image.
15. The microscope of claim 13 , wherein the at least one controller is further programmed to:
generate the reconstructed image using a first set of constructing parameters;
determine that the reconstructed image is not in a desired quality; and
generate a second reconstructed image using a second set of constructing parameters.
16. The microscope of claim 15 , wherein the determination that reconstructed image is not in the desired quality is based on a level of sharpness of the reconstructed image.
17. A method for constructing an image of a sample using image information acquired under a plurality of different illumination conditions, the method comprising:
illuminating a sample at a series of different illumination conditions;
acquiring, from the image capture device, a plurality of images of the sample captured at a first image resolution, wherein the plurality of images includes at least one image for each illumination condition;
determining, from the at least one image, image data of the sample for each illumination condition, wherein the image data includes phase information of the sample under each illumination condition; and
generating, from the determined image data for each illumination condition and in a non-iterative process, a reconstructed image of the sample, the reconstructed image having a second image resolution higher than the first image resolution.
18. The method of claim 17 , wherein the different illumination conditions includes different illumination angles.
19. The method of claim 17 , wherein the different illumination conditions includes different illumination patterns.
20. The method of claim 17 , wherein the different illumination conditions includes different wavelengths.
21. The microscope of claim 1 , wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture.
22. The microscope of claim 13 , wherein a maximal incidence angle of illumination represents a second numerical aperture which is at least 1.5 times the first numerical aperture.
23. The method of claim 17 , wherein an illumination of the sample is at an incidence angle representing a maximal numerical aperture which is at least 1.5 times a numerical aperture associated with an image capture device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/775,389 US20190235224A1 (en) | 2015-11-11 | 2016-11-10 | Computational microscopes and methods for generating an image under different illumination conditions |
Applications Claiming Priority (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201562253726P | 2015-11-11 | 2015-11-11 | |
US201562253734P | 2015-11-11 | 2015-11-11 | |
US201562253723P | 2015-11-11 | 2015-11-11 | |
PCT/IB2016/001725 WO2017081542A2 (en) | 2015-11-11 | 2016-11-10 | Computational microscopes and methods for generating an image under different illumination conditions |
US15/775,389 US20190235224A1 (en) | 2015-11-11 | 2016-11-10 | Computational microscopes and methods for generating an image under different illumination conditions |
Publications (1)
Publication Number | Publication Date |
---|---|
US20190235224A1 true US20190235224A1 (en) | 2019-08-01 |
Family
ID=57543083
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/775,389 Abandoned US20190235224A1 (en) | 2015-11-11 | 2016-11-10 | Computational microscopes and methods for generating an image under different illumination conditions |
US15/775,370 Active 2037-05-18 US10705326B2 (en) | 2015-11-11 | 2016-11-10 | Autofocus system for a computational microscope |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/775,370 Active 2037-05-18 US10705326B2 (en) | 2015-11-11 | 2016-11-10 | Autofocus system for a computational microscope |
Country Status (5)
Country | Link |
---|---|
US (2) | US20190235224A1 (en) |
EP (2) | EP3374816A2 (en) |
JP (1) | JP2018537709A (en) |
CN (2) | CN108351504A (en) |
WO (2) | WO2017081539A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11409095B2 (en) | 2017-11-20 | 2022-08-09 | Scopio Labs Ltd. | Accelerating digital microscopy scans using empty/dirty area detection |
DE102021102990A1 (en) | 2021-02-09 | 2022-08-11 | Carl Zeiss Microscopy Gmbh | Method and device for determining the optimal position of the focal plane for microscopy of a sample |
US11482021B2 (en) | 2017-10-19 | 2022-10-25 | Scopio Labs Ltd. | Adaptive sensing based on depth |
US11549955B2 (en) | 2017-11-20 | 2023-01-10 | Scopio Labs Ltd. | Multi/parallel scanner |
US11650405B2 (en) | 2019-11-15 | 2023-05-16 | Scopio Labs Ltd. | Microscope and method for computational microscopic layer separation |
Families Citing this family (34)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10652444B2 (en) | 2012-10-30 | 2020-05-12 | California Institute Of Technology | Multiplexed Fourier ptychography imaging systems and methods |
US9864184B2 (en) | 2012-10-30 | 2018-01-09 | California Institute Of Technology | Embedded pupil function recovery for fourier ptychographic imaging devices |
US10679763B2 (en) | 2012-10-30 | 2020-06-09 | California Institute Of Technology | Fourier ptychographic imaging systems, devices, and methods |
JP2016534389A (en) | 2013-07-31 | 2016-11-04 | カリフォルニア インスティチュート オブ テクノロジー | Aperture scanning Fourier typography imaging |
WO2015027188A1 (en) | 2013-08-22 | 2015-02-26 | California Institute Of Technoloby | Variable-illumination fourier ptychographic imaging devices, systems, and methods |
US11468557B2 (en) | 2014-03-13 | 2022-10-11 | California Institute Of Technology | Free orientation fourier camera |
US10162161B2 (en) | 2014-05-13 | 2018-12-25 | California Institute Of Technology | Ptychography imaging systems and methods with convex relaxation |
EP3238135B1 (en) | 2014-12-22 | 2020-02-05 | California Institute Of Technology | Epi-illumination fourier ptychographic imaging for thick samples |
EP3248208B1 (en) | 2015-01-21 | 2019-11-27 | California Institute of Technology | Fourier ptychographic tomography |
CA2970053A1 (en) | 2015-01-26 | 2016-08-04 | California Institute Of Technology | Multi-well fourier ptychographic and fluorescence imaging |
CA2979392A1 (en) | 2015-03-13 | 2016-09-22 | California Institute Of Technology | Correcting for aberrations in incoherent imaging system using fourier ptychographic techniques |
US9993149B2 (en) | 2015-03-25 | 2018-06-12 | California Institute Of Technology | Fourier ptychographic retinal imaging methods and systems |
WO2016187591A1 (en) | 2015-05-21 | 2016-11-24 | California Institute Of Technology | Laser-based fourier ptychographic imaging systems and methods |
US11092795B2 (en) | 2016-06-10 | 2021-08-17 | California Institute Of Technology | Systems and methods for coded-aperture-based correction of aberration obtained from Fourier ptychography |
US10568507B2 (en) | 2016-06-10 | 2020-02-25 | California Institute Of Technology | Pupil ptychography methods and systems |
US10558029B2 (en) | 2016-10-27 | 2020-02-11 | Scopio Labs Ltd. | System for image reconstruction using a known pattern |
DE102016123154A1 (en) * | 2016-11-30 | 2018-05-30 | Carl Zeiss Microscopy Gmbh | Determination of the arrangement of a sample object by means of angle-selective illumination |
US10477097B2 (en) | 2017-01-03 | 2019-11-12 | University Of Connecticut | Single-frame autofocusing using multi-LED illumination |
DE102017111718A1 (en) * | 2017-05-30 | 2018-12-06 | Carl Zeiss Microscopy Gmbh | Method for generating and analyzing an overview contrast image |
DE102017115658A1 (en) * | 2017-07-12 | 2019-01-17 | Carl Zeiss Microscopy Gmbh | Flickering at angle-variable illumination |
US10754140B2 (en) | 2017-11-03 | 2020-08-25 | California Institute Of Technology | Parallel imaging acquisition and restoration methods and systems |
FR3076911A1 (en) * | 2018-01-16 | 2019-07-19 | Francois Perraut | MICROSCOPE LIGHTING DEVICE FOR MEASURING A FAULT OF FOCUS |
FR3076912A1 (en) * | 2018-01-16 | 2019-07-19 | Francois Perraut | MICROSCOPE LIGHTING DEVICE FOR PERFORMING THE IMAGE FOCUS OF AN OBJECT |
DE102018107356A1 (en) | 2018-03-28 | 2019-10-02 | Carl Zeiss Microscopy Gmbh | Autofocus with angle-variable illumination |
WO2019185174A1 (en) * | 2018-03-29 | 2019-10-03 | Leica Microsystems Cms Gmbh | Apparatus and method, particularly for microscopes and endoscopes, using baseline estimation and half-quadratic minimization for the deblurring of images |
CN108398775B (en) * | 2018-04-24 | 2019-11-22 | 清华大学 | The focusing method and device of fluorescence microscope system |
DE102018128083A1 (en) * | 2018-11-09 | 2020-05-14 | Leica Microsystems Cms Gmbh | Microscopic transmitted light contrast method |
EP3899621A4 (en) * | 2018-12-21 | 2022-09-28 | Scopio Labs Ltd. | Compressed acquisition of microscopic images |
US11356593B2 (en) * | 2019-05-08 | 2022-06-07 | University Of Connecticut | Methods and systems for single frame autofocusing based on color- multiplexed illumination |
US11523046B2 (en) * | 2019-06-03 | 2022-12-06 | Molecular Devices, Llc | System and method to correct for variation of in-focus plane across a field of view of a microscope objective |
KR102484708B1 (en) * | 2019-11-18 | 2023-01-05 | 단국대학교 산학협력단 | Correction method of three-dimensional image based on multi-wavelength light |
JP2023537892A (en) * | 2020-08-07 | 2023-09-06 | ナノトロニクス イメージング インコーポレイテッド | A Deep Learning Model for Autofocusing Microscopy Systems |
CN112799225A (en) * | 2020-12-31 | 2021-05-14 | 安徽大学 | Smart phone microscopic system based on intensity transmission equation |
CN114967100B (en) * | 2022-08-02 | 2022-11-18 | 杭州德适生物科技有限公司 | Chromosome karyotype analysis micro-shooting device and parameter correction method thereof |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040113050A1 (en) * | 2002-12-17 | 2004-06-17 | Olszak Artur G. | Method and apparatus for limiting scanning imaging array data to characteristics of interest |
US20060126952A1 (en) * | 2004-11-19 | 2006-06-15 | Ntt Docomo, Inc. | Image decoding apparatus, image decoding program, image decoding method, image encoding apparatus, image encoding program, and image encoding method |
US20140347515A1 (en) * | 2011-10-14 | 2014-11-27 | Panasonic Corporation | Imaging lens and imaging device using same |
US20150054979A1 (en) * | 2013-08-22 | 2015-02-26 | California Institute Of Technology | Variable-illumination fourier ptychographic imaging devices, systems, and methods |
US20150331228A1 (en) * | 2014-05-13 | 2015-11-19 | California Institute Of Technology | Ptychography imaging systems and methods with convex relaxation |
US20160131891A1 (en) * | 2013-09-06 | 2016-05-12 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, image pickup apparatus, and non-transitory computer-readable storage medium |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE2423136C3 (en) | 1974-05-13 | 1982-07-29 | Fa. Carl Zeiss, 7920 Heidenheim | Device for automatic focusing of stereo microscopes |
JPS58153327A (en) | 1982-03-08 | 1983-09-12 | Toshiba Corp | Inspecting device for pattern |
DE3328821C2 (en) | 1983-08-10 | 1986-10-02 | Fa. Carl Zeiss, 7920 Heidenheim | Auto focus for microscopes |
JPH0769162B2 (en) | 1990-04-23 | 1995-07-26 | 大日本スクリーン製造株式会社 | Automatic focusing device for optical inspection system |
IL111229A (en) | 1994-10-10 | 1998-06-15 | Nova Measuring Instr Ltd | Autofocusing microscope |
EP0795137B1 (en) * | 1994-11-29 | 1998-08-26 | Minnesota Mining And Manufacturing Company | Transparent decorative article having an etched appearing/prismatic image thereon |
US6677565B1 (en) | 1998-08-18 | 2004-01-13 | Veeco Tucson Inc. | High speed autofocus and tilt for an optical imaging system |
JP2005045164A (en) | 2003-07-25 | 2005-02-17 | Toshiba Corp | Automatic focusing device |
US20050213037A1 (en) * | 2004-03-29 | 2005-09-29 | Erkin Abdullayev | Method and apparatus of cornea examination |
US7075046B2 (en) * | 2004-07-28 | 2006-07-11 | University Of Vermont And State Agricultural College | Objective lens reference system and method |
JP4790420B2 (en) * | 2006-01-12 | 2011-10-12 | オリンパス株式会社 | Microscope with automatic focusing mechanism and adjustment method thereof |
JP4822925B2 (en) | 2006-04-28 | 2011-11-24 | 日本電子株式会社 | Transmission electron microscope |
US7700903B2 (en) * | 2006-06-09 | 2010-04-20 | Wdi Wise Device Inc. | Method and apparatus for the auto-focussing infinity corrected microscopes |
DE102006027836B4 (en) | 2006-06-16 | 2020-02-20 | Carl Zeiss Microscopy Gmbh | Microscope with auto focus device |
CN102099916B (en) * | 2008-07-25 | 2013-07-31 | 康奈尔大学 | Light field image sensor, method and applications |
JP5717296B2 (en) | 2010-01-27 | 2015-05-13 | 国立大学法人北海道大学 | Diffraction microscopy |
EP2373043A1 (en) | 2010-03-29 | 2011-10-05 | Swiss Medical Technology GmbH | Optical stereo device and autofocus method therefor |
WO2011161594A1 (en) * | 2010-06-24 | 2011-12-29 | Koninklijke Philips Electronics N.V. | Autofocus based on differential measurements |
DE102011077236A1 (en) | 2011-06-08 | 2012-12-13 | Carl Zeiss Microlmaging Gmbh | Autofocus method for microscope and microscope with autofocus device |
DE102011082756A1 (en) | 2011-09-15 | 2013-03-21 | Leica Microsystems (Schweiz) Ag | Autofocusing method and device for a microscope |
GB201201140D0 (en) | 2012-01-24 | 2012-03-07 | Phase Focus Ltd | Method and apparatus for determining object characteristics |
US10652444B2 (en) | 2012-10-30 | 2020-05-12 | California Institute Of Technology | Multiplexed Fourier ptychography imaging systems and methods |
US10679763B2 (en) * | 2012-10-30 | 2020-06-09 | California Institute Of Technology | Fourier ptychographic imaging systems, devices, and methods |
WO2015134924A1 (en) * | 2014-03-07 | 2015-09-11 | The Regents Of The University Of California | Partially coherent phase recovery |
CN104200449B (en) * | 2014-08-25 | 2016-05-25 | 清华大学深圳研究生院 | A kind of FPM method based on compressed sensing |
CN104796609B (en) * | 2015-04-17 | 2018-01-05 | 南京理工大学 | Large visual field high resolution micro imaging method based on optimal Hadamard coding |
CN104932092B (en) * | 2015-06-15 | 2017-09-08 | 上海交通大学 | Auto-focusing microscope and its focusing method based on eccentric pencil method |
US10754140B2 (en) * | 2017-11-03 | 2020-08-25 | California Institute Of Technology | Parallel imaging acquisition and restoration methods and systems |
-
2016
- 2016-11-10 EP EP16810021.2A patent/EP3374816A2/en not_active Withdrawn
- 2016-11-10 WO PCT/IB2016/001703 patent/WO2017081539A1/en active Application Filing
- 2016-11-10 US US15/775,389 patent/US20190235224A1/en not_active Abandoned
- 2016-11-10 JP JP2018524197A patent/JP2018537709A/en not_active Withdrawn
- 2016-11-10 CN CN201680065388.0A patent/CN108351504A/en active Pending
- 2016-11-10 CN CN201680064749.XA patent/CN108351506B/en active Active
- 2016-11-10 US US15/775,370 patent/US10705326B2/en active Active
- 2016-11-10 EP EP16834094.1A patent/EP3374817B1/en active Active
- 2016-11-10 WO PCT/IB2016/001725 patent/WO2017081542A2/en active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040113050A1 (en) * | 2002-12-17 | 2004-06-17 | Olszak Artur G. | Method and apparatus for limiting scanning imaging array data to characteristics of interest |
US20060126952A1 (en) * | 2004-11-19 | 2006-06-15 | Ntt Docomo, Inc. | Image decoding apparatus, image decoding program, image decoding method, image encoding apparatus, image encoding program, and image encoding method |
US20140347515A1 (en) * | 2011-10-14 | 2014-11-27 | Panasonic Corporation | Imaging lens and imaging device using same |
US20150054979A1 (en) * | 2013-08-22 | 2015-02-26 | California Institute Of Technology | Variable-illumination fourier ptychographic imaging devices, systems, and methods |
US20160131891A1 (en) * | 2013-09-06 | 2016-05-12 | Canon Kabushiki Kaisha | Image processing method, image processing apparatus, image pickup apparatus, and non-transitory computer-readable storage medium |
US20150331228A1 (en) * | 2014-05-13 | 2015-11-19 | California Institute Of Technology | Ptychography imaging systems and methods with convex relaxation |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11482021B2 (en) | 2017-10-19 | 2022-10-25 | Scopio Labs Ltd. | Adaptive sensing based on depth |
US11409095B2 (en) | 2017-11-20 | 2022-08-09 | Scopio Labs Ltd. | Accelerating digital microscopy scans using empty/dirty area detection |
US11549955B2 (en) | 2017-11-20 | 2023-01-10 | Scopio Labs Ltd. | Multi/parallel scanner |
US11828927B2 (en) | 2017-11-20 | 2023-11-28 | Scopio Labs Ltd. | Accelerating digital microscopy scans using empty/dirty area detection |
US11650405B2 (en) | 2019-11-15 | 2023-05-16 | Scopio Labs Ltd. | Microscope and method for computational microscopic layer separation |
DE102021102990A1 (en) | 2021-02-09 | 2022-08-11 | Carl Zeiss Microscopy Gmbh | Method and device for determining the optimal position of the focal plane for microscopy of a sample |
Also Published As
Publication number | Publication date |
---|---|
US20180329194A1 (en) | 2018-11-15 |
CN108351504A (en) | 2018-07-31 |
EP3374817A1 (en) | 2018-09-19 |
WO2017081542A2 (en) | 2017-05-18 |
EP3374817B1 (en) | 2020-01-08 |
WO2017081539A1 (en) | 2017-05-18 |
CN108351506B (en) | 2020-12-04 |
WO2017081542A3 (en) | 2017-07-06 |
US10705326B2 (en) | 2020-07-07 |
EP3374816A2 (en) | 2018-09-19 |
CN108351506A (en) | 2018-07-31 |
JP2018537709A (en) | 2018-12-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10705326B2 (en) | Autofocus system for a computational microscope | |
US10558029B2 (en) | System for image reconstruction using a known pattern | |
US10101572B2 (en) | Variable focal length lens system with multi-level extended depth of field image processing | |
US20180348500A1 (en) | Scanning microscope with real time response | |
US11482021B2 (en) | Adaptive sensing based on depth | |
US10679763B2 (en) | Fourier ptychographic imaging systems, devices, and methods | |
US11828927B2 (en) | Accelerating digital microscopy scans using empty/dirty area detection | |
CN111164646A (en) | Multi-step image alignment method for large offset die-to-die inspection | |
US11650405B2 (en) | Microscope and method for computational microscopic layer separation | |
CN103168265A (en) | Imaging systems and associated methods thereof | |
US20210149170A1 (en) | Method and apparatus for z-stack acquisition for microscopic slide scanner | |
Cao et al. | Method based on bioinspired sample improves autofocusing performances | |
Buat et al. | Active chromatic depth from defocus for industrial inspection | |
JP2015102694A (en) | Alignment device, microscopic system, alignment method, and alignment program | |
US20240137632A1 (en) | ENGINEERED POINT SPREAD FUNCTION (ePSF) OBJECTIVE LENSES | |
Chiang et al. | Design of a hand-held automatic focus digital microscope by using CMOS image sensor | |
US20230186483A1 (en) | Method for determining boundaries of a z-stack of images of an object, corresponding optical instrument and computer program therefor | |
Liang et al. | Computational label-free microscope through a custom-built high-throughput objective lens and Fourier ptychography | |
Cheng et al. | High-speed all-in-focus 3D imaging technology based on the liquid lens focus scanning | |
JP2023125282A (en) | Analysis method and analysis apparatus | |
Wang et al. | Image-based autofocusing algorithm applied in image fusion process for optical microscope |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SCOPIO LABS LTD., ISRAEL Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SMALL, ERAN;LESHEM, BEN;MADAR, ITTAY;AND OTHERS;REEL/FRAME:045775/0222 Effective date: 20170125 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |