AU2011293269B2 - Microscopy imaging device with advanced imaging properties - Google Patents

Microscopy imaging device with advanced imaging properties Download PDF

Info

Publication number
AU2011293269B2
AU2011293269B2 AU2011293269A AU2011293269A AU2011293269B2 AU 2011293269 B2 AU2011293269 B2 AU 2011293269B2 AU 2011293269 A AU2011293269 A AU 2011293269A AU 2011293269 A AU2011293269 A AU 2011293269A AU 2011293269 B2 AU2011293269 B2 AU 2011293269B2
Authority
AU
Australia
Prior art keywords
microscope
imaging
image
optical
objective lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
AU2011293269A
Other versions
AU2011293269A1 (en
Inventor
Laurie Burns
Eric Cocker
Abbas El Gamal
Kunal Ghosh
Tatt Wei Ho
Mark J. Schnitzer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Leland Stanford Junior University
Original Assignee
Leland Stanford Junior University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=44838762&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=AU2011293269(B2) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Leland Stanford Junior University filed Critical Leland Stanford Junior University
Publication of AU2011293269A1 publication Critical patent/AU2011293269A1/en
Application granted granted Critical
Publication of AU2011293269B2 publication Critical patent/AU2011293269B2/en
Priority to AU2015272003A priority Critical patent/AU2015272003B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0076Optical details of the image generation arrangements using fluorescence or luminescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0008Microscopes having a simple construction, e.g. portable microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/008Details of detection or image processing, including general computer control
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/16Microscopes adapted for ultraviolet illumination ; Fluorescence microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/362Mechanical details, e.g. mountings for the camera or image sensor, housings
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/10Beam splitting or combining systems
    • G02B27/14Beam splitting or combining systems operating by reflection only
    • G02B27/141Beam splitting or combining systems operating by reflection only using dichroic mirrors
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • G02B3/0087Simple or compound lenses with index gradient
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/69Control of means for changing angle of the field of view, e.g. optical zoom objectives or electronic zooming
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/40Animals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/42Evaluating a particular growth phase or type of persons or animals for laboratory research
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/04Arrangements of multiple sensors of the same type
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2576/00Medical imaging apparatus involving image processing or analysis
    • A61B2576/02Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
    • A61B2576/026Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • A61B5/0042Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part for the brain
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0082Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence adapted for particular medical purposes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/48Other medical applications
    • A61B5/4887Locating particular structures in or on the body
    • A61B5/489Blood vessels
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30016Brain
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof

Abstract

Systems, methods and devices are implemented for microscope imaging solutions. One embodiment of the present disclosure is directed toward an epifluorescence microscope. The microscope includes an image capture circuit including an array of optical sensor. An optical arrangement is configured to direct excitation light of less than about 1 mW to a target object in a field of view of that is at least 0.5 mm

Description

1 MICROSCOPY IMAGING DEVICE WITH ADVANCED IMAGING PROPERTIES Related Documents This patent document claims benefit under 35 U.S.C. § 119 to U.S. Provisional Patent Application Serial No. 161/377,591, entitled "Microscopy Imaging Device with Advanced Imaging Properties" and filed on August 27, 2010; this patent document and the Appendices filed in the underlying provisional application, including the references cited therein, are fully incorporated herein by reference. Overview of Certain Embodiments Aspects of the present disclosure relate generally to microscopy imaging devices, for example, miniature epifluorescence imaging devices. Optical microscopes are often designed as instruments of substantial size and expense. The role of imaging in biomedicine has grown, and miniaturized integration of the light microscope facilitates the advancement of many new applications. For instance, mass-producible, tiny microscopes, can be useful for imaging of cells in freely behaving animals, and particularly in the brain, for which is useful for understanding how cellular dynamics relate to animal behavior. Although not limited thereto, aspects of the present disclosure relate to miniature (<2 g), integrated fluorescence microscopes made from mass producible parts, including a semiconductor light source and image sensor, allowing imaging across -0.5 mm 2 areas. Such devices can be configured for high-speed observation of cellular dynamics with sufficient image quality and/or resolution that such observation is useful for viewing dynamics of the brains of active mice at frame acquisition rates up to 100 Hz. The use of a miniature microscope can be useful for a variety of different applications (e.g., tracking Ca2+-spiking concurrently in up to >200 Purkinje neurons extending over 9 cerebellar microzones). Aspects of the present disclosure are directed toward epifluorescence microscopes. The microscope includes an image capture circuit with an array of optical sensors. An optical arrangement is configured to direct excitation light of less than about 1 mW that is at least 0.5 mm 2 encompassed within a field of view 2 which comprises a target object and to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors. The optical arrangement and array of optical sensors are each sufficiently close to the target object to provide at least 2.5 pm resolution for an image of the field of view. Certain embodiments of the present disclosure are directed to an epifluorescence microscope that has an optical light source configured to produce excitation light from an energy source that provides less than 6 mW wherein the excitation light is directed over an area that is at least 0.5 mm 2 encompassed within a field of view which comprises a target object. The microscope includes an imaging circuit including a sensor array and an objective lens configured to operate sufficiently close to the optical light source, the image sensor array and to the target object to provide at least 2.5 pm image resolution image of the field of view that is at least 0.5 mm 2 . Other embodiments of the present disclosure relate to epifluorescence microscopes that occupy less than a cubic inch. Such a microscope includes an optical excitation arrangement configured to direct light toward a field of view containing an imaging target. An imaging circuit including optical sensor array is configured to generate image data from fluorescence caused by an interaction between the directed light and the imaging target. An optical arrangement is configured to direct the fluorescence to the optical sensor array with sufficient intensity and focus for the image data to depict over 0.20 mm 2 and a resolution of at least 3 pm. In other embodiments the intensity and focus for the image data is sufficient to depict at least 2.5 pm image resolution for the field of view that is at least 0.5 mm 2 . Consistent with other embodiments of the present disclosure, an imaging device includes a portable housing that is less than a cubic inch in size. The portable housing contains several elements including an excitation source configured to provide excitation light. A structure is also included, the structure being configured to provide an optical pathway having a first end and a second end. The structure includes an objective lens at the first end of the optical pathway; one or more excitation elements that are configured and arranged to direct the excitation light to the objective lens; and one or more emission 2a elements that are configured and arranged to provide a focal plane at the second end of the optical pathway from epifluorescent emission light received from the objective lens. An imaging circuit includes array of optical sensors positioned at the focal plane and configured and arranged to capture an image of the target object from the epifluorescent emission light, the image having sufficient field of view to capture multiple individual capillary blood vessels and sufficient resolution to distinguish the individual capillary blood vessels from one another. Certain aspects of the present disclosure are exemplified in a number of illustrated implementations and applications, some of which are shown in the figures and characterized WO 2012/027586 PCT/US2011/049180 3 in the claims section that follows. The above overview is not intended to describe each illustrated embodiment or every implementation of the present disclosure. Brief Description of the Drawings 5 Aspects of the present disclosure may be more completely understood in consideration of the detailed description of various embodiments of the present disclosure that follows in connection with the accompanying drawings, in which: FIG. I depicts a block diagram of an epifluorescence microscope device, consistent with an embodiment of the present disclosure; 10 FIG. 2 depicts a block diagram of an epifluorescence microscope device with an external optical source, consistent with an embodiment of the present disclosure; FIG. 3 shows a cross-section of a miniature fluorescence microscope, consistent with an embodiment of the present disclosure; FIG. 4 depicts an objective lens and ray propagation therein, consistent with an 15 embodiment of the present disclosure; FIG. 5 depicts is an optical ray trace diagram of an imaging pathway with two lens elements and an additional spectral-filtering components, consistent with an embodiment of the present disclosure; and FIG. 6 depicts a block diagram for a microscope system, consistent with an 20 embodiment of the present disclosure. While the present disclosure is amenable to various modifications and alternative forms, specifics thereof have been shown by way of example in the drawings and will be described in further detail. It should be understood, however, that the intention is not to limit the disclosure to the particular embodiments described. On the contrary, the intention 25 is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the present disclosure. Detailed Description The present disclosure is believed to be applicable to a variety of different types of 30 devices and processes, and the present disclosure has been found to be particularly suited for epifluorescent imaging applications. While the present disclosure is not necessarily limited to such applications, various aspects of the present disclosure may be appreciated through a discussion of various examples using this context.
WO 2012/027586 PCT/US2011/049180 4 Consistent with certain example embodiments of the present disclosure, epifluorescent imaging is facilitated through the use of a microscope device and system. For instance, particular aspects of the device and/or system allow the use of ultra low levels for excitation light, which are used to generate epi-fluorescence in a target object or cell. 5 Some aspects allow for imaging of a large field of view with a high resolution. Still further aspects are directed toward the high-speed capture of images, which can be viewed in real time or near real-time. While these points of facilitation are not limiting, they are relevant to a number of different embodiments of the present disclosure. A particular aspect relates to the proximity between an optical source of excitation 10 light and the target object or cell for imaging. For epifluorescent imaging, the interaction between the excitation light and the target object causes the generation of imaging fluorescence. The excitation light is directed toward the target object and has a specific wavelength configured for absorption by fluorophores, fluorescent markers or fluorescent probes. The fluorophores then emit light at different (e.g., longer) wavelengths. 15 The amount of absorbed light is related to the amount of excitation light delivered to the target object. In this manner, the amount of fluorescence generated is correlated to the amount of excitation light. Although various light delivery mechanisms can help reduce the attenuation of light as it travels through a medium, the attenuation of light will increase as distance of travel through a medium increases. Also, when using air and other mediums, 20 the composition of the medium and other dispersive attributes can play significant roles in the delivery and/or attenuation of the light, whereas the reduction of the optical path length (mainly resulting in the reduction of travel of light through air) does next to nothing to decreasing attenuation. The design of the microscope device and system allows for the placement of the optical source of the excitation light in close proximity to the target object, 25 thereby facilitating the use of a short optical path. This is particularly useful for facilitating the use of an optical source of low power and/or capturing images using low-levels of light. Various fluorescence sources can be used consistent with one or more embodiments discussed herein. The mention of a particular source of fluorescence does not necessarily preclude use of other sources of fluorescence (e.g., genetically-encoded fluorescent 30 proteins, such as GFP, GCaMP, and variants thereof). Other aspects of the present disclosure relate to the integration of optics, filters, and camera into a single housing, which can be particularly useful for the elimination of the fiber-bundle and all of its associated limitations.
WO 2012/027586 PCT/US2011/049180 5 Yet other aspects relate to the proximity of a target object or cell relative to an image sensor for capturing image data from epifluorescent light. Image resolution and imaging times are related to the amount of epifluorescent light that can be collected and detected by an image sensor. Attenuation of the epifluorescent light due to properties of the optical path 5 between the target object and the image sensor can be undesirable. Careful design of the microscope device and system allows for placement of the image sensor in close proximity to the target object, thereby facilitating the use of a short optical path. Also in accordance with the present disclosure, the proximity of an objective lens of a microscope device is set relative to a target object, during imaging of the target object. 10 Large distances between an objective lens and the target object can have a detrimental effect on the amount of the excitation light received at the target object as well as the amount of fluorescence received at and collected by the objective lens. Accordingly, setting the proximity of the object lens relative to the target object can be advantageous. Embodiments of the present disclosure relate to a microscope device and system that 15 captures image data for a relatively large field of view, the image data providing high resolution of a target object. One such embodiment of the present disclosure includes an image capture circuit, with an array of sensor elements or pixels, which is provided to image the field of view. The sensor elements detect epi-fluorescence for different portions of the field of view. The sensor elements can be configured with sufficient sensitivity and 20 proximity to the target object to facilitate image capture and generation. Other embodiments of the present disclosure relate to the length of exposure times for image capture. As fluorophores are excited, they can begin to lose their ability to fluoresce, which is sometimes referred to as photobleaching. Moreover, epi-fluorescence imaging involves the absorption of the excitation light by the target object. Some of this 25 absorbed light is converted into heat. This generated heat can place limits on the exposure time, e.g., the heating of biological material/cells can cause cell damage and even death. The exposure time, however, can be increased if the intensity of excitation light is decreased. The intensity of the excitation light can be reduced if, for example, the optical coupling between the target object and the image sensor is improved. Phototoxic effects 30 can be more damaging than localized heating. Aspects of the present disclosure lessen or eliminate these effects which adversely impact image capture and related processing of the data. Particular embodiments of the present disclosure relate to the adjustment of excitation light intensity in conjunction with the adjustment of exposure time to improve WO 2012/027586 PCT/US2011/049180 6 image quality, image for a particular goal (e.g., image capture rate, resolution, field of view size or imaging depth). According to other aspects of the present disclosure, relatively low optical zooms are used in connection with high-resolution imaging of a field of view for target objects of 5 small size. Constraints on the optical zoom required for a particular level of imaging can be lessened through the careful design and application of a microscope device and system consistent with various aspects discussed herein. Embodiments of the present disclosure relate to the real-time imaging of target objects using a microscope device and/or system consistent with aspects discussed herein. 10 In certain of these embodiments, the imaging rate is increased by reducing the field of view while holding a constant resolution, the image capture time is reduced by reducing the exposure time and/or the frame rate achievable for such real-time imaging is correlated to size of the full field of view as well as the desired image resolution. Another factor optionally implemented therewith includes the type and responsiveness of the image sensor 15 that is used. Still other factors relate to the ability to transmit and process the image data for display, should it be desirable to view the images in real-time. Still other embodiments of the present disclosure relate to the facilitation of in vivo or in vitro epifluorescent imaging. For instance, in vivo imaging of a live subject can be particularly useful for correlating external stimuli and other factors with the captured 20 images. This correlation can be used, for example, as a diagnostic/research tool by associating properties of the captured images with the external stimuli. Real-time imaging at high frame rates can further provide such correlation as a function of time. An embodiment of the present disclosure is directed toward a microscope device and/or system having a modular design that facilitates detaching and reattaching various 25 components of the microscope device. The detachment and reattachment can be used to replace the modular components with new and/or different modular components. For instance, the light source can be replaced with a new light source having the same or different optical and electrical properties. The array of optical sensors and/or the optical direction elements (e.g., mirrors, filters and lenses) can also be removed and replaced. If 30 desired, the optical sensor can also be removed and replaced. In certain other embodiments consistent with the instant disclosure, one or more of the imaging devices includes a synchronization circuit for interfacing to an external optical data processing (recording and/or configuring) system. The synchronization circuit includes logic circuitry (e.g., a programmable or semi-programmable chip (microcontroller WO 2012/027586 PCT/US2011/049180 7 or ASIC) that is configured and arranged to communicate a frame reference/active signal. In a typical application, a frame active signal would provide synchronization information, e.g., as defined in an IEEE communications standard, for and with the data communicated between the imaging device and the external system. Such an optical-data 5 recording/configuring system can be used to install software, configure set-up parameters for experiments and procedures, provide visual feedback during such experiments and procedures, and record the optical data for manipulation and further study. In yet further embodiments, the instant disclosure is directed to methods of using the image devices which are described herein. Certain of the devices include a base plate acting 10 a foundational structure which provides support/stability and also allows for microscope (re)alignment. These methods include the steps of attaching and reattaching the epifluorescence microscope to the base plate for allowing the microscope alignment to be precise. Such precision should be sufficient for repeated imaging of a common imaging location, e.g., during chronic experiments. 15 Turning now to the figures, FIG. I depicts a block diagram of an epifluorescence microscope device, consistent with an embodiment of the present disclosure. The epifluorescence microscope device 100 includes a number of components within the dimensions 120 and 122. Not shown is a further dimension, which extends perpendicular to the dimensions 120 and 122. Although not necessarily limited thereto, each of these 20 dimensions can be less than an inch. Consistent with other embodiments, the dimensions are slightly larger, e.g., on the order of a few centimeters. The epifluorescence microscope device 100 includes an optical source 102. This optical source 102 generates excitation light 104. In a particular implementation, the optical source 102 is a light-emitting-diode (LED) or an organic light-emitting-diode (OLED). The 25 excitation light 104 is directed by an optical arrangement 124 to a target object 114, for imaging thereof. The optical arrangement can include one or more of objective lens 112, (dichroic) mirror 110 and excitation filter 108 and an emission filter (not depicted). Epifluorescent light 116 from the target object 114 is directed from/by the objective lens to an image capture circuit 118. The epifluorescence microscope device 100 is configured to 30 direct light from and capture image data for a field of view 126. In various embodiments of the present disclosure, the microscope device 100 can also include one or more of an image-focusing optical element (e.g., an achromatic lens) and an emission filter. These and other elements can help control optical properties of the microscope device 100.
WO 2012/027586 PCT/US2011/049180 8 Consistent with one embodiment, the depicted elements are each integrated into a relatively small area, e.g., within a single housing having dimensions 120, 122. Such integration of the various components can be particularly useful for reducing the length the optical pathway from the optical source 102 to the target object 114 and back to the image 5 capture circuit 118. The reduction of this optical pathway can be part of the configuration parameters that facilitate a number of different properties and capabilities of the microscope device 100. For example, in certain embodiments the microscope can provide images with a resolution to 1 um for an imaging field of view of up to 1 mm2 in area. A particular example embodiment is configured with an array of optical sensors 118. 10 An optical arrangement 124 is configured to direct excitation light 104 of less than about 1 mW (various embodiments provide for a higher excitation power, e.g., 100 mW) to a target object 114 in a field of view 126 of that is at least 0.5 mm 2 and to direct epi fluorescence emission 116 caused by the excitation light 104 to the array of optical sensors 118. In various embodiments, the field of view 126 can be at least 1 mm 2. The optical 15 arrangement 124 and array of optical sensors 118 each configured sufficiently close to the target object 114 to provide at least 2.5 ptm resolution for an image of the field of view 126. In other embodiments, the optical arrangement 124 and array of optical sensors 118 can be configured to provide at least 1 jim resolution. In certain embodiments, the excitation optical power at the specimen is variable and can be in the range of 100 IW-100 mW, 20 depending upon the particular configuration and imaging constraints. Consistent with an embodiment of the present disclosure, the optical source 102 can deliver light of up to 37 lumens or 6 mW. It is not, however, necessarily a requirement that the optical source 102 provide light of such intensity. Moreover, the amount of light received by the target object is less than (relative to an attenuation factor) the amount of 25 light provided by the optical source 102. For instance, the attenuation of one embodiment results in 6 mW at the light source corresponding to 1 mW excitation power delivered at the target object. Similarly, to deliver 100 mW of excitation power at the specimen the light source can be configured to provide up to 600 mW. Although FIG. 1 depicts the various components as being within the dimensions 30 120, 122, other embodiments are possible. For instance, FIG. 2 depicts a block diagram of an epifluorescence microscope device with an external optical source, consistent with an embodiment of the present disclosure. The epifluorescence microscope device 200 includes an external optical source 214. This external optical source 214 is coupled to an optical arrangement 250 that includes a number of components within the dimensions 216 and 218.
WO 2012/027586 PCT/US2011/049180 9 Not shown is a further dimension, which extends perpendicular to the dimensions 216 and 218. Although not necessarily limited thereto, each of these dimensions can be less than a cubic inch. Consistent with other embodiments, the dimensions are on the order of a few centimeters. 5 Consistent with one embodiment of the present disclosure, the external optical source 214 is coupled to the optical arrangement 250 via a fiber optic cable 212. Excitation light from the external optical source 214 and the fiber optic cable 212 pass through (optional) excitation filter 208. A (dichroic) mirror 204 and objective lens 206 direct the excitation light to the target object 210. In particular, the excitation light is directed toward 10 field of view 220. The excitation light causes fluorophores in the target object 210 to fluoresce with epifluorescent light. This epifluorescent light is directed by (dichroic) mirror 204 and objective lens 206 to optical sensor 202. In various embodiments of the present disclosure, the microscope device 200 can also include one or more of an image-focusing optical element (e.g., an achromatic lens) 15 and an emission filter in the imaging pathway. These and other elements (not shown Fig. 1) can help control optical properties of the microscope device 200. Although the optical source 214 is not located in close proximity to the optical arrangement 250, the amount of excitation light that is delivered to the target object 210 can still be set at a low level due to the proximity between the target object 210, the objective 20 lens 206 and/or the optical sensor 202. In particular, this close proximity can be particularly useful for providing efficient optical coupling between the target object and the optical sensor. Thus, the epi-fluorescence can be of a lower intensity relative to the image properties. Moreover, a lower level of excitation intensity at the target object 210 can allow for longer exposure to the excitation light before photobleaching, heating or other adverse 25 affects become a factor. The following discussion provides details of an experimental embodiment. Although the experimental embodiment provides examples and details regarding various parameters and results, these aspects are not necessarily limiting to the various other embodiments of the present disclosure. The experimental embodiment was configured and 30 arranged to provide a small epi-fluorescence microscope. The microscope included a specially-integrated arrangement that included the light source, optics, filters, and camera into a single housing. The level of integration and the resulting size scale for the miniature fluorescence microscopes can be configured for use in a multitude of applications. A particularly WO 2012/027586 PCT/US2011/049180 10 challenging application relates to in vivo brain imaging, e.g., in a mouse or similar organism. In at least one such application, the microscope is designed to be mounted on the head of a mouse for in vivo brain imaging during awake behavior. In order to be configured for this and other applications, the microscope was designed with stringent physical size and 5 mass requirements, e.g., so as to be easily borne by the mouse during awake and active behavior. For instance, given that an adult mouse is approximately 25 g in mass, the microscope was designed to be 3 g or less. Other design considerations revolved around the image quality, reliability and speed. One embodiment was configured for imaging of high-speed, cellular-level brain 10 imaging. The cost and simplicity of large-scale manufacturing was another factor in the design of the fluorescent microscope. Particular embodiments were configured and designed as an integrated device that was mass-producible at low costs due (e.g., scalable and amenable to mass-production). FIG. 3 shows a cross-section of a miniature fluorescence microscope that was 15 designed consistent with such considerations and other embodiments of the present disclosure. The vertical arrows denote the excitation (down arrow) and emission (up arrow) pathways. A single housing 300 contains the light source 314 and image capture circuit 302, as well as a fluorescence filter set (emission filter 306 and excitation filter 316) and micro-optics (collector lens 312, dichroic mirror 310, achromatic lens 308, objective lens 20 318 and focusing mechanism 304). This integration of the light source and camera with the filter set and microscope optics facilitates high-resolution image capture in various applications, such as in vivo imaging. Consistent with one embodiment, a solid-state light-emitting-diode (LED), which is small, amenable to integration with collection optics, and mass-producible at low costs, is 25 used for the excitation light source. A Complementary-Metal-Oxide-Semiconductor (CMOS) image sensor is used for the camera. In a particular experimental embodiment of the present disclosure, the LED light source shown in FIG. 3 can be implemented using a blue LED 314 mounted on a custom 6 mm X 6 mm printed circuit board (PCB), which also includes a heatsink. A drum micro 30 lens 312 is used to collect illumination, which then passes through a 4 mm X 4 mm excitation filter 316, deflects off a dichroic mirror 310, and enters the imaging pathway. A gradient refractive index (GRIN) objective micro-lens 318 focuses illumination onto the sample. Fluorescence emissions from the sample return through the objective 318, the dichroic 310, a 4 mm X 4 mm emission filter 306, and an achromatic doublet tube lens 308 WO 2012/027586 PCT/US2011/049180 11 that focuses the image onto the CMOS image sensor 302 (640 X 480 pixels), mounted on a 8.4 mm X 8.4 mm PCB with power and signal conditioning electronics. The LED light source, CMOS camera, and the optical components are integrated into a microscope housing 300 with a modular design that permits individual components, such as the 5 excitation LED and CMOS camera chip, to be replaced for different application needs. Moreover, a memory circuit can be integrated to store image data. The modular aspect allows the memory circuit to be removed and replaced without removing the microscope from the imaging target (e.g., the microscope could remain affixed to an organism). Thus, captured images are stored locally and then retrieved by removal of the memory circuit, 10 which can be configured to interface with an external device, such as a laptop computer. In example embodiments, the microscope housing is fabricated using Polyetheretherketone (PEEK) and has built-in mechanical image focusing capabilities permitting focusing to sub-micron accuracy by adjustment of the camera position. Other materials (e.g., bio-compatible and solvent-resistant materials) can also be used consistent 15 with various desired applications. The microscope can be plugged into a computer via external data acquisition PCBs, with a standard USB interface, providing real-time image acquisition, display, and camera and light source control. Embodiments of the present disclosure are directed toward the design and control over an imaging pathway and design of an epifluorescence microscope. The imaging 20 pathway includes an objective lens along with other optical conditioning and directing components. The additional components can include, for example, spectral-filtering components and/or an achromatic doublet imaging tube lens. FIG. 4 depicts an objective lens and ray propagation therein, consistent with an embodiment of the present disclosure. In a particular embodiment, the objective lens 402 is 25 a GRIN objective lens. The GRIN objective lens is a cylindrical lens with a radially decreasing refractive index profile that results in rays 406, originating from target object 404, propagating in a sinusoidal path, as shown in FIG. 4. A GRIN lens can be particularly useful due to the small form factor and ease-of-integration with other microoptics and/or for reducing optical path length relative other types of objective lenses. 30 In one experimental embodiment of the present disclosure, the GRIN objective lens used to collect fluorescence emission from the specimen is 2 mm in diameter with a pitch length of 0.245. A pitch length of 1 corresponds to one full sinusoidal path of ray propagation; thus a pitch length of 0.245 results in light rays that are close to being considered collimated light rays, as shown in FIG. 4. The objective numerical aperture (NA) WO 2012/027586 PCT/US2011/049180 12 is 0.45. Collected fluorescence emission is passed through the dichroic mirror and the miniature emission filter, and the fluorescence image is then focused by an achromatic lens, with a focal length of 15 mm, onto the CMOS image sensor. FIG. 5 depicts is an optical ray trace diagram of an imaging pathway with the two 5 lens elements and the additional spectral-filtering components, consistent with an embodiment of the present disclosure. The rays show how points in the specimen plane are imaged onto the CMOS camera. Light rays (502, 504, 506, 508, 510) are traced from five distinct point sources in the specimen plane to imaged points on the CMOS camera. The design of the imaging pathway and the optical ray trace simulations were performed using 10 software modeling. The light rays emanating from target object 512 pass through the GRIN objective lens 514. The GRIN objective lens 514 collimates the light rays. The light rays are then directed by dichroic mirror 516 toward achromatic lens 518. Emission filter 520 filters out undesired light wavelengths, such as reflected excitation light. The light rays then strike sensor array/camera 522, where they are recorded and used to generate an image 15 of the target object 512. The optical magnification provided by the imaging pathway and optical elements can be configured accordingly to the desired application. Moreover, the need for optical magnification can be offset by the proximity of the objective lens to the target object as well as the proximity between the target object, objective lens and the image capture circuit, 20 resulting in embodiments where low optical magnifications (1 -4x) can permit imaging large specimen fields-of-view greater than 1 nm 2 while still providing high spatial resolution of at least 1 tm. Consistent with experiments and related embodiments, the microscope optical magnification range is between 4.5-5.5x. The working distance, that is, the distance from 25 the near surface of the objective to the point in the specimen plane that is in focus, is about 150-200 pm or about 50-250 ptm (these dimensions can depend on the exact positioning of the focal plane). The performance of an optical design can be evaluated by its resolving capabilities, and one measure of this is the full-width-half-maximum (FWHM) of the optical point-spread function. The on-axis, lateral spatial resolution of the imaging pathway 30 computed in this manner was approximately 1.2 pim, degrading to approximately 1.6 ptm at the periphery of the field-of-view. This measurement, however, is not necessarily limiting as that the spatial resolution achievable is also a function of various factors including, but not limited to, the camera pixel size.
WO 2012/027586 PCT/US2011/049180 13 Aspects of the present disclosure relate to properties of the illumination pathway between the target object, the excitation source and the image sensors. For instance, careful design of the illumination pathway can provide efficient and uniform excitation of the specimen under observation. The coupling of the excitation light source to the illumination 5 pathway can be useful for providing sufficient and well-controlled illumination to excite the specimen. In one experimental implementation, a blue LED with the spectral peak of illumination at around 470 nm was used as the excitation light source. The LED was mounted on a 6 mm X 6 mm PCB that was equipped with a heat sink. The heat sink helps to keep the LED junction temperature stable during operation. 10 LED illumination output is (first order) linear as compared with drive current only over a local area (the actual transfer function is a curve). However, the output exhibits temperature dependence. The experimental results showed that drive currents of 20 - 30 mA were sufficient to deliver the required illumination power at the specimen. This drive current was approximately one fiftieth (1/50) of the maximum rating for the drive current of 15 the LED (e.g., maximum drive current is IA and typical drive currents are 20 mA). For a given drive current, the LED junction generally reached an equilibrium temperature in approximately 60 s after LED turn-on, and the LED illumination output stabilized. In certain embodiments, the LED light output can be stabilized in real-time over temperature variations via intrinsic or external temperature measurement coupled with a feed-forward or 20 feedback system. For instance, data received from a temperature sensor (e.g., temperature sensitive resistor or temperature sensing diode) and/or current sensor can be used to control the amount of electrical power provided to the LED. In certain embodiments, a control circuit for providing such control can be calibrated during manufacturing or at a point thereafter. 25 Consistent with an experimental embodiment, the LED illumination is collected by a drum lens, passed through a miniature fluorescence excitation filter, and then reflected off a dichroic mirror that directs the illumination into the GRIN objective lens and to the specimen. The system was designed for collection and delivery of light to the specimen to achieve spatially uniform, homogenous illumination at an average optical power density 30 across the specimen field-of-view. This can be accomplished by approximating Kohler illumination. In Kohler illumination, the light source and the specimen planes are on separate sets of conjugate planes, ensuring that the light source is not imaged onto the specimen, and yielding even illumination of the specimen at an average optical power density.
WO 2012/027586 PCT/US2011/049180 14 According to an experimental embodiment, the fluorescence filter set is configured to separate the excitation illumination from the fluorescence emission. The filter set includes three parts: the excitation filter, dichroic mirror, and emission filter. The spectral profiles of the filters and dichroic were configured to allow blue excitation and green 5 emission. These spectral profiles are amenable to imaging a broad palette of synthetic fluorescent probes, such as fluorescein and its reactive derivatives, as well as genetically encoded fluorescent proteins, such as the green fluorescent protein (GFP). For a particular experimental implementation, the specific spectral characteristics and dimensions of the filter set were as follows. The excitation filter was a bandpass filter with a spectrum of 10 480/40 nm and a dimension of 4 mm X 4 mm X 1.05 mm, the emission filter was also a bandpass filter with a spectrum of 535/50 nm and a similar dimension of 4 mm X 4 mm X 1.05 mm, and the dichroic mirror had a long pass spectral profile, passing wavelengths above 506 nm, and with a dimension of 4 mm X 4.8 mm X 1.05 mm. In other embodiments, the filter set can be configured to permit multiple wavelength excitation for 15 excitation and imaging of multiple fluorescent markers with different excitation/emission spectra. Embodiments of the present disclosure are directed toward the use of a CMOS image sensor. CMOS image sensors are digital imaging sensors that are designed and fabricated in CMOS. This can be particularly useful for providing image sensors that can 20 be mass-produced at low costs. Moreover, the use of CMOS technology can be useful for providing a solution that operates at both low power and at high speed. The CMOS image sensors can be implemented with digital pixels, where conversion from photons to bits is done directly at the pixel level with a per-pixel analog-to-digital converter and dynamic memory. This can be particularly useful for high speed imaging applications and the 25 implementation of still and video rate imaging applications that benefit from high-speed capture, such as dynamic range enhancement. In a particular implementation a CMOS image sensor was used that had a resolution of 640 X 480 pixels, each pixel having dimensions of 5.6 pm X 5.6 pm. The CMOS image sensor was packaged in a 5.6 mm X 5.8 mm chip-scale package. The sensor output was in a 30 serialized digital low-voltage differential signaling (LVDS) format. Such a LVDS format is particularly useful for facilitating the interfacing with a minimum number of interconnects, which can be an important consideration for minimizing the number of wires attached to the microscope.
WO 2012/027586 PCT/US2011/049180 15 Experimental characterizations, shown in table 1, of the sensor are briefly described as follows. Pixel read noise was estimated by calculating the standard deviation of pixel intensity in 1000 image frames, acquired in full darkness and with sufficiently brief exposure such that the noise contribution from dark current shot noise was insignificant. 5 Dark current, and dark signal non-uniformity (DSNU), the variation in dark current across the array of pixels due to device mismatches, were estimated by capturing 1000 frames in the dark with sufficiently long exposure times, and then averaging the frames into a single image, with the objective of ideally averaging out temporal noise. Dark current and dark signal nonuniformity were then found from the mean and standard deviation of the pixels in 10 the averaged image. With these experimentally-characterized sensor specifications, and other known electronic properties of the sensor, the CMOS image sensor was analytically modeled to estimate imaging fidelity for a range of incident photon flux densities. Table 1 Package size 5.6 X 5.8 mm 2 Array size 640 X 480 pixels Pixel size 5.6 X 5.6 pm 2 Frame rate 36 fps/Hz Pixel read noise 10 e Dark current (room temp.) 900 e~s Dark signal non-uniformity 30 e~s Full well capacity 52,000 e 15 The experimental results are illustrative and not meant to be limiting. For instance, the frame rate/image capture speed of Table 1 (36Hz) is to be understood in the context of the specific experimental parameters. For instance, the captured field of view (FOV) was at 2 2 least 0.5 mm , although it could be up to 1 mm or even more. Smaller FOVs would allow 20 for higher frame rates (e.g., 370 pm 370 ptm at 100 Hz). One application consistent with embodiments of the present disclosure relates to in vivo mouse brain imaging experiments. Since photon flux densities incident on the sensor plane for typical in vivo mouse brain imaging experiments are on the order of 10" photons/cm 2 /sec, which corresponds to 20,000 electrons/pixel/sec, the CMOS image sensor 25 operates in the photon shot noise limited regime for in vivo mouse brain imaging experiments. Thus, the CMOS image sensor's pixel read noise and dark current numbers, relevant considerations for applications where imaging is performed in low-light conditions, have a negligible impact on imaging fidelity. Along with an estimated sensor dynamic WO 2012/027586 PCT/US2011/049180 16 range of 60 dB, which is believed to be more than sufficient for capturing the range of signal intensities observed in in vivo brain imaging datasets, the imaging performance metrics of the CMOS image sensor were shown to be well-suited to serving the application needs. 5 Embodiments of the present disclosure relate to communication of image data, control signals and/or power to the microscope device. For many applications, the intrusiveness of the microscope is a relevant consideration. This aspect can be adversely affected by the number of wires used to provide the communication and/or power to the microscope device. Accordingly, various aspects of the present disclosure are directed 10 toward reducing the number of wires between the microscope and an external system, which can provide control and/or image storage and processing functions. Consistent with a particular experimental implementation, a two-wire 12C interface is used to communicate control information with the microscope device. The 12C interface defines the wires as SCLK and SDATA and communicates using a serial interface, thereby providing a low wire 15 count solution. In certain embodiments, an additional rotational element (e.g., commutator) can be used to facilitate movement and to lessen or eliminate torsional strain on the connection wires. Various other protocols and communication solutions are possible. Consistent with a particular embodiment of the present disclosure, the input power supply is stepped-down and regulated by a low-dropout voltage regulator (LDO) before 20 being delivered to the image sensor. An input clock signal (162 MHz) is transmitted to and restored by a clock buffer before being sent to the image sensor. The received clock signal is then used to internally generate a 27 MHz master clock signal. The image data output of the sensor is in a 10-bit digitized format and transmitted over a two-wire serial LVDS protocol. The present disclosure, however, is not necessarily limited to any particular 25 communication protocol or power providing mechanism. FIG. 6 depicts a block diagram for a microscope system, consistent with an embodiment of the present disclosure. Two of the electronically active components of the microscope 600 include the optical excitation source 602 and the sensor array 604. In certain embodiments, the microscope 600 receives power and control signals from an 30 external interface module 650. This allows various circuits and components (e.g., power supply, memory storage and/or image processing) to be located remote from the microscope. Interface module 650 can be designed to function as a stand-alone component or for connection with another device, such as a computer.
WO 2012/027586 PCT/US2011/049180 17 In certain embodiments, interface module 650 is configured to provide microscope data acquisition and control and is external to the microscope imaging device. In another embodiments, interface module 650 (with or without input/output (I/O) interface 616) can be integrated with the microscope device 600, e.g., for applications where the weight/size 5 does not preclude such integration. According to one embodiment of the present disclosure, the interface module 650 includes an input/output (I/O) interface 606 (a transmitter/receiver/transceiver circuit). This I/O interface 606 can be used to provide power, control and to transmit image data to and from the microscope 600. For instance, power can be provided from one or more power 10 regulators 610; control signals can be provided from a control interface 614; driver signals 608 for powering the optical excitation source 602; and image data can be communicated to an (image) data processing unit or circuit 612. Accordingly, microscope 600 can also be configured with one or more transmitter/receiver/transceiver circuits to allow for communication with the interface module 650. 15 In one embodiment of the present disclosure, I/O interface 606 is connected to the microscope 600 using wired connections. The wired connections can transmit power and communication signals using any number of different protocols. Particular applications (e.g., in vivo imaging of active organisms) benefit from the wired connection being light, flexible and otherwise amenable to movement by the object of the imaging. Thus, certain 20 embodiments implement communication protocols and solutions with low pin/wire counts. Consistent with other embodiments of the present disclosure, I/O interface 606 is designed to use wireless communications. Wireless control of the microscopy imaging device and wireless data transfer can be particularly useful when several moving imaging objects are being imaged in parallel in sufficiently close proximity to each other. In one, 25 non-limiting, instance, I/O interface 606 can use magnetic field induction, such as near-field communications derived from ISO/IEC 14443. Near-field communications also allow for power to be provided to the microscope wirelessly, e.g., through inductive coupling. Other wireless communication protocols and solutions are also possible. Consistent with various embodiments, the interface module 650 is designed with an 30 input/output (I/O) interface 616 that interfaces with another device, such as a laptop/desktop computer. This I/O interface 616 could also include a display screen for presenting images captured from microscope 600. Consistent with certain embodiments, I/O interface 616 can be integrated as part of the interface module 650 or a separate component (e.g., connected via wired or wireless communication links).
WO 2012/027586 PCT/US2011/049180 18 The various examples discussed herein for the I/O interfaces 606 and 616 are not limiting. The I/O interfaces can be custom designed or implemented consistent with existing communication protocols. In certain embodiments, memory 618 can be used to store image data and/or 5 software instructions for execution by data processing unit or circuit 612, which can be implemented using specialized processor (e.g., field programmable gate arrays) or general purpose microprocessors configured to execute the specialized software instructions. The memory 618 can include circuits providing non-volatile memory (e.g., flash) and/or volatile memory (e.g., volatile random access memory (RAM)). 10 A specific embodiment of the present disclosure is implemented using two printed circuit boards (PCBs) that are contained within the microscope 600. The first PCB 602 includes a light emitting diode (LED). The second PCB 604 includes a complementary metal-oxide semiconductor (CMOS) imaging/camera chip. These PCBs are both connected to a custom external system 650 via nine thin and flexible wires (2 wires to the LED PCB 15 602 and 7 to the camera PCB 604) that are encased in a single polyvinyl chloride (PVC) sheath of outer diameter 1.5 mm. The external system 650 interfaces with a computer via a general-purpose USB imaging data acquisition adapter. This configuration can be particularly useful for enabling real-time microscope control and data acquisition as well as immediate display of images. 20 An Inter-Integrated Circuit (12C) serial communication interface is provided using an 12C controller 614. The 12C interface can be used to control the operation and function of the (CMOS) imaging/camera chip that is part of PCB 604. The image data output from the imaging/camera chip is serialized and transmitted according to a digital low-voltage differential swing (LVDS) format. 25 Consistent with the various embodiments discussed herein, experimental fluorescence microscopes can be fabricated, assembled, and tested. The microscope fabrication, assembly, and testing processes can be implemented in a distributed and streamlined in nature. Camera and LED PCBs can be fabricated separately, while lenses and filters are produced or procured independently. The microscope housing can be 30 fabricated as a kit of individual parts to facilitate manufacturing thereof. With or without imaging optics, the camera PCB can be tested for power, camera control, and the presence of valid output data. Testing of the LED PCB can include the driving of the LED while illumination output is monitored. Once fully assembled, the microscope housing is designed to maintain the optical parts in alignment with the LED and WO 2012/027586 PCT/US2011/049180 19 camera PCBs. The microscope housing was made of black Polyetheretherketone (PEEK), which is lightweight, chemically resistant, stiff, and machinable. Although the black housing absorbed the majority of stray light, a thin layer of black felt or other absorbent material can be affixed (e.g., glued) in locations prone to light reflection. A threaded 5 interface between the part of the housing holding the camera PCB and the microscope body is configured to provide fine adjustment of the spacing between the two for setting the specimen plane that is in focus in the acquired image. The modular nature of the microscope design permits removing and interchanging various parts as required, for example, camera and LED PCBs and the filter and dichroic set. 10 Experimental microscopes manufactured consistent with this method were tested for a variety of features. Table 2 depicts various specifications for the experimentally fabricated miniature fluorescence microscope used for in vivo imaging of a brain for an active mouse and without image alignment. 15 Table 2 Dimensions 8.4 X 13 X 22 mm 3 Mass 2 g Resolution 2.5 ptm Field-of-view 0.48 mmz Photon flux 3 X 10" ph/cm 2 /s SNR 37dB Imaging duration 40 - 50 mins Simulated microscope resolution, based on the modulation transfer function (MTF) of the microscope, was determined to be 2.3 ptm. Measured microscope resolution, as stated in Table 2 above, was empirically estimated to be approximately 2.5 pum. Microscope 20 resolution was measured by imaging a Siemens Star resolution test pattern. In order to test the resolution capabilities of the experimental microscope, a sharp edge, a slanted bar, was used as the synthetic scene and imaged with the virtual microscope. The average edge response, or line spread function, was then derived at different cross sections of the digital image of the slanted bar and the MTF was then calculated. The 25 results support that the Nyquist rate, as determined by the camera pixel pitch, was found to be 89 cycles/mm. This corresponds to a 2.2 pim feature size in the specimen plane. The MTF 10, that is, the resolution at which the contrast degrades to 10% of the ideal contrast was shown to be 2.3 um.
WO 2012/027586 PCT/US2011/049180 20 A number of variations are possible from the expressly-discussed embodiments of the present disclosure. For instance, the microscope can be configured to include a local power supply, such as a battery. In other instances, an array of microscopes can be arranged to capture respective images of target objects. 5 Particular embodiments relate to in vivo imaging of an organism. Various embodiments discussed hereafter relate to the imaging of cerebellar vermis to study microcirculation concurrently with locomotive, and other mouse behaviors, by mounting of an integrated microscope on the cranium. Notwithstanding, the present disclosure is not so limited and can be applied to variety of different fields and applications. 10 In particular experimental embodiments relating to in vivo imaging, brain imaging, with the miniature microscope fixed over the mouse brain (in multiple experiments), was implemented once for a mouse exhibiting vigorous locomotor activity. The microscope was attached while the mouse was anesthetized and imaging commenced about 15-60 min after removal from the anesthesia. Using the cranially-attached microscope, multiple video clips 15 of mouse behavior and the correlated microcirculation in the vermis can be captured for various behaviors. For instance, the mouse walking about the behavioral arena represents a first behavior, while the mouse running on an exercise wheel represents a second behavior. In an experimental implementation, microcirculation was recorded using the integrated microscope that captured images at 100 Hz following an intravenous injection of FITC 20 dextran. This fluorescent dye labeled the blood plasma, allowing erythrocytes to be seen in dark relief. Individual erythrocytes were witnessed flowing through the capillaries. To reduce the possibility of photo-induced alterations in physiology, the duration and mean power of continuous illumination was limited to <5 min and <600 pW for each imaging session. At least 2 min were allowed to elapse between imaging sessions, and the total 25 imaging duration over the course of an experiment was generally around 45 min. Frame acquisition rates were around 100 Hz for the cerebellar vasculature and microcirculation imaging experiments, and 30-46 Hz for Calcium imaging studies. Although several in vivo applications are discussed herein, the devices and methods of the present disclosure can be used for other imaging solutions, such as morphology 30 determinations, drug screening and other applications. Consistent with one embodiment, the use of integrated microscopes discussed herein facilitates the identification of phenotypes for various organisms. This is facilitated by the high-resolution imaging that can be used to identify distinguishing characteristics. For instance, phenotypes can be identified for wild-type and erbb3 mutant zebrafish with WO 2012/027586 PCT/US2011/049180 21 fluorescence immunolabeling of myelin basic protein with Alexa-488. The spinal cords and posterior lateral nerve can be imaged and used to distinguish in wild-type fish. In erbb3 fish Schwann cells fail to develop the posterior lateral nerve. Consistent with another embodiment, the use of integrated microscopes facilitates 5 accurate cell counting assays in well plates. For instance, a base concentration (CO ~ 4.0 x 105 cells/mL) of live MCF7 human breast cancer cells labeled with carboxyfluorescein can be diluted, with 8 sample wells for each of 6 concentrations. Optionally, an automated algorithm can be used to provide for fast and efficient counting of cells in the images. Consistent with one embodiment, the automated algorithm uses successive stages of 10 analysis within a custom cell counting algorithm. A logic circuit such as a (computer) processor circuit (e.g., including a memory circuit/medium for providing the process instructions) performs contrast equalization on (raw) fluorescence image of live MCF7 human breast cancer cells labeled with carboxyfluorescein. The processor circuit next converts the resulting image to binary format, to which an initial segmentation is performed. 15 Single cells are then identified and counted. Iterative rounds of morphological filtering allow segmentation of the clusters of multiple cells that remained after the initial segmentation into individual cells. Embodiments of the present disclosure are directed toward using a microscopy imaging device as part of a larger optical system. For instance, a microscopy imaging 20 device can be embedded in vivo to facilitate long-term, chronic imaging. This can be facilitated by providing mobile power sources and control/processing circuitry. These and other elements can be integrated within the housing of the microscopy imaging device or connected externally (e.g., using a wired connection to a control unit located elsewhere on the subject). In another instance, a microscopy imaging device can be used in connection 25 with specialized optical devices, for example, to facilitate in vivo endoscopy or to monitor the subject during surgical procedures. The various embodiments described above and shown in the figures are provided by way of illustration only and should not be construed to limit the disclosure. Based on the above discussion and illustrations, those skilled in the art will readily recognize that various 30 modifications and changes may be made to the present disclosure without strictly following the exemplary embodiments and applications illustrated and described herein. For instance, applications other than in vivo imaging may be amenable to implementation using similar approaches. In addition, one or more of the above example embodiments and implementations may be implemented with a variety of approaches, including digital and/or WO 2012/027586 PCT/US2011/049180 22 analog circuitry and/or software-based approaches. These approaches are implemented in connection with various example embodiments of the present disclosure. Such modifications and changes do not depart from the true scope of the present disclosure, including that set forth in the following claims. 5 As discussed above, specific applications and background details relative to the present disclosure are discussed above, in the description below and throughout the references cited herein. The embodiments in the Appendices may be implemented in connection with one or more of the above-described embodiments and implementations, as well as with those shown in the figures and described below. Reference may be made to the 10 Appendices filed in the underlying provisional application, which are fully incorporated herein by reference.
WO 2012/027586 PCT/US2011/049180 23 APPENDIX A CMOS camera 3.1 CO emi ss Io n filter focusing mechanism LED achromat lens dichroic mirror collector 9]pil \lens excitation filter objective -- Excitation + Emission Figure 3.1. Computer-aided-design (CAD) drawing of a cross-section of the miniature fluorescence microscope. Figure 3.1 above shows a cross-section of the miniature fluorescence microscope that was designed embodying the design philosophy and guiding principles outline above. The characteristic epi-fluorescence architecture at the core of modem fluorescence microscopes has been retained, as is evident from the blue and green arrows that denote the excitation and emission pathways, respectively, except for the notable fact that the light source and camera - benchtop components in a conventional fluorescence microscope - are now integrated with a miniature WO 2012/027586 PCT/US2011/049180 24 fluorescence filter set and micro-optics into a single device. Key to facilitating "miniaturization by integration" is being able to integrate the light source and camera with the filter set and microscope optics. Leveraging advances in semiconductor electronics has permitted such integration, and in doing so, has enabled the realization of the design goal that was articulated. A solid-state light-emitting-diode (LED) small, amenable to integration with collection optics, and mass-producible at low costs - is used for the excitation light source. A Complementary-Metal-Oxide Semiconductor (CMOS) image sensor is used for the camera. CMOS image sensors have advanced considerably in recent years, driven by needs in mobile imaging, and the use of such an image sensor capitalizes on the several benefits that have been brought about by their ubiquitous use in cellular phone cameras - miniature form factor, amenability to integration with electronics, and availability in large numbers at low cost - all well-suited to the design and development of the miniature fluorescence microscope The LED light source shown in Figure 3.1 is a blue LED mounted on a custom 6 mm x 6 mm printed circuit board (PCB) with a heatsink. A drum micro-lens is used to collect illumination, which then passes through a 4 mm x 4 mm excitation filter, deflects off a dichroic mirror, and enters the imaging pathway. A gradient refractive index (GRIN) objective micro-lens focuses illumination onto the sample. Fluorescence emissions from the sample return through the objective, the dichroic, a 4 mm x 4 mm emission filter, and an achromatic doublet tube lens that focuses the image onto the CMOS image sensor (640 x 480 pixels), mounted on a custom 8.4 mm x 8.4 mm PCB with power and signal conditioning electronics. The LED light source, CMOS camera, and all optical components are integrated into a microscope housing with an inherently modular design that permits individual components, such as the excitation LED and CMOS camera chip, to be replaced for different application needs. The microscope housing is fabricated in Polyetheretherketone (PEEK) and has built-in mechanical image focusing capabilities permitting focusing to sub-micron accuracy by adjustment of the camera position. The microscope can be plugged into a computer via external data acquisition PCBs, with a standard USB interface, providing real-time image WO 2012/027586 PCT/US2011/049180 25 In the following sub-sections, the design of the constituent microscope modules and components are discussed in detail with relevant performance metrics. 3.1.1 The imaging pathway is typically at the heart of any microscope design, and it is perhaps natural to begin with the design of this particular module of the microscope. As was alluded to earlier, the imaging pathway (without the spectral-filtering components) is comprised of two lens elements - a GRIN objective lens and an achromatic doublet imaging tube lens. GRIN lenses are cylindrical lenses with a radially-decreasing refractive index profile that results in rays propagating in a sinusoidal path, as shown in Figure 3.2 below. The choice of a GRIN lens as the objective exploits its miniature form factor and ease-of-integration with other micro optics - attributes that were harnessed in fiber-optic telecommunications and that initially drove the development of such lenses - and its endoscope-like geometry for access to deep imaging targets specimen Figure 3.2. Ray propagation in a Gradient Refractive Index (GRIN) lens. The specific GRIN objective lens used to collect fluorescence emission from the specimen is 2 mm in diameter with a pitch length of 0.245. A pitch length of 1 corresponds to one full sinusoidal path of ray propagation; thus a pitch length of 0.245 results in collimated light rays as shown in Figure 3.2. The objective numerical WO 2012/027586 PCT/US2011/049180 26 aperture (NA) is 0.45. Collected fluorescence emission is passed through the dichroic mirror and the miniature emission filter, and the fluorescence image is then focused by an achromatic lens, with a focal length of 15 mm, onto the CMOS image sensor. Figure 3.3 below is an optical raytrace diagram of the entire imaging pathway with the two 1eM elements and the additional spectral-filtering components. The colored groups of rays show how points in the specimen plane are imaged onto the CMOS camera. Rays are traced from five distinct point sources in the specimen plane to imaged points on the CMOS camera. The design of the imaging pathway and all optical raytrace simulations were performed using ZEMAX software GRIN objective achromat camera and relay composite dichroic m s mirroremission filter Figure 3.3. Optical raytrace schematic of the microscope imaging pathway. The microscope magnification ranges between 4.5-5.5x, and the working distance, that is, the distance from the back surface of the objective to the point in the specimen plane that is in focus, is about 150-200 Rm, depending on the exact positioning of the focal plane. The performance of an optical design is often evaluated by its resolving capabilities, and one measure of this is the full-width-half-maximum (FWIM) of the optical point-spread function. The on-axis, lateral spatial resolution of the imaging pathway computed in this manner is approximately 1.2 gin, degrading to WO 2012/027586 PCT/US2011/049180 27 approximately 1.6 pm at the periphery of the field-of-view. However, it should be noted that the spatial resolution achievable is also a function of the camera pixel size. 3.1.2 Proper design of the illumination pathway is critical to ensuring efficient and uniform excitation of the specimen under observation. The excitation light source itself is perhaps the most integral component of the illumination pathway, determining whether sufficient and well-controlled illumination to excite the specimen can be provided. As mentioned previously, a blue LED with the spectral peak of illumination at around 470 nm was used as the excitation light source. Figure 3.4 below shows a rendition of the LED and Figure 3.5 shows its measured spectral profile. The LED was mounted on a custom 6 mm x 6 mm PCB that was equipped with a heat sink. The purpose of the heat sink is to aim to keep the LED junction temperature stable during operation. LED illumination output is, to a first order, linear with drive current, however, exhibits a strong temperature dependence. Drive currents of 20 - 30 mA were sufficient to deliver the required illumination power at the specimen, approximately a tenth of the maximum drive current that the LED was rated for. Typically, for a given drive current, the LED junction reached an equilibrium temperature in approximately 60 s after LED turn-on and the LED illumination output stabilized. MM 1 mm Figure 3.4. Light-emitting-diode (LED) excitation light source WO 2012/027586 PCT/US2011/049180 28 L. 0 U/) 40 490 480 500 520 Wavelength (nm) Figure 3.5. Measured LED illumination output spectral profile. LED illumination is collected by a drum lens, passed through a miniature fluorescence excitation filter, and then reflected off a dichroic mirror that directs the illumination into the GRIN objective lens and to the specimen. The goal in the design of the LED illumination collection and delivery to the specimen was to approximate Kohler illumination at the specimen, that is, to achieve spatially uniform, homogenous illumination at an average optical power density across the specimen field-of-view.. in Kohler illumination the light source and the specimen planes are on separate sets of conjugate planes, ensuring that the light source is not imaged onto the specimen, and yielding even illumination of the specimen at an average optical power density. 3.1.3 The fluorescence filter set is the core component of the epi-fluorescence architecture, separating the excitation illumination from the fluorescence emission, and comprising of three parts: the excitation filter, dichroic mirror, and emission filter. A standard fluorescence filter set comprising of these three parts was used for the microscope, except for the fact that the filter set used was a miniature version of that used in a conventional benchtop fluorescence microscope. The spectral profiles of the WO 2012/027586 PCT/US2011/049180 29 filters and dichroic were chosen to allow blue excitation and green emission - a common set of spectral profiles that is amenable to imaging a broad palette of synthetic fluorescent probes, such as fluorescein and its reactive derivatives, as well as genetically-encoded fluorescent proteins, such as the green fluorescent protein (GFP). The specific spectral characteristics and dimensions of the filter set are as follows. The excitation filter is a bandpass filter with a spectrum of 480/40 nmu and a dimension of 4 mm x 4 mm x 1.05 mm, the emission filter is also a bandpass filter with a spectrum of 535/50 nm and a similar dimension of 4 mm x 4 mm x 1.05 mm, and the dichroic mirror has a longpass spectral profile, passing wavelengths above 506 nm, and with a dimension of 4 mm x 4.8 mm x 1.05 mm. 3.1.4 The use of a CMOS image sensor for the camera is a key, enabling means towards achieving the articulated design goal and realizing the miniature microscope. CMOS image sensors are digital imaging sensors that are designed and fabricated in CMOS - the technology commonly used in the design of most integrated circuits for a large number of applications today. The transition from analog, film-based imaging to digital electronic imaging was not brought about by CMOS digital image sensors though. Charge-Coupled Device (CCD) technology was first used to demonstrate the physical means of converting photons into electrons in arrays of pixels in silicon and generating a digitized image of a scene - ushering in the era of digital imaging, and indeed, recognized for its role in doing so with one-half of the 2009 Nobel Prize for Physics awarded to two of the pioneers of CCD technology, Willard Boyle and George Smith. The advent of CMOS imaging, however, was a significant milestone in the evolution of digital imaging. In contrast to the specialized semiconductor foundries that are required for fabricating CCD digital image sensors, CMOS digital image sensors can be fabricated in conventional semiconductor foundries. As a result, CMOS image sensors can be mass-produced at very low costs. Furthermore, the use of CMOS technology provides an enabling platform with several inherent performance benefits. Digital imaging sensors with low power and high speed can be realized, and additional WO 2012/027586 PCT/US2011/049180 30 circuits can be integrated at the pixel-level or chip-level to incorporate signal processing capabilities and implement novel architectures for demanding application needs. Exemplifying such capabilities was the design and development of CMOS image sensors with digital pixels - where conversion from photons to bits is done directly at the pixel level with a per-pixel analog-to-digital converter and dynamic memory - enabling, for example, imaging at up to 10,000 frames/s for high speed imaging applications and the implementation of still and video rate imaging applications that benefit from high-speed capture, such as dynamic range enhancement [70]. High dynamic range imaging requirements are often encountered in infrared (IR) imaging, and CMOS readout circuits and architectures for high dynamic range imaging at high speeds, and with high signal fidelity, have been demonstratedI CMOS image sensors have also had an impact on the life sciences, in one work used to enable an integrated, low-cost de novo DNA sequencing platform CMOS image sensors are being used today in a plethora of applications, and remarkably sophisticated custom image sensors are being designed with capabilities that are likely to engender even more applications. A recent work embodies the state-of-the-art in CMOS image sensor device, circuit, and architecture, demonstrating imaging with a novel multiple aperture architecture with submicron pixel structure Although designing and fabricating custom CMOS image sensors, such as those referenced above, affords the capability and flexibility to tailor the image sensor to the target application, the proliferation of CMOS image sensors in mobile imaging devices has created a thriving market of commercially-available CMOS image sensors that are particularly attractive since they usually represent proven designs, and consequently, a quicker route to validation and incorporation into an imaging system. The low cost of production and low power consumption of CMOS image sensors have made them ubiquitous in cellular phones and personal digital assistants (PDAs). Mobile imaging needs continue to drive CMOS image sensor development, and indeed, several of the ensuing advances, such as miniature form factor and amenability to integration with other electronics on miniature PCBs, in addition to their availability at low costs, are well-aligned to the design objectives of the miniature microscope.
WO 2012/027586 PCT/US2011/049180 31 Based on these considerations, a commercially-available CMOS image sensor was selected to form the centerpiece of the camera that would be integrated into the miniature fluorescence microscope. The CMOS image sensor chosen has a resolution of 640 x 480 pixels, each pixel having dimensions of 5.6 pm x 5.6 pm, and is packaged in an especially miniature 5.6 mm x 5.8 mm chip-scale package. The sensor output is in a serialized digital low-voltage differential swing (LVDS) format, facilitating interfacing with a minimum number of interconnects - an important consideration that minimizes the number of wires attached to the microscope. Figure 3.6 below shows the packaged CMOS image sensor photographed next to a U.S. dime. 5 mm Figure 3.6. CMOS image sensor selected for integration with the microscope. Miniature form factor and amenability to integration with other electronic components are necessary but certainly not sufficient conditions in determining the potential of using a commercially-available CMOS image sensor for the camera. Characterization of imaging performance and evaluating application fit is also crucial. Table 3.1 below lists the key specifications and characterization results pertinent to imaging performance of the CMOS image sensor that was chosen.
WO 2012/027586 PCT/US2011/049180 32 Table 3.1. CMOS image sensor specifications and imaging performance characterization results. Package size 5.6 x 5.8 mm 2 Arraysize 640 x 480 nixels Pixel size 5.6 x 5.6 im 2 Frame rate* 36 fps Pixel read noise 10 e Dark current (room temp.) 900 eis Dark signal non-uniformity 30 e~/s Full well capacity 52,000 e *Higher frame rates possible over smaller windows-of-interest (e.g., 100fps at 300 x300 pixels) Experimental characterization of the sensor to determine some of the above listed results are briefly described as follows. Pixel read noise was estimated by calculating the standard deviation of pixel intensity in 1000 image frames, acquired in full darkness and with sufficiently brief exposure such that the noise contribution from dark current shot noise was insignificant. Dark current, and dark signal non-uniformity (DSNU), the variation in dark current across the array of pixels due to device mismatches, were estimated by capturing 1000 frames in the dark with sufficiently long exposure times, and then averaging all the frames into a single image, with the objective of ideally averaging out temporal noise. Dark current and dark signal non uniformity were then found from the mean and standard deviation of all pixels in the averaged image. With these experimentally-characterized sensor specifications, and other known electronic properties of the sensor, the CMOS image sensor was analytically modeled to estimate imaging fidelity for a range of incident photon flux WO 2012/027586 PCT/US2011/049180 33 densities. The reader is referred to Appendix I for details on the analytical modeling. Since photon flux densities incident on the sensor plane for typical in vivo mouse brain imaging experiments are on the order of 1011 photons/cm 2 /seci, which corresponds to 20,000 electrons/pixel/sec, the CMOS image sensor operates in the photon shot noise limited regime for in vivo mouse brain Jiaging experiment - the primary application for the miniature fluorescence microscope. Thus, the CMOS image sensor's pixel read noise and dark current numbers, important considerations for applications where imaging is performed in low-light conditions, have a negligible impact on imaging fidelity. Appendix I provides estimates of the fidelity with which fluorescence emission signals from the brain can be imaged - limited by the Poisson statistics of signal shot noise - with the peak signal-to-noise-ratio (SNR) estimated to be 47 dB. Along with an estimated sensor dynamic range of 60 dB, which is more than sufficient for capturing the range of signal intensities observed in in vivo brain imaging datasets, the imaging performance metrics of the CMOS image sensor are well-suited to serving the application needs. The importance of such sensor characterization, analytical modeling, and evaluation of imaging performance in the context of targeted applications cannot be emphasized enough since the particular choice of the CMOS image sensor for use as the camera is a key design decision. While CMOS image sensors, in general, indeed do facilitate "miniaturization by integration", evaluating whether the chosen sensor has the requisite imaging performance before fabrication of the microscopes is critical - especially if such microscopes are to be mass-produced. The CMOS image sensor was integrated on a custom 8.4 mm x 8.4 mm PCB with power and signal conditioning circuits. The PCB itself is designed to be mounted on top of the miniature microscope with its vertical position adjustable in order to permit fine focusing by adjustments to the imaging plane. Figure 3.7 below shows a schematic of the PCB with the CMOS image sensor and integrated circuit components for power regulation and conditioning, and conditioning of the image sensor clock signal. Determined empirically from measurements and imaging data acquired in previous in vivo mouse brain imaging experiments using the fiber-bundle-based fiberscope.
WO 2012/027586 PCT/US2011/049180 34 SCLK SDATA CLK CMOS + buffer image 2 10 bit pixel output 162 Mbps serial LVDS Figure 3.7. Schematic of PCB with CMOS image sensor and power and signal conditioning integrated circuits. The PCB has the minimum number of I/Os required for interfacing with the CMOS image sensor and acquiring imaging data from the microscope. SCLK and SDATA input connections are used to control the sensor over a two-wire fC interface. Input power supply is stepped-down and regulated by a low-dropout voltage regulator (LDO) before being delivered to the sensor. The input CLK signal (162 MHz) is restored by the CLK buffer before being sent to the sensor, which uses it to internally generate a 27 MHz master CLK signal. The output of the sensor, as described earlier, is in a 10-bit digitized format and transmitted over a two-wire serial LVDS protocol. 3.2 Independently designing constituent microscope modules, such as the imaging and illumination pathways, and characterizing and evaluating key microscope components, such as the camera, allows for optimization and evaluation of the respective parts of the microscope - but independent of each other and without the kind of holistic design perspective that would assist in the evaluation of the microscope as an integrated device. Akin to determining whether the imaging WO 2012/027586 PCT/US2011/049180 35 performance of the camera - that is, the CMOS image sensor - would be sufficient for the application needs before fabrication of the microscope, it is not a leap of the imagination to go a step further and ask how the microscope itself would perform as an integrated device, before fabrication. One approach to answering this question is to model the microscope. Such an approach oenable adopting a modeling-based microscope design methodology, analogous to that used in the design of integrated circuits before mass fabrication. Not only does this permit optimization of microscope performance during the design process, but it also allows simulating and evaluating "plug-and-play" operation across such microscopes before mass fabrication. 3.2.1 Since imaging performance forms the basis of evaluation for any microscope, and in the particular case of these miniature, integrated microscopes that lack an eyepiece the captured digital image serves as the only means of specimen readout, modeling the microscope as an imaging system pipeline from input, the specimen, to final output, the captured digital image of the specimen, provides a framework to implement the comprehensive modeling-based design methodology described above. The process of imaging a specimen can be modeled and simulated, allowing for evaluation of the microscope's overall performance, and different parts of the microscope can at the same time be co-designed and optimized in a holistic manner. In particular, microscope imaging optics and camera co-design, with the objective of optimizing overall device imaging performance, or a system merit function, can enable the designer to explore trading-off complexity in the optics with sophistication in the imaging electronics and processing For example, leveraging advances in image sensors and image processing algorithms to compensate for deficiencies in the imaging optics and correct for optical aberrations may permit the use of simple, compact, and low cost imaging optics without the sacrificing of final image quality. The following section describes the development of this image-based microscope modeling framework and associated tool flow. The tool flow facilitates the WO 2012/027586 PCT/US2011/049180 36 construction of a "virtual" microscope that can be used to image synthetic specimen scenes and permit image-based evaluation of the miniature, integrated microscope. 3.2.2 While commercial optics design tools such as ZEMAX and Code V permit analytical optimization and evaluation of the microscope imaging optics, they do not readily lend themselves to image-based analyses with user-defined multi spectral scenes, and by themselves, are decoupled from any subsequent steps required to convert the optical image to a digital image. Thus any tool developed to model the microscope as an imaging system pipeline, that is, as an integrated imaging device, must, at a minimum, perform two steps. First, the user must be able to specify a model of the specimen (synthetic scene) with known illumination characteristics as determined by typical imaging conditions for the particular application, and given this synthetic scene, the process of forming an optical image using the designed imaging optics must be modeled. Second, the process of capturing this optical image using the camera and generating a digital image must be modeled. Such a tool flow would thus enable image-based evaluation of the integrated microscope design, providing feedback to the designer and facilitating, for example, investigation of design trade offs in the imaging optics and sensor in an iterative manner until an optimized integrated microscope device design has been achieved. The option to incorporate custom post-acquisition image processing routines into the tool flow, either to correct for deficiencies in the imaging system pipeline or to perform further image-based processing, are also features that would enhance the utility and applicability of such a modeling-based microscope design methodology and associated tool flow. The tool flow developed to perform the above steps is built around a core set of open-source MATLAB-based tools, the Image Systems Evaluation Toolbox (ISET), previously developed with the objective of modeling a typical digital camera imaging system pipeline and evaluating camera image quality ISET provides a platform to specify a user-defined radiometric representation of a scene, a model of the imaging optics, and a model of the digital image capture device, and can be used to simulate WO 2012/027586 PCT/US2011/049180 37 conversion of the scene to an irradiance optical image, and conversion of the optical image to a digital image, using the optics model and camera model, respectively. The integrated microscope modeling tool flow developed leverages this basic framework for modeling an imaging systems pipeline, using the optics design tool (ZEMAX) to create and specify a modcl of the imaging optics based an the ZEMAX optical raytrace design, and VCAM, the virtual camera simulator in ISET, to create a virtual camera model based on the parameters obtained from the CMOS image sensor characterization and specifications. Given the open-source nature of the development environment, the platform is readily amenable to integrating any custom post processing algorithms based on the raw digital images. Figure 3.8 below illustrates this comprehensive, image-based integrated microscope modeling tool flow. In put: Virtual specimen Radiornetric scene ZEMAX Optics model nor optical image VCAM* Camera rnodel Form digital inaa Output: Rendered image Figure 3.8. Tool flow for image-based modeling of the miniature microscope. *VCAM: Virtual Camera, digital camera simulator The steps and processes involved in the tool flow are described in detail below.
WO 2012/027586 PCT/US2011/049180 38 Step 1: Creating the scene The first step in modeling the integrated microscope is creating a model of the specimen, which becomes the input to the imaging system pipeline. Distance of the specimen plane from the first lens surface and specimen field-of-view dimensions are user inputs, and the number of pixels used to represent the discretized scene can also be specified. Given knowledge of typical illumination intensities and spectral properties (i.e., average photon fluxes and distribution of wavelengths), a radiometric representation of the synthetic specimen target is then created where each pixel is expressed in units of photons/sec/nm/sr/M 2 . Standard optical targets are often a basis for evaluating general imaging performance before simulating synthetic specimen scenes. As an example, the normalized digital image of a USAF 1951 target or EIA 1956 resolution chart can be used as an input scene with geometric dimensions and illumination characteristics approximating specimen imaging conditions. In creating such a scene, the user would provide the dimensions of the field-of-view and distance of the scene from the optics, and number of pixels representing the scene FOV. Smaller number of pixels reduce computation time, however, a minimum number of pixels is required such that the discretization of the scene is fine enough for accurate computations that involve the scene further downstream. The user would also have the option of defining the distribution of photon fluxes over a spectrum of wavelengths, the radiometric representation of the scene, which would serve to model scene illumination conditions. In this manner, any normalized digital image that is a representation of the specimen or imaging target can be converted into a multi-spectral radiometric scene with defined geometric dimensions. Step 2: Forming the optical image The next step in the modeling process is forming an optical image of the multi spectral radiometric scene using the designed microscope optics. A model of the microscope optics is generated and extracted from the optics design package (i.e., ZEMAX) and is essentially an equivalent lens representation of the imaging pathway.
WO 2012/027586 PCT/US2011/049180 39 The model is comprised of three main components: 1) Equivalent paraxial optics parameters for the pathway, e.g., effective focal length and magnification; 2) Geometrical distortion and relative illumination parameters; and 3) Field point and wavelength dependent Point Spread Functions (PSFs). The optical image is computed in three consecutive stages corresponding to the above model components. An ideal, unaberrated geometric image is first computed from the extracted paraxial optics parameters. Geometrical distortion and relative illumination parameters, which account for changes in the image shape and geometry and for the drop-off in illumination at off-axis points due to effects such as vignetting and pupil aberrations, respectively, are extracted at each specified field point in the object plane and used in a least-squares fit with an empirically-derived n t order polynomial to generate a map of these parameters across the entire field-of-view. An intermediate, distorted optical image is created in this process. The final optical image is formed by convolving the field point and wavelength dependent PSFs with this intermediate image. In an ideal linear, shift-invariant imaging system, the PSF is the system impulse response function, and as such, convolution of field points in the intermediate image with the PSF results in a final, filtered optical image that captures the effects of diffraction and optical non-idealities. However, most optical systems are not shift-invariant and thus the final transformation assumes linear, rotationally-symmetric, and shift-variant optics. The intermediate image is therefore split into several rotationally-symmetric isoplanatic image sections corresponding to the number of field points specified, each image section is convolved with local PSFs interpolated from the PSFs extracted at the boundary field points of that image section, and the results then summed. These operations are iterated over all wavelengths to transform the multispectral scene radiance into a multispectral optical image irradiance (photons/sec/nm/m 2 ) incident on the camera plane. Whereas the concept of integrating the ZEMAX optics design with the ISET imaging system modeling platform allows leveraging of standard, well-developed optics design packages for designing and modeling the microscope imaging optics, the actual integration is non-trivial and some of the mechanistic details deserve mention.
WO 2012/027586 PCT/US2011/049180 40 Extraction of the three main model components is done within the ZEMAX environment and automated through a ZEMAX script (.ZPL). The script generates the model parameters by simulating the particular lens configuration specified for a given number of wavelengths and object plane field points (or paraxial image heights). The first component of the model, extracting the paraxial image parameters for computation of the ideal geometric image, is straightforward with standard .ZPL programming functions. Extracting relative illumination and geometric distortion parameters at each field point is also readily performed using .ZPL programming functions. However, since generating a map of these parameters across the field-of view involves a least-squares fit at a later stage in the computation process, the minimum number of field points that the user specifies in ZEMAX must be greater than or equal to the order of the polynomial used in the least-squares fits. A greater number of field points also helps in improving the accuracy of the final step in the optical image formation process. Since PSFs are generated and extracted from ZEMAX at each field point, and convolved with isoplanatic image sections corresponding to the number of field points, specifying several field points results in the generation of more "local" PSFs and finer isoplanatic image sections, providing a better approximation of the shift-variant nature of the optical system. However, this does come at the expense of an increase in computational time. The PSFs at each field point are calculated using the Huygen's PSF method in ZEMAX using PSF size and grid spacing parameters determined by the extent of the PSF spread and the scene pixel pitch spacing, respectively. The latter is especially important since the PSF sampling and scene pixel pitch must be consistent (ideally, the scene pixel pitch after magnification in the imaging plane must be smaller than or equal to the PSF sampling density) for the 2D convolution between the PSFs and the corresponding isoplanatic image sections to be accurate. All the optics model parameters extracted as above from ZEMAX are output to a set of files which are then imported into the ISET modeling platform where the conversion from scene to final optical image is performed using the computational steps and procedures discussed earlier. Some of WO 2012/027586 PCT/US2011/049180 41 these considerations pertaining to integration of a custom optics design with the ISET platform have also been discussed in Step 3: Forming the digital image The final step in modeling the integrated microscope is simulating capture of the optical image and its conversion to a digital image. A model of the digital imaging device (i.e., camera) is required and such a model can be constructed within the ISET platform using known and experimentally-derived camera parameters such as pixel and array size, pixel quantum efficiency, conversion gain, analog-to-digital (A/D) converter specifications, and camera noise parameters. In order to simulate camera capture and the digital image formation process, the conversion of incident photon flux density to sensor signal must be accurately modeled. Photon flux density is first converted into current density using pixel quantum efficiency, then integrated in space over the pixel photodiode area and integrated in time given a user-specified camera exposure time yielding a per sensor pixel signal count in electrons (charge) that is ideally directly proportional to incident light intensity until saturation. The pixel conversion gain is used to convert the collected charge to volts for readout, and the resolution of the camera A/D and full scale range of the pixel is used to convert the analog pixel intensities to digital pixel values, thereby completing the transformation of the optical irradiance image to raw digital image. Figure 3.9 below summarizes the sequence of steps. Quantum Integration Conversion efficiency space/time gain Quantization Photon flux density- Current density -~+ Charge -- +Voltage - Digital count photons/cm 2 .sec A/cm 2 Col V Figure 3.9. Conversion of photon flux density to a digital pixel value. The digital image formation process described above is an ideal process. Conversion of the optical image to a digital image is a non-ideal process and often a significant source of additive noise in any imaging system. As such, it is both instructive and important to model the noise at this stage in the microscope imaging system pipeline. Noise modeling enables the rendering of realistic digital images, WO 2012/027586 PCT/US2011/049180 42 allows for estimation of imaging signal fidelity, and provides downstream post processing algorithms, such as deconvolution algorithms to correct for optical system aberrations, with reliable inputs that serve as good tests of the efficacy and performance of these algorithms in the presence of noise. An additive noise model, incorporating the camera noise parameters and the presence of signal photon shot noise, is therefore used to generate digital images with noise. The reader is referred to Appendix I for details on the modeling of noise in the digital image formation process. The tool flow described above was used to model the miniature microscope at various stages of its design, providing design feedback and allowing optimization of microscope imaging performance. The tool flow facilitated the adoption of a modeling-based microscope design methodology, specifically aiding in the evaluation and design optimization of the microscope imaging optics. Indeed, the imaging pathway design presented in Section 3.1.1 is the result of converging on an optimized design based on using the described tool flow for image-based modeling of different, candidate optical designs with the chosen CMOS image sensor. Since the tool flow developed assumes an illuminated scene, that is, creates a radiometric representation of the specimen using known specimen illumination characteristics, the illumination pathway is not modeled and thus its design remains independent. However, the illumination pathway design can be easily prototyped to validate the specimen illumination assumptions that are made. 3.2.3 This section provides simulation results from image-based modeling of the microscope using the tool flow. Results of simulating synthetic scenes using a model of the final miniature microscope design are presented to illustrate the steps in the tool flow. While simulated optical and digital images from the "virtual" miniature microscope assist in qualitatively evaluating overall microscope imaging performance, the tool flow is also used to quantitatively evaluate imaging performance of the final miniature microscope design, in terms of imaging fidelity and spatial resolution.
WO 2012/027586 PCT/US2011/049180 43 Using the virtual microscope to image a USAF 1951 resolution target To illustrate the tool flow, a standard USAF 1951 resolution test target is used as the synthetic specimen scene. Figure 3.10 below shows the radiometric representation of this scene - the result of the first step in the tool flow - with 1 luminlation1 suc t photo flx h densiyi nt on the focI nlane is 1011 photons/cm 2 /sec - typical of in vivo mouse brain imaging experiments. Illumination is assumed to be monochromatic at a wavelength of 550 nm, within the band of fluorescence emission wavelengths. Scene dimensions are 500 pm x 500 pm. IIIan illl 3 I 2 wowI~ 551 1 1 mm 100 PM Figure 3.10. Radiometric representation of USAF 1951 synthetic scene. The next step is modeling conversion of the radiometric scene to an optical image irradiance using a model of the designed microscope imaging optics. Figure 3.11 below shows the result of this step - the second step in the tool flow - where optical image formation is modeled using the imaging pathway design presented in Section 3.1.1. As expected, the on-axis spatial resolution is sufficiently high to be able to resolve lines on the order of 1-2 ptm that are on the outer grid at the center of the WO 2012/027586 PCT/US2011/049180 44 resolution target (FWIIN4 of the point-spread function is 1.2 pim, as reported in Section 3.1.1). Optical aberrations are evident at the periphery of the field-of-view. 4~ III ' EWON" 500 pm Figure 3.11. Optical image irradiance of the USAF 1951 resolution target. Optical image dimensions are approximately 2.5 mm x 2.5 mm, corresponding to an optical magnification of 5x. The imaging optics model used is based on the design of the imaging pathway presented in Section 3.1.1. The final step in modeling imaging of the scene with the virtual microscope is modeling conversion of the optical image irradiance to a digital image using a model of the camera based on the CMOS image sensor that was chosen (Section 3.1.4). Figure 3.12 below shows the result of this step assuming a 30 ms exposure time. Each pixel in the digital image is a 10-bit digital number - the final result of the process of sensing photons incident on the sensor, collecting charge proportional to the incident number of photons, converting the charge collected per pixel to voltage, and then digitizing the pixel voltage with a 10-bit analog-digital-converter. The average image SNR is approximately 35 dB. Imaging fidelity is limited by photon shot noise, as would be expected from the analytical sensor modeling results presented in Section 3.1.4. Of course, given that the process of modeling digital image formation in the tool WO 2012/027586 PCT/US2011/049180 45 flow is based on the same analytical model and sensor input parameters presented and discussed in Section 3.1.4 and Appendix I, the average SNR of the simulated image is also consistent with what would be expected from the analytical sensor modeling. nw 500iprn Figure 3.12. Digitized image of USAF 1951 resolution target. Image is 3.6 mm x 2.7 mm, corresponding to the CMOS image sensor active area. Using the virtual microscope to estimate microscope spatial resolution One of the most important metrics for evaluating performance of any microscope is its resolving power, that is, the achievable imaging spatial resolution. As was alluded to previously in Section 3.1.1, the achievable imaging spatial resolution of the miniature microscope is a function of both the imaging optics and the camera - specifically, the pixel pitch of the CMOS image sensor. Although the FWHM of the point-spread function of the microscope imaging optics is 1.2 pm, since the CMOS image sensor pixel pitch is 5.6 ptm and the optical magnification is approximately 5x, the achievable imaging spatial resolution is bounded by the Nyquist frequency determined by the camera pixel pitch. Therefore, imaging spatial resolution WO 2012/027586 PCT/US2011/049180 46 is limited by the camera, and not the optics. Given that the FWHM of the optical point-spread function no longer serves as a suitable metric for evaluating microscope resolution, the resolving capabilities of the microscope have to be assessed by other means. Although imaging resolution targets such as the USAF 1951 test target is helpful, calculating the Modulation Transfer Function (MTF) of the microscope provides another means for quantifying microscope resolution. For an optical system., the MTF is the magnitude of the Fourier Transform of the point-spread function and is thus a filter in the frequency domain providing a measure of the degradation in contrast over frequencies. For an integrated imaging system with imaging optics and a camera, such as the miniature microscope, the MTF of the integrated imaging system can be analogously calculated from the line spread function (LSF) at a particular cross-section of a sharp edge that has been imaged through the system. Such a microscope system MTF captures the effects of both the optical MTF and the sensor MTF, and provides a measure of the resolving capabilities of the microscope and imaging spatial resolution that can be achieved. C Microscope resolution - Optics only .2 1%" t * I mSimulated MTF10 2.3 pm Microscope C :0.8 ,ID - 0.4N c (89 cyclIes /mm) 0 "'0.2 -a 00 :12 0 20 40 60 80 100 Frequency (cycles / mm) Figure 3.13. Modulation transfer function (MTF) of the miniature microscope calculated from a simulated image of a slanted bar. Figure 3.13 above is a plot of the microscope MTF (compared with the MTF of the imaging optics) calculated at the center of the field-of-view. A sharp edge - a WO 2012/027586 PCT/US2011/049180 47 slanted bar - was used as the synthetic scene and imaged with the virtual microscope. The average edge response, or line spread function, was then derived at different cross-sections of the digital image of the slanted bar and the MTF was then calculated, as discussed above, with Figure 3.13 showing the MTF calculated from the LSF at the center of the FOV. As is indicated on the plot, the Nyquist rate, as determined by the camera pixel pitch, is 89 cycles/mm, which corresponds to a 2.2 pm feature size in the specimen plane. The MTF10, that is, the resolution at which the contrast degrades to 10% of the ideal contrast is 2.3 ptm, which can be regarded as a measure of the microscope's expected resolving capability. The image-based modeling framework and associated tool flow developed was therefore valuable in not only facilitating and enabling the modeling-based design methodology that was articulated and that is imperative in establishing a systematic and methodical means of designing such integrated microscopes, but also allowed for quantification of the expected microscope imaging performance. The simulations and results provided by a model are of course dependent on the validity of the model. Several sanity checks were performed during the course of development of the image based modeling framework and tool flow, for example, intentionally perturbing model parameters and assessing their effects, but ultimately, the validity of the model is best corroborated by comparing results obtained from the model to those obtained experimentally, from the fabricated microscope WO 2012/027586 PCT/US2011/049180 48 3.3 Appendix I: This Appendix provides further details on modeling noise in a CMOS image sensor and the analytical model that is referred to in Sections 3.1.4, 3.2.2, and 3.2.3. The noise model developed here for the conversion from photon flux density to sensor signal is an input-referred, additive noise model that incorporates both temporal noise, which accounts for the variation in each sensor pixel signal over time, and fixed pattern noise (FPN), which accounts for the spatial variation in signal across pixels in the array (under temporally and spatially uniform illumination). Temporal noise arises from various sources including signal and dark photon shot noise, sensor read noise, and reset noise. FPN arises from inherent non-uniformities over pixels, due to device and readout circuit mismatches, and has an offset component and a gain component, the latter being signal-dependent. Image sensors often perform common on-chip noise suppression techniques. One such technique is Correlated Double-Sampling (CDS) where each pixel output is sampled twice, once right after pixel reset and a second time after integration, and differenced before signal readout. Through this operation CDS reduces reset noise and the offset component of FPN. Assuming CDS is always performed and ignoring gain FPN (significant only under high illumination), the image sensor per pixel additive noise model, with the pixel photocurrent as the input signal, can be simplified as follows, d Qshot Qread dsnu 9 iphr r d a Vo Figure 3.14. CMOS image sensor signal and additive noise model.
WO 2012/027586 PCT/US20111/049180 49 where, is pixel photocurrent and id, is pixel dark current in A Q(i) is the function representing conversion of total pixel current to total collected charge, Q)= i for 0< iqQa q t Q(i)= Q. electrons, for i > til where q is electron charge, t, is integration time (camera exposure), and Q.. is well capacity Q,,,, is the r.v. representing the noise charge due to integration (Poisson distributed shot noise) 1 and is Gaussian with zero mean and variance -(ih + i,.)(jy,, electrons" q Q,,,ad is the r.v. representing the noise charge due to camera read noise (noise due to readout circuits and quantization) and has zero mean and standard deviation a., electrons Qds,, is the r.v. representing the noise charge due to Dark Signal Non-Uniformity (component of offset FPN not canceled by CDS) and is xsumed to have zero mean and standard deviation ad,.. = (,,ti, / q electrons g is the sensor conversion gain in V/electron. Thus the total input-referred average noise power is given by, 2 q~i~~id~) 2a " 2 2q_ -(phd + read2 Assuming the input signal is not varying during integration, the sensor Signal-to-Noise Ratio (SNR) is then given by, il2 SN~z Rp q ( h + d) 2 q 2 u 2 lint i tint2 SNR can therefore be estimated using the above equation for any given input signal photocurrent, corresponding to a particular incident photon flux, and specified integration time, given knowledge of sensor read noise, dark current, and DSNU.
WO 2012/027586 PCT/US2011/049180 50 4.1 CMOS camera Filters.. %* LED objective 5 mm Figure 4.1. A typical fabricated miniature fluorescence microscope with main components labeled. The scale bar applies to main photograph and all insets. Figure 4.1 above depicts a typical fabricated microscope fully-assembled with all constituent components. The insets show, clockwise from top, the camera PCB with the CMOS image sensor mounted on top, the LED PCB with mounted LED and integrated heat sink, and the fluorescence filter and dichroic set.
WO 2012/027586 PCT/US2011/049180 51 j;~~j Figure 4.2. A fabricated, fully-assembled microscope photographed next to a U.S. dime. V~ # Figure 4.3. Several fabricated, fully-assembled microscopes photographed in the palm of a hand. Figure 4.2 and Figure 4.3 above are photographs of fabricated, fully-assembled microscopes, providing a perspective on the level of miniaturization achieved.
WO 2012/027586 PCT/US2011/049180 52 Table 4.1. Fabricated miniature fluorescence microscope specifications and characterized imaging performance. Dimensions 8.4 x 13 x 22 mm3 B Ch top Mass 2g Resolution 2.5 pim rn Field-of-view 0.48 mm2 Photon flux 3 x 10" ph/cm2/S 6 SNR (same PEXC) 37 dB SNR 3 d B111 R Imaging duration 40 - 50 mins 10 ins Mouse motion Active behavior Image alignment Not necessary N ecessary Table 4.1 above lists specifications of the miniature fluorescence microscopes, characterized imaging performance of a typical fabricated microscope, and comparisons to the performance of the fiber-bundle-based fiberscope in the context of the in vivo mouse brain imaging application that the miniature fluorescence microscopes were primarily designed for. The listed specifications show that the design objectives have been met. The microscope is small enough to be borne on the head of a mouse during active behavior. Imaging performance is also sufficient for high-speed, cellular-level brain imaging, and indeed many of the performance gains expected have been achieved (see Section 4.1.2).
WO 2012/027586 PCT/US2011/049180 53 4.1.1 Comparing specifications and experimentally-characterized imaging performance of the fabricated microscope to what was expected from the design of the microscope and simulations of its imaging performance provides feedback on the design process and corroboration of the validity of the image-based modeling framework that was developed and used to simulate microscope imaging performance. Resolution Perhaps the most important performance metric worthy of comparison is resolution. Simulated microscope resolution, based on the MTF of the microscope (Section 3.2.3, Figure 3.13), was determined to be 2.3 tm. Measured microscope resolution, as stated in Table 4.1 above, was empirically estimated to be approximately 2.5 pm. Microscope resolution was measured by imaging a Siemens Star resolution test pattern, Field-of-view Field-of-view, determined by the CMOS image sensor array size and the optical magnification, is also consistent with expectations, with the FOV reported in Table 4.1 corresponding to the specimen FOV area imaged with an optical magnification of 4.5x. Imaging fidelity (SNR) Earlier, it was mentioned that typical in vivo brain imaging signals are on the order of 1011 photons/cm 2 /s, and that the expected imaging fidelity in terms of SNR for signals of that magnitude was around 35 dB (analytical sensor modeling and simulations, Section 3.1.4 and Section 3.2.3). Table 4.1 above reports SNR of 37 dB for signals where the mean photon flux density incident on the sensor is approximately 3 x 10" photons/cm 2 /s. The slightly lower than expected SNR (for a mean signal of 3 x 10" photons/cm 2 /s the analytical sensor modeling would predict an SNR of 39-40 WO 2012/027586 PCT/US2011/049180 54 dB) can most likely be attributed to simplifications in the noise modeling, for example, not incorporating the additional noise component due to gain FPN that is manifested at high signal levels). From the above, it is heartening to conclude that measurement results are therefore, in general, found to agree with simulation results and expectations. 4.1.2 Table 4.1 also compares the specifications and imaging performance of the miniature fluorescence microscope with the fiber-bundle-based fiberscope. Elimination of the fiber-bundle was expected to realize several performance benefits. The expected areas of improvement are summarized below and the fulfillment of those expectations is discussed. Mouse motion One of the principle drawbacks of the fiber-bundle was that it was rigid and not very flexible, limiting the range of mouse behavior. With the miniature microscope, the range of mouse behaviors observed is remarkably rich. Although there is a bundle of wires encased in a polyvinyl chloride (PVC) sheath that connects the microscope to the data acquisition and control setup, this cable carrying all electrical signals is very thin (with an overall diameter of 1.5 mm) and flexible As will be evident in the discussion of experimental results from in vivo mouse brain imaging in the next section, one of the primary design goals from the perspective of the neuroscience application - enabling free and active mouse behavior during in vivo brain imaging - has certainly been achieved. Image alignment although a commutator attempts to provide the mouse with more behavioral freedom by allowing the fiber-bundle to rotate as the mouse moves, this results in a rotating field-of-view that necessitates computational WO 2012/027586 PCT/US2011/049180 55 realignment of the rotating set of raw images by offline image registration. Furthermore, since the illumination field does not rotate with the image and is always slightly non-uniform, a time-dependent illumination non-uniformity is present in the computationally processed, rotationally-aligned images. With the miniature microscope, since the entire microscope is mounted on the mouse, captured images are always in the same frame of reference and therefore such limitations no longer exist. Furthermore, images are already digitized when transmitted over the electrical cable. The field-of-view does not rotate, computational realignment of images is not required, and illumination non-uniformities associated with a rotating FOV are absent. Resolution The fiber-bundle - a collection of single core, step-index optical fibers - is pixilated, constraining the achievable spatial resolution. Although the measured resolution of the miniature microscope is indeed better than that achieved with the fiberscope, it is important to note that the resolution of the miniature microscope is limited by the camera and has the potential for further scaling towards the optics limited resolution of 1.2 ptm by simply leveraging advances in CMOS image sensor technology and current trends in pixel scaling without any re-design of the microscope imaging optics. Therefore, whereas the resolution of the miniature fluorescence microscope is scalable to 1.2 gm (and beyond with optical re-design and/or resolution enhancement techniques), the resolution achievable with the fiberscope remains inherently constrained by the pixilation of the fiber-bundle. Field-of-view The specimen field-of-view area that can be imaged with the miniature microscope is 4-5x larger than what could be imaged with the fiberscope. Whereas for the fiberscope the specimen FOV area that can be imaged is constrained by the fiber bundle diameter, the FOV area that can be imaged by the miniature microscope is determined by the CMOS image sensor active area and optical magnification.
WO 2012/027586 PCT/US2011/049180 56 Photon throughput A major limitation of the fiberscope was the loss of emission photon throughput due to the fiber-bundle. Empirical characterization of the photon flux levels at various points in the system during typical brain imaging experiments had revealed a total of about 5x loss in photon throughput from the end of the fiber-bundle that was close to the specimen to the EMCCD camera With the elimination of the fiber-bundle, the expected improvement in photon throughput, approximately 5x, has been achieved with the miniature microscope. Thus, as reported in Table 4.1, with the same level of excitation power, the 5x improvement in emission photon throughput results in an approximately 7 dB improvement in imaging fidelity (SNR). Gains from improvement in emission photon throughput can also be realized with reductions in excitation power, thereby permitting longer imaging durations. Reduced excitation powers mitigate the effects of photobleaching and phototoxicity. For example, an approximately 5x reduction in excitation power, yielding a still satisfactory imaging fidelity of approximately 30 dB, allows for almost five times longer imaging durations enabling the generation of much larger datasets. From the above, it is evident that the miniature fluorescence microscope not only possesses the requisite imaging performance for in vivo brain imaging, but indeed performs unequivocally better in several aspects compared to the fiberscope, previously the state-of-the-art in fluorescence microscopy for in vivo brain imaging in freely behaving mice. The total mass of a microscope is less than two-thirds of the mass budget of 3 g that had been specified - small enough to be borne by a mouse during active behavior. The fabrication of a batch of fully-functional microscopes has served to lend credence to the assertion that the design of the miniature microscope is scalable to eventual mass-production. Therefore, the design goal that was articulated at the outset - miniaturizing the benchtop fluorescence microscope by integrating the light source, optics, filters, and camera into a single device - with the associated objectives of eliminating the fiber-bundle, and thereby realizing several performance gains, in addition to yielding an inherently scalable design, has been attained.
WO 2012/027586 PCT/US2011/049180 57 4.2 While the goal of equipping neuroscientists with the technology that would provide them with the enabling means to perform in vivo brain imaging, with fluorescence microscopy, in a freely behaving mouse in their quest to correlate causal cellular processes with animal behavior was the overarching inspiration for the development of the miniature fluorescence microscopes, two specific experimental paradigms were of particular scientific interest and served to validate the performance of the microscopes. The first was the desire to image vasculature and microcirculation in the cerebellum - the region of the brain that is involved in precisely coordinating motor control and in motor learning - during free and active mouse motor behaviors. Such a paradigm seeks to study hemodynamics at the cellular-level in order to better understand the linkages between hemodynamics, brain activity, and behavior, and towards research on cerebrovascular diseases. The second was the desire to image Calcium dynamics in Purkinje neurons in the cerebellum, also as the mouse engaged in free and active behavior. This paradigm exemplifies the kind of studies correlating neural activity to corresponding animal behavior that neuroscientists are interested in. The scientific objectives of these two experimental paradigms and results from each will be elaborated upon and discussed in the sub-sections that follow. However, before proceeding with those discussions, the methods of performing in vivo brain imaging in a behaving mouse with a miniature fluorescence microscope are briefly presented. There are two steps that need to be performed before the miniature fluorescence microscope can be used for in vivo brain imaging. The first step is preparing the mouse for imaging and creating an optical window into the brain region being imaged over which the microscope can be placed. The next step is fluorescent labeling of specific targets, where the fluorescent label and labeling strategy depends on the nature of the experiment and the structures and processes being studied. For all vasculature and microcirculation imaging experiments, the blood plasma is fluorescently-labeled, whereas for all Calcium imaging experiments a fluorescent WO 2012/027586 PCT/US2011/049180 58 indicator that binds to Ca ' is used. After mouse surgical preparation and fluorescent labeling, the mouse is always first imaged under anesthesia to determine the appropriate brain field-of-view to be imaged and the focal depth. With the mouse under anesthesia (isoflurane), the miniature microscope is positioned above the optical window and lowered towards the brain using a translation stage, until the fluorescently-labeled surface structures are visible under weak illumination (typically 90-200 pLW illumination power at the specimen plane). After locating a suitable recording site and focal depth, the illumination is turned off and the microscope is fixed to the custom metal head plate mounted on the brain using CerebondTm adhesive and dental acrylic. The mouse is then allowed to recover from anesthesia before being placed into a 45 cm x 45 cm x 15 cm behavioral arena made of transparent acrylic. A thin layer of bedding, a few food pellets, and an exercise wheel are typically placed inside the arena to provide a comfortable environment for the mouse. Brain imaging, with the miniature microscope fixed over the mouse brain, commences once the mouse exhibits vigorous locomotor activity, typically 15-60 min after removal from the anesthesia. To minimize the possibility of photo-induced alterations in physiology, the duration and mean power of continuous illumination are typically limited to <5 min and <600 pW for each imaging session. At least 2 min are allowed to elapse between imaging sessions, and the total imaging duration over the course of an experiment is typically around 45 min. Frame acquisition rates are typically 100 Hz for the cerebellar vasculature and microcirculation imaging experiments, and 30-46 Hz for Calcium imaging studies. For recording of the corresponding mouse behavior during in vivo brain imaging, either an external video rate monochrome CMOS camera with a high resolution lens situated over the arena, or a video rate color CCD camera placed adjacent to the arena is used. The overhead camera is used with two sets of infrared LED arrays for arena illumination. The color camera is used with dim room lighting. The LED on the miniature microscope providing excitation illumination to the brain region being imaged is powered either by an external programmable constant WO 2012/027586 PCT/US2011/049180 59 current source or a digital pulse-width-modulation-based (PWM) current source. On the imaging day, the LED is calibrated to determine the illumination power delivered to the specimen over a range of mean drive currents. Rel-time image acquisition and dislna is crucial in facilitatingU,)M mirsoefouig aw Imaging d~ata acquired durig the curse of a experiment is spooled to the hard disk and then transferred to external disks after the experiment for backup and analyses. Figure 4.4 below depicts the experimental setup (with the room light turned on for purposes of taking the photograph). 'mcroscope 7 Figure 4.4. Behavioral arena and experimental setup for in vivo brain imaging experiments. 4.2.1 Bulk hemodynamic responses in the cerebellar cortex are known to arise during more behavior How individual capillaries respond during motor WO 2012/027586 PCT/US2011/049180 60 activity remains unstudied but must be examined towards understanding the physiological underpinning of the hemodynamic signals widely used for functional brain mapping Addressing these objectives forms the basis for the need to perform high-speed, cellular level imaging of cerebellar vasculature and microcirculation in awake behaving mice. Exprimntswere perfonnedA on a to-tal of seven mice. Figurie 4.5 below is a representative image of cortical vasculature from a movie. Erythrocytes flowing through vasculature appear dark against the fluorescently-labeled blood plasma. Figure 4.5. A single frame from a movie of micro circulation. Figure 4.5 above is a full field-of-view (800 pmn x 600 pmn) single frame image of vasculature captured with the mouse under anesthesia and serves to illustrate the kind of images of cortical vasculature with varying diameters that can be captured by WO 2012/027586 PCT/US2011/049180 61 the miniature microscope. However, of particular scientific interest is imaging vasculature with much smaller diameters (capillaries) in the cerebellar cortex and at high speeds (100 Hz) during free and active motor behavior such that erythrocyte flow speed and changes in vessel diameters can be estimated over different behaviors. Qnqntify7ing mndulatiornQ in blnnd flnw s-nped and vPsP diamPfPrs in individual capiiiaies and lU1crlig Lmuse Lb v LUsIeekUslUg 111UUb UaIUI LU ade fundamental questions as to how hemodynamics are altered - at the level of individual vessels - during behavior. Towards that end, data from three experiments focusing on imaging lobules V and VI of the vermis, areas of the cerebellar cortex implicated in coordinating hind and forelimbs was collected and analyzed. Figure 4.6 below is a frame from a composite video from an imaging session during one of the experiments showing the behaving mouse and corresponding brain area being imaged. Figure 4.6. A snapshot from a composite video of a behaving mouse and the corresponding brain FOV being imaged. (Left) The mouse, with the miniature microscope mounted on the brain, running on an exercise wheel in the behavioral arena. (Right) The corresponding brain FOV being imaged in real-time.
WO 2012/027586 PCT/US2011/049180 62 The experiment used for illustration in Figure 4.6 above was conducted in the manner described in the introduction to Section 4.2. The mouse engaged in a range of behaviors during the experiment, which lasted approximately 45 minutes, for example running on an exercise wheel as depicted in the snapshot above. The corresponding 370 pm x 370 pm brain FOV shown on the right was imagedat 100 HTz in real-time Modulations in erythrocyte flow speeds and vessel diameters over the three experiments (three mice) were analyzed across three behavioral states: periods when the mouse was awake but resting; walking about the behavioral arena; or running on the exercise wheel. Analyses revealed that although both locomotor states evoked significant hemodynamic changes compared to when the mice was at rest, running evoked greater increases in flow speeds and capillary diameters than walking. The experimental results reported above not only validate the performance of the miniature microscope for in vivo brain imaging in a mouse during active behavior, but also serve to demonstrate its use as an enabling tool for performing scientific studies seeking to test hypotheses on associations between hemodynamics, brain activity, and behavior. 4.2.2 The second in vivo brain imaging experiment serves to demonstrate how the miniature fluorescence microscopes can be used as enabling tools in studies seeking to correlate neural circuit activity with animal behavior - a longstanding goal in neuroscience. The dendritic Ca 2 +-spiking activity of cerebellar Purkinje neurons in the vermis was examined across different behavioral states for a total of four mice. After injection of the fluorescent Ca 2 +-indicator, into the cerebellar cortex, the miniature fluorescence microscope placed on the mouse's cranium could provide records of Ca 2 +-spiking from up to 206 individual neurons concurrently. These spikes represent the Ca 2 + component of Purkinje neurons' complex (Na* and Ca 2 +) action WO 2012/027586 PCT/US2011/049180 63 potentials Figure 4.7 below shows a snapshot from a composite video of Ca 2 +-spiking activity and corresponding mouse behavior during an imaging session. Figure 4.7. A snapshot from a composite video of Ca 2 -spiking activity in Purkinje cells and corresponding mouse behavior. (Left) The mouse, with the miniature microscope mounted on the brain, running on an exercise wheel in the behavioral arena. The video of mouse behavior was taken with a color CCD camera with dim room lighting. (Right) The corresponding brain FOV being imaged in real-time. The experiment that is used for illustration in Figure 4.7 above was conducted in the manner described in the introduction to Section 4.2. The mouse engaged in a range of behaviors over the course of the experiment (approximately 45 minutes in duration) with wheel running again depicted in the left panel of the figure above. The corresponding brain FOV that is shown (right panel) is approximately 450 ptm x 450 gm, imaged at 40 Hz, and is normalized based on the baseline fluorescence level. Temporal changes in fluorescence intensity in a neuron are a result of changes in intracellular Ca 2 + concentration, that is, correspond to its Ca 2 +-spiking activity. Thus, analyses of fluorescence signals from individual neurons over time permitted extraction of neuronal spike rates for all neurons in the brain FOV over different mouse behavioral states. In general, neurons exhibited higher spike rates during locomotor behavioral states, as compared to when the mouse was at rest. More WO 2012/027586 PCT/US2011/049180 64 broadly, such analyses of spike rates with single neuron specificity over a population of neurons provides readouts of circuit-level neural activity with single neuron precision, generating the underlying population statistics that permits study of patterns of neural activity with behavior. In summary, the miniature fluorescence microscopes were successfilly used as tools for research and experimentation in the primary application that they were designed for - in vivo brain imaging in a freely behaving mouse. As discussed earlier, significant performance gains were achieved over the previous fiber-bundle-based fiberscope. Enabling freer mouse movement and consequently a much broader palette of mouse behaviors, combined with the ability to perform corresponding in vivo brain imaging for durations that are long enough to gather statistically-relevant datasets during those behaviors, has dramatically expanded the scope of investigation that is possible. With an inherently scalable design that is amenable to mass-production, these microscopes can facilitate large-scale studies seeking to elucidate the neurobiological and neural circuit basis of behavior.
WO 2012/027586 PCT/US20111/049180 65 APPENDIX B Prior methods for light microscopy in behaving animals have required tabletop optical instruments that appeared incompatible with most standard rodent behavioral assays and did not permit imaging many mice in parallels-". The fluorescence microscope described here clears both of these challenges, since it is a miniaturized device that is integrated with all optical components and sufficiently small to be mounted on the cranium of a freely behaving mouse. The microscope design capitalizes on recent advances in semiconductor optoelectronics that have led to mass availability of inexpensive but high quality optical parts, including tiny but bright light-emitting diodes (LEDs) and complementary metal-oxide semiconductor (CMOS) image sensors". This allowed us to incorporate all optical parts from light source to camera within one package ~2.4 cm 3 in volume (Fig. 1a) (Methods). The microscope's illumination source is a blue LED that resides on a custom 6 mm x 6 mm printed circuit board (PCB) (Fig. 1b, lower right inset). A drum lens collects the LED's emissions, which then pass through a 4 mm x 4 mm optical excitation filter (Fig. 1b, lower left inset), deflect off a dicbroic mirror, and enter the imaging pathway. A gradient refractive index (GRIN) objective lens focuses illumination onto the sample. Fluorescence emissions from the sample return through the objective, the dichroic, an emission filter, and an achromatic doublet tube lens that focuses the image onto a CMOS image sensor (640 x 480 pixels) mounted on a custom 8.4 mm x 8.4 mm PCB (Fig. 1b, upper right inset). The microscope housing, fabricated in polyether-etherketone (PEEK), permits focusing to sub- WO 2012/027586 PCT/US2011/049180 66 micron accuracy by adjustment of the camera position. The electronics allow full-frame image acquisition at 36 Hz, or up to 100 Hz over sub-regions of 300 x 300 pixels (Fig. Si). The. field of view is up to 600 im x 800 im wide, depending on the focal position, and the lateral Otical resolution is 2 p M e C tIods). Th; -ptica design Ao esiy p-rm;ts firther ains i-n field of view and resolution to be realized as CMOS image sensors progress. We first assessed the microscope's performance in live animals by imaging cerebral microcirculation in awake behaving mice (n = 7 mice) We focused on lobules V and VI of the vermis (Fig. 2a), areas of cerebellar cortex implicated in coordinating hind- and forelimbs -4 . Bulk hemodynamic responses in cerebellar cortex are known to arise during motor behavior 6
"
5 . How individual capillaries respond during motor activity remains unstudied but must be examined towards understanding the physiological basis for the hemodynamic signals widely used for functional brain mapping". We tracked erythrocyte flow speeds and capillary diameters in individual vessels across three behavioral states: periods when the mouse was awake but resting; walking about its enclosure; or running on an exercise wheel (Fig. 2b-h). Detailed analysis focused on a subset of mice and revealed both locomotor states evoked increases in flow speeds (running: +83 ± 20 im/s; walking: +21 + 11 ptm/s; mean ± s.e.m;) (Fig. 2e,g) and capillary diameters (running: +0.52 ± 0.12 pm; walking: +0.27 ± 0.08 pm) (Fig. 2fh) compared to resting periods. These changes were all significant (n = 3 mice; 97 vessel locations; p S 10-; Wilcoxon signed rank tests), except for the rise in flow speed evoked by walking (p = 0.13). We had expected all capillaries within a field of view would uniformly undergo increased speeds and diameters during locomotor behavior. Surprisingly, only a spatially scattered minority of capillaries exhibited substantial up-regulation of erythrocyte speeds and vessel WO 2012/027586 PCT/US2011/049180 67 diameters during locomotion (Fig. 2e-h). This indicates capillaries lying within the same vascular bed are controlled non-uniformly, even for vessels separated by only tens of microns, and that changes in a subset of vessels appear to dominate aggregate effects. Future work m idntify the. mechanisms underlying this unexpected spatial precision in vessel regulation, towards improved understanding of cerebrovascular disorders and functional human brain imaging. We next applied the integrated microscope to studies of dendritic Ca 2 +-spiking by Purkinje neurons in the cerebellar vermis of freely behaving mice. These Ca 2 +-spikes are elicited by climbing fiber input from the inferior olive and represent the Ca 2 + component of Purkinje neurons' complex (Na* and Ca 2 +) action potentials'' 17 that are thought to be crucial for cerebellar motor learning 34 . An integrated microscope secured to the cranium was used to record Ca 2 +-spiking concurrently from up to >200 individual neurons following injection of the cell-permeant fluorescent Ca 2 +-indicator Oregon-Green-BAPTA-1-AM into cerebellar cortex (n = 4 mice) (Fig. 3a,b). This is a much larger set of Purkinje neurons than has been monitored previously in behaving animals'' 1 8
-
1 9 . Moreover, unlike multi-electrode recordings that have monitored Purkinje neurons spaced 100-200 pm apart due to electrode 119 separations ' , our optical recordings were of cells densely tiling a single patch of cerebellar cortex. This sampling configuration, combined with the sheer number of neurons monitored concurrently, led us to the discovery of a previously unreported form of large-scale concerted activity within individual microzones, basic cerebellar divisions that are reproducible between subjects and each mapped to a specific part of the body 20 . After using established computational methods to identify individual Purkinje neurons and extract their Ca 2 + activity tracesfrom the image data 7 (Fig. 3ab) (Methods), we focused WO 2012/027586 PCT/US2011/049180 68 our initial analysis on identifying and characterizing the microzones. To find the microzones' boundaries, we examined pairwise cross-correlograms of Purkinje neurons' Ca 2 t-spiking activity. This revealed a spatial clustering of Purkinje neurons into local microzones of 7-36 cells with significant pairwise synchrony?' 2 ' (Fig. 3c). The micrnnpe's wide field of view extended across up to 9 microzones in individual recordings. By using video records to classify the mouse's behavior into periods of rest, grooming, locomotion, or other behavior (Methods), we observed that microzones maintained their stable identities across different behaviors, as in prior work 7 , and had elevated pairwise correlations with cells in neighboring as compared to distal microzones (Fig. 3c). We analyzed data from three mice in detail and found nearly all cell pairs in each microzone were significantly correlated (p <0.01, likelihood ratio test) during rest (84% of pairs) and locomotion (98%). Only 9% of cell pairs in distinct microzones were correlated during rest, rising to 19% during locomotion. Correlation coefficients for intrazone cell pairs rose from r = 0,12 ± 0.004 (mean ± s.e.m) during rest to r = 0.21 ± 0.004 during locomotion (n = 3985 intrazone pairs; p < 10-5, Wilcoxon signed rank test). By comparison, mean correlation coefficients for interzone cell pairs were much smaller and also similar under both behavioral conditions (r = 0.01 ± 0.001; n = 21,434 interzone pairs), but were significantly higher for cell pairs from adjacent microzones (r = 0.02-0.04 ; n = 6314 pairs) (Fig. 3c). Beyond these pairwise correlations, our analysis yielded a striking discovery regarding microzones' physiological dynamics during motor behavior. Predominantly during motor activity large cohorts of up to >30 Purkinje neurons within individual microzones fired simultaneous Ca 2 t spikes (Fig. 4a). Our analysis of these large-scale events used two alternative threshold values of spiking synchrony to define microzone activation: either >35% WO 2012/027586 PCT/US2011/049180 69 or >50% of visible neurons within a microzone were required to spike within a 50 ms interval. Using either definition and in all behavioral conditions, microzone activation occurred vastly above (- 05-10I) expected rates due to chance given cells' individual spike rates (Methods). Whereas cells' mean spike rates rose 20 -- 10% (s.e.m.) during locomotion as compared to rest (n =336 cells; p < 1050, Wilcoxon signed rank test), the mean rate of microzone activation during locomotion (with >50% of cells in a microzone spiking together) rose to 740 i 360o error bar doesn't agree with graph in Fig 4 of the resting value (n =16 microzones; p < 10-3) (Figs. 4b,c). This value is based on a conservative overestimate of the rate of synchronized activity during rest, since we did not classify very brief or slight movements as occurring during movement periods. Close inspection of the videos often revealed a subtle movement of the mouse accompanying the synchronized Ca 2 + activity that arose during periods classified as rest. During continuous motor behavior, microzone activation represented a substantial fraction of all recorded Ca 2 + spikes; the cells' median percentage of Ca 2 + spiking activity that arose during microzone activation was as great as >20% (Fig. 4d). These findings indicate large-scale synchronization of neuronal activity within individual microzones might be a basic dynamical motif for cerebellar control of motor behavior. Although ~50-100 ptm wide in the medial-lateral dimension, microzones extend millimeters in the rostral-caudal dimension 2 , suggesting that during microzone activation hundreds to thousands of Purkinje neurons could be acting simultaneously with the subset observed via the integrated microscope. Thus, the microzone activation uncovered here prompts many questions regarding the extent and mechanisms underlying the ensemble activation, the information so encoded, and its role in motor behavior.
WO 2012/027586 PCT/US20111/049180 70 As posited by the famous Marr-Albus-Ito hypothesis, Purkinje neurons' Ca 2 spikes play a crucial role in motor learning and encode motor errors sent via these cells' climbing fiber inputs 3 4
'
2 3 . Theoretical and experimental exploration of this idea has focused on the dynamics of single Purkinje neurons-. This may be a useful conceptual simplification, but due to their multiplicity and anatomical connectivity Purkinje neurons seem unlikely to promote learning through their individual dynamics. Concerted Ca+ spiking within Purkinje neuron microzones offers a more plausible physiological substrate for encoding motor information that promotes learning. Our data thus suggest a re-formulation of the Marr-Albus Ito theory in which coherent microzone activation represents the basic unit for encoding motor errors conveyed from the inferior olive, a brainstem area where excitatory neurons can synchronize via gap junctions2. During animal inactivity, individual Purkinje neurons exhibit ongoing Ca 2 " spikes, whereas microzone activation appears to stop or persist at a much reduced rate. Therefore, our re-formulation potentially resolves a longstanding problem in the Marr-Albus-Ito theory regarding why motor error signals would persist in the absence of motor behavior; this problem may not exist if microzones convey the error signals. When combined with behavioral assays for cerebellar motor adaptation, use of the integrated microscope for fluorescence Ca 2 ' imaging should permit simultaneous, direct tests of both the original and re-formulated versions of the theory. Prior approaches to fluorescence imaging in a miniaturized format have required accessory, tabletop optical instrumentation9~ 2 5 27 . Optofluidic chips for image production have been incompatible with fluorescence contrast and required specialized assumptions about the specimen, such as that it was flowing at certain speeds at zero optical working distance 2 30 . Lensless imaging devices for cytometry applications do not produce direct images but rather WO 2012/027586 PCT/US2011/049180 71 involve deciphering diffraction patterns of scattered light produced by cells immobilized at a fixed working distance 1 . Owing to its general purpose capabilities for fluorescence imaging, the integrated microscope stands in distinction to all prior miniaturized imaging systems. vidjfled designs will also permit dark field or phase contrast imaging. The reliance on mass producible micro-optic and semiconductor components makes the integrated microscope amenable to broad dissemination. When multiple copies of the microscope are used together with genetically encoded fluorescent Ca 2 ' indicators that permit chronic imaging studies 32
'
33 , it should be feasible to perform long-term brain imaging in substantial numbers of mice in parallel. This will promote large-scale studies in basic neuroscience and of animal models of brain disease, by easing the acquisition of rich data sets and enabling many candidate therapeutics to be tested quickly. Brain areas where cellular level epi-fluorescence imaging can work well in live rodents include cerebellum', olfactory bulb 4 , hippocampus 35 , and neocortex3. Beyond brain imaging, the integrated microscope is likely to be an enabling technology for a wide range of in vivo and in vitro applications. The latter usages might involve portable fluorescence assays, high-throughput image-based screens, or imaging inside other instruments such as incubators. The microscope might also be combined with other integrated components used in biotechnology, such as microfluidic or gene chips'. To explore potential in vitro imaging applications, we created an array of four integrated microscopes (Fig. S2). We examined wildtype zebrafish intermingled with erbb3 mutants, which have deficits in ErbB receptor signaling that disrupt nerve myelination in the peripheral but not the central nervous system 3 7
,
38 . Images taken by the four microscopes clearly revealed the phenotypes of the mutant fish, including the absence of the posterior lateral line nerve and WO 2012/027586 PCT/US2011/049180 72 dorsal nerve roots that were apparent in wildtype fish (Fig. S2b). These data provide proof-of concept evidence arrays of integrated microscopes can underlie parallel screening strategies. Today, genetic or chemical screens using small model organisms often rely on fluorescent malers, but screni is typically done serially on a conventional microscope which can be unsatisfactory for large sample sets or samples that must be monitored continuously. We also explored how an array of integrated microscopes might facilitate the widely used cell counting assays done in standard format 96-well plates. Commercial cell counters often rely on imaging to determine cell numbers in well plates 40
'
4 ' and range in size from benchtop to floor-standing instruments. We tested if the four-microscope array could provide accurate cell counts by using cultured samples of live MCF7 human breast cancer cells fluorescently labeled with carboxyfluorescein. Across ~2.5 orders of magnitude in cell density, use of the microscope array and an image segmentation algorithm (Figs. S2c, S3) yielded counts accurate to 4-16% (s.e.m.) (Fig. S2d), comparable to the accuracy of commercial counters that use digital imaging 42 . This points the way to counters based on larger arrays of microscopes with much higher throughput than conventional counters. Modem understanding of technology recognizes miniaturized integration as a pivotal advance that facilitates low-cost production and generally leads to improved performance and unanticipated new applications 3 . This has occurred in diverse arenas including telecommunications, computing, and genomics. Integrated technologies usually progress rapidly, which motivated our choice of a design that can capitalize on upcoming improvements to CMOS image sensors. As these cover larger areas with pixels of finer size, the existing optical pathway (Fig. 1a) will support resolution as fine as 1.5 Pm over fields as broad as 1.15 mm. Advances in microlenses, such as for diffraction-limited imaging", should WO 2012/027586 PCT/US2011/049180 73 further boost image attributes. Since CMOS technology underpins most modem electronics, 'intelligent' integrated microscopes seem likely to emerge with sensors having built-in electronic computational capabilities to facilitate rapid analysis, screening, or diagnostic evaluations.
WO 2012/027586 PCT/US2011/049180 74 Methods Summary Microscope design. We made 8 miniaturized integrated microscopes of identical designs that were -8.4 mm x 13 mm x 22 mm in volume and <2 g in mass when assembled. Each microscope had a blue-emitting LED, a dichroic and filter set for fluorescence imaging, and a CMOS image sensor. The housings were fabricated in polyetheretherketone. Brain imaging. All procedures were approved by the Stanford APLAC. We used male CD-1 wildtype mice 7-14 weeks old. Surgery was done under isoflurane anesthesia as described 9 . In brief, 1-6 days prior to imaging the skull was exposed and cleaned above cerebellum. A metal plate allowing cranial access was fixed to the skull with dental acrylic. On imaging day, a craniotomy (1.5-2.5 mm diameter) was opened, filled with agarose, and covered by a coverslip. For Ca_' imaging, cerebellar cortex was fluorescently labeled by bolus-loading the Ca 2 " indicator Oregon-Green-488-BAPTA-1-acetoxymethyl 6
'
9 . For circulatory studies, we fluorescently labeled blood plasma by tail-vein injection of fluorescein-dextran. We positioned the microscope above the coverslip. After locating a suitable recording site, we fixed the microscope to the metal head plate. Imaging began once the mouse exhibited vigorous locomotor activity, -15-60 min after removal from isoflurane. Analysis was done using custom routines written in MATLAB. Zebrafish imaging. 50 zebrafish, 5 days post-fertilization and comprising an unknown mix of wild type and Schwann cell-deficient ErbB mutants were fixed, fluorescently immunostained for myelin basic protein, and placed on slides. Four microscopes in an array were positioned over the fish. All LEDs were set to deliver -600 IW to the specimen. We took images in sets of four, along with background images to correct for any illumination non-uniformities or background fluorescence originating from the slides.
WO 2012/027586 PCT/US2011/049180 75 Cell counting. MCF7 cancer cells were labeled with Carboxyfluorescein diacetate, succinimidyl ester, serially diluted to 6 different concentrations, placed into the wells of a 96 well plate, and imaged by the 4-microscope array. Counting was done by image analysis.
WO 2012/027586 PCT/US2011/049180 76 Figure Legends Figure 11 Design and validation of an integrated fluorescence microscope. a. Computer-assisted design of an integrated microscope in cross-section. Blue and green arrows mark illumination and emission pathways, respectively. b. Assembled integrated microscope. Insets show, clockwise from bottom left: filter cube holding dichroic mirror and excitation and emission filters; printed circuit board (PCB) holding the complementary metal-oxide-semiconductor (CMOS) camera chip; PCB holding the light emitting diode (LED) illumination source. Scale bar is 5 mm and applies to all panels and insets in (ab). Figure 2 | Microcirculatory dynamics and regulation studied in freely behaving mice a. Microvasculature in cerebellar cortex of a freely behaving mouse, after intravascular injection of fluorescein-dextran dye. The image shown is the standard deviation of a 10 s movie, a computation that highlights vasculature (Methods). Colored dots mark the locations of corresponding dynamical measurements in (c, d). Scale bars in (a, b) are 50 Im. b. Map of erythrocyte flow speeds for vessels of (a). c, d. Erythrocyte flow speeds, (c), and vessel diameter changes, (d), for the 4 vessels marked in (a). Blue shading marks periods of movement about the cage. Red shading indicates running on an exercise wheel. White shading marks when the mouse rested or barely moved. Black vertical lines separate different records from the same mouse and specimen field. e, f. Erythrocyte speeds, (e), and vessel diameters, (f), compared between rest and periods of motor activity. Each datum represents a single location in the vasculature. Data points above the diagonal indicate up-regulation in speed or diameter during motor activity (blue points, WO 2012/027586 PCT/US2011/049180 77 movement about the cage; red points, wheel running). Shaded blue and red areas demarcate a 1 s.d. estimate of measurement fluctuations calculated using the data below the diagonal. g, h. Cumulative histograms of changes in vessel flow speeds, (g), and diameters, (h), during walking (blue) or running (red), compared to resting periods. (Insets) Mean ± s.e.m. changes compared to rest. (*) Indicates significant difference between walking and running (p <1 0-2); (f) indicates significant difference from rest (p < 10-3), using Wilcoxon signed-rank tests. Histogram portions above and to the right of areas enclosed by colored dashed lines represent data for vessel locations lying above the color corresponding shaded areas in (e, f). Data in (e-h) comprise 97 measurement locations from 3 mice. Figure 31 Ca 2 - spiking dynamics of cerebellar Purkinje neurons during motor behavior. a. Contours of 206 Purkinje neuron dendritic trees identified in a freely behaving mouse, superimposed on an averaged fluorescence image of the cerebellar surface after injection of the Ca 2 +-indicator Oregeon-Green-BAPTA-1-AM. Each color indicates one of nine identified microzones. Filled contours mark neurons whose activity is shown in b. Scale bar is 100 pmn. b. Relative changes in fluorescence (AF/F) from filled neurons marked in (a), numbered as in (c). Black dots mark detected Ca 2 + spikes. Scale bars: 1 s (horizontal); 3% AF/F (vertical). c. Spike train correlation coefficients for pairs of neurons, for three different behaviors: resting, grooming, and locomotion. Colored outlines indicate local microzones identified by cluster analysis of the correlation coefficients and correspond to the microzones shown in (a).
WO 2012/027586 PCT/US2011/049180 78 Figure 41 Cerebellar microzones exhibit large-scale, synchronized Ca 2 + spiking during motor behavior. a. Ca 2 + spike (black dots) raster plots for neurons of Fig. 3a. Colored shading indicates the mouse's behavior (peach, grooming; green, locomotion; gray, resting; other small movements, blue). Microzone rasters (colored dots) show Ca 2 " spiking by >35% (open dots) or >50% (closed dots) of neurons identified in each microzone. Inset expands the marked interval. Scale bars are 5 s. b. Mean ± s.e.m rates of individual neuronal spiking (top) and synchronized microzone activation (bottom: >35% cells synchronized, unfilled bars; >50% cells, solid bars) reveal significant differences between behaviors (orange, grooming; green, locomotion; gray, rest) (p <10-2, neuronal spike rates; p < 104 , rates of synchronized activation; Wilcoxon signed rank tests). c. Spike rates for individual cells (small data points) and synchronized microzonal activation (>35% cells, large open points; >50% cells, large solid points) plotted for periods of grooming vs. rest (yellow points) or locomotion vs. rest (green points). d. Cumulative histograms of percentages of cell's spikes occurring during synchronized activation (>35% activation, open points; >50% activation, solid points; locomotion, green; grooming, yellow; resting, gray). Data of (b-d) are from 3 mice, 336 cells and 16 microzones. Figure S1 I Schematic of electronics for real-time image acquisition and control. Two printed circuit boards (PCBs) are contained within each integrated microscope (Fig. 1), one for the light emitting diode (LED) and one for the complementary metal-oxide- WO 2012/027586 PCT/US2011/049180 79 semiconductor (CMOS) camera chip. These boards are both connected to a custom external PCB via nine thin and flexible wires (2 wires to the LED board and 7 to the camera) that are all encased in a single polyvinyl chloride (PVC) sheath of outer diameter 1.5 mm. The external PCB interfaces with a computer via a general-purpose USB imaging data acquisition adapter PCB (Aptina Imaging, San Jose, CA), enabling real-time microscope control and data acquisition as well as immediate display of images. The labeled components and signaling pathways shown here provide an overview of the circuitry responsible for providing power, controlling the LED, controlling the CMOS camera, acquiring images, and transferring data. Abbreviations: PD, flash programming device; OSC, Quartz crystal oscillator; 12C, Two-wire Inter-Integrated Circuit serial communication interface. Figure S2 I An array of integrated microscopes can facilitate parallel imaging applications. a. Schematic of four microscopes assembled in an array. Scale bar is 1 cm. b. Multiple microscopes reveal phenotypes of wild-type and erbb3 mutant zebrafish after fluorescence immunolabeling of myelin basic protein with Alexa-488. White arrowheads mark spinal cords. Yellow arrowheads mark the posterior lateral nerve in wildtype fish. In erbb3 fish Schwann cells fail to develop on this nerve. Each image was taken by a different microscope and has undergone subtraction of background fluorescence. Scale bar of 50 pm applies to all panels. c. Integrated microscopes enable accurate cell counting assays in 96-well plates. A base concentration (Co ~ 4.0 x 10 5 cells/mL) of live MCF7 human breast cancer cells labeled with carboxyfluorescein was diluted, with 8 sample wells for each of 6 concentrations. An automated algorithm counted cells in the images (Fig. S2). The panels show counting results, WO 2012/027586 PCT/US2011/049180 80 with individual cells identified at 4 different concentrations shown in color. The raw images were taken by four different microscopes. Scale bar of 100 pim applies to all images. d. Measured cell concentrations versus the expectation based on sample dilution. A linear fit (solid line) has r 2 = 0.995. Error bars are s.e.m. over the 8 samples from each dilution. Figure S3 I An iterative image segmentation algorithm for cell counting. Successive stages of analysis within a custom cell counting algorithm (Methods). The raw image used here for illustration led to the cell count shown in the far left panel of Fig. S2c. (a) A raw fluorescence image (top) of live MCF7 human breast cancer cells labeled with carboxyfluorescein underwent contrast equalization (middle). The image was then converted to binary format and underwent an intial segmentation (bottom), in which areas identified as single cells are marked in color. Further analysis of the area within the white rectangle is illustrated in (b). The scale bar is 100 pm. (b) Iterative rounds of morphological filtering segmented the clusters of multiple cells that remained after the initial segmentation (top) into individual cells (middle and bottom). The area shown corresponds to the sub-region contained within the white rectangle of (a, lower). Results from two iterations of erosion and dilation are shown. All cells in view were counted by the second iteration. The scale bar is 50 im. Movie 1 I Mouse behavior and microcirculation in the cerebellar vermis recorded concurrently in a mouse with an integrated microscope mounted on the cranium. This movie presents the simultaneous video clips of mouse behavior and microcirculation in the vermis for two example behaviors. The first example shows the mouse walking about the behavioral arena. The second example shows the mouse running on an exercise wheel.
WO 2012/027586 PCT/US2011/049180 81 Behavioral data (leftpane) was recorded at 30 Hz with an overhead camera and infrared illumination. Microcirculation (rightpanel) was recorded using the integrated microscope at 100 Hz following an intravenous injection of FITC-dextran. This fluorescent dye brightly labels the blood plasma, allowing erythrocytes to be seen in dark relief. Individual erythrocytes are apparent flowing through the capillaries. Scale bar is 100 pm.
WO 2012/027586 PCT/US2011/049180 82 References 1 Hayden, E. C. Microscopic marvels: Microscope for the masses. Nature 459, 632-633, doi:459632a [pii] iU.1038459632a (2009). 2 Wilt, B. A. et al. Advances in light microscopy for neuroscience. Annu Rev Neurosci 32, 435-506, doi: 10.1 146/annurev.neuro.051508.135540 (2009). 3 Thach, W. T. A role for the cerebellum in learning movement coordination. Neurobiol Learn Mem 70, 177-188 (1998). 4 Raymond, J. L., Lisberger, S. G. & Mauk, M. D. The cerebellum: a neuronal learning machine? Science 272, 1126-1131 (1996). 5 Helmchen, F., Fee, M., Tank, D. & Denk, W. A miniature head-mounted two-photon microscope: high-resolution brain imaging in freely moving animals. Neuron 31, 903 912 (2001). 6 Nimmerjahn, A., Mukamel, E. A. & Schnitzer, M. J. Motor behavior activates Bergmann glial networks. Neuron 62, 400-412, doi:S0896-6273(09)00244-X [pii] 10.1016/j.neuron.2009.03.019 (2009). 7 Mukamel, E. A., Nimmejahn, A. & Sclmitzer, M. J. Automated analysis of cellular signals from large-scale calcium imaging data. Neuron 63, 747-760, doi:S0896 6273(09)00619-9 [pii] 10.1016/j.neuron.2009.08.009 (2009). 8 Dombeck, D. A., Khabbaz, A. N., Colliman, F., Adelman, T. L. & Tank, D. W. Imaging large-scale neural activity with cellular resolution in awake, mobile mice. Neuron 56, 43-57 (2007). 9 Flusberg, B. A. et al. High-speed, miniaturized fluorescence microscopy in freely moving mice. Nat Methods 5, 93 5-938 (2008). 10 Sawinski, J. et al. Visually evoked activity in cortical cells imaged in freely moving animals. PNAS (2009). 11 El Gamal, A, & Eltoukhy, H. CMOS image sensors. IEEE Circuits and Device Magazine 21, 6-20 (2005). 12 Andersson, G. & Armstrong, D. M. Complex Spikes in Purkinje-Cells in the Lateral Vermis (B-Zone) of the Cat Cerebellum during Locomotion. Journal ofPhysiology London 385, 107-134 (1987). 13 Leicht, R. & Schmidt, R. F. Somatotopic Studies on Vermal Cortex of Cerebellar Anterior Lobe or Unanesthetized Cats. Experimental Brain Research 27, 479-490 (1977).
WO 2012/027586 PCT/US2011/049180 83 14 Jorntell, H., Ekerot, C., Garwicz, M. & Luo, X. L. Functional organization of climbing fibre projection to the cerebellar anterior lobe of the rat. JPhysiol 522 Pt 2, 297-309 (2000). 15 Glickstein, M., Sultan, F. & Voogd, J. Functional localization in the cerebellum. Cortex, doi:SOO10-9452(09)00263-9 [pii] 10.1016/j.cortex.2009.09.00i (2009). 16 Logothetis, N. K. & Wandell, B. A. Interpreting the BOLD signal. Ann Rev Physiol 66, 735-769 (2004). 17 Ozden, I., Lee, H. M., Sullivan, M. R. & Wang, S. S. Identification and clustering of event patterns from in vivo multiphoton optical recordings of neuronal ensembles. J Neurophysiol 100, 495-503, doi:01310.2007 [pii] 10.1 152/jn.01310.2007 (2008). 18 Fukuda, M., Yamamoto, T. & Llinas, R. The isochronic band hypothesis and climbing fibre regulation of motricity: an experimental study. Eur JNeurosci 13, 315-326, doi:ejn1394 [pii] (2001). 19 Welsh, J. P., Lang, E. 1., Suglhara, I. & Llinas, R. Dynamic organization of motor control within the olivocerebellar system. Nature 374, 453-457, doi:10.1038/374453a0 (1995). 20 Apps, R. & Garwicz, M. Anatomical and physiological foundations of cerebellar information processing. Nat Rev Neurosci 6, 297-311, doi:nrn1646 [pii] 10. 1038/nml 646 (2005). 21 Ozden, I., Sullivan, M. R., Lee, H. M. & Wang, S. S. Reliable coding emerges from coactivation of climbing fibers in microbands of cerebellar Purkinje neurons. J Neurosci 29, 10463-10473, doi:29/34/10463 [pii] 10.1523/JNEUROSCI.0967-09.2009 (2009). 22 Andersson, G. & Oscarsson, 0. Climbing fiber microzones in cerebellar vermis and their projection to different groups of cells in the lateral vestibular nucleus. Exp Brain Res 32, 565-579 (1978). 23 Marr, D. A theory of cerebellar cortex. JPhysiol 202, 437-470 (1969). 24 Kim, J. J., Krupa, D. J. & Thompson, R. F. Inhibitory cerebello-olivary projections and blocking effect in classical conditioning. Science 279, 570-573 (1998). 25 Engelbrecht, C. J., Johnston, R. S., Seibel, E. J. & Helmchen, F. Ultra-compact fiber optic two-photon microscope for functional fluorescence imaging in vivo. Opt Express 16, 5556-5564, doi:157085 [pii] (2008). 26 Rogers, J. D. et al. Imaging performance of a minature integrated microendoscope. J Biomedical Optics 13, 054020(054021)-054020(054026) (2008). 27 Breslauer, D. N., Maamari, R. N., Switz, N. A., Lam, W. A. & Fletcher, D. A. Mobile phone based clinical micrscopy for global health applications. PLoS ONE 4, 1-7 (2009).
WO 2012/027586 PCT/US2011/049180 84 28 Cui, X. et al. Lensless high-resolution on-chip optofluidic microscopes for Caenorhabditis elegans and cell imaging PNAS 105, 10670-10675 (2008). 29 Heng, X. et al. Optofluidic microscopy - a method for implementing a high resolution optical microscope on a chip. Lab On a Chip 6, 1274-1276 (2006). 30 Wu, J.,.Cui, X., Lee, L. M. & Yang, C. The application of Fresnel zone plate based projection in optofluidic microscopy. Optics Express 16, 15595-15602 (2008). 31 Seo, S., Su, T., Tseng, D. K., Erlinger, A. & Ozcan, A. Lensfree holographic imaging for on-chip cytometry and diagnostics. Lab On a Chip 9, 777-787 (2009). 32 Tian, L. et al. Imaging neural activity in worms, flies and mice with improved GCaMP calcium indicators. Nat Methods 6, 875-88 1, doi:nmeth.1398 [pii] 10.1038/nmeth.1398 (2009). 33 Mank, M. et al. A genetically encoded calcium indicator for chronic in vivo two photon imaging. Nat Methods 5, 805-811, doi: 10. 1038/nmeth. 1243 (2008). 34 Carey, R. M., Verhagen, J. V., Wesson, D. W., Pirez, N. & Wachowiak, M. Temporal structure of receptor neuron input to the olfactory bulb imaged in behaving rats. J Neurophysiol 101, 1073-1088, doi:90902.2008 [pii] 10.1 152/jn.90902.2008 (2009). 35 Jung, J. C., Mebta, A. D., Aksay, E., Stepnoski, R. & Schnitzer, M. J. In vivo mammalian brain imaging using one- and two-photon fluorescence microendoscopy. J Neurophysiol 92, 3121-3 133, doi:10.1 152/jn.00234.2004 00234.2004 [pii] (2004). 36 Murayama, M. et al. Dendritic encoding of sensory stimuli controlled by deep cortical interneurons. Nature 457, 1137-1141, doi:nature07663 [pii] 10.1038/nature07663 (2009). 37 Lyons, D. A. et al. erbb3 and erbb2 are essential for Schwann cell migration and myelination in zebrafish. Current Biology 15, 513-524 (2005). 38 Pogoda, H. M. et al. A genetic screen identifies genes essential for development of myelinated axons in zebrafish. Dev Biol 298, 118-131, doi:S0012-1606(06)00924-9 [pii] 10.1016/j.ydbio.2006.06.021 (2006). 39 Pepperkok, R. & Ellenberg, J. High-throughput fluorescence microscopy for systems biology. Nat Rev Mol Cell Biol 7, 690-696, doi:nrm1979 [pii] 10. 1038/nrm 1979 (2006). 40 Kachouie, N., Kang, L. & Khademhosseini, A. Arraycount, an algorithm for automatic cell counting in microwell arrays. Biotechniques 47, x-xvi, doi:000 113202 [pii] 10.2144/000113202 (2009).
WO 2012/027586 PCT/US2011/049180 85 41 Brinkmann, M., Lutkemeyer, D., Gudermann, F. & Lehmann, J. New technologies for automated cell counting based on optical image analysis ;The Cellscreen'. Cytotechnology 38, 119-127, doi:10.1023/A:1021118501866 (2002). 42 Stone, L. R., Gray, D. R., Remple, K. L. & Beaudet, M. P. Accuracy and precision comparison of the hemocytometer and automated cell counting methods. FASEB J 23_MeetingAbstracts, 827 (2009). 43 Arthur, W. B. The Nature of Technology: What It Is and How It Evolves. (Free Press, 2009). 44 Barretto, R. P., Messerschmidt, B. & Schnitzer, M. J In vivo fluorescence imaging with high-resolution microlenses. Nat Methods 6, 511-512, doi:nmeth. 1339 [piij 10.1038/nmeth.1339 (2009).
WO 2012/027586 PCT/US2011/049180 86 8 mosb camera--. mechanism Co-co Achromat Dichroic mirror Excitation
--
Fousn Ofiter rn,- P ObjectiveM WO 2012/027586 PCT/US2O1 11049180 87 ab 0.4 0.2 _______________________ U 300 Time (s) ouu e 9 Ic 100 a cage E- 0Wheel 0 100~ speed (mm/s) 500 Ret0o F TT0 a -0 0 <4c 0 1200 -2 0 2 4 6 44 Rest diameter 16 A Flow speed A Vessel diameter (pm) from rest (pmls) from rest (pm) WO 2012/027586 PCT/US2011/049180 88 135 95 87 40 J 0.7 C Resting Grooming - -, Locomotion C 206 0 1 Cell index 206 WO 2012/027586 PCT/US201 1/049180 89 1 G4r-oirn~ ther Locomnotion Resting Ohr Resting x e j ~ .~ 4 0 (D jh- 0 C .000 J: 5 R McrozomicRatoe drrevt (H) o 08 WO 2012/027586 PCT/US2011/049180 90 LED PCB rM71 ~ ~ FtruIPrnra~mmable Gate Array 2 Data CMOS camera PCB From/to computer Poerrelt~tors~ WO 2012/027586 PCT/US2011/049180 91 a b _ _ _ _ __ Wild type erbb3 mutant c d
C
0 C/2 Co/4 Cz/8 300 S200 100 u 0 100 200 300 400 cone r ected concentton (ceIfs/pL) WO 2012/027586 PCT/US2011/049180 92 a Fluorescence image Initial cell segmentation After contrast equalization Iteration I Initial cell se mentation Iteration 2

Claims (31)

1. An epifluorescence microscope comprising: an image capture circuit including an array of optical sensors; an optical arrangement configured (i) to direct excitation light of less than about 1 mW over an area that is at least 0.5 mm 2 encompassed within a field of view which comprises a target object, and (ii) to direct epi-fluorescence emission caused by the excitation light to the array of optical sensors, the optical arrangement and array of optical sensors each being sufficiently close to the target object to provide at least 2.5 pim resolution for an image of the field of view.
2. The microscope of claim 1, wherein the optical arrangement includes an objective lens, a light-emitting diode, and a CMOS image sensor array, each contained within an integrated housing of the microscope that is less than a cubic inch in size.
3. The microscope of claim 1, wherein the optical arrangement includes an objective lens, a light-emitting diode, and a CMOS image sensor array, each contained within an integrated housing, wherein the optical arrangement and array of optical sensors weighs less than 2 grams.
4. The microscope of claim 1, wherein optical magnification at the image capture circuit and of the target object is less than or equal to 5.
5. The microscope of claim 1, wherein the optical arrangement is configured to provide self-alignment between a light source providing the excitation light, the array of optical sensors and an objective lens.
6. The microscope of claim 1, wherein the optical arrangement is configured to provide focusing of an image by adjustment of a distance between the array of optical sensors and an objective lens during real-time imaging.
7. The microscope of claim 1, further including at least one optical filter element and light source generating the excitation light, and wherein the at least one optical filter 94 element, the light source and the array of optical sensors is configured to detach and reattach to the microscope.
8. The microscope of claim 1, wherein the optical arrangement and the array of optical sensors are each sufficiently close to the target object to provide at least 2.5 yrn resolution for generation of images of the field of view at a rate of at least 36 Hz.
9. The microscope of claim 1, wherein the microscope, including the image capture circuit and the optical arrangement, weighs 3 grams or less.
10. The microscope of claim 2, wherein the microscope, including the image capture circuit and the optical arrangement, weighs less than 2 grams, and wherein the light emitting diode contained within the housing is configured to provide the excitation light.
11. The microscope of claim 1, wherein the optical arrangement includes an objective lens and a light source emitting the excitation light and is contained within a housing of the microscope that is less than a cubic inch in size; and wherein the image capture circuit and the optical arrangement, including the light source, weigh less than 2 grams.
12. The microscope of claim 11, wherein the excitation light provided by the light source is directed to the objective lens by one or more excitation elements of the optical arrangement.
13. The microscope of claim 12, wherein the objective lens is configured to direct the excitation light to the target object and focus the excitation light onto the field of view.
14. The microscope of claim 13, wherein the optical arrangement further includes one or more emission elements configured to provide a focal plane from epifluorescent emission light received from the target object via the objective lens.
15. The microscope of claim 14, wherein the image capture circuit is provided at the focal plane and configured to capture the image of the field of view that includes multiple 95 individual capillary blood vessels with sufficient resolution to distinguish individual capillary blood vessels from one another.
16. The microscope of claim 15, wherein the image of the field of view has sufficient resolution to distinguish individual erythrocytes flowing through the individual capillary blood vessels.
17. The microscope of claim 15, wherein the one or more emission elements includes an achromatic lens located between the objective lens and the image capture circuit, wherein the achromatic lens is configured to receive collimated epifluorescent light from the objective lens and focus the collimated epifluorescent light onto the focal plane at which the image capture circuit is provided.
18. The microscope of claim 11, wherein the target object is a brain of a mouse and the microscope is configured to be mounted on a head of the mouse and provide in vivo brain imaging during awake behavior, wherein preferably the awake behavior includes locomotion of the mouse, and the image of the field of view includes multiple individual capillary blood vessels of the brain of the mouse with sufficient resolution to distinguish individual capillary blood vessels from one another during the locomotion.
19. The microscope of claim 15, wherein a distance from the objective lens to the field of view of the target object includes a distance such that the image of the field of view is in focus is 50-250 ycm.
20. The microscope of claim 11, wherein a distance from the objective lens to the image capture circuit is less than 1 inch.
21. The microscope of claim 13, wherein the light source is configured to provide the excitation light from the light source within the housing to the target object without use of a fiber optic.
22. The microscope of claim 11, wherein no fiber optics are provided within the housing. 96
23. The microscope of claim 11, wherein the housing is formed from a black material or includes a layer of absorbent material.
24. The microscope of claim 15, wherein the housing comprises a threaded interface between a portion of the housing holding the image capture circuit and a portion of the housing holding the objective lens, the threaded interface configured to provide fine adjustment of a distance between the image capture circuit and the objective lens to adjust the focal plane.
25. The microscope of claim 15, wherein the optical arrangement includes a dichroic mirror configured to reflect the excitation light from the light source to the objective lens and to pass the epifluorescent emission light from the objective lens to the image capture circuit.
26. The microscope of claim 11, further comprising a synchronization circuit configured to interface with an external device that displays the image of the field of view, wherein preferably the external device is configured to provide control of the light source.
27. The microscope of claim 2, wherein the objective lens is a gradient refractive index (GRIN) lens, wherein preferably the GRIN lens has at least one of the following characteristics: a diameter of about 2mm, a pitch length of about 0.245, or a numerical aperture of about 0.45.
28. The microscope of claim 12, wherein the microscope is configured and arranged to capture video images of the field of view of the target object.
29. A method of using the epifluoreseence microscope of claim 11, comprising the steps of attaching and reattaching the epifluorescence microscope to a base plate of a supportive structure for allowing precise microscope alignment for repeated imaging of a common imaging location during chronic experiments. 97
30. An epifluorescence microscope comprising: an optical light source configured to produce excitation light from an energy source that provides less than 6 mW, wherein the excitation light is directed over an area that is at least 0.5 mm2 encompassed within a field of view which comprises a target object; an imaging circuit means including a sensor array; and an objective lens configured to operate sufficiently close to the optical light source, the image sensor array and the target object to provide at least a 2.5 rn image resolution image of the field of view that is at least 0.5 mn
31. The microscope of claim 30, further comprising a housing that is less than one cubic inch in size, the housing containing the imaging circuit and the optical light source, wherein preferably the objective lens is configured to direct collimated excitation light from the target object. THE BOARD OF TRUSTEES OF THE LELAND STANFORD JUNIOR UNIVERSITY WATERMARK PATENT AND TRADEMARK ATTORNEYS P37172AU00
AU2011293269A 2010-08-27 2011-08-25 Microscopy imaging device with advanced imaging properties Active AU2011293269B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU2015272003A AU2015272003B2 (en) 2010-08-27 2015-12-21 Microscopy imaging device with advanced imaging properties

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US37759110P 2010-08-27 2010-08-27
US61/377,591 2010-08-27
PCT/US2011/049180 WO2012027586A2 (en) 2010-08-27 2011-08-25 Microscopy imaging device with advanced imaging properties

Related Child Applications (1)

Application Number Title Priority Date Filing Date
AU2015272003A Division AU2015272003B2 (en) 2010-08-27 2015-12-21 Microscopy imaging device with advanced imaging properties

Publications (2)

Publication Number Publication Date
AU2011293269A1 AU2011293269A1 (en) 2013-04-04
AU2011293269B2 true AU2011293269B2 (en) 2015-10-01

Family

ID=44838762

Family Applications (2)

Application Number Title Priority Date Filing Date
AU2011293269A Active AU2011293269B2 (en) 2010-08-27 2011-08-25 Microscopy imaging device with advanced imaging properties
AU2015272003A Active AU2015272003B2 (en) 2010-08-27 2015-12-21 Microscopy imaging device with advanced imaging properties

Family Applications After (1)

Application Number Title Priority Date Filing Date
AU2015272003A Active AU2015272003B2 (en) 2010-08-27 2015-12-21 Microscopy imaging device with advanced imaging properties

Country Status (8)

Country Link
US (7) US9195043B2 (en)
EP (2) EP2609742A4 (en)
JP (1) JP2013545116A (en)
CN (1) CN103765289B (en)
AU (2) AU2011293269B2 (en)
CA (3) CA2943966C (en)
GB (1) GB2483963B (en)
WO (1) WO2012027586A2 (en)

Families Citing this family (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2609742A4 (en) 2010-08-27 2015-07-08 Univ Leland Stanford Junior Microscopy imaging device with advanced imaging properties
US20120105949A1 (en) * 2010-11-02 2012-05-03 Eric B Cummings Additive Manufacturing-Based Compact Epifluorescence Microscope
WO2013033954A1 (en) 2011-09-09 2013-03-14 深圳市大疆创新科技有限公司 Gyroscopic dynamic auto-balancing ball head
US20140043462A1 (en) 2012-02-10 2014-02-13 Inscopix, Inc. Systems and methods for distributed video microscopy
US9955871B2 (en) * 2012-03-21 2018-05-01 Korea Electrotechnology Research Institute Transmitted light detection type measurement apparatus for skin autofluorescence
US9723990B2 (en) * 2012-03-21 2017-08-08 Korea Electro Technology Research Institute Transmitted light detection type measurement apparatus for skin autofluorescence
MX337140B (en) 2012-04-03 2016-02-12 Illumina Inc Integrated optoelectronic read head and fluidic cartridge useful for nucleic acid sequencing.
EP2660639B1 (en) * 2012-05-02 2016-04-13 Centre National De La Recherche Scientifique Method and apparatus for single-particle localization using wavelet analysis
WO2014072831A2 (en) * 2012-11-02 2014-05-15 Cesacar Holding, S.L. Fluorescence coloring for eye surgery
US20150309295A1 (en) * 2012-11-05 2015-10-29 Inscopix, Inac. Miniaturized imaging devices, systems and methods
GB2511483B (en) * 2013-01-15 2016-11-23 Coolled Ltd LED Illumination
JP6166049B2 (en) * 2013-01-25 2017-07-19 浜松ホトニクス株式会社 Photodetection device and photodetection method
CN103676122A (en) * 2013-06-24 2014-03-26 张晨 Small fluorescent/bright field optical imaging system, optical imaging method and purpose thereof
CN104251811A (en) * 2013-06-28 2014-12-31 西门子医疗保健诊断公司 Digital microscope and image identification method
JP6239881B2 (en) * 2013-07-10 2017-11-29 浜松ホトニクス株式会社 Image acquisition apparatus and image acquisition method
US8903568B1 (en) 2013-07-31 2014-12-02 SZ DJI Technology Co., Ltd Remote control method and terminal
CN104459098B (en) * 2013-09-23 2016-11-23 西门子医疗保健诊断公司 A kind of diagnostic equipment gathering medical science sample image
JP2016541026A (en) 2013-10-08 2016-12-28 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd Apparatus and method for stabilization and vibration reduction
US20170055837A1 (en) * 2014-05-05 2017-03-02 Health Research, Inc. Clinical Intravital Microscope
US10016136B2 (en) 2014-06-20 2018-07-10 Optomak, Inc. Image relaying cannula with detachable self-aligning connector
CN105629255B (en) * 2014-11-03 2019-02-12 信泰光学(深圳)有限公司 Rangefinder
US10520792B2 (en) * 2014-11-03 2019-12-31 Sintai Optical (Shenzhen) Co., Ltd. Range finder
US10292592B2 (en) 2014-11-13 2019-05-21 The Board Of Trustees Of The Leland Stanford Junior University Method and apparatus for optical recording of biological parameters in freely moving animals
CN107111013B (en) * 2014-12-08 2020-11-06 Trw汽车美国有限责任公司 Compact modulation transfer function evaluation system
US11112592B2 (en) * 2015-04-24 2021-09-07 Board Of Supervisors Of Louisiana State University And Agricultural And Mechanical College Fine focus microscope control
CN108351301B (en) * 2015-09-02 2021-03-09 英思克斯公司 System and method for color imaging
DE102015116488A1 (en) 2015-09-29 2017-03-30 Carl Zeiss Microscopy Gmbh Microscope and lighting device and lighting set for a microscope
CN108474736B (en) 2015-11-05 2022-01-25 英思克斯公司 System and method for optogenetic imaging
WO2020206362A1 (en) 2019-04-04 2020-10-08 Inscopix, Inc. Multi-modal microscopic imaging
US9846300B2 (en) 2016-01-08 2017-12-19 Optomak, Inc. Microscope with multiple image sensors for fluorescence imaging of multiple locations and/or wavelengths
US9791683B2 (en) 2016-01-08 2017-10-17 Optomak, Inc. Microscope with multiple illumination channels for optogenetic stimulation and fluorescence imaging
US10274712B2 (en) 2016-01-08 2019-04-30 Optomak, Inc. Microscope for fluorescence imaging with variable focus
JP6242447B2 (en) * 2016-08-04 2017-12-06 オリンパス株式会社 Microscope system
US10386623B2 (en) 2016-09-13 2019-08-20 Inscopix, Inc. Adapter for microscopic imaging
US10568695B2 (en) * 2016-09-26 2020-02-25 International Business Machines Corporation Surgical skin lesion removal
KR102621221B1 (en) * 2017-05-16 2024-01-08 리서치 디벨럽먼트 파운데이션 Devices and methods for identifying endometrial tissue
JP7159216B2 (en) * 2017-05-19 2022-10-24 ザ ロックフェラー ユニヴァーシティ Imaging signal extractor and method of using same
WO2019010348A1 (en) * 2017-07-06 2019-01-10 The Johns Hopkins University A miniature microscope for multi-contrast optical imaging in animals
DE102017127931A1 (en) * 2017-11-27 2019-05-29 Henke-Sass, Wolf Gmbh Optics arrangement for an endoscope and endoscope with such an optical arrangement
JP6969459B2 (en) * 2018-03-15 2021-11-24 オムロン株式会社 The sensor head
CN108982444A (en) * 2018-07-04 2018-12-11 浙江大学 A kind of short-wave infrared fluorescence microimaging systems of LED excitation
CN109342369A (en) * 2018-10-26 2019-02-15 中国科学院苏州生物医学工程技术研究所 The big visual field bio-imaging that is quickly detected for circulating tumor cell, scanning, analytical equipment
DE102018127339A1 (en) * 2018-11-01 2020-05-07 Leibniz-Institut für Neurobiologie Magdeburg Image capture device, system and method for image capture
CN110018558A (en) * 2019-05-24 2019-07-16 福州大学 A kind of Portable fluorescence microscope and its working method
KR102252113B1 (en) 2019-09-19 2021-05-17 한국과학기술연구원 Neural probe structure for measuring multiple fluorescence signals and manufacturing method thereof
EP3805838B1 (en) * 2019-10-10 2023-12-06 Leica Instruments (Singapore) Pte. Ltd. Microscope and related apparatuses, methods and computer programs
WO2021216958A1 (en) * 2020-04-24 2021-10-28 The Regents Of The University Of California Devices and methods for two-dimension (2d)-based protein and particle detection

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090072171A1 (en) * 2003-08-15 2009-03-19 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US20090237501A1 (en) * 2008-03-19 2009-09-24 Ruprecht-Karis-Universitat Heidelberg Kirchhoff-Institut Fur Physik method and an apparatus for localization of single dye molecules in the fluorescent microscopy

Family Cites Families (122)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US88145A (en) 1869-03-23 Improvement in blind-wiring machine
US3762801A (en) 1972-02-24 1973-10-02 Polaroid Corp Compact three component objective lenses
US5212593A (en) 1992-02-06 1993-05-18 Svg Lithography Systems, Inc. Broad band optical reduction system using matched multiple refractive element materials
IL108352A (en) 1994-01-17 2000-02-29 Given Imaging Ltd In vivo video camera system
US5798222A (en) 1995-07-17 1998-08-25 Guava Technologies, Inc. Apparatus for monitoring substances in organisms
US5907425A (en) 1995-12-19 1999-05-25 The Board Of Trustees Of The Leland Stanford Junior University Miniature scanning confocal microscope
US6396941B1 (en) 1996-08-23 2002-05-28 Bacus Research Laboratories, Inc. Method and apparatus for internet, intranet, and local viewing of virtual microscope slides
WO1998055026A1 (en) 1997-06-05 1998-12-10 Kairos Scientific Inc. Calibration of fluorescence resonance energy transfer in microscopy
US6324418B1 (en) 1997-09-29 2001-11-27 Boston Scientific Corporation Portable tissue spectroscopy apparatus and method
KR20010040418A (en) 1998-01-26 2001-05-15 자밀라 제트. 허벡 Fluorescence imaging endoscope
EP1079890A4 (en) 1998-05-08 2008-12-03 Genetronics Inc Electrically induced vessel vasodilation
CA2280398C (en) * 1998-10-26 2009-01-20 Lothar Lilge A semiconductor based excitation illuminator for fluorescence and phosphorescence microscopy
US6653651B1 (en) * 1998-12-09 2003-11-25 Carl D. Meinhart Micron resolution particle image velocimeter
US6907390B1 (en) * 1998-12-14 2005-06-14 Smiths Detection Inc. Miniaturized opto-electronic magnifying system
US6005720A (en) 1998-12-22 1999-12-21 Virtual Vision, Inc. Reflective micro-display system
US20030170908A1 (en) 2000-07-28 2003-09-11 Bright Frank V. Method for making microsensor arrays for detecting analytes
RU2182328C2 (en) 2000-02-17 2002-05-10 Институт молекулярной биологии им. В.А. Энгельгардта РАН Fluorescent microscope
US20060008799A1 (en) 2000-05-22 2006-01-12 Hong Cai Rapid haplotyping by single molecule detection
DE60119930T2 (en) 2000-07-10 2007-01-18 University Health Network, Toronto METHOD AND DEVICE FOR HIGHLY RESOLVING COHERENT OPTICAL FIGURE
WO2002007587A2 (en) 2000-07-14 2002-01-31 Xillix Technologies Corporation Compact fluorescent endoscopy video system
US8036731B2 (en) 2001-01-22 2011-10-11 Spectrum Dynamics Llc Ingestible pill for diagnosing a gastrointestinal tract
US6780584B1 (en) 2000-09-27 2004-08-24 Nanogen, Inc. Electronic systems and component devices for macroscopic and microscopic molecular biological reactions, analyses and diagnostics
US6818907B2 (en) 2000-10-17 2004-11-16 The President And Fellows Of Harvard College Surface plasmon enhanced illumination system
FR2820828B1 (en) 2001-02-09 2003-05-02 Commissariat Energie Atomique SAMPLE OBSERVATION DEVICE BY FLUORESCENCE, IN PARTICULAR SEQUENTIALLY
US6790672B2 (en) 2001-02-19 2004-09-14 Board Of Regents The University Of Texas System Encoded molecular sieve particle-based sensors
EP1372486A2 (en) 2001-03-09 2004-01-02 Lucid, Inc. System and method for macroscopic and confocal imaging of tissue
WO2002075370A2 (en) 2001-03-19 2002-09-26 Weinstein Ronald S Miniaturized microscope array digital slide scanner
US7864380B2 (en) 2001-03-19 2011-01-04 Dmetrix, Inc. Slide-borne imaging instructions
US6649402B2 (en) 2001-06-22 2003-11-18 Wisconsin Alumni Research Foundation Microfabricated microbial growth assay method and apparatus
US7297494B2 (en) 2001-06-25 2007-11-20 Georgia Tech Research Corporation Activatable probes and methods for in vivo gene detection
US7439478B2 (en) 2001-07-06 2008-10-21 Palantyr Research, Llc Imaging system, methodology, and applications employing reciprocal space optical design having at least one pixel being scaled to about a size of a diffraction-limited spot defined by a microscopic optical system
US7105795B2 (en) * 2001-07-06 2006-09-12 Palantyr Research, Llc Imaging system, methodology, and applications employing reciprocal space optical design
US6850362B2 (en) * 2001-08-08 2005-02-01 Atto Bioscience Inc. Microscope optical system with a stationary sample stage and stationary viewing ports suited for viewing various fields of view of a sample
US7326938B2 (en) * 2001-08-23 2008-02-05 D.N.R. Imaging Systems Ltd. Optical system and method for inspecting fluorescently labeled biological specimens
WO2003023479A1 (en) 2001-09-07 2003-03-20 Board Of Regents, The University Of Texas System Multimodal miniature microscope
US6885492B2 (en) 2001-11-08 2005-04-26 Imaginative Optics, Inc. Spatial light modulator apparatus
US6643071B2 (en) 2001-12-21 2003-11-04 Lucent Technologies Inc. Graded-index lens microscopes
JP2003260025A (en) 2002-03-08 2003-09-16 Olympus Optical Co Ltd Capsule endoscope
JP4084061B2 (en) * 2002-03-18 2008-04-30 独立行政法人科学技術振興機構 High stability optical microscope
JP3789104B2 (en) 2002-05-13 2006-06-21 株式会社日立ハイテクノロジーズ Element distribution observation method and apparatus
US7193775B2 (en) 2002-05-30 2007-03-20 Dmetrix, Inc. EPI-illumination system for an array microscope
US6987259B2 (en) 2002-05-30 2006-01-17 Dmetrix, Inc. Imaging system with an integrated source and detector array
US20040091465A1 (en) 2002-06-26 2004-05-13 Zachary Yim Therapeutic antiangiogenic compositions and methods
US7154598B2 (en) 2002-07-12 2006-12-26 Decision Biomarkers, Inc. Excitation and imaging of fluorescent arrays
EP1527333A1 (en) * 2002-08-01 2005-05-04 Sensor Technologies LLC Fluorescence correlation spectroscopy instrument
AU2003261362A1 (en) 2002-08-02 2004-02-23 University Of Massachusetts Lipid binding molecules and methods of use
US7023622B2 (en) 2002-08-06 2006-04-04 Dmetrix, Inc. Miniature microscope objective lens
JP4395859B2 (en) * 2003-01-07 2010-01-13 三星電機株式会社 Camera module for portable terminals
US7672043B2 (en) 2003-02-21 2010-03-02 Kla-Tencor Technologies Corporation Catadioptric imaging system exhibiting enhanced deep ultraviolet spectral bandwidth
US7151260B2 (en) * 2003-03-03 2006-12-19 Advanced Fuel Research, Inc. Analyzer for measuring multiple gases
US8206946B2 (en) 2003-03-24 2012-06-26 Mary Beth Tabacco Fluorescent virus probes for identification of bacteria
CN100446717C (en) 2003-04-25 2008-12-31 奥林巴斯株式会社 Capsule endoscope and capsule endoscope system
EP1618830A4 (en) 2003-04-25 2010-06-23 Olympus Corp Radio-type in-subject information acquisition system and outside-subject device
JPWO2004096022A1 (en) 2003-04-25 2006-07-13 オリンパス株式会社 Wireless intra-subject information acquisition system and intra-subject introduction device
US20110259744A1 (en) 2003-04-30 2011-10-27 Moyle William R Sensors for biomolecular detection and cell classification
WO2005009126A1 (en) * 2003-07-23 2005-02-03 Essen Instruments, Inc. Examination systems for biological samples
WO2005043114A2 (en) 2003-10-22 2005-05-12 Ron Broide Oligonucleotide probe sets and uses thereof
WO2005057193A1 (en) 2003-11-18 2005-06-23 Applied Materials Israel, Ltd Inspection system with auto-focus
US8082024B2 (en) * 2004-01-16 2011-12-20 Alfano Robert R Micro-scale compact device for in vivo medical diagnosis combining optical imaging and point fluorescence spectroscopy
JP2005294801A (en) * 2004-03-11 2005-10-20 Advanced Lcd Technologies Development Center Co Ltd Laser crystallization apparatus and laser crystallization method
WO2005112895A2 (en) 2004-05-20 2005-12-01 Spectrum Dynamics Llc Ingestible device platform for the colon
DE102004034970A1 (en) 2004-07-16 2006-02-02 Carl Zeiss Jena Gmbh Scanning microscope and use
JP4587742B2 (en) 2004-08-23 2010-11-24 株式会社日立ハイテクノロジーズ Charged particle beam microscopic method and charged particle beam application apparatus
US8878924B2 (en) * 2004-09-24 2014-11-04 Vivid Medical, Inc. Disposable microscope and portable display
WO2006050355A2 (en) * 2004-11-01 2006-05-11 David Jones A compact portable rugged fluorescence microscope sealed against dust and water
FR2881225B1 (en) 2005-01-21 2007-10-26 Cypher Science Sarl PORTABLE DETECTION APPARATUS FOR FIELD DETECTION OF FLUORESCENT-MARKING ELEMENTS
US20060164510A1 (en) * 2005-01-24 2006-07-27 Doron Adler Sensor with narrow mounting profile
US8788021B1 (en) 2005-01-24 2014-07-22 The Board Of Trustees Of The Leland Stanford Junior Univerity Live being optical analysis system and approach
US7307774B1 (en) 2005-01-24 2007-12-11 The Board Of Trustees Of The Leland Standford Junior University Micro-optical analysis system and approach therefor
US8346346B1 (en) 2005-01-24 2013-01-01 The Board Of Trustees Of The Leland Stanford Junior University Optical analysis system and approach therefor
US7738086B2 (en) 2005-05-09 2010-06-15 The Trustees Of Columbia University In The City Of New York Active CMOS biosensor chip for fluorescent-based detection
US7224539B2 (en) 2005-05-13 2007-05-29 Schaack David F Providing optical systems having improved properties to users of catalog (stock) lenses
FR2887457B1 (en) * 2005-06-23 2007-10-05 Fond Bettencourt Schueller TRANSCUTANE TARGETING VACCINATION
WO2007102839A2 (en) * 2005-10-27 2007-09-13 Applera Corporation Optoelectronic separation of biomolecules
WO2007067733A2 (en) * 2005-12-09 2007-06-14 Massachusetts Institute Of Technology Compositions and methods to monitor rna delivery to cells
WO2007082102A2 (en) * 2006-01-12 2007-07-19 Optimedica Corporation Optical delivery systems and methods of providing adjustable beam diameter, spot size and/or spot shape
US8487326B2 (en) * 2006-04-24 2013-07-16 Cree, Inc. LED device having a tilted peak emission and an LED display including such devices
US8771261B2 (en) * 2006-04-28 2014-07-08 Topcon Medical Laser Systems, Inc. Dynamic optical surgical system utilizing a fixed relationship between target tissue visualization and beam delivery
CN101063657A (en) * 2006-04-29 2007-10-31 中国科学院上海生命科学研究院 Method and system for screen selecting combination of ligand with receptor in active somatic cell
US8045263B2 (en) 2006-06-30 2011-10-25 The General Hospital Corporation Device and method for wide-field and high resolution imaging of tissue
JP2010500617A (en) * 2006-08-04 2010-01-07 イコニシス インコーポレーテッド Automatic microscope and method for dynamic scanning
US10335038B2 (en) * 2006-08-24 2019-07-02 Xenogen Corporation Spectral unmixing for in-vivo imaging
JP5256201B2 (en) * 2006-08-24 2013-08-07 エージェンシー フォー サイエンス, テクノロジー アンド リサーチ Compact optical detection system
GB0617945D0 (en) 2006-09-12 2006-10-18 Ucl Business Plc Imaging apparatus and methods
US20080070311A1 (en) * 2006-09-19 2008-03-20 Vanderbilt University Microfluidic flow cytometer and applications of same
US7535991B2 (en) * 2006-10-16 2009-05-19 Oraya Therapeutics, Inc. Portable orthovoltage radiotherapy
WO2008048612A2 (en) 2006-10-17 2008-04-24 Hnuphotonics Miniature microscope camera
US20090208965A1 (en) * 2006-10-25 2009-08-20 Ikonisys, Inc. Automated method for detecting cancers and high grade hyperplasias
US20100074486A1 (en) 2006-11-22 2010-03-25 Max-Planck-Gesellschaft Zur Forderung Der Wissenschaften E.V., A Corporation Of Germany Reconstruction and visualization of neuronal cell structures with bright-field mosaic microscopy
US20090093551A1 (en) * 2006-12-08 2009-04-09 Bhatia Sangeeta N Remotely triggered release from heatable surfaces
US20080141795A1 (en) * 2006-12-13 2008-06-19 Gagnon Daniel F Micro-surface inspection tool
DE102007018922A1 (en) * 2007-02-12 2008-08-14 Leica Microsystems Cms Gmbh microscope
CA2690633C (en) 2007-06-15 2015-08-04 Historx, Inc. Method and system for standardizing microscope instruments
WO2009002273A1 (en) 2007-06-26 2008-12-31 Agency For Science, Technology And Research Imaging chamber with window and micro-needle platform magnetically biased toward each other
US8068899B2 (en) 2007-07-03 2011-11-29 The Board Of Trustees Of The Leland Stanford Junior University Method and system of using intrinsic-based photosensing with high-speed line scanning for characterization of biological thick tissue including muscle
US9411149B2 (en) * 2007-07-17 2016-08-09 The Board Of Trustees Of The Leland Stanford Junior University Microendoscopy with corrective optics
EP2225600A2 (en) * 2007-09-05 2010-09-08 Chroma Technology Corporation Light source
JP5643101B2 (en) * 2007-10-25 2014-12-17 ワシントン・ユニバーシティWashington University Scattering medium imaging method, imaging apparatus, and imaging system
US7801271B2 (en) 2007-12-23 2010-09-21 Oraya Therapeutics, Inc. Methods and devices for orthovoltage ocular radiotherapy and treatment planning
CN101952762B (en) * 2008-01-02 2012-11-28 加利福尼亚大学董事会 High numerical aperture telemicroscopy apparatus
US8179525B2 (en) 2008-03-31 2012-05-15 Jawaharial Nehru Centre For Advanced Scientific Research Mirror mounted inside filter block of a fluorescence microscope to perform SERS and method thereof
EP2291637B1 (en) * 2008-05-20 2020-01-08 NanoTemper Technologies GmbH Method and device for measuring thermo-optical characteristics of particles in a solution
US8184298B2 (en) * 2008-05-21 2012-05-22 The Board Of Trustees Of The University Of Illinois Spatial light interference microscopy and fourier transform light scattering for cell and tissue characterization
US8983581B2 (en) 2008-05-27 2015-03-17 Massachusetts Institute Of Technology System and method for large field of view, single cell analysis
US20090322800A1 (en) * 2008-06-25 2009-12-31 Dolby Laboratories Licensing Corporation Method and apparatus in various embodiments for hdr implementation in display devices
US7940886B2 (en) 2008-10-03 2011-05-10 Siemens Medical Solutions Usa, Inc. 3D medical anatomical image system using 2D images
JP2010117705A (en) 2008-10-14 2010-05-27 Olympus Corp Microscope for virtual-slide creating system
US8551730B2 (en) 2008-10-24 2013-10-08 The Regents Of The University Of California Use of a reference source with adaptive optics in biological microscopy
US8525925B2 (en) * 2008-12-29 2013-09-03 Red.Com, Inc. Modular digital camera
US9685592B2 (en) * 2009-01-14 2017-06-20 Cree Huizhou Solid State Lighting Company Limited Miniature surface mount device with large pin pads
US20100259805A1 (en) * 2009-04-13 2010-10-14 Molecular Devices, Inc. Methods and systems for reducing scanner image distortion
JP5307629B2 (en) 2009-05-22 2013-10-02 オリンパス株式会社 Scanning microscope equipment
US8174761B2 (en) * 2009-06-10 2012-05-08 Universitat Heidelberg Total internal reflection interferometer with laterally structured illumination
US20110061139A1 (en) * 2009-09-04 2011-03-10 Ahmet Oral Method to measure 3 component of the magnetic field vector at nanometer resolution using scanning hall probe microscopy
WO2011091283A1 (en) * 2010-01-22 2011-07-28 Board Of Regents, The University Of Texas System Systems, devices and methods for imaging and surgery
JP4941566B2 (en) * 2010-01-27 2012-05-30 ブラザー工業株式会社 Process cartridge
JP2011250835A (en) 2010-05-31 2011-12-15 Olympus Corp Endoscope system
US20110312712A1 (en) * 2010-06-17 2011-12-22 Geneasys Pty Ltd Genetic analysis loc for pcr amplification of nucleic acids from whole blood
WO2012002893A1 (en) 2010-06-30 2012-01-05 Ge Healthcare Bio-Sciences Corp A system for synchronization in a line scanning imaging microscope
EP2609742A4 (en) 2010-08-27 2015-07-08 Univ Leland Stanford Junior Microscopy imaging device with advanced imaging properties
JP6009841B2 (en) 2011-07-12 2016-10-19 オリンパス株式会社 Optical observation device
US9575304B2 (en) 2012-06-25 2017-02-21 Huron Technologies International Inc. Pathology slide scanners for fluorescence and brightfield imaging and method of operation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090072171A1 (en) * 2003-08-15 2009-03-19 Massachusetts Institute Of Technology Systems and methods for volumetric tissue scanning microscopy
US20090237501A1 (en) * 2008-03-19 2009-09-24 Ruprecht-Karis-Universitat Heidelberg Kirchhoff-Institut Fur Physik method and an apparatus for localization of single dye molecules in the fluorescent microscopy

Also Published As

Publication number Publication date
GB2483963A (en) 2012-03-28
JP2013545116A (en) 2013-12-19
CA2943966C (en) 2019-02-19
EP2609742A4 (en) 2015-07-08
US20210267458A1 (en) 2021-09-02
US9498135B2 (en) 2016-11-22
CN103765289A (en) 2014-04-30
EP2609742A2 (en) 2013-07-03
CA2943966A1 (en) 2012-03-01
US9629554B2 (en) 2017-04-25
AU2015272003B2 (en) 2017-10-19
US20160029893A1 (en) 2016-02-04
CA2809581C (en) 2016-11-15
US11259703B2 (en) 2022-03-01
WO2012027586A3 (en) 2013-11-14
GB201114767D0 (en) 2011-10-12
US20160004063A1 (en) 2016-01-07
CA3030720A1 (en) 2012-03-01
US20120062723A1 (en) 2012-03-15
CA2809581A1 (en) 2012-03-01
US10813552B2 (en) 2020-10-27
WO2012027586A2 (en) 2012-03-01
EP3358387A1 (en) 2018-08-08
US9474448B2 (en) 2016-10-25
US20230200656A1 (en) 2023-06-29
US20160033752A1 (en) 2016-02-04
US9195043B2 (en) 2015-11-24
GB2483963B (en) 2015-02-04
CN103765289B (en) 2017-05-17
AU2011293269A1 (en) 2013-04-04
US20170296060A1 (en) 2017-10-19

Similar Documents

Publication Publication Date Title
AU2011293269B2 (en) Microscopy imaging device with advanced imaging properties
Scott et al. Imaging cortical dynamics in GCaMP transgenic rats with a head-mounted widefield macroscope
Ghosh et al. Miniaturized integration of a fluorescence microscope
JP2019148801A (en) Method for using epi-illumination fluorescence microscope, method for using imaging device, and epi-illumination fluorescence microscope
JP2016118792A (en) Microscopy imaging device with advanced imaging properties
Keller et al. Visualizing whole-brain activity and development at the single-cell level using light-sheet microscopy
Ovečka et al. Multiscale imaging of plant development by light-sheet fluorescence microscopy
Stirman et al. A multispectral optical illumination system with precise spatiotemporal control for the manipulation of optogenetic reagents
CN1550039A (en) Imaging system and methodology employing reciprocal space optical design
Scherrer et al. A novel optical design enabling lightweight and large field-of-view head-mounted microscopes
US11422355B2 (en) Method and system for acquisition of fluorescence images of live-cell biological samples
Ghosh Miniature and Mass-Producible Fluorescence Microscopes for Biomedical Imaging
JP2019082708A (en) Microscopy imaging device with advanced imaging properties, epifluorescence microscope device, and method of using epifluorescence microscope device
US20230218173A1 (en) Endoscopic imaging and patterned stimulation at cellular resolution
Zhang et al. A Systematically Optimized Miniaturized Mesoscope (SOMM) for large-scale calcium imaging in freely moving mice
Dong Development of an Imaging Chamber for Measurement of Spatially Resolved Photosynthetic Quantum Yield
Wang FEASIBILITY EXPERIMENTS TOWARD A HIGH-THROUGHPUT MICROSCOPE PLATFORM FOR NEUROSCIENCE RESEARCH IN C. ELEGANS
Kwan Toward reconstructing spike trains from large‐scale calcium imaging data
Osman et al. A head-mountable microscope for high-speed fluorescence brain imaging
CN115453734A (en) Microscopic imaging system, control method and device thereof, and computer readable storage medium
Burns Development and Application of a Miniature, Integrated Fluorescence Microscope for in Vivo Brain Imaging
Ghaye Image processing on reconfigurable hardware for continuous monitoring of fluorescent biomarkers in cell cultures
Aharoni Miniscope-LFOV: A large field of view, single cell resolution, miniature microscope for wired and wire-free imaging of neural dynamics in freely behaving animals

Legal Events

Date Code Title Description
FGA Letters patent sealed or granted (standard patent)