CN111610621B - Bimodal microscopic imaging system and method - Google Patents

Bimodal microscopic imaging system and method Download PDF

Info

Publication number
CN111610621B
CN111610621B CN202010059510.1A CN202010059510A CN111610621B CN 111610621 B CN111610621 B CN 111610621B CN 202010059510 A CN202010059510 A CN 202010059510A CN 111610621 B CN111610621 B CN 111610621B
Authority
CN
China
Prior art keywords
image
light
optical diffraction
sample
diffraction tomography
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010059510.1A
Other languages
Chinese (zh)
Other versions
CN111610621A (en
Inventor
施可彬
陈良怡
董大山
黄小帅
李柳菊
毛珩
王爱民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Peking University
Original Assignee
Peking University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Peking University filed Critical Peking University
Priority to CN202010059510.1A priority Critical patent/CN111610621B/en
Publication of CN111610621A publication Critical patent/CN111610621A/en
Priority to PCT/CN2021/071393 priority patent/WO2021143707A1/en
Application granted granted Critical
Publication of CN111610621B publication Critical patent/CN111610621B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0036Scanning details, e.g. scanning stages
    • G02B21/0048Scanning details, e.g. scanning stages scanning mirrors, e.g. rotating or galvanomirrors, MEMS mirrors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0076Optical details of the image generation arrangements using fluorescence or luminescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/361Optical details, e.g. image relay to the camera or image sensor

Landscapes

  • Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Optics & Photonics (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Multimedia (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Microscoopes, Condenser (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

The embodiment of the application discloses a bimodal microscopic imaging system and a bimodal microscopic imaging method. The bimodal microscopic imaging system comprises an optical diffraction tomography subsystem and a structured light illumination fluorescence imaging subsystem; the optical diffraction tomography subsystem is used for performing label-free optical diffraction tomography based on the first laser to acquire an optical diffraction tomography image of the sample; the structured light illumination fluorescence imaging subsystem is used for performing fluorescence imaging based on second laser to acquire a structured light illumination fluorescence image of the sample; the dual-mode microscopic imaging system comprises a first light source and a second light source which are independent of each other, wherein the first light source is used for emitting the first laser, and the second light source is used for emitting the second laser.

Description

Bimodal microscopic imaging system and method
Technical Field
The application relates to the technical field of microscopic imaging, in particular to a bimodal microscopic imaging system and a bimodal microscopic imaging method.
Background
Since the twenty-first century, the vigorous development of biology and medicine has continuously improved the recognition of human life essence and greatly promoted the development of human civilization. With the development of biomedical research, leading-edge research has been conducted into cells in an effort to decipher life processes on a molecular scale. Optical microscopy is a powerful tool for modern molecular biology research, and its development has driven the human observation and understanding of life phenomena to progress. Optical microscopes are devices that use the interaction of light with matter to represent the microstructure of an object. Since the invention of optical microscopy, which has been the most common tool in the biomedical field, conventional lenses and visible light illumination have supported about 80% of microscopic studies. Due to the optically clear nature of the cells, only optical microscopy enables non-invasive label-free imaging of living cells.
Fluorescence microscopy imaging is a main means of molecular biology research, however, because of high photon flux and phototoxicity of excitation light, the total imaging times are limited, so that interaction and dynamic processes of intracellular organelles cannot be fully revealed at present. High resolution long-time imaging of living cells is still a great challenge in biological research at present, three-dimensional fluorescence imaging requires a greater excitation photon flux due to the limitation of axial scanning speed, and the total time of three-dimensional imaging is greatly limited by the photobleaching effect. Meanwhile, due to the wide fluorescence spectrum and the limited number of channels in the imaging process, the fluorescence imaging can only mark limited kinds of molecules at the same time. Although auxiliary imaging means such as electron microscope can observe various organelles, the auxiliary imaging means can only provide static snapshots as assistance.
Optical diffraction tomography microscopy is a technique for non-invasive label-free three-dimensional imaging of cells and tissues. Because the quantitative phase imaging technology and the scattering theory are combined, the optical diffraction chromatography microscopic imaging technology can sense the morphological change of the nanometer scale, and can carry out long-time high-resolution nondestructive imaging on living cells. As a powerful tool for observing dynamic changes of cells, the kit has great application prospects in the aspects of cell metabolism, pathology, tumor diagnosis and the like. However, the diffraction tomography technique has a complicated optical structure and an algorithm that is not mature enough, and thus cannot be applied to the biomedical research in a large scale. The diffraction chromatography microscopy has two problems to be solved: on one hand, the diffraction chromatography has large data volume and complex calculation process, and is difficult to be used for continuously observing life phenomena. On the other hand, while label-free imaging is realized, the chemical selectivity imaging capability of the optical diffraction chromatography microscope is limited, the morphological characterization lacks chemical specificity, and the persuasion is limited.
The optical diffraction chromatography microscopic imaging has the characteristics of low luminous flux and low phototoxicity, and can effectively solve the problems encountered by fluorescence imaging. In the optical diffraction tomography system, the prior work lacks fluorescence imaging as an auxiliary, most structures in the diffraction tomography image lack calibration, and only morphological analysis can be carried out. In the traditional optical diffraction tomography, the differential calibration of combined wide-field fluorescence imaging is only carried out on lipid drops, chromosomes and mitochondria.
Therefore, it is necessary to provide a bimodal microscopic imaging method combining optical diffraction tomography microscopic imaging and structured light illumination super-resolution fluorescence imaging, and the super-resolution fluorescence imaging is used for assisting optical diffraction tomography to perform co-localization imaging.
Disclosure of Invention
One of the embodiments of the application provides a bimodal microscopic imaging system, which comprises an optical diffraction tomography subsystem and a structured light illumination fluorescence imaging subsystem; the optical diffraction tomography subsystem is used for performing label-free optical diffraction tomography based on the first laser to acquire an optical diffraction tomography image of the sample; the structured light illumination fluorescence imaging subsystem is used for performing fluorescence imaging based on second laser to acquire a structured light illumination fluorescence image of the sample; the dual-mode microscopic imaging system comprises a first light source and a second light source which are independent of each other, wherein the first light source is used for emitting the first laser, and the second light source is used for emitting the second laser.
One embodiment of the present application provides a bimodal microscopic imaging method, including: respectively generating a first laser and a second laser by utilizing mutually independent light sources; obtaining an optical diffraction tomography image of a sample based on the first laser with an optical diffraction tomography subsystem; acquiring a structured light illumination fluorescence image of the sample based on the second laser by using a structured light illumination fluorescence imaging subsystem; generating a bimodal fusion image of the sample based on the optical diffraction tomography image and the structured light illuminated fluorescence image.
One of the embodiments of the present application provides a dual-modality microscopic imaging apparatus, including a processor, where the processor is configured to execute the dual-modality microscopic imaging method according to any one of the embodiments of the present application.
One of the embodiments of the present application provides a computer-readable storage medium, where the storage medium stores computer instructions, and when the computer instructions in the storage medium are read by a computer, the computer executes the bimodal microscopic imaging method according to any one of the embodiments of the present application.
Drawings
The present application will be further explained by way of exemplary embodiments, which will be described in detail by way of the accompanying drawings. These embodiments are not intended to be limiting, and in these embodiments like numerals are used to indicate like structures, wherein:
FIG. 1 is a schematic structural diagram of a dual-modality microscopic imaging system according to some embodiments of the present application;
FIG. 2 is a block diagram of a dual modality microscopic imaging system, shown in accordance with some embodiments of the present application;
FIG. 3 is a schematic diagram of a control subsystem of a dual-modality microscopic imaging system, shown in accordance with some embodiments of the present application;
FIG. 4 is a timing diagram for control of a dual modality microscopic imaging system, according to some embodiments of the present application;
FIG. 5 is an exemplary flow chart of a method of dual modality microscopy imaging according to some embodiments of the present application;
FIG. 6 is a flow chart of a diffraction tomography reconstruction algorithm according to some embodiments of the present application;
FIG. 7a is an exemplary flow chart illustrating the determination of a target swept wave-vector according to some embodiments of the present application;
FIG. 7b is a schematic illustration of the determination of a target swept wave-vector using the VISA method according to some embodiments of the present application;
FIG. 8 is a schematic illustration of a shift in swept wave vectors according to some embodiments of the present application;
FIG. 9 is a reconstructed image without scan-wave-vector iteration according to some embodiments of the present application;
FIG. 10 is a reconstructed image with scan-wave-vector iteration performed according to some embodiments of the present application;
FIG. 11a is a schematic diagram of an optical diffraction tomography image reconstruction algorithm according to some embodiments of the present application;
FIG. 11b is a schematic diagram of a structured light illumination fluorescence image reconstruction algorithm according to some embodiments of the present application;
FIG. 12 is an optical diffraction tomography image of patient fibroblasts obtained using the optical diffraction tomography subsystem 210, according to some embodiments of the present application;
FIG. 13 is a graph of optical diffraction tomography of immobilized INS-1 cells according to some embodiments of the present application;
FIGS. 14a and 14b are graphs comparing the results of a wide-field fluorescent scintigraphic image of fixed naive hepatocytes with an optical diffraction tomography image according to some embodiments of the present application;
FIG. 15 is a result of sequential imaging of COS-7 cells using the optical diffraction tomography subsystem 210, according to some embodiments of the present application;
FIG. 16 is an optical diffraction tomography image of a pollen tube growth process according to some embodiments of the present application;
FIG. 17 is an optical diffraction tomography image of a nematode embryonic development process according to some embodiments of the present application;
FIG. 18 is an optical diffraction tomography-fluorescence co-localization image collected of six major organelles in a COS-7 viable cell using a dual-modality microscopic imaging system, according to some embodiments of the present application;
FIG. 19 is a lateral resolution characterization diagram of an optically diffractive tomography-structured light illuminated fluorescence dual-modality microscopy imaging system, according to some embodiments of the present application;
FIG. 20 is a COS-7 cell mitosis bimodal fluorescence co-localization image acquired by a bimodal microscopy imaging system according to some embodiments of the present application;
FIG. 21 is an optical diffraction tomography-fluorescence co-localization image of an organelle in a COS-7 living cell that cannot be resolved by optical diffraction tomography in a dual-modality microscopy imaging system according to some embodiments of the present application;
FIG. 22 is a low refractive index vesicle appearing in an optical diffraction tomographic image of a live COS-7 cell according to some embodiments of the present application;
FIG. 23 is a low refractive index vesicle appearing in an optical diffraction tomographic image when a bimodal microscopic imaging system according to some embodiments of the present application images different types of cells;
FIG. 24 is an optical diffraction tomographic image of human mesenchymal stem cells and the correlation of the number of low refractive index vesicles therein with their cellular senescence phenotype according to some embodiments of the present application;
FIG. 25 is an illustration of mitochondrial interaction with other organelles in COS-7 cells, in accordance with some embodiments of the present application;
FIG. 26 is a representation of the interaction of low refractive index vesicles according to some embodiments of the present disclosure between the histiocytes of COS-7 cells;
FIG. 27 is a visualization of the DBs transport pathway and its role in tissue-cytometer interactions, shown according to some embodiments of the present application;
fig. 28 is an analysis of motion artifacts that may be produced by different microscopes in optical diffraction tomography of fast moving lysosomes according to some embodiments of the present application;
FIG. 29 is a two-dimensional coherence function (CTF) measurement of a microscope according to some embodiments of the present application;
FIG. 30 is an image of a viewing of vacuoles of budding yeast by an ODT subsystem, according to some embodiments of the present application;
FIG. 31 is a graph showing the correlation of LC3-EGFP tag structures with DBs in COS-7 cells according to some embodiments of the present application;
FIG. 32 is a histogram of the size of LE/LY structures observed in COS-7 cells overexpressing different protein markers by the optical diffraction tomography subsystem shown in some embodiments herein;
FIG. 33 is a spatial frequency domain reconstructed based on optical diffraction tomography according to some embodiments of the present application;
FIG. 34 is a schematic illustration of a three-dimensional global view of an organelle using an optical diffraction tomography subsystem, according to some embodiments of the present application;
FIG. 35 is a schematic diagram of a super-resolution fluorescence assisted diffraction tomography (SR-FACT) system according to some embodiments of the present application.
Wherein 100 is a bimodal microimaging system, 101 is a first light source, 102 is a first acousto-optic modulator (AOM), 103 is a first half-wave plate (HWP), 104 is a first polarizing beam splitter Prism (PBS), 105 is a Single Mode Fiber (SMF), 106 is a lens, 107 is a Galvanometer (GM), 108 is a telescopic lens, 109 is a microscope Objective (OBJ), 110 is a sample, 111 is a microscope objective, 112 is a first Dichroic Mirror (DM), 113 is a lens, 114 is a second camera, 115 is a single mode fiber, 116 is a lens, 117 is a first camera, 118 is a polarization independent beam splitter prism (BS), 119 is a lens, 120 is a lens, 121 is a second dichroic mirror, 122 is a lens, 123 is a Polarization Rotator (PR), 124 is a lens, 125 is a spatial filter (Mask), 126 is a lens, 127 is a second polarizing beam splitter prism, 128 is a second half-wave plate, 129 is a Spatial Light Modulator (SLM), 130 is a lens, 131 is a single mode fiber, 132 is a second acousto-optic modulator, 133 is a second light source, 134 is a coupler, 135 is a coupler, and 136 is a coupler.
Detailed Description
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings used in the description of the embodiments will be briefly introduced below. It is obvious that the drawings in the following description are only examples or embodiments of the application, and that for a person skilled in the art, the application can also be applied to other similar scenarios on the basis of these drawings without inventive effort. Unless otherwise apparent from the context, or otherwise indicated, like reference numbers in the figures refer to the same structure or operation.
It should be understood that "system", "device", "unit" and/or "module" as used herein is a method for distinguishing different components, elements, parts, portions or assemblies at different levels. However, other words may be substituted by other expressions if they accomplish the same purpose.
As used in this application and in the claims, the terms "a," "an," "the," and/or "the" are not intended to be inclusive in the singular, but rather are inclusive in the plural, unless the context clearly dictates otherwise. In general, the terms "comprises" and "comprising" merely indicate that steps and elements are included which are explicitly identified, that the steps and elements do not form an exclusive list, and that a method or apparatus may include other steps or elements.
Flow charts are used herein to illustrate operations performed by systems according to embodiments of the present application. It should be understood that the preceding and following operations are not necessarily performed in the exact order in which they are performed. Rather, the steps may be processed in reverse order or simultaneously. Meanwhile, other operations may be added to the processes, or a certain step or operations may be removed from the processes.
The present application relates generally to a bimodal microscopic imaging system. In some embodiments, the dual-modality microscopy imaging system may be a super-resolution fluorescence-assisted diffraction computed tomography (SR-FACT) system (fig. 35). The dual-modality microscopic imaging system includes an Optical Diffraction Tomography (ODT) subsystem and a structured light illumination fluorescence imaging (SIM) subsystem. By combining the two subsystems, the short plate with non-specific imaging capability of diffraction tomography and limited speed and time range of structured light illumination fluorescence imaging is effectively made up. The optical diffraction tomography subsystem may be used to acquire an optical diffraction tomography image of the sample and the structured light illuminated fluorescence imaging subsystem may be used to acquire a structured light illuminated fluorescence image of the sample. The sample 110 may include, but is not limited to, biological tissue, biological macromolecules, proteins, cells, microorganisms, or other substances, etc.
FIG. 1 is a schematic structural diagram of a bimodal microscopy imaging system, such as a super-resolution fluorescence assisted diffraction tomography (SR-FACT) system, according to some embodiments of the present application. FIG. 2 is a block diagram of a dual modality microscopic imaging system, shown in accordance with some embodiments of the present application. FIG. 35 is a simplified illustration of the hardware architecture of the SR-FACT system. As shown in fig. 1-2, the dual-modality microscopy imaging system 100 may include an optical diffraction tomography subsystem 210, a structured light illumination fluorescence imaging subsystem 220, a control subsystem 230, and a processor 240.
The Optical Diffraction Tomography subsystem 210 may be a system that employs Optical Diffraction Tomography (ODT) microscopy for imaging. When light waves pass through a weakly scattering sample, the scattering (or diffraction) of the inhomogeneous sample can cause the amplitude and phase of the light field to change; thus, the amplitude and phase of the scattered (or diffracted) light field naturally contain structural information of the sample. The optical diffraction tomography microscopy is an imaging technique for reconstructing a three-dimensional distribution image of refractive index of a sample through a back scattering (or diffraction) process. In embodiments of the present application, the optical diffraction tomography subsystem may perform label-free optical diffraction tomography based on the first laser to acquire an optical diffraction tomography image of the sample.
In some embodiments, the optical diffraction tomography subsystem 210 may include the first light source 101, the first acousto-optic modulator 102, the first half-wave plate 103, the first polarization splitting prism 104, the galvanometer 107, the polarization-independent splitting prism 127 (which may also be referred to as a beam splitter), the first camera 117, and other devices. The first light source 101 may be used to emit first laser light. In some embodiments, the optical diffraction tomography subsystem 210 may also include one or more lenses, single mode fibers, and/or couplers, among others. In an embodiment of the present application, as shown in fig. 1, the optical diffraction tomography subsystem 210 may include the first light source 101, the first acousto-optic modulator (AOM)102, the first Half Wave Plate (HWP)103, the first polarization splitting Prism (PBS)104, the Single Mode Fiber (SMF)105, the lens 106, the Galvanometer (GM)107, the sleeve lens 108, the microscope Objective (OBJ)109, the microscope objective 111, the first Dichroic Mirror (DM)112, the single mode fiber 115, the lens 116, the first camera 117, the polarization independent splitting prism (BS)118, the lens 119, the lens 120, the second dichroic mirror 121, the lens 122, and the like.
In the embodiment of the present application, the first laser light emitted from the first light source 101 becomes +1 st order diffracted light after being modulated by the first acousto-optic modulator 102, and the +1 st order diffracted light passes through the first half-wave plate 103 and is split into the first and second split lights by the first polarization splitting prism 104. In some embodiments, rotating the first half wave plate 103 can adjust the splitting ratio of the first split light and the second split light.
In one aspect, the first split light may be coupled into the single-mode fiber 105 through the coupler 134, and then the first split light is collimated by the lens 106 after being output from the single-mode fiber 105, and the collimated first split light (first split light collimated beam) may be applied to the sample 110 from multiple angles through the vibrating mirror 107 to obtain sample light with sample information. Specifically, the first split collimated light beam passes through the galvanometer 107 to obtain a two-dimensional scanning light beam of the deflection direction of the collimated light beam, and then the scanning light beam is focused on the back focal plane of the microscope objective 109 by the sleeve lens 108 to realize illumination of the collimated light beam in different directions on the sample 110. In some embodiments, galvanometer 107 may be a 4-f system (e.g., AC254-100-a × 2, f 100mm) with a pair of scanning galvanometers mounted at opposite ends with orthogonal deflection directions. In some embodiments, microscope objective 109 may be an illumination objective. In some embodiments, the sample light may be light that impinges on the sample 110 from collimated light beams of different directions and is transmitted (e.g., scattered or diffracted, etc.) through the sample 110. The sample light is collected by the microscope objective 111, reflected by the first dichroic mirror 112, passes through the lens 122 and the second dichroic mirror 121, and is re-collimated by the lens 120 and the lens 119 to obtain signal light. After the sample light passes through the second dichroic mirror 121, the light beam of the sample light will be expanded by the pair of lenses 120 and 119.
On the other hand, the second split light is coupled into the single mode fiber 115 through the coupler 135. To ensure that the first split beam and the second split beam reach the first camera 117 at the same time, a delay path may be passed before the second split beam is coupled. In an embodiment of the present application, the second split light is reference light. The second split light is output from the single-mode fiber, and is first collimated by the lens 116, and finally combined with the collimated signal light by the polarization-independent beam splitter prism 118, and forms off-axis holographic fringes at an off-axis angle, and is received by the first camera 117.
In embodiments of the present application, the optical diffraction tomography subsystem may be a MZ interferometer based off-axis holographic microscope retrofitted on a commercial microscope (OLYMPUS, IX 73). The first light source 101 can be a single longitudinal mode laser with a wavelength of 561nm, which is manufactured by vinpocetine new industry photoelectric technology limited and has the model number of MSL-FN-561-50 mW; the first acousto-optic modulator can be an acousto-optic modulator manufactured by Chongqing acousto-optic and electro-optic Limited company of the Middle electric technology group; the first half-waveplate may be of the type Thorlabs, AHWP 10M-600; the first polarization beam splitter prism may be of the type Thorlabs, CCM1-PBS 251; couplers 134 and 135 may be of the type Thorlabs, PAF 2-7A; single mode fibers 105 and 115 may be polarization maintaining single mode fibers of the type PM460-HP HA, FC/APC, made by Shanghai vast; the focal length of lens 106 may be 40 mm; the galvanometer 107 can be Thorlabs, GVS 211/M; the focal length of sleeve lens 108 may be 180 mm; the type of the micro objective lens 109 can be OLYMPUS, LUMPlanFLN,60 x/1.0W; the type of the microscope objective 111 can be OLYMPUS, ApoN 100x/1.49 Oil; the model of the first dichroic mirror 112 can be Chroma, ZT405/488/561/640-phase R; the focal length of the lens 122 may be 200 mm; the second dichroic mirror may be of the type Thorlabs, DMLP 550L; the focal length of the lens 120 may be 180 mm; the focal length of lens 119 may be 250 mm; the focal length of the lens 116 may be 100 mm; the type of the polarization independent beam splitter prism 118 may be Thorlabs, BS 013; the first camera may be a sCMOS camera, model Hamamatsu, OCRA-flash4.0V3C13440, which can provide a sufficient total photon flux within a 50 μ s exposure time.
In embodiments of the present application, the optical diffraction tomography subsystem 210 may be used to acquire optical diffraction tomography images. In order to better describe the characteristics of the optical diffraction tomography subsystem 210 of the present embodiment, specific experimental results will be described below.
The optical diffraction tomography subsystem 210 of the present embodiment has label-free imaging capabilities. Shown in fig. 12 is an optical diffraction tomographic image of the patient's fibroblasts obtained using the optical diffraction tomography subsystem 210. The left image in FIG. 12 is the control cell image; the middle panel is a cellular image of the first patient; the right image is a cell image of a second patient. By comparing with the morphology of the control group, the cells of the patient have the characteristics of small number of mitochondria, short length of mitochondria, large lysosomes, abnormal structures and the like. In the traditional pathological imaging, exogenous fluorescent labeling can not be carried out on living cells of a human body due to ethical requirements, so that pathological research on the scale of a few organelles is carried out. As can be seen in FIG. 12, the optical diffraction tomography subsystem 210 has excellent label-free imaging capabilities.
The optical diffraction tomography subsystem 210 of the embodiment of the application has better resolution and three-dimensional imaging capability. FIG. 13 shows the results of optical diffraction tomography of immobilized INS-1 cells. Wherein, (a) is an original hologram; (b) is a phase image; (c) three-dimensionally rendering the refractive index distribution; (d) is a wide field fluorescence image; (e) is a single layer diffraction tomographic image corresponding to the fluorescence image acquired using the optical diffraction tomographic imaging subsystem 210; (b) the scale bar of (d) and (e) was 10 μm. By comparing (b) with (e), the optical diffraction tomography has better resolution and three-dimensional imaging capability compared with the traditional phase imaging. Furthermore, by comparing (d) with (e), it can be found that the optical diffraction tomography image collected by the optical diffraction tomography subsystem 210 can be used to characterize not only lipid droplets but also other unlabeled structures in the cell. Further, FIG. 14 shows the results of comparing the fixed primary hepatocyte wide-field fluorescent scintigraphic image with the optical diffraction tomographic image. Wherein (a) is a wide field fluorescence image at different depths; (b) a single layer diffraction tomographic image of a corresponding depth; (a) the scale bars of (a) and (b) are each 10 μm. The lipid droplet structures at different depths in the optical diffraction tomography image can correspond to the wide-field fluorescence image, and the three-dimensional imaging capability of the optical diffraction tomography image is verified.
The optical diffraction tomography subsystem 210 of the present embodiment also has fast, long-term, non-destructive imaging capabilities. Specifically, the optical diffraction tomography subsystem 210 does not need to dye and mark the cells, so that the cytotoxicity is low, and the optical diffraction tomography subsystem is suitable for long-term imaging of the cells. For example, FIG. 15 shows the results of continuous imaging of COS-7 cells at intervals of 10s for 83min using the optical diffraction tomography subsystem 210. Wherein, (a) is a Z plane image of the cell at the time of 00:49:30, wherein the Z plane is a transverse plane which is selected according to contents and is vertical to the optical axis in the three-dimensional image; (b) the enlarged images of the areas indicated by the dashed boxes in four different moments (a) respectively show the processes of chromosome segregation and gathering, nuclear membrane formation and chromatin gathering and nucleation; (c) an image of the cells in another Z plane (below 0.86 μm in the midplane in (a)) at time 00:00:00, and (d) an enlarged image of the area indicated by the dashed box in (c); (e) cell image of the third Z plane at time 00:43:10 ((a) mid-plane 1.72 μm or less); (f) the images of the two different time points of the area shown by the dotted line box in (e) show that the tubular organelles stretch and twist during cell division and are radially arranged outside the nucleus after cell division. (g) A plane of the pre-division nuclear region of the cell is shown for an enlarged image of another cell. Five different time points are shown, with the nucleus and associated nucleolar structures clearly visible (0 ' 00 "), one region of the nuclear membrane (arrow) being deformed by many of the incoming cellular structures (27 ' 30"), the other region of the nuclear membrane initially being ruptured (arrow, 29 ' 10 "), the appearance of chromosomes (31 ' 10"), the chromosomes being arranged in petal-like shapes (38 ' 10 "). The scales in FIG. 15 are 5 μm (a, c, e) and 2 μm (b, d, f, g), respectively. In FIG. 15(a), a chromosome-like double structure in the nuclear region of dividing cells can be observed. During mitosis, chromosomes are first separated and then two large, closely-connected, high-density plaques are formed. In FIG. 15(b), the process of mitotic division in which chromosomes are pulled apart, aggregate and fuse to form new nuclei, respectively, can be seen. As shown in fig. 15(c-d), various structures with different shapes, densities and dynamics can also be observed in the cytoplasm. For example, complex filament structures can be observed at the centrosome locations, while in other areas bright vesicle structures, large dark vesicles and black vacuole-like vesicles aggregate. Thanks to the label-free imaging property of optical diffraction tomography, the dynamics of other organelles in the cytoplasm of the cell can be observed at the same time, as shown in fig. 15(e-f), a vermicular tubular structure, which is a tubular mitochondrion, can be observed. Tubular mitochondria stretch and twist during fission and align radially along the outer spindle of the nuclear membrane after fission. In some embodiments, as shown in FIG. 15(g), the nucleus and associated nucleolar structures rotate, and then many afferent organelles may attach to an area in the cell membrane and deform. Then, nuclear membrane disintegration was observed in the region opposite to the initial invagination site, followed by the appearance of chromosomal structures, which were finally arranged in a petaloid array. Although the optical diffraction tomography module reveals these unique spatiotemporal dynamics of subcellular structures, their identity remains to be explored by co-localization with existing fluorescent organelle markers provided by simultaneous Hessian SIM imaging. In addition, the optical diffraction tomography subsystem 210 can also be used to observe other cellular biological processes, such as pollen tube growth (as shown in FIG. 16), nematode embryo development (as shown in FIG. 17), and the like. FIG. 16 shows an optical diffraction tomographic image of the growth process of the pollen tube. Axial (e.g., z-direction) pole projections at time 00:00, 03:20, 06:40, and 09:50, in order from left to right. FIG. 17 is an optical diffraction tomography image of nematode embryo development. Axial extremum projections at time 00:00, 04:20, 07:30, and 09:35 in order from left to right. From the test results of FIGS. 15-17, it can be seen that the optical diffraction tomography subsystem 210 has fast, long-term non-destructive imaging capabilities.
The structured light illumination fluorescence imaging subsystem 220 can be a system that employs structured light illumination techniques for fluorescence imaging. Traditional fluorescence microscopy imaging is limited by the imaging bandwidth of the microscope and resolution is limited by the optical diffraction limit. The fine sample structure information corresponds to a higher spatial frequency, which cannot be imaged when the frequency exceeds the cut-off frequency of the optical transfer function of the microscope system. The structured light illumination fluorescence imaging subsystem 220 can load the illumination light with a certain spatial frequency through the grating, and the high frequency information of the sample will be mixed with the illumination light to generate a lower spatial frequency, namely moire fringe effect. Moire fringes shift spatial high frequency information that would otherwise be collected to within the optical transfer function cutoff frequency, which can be collected by the imaging system and improve the resolution of the microscopy system by resolution (e.g., deconvolution operations). In some embodiments, structured light illuminated fluorescence imaging subsystem 220 can include linear structured light illumination imaging and/or non-linear structured light illumination imaging. In embodiments of the present application, the structured light illuminated fluorescence imaging subsystem may be configured to perform super-resolution fluorescence imaging based on the second laser to obtain a structured light illuminated fluorescence image of the sample.
In some embodiments, the structured light illumination fluorescence imaging subsystem 220 may include a second light source 133, a second acousto-optic modulator 132, a second polarization splitting prism 127, a second half-wave plate 128, a spatial light modulator 129, a spatial filter 125, an offset rotator 123, a second camera 114, and the like. In some embodiments, the structured light illumination fluorescence imaging subsystem may further include lenses, single-mode fibers, dichroic mirrors, and/or couplers, among others. In an embodiment of the present application, as shown in fig. 1, the structured light illumination and fluorescence imaging subsystem 220 may include a second light source 133, a second sound light modulator 132, a coupler 136, a single-mode fiber 131, a lens 130, a second polarization splitting prism 127, a second half-wave plate 128, a spatial light modulator 129, a lens 126, a spatial filter 125, a lens 124, a polarization rotator 123, a second dichroic mirror 121, a lens 122, a first dichroic mirror 112, a microscope objective 111, a lens 113, a second camera 114, and the like.
In the embodiment of the present application, after the second laser light emitted from the second light source 133 is modulated in light intensity and on-off time by the second sound optical modulator 132, the +1 st order diffracted light is coupled to the single-mode optical fiber 131 through the coupler 136. The second laser light is collimated by a lens 130 after being filtered by a single mode fiber 131. After the collimated linearly polarized laser light passes through a structured light system composed of the second polarization beam splitter 127, the second half-wave plate 128 and the spatial light modulator 129, a structured light modulation pattern, i.e., structured light, on the spatial light modulator 129 is loaded in the light field. The structured light is focused by lens 126 onto spatial filter 125, filtering out the desired ± 1 st order diffraction, filtering out stray light generated by spatial light modulator 129, and then re-collimated by lens 124. Where the light beams converge, the liquid crystal polarization rotator rotates the illumination light polarization (and forms the excitation light) according to the fringe direction to maintain the S-polarization direction incident relative to the sample to form interference fringes. The excitation light is reflected by the second dichroic mirror 121 to the lens 122, and is guided by the first dichroic mirror 112 to the microscope objective 111 and acts on the sample 110, thereby focusing ± 1 order lights on the back focal plane of the microscope objective 111, interfering as a modulation fringe of a diffraction limit at the sample 110, and exciting fluorescence. The excited fluorescent light is collected by the microscope objective 111 and guided through the first dichroic mirror 112 to the lens 113 (e.g., a microscope tube lens), passes through the lens 113 and is received by the second camera 114. In one embodiment, the structured light illuminated fluorescence imaging subsystem may be an ultrafast, long-range hessian structured light illuminated microscope system.
In some embodiments, the objective (e.g., microscope objective 111), first dichroic mirror 112, and/or second dichroic mirror 121 in bimodal microscopy imaging system 100 may be shared by optical diffraction chromatography subsystem 210 and structured light illuminated fluorescence subsystem 220. For example, the sample light generated in the optical diffraction tomography subsystem 210 and the fluorescence generated by the structured light illumination fluorescence subsystem 220 may both be collected by the microscope objective 111. As another example, the first dichroic mirror 112 may be used to separate the sample light and the fluorescent light. For another example, the second dichroic mirror 121 may be used to separate the sample light and the excitation light. For another example, first dichroic mirror 112 and second dichroic mirror 121 can be used to direct sample light to the dichroic mirror in the optical diffraction tomography subsystem, and to direct excitation light to the objective lens to act on the sample in the structured light illumination fluorescence imaging subsystem.
In some embodiments, the first light source 101 and the second light source 133 may be two light sources independent of each other in a dual-modality microscopic imaging system. In some embodiments, the first laser light and the second laser light may be emitted by the same combined light source (e.g., combined by first light source 101 and second light source 133). According to the embodiment of the application, the first laser and the second laser which are used for the optical diffraction tomography subsystem and the structured light illumination fluorescence imaging subsystem are respectively emitted by the first light source 101 and the second light source 133, so that fluorescence imaging can be controlled independently, the toxicity of the first laser and the second laser to a sample (such as a cell) is effectively reduced, the influence of optical signals on sample bleaching is effectively reduced, and quick co-location imaging observation can be facilitated. In some embodiments, the wavelengths of the first and second lasers may be the same. In some embodiments, the first laser light and the second laser light are different in wavelength. Specifically, the first light source 101 may include a single longitudinal mode laser that emits first laser light having a wavelength of 561nm, and the second light source 133 may include a single longitudinal mode laser that emits second laser light having a wavelength of 488 nm.
In some embodiments, the structured light illuminated fluorescence imaging subsystem 220 can also acquire another structured light illuminated fluorescence image of the sample based on the third laser. Wherein the second laser light and the third laser light have different wavelengths. For example, the wavelength of the second laser may be 488 nm; the wavelength of the third laser light may be 498nm, 475nm, etc. In some embodiments, the third laser light may be emitted by the second light source. For example, the second light source may be caused to emit the second laser light or the third laser light by adjusting the second light source. In some embodiments, the third laser light may also be emitted by a separate third light source. For example, after performing fluorescence imaging using the second laser light emitted by the second light source, the second light source may be replaced with a third light source, and fluorescence imaging may be performed using the third laser light emitted by the third light source. The process of acquiring another structured light illumination fluorescence image of the sample based on the third laser is similar to the process of acquiring the structured light illumination fluorescence image of the sample based on the second laser, and is not described in detail here. In some alternative embodiments, the structured light illuminated fluorescence imaging subsystem 220 can also acquire more structured light illuminated fluorescence images of the sample based on more lasers of different wavelengths (e.g., a fourth laser, a fifth laser, etc.). By adopting the lasers with different wavelengths, the structured light illumination fluorescence images with different resolutions can be obtained, so that more information of the sample can be obtained, and the sample can be observed more comprehensively and accurately.
In some embodiments, the optical diffraction tomography image of the sample acquired by the optical diffraction tomography subsystem 210 can be a two-dimensional image and/or a three-dimensional image. In some embodiments, the structured light illuminated fluorescence image of the sample acquired by the structured light illuminated fluorescence imaging subsystem 220 can be a two-dimensional image and/or a three-dimensional image. In one embodiment, the optical diffraction tomography image may be a three-dimensional image and the structured light illuminated fluorescence image may be a two-dimensional image. In particular, label-free optical diffraction tomography can be assisted by structured light illumination fluorescence imaging. The bimodal microscopy imaging system 100 may perform fluorescence imaging and label-free optical diffraction tomography at a speed of no less than 0.3Hz (e.g., 0.3Hz, 0.4Hz, 0.5Hz, 0.7Hz, 0.8Hz, etc.) over a field of view of no less than 80 μm x 40 μm (e.g., 80 μm x 40 μm, 100 μm x 50 μm, etc.). The lateral resolution of the obtained optical diffraction chromatographic image can be no worse than 200nm (such as 200nm, 190nm, 180nm and the like), and the longitudinal resolution can be no worse than 560nm (such as 560nm, 540nm, 500nm and the like); the lateral resolution of the resulting structured light illuminated fluorescence image may be no worse than 100nm (e.g., 100nm, 95nm, 90nm, etc.).
The control subsystem 230 may be used to control various components of the dual-modality microscopic imaging system 100. In some embodiments, the control subsystem 230 may control one or more of the first light source 101, the first acousto-optic modulator (AOM)102, the Galvanometer (GM)107, the second camera 114, the first camera 117, the second acousto-optic modulator 132, the second light source 133, and so on. For example, the control subsystem 230 may control the first light source 101 to emit the first laser light. For another example, the control subsystem 230 may control the second light source 133 to emit the second laser light. As another example, the control subsystem 230 may control the galvanometer 107 to rotate. As another example, the control subsystem 230 may control the exposure of the second camera 114 and/or the first camera 117.
In some embodiments, the control subsystem 230 may control the timing of the operation of the optical diffraction tomography subsystem 210 and the structured light illuminated fluorescence imaging subsystem 220 to achieve simultaneous or alternating label-free optical diffraction tomography and fluorescence imaging. For more details regarding control subsystem 230, reference may be made to FIG. 3 and its associated description.
The processor 240 may be used to process information/data in a dual modality microscopic imaging procedure. In some embodiments, the processor 240 may determine a bimodal fusion image of the sample at the same location based on the optical diffraction tomography image and the structured light illumination fluorescence image. The bimodal fusion image may have both morphological information and class label information of the sample. The morphological information of the sample may include the size and shape of the sample, and the like. The class label information of the sample may include the type of labeled portion of the sample (e.g., what organelles are specifically).
In some embodiments, processor 240 may include microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASIC), application specific instruction set processors (ASIP), Central Processing Units (CPU), Graphics Processing Units (GPU), Physical Processing Units (PPU), microcontroller units, Digital Signal Processors (DSP), Field Programmable Gate Array (FPGA), Advanced RISC Machines (ARM), programmable logic devices, any circuit or processor capable of executing one or more functions, and the like, or any combination thereof.
FIG. 3 is a schematic diagram of a control subsystem of a dual-modality microscopic imaging system, shown in accordance with some embodiments of the present application. In some embodiments, the control subsystem 230 may be implemented by one or more of a data acquisition card (DAQ), a single chip, a Field Programmable Gate Array (FPGA), and the like. For example only, as shown in FIG. 3, the control subsystem 230 may include a control computer (PC) and a data acquisition card (DAQ). Four analog output channels of a data acquisition card (DAQ) respectively control the scanning of the two-axis galvanometer, the Z-axis voltage level shift stage and the liquid crystal polarization rotator. Wherein the Z-axis may represent a direction perpendicular to the plane of the slide of the sample (e.g., biological cells). Specifically, the data acquisition card may output control voltages to the galvanometer servo circuit at the analog output ports AO0 and AO1 to control the angle of the galvanometer, and further control the deflection angle of the light beam acting through the galvanometer 107 along the x and y directions. Wherein the x and y directions are two orthogonal directions parallel to the sample slide. In some embodiments, the data acquisition card may output a control voltage at the analog output AO2 to control the Z-axis voltage level shifter, which may be used to move the sample in the Z-axis direction. The data acquisition card can output control voltage at an analog output port AO3 to control the signal generator, the liquid crystal polarization rotator is controlled by an amplitude-modulated 20kHz square wave signal generated by the signal generator, and the amplitude-modulated voltage is generated by the data acquisition card. The spatial light modulator 129 and the two cameras are controlled by four programmable digital output ports of the data acquisition card, wherein PFI0.0 and PFI0.1 control the switch and trigger of the spatial light modulator 129, and PFI0.2 and PFI0.3 output camera external trigger signals. Specifically, the digital output port PFI0.2 may output a delayed and duty-adjustable square wave signal to the external trigger port of the camera, so as to control the camera to synchronously perform exposure during galvanometer scanning. The scan voltage may be a sine and cosine sawtooth voltage in the range of 205mV and the camera trigger may use a TTL level of 5V. Meanwhile, the camera exposure output port can output a 5V TTL exposure signal to the first acousto-optic modulator 102 for controlling the exposure time. For more details on the control subsystem 230 and the control timing, reference may be made to fig. 4 and its associated description.
In some embodiments, the bimodal microscopy imaging system may be a super-resolution fluorescence assisted diffraction tomography (SR-FACT) system. According to the Rayleigh criterion, the resolution of the bimodal microscopic imaging system is as follows:
Figure BDA0002373980640000131
in this formula, λ represents the illumination wavelength, and NA represents the Numerical Aperture (NA) of the optical system. In optical diffraction chromatographyIn imaging systems, rotational illumination extends the lateral spatial frequency limit. Theoretically, the fringe field in an optical diffraction tomography system is essentially a frequency shift in the detection frequency domain that extends the bandwidth of the lateral frequency shift by the following equation: k is a radical of||,odt=k||,det+k||,ill. In this formula, k||,odtAnd k||,detRepresenting the maximum numerical aperture of the illumination objective and the detection objective, respectively, the numerical aperture of the illumination objective being 1.0. Although objectives with a numerical aperture of 1.45 have also been used, the maximum numerical aperture of the article detected is still limited by the lower Refractive Index (RI) solution. If these live cells are stored in PBS buffer (n ═ 1.333), the effective numerical aperture of the optical diffraction tomography system can be: NAeff=NAdet+NAill≈2.33。
When laser light having a wavelength of 561nm was used for irradiation, the theoretical lateral resolution of the optical diffraction tomography system was about 197nm, which was confirmed in the experiments relating to COS-7 cells (see FIG. 19).
The situation of the axial resolution of the optical diffraction tomography system is complicated by the non-uniform distribution of spatial frequencies along the longitudinal direction (see fig. 33 c). The longitudinal spatial frequency bandwidth is essentially dependent on the transverse frequency. At k||Lower region, edge kThe direction shows a vanishing kernel, indicating that in the longitudinal direction the actual point spread function has a large transverse full width at half maximum (FWHM). When imaging living cells, especially those with high transverse frequencies in the cell structure, the bandwidth in the longitudinal direction increases (see fig. 29c), where the longitudinal frequency bandwidth can be:
Figure BDA0002373980640000141
Figure BDA0002373980640000142
in this formula, n01.33, λ 561nm, so a simple estimated longitudinal resolution is obtained:
Figure BDA0002373980640000143
FIG. 4 is a timing diagram for control of a dual modality microscopic imaging system, according to some embodiments of the present application. In particular, the control sequence of the dual-modality microscopic imaging system may be implemented by the control subsystem 230. In embodiments of the present application, the dual-modality microscopic imaging system 100 may perform dual-modality microscopic imaging (e.g., long-time dynamic observation) of the sample for multiple time periods. There may be a certain time interval (e.g. 0.5 second, 1 second, 3 seconds, etc.) between two adjacent time periods. For a time period, as shown in fig. 4, dual-modality microscopy imaging system 100 may perform structured light illumination fluorescence imaging (SIM) on the sample based on optical diffraction tomography subsystem 210 and Optical Diffraction Tomography (ODT) on the sample based on structured light illumination fluorescence imaging subsystem 220.
When optical diffraction tomography is performed, a data acquisition card (DAQ) in the control subsystem 230 can output X-axis and Y-axis scanning voltages (e.g., sine-cosine sawtooth voltages in the range of ± 205 mV) of the galvanometer 107 to the galvanometer servo circuit at the analog output ports AO0 and AO1, respectively, so as to realize the X-direction and Y-direction deflection of the light beam. Meanwhile, the digital output port PFI0.2 of the data acquisition card may output a first camera trigger signal (e.g. TTL level of 5V) to the first camera 117(ODT camera), and the exposure output port of the first camera may output an exposure signal (e.g. TTL exposure signal of 5V) to the first acousto-optic modulator 102, so as to turn on the first acousto-optic modulator 102 to release the first laser to trigger the exposure of the first camera 117 and control the exposure time. As shown in fig. 4, the data for each time period may consist of 240 off-axis holograms at different illumination angles. The maximum frame rate of the first camera can be 200 fps under 1024x1024 frames, and the practical frame rate is 196fps, so that the single-group data acquisition takes 1.225 s. Wherein the illumination light of the first and last images may both be at normal incidence to the sample plane; in other image shooting, the galvanometer controls the focus of the light beam on the back focal plane of the objective lens to carry out annular scanning.
In some embodiments, the first camera 117 may be an sscmos camera that enters the exposure state line by line while performing the exposure. To increase the data acquisition speed, the sCMOS camera can be operated in a synchronous exposure mode. Specifically, the rising edge of the exposure trigger of the camera is synchronized with the start time of the exposure of the 1023 th row and the 1024 th row in the camera, and after all rows 513 to 1535 enter the exposure state, the camera outputs a square wave switching signal with adjustable time width to the first acousto-optic modulator 102 (i.e. ODT AOM in the figure), so that the system is synchronously exposed for a certain time. After exposure is finished, the sCMOS chip starts to read data line by line from the middle to the upper side and the lower side, and then enters the next round of exposure time sequence. To minimize the effects of noise and vibration over time in the system, the on-time of the first acousto-optic modulator 102 may be set to 50 mus.
In some embodiments, the vertically illuminated images in each set of data collected by the optical diffraction tomography subsystem 210 may be used for timing checking during data processing to reduce data timing upsets caused by occasional frame dropping during high-speed data collection by the camera. Meanwhile, in order to eliminate the influence of uneven amplitude distribution and residual uneven phase caused by the defects of the optical system on the illumination light beams, a group of background data can be collected in a sample-free area on the sample slide for data processing before the sample is replaced each time.
When the data acquisition of the optical diffraction tomography subsystem 210 is completed in a time period, the structured light illumination fluorescence subsystem 220 data acquisition can be performed. Specifically, the data acquisition card (DAQ) may provide a start signal to the spatial light modulator 129(SLM), and the SLM start signal may be provided each time the structured light illumination and fluorescence imaging is performed, and the SLM start signal may ensure that the first image of each set of acquired data is correct, so as to avoid a situation that a plurality of sets of data are disordered due to a lack of one or more images in a certain set of data during continuous shooting. In some embodiments, the spatial light modulator 129(SLM) may acquire multiple (e.g., 9) triggering signals to form multiple (e.g., 9) structured-light modulation patterns, and obtain multiple (e.g., 9) structured-light illumination fluorescence images in sequence. After the spatial light modulator 129 is triggered, the data acquisition card (DAQ) may output a trigger signal to the second camera (SIM), and the exposure output port of the second camera 114 may output a trigger signal to the second sound light modulator 132 to turn on the second sound light modulator 132 to release the second laser light, so as to achieve the exposure of the second camera 114. After one set of exposures is complete, the data, consisting of 9 structured light illuminated fluorescence images, is read out by the CMOS chip of the second camera 114. In some embodiments, the exposure time of the second camera 114 may be longer than the exposure time of the first camera. For example, the exposure time of the second camera 114 may be set to 30ms to ensure a fluorescence image signal-to-noise ratio and avoid bleaching the cell sample. In some embodiments, to obtain three sets of differently oriented structured light modulation patterns, a different deflection voltage may be applied to the polarization rotator 123 to change the rotation angle of the polarization rotator 123 every three exposures while the second camera 114 is exposed.
FIG. 5 illustrates an exemplary flow chart of a method of dual modality microscopy imaging according to some embodiments of the present application. In some embodiments, the dual-modality microscopic imaging method 500 may be performed by the dual-modality microscopic imaging system 100. As shown in fig. 5, the bimodal microscopy imaging method 500 may include:
step 510, respectively generating a first laser and a second laser by using mutually independent light sources. In some embodiments, the first laser light is generated by a first light source of the optical diffraction tomography subsystem 210; the second laser light is generated by a second light source of the structured light illumination fluorescence imaging subsystem 220. The first light source and the second light source are two independent lasers in the dual-mode microscopic imaging system 100. In some embodiments, the wavelengths of the first and second lasers may be the same. In some embodiments, the wavelengths of the first and second lasers may be different. For example, the wavelength of the first laser light may be 561nm, and the wavelength of the second laser light may be 488 nm.
Step 520, acquiring an optical diffraction tomography image of the sample based on the first laser by using an optical diffraction tomography subsystem. In some embodiments, the optical diffraction tomography subsystem 210 may utilize the first laser to act on the sample to obtain a first imaging data set (e.g., the raw optical diffraction tomography data shown in fig. 6) of the sample, and specific embodiments may refer to the description above regarding the optical diffraction tomography subsystem 210 and will not be described herein again. Wherein the first imaging data set of the sample may comprise sample information.
In some embodiments, the processor 240 may process the first imaging data set of the sample using a diffraction tomography reconstruction algorithm to generate an optical diffraction tomography image. FIG. 6 is a flow chart of a diffraction tomography reconstruction algorithm according to some embodiments of the present application. FIG. 11a is a schematic diagram of an optical diffraction tomography image reconstruction algorithm according to some embodiments of the present application. As shown in fig. 6 and fig. 11a, two sets of data of a sample-free background hologram and a hologram of a sample at a plurality of time points are input as original data of optical diffraction tomography. Wherein, can gather the background hologram under l different illumination angles under the condition of no sample. While the live cell sample is continuously imaged in the presence of the sample, one hologram at the same illumination angle is acquired at each time point. In some embodiments, the diffractive tomographic reconstruction algorithm employed based on the first imaging data set of the sample may include: the method comprises the steps of holographic processing, wave vector calculation, sequence inspection, Rytov calculation, frequency spectrum splicing and inverse filtering. For convenience of explanation, the steps will be specifically described below according to the flow of the diffraction tomography reconstruction algorithm.
In some embodiments, the holographic processing step may be for extracting a hologram based on the first set of imaging data, wherein the hologram may comprise an amplitude image and a phase image. In some embodiments, the holography processing step may employ techniques including, but not limited to, on-axis digital holography, off-axis digital holography, heterodyne holography, phase shift holography, and the like, to extract the hologram based on the first set of imaging data. By way of example only, in the holographic processing step, off-axis digital holography techniques may be used to derive holograms (i.e. two-dimensional light field amplitude images and phase images) based on the first set of imaging data.
In off-axis digital holography, the reference light does not propagate coaxially with the sample light, but rather there is an off-axis angle θ. The off-axis angle introduces a certain space carrier term to the reference light on the holographic image surface
Figure BDA0002373980640000171
λ represents the wavelength and θ represents the off-axis angle. The reference light of the holographic image plane can be written as follows:
Figure BDA0002373980640000172
in the formula (1), r represents the amplitude of the reference beam at the hologram surface.
Reference light and sample light
Figure BDA0002373980640000173
The hologram formed after interference can be represented as:
Figure BDA0002373980640000174
where o (x, y) represents the amplitude of the sample light at the holographic image plane, and φ (x, y) represents the phase distribution of the sample light at the holographic image plane. At this time, the hologram is displayed as interference fringes modulated in the x direction, and a phase change in the fringes may directly reflect the phase distribution of the sample light.
When holographic reproduction is performed, the original reference light generates a restored light field:
Figure BDA0002373980640000175
due to the off-axis angle, a virtual image is generated in the light incoming direction of the original sample light of the hologram, a real image is generated behind the holographic image plane and in the direction of the off-axis angle which is twice of the original sample light, and the real image and the virtual image are separated spatially.
Further, in the off-axis digital holography technique, the acquired off-axis hologram can be represented as:
Figure BDA0002373980640000176
in the formula (4), m represents the m-th pixel in the x direction, n represents the n-th pixel in the y direction, and Δ p represents the pixel size.
In off-axis digital holographic reproduction, the digital hologram can first be brought into phase with a reference phase e-iαmΔpMultiplying, filtering out DC term and conjugate term by low-pass spatial filtering, and finally utilizing reference light image IRef(m Δ p, n Δ p) normalization yields amplitude and phase images:
Figure BDA0002373980640000177
wherein L denotes a low-pass matrix, and dft. After the amplitude image and the phase image are obtained, the three-dimensional light field calculation can be carried out by utilizing a Fresnel diffraction formula, and the corresponding phase distribution is solved by the amplitude angle of the complex light field.
It should be noted that since the sample wavefront to be recorded often has a certain bandwidth, too small an off-axis angle may not sufficiently separate the terms in the spatial spectrum, causing background noise to appear. Assuming sample light
Figure BDA0002373980640000181
Having a cut-off frequency umI.e.:
Figure BDA0002373980640000182
in the formula (6), u represents the x-direction spatial frequency, and v represents the y-direction spatial frequency.
Where FT. denotes the fourier transform, the spatial spectrum of the hologram obtained from equation (2) is:
Figure BDA0002373980640000183
δ (u, v) represents a dirac function on a spatial frequency coordinate,
Figure BDA0002373980640000184
representing an autocorrelation operation. The autocorrelation operation makes the cutoff frequency of the second term in equation (6) 2umThe last two cut-off frequencies are still um. It can be seen that, in order to make the terms in equation (6) not overlap with each other on the spatial frequency spectrum, the requirement on the spatial carrier frequency is:
α≥3um (8)
thus, the conditions which the off-axis angle in the off-axis digital holographic technology should satisfy are obtained as follows:
θ≥arcsin(3umλ) (9)
in some embodiments, since the off-axis angle is unknown, in the hologram processing step, the angle between the reference light and the illumination light transmitted in the signal light of each hologram may be determined based on the distance of the direct current term and the modulation term measured in the hologram frequency domain measurement. The frequency domain distance obtained by recording the jth hologram is
Figure BDA0002373980640000185
The unit thereof is a pixel. Since it is considered that the one-circle annular illumination light scanning mode with equal intervals can be used, the reference wave vector with sub-pixel precision can be obtained by calculating the average value of l distances:
Figure BDA0002373980640000186
meanwhile, the lateral component of the illumination light wave vector at the jth illumination angle (corresponding to the jth hologram) can be estimated as:
Figure BDA0002373980640000187
in the formula (11), the first and second groups,
Figure BDA0002373980640000188
is the initial estimate of the lateral component of the illumination light wave-vector at the j-th illumination angle.
In some embodiments, the wave-vector calculating step may be for determining a target swept wave-vector based on the phase image and generating the unwrapped phase image. In particular, in high-speed diffraction tomography, a light beam scanned at high speed is inevitably affected by minute chattering in the mechanical movement of the optical mechanical device. In long-term diffraction tomography, since the repeatability of the illumination light direction at different time points is difficult to guarantee, using the same parameters for data reconstruction at different time points can cause certain errors. In some embodiments, the wave vector calculation step further includes a vector iterative search algorithm, which can be used to solve the scanning wave vector at different angles, i.e. the target scanning wave vector. The process 700 of fig. 7a-b for determining a target swept wave-vector using a vector iterative search algorithm may include the following four steps:
step 710, multiplying the phase image extracted in the holographic processing step by the digital phase shift term to obtain a frequency-shift phase image and a preliminary estimated swept wave vector.
In step 710, the phase image extracted in the holographic processing step is phase shifted from the digital phase
Figure BDA0002373980640000191
By multiplication, a frequency-shifted phase image can be obtained, as well as a preliminary estimated swept wave vector. Wherein the content of the first and second substances,
Figure BDA0002373980640000192
is the initial estimate of the lateral component of the illumination light wave vector at the j illumination angle obtained in the holographic processing step. The preliminary estimated swept-wave-vector in step 710 may be used to update the swept-wave-vector, how it is used will be explained in step 730.
Step 720, unwrapping the frequency-shifted phase image using a least-squares-based phase unwrapping algorithm to obtain an unwrapped phase image with a low residual slope;
step 730, performing linear fitting on the slope of the unwrapped phase image in the orthogonal direction to obtain an updated scanning wave vector. In some embodiments, a linear fit may be performed on the slope of the unwrapped phase image in the orthogonal direction, and an updated swept wave vector may be obtained based on the fit and the preliminary estimated swept wave vector.
In step 730, the slope in the orthogonal direction (x, y direction) of the unwrapped phase image of the hologram at the jth illumination angle
Figure BDA0002373980640000193
Performing linear fitting, and updating scanning wave vectors:
Figure BDA0002373980640000194
in the formula (12), the compound represented by the formula (I),
Figure BDA0002373980640000195
as a new scanning wave-vector for the hologram at the jth illumination angle,
Figure BDA0002373980640000196
is the preliminary estimated swept wave-vector at the jth illumination angle obtained in step 710.
Step 740, based on the updated scanning wave vector, repeating the iteration until the slope | ajAnd | is lower than a preset threshold value to obtain a target scanning wave vector. In some embodiments, the preset threshold may be set to 0.00001 · 2 pi.
FIG. 8 shows the swept wave vector at a certain angle
Figure BDA0002373980640000197
In the figure, "x" represents the scanning wave vector calculated from the background hologram, and "+" represents the scanning wave vector calculated from 100 sets of sample holograms at an interval of 1 s. In the wavevector calculation step, as shown in fig. 8, the average deviation of the swept wavevector with sample data from the swept wavevector of the background data is 0.018(7) μm-1(ii) a Offset corresponds to km0.2% of. Corresponding to a scan angle difference of 0.05. Meanwhile, the standard deviation of fluctuation of wave vectors of illumination scanning under 100 groups of continuous time points is
Figure BDA0002373980640000198
And
Figure BDA0002373980640000199
Figure BDA00023739806400001910
fluctuation is km0.07% of the total amount of the components. Therefore, due to the high scanning speed (e.g., about 200HZ) and long-time recording, the mechanical jitter and instability of the galvanometer can cause the scanning wave vector to deviate from a specified angle, so that the scanning wave vector has an error and then causes the dislocation of a subsequent spectrum stitching step, so that the contrast and the resolution of a reconstructed image are seriously damaged, and the image quality is reduced, and the error of the scanning wave vector can be reduced by using the Vector Iterative Search (VISA) algorithm (fig. 7b) to iterate the scanning wave vector. Fig. 9 is a reconstructed image in the case where the scan wave vector iteration is not performed, and fig. 10 is a reconstructed image in the case where the scan wave vector iteration is performed. Comparing fig. 9 and 10, it can be seen that the scan wave-vector iteration can make the background of the reconstructed image more uniform and have better contrast.
In some embodiments, the diffraction tomography reconstruction algorithm may further comprise a sequence checking step, which may be performed after the wave vector calculation step. The hologram may include a sample hologram and a background hologram, and in the sequence checking step may include: comparing the scanning wave vectors of a group of sample holograms at the same time point with the scanning wave vectors of the background holograms to obtain scanning errors of the group of sample holograms; when the scanning error is larger than a set threshold value, marking sequence abnormality of the group of sample holograms; controlling the grouping of the first imaging data sets and the spectral stitching step based on the sequence anomaly. The phase image of each hologram obtained in the holographic processing step is accurately solved through the wave vector calculation step, and the scanning wave vector is obtained. The scanning errors of the set of sample holograms are obtained by comparing the scanning wave vectors of a set of holograms at the same point in time with the scanning wave vectors of the background holograms, the set of holograms which are considered to have larger scanning errors than a set threshold can be marked. In controlling the grouping of the first imaging data sets and the spectral stitching step, it may be chosen not to use the sample hologram data of these groups marked as sequence anomalies. In this embodiment, a single frame loss may occur occasionally due to a file input/output error that may occur during the storage of high-throughput data. In the sequence checking step, accurate scanning wave vector data corresponding to each hologram can be utilized to judge whether the image sequence is abnormal under a certain difference threshold value, and the sequence abnormal result is fed back to the optical diffraction chromatography subsystem, and the optical diffraction chromatography subsystem can control the grouping of the first imaging data set and the frequency spectrum splicing step based on the sequence abnormality so as to ensure that the long-range data reconstruction result is not influenced by the accidental frame loss problem.
In some embodiments, the Rytov approximation step may be used to determine the Rytov phase field based on the amplitude image and the unwrapped phase image. By way of example only, under the Rytov approximation, the light field may be described by a complex phase ψ (r), when the total light field u (r) and the incident light field u0(r) is described by a complex phase ψ (r) can be expressed as:
u(r)=eψ(r) (13)
Figure BDA0002373980640000201
ψ(r)=ψ0(r)+ψs(r) (15)
wherein the complex phase phi (r) of the total light field and the complex phase phi of the incident light field0(r) has the form:
ψ(r)=ln[a(r)]+iφ(r) (16)
ψ0(r)=ln[a0(r)]+iφ0(r) (17)
a (r) and phi (r) represent the light field amplitude and phase distribution, respectively, the light field amplitude may be determined based on the amplitude image and the phase distribution may be determined based on the unwrapped phase image. Psis(r) is the complex phase of the scattered light field.
The scattered field can now be expressed as:
Figure BDA0002373980640000202
the instantaneous harmonic wave equation of the helmholtz equation is:
Figure BDA0002373980640000211
in the formula (19), k (r) is a wave vector depending on the refractive index spatial distribution, u (r) is an optical field spatial distribution,
Figure BDA00023739806400002116
is the hamiltonian.
By substituting formula (13) for formula (19):
Figure BDA0002373980640000212
in formula (20), kmRepresenting the average wavenumber in the medium, f (r) representing the scattering potential, wherein
Figure BDA0002373980640000213
Figure BDA0002373980640000214
Wherein
Figure BDA0002373980640000215
Can be unfolded as follows:
Figure BDA0002373980640000216
the formula (21) can be substituted for the formula (20):
Figure BDA0002373980640000217
equation (22) is a nonlinear heterogeneous differential equation of the complex phase ψ (r). Similarly, the complex phase psi of the incident light field0(r) also satisfies the homogeneous equation:
Figure BDA0002373980640000218
substituting the formula (15) into the formula (22) to obtain the complex phase psi of the scattered light fields(r) satisfies:
Figure BDA0002373980640000219
the homogeneous equation (23)) is substituted into the following equation (24):
Figure BDA00023739806400002110
taking into account the following mathematical transformations
Figure BDA00023739806400002111
Can further obtain:
Figure BDA00023739806400002112
comparison with equation (25) yields:
Figure BDA00023739806400002113
in the Rytov approximation, a phase gradient term is assumed
Figure BDA00023739806400002114
But may be omitted. At this time, #R(r) is called Rytov phaseAnd satisfy the wave equation:
Figure BDA00023739806400002115
at an approximation of Rytov, if the scattered field u is knownR(r), then Rytov phase is:
Figure BDA0002373980640000221
here, the complex numbers are logarithmic, so equation (30) can be further expanded as:
Figure BDA0002373980640000222
in the formula (31), a (r) represents the total light field amplitude, a0(r) denotes the amplitude of the incident light field, phi (r) denotes the phase distribution of the total light field, phi0(r) represents the incident light field phase distribution. Therefore, equation (31) is an equation for determining the Rytov phase field based on the amplitude image obtained in the ensemble processing step and the unwrapped phase image obtained in the wave vector calculation step. And (3) calculating the amplitude image obtained in the holographic processing step and the low-residual-slope unwrapped phase image obtained in the wave vector calculation step according to an equation (31) to obtain the Rytov phase field.
In some embodiments, the spectral stitching step may be used for stitching in the frequency domain based on the Rytov phase field. Through the spectrum splicing step, the two-dimensional spectrum can be spliced into the three-dimensional spectrum. In some embodiments, the spectrum stitching step may be for: and splicing the two-dimensional frequency spectrum of the Rytov phase field and the complex transmission frequency spectrum taking the target scanning wave vector as parameters to obtain a three-dimensional scattering frequency spectrum and a three-dimensional complex transmission frequency spectrum. In some embodiments, spectral stitching may be achieved using frequency domain interpolation (e.g., next-to-neighbor interpolation or linear interpolation, etc.). The method for performing interpolation in the adjacent region is described as an example, the method for performing interpolation in the adjacent region can directly interpolate two-dimensional grid points of a two-dimensional spectrum to three-dimensional grid points of a three-dimensional spectrum closest to the two-dimensional grid points of the two-dimensional spectrum for calculation, and additional interpolation or spectrum spread calculation is not needed, so that additional calculation time is not basically needed. For diffraction tomography microscopy applications requiring large amounts of data processing, algorithms based on close proximity interpolation can achieve fast image processing, and have been demonstrated to have better computational accuracy under holographic bandwidth reconstruction.
In some embodiments, an inverse filtering step may be used to filter the spectral stitching results to obtain the optical diffraction tomography image. In the process of acquiring, transmitting, reconstructing, storing and the like of an image, due to various factors, such as relative motion between an optical diffraction tomography subsystem and a sample, turbulence effect of atmosphere, phase difference of the optical diffraction tomography subsystem, random noise of environment and the like, the image (frequency spectrum splicing result) has the problems of blurring, distortion, additional noise and the like, and in order to solve the problems, an inverse filtering step is needed for image restoration. In particular, the inverse filtering step may be used to: and dividing the three-dimensional scattering spectrum and the three-dimensional complex transmission spectrum obtained in the spectrum splicing step based on a wiener inverse filtering principle to obtain the optical diffraction chromatographic image. In some embodiments, the spectrum stitching result may be multiplied by the complex conjugate of the three-dimensional complex transmission spectrum and then divided by the square of the three-dimensional complex transmission spectrum, so that the influence of noise may be eliminated as much as possible, and the obtained optical diffraction tomographic image may be clearer.
In some embodiments, the first imaging data set may be processed by the processor 240 to perform a complex deconvolution diffraction tomographic three-dimensional reconstruction algorithm to generate an optical diffraction tomographic image. The complex deconvolution diffraction chromatography three-dimensional reconstruction algorithm can synchronously splice complex propagation frequency spectrums in the frequency domain splicing step and realize the division normalization of the three-dimensional scattering frequency spectrums and the three-dimensional complex transmission frequency spectrums in the inverse filtering step. Specifically, due to the low pass filtering effect of the far-field imaging system, the actual measurement at the jth illumination angle (as shown in fig. 29, where (b-c) calculates the amplitude (b) and phase (c) of the two-dimensional coherence function from (a)
Figure BDA0002373980640000231
Is the actual scattered field
Figure BDA0002373980640000232
Convolution with coherent imaging system amplitude point spread function h (x, y):
Figure BDA0002373980640000233
where G (x, y, z) is a far-field propagation term, which can be described by a Green function under a scalar approximation:
Figure BDA0002373980640000234
in equation (33), ± distinguishes between forward and backward propagation.
In the spatial frequency domain, equation (32) can be rewritten as:
Figure BDA0002373980640000235
in the formula (34), the reaction mixture is,
Figure BDA0002373980640000236
and
Figure BDA0002373980640000237
respectively represent
Figure BDA0002373980640000238
And
Figure BDA0002373980640000239
three-dimensional fourier transform of (a). And C (k)x,ky,kz) Which is a coherent transfer function, can be expressed as:
C(kx,ky,kz)=F.Τ.3D{h(x,y)G(x,y,z)} (35)
in the formula (35), f.t. represents fourier transform. Due to the constant wave vector k in the propagation process of monochromatic wavemThus, therefore, theThree-dimensional coherent transfer function C (k)x,ky,kz) Can be regarded as a two-dimensional coherent transfer function C (k)x,Ky) In a three-dimensional spherical surface
Figure BDA00023739806400002310
Figure BDA00023739806400002311
To (c) is performed.
Considering the coherent transfer function, then the jth illumination direction
Figure BDA00023739806400002312
Three-dimensional spectral components of the lower acquired scattering potential
Figure BDA00023739806400002313
With the real spectral components
Figure BDA00023739806400002314
The following relationships exist:
Figure BDA00023739806400002315
in equation (36), the spectral coordinates satisfy:
Figure BDA00023739806400002316
imaging at the image plane (z)00) the fourier diffraction theorem under the approximation of rewriting Rytov is
Figure BDA00023739806400002317
In order to spectrally stitch the scatter images at different illumination angles (j ═ 1,2, …) and normalize the overlap, the scatter potential can be reconstructed based on the wiener inverse filter principle. The three-dimensional scattering potential can now be solved as:
Figure BDA00023739806400002318
in the formula (39), the reaction mixture is,
Figure BDA00023739806400002319
is the mapping of the spectrum of the two-dimensional Rytov phase in the three-dimensional spectrum according to equation (37);
Figure BDA0002373980640000241
is along the sweep wave vector
Figure BDA0002373980640000242
A three-dimensional coherent transmission function after reverse translation; alpha is a regular parameter used in wiener inverse filtering to reduce noise generated by division.
In some embodiments, after obtaining the two-dimensional Rytov phase field through the Rytov approximation step, the splicing and filtering may be performed in the frequency domain according to equations (37) and (39). The two-dimensional spectrum can be mapped on the nearest three-dimensional spectrum lattice point by the close proximity interpolation method. In the two-dimensional frequency spectrum mapping process, the three-dimensional frequency spectrum lattice point is at KxAnd KyThe spacing of the directions is consistent with the spacing of the two-dimensional grid points, so that the Rytov approximation only needs to be carried out at KzInterpolation is performed in the direction. In this case KzThe smaller the directional lattice spacing, the higher the accuracy of the next-neighbor interpolation. But decrease KzThe directional lattice spacing means increasing KzThe total number of directional grid points, which may significantly increase the computation time in the subsequent inverse filtering step. Therefore, the calculation efficiency and the imaging accuracy are considered in combination, and it is preferable to use the KxAnd KyThe interval of the directions is 1-2 times of lattice point interval, so that the calculation efficiency can be improved, and the reconstruction effect of the optical diffraction tomography image can be ensured.
And step 530, acquiring a structured light illumination fluorescence image of the sample based on the second laser by using a structured light illumination fluorescence imaging subsystem. In some embodiments, the structured light illumination fluorescence subsystem may acquire first imaging data of the sample based on the second laser. In some embodiments, the structured light illumination fluorescence subsystem may utilize the second laser to act on the sample to obtain the second imaging data set of the sample, and the detailed description is given above with respect to the structured light illumination fluorescence subsystem and will not be repeated herein. Wherein the second imaging data set of the sample may comprise sample information with a fluorescent marker.
In some embodiments, the structured light illuminated fluorescence imaging subsystem may acquire a structured light illuminated fluorescence image of the sample based on the second imaging dataset using a structured light illuminated fluorescence image reconstruction algorithm. Fig. 11 is a flow chart of a structured light illumination fluorescence image reconstruction algorithm. The structured light illumination fluorescence image data firstly needs to calculate reconstruction parameters according to the image data, and then high-resolution fluorescence images are obtained through frequency spectrum component separation and splicing. Specifically, the structured light illumination fluorescence image reconstruction algorithm may include: and calculating reconstruction parameters, frequency spectrum component separation, frequency spectrum splicing and inverse Fourier transform based on the image data to obtain a high-resolution fluorescence image. Specifically, in the structured light illumination fluorescence subsystem, coherent illumination light in two directions interferes to form sinusoidal structured light:
Figure BDA0002373980640000243
wherein, gamma is the interference depth, p is the space frequency of the structured light, and phi is the initial phase of the structured light.
For the structured light illumination fluorescence imaging subsystem, the fluorescence light intensity is in direct proportion to the illumination light intensity, and the light intensity image of the wide-field imaging is as follows:
Ie(r)=Ii(r)D(r)*p(r) (41)
wherein D (r) represents the fluorescent molecule density distribution, represents the convolution operation, p (r) represents the light intensity point spread function of the system, according to a two-dimensional coherence function
C2D(kx,ky)=F.Τ.2D{h(x,y)} (42)
p (r) can be defined as:
Figure BDA0002373980640000251
wherein
Figure BDA0002373980640000252
Namely an optical transfer function, and the spectral transfer characteristic of the light intensity imaging system is represented.
Equation (41) can be expanded in the frequency domain as:
Figure BDA0002373980640000253
wherein
Figure BDA0002373980640000254
Is the spatial frequency spectrum of the fluorescent molecule density distribution. The frequency spectrum of the structured light illumination fluorescence image comprises the original frequency spectrum and the frequency spectrum of a shift term +/-p, and the information of the three frequency spectrums is optically transmitted by a function
Figure BDA0002373980640000255
Filtering is performed at the cut-off frequency. In order to separate three parts of spectrum information, phi is required to be changed for three times of measurement, and the frequency spectrums are respectively obtained:
Figure BDA0002373980640000256
separating spectral components Sn(k):
Figure BDA0002373980640000257
In a real two-dimensional image, typically at three angles
Figure BDA0002373980640000258
In the direction p1,p2,p3The measurements are repeated separately to achieve uniform super-resolution results in both dimensions and directions. The nine spectral components can be represented as:
Figure BDA0002373980640000259
Figure BDA00023739806400002510
Figure BDA00023739806400002511
after the above frequency spectrum components are solved, the high-resolution fluorescence density distribution frequency spectrum can be solved by utilizing Wiener inverse filtering:
Figure BDA0002373980640000261
wherein:
Figure BDA0002373980640000262
α is a regularization parameter introduced in Wiener filtering to reduce noise. Equation (54) shows that the final spectrum contains a total of 7 spectral components, six of which are uniformly distributed in six directions to extend a spectral range. The final fluorescence density distribution image (structured light illuminated fluorescence image) can be obtained as follows:
D(r)=IFT.2D{δ(k)} (55)
step 540, generating a bimodal fusion image of the sample based on the optical diffraction tomography image and the structured light illumination fluorescence image. In particular, step 540 may be performed by processor 240. In some embodiments, the processor 240 may employ image processing techniques of co-location image fusion to fuse the optical diffraction tomography image and the structured light illumination fluorescence image into a dual modality fused image. The bimodal fusion image may have both morphological information and class label information of the sample. The morphological information of the sample may include the size and shape of the sample, and the like. The class label information of the sample may include the kind of the labeled portion of the sample (e.g., what organelles are specifically). In some embodiments, processor 240 may select a two-dimensional tomographic image from the three-dimensional optical diffraction image that most closely resembles the two-dimensional structured-light illuminated fluorescence image for bimodal fusion. In some embodiments, the bimodal fusion image may be: in the optical diffraction tomography image, the fluorescence information (such as a fluorescence-labeled organelle) displayed by the structural light illumination fluorescence image is subjected to a fusion processing result after being labeled. The manner of marking may include, but is not limited to, highlighting, displaying in different colors, and the like. In some embodiments, the processor 240 may process the optical diffraction tomography image, the structured light illuminated fluorescence image, and/or the bimodal fusion image of the sample, such as image segmentation (e.g., segmenting each organelle in the image), quantitative statistics (e.g., counting path information of a certain organelle), and the like. In some embodiments, the processor 240 may perform online or real-time processing of the images described above. In some embodiments, the processor 240 may perform offline or post-processing of the images described above. In some embodiments, the images may be processed using other processors external to and/or associated with dual modality microscopy imaging system 100.
In order to better describe the dual-modality microscopic imaging system and the dual-modality fusion image of the embodiment of the present application, specific experimental results will be described as an example.
The embodiment of the application adopts a bimodal microscopic imaging system as shown in figures 1-2. The dual-modality microscopic imaging system includes an optical diffraction tomography subsystem 210 and a structured light illuminated fluorescence imaging subsystem 220. The optical diffraction tomography subsystem 210 and the structured light illumination fluorescence imaging subsystem 220 can be combined with the second dichroic mirror 121 through at least the first dichroic mirror 112 into a bimodal microscopy imaging system. In an embodiment of the present application, the dual-modality microscopy imaging system may include all of the components of the optical diffraction tomography subsystem 210 and the structured light illuminated fluorescence imaging subsystem 220. Specifically, the dual-mode micro-imaging system may include a first light source 101, a first acousto-optic modulator (AOM)102, a first Half Wave Plate (HWP)103, a first Polarization Beam Splitter (PBS)104, a Single Mode Fiber (SMF)105, a lens 106, a Galvanometer (GM)107, a sleeve lens 108, a microscope Objective (OBJ)109, a sample 110, a microscope objective 111, a first Dichroic Mirror (DM)112, a lens 113, a second camera 114, a single mode fiber 115, a lens 116, a first camera 117, a polarization independent Beam Splitter (BS)118, a lens 119, a lens 120, a second dichroic mirror 121, a lens 122, a Polarization Rotator (PR)123, a lens 124, a spatial filter (Mask)125, a lens 126, a second polarization beam splitter 127, a second half wave plate 128, a Spatial Light Modulator (SLM)129, a lens 130, a single mode fiber 131, a second acousto-optic modulator 132, a second light source 133, a Spatial Light Modulator (SLM)129, a second Half Wave Plate (HWP)103, a first Dichroic Mirror (DM)112, a second polarization beam splitter (DM) 105, a second polarization beam splitter) 115, a second mirror (c) and a second polarization beam splitter (c) and a second mirror (c) are disposed between the first and the second half wave plate, Coupler 134, coupler 135, and coupler 136.
In the optical diffraction tomography subsystem 210, the imaging optical path of the optical diffraction tomography subsystem 210 may be based on a commercial microscope model OLYMPUS, IX73 and equipped with a high numerical aperture imaging objective (i.e., microscope objective 111 model OLYMPUS, ApoN,100 × NA ═ 1.45). The first light source 101 can be a single longitudinal mode laser with a wavelength of 561nm, which is manufactured by Changchun New industry electro-optical technology, Inc. and has a model number of MSL-FN-561-50 mW; the first acousto-optic modulator is manufactured by Chongqing acousto-optic and electro-optic Limited of the middle electric technology group; the model of the first half-wave plate is Thorlabs, AHWP 10M-600; the model of the first polarization beam splitter prism is Thorlabs, CCM1-PBS 251; couplers 134 and 135 are of the type Thorlabs, PAF 2-7A; the single mode fibers 105 and 115 are polarization maintaining single mode fibers made by Shanghai vast space and having the model number of PM460-HP HA, FC/APC; the focal length of the lens 106 is 40mm, and particularly, the lens can be a model AC254-040-A lens manufactured by Thorlabs company in the United states; the model of the galvanometer 107 is Thorlabs, GVS 211/M; the sleeve lens 108 has a focal length of 180mm and may be specifically a model AC508-180-A lens manufactured by Thorlabs, USA; the type of the micro objective lens 109 is OLYMPUS, LUMPlanFLN,60 x/1.0W; the model of the microscope objective 111 is OLYMPUS, ApoN 100x/1.49 Oil; the model of the first dichroic mirror 112 is Chroma, ZT405/488/561/640-phase R; the focal length of the lens 122 is 200 mm; the second dichroic mirror is model number Thorlabs, DMLP 550L; the focal length of the lens 120 is 180mm, and specifically, the lens can be a lens manufactured by Thorlabs company of America with model number TTL 180-A; the focal length of the lens 119 is 250mm, and specifically, the lens can be a lens manufactured by Thorlabs company of the United states and having a model number AC 508-250-A; the focal length of the lens 116 is 100 mm; the polarization-independent beam splitter 118 is of the type Thorlabs, BS 013; the first camera was a sCMOS camera model Hamamatsu, OCRA-flash4.0 V3C13440.
In the structured light illuminated fluorescence imaging subsystem 220, the second light source is a single longitudinal mode laser with a wavelength of 488nm, which is a Coherent, Sapphire488 LP-200; coupler 136 model Lightpath, f 10 mm; the lens 130 is of the type Thorlabs, AC254-100-a, f 100 mm; the second half-wave plate 128 is of the type Thorlabs, AHWP 10M-600; the second polarization beam splitter 127 is of a type Thorlabs, CM1-PBS 25; the second acousto-optic modulator 132 is of the type AA Opto-Electronic, AOTF; the spatial light modulator 129 is of the type four Dimension Display, SXGA-3 DM; lens 126 is of the type Thorlabs, AC508-500-a, and f is 500 mm; lens 124Thorlabs, AC508-300-a, f 200 mm; the second camera 114 is a sCMOS camera with peak quantum efficiency of 82%, model Hamamatsu, OCRA-flash4.0V2C11440, to detect fluorescence emission.
In some embodiments, since in the optical diffraction tomography subsystem, information from multiple raw images can be combined to reconstruct one volumetric image, movement of structures in living cells can lead to motion artifacts and reduced resolution. As shown in fig. 28, if Lysosomes (LY) move a distance exceeding the spatial resolution of the system within the capture time required to reconstruct a frame, it can lead to reduced contrast and/or to motion artifacts, even though LY disappears in the final optical diffraction tomography image. Therefore, higher theoretical resolution can adversely affect the performance of the system during live cell imaging, and the spatial resolution must be matched to the corresponding temporal resolution to enable maximum resolution in live cell optical diffraction tomography. All previous optical diffraction tomography microscopes did not achieve very high temporal resolution, which easily explains the failure of these microscopes to detect lysine in living cells despite the high theoretical spatial resolution. Similarly, in living cells, tubular endoplasmic reticulum structures have never been observed with optical diffraction tomography microscopes, and even such observation is considered impossible, possibly for the same reasons as endoplasmic reticulum tubules and junctions undergo rapid movement in living cells.
The bimodal microscopic imaging system of the embodiment of the application can perform high-speed cell imaging based on fluorescence co-localization. Wherein the total acquisition cycle of the structured light illuminated fluorescence imaging subsystem 220 and the optical diffraction tomography subsystem 210 is 1.49s, which is fast enough to image a cell such that the dual-modality microscopy imaging system can alternately detect the same structure in a living cell between the two imaging subsystems. FIG. 18 shows the optical diffraction tomography-fluorescence co-localization images collected from six major organelles in a COS-7 live cell using a dual-modality microscopic imaging system. In the three columns on the left side, the first column is a structured light illumination fluorescence reconstruction image, the second column is an optical diffraction tomography image, and the third column is a bimodal fusion image; the three columns on the right side are sequentially an enlarged structured light illuminated fluorescence reconstruction image, an optical diffraction tomography image and a bimodal fusion image of the area indicated by the dashed line frame in the first column of images. (a) A tubular endoplasmic reticulum and a KDEL-EGFP marked structured light illumination fluorescence image co-location result in the optical diffraction chromatography image; (b) a mitochondria and a MitoTracker Green marked structured light illumination fluorescence image co-location result in the optical diffraction chromatography image; (c) the co-location result of the lipid drop in the optical diffraction tomography image and the structured light illumination fluorescence image marked by the LipidSpot 488; (d) illuminating a fluorescence image for co-localization result by the lysosome and the lysoView 488-marked structured light in the optical diffraction tomography image; (e) the nuclear membrane in the optical diffraction chromatography image and the LaminA-EGFP marked structured light illuminate a fluorescence image for co-positioning; (f) and (3) co-locating the optical diffraction tomography image chromosome and the H2B-EGFP marked structured light illuminated fluorescence image. Scale bar: 5 μm (left), 1 μm (right).
From the analysis of fig. 18 and 28, it can be concluded that due to the increase in spatio-temporal resolution of the optical diffraction tomography subsystem, the dynamics of the endoplasmic reticulum microtubules can be observed by the bimodal microscopy system and accurately correspond to the fluorescence imaging results of KDEL-EGFP labeling (as shown in fig. 18 (a)). And for the long strip and twisted structure appearing in the optical diffraction tomographic image, it was determined to be mitochondria by the co-localization data of the mitochondrial marker MitoTracker Green (as shown in fig. 18 (b)). The structured light illumination fluorescence imaging subsystem 220 can resolve the internal structure of mitochondria, and the optical diffraction tomography subsystem 210 can simultaneously provide three-dimensional dynamics of all mitochondria in the whole organelle, and the provided biodynamics information is far higher than that of two-dimensional fluorescence imaging. And because the optical diffraction tomography subsystem 210 has the characteristics of no phototoxicity and the photobleaching effect to limit the imaging time, the dual-mode microscopic imaging system can carry out three-dimensional continuous imaging on the order of hours on living cells (as shown in FIG. 15).
In addition, lipid droplets, late endosomes or lysosomes, nuclear membranes and chromosomes within living cells can also be observed by the bimodal microscopy imaging system. Specifically, the bright vesicle structure in the optical diffraction tomographic image was confirmed as a lipid droplet from the fluorescence co-localization image of the dye Lipidspot488 aggregated in the lipid droplet (fig. 18 (c)); meanwhile, darker vesicles in the optical diffraction tomographic image were confirmed to be late endosomes or lysosomes by fluorescence co-localization images of Lysoview488 labeling the acid lysosomes (fig. 18 (d)); near the nucleus, the continuous membrane-like structure in the optical diffraction tomographic image was confirmed as a nuclear membrane by co-localization with the laminA-EGFP-labeled fluorescence image (fig. 18 (e)); the irregular structure in the nuclear membrane was co-localized with the H2B-EGFP-labeled chromatin (FIG. 18(f)), and was confirmed as a chromosome.
In addition to the conventional organelles described above, the dual-modality microscopic imaging system of the example of the present application can also observe an RI of the dark vacuole even lower than the cytoplasmic RI of COS-7 cells (as shown in FIG. 15 (d)). Because optical diffraction tomography can measure the spatiotemporal distribution of mass density within living cells, these vacuolar structures contain much less material than cytosol, similar to vacuoles in plants and yeast. However, vacuolar vacuoles were small (1.56 ± 0.01 μm, n 5162), numerous (43 ± 2 vacuoles per planar COS-7 cell (119 cells)), and not labeled with fluorescent dyes for the acid cavity, compared to the 5-10 μm central acid vacuoles in plants and yeast (as shown in fig. 18 (d)). As shown in fig. 30, yeast vacuoles observed under the same optical diffraction tomography microscope were larger and darker than those in mammalian cells, and these structures were named as dark vacuoles (DBs).
Actin microfilaments can be observed by the bimodal microscopic imaging system of the embodiment of the application. In particular, fig. 19 shows the lateral resolution of the optical diffraction tomography-structured light illuminated fluorescence bimodal microscopy imaging system. Wherein (a) the actin-labeled optical diffraction chromatography-structured light illumination fluorescence bimodal image is marked by LifeactEGFP: left, structured light illuminates the fluorescence super-resolution image; an optical diffraction tomographic image corresponding to the depth; right, co-locate images. Scale bar: 5 μm. (b) Axial cross section along the refractive index variation on the solid line in (a). Scale bar: 0.5 μm. (c) Refractive index change and fluorescence intensity distribution along the solid line in (a). (d) The transverse full width at half maximum (FWHM) was measured and counted for 50 consecutive actin microfilaments in (c) in the vertical direction. (e) A refractive index variation profile along the dotted line in (b). (f) Axial full width at half maximum (FWHM) measured and counted 50 consecutive times in the direction perpendicular to the motilin microwires in (e). Actin microfilaments can be observed from fig. 19 (a). As shown in fig. 19(d-f), fitting the intensity distribution of LifeAct-EGFP along the actin filament cross-section using gaussian function, it can be found that the resolution of the hessian structured light illumination microscope system is 100nm, and the full width at half maximum (FWHM) of the actin microfilament measured by ODT subsystem is 200nm, exceeding the conventional lateral resolution obtained by fluorescence microscopy at this wavelength (as in fig. 33).
FIG. 20 is a COS-7 cell mitosis bimodal fluorescence co-localized image acquired by the bimodal microscopy imaging system. The first row is a structured light illumination fluorescence reconstruction image, the second row is an optical diffraction tomography image, the third row is a bimodal fusion image, and the four columns are four moments in imaging respectively. Scale bar: 2 μm. It can be concluded from the figure that during structured light illuminated fluorescence imaging, non-negligible phototoxicity causes spindle-shaped distribution of chromosomes to be blocked during the pre-mitotic phase; this is in sharp contrast to the complete mitotic process (figure 15) which was only imaged by diffraction tomography. In addition, in a similar experiment, it was observed that the phototoxicity generated by structured light illumination fluorescence imaging blocked COS-7 cells in later late stages. FIG. 34 is a schematic diagram showing three-dimensional whole-body observation of organelles using an optical diffraction tomography subsystem. Wherein (a) three different Z-planes of a typical COS-7 unit. Arrows indicate mitochondria, Lysosomes (LYs) and lactate-depleted droplets (LDs), respectively. (b) The average percentage of LDs, LYs and mitochondria was related to the total area of all LDs, LYs and mitochondria within the cell (on average from 13 cells) in an axial volume of 0.86 μm (10Z plane). Scale bar, 5 μm. Center line, median; limit, 75% and 25%; number, maximum and minimum. It can be seen from FIG. 34 that the phototoxicity generated by structured light illumination fluorescence imaging blocked COS-7 cells at a later late stage.
FIG. 21 is an optical diffraction tomography-fluorescence co-localization image of a cytometer in a COS-7 live cell that cannot be resolved by optical diffraction tomography using a bimodal microscopy imaging system. The first column is a structured light illumination fluorescence reconstruction image, the second column is an optical diffraction chromatography image, the third column is a bimodal fusion image, and the right three columns are an enlarged structured light illumination fluorescence reconstruction image, an optical diffraction chromatography image and a bimodal fusion image of the area indicated by the dotted line frame in the first column of images in sequence. (a) is a co-localization result of a Golgi-EGFP labeled Golgi body structured light illumination fluorescence image; (b) the result of co-localization of fluorescence images is illuminated by structured light of Pex11-EGFP labeled peroxisomes. Scale bar: 5 μm (left), 1 μm (right). Thus, in the bimodal colocalized image labeled with Golgi apparatus and peroxisomes, no obvious structure in the optical diffraction tomography image can correspond to the fluorescence image. The reason for this is that these organelles do not produce large refractive index differences with the cell cytoplasm, so golgi bodies, peroxisomes, are not yet observed with the bimodal microscopic imaging system.
In summary, the Endoplasmic Reticulum (ER), mitochondria, Lipid Droplets (LDs), Late Endosomes (LEs)/Lysosomes (LYs), nuclear membrane, chromosomal six major traditional organelles and actin microfilaments and a new organelle can be observed by the bimodal microscopy imaging system. This allows the dual modality imaging system to perform high speed continuous three-dimensional imaging of these cells without the need for markers and without the limitation of phototoxicity.
The bimodal microscopic imaging system of the embodiment of the application can also observe and analyze low-refractive-index vesicles (DBs) appearing in the imaging process, and the refractive index of the low-refractive-index vesicles is significantly lower than that of surrounding cytoplasm. Specifically, fig. 22 shows low refractive index vesicles appearing in an optical diffraction tomographic image of live COS-7 cells. Wherein the low refractive index vesicles are not co-localized with Rab7 EGFP-labeled lysosomes (a) and LAMP 1-EGFP-labeled late lysosomes (b); it is partially co-localized with Rab5a-EGFP labeled vesicle (c), EEA1-EGFP labeled vesicle (d), FYVE-EGFP labeled vesicle (e), and Rab9a-EGFP labeled vesicle (f) (g). (h) Left: ratio of DB to Total pool associated with fluorescence Rab5a/EEA1/FYVE/Rab9a/Rab 7/LAMP 1. And (3) right: the ratio of Rab5a/EEA1/FYVE/Rab9a/Rab7/LA MP 1-related DBs to the total pool of the corresponding fluorescent vesicles. (i) Left: diameter of DBs with and without fluorescent labels, and fluorescent vesicles with visible optical diffraction tomography structures in addition to DBs. And (3) right: differences in cellular structure and its surrounding environment include fluorescently labeled and fluorescently unlabeled DBs, and fluorescence vesicles of visible-optical diffraction tomographic structures, with the exception of DB. The first column is a structured light illumination fluorescence reconstruction image, the second column is an optical diffraction tomography image, the third column is a bimodal fusion image, and the right three columns are an enlarged structured light illumination fluorescence reconstruction image, an optical diffraction tomography image and a bimodal fusion image of the area indicated by a dotted line frame in the first column of images. Scale bar: 5 μm (left), 1 μm (right). As can be seen from FIG. 22, proteins and lipids on DB membranes (FIG. 22(a) -FIG. 22(g)) can be analyzed using a bimodal microscopy imaging system to visualize cells with exogenously expressed different fluorescent markers (R ab5a/EEA1/FYVE/Rab9a/Rab7/LAMP 1). A total of 61. + -. 3% of DBs (average diameter of about 1.5 μm, measured by the external fluorescence ring) were associated with EE-labeled Rab5a-EGFP, whereas a large number of Rab5 a-EGFP-labeled vesicles showed RI values higher than DBs (66. + -. 4%, FIG. 22 (h)). As shown in FIG. 22(i), DBs co-localized with LE/LY markers increased in size (mean diameter 1.8-2.3 μm) downstream of the intracellular transport pathway. Although Rab9a-EGFP labeled 60 ± 6% of all liquefied vesicles, it accounted for only 12 ± 1% of all Rab9a-EGFP labeled vesicles. Approximately 31% to 35% of the vacuolated vesicles are labeled with Rab7-EGFP or LAMP1EGFP, while these co-localized vesicles represent a small population of Rab 7-EGFP/LAMP 1-EGFP vesicles (approximately 11% to 14%). Rab7 and LAMP1 are more correlated with LE/LY than Rab9a37, 38, and 31% -35% of DB may be characterized by LEs/LYs. Likewise, 61 ± 3% DBs overlapping Rab5a-EGFP likely correspond to a population similar to EE. 82% -91% of the vacuolated vesicles overlapped with the EEA1-EGFP and FYVEEGFP labeled structures, indicating phosphatidylinositol 3-phosphate enrichment on DBs. As figure 22(g) shows co-localization between DBs and aquaporins, aquaporins can facilitate water transport across the plasma and endosomal membranes. Where aquaporin-2-EGFP (AQP2-EGFP) is co-localized with the LE/LY compartment but does not overlap with DBs, so that DBs represent organelles having a molecular structure different from that of the endosome, despite some partial sharing of proteins and lipids. It can be concluded from fig. 22(a-b) that the low refractive index vesicles are hardly co-localized with late endosomal or lysosomal markers such as Rab-7 or LAMP1, but significantly overlap with the fluorescence images of markers associated with early endosomal sources such as EEA1, Rab-5, FYVE (fig. 22 (c-f)). At the same time, the co-localization of Rab-9a markers suggests that low refractive index vesicles may be involved in late endosome-to-golgi network trafficking.
Figure 23 is a low refractive index vesicle appearing in an optical diffraction tomographic image when a bimodal microscopic imaging system images different cell types. Wherein the low refractive index vesicles appear in the optical diffraction tomography images of different cell types. (ii) (a) human fibroblasts; (b) human umbilical vein endothelial cells; (c) rat insulinoma INS-1 cells; (d) mouse dorsal root ganglion nerve cells. (a) The second row in (1) is a moving image of the low refractive index vesicle corresponding to the dashed box area in the top image. Scale bar: 5 μm (upper) and 1 μm (lower). It follows that such low refractive index vesicles are widely found in mammalian cells, for example, bimodal microscopy imaging systems can observe such low refractive index vesicles in human fibroblasts, human umbilical vein endothelial cells, rat insulinoma INS-1 cells, mouse dorsal root ganglion nerve cells.
FIG. 24 is an optical diffraction tomographic image of human mesenchymal stem cells and the correlation between the number of vesicles with low refractive index and their cell senescence phenotype. The HGPS-hMSCs of WT-hMSCs (wild type human bone marrow mesenchymal stem cells are used as an isogenic control group), heterozygotes (LMNAG608G/+) and homozygotes (LMNAG608G/G608G) form a control model of Hutchinson-Gilford premature senility syndrome (HGPS), and WS-hMSCs with WRN defects (WRN-/-) are used as a control model of Werner Syndrome (WS). (a-d) are optical diffraction tomographic images of four cells. (e) Low refractive index vesicle distribution density statistics for the intercellular layer. Scale bar: 5 μm on the left and 1 μm on the right. Mann-Whitney rank sum test: p <0.05, p <0.01, p < 0.001. Therefore, the low refractive index vesicles also exist in human mesenchymal stem cells, and the number and the shape of the low refractive index vesicles in different types of cells are slightly different, wherein the number of the low refractive index vesicles in the mesenchymal stem cells of the early senescence phenotype is obviously increased compared with that of a control group, which indicates that the functions of the low refractive index vesicles are possibly related to the normal operation of cells.
The bimodal microscopic imaging system of the embodiment of the application can carry out unmarked rapid long-term observation on the interaction process of the organelles. Organelles are cellular compartments that retain the local imprint of molecules and signals, and exchange information and materials with other organelles at the moment of cellular contact formation, and are critical to many cellular functions and behaviors. From an evolutionary point of view, both the endoplasmic reticulum and mitochondria are ancient eukaryotic endomembrane systems. However, in contrast to the numerous studies on the role of the endoplasmic reticulum in coordinating organelle interactions, there have been few studies on the interaction of mitochondria with different organelles. Since mitochondria are subject to phototoxicity when fluorescence imaging is performed. By continuously monitoring mitochondria in living cells with minimal phototoxicity, it can be seen that mitochondria can actively change shape, location, fate, and possibly act according to their interaction with other organelles. In particular, FIG. 25 shows the interaction of mitochondria with other organelles in COS-7 cells. Wherein (a) the nuclear membrane remains in contact with the mitochondria and interacts coherently in half-hour imaging; (b-c) contacting the endoplasmic reticulum with mitochondria and causing mitochondrial disruption (b) or lateral expansion (c); (d) mitochondrial interaction with lipid droplets and mitochondrial fragmentation resulting from contact with lysosomes; (e-g) different modes of interaction between mitochondria and low refractive index vesicles, contact resulting in mitochondria changing shape (e), moving with low refractive index vesicles (f) or undergoing fragmentation (g). Scale bar: 5 μm (a),2 μm (a, zoom in),1 μm (b, c, e, f),2 μm (d, g). By imaging the mitochondrial dynamics in living cells over a fast and long-term period in the optical diffraction tomography subsystem 210, the mitochondria will be altered in shape, location, behavior and function by interaction with other organelles, e.g., the mitochondria near the nuclear membrane will be longer in length and more uniform in shape than those randomly distributed in the cytoplasm (fig. 25 (a)). In the figure, mitochondria associated with the nucleus are arranged along the nucleus of the cell and change shape along with the nuclear membrane. Such localized mitochondria may assist in important processes in the nucleus such as the transport of mRNA out of the nucleus, etc. For example, after contacting with mitochondria, lipid droplets are pushed off mitochondria quickly, and both have no obvious morphological change; in contrast, exposure of lysosomes to mitochondria resulted in mitosis of mitochondria (fig. 25 (d)).
The bimodal microscopic imaging system of the embodiments of the present application can also observe the interaction process of the low refractive index vesicles with mitochondria. The shape of mitochondria can be observed to change (fig. 25(e)) or break (fig. 25(g)) upon contact of the low refractive index vesicles with mitochondria, and thus, instead of forming a continuous network that interacts with other organelles (e.g., ER), mitochondria are tailored to interact with different organelles under a variety of conditions using a "one-to-one" contact. In some cases, the low refractive index vesicles also have the behavior of dragging mitochondria with them (fig. 25 (f)).
The dual-modality imaging system of the embodiments of the present application may provide a complete map of organelle interactions because of optical diffraction tomographySubsystem 210 is capable of detecting a total number of organelles in 3D (e.g., mitochondria, LDs, and LEs/LYs) that exceeds the total number that can be detected with a 2D microscope using only one Z plane. As shown in fig. 33, two maximum illumination angles that are mirror images of each other are taken as an example. (a) And (b) the green and orange lines in the top diagram represent the illumination beam with the maximum NA of the illumination objective. As shown in fig. (c), the spatial frequency domain bandwidth is extended. (c) The top panel of (a) shows a lateral spatial frequency bandwidth extension, and the bottom panel of (c) shows an uneven distribution of longitudinal frequency bandwidth. k is a radical of||,illAnd k||,detRepresenting the maximum NA lateral projections of the illumination objective and the detection objective, respectively. From fig. 33, it can be concluded that although the structural dynamics inside different mitochondria can be solved by a two-dimensional hessian structured light illuminated microscope system, this method provides information extracted from only one axial plane. In contrast, the ODT subsystem with label-free imaging capability provides a 3D map of total mitochondria in a cell, covering an area 3 times the largest mitochondrial area detectable in an axial volume of about 0.86 μm.
The bimodal microscopic imaging system of the embodiments of the present application can also observe the interaction process of the low refractive index vesicles with other organelles besides mitochondria. Specifically, fig. 26 shows the interaction of low refractive index vesicles in the intracellular tissue organelle of COS-7. Wherein, (a) a two low refractive index vesicle fusion process; (b) the endoplasmic reticulum and the low-refractive-index vesicles are in contact with each other; (c) the endoplasmic reticulum and the lipid droplets of the low-refractive-index vesicle are contacted simultaneously; (d) simultaneous contact of low refractive index vesicles, mitochondria and endoplasmic reticulum; (e) the same low refractive index vesicles interacted with the nuclear membrane at one depth (top image) while interacting with lipid droplets and mitochondria at another depth (bottom image). Scale bar: 1 μm. For example, fusion of two low refractive index vesicles (fig. 26(a)), and interaction with endoplasmic reticulum (fig. 26 (b)). In fact, the endoplasmic reticulum is typically a bridge between low refractive index vesicles and other organelles. For example, the low-refractive-index vesicles and lipid droplets contact the endoplasmic reticulum from both sides, and the three contact each other for more than 2min (fig. 26(c)), and the endoplasmic reticulum carries the low-refractive-index vesicles and contacts the mitochondria for more than 1min (fig. 26 (d)). In some cases, low refractive index vesicles may also interact with other multiple organelles alone: as in fig. 26(e), low refractive index vesicles at one depth were in contact with the nuclear membrane while interacting with lipid droplets and mitochondria at another depth. The phenomena above all illustrate the pivotal effect of the low refractive index vesicles and endoplasmic reticulum in the cell, and the bimodal microscopic imaging system of the embodiment of the application provides a strong support for observing the new phenomena above.
The bimodal microscopic imaging system of the embodiments of the present application can also study the role of DBs in tissue organelle interactions. In particular, fig. 27 is a visualization of the DBs transport pathway and its role in tissue organelle interactions. Where (a) the Z-plane (left) of COS-7 cells is shown at the 00:49:30 time point, and the corresponding schematic shows the distribution of DBs near the nuclear and plasma membranes. (b) Representative examples of large DB formed from the cell periphery due to pinocytosis. (c) Two representative examples of sequential DB-DB fusion events, arrows indicate DBs. (d) Representative examples of DB conversion to LE/LY in COS-7 cells are shown on the left, and corresponding intensity distributions at different time points approximated by Gaussian functions are shown on the right. (e) A representative example of the biosynthesis of DB from a region close to the nuclear membrane (top), followed by fusion of DB to the plasma membrane after about 27 minutes (bottom). The clipping of the DB at different time points is shown on the left side, while the corresponding intensity curves approximated by gaussian functions at different time points are shown on the right side. (f-g) DB-mitochondria (f, arrow indicates DB) or LD-mitochondria (g, arrow indicates LD) contact representative examples. (h) Distribution of duration of LD-mitochondrial contact (left) and DB-mitochondrial contact (right). (i) Sparse (i) and (ii) dense cells surrounded by DB nuclear membrane. (j) Histogram of contact time between DB and nuclear membrane in sparse (left) and dense (right) regions. (k) Representative examples of multiorganelle complexes formed by DB collisions with tubular ER and LD. (l) Representative examples of multiorganelle complexes formed by DB, mitochondria, and tubular ER. (m) DB bridges a representative example of different organelles, where the same DB interacts with the nuclear membrane in one Z plane, while interacting with LD and mitochondria outside 0.68 μm (lower plane). (n) a representative exemplary cut-out of DB that interacts with mitochondria, LY and LD in sequence to form a DB-mitochondria-LD-LY quaternary complex, and then dissociates from LD, mitochondria and LY in sequence. To better demonstrate this process, an illustrative diagram is shown below the optical diffraction tomography image. Scale bar, (a)5 μm and (b-g, i, k-n)1 μm.
From fig. 27, the cells can be imaged for a long period of time using a dual-modality microscopic imaging system to observe the characteristics of DBs. Specifically, while large vacuoles may originate from micellular drinking near the plasma membrane, most normal-sized vacuoles appear in the area near the nuclear membrane (fig. 27a and 27 b). During the life cycle of living cells, these vacuoles fuse with each other to increase in size (fig. 27 c). Finally, when a few cells slowly transformed micro LEs/Lys (3 out of 26, from 2 cells) (fig. 27 d). Most DBs (3 out of 26, from 2 cells) collapse into the plasma membrane. The duration of DB-mitochondrial contact follows an exponential distribution with an aggregate time constant of about 61s, as a control, the average time of LD-mitochondrial contact is significantly shorter (about 37s) (fig. 27 f-fig. 27 h). These data indicate a strong interaction between DB and mitochondria. In addition, it was also observed that DB often interacts with nuclear membranes (fig. 27 i). FIG. 27j shows the time intervals for enucleated membrane contact of perinuclear regions with either a sparse (i) or dense (ii) distribution, which, due to the combined action of these two regions, can result in the most appropriate Gaussian distribution in duration, indicating that DB-nuclear membrane contact is controlled by multiple interaction processes. In addition, since another peak appears at about 105s in the histogram of the contact time between DBs and the nuclear membrane in the ii-zone, the interaction in the ii-zone is much stronger than that in the i-zone. DBs often play a central role in the formation of multiorganelle complexes. For example, DB and LD were ligated to different sides of the ER tubule, forming a multiorganelle complex more than 2 minutes before isolation (fig. 27 k). DB coated in ER tubules can also attach to mitochondria and stay on mitochondria for at least one minute (fig. 27 i). DB itself can also be connected to a plurality of organelles at the same time. For example, in one focal plane, DB is tightly bound to the nuclear membrane, and the morphology of the two organelles changes over time; at another focal plane, outside 0.68 μm, the same DB interacts with both an LD and a mitochondrion on two different sides (fig. 27 m). In another example, a DB is contacted with mitochondria (7 '50 "), LY (9' 45"), and LD (9 '50 ") in sequence to form a multiorganelle complex lasting 40s before LD (10' 30"), mitochondria (11 '00 "), and LY (13' 15") are isolated (fig. 27 n). Taken together, these data suggest that DBs may be central hubs coordinating organelle interactions and organizing multiorganelle complexes.
The bimodal microscopic imaging system of the embodiments of the present application can also be used to detect co-localization results between DBs and LC3-EGFP labeled autophagosomes. The correlation of the LC3-EGFP tag structure with DBs in COS-7 cells is depicted in FIG. 31. Wherein (a-b) some of the LC 3-EGFP-tagged structures are fluorescent rings co-localized with Lys (a) outer membrane or large vacuole (b). (c-d) most structures of the LC3-EGFP label are fluorescent spots that do not overlap with the clear optical diffraction tomography structure (c) nor with the LE/LY structure (d). All images shown in fig. 31 represent three similar experiments. Scale bar, 1 μm. From FIG. 31, most of the LC3-EGFP sites were either non-overlapping with the clear optical diffraction tomography structures in COS-7 cells (FIG. 31a) or non-overlapping with LES/LYs (FIG. 31b), although occasionally the outer membrane co-localization of the circular LC3-EGFP with LE/LYs or large DBs was seen (FIG. 31 c-FIG. 31 d).
The bimodal microscopic imaging system of the embodiments of the present application can be used to detect non-specific effects due to over-expression of foreign proteins. For example, FIG. 32 is a histogram of the size of LE/LY structures observed by the optical diffraction tomography subsystem in COS-7 cells overexpressing different protein markers. Wherein all distributions can be fitted by gaussian functions. In LysoView 488-labeled cells, the average size of LE/LY structure was 1.77. + -. 0.01 μm, smaller than that of LAMP1-EGFP overexpressing cells (2.00. + -. 0.01 μm), larger than that of Rab7-EGFP overexpressing cells (1.55. + -. 0.01 μm) and Rab9a-EGFP (1.67. + -. 0.01 μm).
In the present application, the imaging conditions for each image are shown in the following table.
Figure BDA0002373980640000361
Figure BDA0002373980640000371
In summary, in the optical diffraction tomography process, since the information of a plurality of original images are combined to reconstruct one frame image, the motion of any structure in the living cell may cause motion blur and resolution reduction, which is similar to the SIM reconstruction. For example, movement of LE/LY over distances greater than the system spatial resolution (about 200nm) may result in motion blur and reduced image contrast, requiring an acquisition time of less than 1.38s for a reconstructed frame. Higher spatial resolution or longer exposure times will result in the distribution of the LE/LY signals over a larger field of view, with the resulting structure being lost in background noise. Also, since ER tubules and junctions also undergo rapid movement in living cells, such observation has never been observed previously with optical diffraction tomography microscopes in living cells, and is even considered impossible. Therefore, the spatial resolution must be matched to the corresponding temporal resolution to achieve the maximum resolution achieved in living cell optical diffraction tomography, which was ignored in previous designs. In order to maintain contrast and resolution in limited lighting situations, fast optical diffraction tomography microscopes must have sufficiently high sensitivity. In addition to careful design and alignment of the optical path, there is a need for sCMOS cameras with large full well electrons and mechanically oscillating scanning mirrors with less optical distortion than digital micromirror devices. Therefore, the problems of illumination angle jitter and ball dislocation splicing during high-speed mechanical scanning in long-term living cell optical diffraction tomography are solved by using a VISA algorithm, the scanning wave vector of illumination angle change is accurately determined, and the splicing error is minimized. These all contribute to generating sufficient photon flux during short time exposure, and the optical diffraction tomography technique of the embodiments of the present application possesses superior color development performance over the prior art.
Rapid optical diffraction tomography microscopes have several unique advantages over fluorescence microscopes. Optical diffraction tomography microscopy can be used to image cells, structures and processes susceptible to phototoxicity (e.g., cell mitosis) (figure 15). On the other hand, phototoxicity generated by SR fluorescence imaging blocked COS-7 cells at late stage (fig. 34), consistent with the significant toxicity of fluorescence microscopy on H2EGFP in vivo imaging during caenorhabditis elegans embryonic development. In addition, nonspecific effects due to overexpression of foreign proteins can be easily detected by an optical diffraction tomography microscope. For example, the size of the LE/LY structure observed by optical diffraction tomography microscopy was significantly increased in LAMP1-EGFP overexpressing cells, while the size of the LE/LY structure was significantly decreased in Rab7EGFP overexpressing cells, compared to LysoView488 loaded cells alone (fig. 32). Recently, the emergence of membrane-free organelles and phase separation between solid and liquid states within living cells have been found to be a common mechanism involved in the mediation of important biological processes. Optical diffraction tomography microscopy is a good reflection of the image changes of the cell mass due to the phase separation process compared to fluorescence microscopy highlighting only specific proteins or organelles, since in this method the signal intensity is related to the spatial distribution of the intracellular material density. In fact, the coagulation of the chromatin and the appearance of the nuclear membrane during mitosis of living cells can be clearly observed (FIG. 15), similar to the phase separation process. Finally, optical diffraction tomography can provide a comprehensive map of the set of organelle interactions, since the total number of organelles (e.g., mitochondria, LD, and LE/LY) that can be detected in 3D by optical diffraction tomography exceeds the total number that can be detected with only one Z-plane 2D microscope (fig. 33). Furthermore, the same cell can also be imaged indefinitely, which will provide continuous information about long-term cellular processes, visualizing rare structures and intermediates.
On the other hand, fluorescent hessin SIMs are also indispensable. With higher resolution and contrast, hessian provides finer details, including the inside of mitochondria and their dynamics in living cells. Further enhancing the ability of super-resolution imaging in polychromatic fluorescence SR-FACT imaging to observe in optical diffraction tomography microscopyOrganelles not visible under the mirror (e.g., golgi and peroxisomes). In addition, by special labeling, fluorescence super-resolution imaging can also highlight key proteins/lipids/molecules at spatiotemporal moments of structural and dynamic changes. Finally, SR-FACT enables the detection of, for example, Ca by imaging with fluorescently labeled probes2+Functional kinetics such as voltage and cAMP are incorporated into the cellular landscape. However, it is not easy to add low phototoxicity SR fluorescence microscopy on optical diffraction tomography microscopy. The volumetric Sr fluorescent SIM requires intense illumination excitation, leading to extensive photobleaching and phototoxicity, and the temporal resolution of this approach is limited by the speed of the axial focal plane mechanical change. Both of these drawbacks are incompatible with optical diffraction tomography, making live cell-related SR imaging impossible in three dimensions. Thus, the use of the hessian 2D-SIM to help identify and interpret the structures seen by the optical diffraction tomography module has been demonstrated to reduce photon dose by a factor of 10 over the conventional 2D-SIM.
The combination of optical diffraction tomography and hessian generally leads to the observation of novel structures, which is best demonstrated from the findings of DB. Several lines of evidence suggest that DBs may represent previously unimportant cellular machinery. First, while DBs share some endosomal markers with the conventional endosomal fraction, their vesicular cavities are pH neutral and mostly liquid-rich. Both of these properties are different from the endosome. Next, high resolution optical diffraction tomography for up to one hour revealed biogenesis of DBs in organelle and biomaterial rich areas around the nucleus and then eventually collapse into the plasma membrane, also unlike endosomes which mainly take the opposite endocytic transport route. Finally, it was found that the water transporters including AQP-2 are mainly present in the endosome and almost absent in DBs, again demonstrating their uniqueness. Without SR-FACT, it is difficult to obtain a high resolution, modular complete image dataset spanning different time scales, even when combining SR fluorescence imaging with electron microscopy.
DBs may be important organizers of the organelle interaction group by closely interacting with other organelles including mitochondria and nuclear membrane (fig. 27(f) - (j)). In particular, it was demonstrated that different organelles can interact sequentially with a DB as a basal stone to form a multi-organelle complex (fig. 27(k) - (n)). Since cell contacts are usually equipped with specific lipid and protein tethers at the interface, data according to the examples of the present application indicate that lipids and proteins of different domains may be present in the same DB. Thus, DBs may host material and information exchange between different organelles, some of which may eventually be transported to the plasma membrane. Interestingly, different types of aging stem cells, known to be driven by nuclear membrane changes, showed phenotypically-associated increases in the number of DBs in hMSCs. These data indicate that nuclear information can be transferred to DB directly or indirectly through DE-nuclear membrane contact. Occasionally, larger vacuoles (2-3 μm in diameter) are observed in mammalian cells under abnormal conditions, such as nutritional deficiencies, chemical exposure, bacterial toxin treatment or inhibition of PI5 kinase. These vacuoles may represent a small number of DBs in normal COS-7 cells and increase in size under pathological stress conditions, also consistent with the importance of normal DB transport in maintaining cell function.
In summary, SR-FACT represents a tool that provides both a holistic view of the three-dimensional organelle interactions in living cells and highlights the specific organelles/molecules/signaling pathways involved. With dual-mode correlated SR imaging capability, SR-FACT can reveal phenomena that cannot be observed using either imaging modality alone, and often lead to unexpected observations in a well-studied process. It has minimal phototoxicity and no special requirements on the labeling method, it also represents a new generation of user-friendly super-resolution microscopes, which can generate countless structural and dynamic information and help to expand the understanding of the cell biological processes of living cells.
In experiments contemplated by some embodiments of the present application, cells can be cultured and prepared in the following manner. COS-7 cells can be cultured in high glucose DMEM (GIBCO,21063029) with a confluency of about 75% by adding 10% Fetal Bovine Serum (FBS) (GIBCO) and 1% 100mM sodium pyruvate solution (Sigma-Aldrich, S8636) to a 5% carbon dioxide incubator at 37 ℃. HUVECs can be isolated and cultured by adding fibroblast growth factor, heparin and 20% FBS to an M199 medium (Thermo Fisher Scientific,31100035) at a temperature of 37 ℃ and a carbon dioxide concentration of 5% until the degree of fusion reaches about 75% or by including Endothelial Cell Growth Supplements (ECGS) and 10% FBS (GIBCO) in an ECM medium at a temperature of 37 ℃ and a carbon dioxide concentration of 5% until the degree of fusion reaches about 75%. INS-1 cells were cultured in RPMI1640 medium containing 10% EBS (GIBCO), 1% 100mM sodium pyruvate, and 0.1% 55mM 2-mercaptoethanol (GIBCO,21985023) at 37 ℃ in a 5% carbon dioxide incubator until the confluency reached 75%. Human fibroblasts can be cultured in high glucose DMEM (GIBCO,21063029) with a confluency of about 75% in an incubator with a temperature of 37 ℃ and a carbon dioxide concentration of 5% supplemented with 20% fbs (GIBCO). All hMSCs can be cultured in hMSCs medium containing 90% α -MEM + glutamine (Gibco), 10% FBS (cells, a77E01F), 1% penicillin/streptomycin (Gibco), and 1ng/mL FGF2(Joint Protein Central). Dorsal Root Ganglion (DRG) neurons were isolated from P10 rats. The isolated DRGs can be removed from excess roots and digested with dispase ii (Roche, 10888700)/collagenase type ii (Worthington Biochemical, LS004176)) at 37 ℃ for 30 minutes followed by centrifugation at room temperature for another 35 minutes. DRG neuronal cell bodies were seeded onto 30mg/ml poly L-ornithine- (Sigma, RNBG3346) and 5 μ g/ml amine- (Roche, 11243217001) coated coverslips and cultured in neurobasal a (GIBCO, 21103049) supplemented with 2% B-27 supplement (GIBCO, a3582801), 2mM glutamine MAX (GIBCO, 35050061) and 1% penicillin/streptomycin (GIBCO, 15140122) in an incubator with a temperature of 37 ℃ and a carbon dioxide concentration of 5%. After 48h in vitro culture, DRG neurons were ready for imaging. For the SR-FACT imaging experiment, cells were seeded on coverslips (Thorlabs, CG15 XH).
In some embodiments, COS-7 cells can be in the presence of Ca for mitochondrial labeling2+And Mg2+But in HBSS without phenol red (Thermo Fisher Scientific, 14025076) 250nM MitoTracker was usedTMGreen FM (Thermo Fisher Scientific, M7514) was incubated for 15 minutes, then washed and imaged. For labeling LDs, COS7 cells were labeled with 1 × LipidSpotTM488(Biotium, 70065-T) whole cell culture medium was incubated at 37 ℃ for 30min without light, then washed and imaged. For labeling LES/Lys, COS7 cells can be labeled with 1 × LysoViewTM488(Biotium, 70067-T) whole cell culture medium was incubated at 37 ℃ for 15-30min in the absence of light, without washing and imaging.
In some embodiments, COS-7 cells can be transfected with Life Act-EGFP/KDEL-EGFP/Lamina-EGFP/H2 BEGFP/LAMP 1-EGFP/beta 1, 4-galactosyltransferase 1(B4GALT1) -EGFP/Pex11 a-EGFP/LC 3-EGFP/Rab7-EGFP/Rab5a-EGFP/Rab9a-EGFP/FYVE-EGFP/EEA 1-EGFP/AQP 2-EGFP. Lipofectamine was used according to the manufacturer's instructionsTM2000 (Thermo Fisher Scientific, 11668019). Cells were imaged in a bench top incubator 24-36 hours after transfection.
In some embodiments, in order to clean the coverslip for live cell imaging during coverslip preparation, the coverslip may be immersed in 10% powdered precision cleaner (Alconox, 1104-1) and sonicated for 20 minutes. After rinsing in deionized water, the coverslips were sonicated in acetone for 15 minutes and then again in 1M NaOH or KOH for 20 minutes. Finally, the coverslip may be rinsed with deionized water and then sonicated 3 times for at least 5 minutes each. The washed coverslips were stored in 95-100% ethanol at 4 ℃.
In some embodiments, imagej (fiji) may be used to analyze the image during imaging data analysis and statistics. To analyze DBs (fig. 24), a threshold may be applied to each optical diffraction tomographic image stack for segmentation and the density of DB in the Z-plane of individual cells with clearly visible nuclear cell membrane structures calculated. To analyze other organelles (FIG. 34), the optical diffraction tomography dataset and segmented LDs, LES/LYs and mitochondria can be manually labeled. The areas of LD, LEs, LYs and mitochondria in an axial volume of 0.86 μm (axial volume of 10Z planes with the layer containing the largest number of organelles as the center) can be calculated to match one Z plane of the 2D SIM and their percentage relative to the total area of LDs, LYs and mitochondria throughout the cell. The movements of LYs (FIG. 28) and DBs (FIG. 27) can be manually tracked using the ImageJ plug TrackMate. MATLAB (Mathworks), originPro (originLab), Igor Pro (Wavemetrics) and Illustrator (Adobe) can be used to analyze the data and prepare the final image. The mean results can be shown as mean ± standard error (mean ± SEM) for the number of experiments shown. Statistical significance can be assessed using the Mann-Whitney rank sum test (, indicates p <0.05, p <0.01, p <0.001, respectively). In some embodiments, the above-mentioned data (or image) analysis and statistical method can be implemented by other relevant instructions or software modules. In some embodiments, the above-described data (or image) analysis and statistical methods may be implemented automatically by invoking relevant computer instructions, which may be stored in a computer-readable storage medium, and when the computer instructions in the storage medium are read by a computer, the dual-mode microscopy imaging system 100 may be automatically implemented.
The beneficial effects that may be brought by the embodiments of the present application include, but are not limited to: (1) the bimodal microscopic imaging system can simultaneously perform label-free optical diffraction tomography three-dimensional imaging and super-resolution fluorescence two-dimensional imaging on a field range of 80 micrometers multiplied by 40 micrometers at a speed of at least 0.3 HZ; (2) the bimodal microscopic imaging system fully overcomes the defects of short optical diffraction tomography and structured light illumination fluorescence imaging, can perform comprehensive, rapid and long-time cell metabolism research on living cells, and can observe and identify various organelles; (3) the existence of a novel endosome, namely a low-refractive-index vesicle can be observed by using the bimodal microscopic system, the biochemical function of the novel endosome is represented, and the relationship between the novel endosome and the cell aging is researched; (4) by the diffraction chromatography reconstruction algorithm with complex deconvolution, reconstruction interruption caused by fluctuation and accidental abnormality of an imaging system can be effectively solved in actual imaging, high fault tolerance and robustness are achieved, and long-time and high-throughput diffraction chromatography data in biological cell imaging can be timely processed; (5) based on the fusion of the two modes, the parallel imaging characterization of the local specificity and the global morphology of the sample can be realized. It is to be noted that different embodiments may produce different advantages, and in different embodiments, the advantages that may be produced may be any one or combination of the above, or any other advantages that may be obtained.
Having thus described the basic concept, it will be apparent to those skilled in the art that the foregoing detailed disclosure is to be considered merely illustrative and not restrictive of the broad application. Various modifications, improvements and adaptations to the present application may occur to those skilled in the art, although not explicitly described herein. Such alterations, modifications, and improvements are intended to be suggested herein and are intended to be within the spirit and scope of the exemplary embodiments of the present application.
Also, this application uses specific language to describe embodiments of the application. Reference throughout this specification to "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with at least one embodiment of the present application is included in at least one embodiment of the present application. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, some features, structures, or characteristics of one or more embodiments of the present application may be combined as appropriate.
Moreover, those skilled in the art will appreciate that aspects of the present application may be illustrated and described in terms of several patentable species or situations, including any new and useful combination of processes, machines, manufacture, or materials, or any new and useful improvement thereon. Accordingly, various aspects of the present application may be embodied entirely in hardware, entirely in software (including firmware, resident software, micro-code, etc.) or in a combination of hardware and software. The above hardware or software may be referred to as "data block," module, "" engine, "" unit, "" component, "or" system. Furthermore, aspects of the present application may be represented as a computer product, including computer readable program code, embodied in one or more computer readable media.
The computer storage medium may comprise a propagated data signal with the computer program code embodied therewith, for example, on baseband or as part of a carrier wave. The propagated signal may take any of a variety of forms, including electromagnetic, optical, and the like, or any suitable combination. A computer storage medium may be any computer-readable medium that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code located on a computer storage medium may be propagated over any suitable medium, including radio, cable, fiber optic cable, RF, or the like, or any combination of the preceding.
Computer program code required for the operation of various portions of the present application may be written in any one or more programming languages, including an object oriented programming language such as Java, Scala, Smalltalk, Eiffel, JADE, Emerald, C + +, C #, VB.NET, Python, and the like, a conventional programming language such as C, Visual Basic, Fortran 2003, Perl, COBOL 2002, PHP, ABAP, a dynamic programming language such as Python, Ruby, and Groovy, or other programming languages, and the like. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any network format, such as a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet), or in a cloud computing environment, or as a service, such as a software as a service (SaaS).
Additionally, unless explicitly stated in the claims, the order of processing elements and sequences, the use of numerical letters or other designations in the present application is not intended to limit the order of the processes and methods in the present application. While various presently contemplated embodiments of the invention have been discussed in the foregoing disclosure by way of example, it is to be understood that such detail is solely for that purpose and that the appended claims are not limited to the disclosed embodiments, but, on the contrary, are intended to cover all modifications and equivalent arrangements that are within the spirit and scope of the embodiments herein. For example, although the system components described above may be implemented by hardware devices, they may also be implemented by software-only solutions, such as installing the described system on an existing server or mobile device.
Similarly, it should be noted that in the preceding description of embodiments of the application, various features are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure aiding in the understanding of one or more of the embodiments. This method of disclosure, however, is not intended to require more features than are expressly recited in the claims. Indeed, the embodiments may be characterized as having less than all of the features of a single embodiment disclosed above.
Some embodiments have been described using numbers to describe components, attributes, and quantities, it being understood that such numbers used in the description of the embodiments have been modified in some instances by the use of the modifier "about", "approximately" or "substantially". Unless otherwise indicated, "about", "approximately" or "substantially" indicates that the number allows a variation of ± 20%. Accordingly, in some embodiments, the numerical parameters used in the specification and claims are approximations that may vary depending upon the desired properties of the individual embodiments. In some embodiments, the numerical parameter should take into account the specified significant digits and employ a general digit preserving approach. Notwithstanding that the numerical ranges and parameters setting forth the broad scope of the range are approximations, in the specific examples, such numerical values are set forth as precisely as possible within the scope of the range.
The entire contents of each patent, patent application publication, and other material cited in this application, such as articles, books, specifications, publications, documents, and the like, are hereby incorporated by reference into this application. Except where the application is filed in a manner consistent or inconsistent with the present application, except where a claim is filed in a manner limited to the broadest scope of the application (whether present or later appended to the application). It is noted that the descriptions, definitions and/or use of terms in this application shall control if they are inconsistent or contrary to the statements, definitions and/or use of terms in the attached material of this application.
Finally, it should be understood that the embodiments described herein are merely illustrative of the principles of the embodiments of the present application. Other variations are also possible within the scope of the present application. Thus, by way of example, and not limitation, alternative configurations of the embodiments of the present application can be viewed as being consistent with the teachings of the present application. Accordingly, the embodiments of the present application are not limited to only those embodiments explicitly described and illustrated herein.

Claims (18)

1. A bimodal microscopic imaging system is characterized by comprising an optical diffraction tomography subsystem and a structured light illumination fluorescence imaging subsystem;
the optical diffraction tomography subsystem is used for performing label-free optical diffraction tomography based on the first laser to acquire an optical diffraction tomography image of the sample;
the structured light illumination fluorescence imaging subsystem is used for performing fluorescence imaging based on second laser to acquire a structured light illumination fluorescence image of the sample; wherein the content of the first and second substances,
the dual-mode microscopic imaging system comprises a first light source and a second light source which are mutually independent, wherein the first light source is used for emitting the first laser, and the second light source is used for emitting the second laser;
the optical diffraction tomography subsystem comprises the first light source, a first acousto-optic modulator, a first half-wave plate, a first polarization beam splitter prism, a galvanometer, a beam splitter and a first camera; the first laser at least passes through the first acousto-optic modulator and the first half-wave plate and is divided into first light splitting and second light splitting by the first polarization beam splitter prism; the first split light irradiates the sample from multiple angles after at least the action of the galvanometer so as to obtain sample light with sample information; the sample light and the second split light are received by the first camera after being combined by the beam splitter at least;
the structured light illumination fluorescence imaging subsystem comprises a second light source, a second sound-light modulator, a second polarization splitting prism, a second half-wave plate, a spatial light modulator, a spatial filter, a polarization rotator and a second camera; the second laser light forms structured light at least after passing through the second sound light modulator, the second polarization beam splitter prism, the second half-wave plate and the spatial light modulator; the structured light forms exciting light after passing through at least the spatial filter and the polarization rotator; the exciting light acts on the sample and excites the sample to generate fluorescence; the second camera is used for receiving the fluorescence;
the system further includes a first dichroic mirror for separating the sample light and the fluorescent light, and a second dichroic mirror for separating the sample light and the excitation light.
2. The dual modality microscopic imaging system of claim 1, wherein the system further includes an objective lens, wherein,
the objective lens, the first dichroic mirror, and the second dichroic mirror are shared by the optical diffraction tomography subsystem and the structured light illuminated fluorescence imaging subsystem;
the first dichroic mirror and the second dichroic mirror are further for directing the sample light to the dichroic mirror in the optical diffraction tomography subsystem and directing the excitation light to the objective lens to act on the sample in the structured light illuminated fluorescence imaging subsystem;
the first dichroic mirror is further to direct the fluorescent light to the second camera in the structured light illuminated fluorescence imaging subsystem.
3. The dual-modality microscopic imaging system of claim 2, the system further comprising:
at least one lens for collimating the first laser light, the first split light, the second split light, the sample light, the second laser light, the structured light, the excitation light and/or the fluorescence light;
at least one optical fiber is used for transmitting the first split light, the second split light and/or the second laser light.
4. The dual modality microscopic imaging system of any of claims 1-3, wherein the optical diffraction tomography image is a three dimensional image and the structured light illuminated fluorescence image is a two dimensional image.
5. The dual modality microscopic imaging system of any of claims 1 through 3,
the structured light illumination fluorescence imaging subsystem is further used for acquiring another structured light illumination fluorescence image of the sample based on third laser; wherein the third laser light and the second laser light have different wavelengths.
6. The dual modality microscopic imaging system of any of claims 1-3, wherein the first laser light is a different wavelength than the second laser light.
7. The dual modality microscopic imaging system of any of claims 1 through 3,
the system is used for carrying out the fluorescence imaging and the label-free optical diffraction tomography on the field range of not less than 80 micrometers multiplied by 40 micrometers at the speed of not less than 0.3 Hz;
the transverse resolution of the optical diffraction chromatographic image is not less than 200nm, and the longitudinal resolution is not less than 560 nm;
the lateral resolution of the structured light illuminated fluorescence image is not less than 100 nm.
8. The dual modality microscopic imaging system of any of claims 1-3, wherein the system further comprises a processor to:
and determining a bimodal fusion image of the sample at the same positioning position based on the optical diffraction tomography image and the structured light illumination fluorescence image.
9. The dual modality microscopic imaging system of claim 8,
the system uses the fluorescence imaging to assist the label-free optical diffraction tomography;
the bimodal fusion image has both morphological information and class label information of the sample.
10. The dual modality microscopic imaging system of any of claims 1-3, the system further comprising a control subsystem for:
and controlling the working time sequence of the optical diffraction tomography subsystem and the structured light illumination fluorescence imaging subsystem so as to realize the simultaneous or alternate execution of the label-free optical diffraction tomography and the fluorescence imaging.
11. A dual-modality microscopic imaging method applied to the dual-modality microscopic imaging system according to any one of claims 1 to 10, comprising:
respectively generating a first laser and a second laser by utilizing mutually independent light sources;
acquiring an optical diffraction tomography image of the sample based on the first laser with an optical diffraction tomography subsystem; the acquiring, with an optical diffraction tomography subsystem, an optical diffraction tomography image of a sample based on the first laser light includes: acting on the sample with a first laser to obtain a first imaging dataset of the sample; processing the first imaging data set with a diffraction tomography reconstruction algorithm to generate the optical diffraction tomography image; the diffraction tomography reconstruction algorithm comprises: a holographic processing step for extracting a hologram based on the first imaging data set, the hologram comprising an amplitude image and a phase image; the wave vector calculation step is used for determining a target scanning wave vector based on the phase image and generating an unwrapped phase image; a Rytov approximation step for determining a Rytov phase field based on the amplitude image and the unwrapped phase image; the spectrum splicing step is used for splicing in a frequency domain based on the Rytov phase field; and the inverse filtering step is used for filtering the frequency spectrum splicing result to obtain the optical diffraction chromatographic image;
acquiring a structured light illumination fluorescence image of the sample based on the second laser by using a structured light illumination fluorescence imaging subsystem;
generating a bimodal fusion image of the sample based on the optical diffraction tomography image and the structured light illumination fluorescence image.
12. The dual-modality microscopic imaging method of claim 11, wherein the acquiring, with the optical diffraction tomography subsystem, an optical diffraction tomography image of the sample based on the first laser light includes:
acting on the sample with a first laser to obtain a first imaging dataset of the sample;
processing the first imaging data set with at least a complex deconvolution diffraction tomography three-dimensional reconstruction algorithm to generate the optical diffraction tomography image.
13. The dual-modality microscopic imaging method of claim 11, wherein the wave vector calculating step includes:
and determining the target scanning wave vector by using a vector iterative search algorithm.
14. The method of dual-modality microscopic imaging of claim 13, wherein the determining the target scan wave-vector using a vector iterative search algorithm comprises:
multiplying said phase image extracted by the holographic processing step with a digital phase shift term to obtain a frequency shifted phase image, an
A preliminary estimated scan wave vector;
unwrapping the frequency-shifted phase image using a phase unwrapping algorithm based on a least squares method to obtain the unwrapped phase image;
performing linear fitting on the slope of the unwrapped phase image in the orthogonal direction, and obtaining an updated scanning wave vector based on a fitting result and the preliminarily estimated scanning wave vector;
and repeatedly iterating until the slope is lower than a preset threshold value based on the updated scanning wave vector to obtain a target scanning wave vector.
15. The dual-modality microscopic imaging method of claim 11, wherein the hologram includes a sample hologram and a background hologram, the diffraction tomography reconstruction algorithm further includes a sequence checking step, the sequence checking step including:
comparing the scanning wave vectors of a group of sample holograms at the same time point with the scanning wave vectors of the background holograms to obtain scanning errors of the group of sample holograms;
when the scanning error is larger than a set threshold value, marking sequence abnormality of the group of sample holograms;
controlling the grouping of the first imaging data sets and the spectral stitching step based on the sequence anomaly.
16. The dual-modality microscopic imaging method of claim 11, wherein the spectral stitching step is used to:
splicing the two-dimensional frequency spectrum of the Rytov phase field and a complex transmission frequency spectrum taking the target scanning wave vector as a parameter to obtain a three-dimensional scattering frequency spectrum and a three-dimensional complex transmission frequency spectrum;
the inverse filtering step is for: and dividing the three-dimensional scattering spectrum and the three-dimensional complex transmission spectrum based on a wiener inverse filtering principle to obtain the optical diffraction tomography image.
17. A dual-modality microscopic imaging apparatus comprising a processor, wherein the processor is configured to perform the dual-modality microscopic imaging method of any one of claims 11-16.
18. A computer readable storage medium storing computer instructions which, when read by a computer, cause the computer to perform the method of dual modality microscopic imaging of any one of claims 11 to 16.
CN202010059510.1A 2020-01-19 2020-01-19 Bimodal microscopic imaging system and method Active CN111610621B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010059510.1A CN111610621B (en) 2020-01-19 2020-01-19 Bimodal microscopic imaging system and method
PCT/CN2021/071393 WO2021143707A1 (en) 2020-01-19 2021-01-13 Dual-modality microscopic imaging system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010059510.1A CN111610621B (en) 2020-01-19 2020-01-19 Bimodal microscopic imaging system and method

Publications (2)

Publication Number Publication Date
CN111610621A CN111610621A (en) 2020-09-01
CN111610621B true CN111610621B (en) 2022-04-08

Family

ID=72195888

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010059510.1A Active CN111610621B (en) 2020-01-19 2020-01-19 Bimodal microscopic imaging system and method

Country Status (2)

Country Link
CN (1) CN111610621B (en)
WO (1) WO2021143707A1 (en)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111610621B (en) * 2020-01-19 2022-04-08 北京大学 Bimodal microscopic imaging system and method
CN113009680B (en) * 2021-03-11 2021-11-05 广东粤港澳大湾区协同创新研究院 Multi-channel imaging system and method for super-resolution imaging
CN113702288B (en) * 2021-08-18 2022-07-01 北京大学 Bimodal microscopic imaging system and imaging method thereof
CN113777767B (en) * 2021-09-14 2022-06-10 北京大学长三角光电科学研究院 Optical tomography microscopic imaging system and method for rapidly and continuously rotating sample
CN113779180A (en) * 2021-09-29 2021-12-10 北京雅丁信息技术有限公司 Regional DRG grouping simulation method
CN113933277B (en) * 2021-10-15 2023-08-22 深圳大学 High-density three-dimensional single-molecule positioning super-resolution microscopic imaging system and method
CN113768472B (en) * 2021-11-10 2022-03-22 华中科技大学 Three-dimensional image acquisition device with fluorescent marker and method
CN114324245B (en) * 2021-11-15 2024-01-30 西安电子科技大学 Quantitative phase microscopic device and method based on partially coherent structured light illumination
CN114190890A (en) * 2021-11-26 2022-03-18 长沙海润生物技术有限公司 Wound surface imaging device and imaging method thereof
CN114326075B (en) * 2021-12-10 2023-12-19 肯维捷斯(武汉)科技有限公司 Digital microscopic imaging system and microscopic detection method for biological sample
CN114813518A (en) * 2022-02-17 2022-07-29 山东大学 Mark-free streaming detection device and method based on single-camera dual-mode imaging
CN114755200B (en) * 2022-03-21 2022-11-08 北京大学长三角光电科学研究院 Visual monitoring system and method based on photodynamic therapy
CN114820761B (en) * 2022-05-07 2024-05-10 北京毅能博科技有限公司 XY direction included angle measurement and motion compensation method based on image microscopic scanning platform
CN115015176B (en) * 2022-05-18 2023-08-29 北京大学长三角光电科学研究院 Optical diffraction tomography enhancement method and device
CN114668582B (en) * 2022-05-30 2022-08-19 季华实验室 Ophthalmologic light source operation system
CN115308184A (en) * 2022-09-05 2022-11-08 中国科学院苏州生物医学工程技术研究所 Active structured light illuminated super-resolution microscopic imaging method and system
CN115541578B (en) * 2022-09-28 2023-10-24 佐健(上海)生物医疗科技有限公司 High-flux super-resolution cervical cell pathological section rapid scanning analysis system
CN115901623B (en) * 2022-11-03 2023-07-11 北京大学长三角光电科学研究院 High-speed optical diffraction chromatography microscopic imaging system
CN115980989A (en) * 2023-01-09 2023-04-18 南开大学 Single-frame quantitative phase tomography system and method based on microlens array
CN117268288B (en) * 2023-07-26 2024-03-08 北京大学长三角光电科学研究院 Optical diffraction tomography laser scanning method and device and electronic equipment
CN117030674A (en) * 2023-10-09 2023-11-10 中国科学院生物物理研究所 Structured light illuminated three-dimensional super-resolution imaging system and method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062364A1 (en) * 2008-10-31 2010-06-03 University Of Maine System Board Of Trustees Nanoscale imaging of molecular positions and anisotropies
CN106056613A (en) * 2016-06-02 2016-10-26 南方医科大学 Magnetic resonance phase unwrapping method based on pixel classification and local surface fitting
CN108169173A (en) * 2017-12-29 2018-06-15 南京理工大学 A kind of big visual field high-resolution three dimensional diffraction chromatography micro imaging method
CN108665411A (en) * 2018-03-09 2018-10-16 北京超维景生物科技有限公司 A kind of method and system of image reconstruction
CN208399380U (en) * 2018-06-07 2019-01-18 中国科学院苏州生物医学工程技术研究所 A kind of Structured Illumination super-resolution micro imaging system
CN110658195A (en) * 2019-10-25 2020-01-07 浙江大学 Frequency shift unmarked super-resolution microscopic chip and imaging method thereof
CN209946009U (en) * 2019-03-22 2020-01-14 中国科学院苏州生物医学工程技术研究所 Optical coherence tomography and two-photon fluorescence synchronous imaging system
CN111060485A (en) * 2019-12-31 2020-04-24 苏州栢科昇科技有限公司 Microorganism multi-modal imaging system and microorganism multi-modal imaging detection method

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011121523A2 (en) * 2010-03-28 2011-10-06 Ecole Polytechnique Federale De Lausanne (Epfl) Complex index refraction tomography with sub √6-resolution
CN101984928B (en) * 2010-09-29 2012-06-13 北京大学 Multi-mode molecular tomography system
CN103948399B (en) * 2013-07-25 2016-03-23 合肥工业大学 Based on the 3-D supersonic imaging method of non-diffraction ripple under sector scanning mode
WO2015189174A2 (en) * 2014-06-10 2015-12-17 Carl Zeiss Meditec, Inc. Improved frequency-domain interferometric based imaging systems and methods
CN205750291U (en) * 2016-04-13 2016-11-30 苏州大学 A kind of hologram three-dimensional display device based on spatial light modulator
KR101865624B1 (en) * 2016-06-10 2018-06-11 주식회사 토모큐브 3D Refractive Index Tomogram and Structured Illumination Microscopy System using Wavefront Shaper and Method thereof
KR102426103B1 (en) * 2016-07-22 2022-07-28 주식회사 내일해 An Improved Holographic Reconstruction Apparatus and Method
CN106227016B (en) * 2016-07-28 2019-03-12 东南大学 A kind of non-iterative complex amplitude modulation holographic projection methods
US10131133B2 (en) * 2016-10-17 2018-11-20 Purdue Research Foundation Methods for forming optically heterogeneous phantom structures and phantom structures formed thereby
US20180164562A1 (en) * 2016-10-25 2018-06-14 Stereo Display, Inc. Confocal microscopy system with vari-focus optical element
CN111610621B (en) * 2020-01-19 2022-04-08 北京大学 Bimodal microscopic imaging system and method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2010062364A1 (en) * 2008-10-31 2010-06-03 University Of Maine System Board Of Trustees Nanoscale imaging of molecular positions and anisotropies
CN106056613A (en) * 2016-06-02 2016-10-26 南方医科大学 Magnetic resonance phase unwrapping method based on pixel classification and local surface fitting
CN108169173A (en) * 2017-12-29 2018-06-15 南京理工大学 A kind of big visual field high-resolution three dimensional diffraction chromatography micro imaging method
CN108665411A (en) * 2018-03-09 2018-10-16 北京超维景生物科技有限公司 A kind of method and system of image reconstruction
CN208399380U (en) * 2018-06-07 2019-01-18 中国科学院苏州生物医学工程技术研究所 A kind of Structured Illumination super-resolution micro imaging system
CN209946009U (en) * 2019-03-22 2020-01-14 中国科学院苏州生物医学工程技术研究所 Optical coherence tomography and two-photon fluorescence synchronous imaging system
CN110658195A (en) * 2019-10-25 2020-01-07 浙江大学 Frequency shift unmarked super-resolution microscopic chip and imaging method thereof
CN111060485A (en) * 2019-12-31 2020-04-24 苏州栢科昇科技有限公司 Microorganism multi-modal imaging system and microorganism multi-modal imaging detection method

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
"simultaneous dual-contrast three-dimensional imaging in live cells via optical diffraction tomography and fluorescence";Chen Liu etc.;《PHOTONICS Research》;20190930;第1042-1050 *
"Super-resolution three-dimensional fluorescence and optical diffraction tomography of live cells using structured illumination generated by a digital micromirror device";Seungwoo Shin etc.;《SCIENTIFIC REPORTS》;20181230;第1-8页 *
"基于波前重构的计算显微成像方法与应用研究";卞殷旭;《中国博士学位论文全文数据库工程科技II辑》;20190430;正文第2章至第4章 *
"活细胞超灵敏结构光超高分辨率显微镜";黄小帅等;《中国科学基金》;20181230;第367-375页 *
Seungwoo Shin etc.."Super-resolution three-dimensional fluorescence and optical diffraction tomography of live cells using structured illumination generated by a digital micromirror device".《SCIENTIFIC REPORTS》.2018, *

Also Published As

Publication number Publication date
WO2021143707A1 (en) 2021-07-22
CN111610621A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
CN111610621B (en) Bimodal microscopic imaging system and method
Dong et al. Super-resolution fluorescence-assisted diffraction computational tomography reveals the three-dimensional landscape of the cellular organelle interactome
Park et al. Quantitative phase imaging in biomedicine
Li et al. Transport of intensity diffraction tomography with non-interferometric synthetic aperture for three-dimensional label-free microscopy
Galiani et al. Super-resolution microscopy reveals compartmentalization of peroxisomal membrane proteins
Kim et al. Correlative three-dimensional fluorescence and refractive index tomography: bridging the gap between molecular specificity and quantitative bioimaging
Boden et al. Volumetric live cell imaging with three-dimensional parallelized RESOLFT microscopy
US8848199B2 (en) Tomographic phase microscopy
US9164479B2 (en) Systems and methods of dual-plane digital holographic microscopy
Kemper et al. Label-free quantitative in vitro live cell imaging with digital holographic microscopy
US20090290156A1 (en) Spatial light interference microscopy and fourier transform light scattering for cell and tissue characterization
Thouvenin et al. Dynamic multimodal full-field optical coherence tomography and fluorescence structured illumination microscopy
Holanová et al. Optical imaging and localization of prospective scattering labels smaller than a single protein
WO2023221741A1 (en) Transport of intensity diffraction tomography microscopic imaging method based on non-interferometric synthetic aperture
KR101855366B1 (en) Method and system for 3d label-free imaging and quantification of lipid doplets in live hepatocytes
Lévesque et al. Sample and substrate preparation for exploring living neurons in culture with quantitative-phase imaging
US8508746B2 (en) Interferometric systems having reflective chambers and related methods
Weigel et al. Resolution in the ApoTome and the confocal laser scanning microscope: comparison
Mirsky et al. Dynamic tomographic phase microscopy by double six-pack holography
Sergeev et al. Determination of membrane protein transporter oligomerization in native tissue using spatial fluorescence intensity fluctuation analysis
Hsiao et al. Spinning disk interferometric scattering confocal microscopy captures millisecond timescale dynamics of living cells
US20130169969A1 (en) Spatial Light Interference Tomography
Zhang et al. Advanced imaging techniques for tracking drug dynamics at the subcellular level
Kang et al. Mapping nanoscale topographic features in thick tissues with speckle diffraction tomography
Creath Dynamic quantitative phase images of pond life, insect wings, and in vitro cell cultures

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant