CN116593434A - Line illumination modulation double-color tomography system - Google Patents

Line illumination modulation double-color tomography system Download PDF

Info

Publication number
CN116593434A
CN116593434A CN202310411101.7A CN202310411101A CN116593434A CN 116593434 A CN116593434 A CN 116593434A CN 202310411101 A CN202310411101 A CN 202310411101A CN 116593434 A CN116593434 A CN 116593434A
Authority
CN
China
Prior art keywords
pixel row
image
line
channel
light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310411101.7A
Other languages
Chinese (zh)
Inventor
袁菁
龚辉
丁章恒
赵江江
骆清铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hust-Suzhou Institute For Brainsmatics
Original Assignee
Hust-Suzhou Institute For Brainsmatics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hust-Suzhou Institute For Brainsmatics filed Critical Hust-Suzhou Institute For Brainsmatics
Priority to CN202310411101.7A priority Critical patent/CN116593434A/en
Publication of CN116593434A publication Critical patent/CN116593434A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0071Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by measuring fluorescence emission
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0073Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence by tomography, i.e. reconstruction of 3D images from 2D projections
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0032Optical details of illumination, e.g. light-sources, pinholes, beam splitters, slits, fibers
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0064Optical details of the image generation multi-spectral or wavelength-selective arrangements, e.g. wavelength fan-out, chromatic profiling
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/0052Optical details of the image generation
    • G02B21/0076Optical details of the image generation arrangements using fluorescence or luminescence
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/002Scanning microscopes
    • G02B21/0024Confocal scanning microscopes (CSOMs) or confocal "macroscopes"; Accessories which are not restricted to use with CSOMs, e.g. sample holders
    • G02B21/008Details of detection or image processing, including general computer control
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6428Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes"
    • G01N2021/6439Measuring fluorescence of fluorescent products of reactions or of fluorochrome labelled reactive substances, e.g. measuring quenching effects, using measuring "optrodes" with indicators, stains, dyes, tags, labels, marks
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N2021/6463Optics

Landscapes

  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Immunology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biochemistry (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Surgery (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Computational Linguistics (AREA)
  • Multimedia (AREA)
  • Animal Behavior & Ethology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Chemical Kinetics & Catalysis (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)

Abstract

The application discloses a line illumination modulation double-color tomography system, which belongs to the technical field of multicolor tomography and comprises the following components: the line illumination modulation module comprises two monochromatic light sources and a modulation light path, each light beam is modulated to form a line light spot focused on the focal plane of the objective lens, and the position parameter of the line light spot is positioned in the center of a pixel row or on the intersection line of adjacent pixel rows; the distance between the position parameters of the two line light spots is an integer multiple of 0.5 pixel; the imaging module is used for continuously scanning and imaging along a first direction by adopting a multi-element detector with n rows of pixels to obtain n mixed images, each mixed image corresponds to one row of pixels, each mixed image contains signals of 2 channels, and n is more than or equal to 3 and is a positive integer; and the image demodulation module is used for carrying out subtraction processing on the mixed image of the two symmetrical pixel rows, eliminating the signal of one channel from the mixed image, and demodulating a monochromatic image, wherein the monochromatic image comprises the signal of the other channel. The application achieves simultaneous dual-color tomography on a single detector.

Description

Line illumination modulation double-color tomography system
Technical Field
The application relates to the technical field of multicolor imaging, in particular to a line illumination modulation double-color tomography system.
Background
Different structures in biological tissues are selectively marked by using fluorophores with different colors, multicolor fluorescent signals are obtained simultaneously by using multicolor microscopic imaging technology, and the spatial relationship and interaction among cells, organelles and molecules in the biological tissues can be better analyzed. Current polychromatic imaging methods typically use dichroic mirrors to spatially separate the signals of each channel and pass them to different black and white cameras for detection, but this approach has some problems: images acquired from a common wide-field imaging microscope do not have chromatographic capability and are interfered by stronger defocused background signals; for fluorophores with emission spectra overlapping each other, fluorescent signals of different channels cannot be completely separated by the dichroic mirror, so crosstalk between channels can exist; detecting by adopting a plurality of black-and-white cameras, wherein images of all channels are offset in the transverse direction and have differences of detection focal planes in the axial direction, so that complex registration processing is required to be carried out on original images; each channel requires a separate black and white camera, limited by the size and complexity of the system, and the number of channels for multicolor imaging is limited; the manufacturing cost of the system increases with the number of black and white cameras.
Disclosure of Invention
In order to overcome the problems of the existing multicolor imaging method, the embodiment of the application provides a line illumination modulation bicolor tomography system. The technical scheme is as follows:
the line illumination modulation dual-color tomography system comprises:
the line illumination modulation module comprises two monochromatic light sources and a modulation light path, and the wavelengths of light beams emitted by the two monochromatic light sources are different; each light beam passes through the modulation light path to form a line light spot focused on the focal plane of the objective lens, the illumination intensity of the line light spot on the focal plane of the objective lens is Gaussian distributed in a first direction, the first direction is perpendicular to the extending direction of the line light spot, and the position parameter of the line light spot is positioned in the center of a pixel row or on the intersection line of adjacent pixel rows; the distance between the position parameters of the two line light spots is an integral multiple of the corresponding width of 0.5 pixel in the object space, and the line light spots of the two channels are overlapped to form bicolor line illumination light;
the imaging module is used for continuously scanning and imaging along the first direction by adopting a multi-element detector with n rows of pixels to obtain n mixed images under the illumination of the double-color line illumination light, each mixed image corresponds to one row of pixels, each mixed image contains signals of 2 channels, and n is more than or equal to 3 and is a positive integer;
and the image demodulation module is used for carrying out subtraction processing on the mixed image of the two symmetrical pixel rows, demodulating a single-color image from the mixed image, wherein the two symmetrical pixel rows are symmetrical about the position parameter of the line light spot of one channel, and the single-color image corresponds to the signal of the other channel.
Further, the image demodulation module includes:
the channel selection unit is used for selecting one channel as a target channel and the other channel as an auxiliary channel;
a pixel row determining unit configured to determine a first pixel row and a second pixel row symmetrical with respect to a position parameter of the auxiliary channel;
an image correction unit configured to perform deskew correction on the mixed image of the second pixel row;
and the image demodulation unit is used for obtaining the monochromatic image of the target channel according to the mixed image of the first pixel row and the mixed image of the second pixel row after the depolarization correction.
Further, the image correction unit includes:
a convolution kernel acquisition subunit configured to acquire a deskewing correction convolution kernel between the first pixel row and the second pixel row;
and the convolution kernel correction subunit is used for carrying out the depolarization correction on the mixed image of the second pixel row by using the depolarization correction convolution kernel.
Further, the convolution kernel acquisition subunit is configured to:
acquiring a monochromatic image of the first pixel row and a monochromatic image of the second pixel row under the condition that only an auxiliary channel illumination beam is started;
performing Fourier transform on the monochromatic images of the first pixel row and the monochromatic images of the second pixel row to obtain a frequency domain image of the first pixel row and a frequency domain image of the second pixel row;
and dividing the frequency domain image of the first pixel row and the frequency domain image of the second pixel row, and then performing inverse Fourier transform to obtain the depolarization correction convolution kernel.
Further, the convolution kernel acquisition subunit is configured to:
calculating an effective point spread function of the first pixel row and an effective point spread function of the second pixel row;
performing Fourier transformation on the effective point spread function of the first pixel row and the effective point spread function of the second pixel row to obtain an optical transfer function of the first pixel row and an optical transfer function of the second pixel row;
and dividing the optical transfer function of the first pixel row and the optical transfer function of the second pixel row, and then performing inverse Fourier transform to obtain the depolarization correction convolution kernel.
Further, the image correction unit includes:
a translation parameter obtaining subunit, configured to obtain a translation parameter between the first pixel row and the second pixel row;
and the translation correction subunit is used for carrying out offset correction on the mixed image of the second pixel row by utilizing the translation parameters.
Further, the modulation optical path includes a shaping optical path for shaping the light beam into a line light beam, a position adjusting optical path for adjusting a position parameter of each line light spot, and a projection optical path for superposing the line light spots to form a bi-color line illumination light.
Further, the position adjusting light path comprises a second dichroic mirror, a third dichroic mirror, a first reflecting mirror, a second reflecting mirror, a third reflecting mirror and a fourth reflecting mirror, the second dichroic mirror divides an incident light beam into two light paths, one light path sequentially passes through the second dichroic mirror and the third dichroic mirror, the other light path sequentially passes through the second dichroic mirror, the first reflecting mirror, the second reflecting mirror and the third reflecting mirror, and emergent light of the third dichroic mirror sequentially passes through the third reflecting mirror and the fourth reflecting mirror.
Further, the imaging module includes:
the scanning unit is used for continuously scanning and imaging along a first direction by adopting a multi-element detector with n rows of pixels, wherein n is more than or equal to 3;
an image block acquisition unit that acquires a stripe image block of an i-th pixel row in each frame image of one sample obtained in time sequence;
and the splicing unit is used for sequentially splicing the strip image blocks of the ith pixel row in each frame of image of one sample to obtain a mixed image of the ith pixel row, i epsilon n.
Further, the line illumination modulation dual-color tomography system further comprises a driving module for driving the line illumination modulation module and the sample to perform relative movement in three directions perpendicular to each other.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
the application provides a line illumination modulation double-color tomography system, which adopts a single multi-element detector to carry out scanning imaging, and compared with an imaging method which needs to use a plurality of cameras, the system cost and complexity are reduced; detecting and imaging by adopting a plurality of pixel rows, wherein images among all channels are in a natural registration relationship, and no additional registration processing is needed; the demodulated image eliminates the defocused background signal and has the chromatography capability naturally. In fluorescence imaging, the line illumination modulation double-color tomography system avoids the problem of emission spectrum crosstalk by utilizing illumination light with different wavelengths to excite fluorescent signals with different colors differently.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a functional block diagram of a line illumination modulated dual color tomography system of the present application;
FIG. 2 is a schematic diagram of the optical path structure of the line illumination modulated dual color tomography of the present application;
FIG. 3 is a schematic diagram of the line illumination light distribution of the present application;
FIG. 4 is a functional block diagram of an imaging module of the present application;
FIG. 5 is a schematic diagram of a sample imaging acquisition process of the present application;
FIG. 6 is a schematic representation of three-dimensional imaging of the present application;
fig. 7 is a functional block diagram of an image correction unit of the present application;
fig. 8 is a functional block diagram of another image correction unit of the present application.
Detailed Description
The present application will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present application more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the application. In addition, the technical features of the embodiments of the present application described below may be combined with each other as long as they do not collide with each other.
As shown in fig. 1, an embodiment of the present application provides a line illumination modulation dual-color tomography system, which includes a line illumination modulation module 10, an imaging module 20, and an image demodulation module 30.
The line illumination modulation module 10 includes two monochromatic light sources and a modulation light path, the wavelengths of the light beams emitted by the two monochromatic light sources are different, each light beam is shaped into a line light spot focused on the focal plane of the objective lens through the modulation light path, the illumination intensity of the line light spot on the focal plane of the objective lens is in gaussian distribution in a first direction, the first direction is perpendicular to the extending direction of the line light spot, the position parameters of the line light spots formed after the different light beams are modulated are different, the position parameters of the line light spot are located in the center of a pixel row or on the intersection line of adjacent pixel rows, and the distance between the position parameters of the two line light spots is an integer multiple of the corresponding width of 0.5 pixel in object space.
Further, the imaging module 20 includes a multi-element detector with n rows of pixels for multi-line detection imaging of the sample signal excited by the bi-color line illumination light. Because each channel line light spot only excites signals of one color on a sample, n mixed images under the illumination of double-color line illumination light can be obtained through continuous scanning imaging of a multi-element detector with n rows of pixels along a first direction, each mixed image corresponds to one row of pixels, each mixed image contains signals of 2 channels, wherein n is more than or equal to 3 and is a positive integer.
Further, by the image demodulation module 30, a subtraction process is performed on the mixed image of two pixel rows symmetrical with respect to the line spot position parameter of one channel, and a monochrome image corresponding to the signal of the other channel is demodulated from the mixed image.
The line illumination modulation dual-color tomography system provided by the embodiment of the application adopts a single multi-element detector to carry out scanning imaging, reduces the cost and complexity of the system compared with an imaging method which needs to use a plurality of cameras, adopts a plurality of pixel rows to carry out detection imaging, and realizes dual-color simultaneous imaging by adopting a natural registration relationship of images among all channels without carrying out additional registration processing. The mixed image of two pixel rows symmetrical with respect to the linear light spot position parameter of one channel is subjected to subtraction processing to carry out image demodulation, so that the demodulated image eliminates the defocused background signal, and the demodulated image has the chromatographic capability naturally. In fluorescence imaging, the line illumination modulation double-color tomography system avoids the problem of emission spectrum crosstalk by utilizing illumination light with different wavelengths to excite fluorescent signals with different colors differently.
The monochromatic Light source in the linear illumination modulation dual-color tomography system refers to a Light source with a certain wavelength, and can be either a monochromatic laser Light source or a monochromatic LED (Light-emitting diode) Light source or a monochromatic Light source obtained by combining a broad-spectrum Light source with a narrow-wave filter. In addition, the line illumination modulation double-color tomography system can be suitable for fluorescence microscopic imaging and can also be suitable for multiple scenes of non-fluorescence microscopic imaging (namely, the illumination wavelength is the same as the detection wavelength), and the application is not limited to the above.
Further, the modulation optical path includes a shaping optical path for shaping the light beam into a line light beam, a position adjusting optical path for adjusting a position parameter of each line light spot, and a projection optical path for superposing the line light spots to form a double-color line illumination light.
Taking the case of two laser light sources as an example, an optical path structure of the line illumination modulation bicolor tomography in fluorescence microscopy imaging of the present application is described below, and as shown in fig. 2, a shaping optical path includes a first dichroic mirror 3 for combining the 2 laser light sources, a first beam expander 4, a second beam expander 5, and a cylindrical lens 12 sequentially arranged along a beam transmission direction after the combination. The position adjustment optical path includes a second dichroic mirror 6, a third dichroic mirror 7, a first reflecting mirror 8, a second reflecting mirror 9, a third reflecting mirror 10, and a fourth reflecting mirror 11. The projection light path includes an illumination cylindrical lens 13, an objective lens 14 and a fourth dichroic mirror 17 (in non-fluorescent imaging, a half-reflecting and half-transmitting beam splitter, or a combination device composed of a polarizing beam splitter and a quarter-glass, or other devices capable of reflecting illumination light and transmitting detection light with the same wavelength as the illumination light) for superposing the linear light spots to form bicolor linear illumination light. The line illumination modulation dual-color tomography system of the present application further comprises an imaging light path, specifically comprising an objective lens 14, a fourth dichroic mirror 17, an emission filter 18 (which is not required in non-fluorescent imaging), a probe cylinder 19 and a multi-element detector 20.
Specifically, the laser beams emitted by the laser light sources of the two channels are first combined by the first dichroic mirror 3, and are expanded by the first beam expander 4 and the second beam expander 5, then the illumination beams of the different channels are separated by the second dichroic mirror 6 and respectively enter two light paths, one light path passes through the second dichroic mirror 6 and the third dichroic mirror 7, the other light path passes through the second dichroic mirror 6, the first reflecting mirror 8, the second reflecting mirror 9 and the third dichroic mirror 7 in sequence, then the illumination beams of the two light paths are combined again by the third dichroic mirror 7, the emergent light of the third dichroic mirror 7 passes through the third reflecting mirror 10 and the fourth reflecting mirror 11 in sequence, finally the combined illumination beams are shaped into linear beams by the cylindrical lens 12, and are projected onto a focal plane of the objective lens through the illumination cylindrical lens 13, the fourth dichroic mirror 17 and the objective lens 14, for exciting fluorescent signals of the sample 15 on the three-dimensional electric translation stage 16, and the fluorescent signals pass through the objective lens 14, the fourth dichroic mirror 17, the emission cylindrical lens 18 and the final detector element 20 in sequence for imaging.
Specifically, adjusting the angle of either or both of the third mirror 10 or the fourth mirror 11 can simultaneously adjust the positional parameters of the line spots of the two channels (denoted as channel a and channel b), while adjusting the angle of any one or more of the second dichroic mirror 6, the first mirror 8, the second mirror 9, and the third dichroic mirror 7 can adjust the positional parameters of the line spots of the corresponding channel (denoted as channel b) transmitted along the optical path. Therefore, in the experimental process, the angle of either or both of the third reflecting mirror 10 and the fourth reflecting mirror 11 can be adjusted first to enable the linear light spot of the channel a to be displaced along the x direction until the position parameter of the linear light spot is located at the center of the pixel row or the intersection line of the adjacent pixel rows, and then the angle of any one or more of the second dichroic mirror 6, the third dichroic mirror 7, the first reflecting mirror 8 and the second reflecting mirror 9 is adjusted to enable the linear light spot of the channel b to be displaced along the x direction until the position parameter of the linear light spot of the channel b is also located at the center of the pixel row or the intersection line of the adjacent pixel rows, and the distance between the position parameters of the linear light spots of the two channels is an integer multiple of the corresponding width of 0.5 pixel in the object space. By adjusting the position adjusting light paths successively, the linear light spots on the two channels can have certain dislocation in the x direction, and the purpose of separating the positions of the linear light spots of different channels in the x direction is achieved.
In other embodiments, the second dichroic mirror 6 and the third dichroic mirror 7 may be replaced with PBS (Polarization beam splitter, polarizing beam splitter) to perform the function of splitting light. Specifically, a half-wavelength glass slide is inserted before the laser beam combining of different wavelengths for adjusting the polarization state of the laser, so that the polarization states of the laser beams of different wavelengths after the beam combining are orthogonal, and then the light splitting and the subsequent light combining of the laser beams of different wavelengths can be realized through the PBS.
In other embodiments, the modulated light path may directly generate linear light beams with different wavelengths from linear LEDs with different emission wavelengths, and then project the linear light beams onto the focal plane of the objective lens through the projection light path. The position parameters of the light spot of the objective Jiao Mianxian can be changed by adjusting the positions of the different linear LEDs.
As shown in fig. 3, the illumination intensities of the line illumination light beams of the channel 1 and the channel 2 are distributed in gaussian in the x direction, the imaging area of the multi-element detector is 8 pixel rows, and the position parameters of the line light spot focused on the focal plane of the objective lens by each light beam through the modulation light path are different. As shown in fig. 3 (a), the brightest spot of the line spot of the channel 1 in the x-direction is located on the intersection of the 4 th pixel row and the 5 th pixel row, that is, the line spot position parameter of the channel 1 is located on the intersection of the 4 th pixel row and the 5 th pixel row, and similarly, the line spot position parameter of the channel 2 is located on the intersection of the 5 th pixel row and the 6 th pixel row; as shown in fig. 3 (B), the linear-plaque-position parameter of the channel 1 is located at the center of the 4 th pixel row, and the linear-plaque-position parameter of the channel 2 is located at the center of the 5 th pixel row; as shown in fig. 3 (C), the line spot position parameter of the channel 1 is located on the intersection of the 4 th pixel row and the 5 th pixel row, and the line spot position parameter of the channel 2 is located at the center of the 5 th pixel row.
The multi-element detector in this embodiment may be an area CCD (Charge-coupled device) or an area CMOS (Complementary Metal Oxide Semiconductor) camera having Sub-array or ROI (Region of interest) functions, or a linear CCD or linear CMOS camera having an area mode function may be used.
Further, referring to fig. 4, the imaging module 20 includes:
a scanning unit 201, configured to continuously scan and image along a first direction by using a multi-element detector with n rows of pixels, where n is greater than or equal to 3;
an image block acquisition unit 202 that acquires a stripe image block of an i-th pixel row in each frame image of one sample obtained in time series;
the stitching unit 203 sequentially stitches the stripe image blocks of the ith pixel row in each frame of image of one sample to obtain a mixed image of the ith pixel row, i e n.
Fig. 5 is a schematic diagram of a sample imaging acquisition process according to the present application, and the function of the imaging module 20 is described below in conjunction with fig. 5. In order to facilitate subsequent image demodulation, the imaging area of the multi-element detector in the embodiment is n pixel rows arranged along the x direction, and the movement direction of the sample is also along the x direction, so that the sample can be scanned and imaged under different pixels respectively, and synchronous two-color imaging is realized.
In the imaging and collecting process, the three-dimensional electric translation stage 16 drives the sample 15 to move at a uniform speed along the x direction, the single-frame exposure time of the multi-element detector is equal to the time of the sample 15 moving by one pixel in the corresponding width of the object space, and if the image corresponding to any pixel row in one frame of image is set as one strip image block, a plurality of strip image blocks corresponding to the pixel row in a plurality of frames of images are sequentially and continuously imaged for each part of the sample. And splicing the strip image blocks obtained by continuous imaging obtained in time sequence to obtain mixed images corresponding to n pixel rows respectively, wherein each mixed image corresponds to one pixel row, and each mixed image contains signals of two channels, wherein n is more than or equal to 3 and is a positive integer. Specifically, the i-th pixel line is selected from the n pixel lines, the image block acquisition unit 202 acquires the stripe image blocks of the i-th pixel line in each frame image of one sample obtained in time sequence, and the stitching unit 203 sequentially stitches the stripe image blocks of the i-th pixel line in each frame image of one sample to obtain a mixed image of the i-th pixel line.
Further, the line illumination modulation dual color tomography system of the present application further comprises a driving module 40 for driving the line illumination modulation module and the sample to perform relative movement in three directions perpendicular to each other.
Specifically, referring to fig. 6, for a three-dimensional imaging schematic of the present application, a sample is set to four surface layers, each surface layer is divided into 4 sample strips, the three-dimensional motorized translation stage 16 drives the sample 15 to move along the x-direction, images the first sample strip, then moves by one sample strip width along the y-direction, then moves along the-x direction, images the second sample strip, and the above-mentioned process is continuously circulated to complete the imaging of the third sample strip and the fourth sample strip. After the scanning imaging of the first surface layer is completed, the three-dimensional electric translation stage 16 drives the sample 15 to move for a certain distance along the z direction, so that the focal depth of the sample is changed from the first surface layer to the second surface layer, and likewise, the three-dimensional electric translation stage 16 drives the sample 15 to move along the x direction again, the second surface layer of the sample is imaged, and the reciprocating cycle is performed, so that the three-dimensional imaging of the sample can be realized.
The demodulation principle and demodulation process in the case of fluorescence imaging will be mainly described below. The demodulation principle and the demodulation process of non-fluorescent imaging and the demodulation principle and the demodulation process of fluorescent imaging are basically the same, the main difference is that the detection wavelength is different, so that the effective PSF of the system is slightly different and can be ignored, and the method is still applicable.
Similar to the imaging principle of a line scanning confocal microscope, the blended image corresponding to each pixel row in the method is a convolution of the effective PSF (point spread function ) of the system, which is the product of the illumination system PSF and the detection system PSF, and the fluorescent signal distribution on the sample.
Under bi-color illumination, the mixed image contained 2 channel fluorescent signals. Taking the fluorescence signal of a single channel as an example, the effective PSF of the jth channel in the ith pixel row is:
wherein, the liquid crystal display device comprises a liquid crystal display device,indicating the illumination PSF, < >>The detection PSF, i represents the serial number of each pixel row on the imaging area, j represents the serial number of the fluorescent protein type, and also represents the channel serial number, x, y and z represent the coordinates of the three-dimensional object space in all directions, b i,j Representing the distance between the ith pixel row and the spot location parameter of the jth channel illumination line.
Thus, the signal of the j-th channel included in the mixed image obtained by the i-th pixel row of the camera can be expressed as:
wherein f j (x, y, z) represents the relative concentration distribution of the j-th fluorescent protein on the sample, and "×" represents the convolution operation. As can be seen from equation (2), the fluorescent signal of the sample is co-modulated by the illumination system PSF and the detection system PSF.
Further, referring to fig. 1, the image demodulation module 30 includes:
a channel selecting unit 301, configured to select one channel as a target channel and the other channel as an auxiliary channel;
a pixel row determining unit 302 for determining a first pixel row and a second pixel row symmetrical with respect to the position parameter of the auxiliary channel;
an image correction unit 303, configured to perform deskew on the mixed image of the second pixel row;
and an image demodulation unit 304, configured to obtain a monochrome image of the target channel according to the mixed image of the first pixel row and the mixed image of the second pixel row after the depolarization correction.
When the two channels are illuminated simultaneously, the mixed image contains fluorescent signals of two channels, and first, one channel is selected as a target channel and the other channel is an auxiliary channel through the channel selecting unit 301. In equation (1), the intensity distribution of the illumination PSF or the detection PSF in each z-plane is rotationally symmetric, so that the effective PSF shape of two pixel rows symmetric with respect to the linear spot position parameter is approximately the same for any channel (the imaging focal depth of the two symmetric pixel rows is the same, the modulation intensity is the same), and thus the channel image acquired by the two symmetric pixel rows is also approximately the same
In addition, due to the dislocation of the linear light spots of the two channels, two pixel rows symmetrical with respect to the linear light spot position parameters of the auxiliary channel can form modulation with different intensities under the action of the linear light spots of the target channel. Therefore, by the pixel row determining unit 302, two pixel rows in the auxiliary channel, which are symmetrical with respect to the line spot position parameter, are selected as the first pixel row and the second pixel row, and then the mixed image of the first pixel row and the second pixel row is subjected to subtraction processing, so that a monochromatic image of the fluorescent signal corresponding to the target channel can be demodulated from the mixed image.
Specifically, the process of the abatement process includes: the mixed image of the second pixel row is subjected to depolarization correction by the image correction unit 303, and then a monochrome image of the target channel is obtained by the image demodulation unit 304 from the mixed image corresponding to the first pixel row and the mixed image of the second pixel row after the depolarization correction.
Taking the illumination case shown in fig. 3 (a) as an example, the line spot position parameter of the channel 1 is located on the intersection of the 4 th pixel row and the 5 th pixel row, and the line spot position parameter of the channel 2 is located on the intersection of the 5 th pixel row and the 6 th pixel row. Since the 5 th and 6 th pixel rows of the channel 2 are symmetrical with respect to the line spot center of the channel 2, the signal of the channel 2 can be canceled by processing the mixed image corresponding to the 5 th and 6 th pixel rows of the channel 2, thereby obtaining a monochrome image of the channel 1.
In some embodiments, the 4 th pixel row and the 7 th pixel row, the 3 rd pixel row and the 8 th pixel row in the channel 2 can be selected for mixed image demodulation, so as to obtain a monochromatic image of the channel 1.
In some embodiments, the single-color images of channel 1 demodulated by the pairs of pixel rows may also be added to improve the signal-to-noise ratio of the demodulated images.
Similarly, in fig. 3 (B), the 4 th pixel row and the 6 th pixel row of the channel 2 are symmetrical with respect to the line spot position parameter, and therefore, a monochrome image of the channel 1 can be obtained by processing a mixed image corresponding to the 4 th pixel row and the 6 th pixel row of the channel 2, and canceling the signal of the channel 2. In fig. 3 (C), the 4 th pixel row and the 6 th pixel row of the channel 2 are symmetrical with respect to the line spot position parameter, and therefore, the signal of the channel 2 can be canceled by processing the mixed image corresponding to the 4 th pixel row and the 6 th pixel row of the channel 2, to obtain the monochrome image of the channel 1.
Further, according to formula (2), the mixed image corresponding to the 5 th pixel row and the 6 th pixel row can be expressed as:
I 5 (x,y,z)=h 5,1 (x,y,z)*f 1 (x,y,z)+h 5,2 (x,y,z)*f 2 (x,y,z) (3)
I 6 (x,y,z)=h 6,1 (x,y,z)*f 1 (x,y,z)+h 6,2 (x,y,z)*f 2 (x,y,z) (4)
in equations (3) and (4), the effective PSFh of channel 2 on the 5 th and 6 th pixel rows 5,2 (x, y, z) and h 6,2 The shape of (x, y, z) in 3-dimensional space is symmetrical about the line spot center of the channel 2 (illumination situation shown in fig. 3 (a)), and therefore, the images of the channel 2 in the mixed image corresponding to the 5 th pixel row and the 6 th pixel row are approximately the same with only a slight shift. In order to eliminate the channel 2 signal in the mixed image as much as possible, the mixed image needs to be subjected to depolarization correction.
Further, as shown in fig. 7, the image correction unit 303 includes:
a convolution kernel acquisition subunit 3031, configured to acquire a deskew convolution kernel between the first pixel row and the second pixel row;
a convolution kernel correction subunit 3032, configured to perform a deskewing correction on the mixed image of the second pixel row by using the deskewing correction convolution kernel.
In some embodiments, the deskewing correction convolution kernel between the first pixel row and the second pixel row in the auxiliary channel may be calculated by theory.
Specifically, it can be realized by the following means:
s101, calculating an effective point spread function of a first pixel row and an effective point spread function of a second pixel row;
s102, carrying out Fourier change on an effective point spread function of a first pixel row and an effective point spread function of a second pixel row to obtain an optical transfer function of the first pixel row and an optical transfer function of the second pixel row;
s103, dividing the optical transfer function of the first pixel row and the optical transfer function of the second pixel row, and then performing inverse Fourier transform to obtain a depolarization correction convolution kernel.
In some embodiments, the deskew convolution kernel may be obtained experimentally.
Specifically, it can be realized by the following means:
s201, acquiring a monochromatic image of a first pixel row and a monochromatic image of a second pixel row under the condition that only an auxiliary channel illumination beam is started;
s202, carrying out Fourier transformation on the monochromatic image of the first pixel row and the monochromatic image of the second pixel row to obtain a frequency domain image of the first pixel row and a frequency domain image of the second pixel row;
s203, dividing the frequency domain image of the first pixel row and the frequency domain image of the second pixel row, and performing inverse Fourier transform to obtain a depolarization correction convolution kernel.
In the following, a detailed description will be given of the demodulation process by taking the illumination situation shown in fig. 3 (a) as an example, taking the channel 1 as a target channel and the channel 2 as an auxiliary channel, and selecting the mixed image corresponding to the 5 th pixel row and the 6 th pixel row in the channel 2 for demodulation.
In order to acquire the depolarization correction convolution kernel of the channel 2, only the laser of the channel 2 is started to illuminate the sample, at this time, the camera acquires a monochrome image only containing the fluorescence signal of the channel 2, and the monochrome images acquired by the 5 th pixel row and the 6 th pixel row can be respectively expressed as:
I 5,2 (x,y,z)=h 5,2 (x,y,z)*f 2 (x,y,z) (5)
I 6,2 (x,y,z)=h 6,2 (x,y,z)*f 2 (x,y,z) (6)
performing Fourier transform on the original space domain monochromatic image to obtain frequency domain images, wherein the frequency domain images are respectively expressed as:
O 5,2 (u,v,w)=H 5,2 (u,v,w)×F 2 (u,v,w) (7)
O 6,2 (u,v,w)=H 6,2 (u,v,w)×F 2 (u,v,w) (8)
wherein H is 5,2 (u,v,w)、H 6,2 (u, v, w) and F 2 (u, v, w) each represents h 5,2 (x,y,z)、h 6,2 (x, y, z) and f 2 Fourier transform of (x, y, z). Dividing the frequency domain images of the 5 th pixel row and the 6 th pixel row, and then performing inverse Fourier transform to obtain a convolution kernel with offset correction:
when the depolarization correction convolution kernel is used for acting on the image, displacement change in the x direction mainly exists between the images before and after convolution, and the signal intensity of each position in the image does not change obviously. The deskew convolution kernel does not change from sample position to sample position as long as the illumination beam does not change relative to the camera position.
After the depolarization correction convolution kernel is obtained, the depolarization correction is performed on the mixed image of the second pixel row by using the depolarization correction convolution kernel. Specifically, the original spatial domain image acquired by the 6 th pixel row is convolved with the depolarization correction convolution kernel, so that the spatial domain image of the 6 th pixel row with offset corrected can be obtained:
finally, subtracting the image of the 6 th pixel row after the depolarization correction from the original spatial domain image acquired by the 5 th pixel row to obtain an image only containing the channel 1 fluorescence signal, namely:
wherein g 1 (x, y, z) represents the demodulated image of channel 1.
In equation (11), the demodulated monochrome image contains the modulation term h 5,1 (x, y, z) and h 6,1 (x, y, z), which can be written separately:
as can be derived from these two formulas, near the focal plane,and->There is a misalignment and there is a significant difference in value between them at different x coordinates. Whereas in the out-of-focus position, the illumination PSF decays rapidly to uniform illumination, i.e.:
when z is in the out-of-focus position (14)
In addition, convolution kernel K 56 (x, y, z) does not change the intensity value of the convolved object. So at the defocus position, the modulation term in equation (11) is further expressed as:
h 5,1 (x,y,z)-h 6,1 (x,y,z)*K 56 (x, y, z) ≡0 when z is in defocus position (15)
The formula shows that in the monochromatic image demodulated by the formula (11), the defocused background is suppressed, and the image has chromatographic capability. The line illumination modulation bicolor tomography system provided by the application can realize bicolor imaging, and simultaneously improve the axial resolution and the tomography capacity of the imaging system, thereby being suitable for bicolor three-dimensional imaging of thick samples.
The above embodiments provide a method of acquiring a three-dimensional deskewing correction convolution kernel and performing two-color signal demodulation in a three-dimensional space using the convolution kernel, and in some embodiments, two-color demodulation may be performed on a two-dimensional plane in order to increase the calculation speed of demodulation.
For a single channel, the illumination PSF corresponding to each pixel row of the camera is only shifted in the x direction, so that the effective PSF mainly changes in the x direction, and the depolarization correction convolution kernel in the formula (9) can be further approximately simplified, namely:
wherein O is 5,2 (u, v) and O 6,2 (u, v) are respectively I 5,2 (x, y) and I 6,2 Fourier transform of (x, y), I 5,2 (x, y) and I 6,2 (x, y) is a two-dimensional image of channel 2 acquired by camera row 5 and row 6 for illuminating the sample with laser light that turns on channel 2 only. H 5,2 (u, v) and H 6,2 (u, v) are each h 5,2 (x, y) and h 6,2 Fourier transform of (x, y), h 5,2 (x, y) and h 6,2 (x, y) represents the two-dimensional effective PSF (the PSF of the focal plane) of channel 2 at the 5 th pixel row and the 6 th pixel row, respectively. Therefore, when in single-channel illumination, only a single-layer two-dimensional image is acquired, then a two-dimensional depolarization correction convolution kernel is calculated by using a formula (16), the two-dimensional image of the channel 1 during double-channel illumination can be demodulated by using the two-dimensional depolarization correction convolution kernel,
wherein the method comprises the steps ofI 5 (x, y) and I 6 (x, y) representing two-dimensional blended images acquired by the 5 th and 6 th pixel rows of the camera, respectively, when illuminated in two channels; h is a 5,1 (x, y) and h 6,1 (x, y) represents the two-dimensional effective PSF (the PSF of the focal plane) of channel 1 at the 5 th pixel row and the 6 th pixel row, respectively; f (f) 1 (x, y) represents the relative concentration distribution of the 1 st fluorescent protein on the two-dimensional plane.
In some embodiments, the process of performing the deskewing on the blended image corresponding to the second pixel row may be further implemented by a translation process, and referring to fig. 8, the image correction unit 303 further includes:
a translation parameter obtaining subunit 3033, configured to obtain a translation parameter between the first pixel row and the second pixel row;
and the shift correction subunit 3034 is configured to perform deskew on the mixed image of the second pixel row by using the shift parameter.
Specifically, the translation processing can be directly performed on the mixed image of the 6 th pixel row, and then the monochrome image of the channel 1 is demodulated by subtracting the 5 th pixel row from the 6 th pixel row, namely:
g 1 (x,y)=I 5 (x,y)-I 6 (x+d 1 ,y) (18)
wherein d 1 The offset distance between the fluorescence signals corresponding to channel 2 in the blended image corresponding to the 5 th pixel row and the 6 th pixel row is shown.
The above embodiment describes the monochrome image demodulation process of the channel 1, and the image demodulation process for the channel 2 is similar to the above steps. Referring to the illumination situation shown in fig. 3 (a), first, a deskew convolution kernel between the mixed images of the 4 th pixel row and the 5 th pixel row is acquired when the channel 1 is monochrome illuminated, and then the offset between the fluorescent signals corresponding to the channel 1 in the mixed images of the 4 th pixel row and the 5 th pixel row when the two-channel illumination is eliminated by the deskew convolution kernel. Since the line illumination light intensity of the channel 2 is greater than that of the 5 th pixel row and the mixed image of the 4 th pixel row is subtracted from the mixed image of the 5 th pixel row after the depolarization correction, the monochrome image of the channel 2 is demodulated.
By realizing synchronous two-color tomography on a single multi-element detector, images among all channels are in a natural registration relationship, no additional registration processing is needed, and a single-color image containing single-channel fluorescent signals can be obtained by processing the mixed image through a demodulation algorithm, so that the complexity of the system is greatly reduced.
The foregoing is only illustrative of the present application and is not to be construed as limiting thereof, but rather as various modifications, equivalent arrangements, improvements, etc., within the spirit and principles of the present application.

Claims (10)

1. A line illumination modulated dual color tomography system, the line illumination modulated dual color tomography system comprising:
the line illumination modulation module comprises two monochromatic light sources and a modulation light path, and the wavelengths of light beams emitted by the two monochromatic light sources are different; each light beam passes through the modulation light path to form a line light spot focused on the focal plane of the objective lens, the illumination intensity of the line light spot on the focal plane of the objective lens is Gaussian distributed in a first direction, the first direction is perpendicular to the extending direction of the line light spot, and the position parameter of the line light spot is positioned in the center of a pixel row or on the intersection line of adjacent pixel rows; the distance between the position parameters of the two line light spots is an integral multiple of the corresponding width of 0.5 pixel in the object space, and the line light spots of the two channels are overlapped to form bicolor line illumination light;
the imaging module is used for continuously scanning and imaging along the first direction by adopting a multi-element detector with n rows of pixels to obtain n mixed images under the illumination of the double-color line illumination light, each mixed image corresponds to one row of pixels, each mixed image contains signals of 2 channels, and n is more than or equal to 3 and is a positive integer;
and the image demodulation module is used for carrying out subtraction processing on the mixed image of the two symmetrical pixel rows, demodulating a single-color image from the mixed image, wherein the two symmetrical pixel rows are symmetrical about the position parameter of the line light spot of one channel, and the single-color image corresponds to the signal of the other channel.
2. The line illumination modulated dual color tomography system of claim 1, wherein the image demodulation module comprises:
the channel selection unit is used for selecting one channel as a target channel and the other channel as an auxiliary channel;
a pixel row determining unit configured to determine a first pixel row and a second pixel row symmetrical with respect to a position parameter of the auxiliary channel;
an image correction unit configured to perform deskew correction on the mixed image of the second pixel row;
and the image demodulation unit is used for obtaining the monochromatic image of the target channel according to the mixed image of the first pixel row and the mixed image of the second pixel row after the depolarization correction.
3. The line illumination modulation dual color tomography system of claim 2, wherein the image correction unit comprises:
a convolution kernel acquisition subunit configured to acquire a deskewing correction convolution kernel between the first pixel row and the second pixel row;
and the convolution kernel correction subunit is used for carrying out the depolarization correction on the mixed image of the second pixel row by using the depolarization correction convolution kernel.
4. A line illumination modulated bi-color tomography system as claimed in claim 3, characterized in that the convolution kernel acquisition subunit is adapted to:
acquiring a monochromatic image of the first pixel row and a monochromatic image of the second pixel row under the condition that only an auxiliary channel illumination beam is started;
performing Fourier transform on the monochromatic images of the first pixel row and the monochromatic images of the second pixel row to obtain a frequency domain image of the first pixel row and a frequency domain image of the second pixel row;
and dividing the frequency domain image of the first pixel row and the frequency domain image of the second pixel row, and then performing inverse Fourier transform to obtain the depolarization correction convolution kernel.
5. A line illumination modulated bi-color tomography system as claimed in claim 3, characterized in that the convolution kernel acquisition subunit is adapted to:
calculating an effective point spread function of the first pixel row and an effective point spread function of the second pixel row;
performing Fourier transformation on the effective point spread function of the first pixel row and the effective point spread function of the second pixel row to obtain an optical transfer function of the first pixel row and an optical transfer function of the second pixel row;
and dividing the optical transfer function of the first pixel row and the optical transfer function of the second pixel row, and then performing inverse Fourier transform to obtain the depolarization correction convolution kernel.
6. The line illumination modulation dual color tomography system of claim 2, wherein the image correction unit comprises:
a translation parameter obtaining subunit, configured to obtain a translation parameter between the first pixel row and the second pixel row;
and the translation correction subunit is used for carrying out offset correction on the mixed image of the second pixel row by utilizing the translation parameters.
7. The line illumination modulated dual color tomography system of any of claims 1 to 6, wherein the modulation optical path comprises a shaping optical path for shaping the beam into a line beam, a position adjustment optical path for adjusting a position parameter of each line spot, and a projection optical path for superimposing the line spots to form dual color line illumination.
8. The line illumination modulation dual color tomography system of claim 7, wherein the position adjustment optical path comprises a second dichroic mirror, a third dichroic mirror, a first mirror, a second mirror, a third mirror, and a fourth mirror, the second dichroic mirror splitting the incident light beam into two optical paths, one optical path passing through the second and third dichroic mirrors in sequence, the other optical path passing through the second, first, second, and third mirrors in sequence, the outgoing light from the third dichroic mirror passing through the third and fourth mirrors in sequence.
9. The line illumination modulated dual color tomography system of any of claims 1 to 6, wherein the imaging module comprises:
the scanning unit is used for continuously scanning and imaging along a first direction by adopting a multi-element detector with n rows of pixels, wherein n is more than or equal to 3;
an image block acquisition unit that acquires a stripe image block of an i-th pixel row in each frame image of one sample obtained in time sequence;
and the splicing unit is used for sequentially splicing the strip image blocks of the ith pixel row in each frame of image of one sample to obtain a mixed image of the ith pixel row, i epsilon n.
10. The line illumination modulated dual color tomography system of any of claims 1 to 6, further comprising a drive module for driving the line illumination modulation module and the sample for relative movement in three directions perpendicular to each other.
CN202310411101.7A 2023-04-17 2023-04-17 Line illumination modulation double-color tomography system Pending CN116593434A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310411101.7A CN116593434A (en) 2023-04-17 2023-04-17 Line illumination modulation double-color tomography system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310411101.7A CN116593434A (en) 2023-04-17 2023-04-17 Line illumination modulation double-color tomography system

Publications (1)

Publication Number Publication Date
CN116593434A true CN116593434A (en) 2023-08-15

Family

ID=87588908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310411101.7A Pending CN116593434A (en) 2023-04-17 2023-04-17 Line illumination modulation double-color tomography system

Country Status (1)

Country Link
CN (1) CN116593434A (en)

Similar Documents

Publication Publication Date Title
EP2817670B1 (en) Multi-focal structured illumination microscopy systems and methods
JP2021165851A (en) Configuration for light sheet microscopy, and method therefor
US10809514B2 (en) Low resolution slide imaging and slide label imaging and high resolution slide imaging using dual optical paths and a single imaging sensor
US10156711B2 (en) Multi-focal structured illumination microscopy systems and methods
US10955650B2 (en) Two pass macro image
US11106026B2 (en) Scanning microscope for 3D imaging using MSIA
US11086113B2 (en) Multi-focal structured illumination microscopy systems and methods
JP2005292839A (en) Slit confocal microscope and its operation method
ES2928577T3 (en) 2D and 3D Fixed Z-Scan
WO2018226836A1 (en) Multi-focal structured illumination microscopy systems and methods
US20200358946A1 (en) Methods and Systems for Single Frame Autofocusing Based on Color-Multiplexed Illumination
CN116593434A (en) Line illumination modulation double-color tomography system
CN116539575A (en) Line illumination modulation polychrome tomography system
CN111650739B (en) Single-frame exposure rapid three-dimensional fluorescence imaging system and method based on DMD
AU2019373533B2 (en) High-throughput optical sectioning imaging method and imaging system
CN116609305A (en) Linear illumination modulation multicolor imaging system
US20220373777A1 (en) Subpixel line scanning
JP2019204026A (en) Microscope system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination