CN113267480B - High-precision real-time drift correction method and system based on phase image - Google Patents

High-precision real-time drift correction method and system based on phase image Download PDF

Info

Publication number
CN113267480B
CN113267480B CN202110703134.XA CN202110703134A CN113267480B CN 113267480 B CN113267480 B CN 113267480B CN 202110703134 A CN202110703134 A CN 202110703134A CN 113267480 B CN113267480 B CN 113267480B
Authority
CN
China
Prior art keywords
image
drift
images
fluorescence
phase
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110703134.XA
Other languages
Chinese (zh)
Other versions
CN113267480A (en
Inventor
黄振立
商明涛
周志伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huazhong University of Science and Technology
Original Assignee
Huazhong University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huazhong University of Science and Technology filed Critical Huazhong University of Science and Technology
Priority to CN202110703134.XA priority Critical patent/CN113267480B/en
Publication of CN113267480A publication Critical patent/CN113267480A/en
Application granted granted Critical
Publication of CN113267480B publication Critical patent/CN113267480B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/645Specially adapted constructive features of fluorimeters
    • G01N21/6456Spatial resolved fluorescence measurements; Imaging
    • G01N21/6458Fluorescence microscopy
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/62Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light
    • G01N21/63Systems in which the material investigated is excited whereby it emits light or causes a change in wavelength of the incident light optically excited
    • G01N21/64Fluorescence; Phosphorescence
    • G01N21/6402Atomic fluorescence; Laser induced fluorescence
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N2201/00Features of devices classified in G01N21/00
    • G01N2201/12Circuits of general importance; Signal processing
    • G01N2201/127Calibration; base line adjustment; drift compensation

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Biochemistry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Optics & Photonics (AREA)
  • Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention discloses a high-precision real-time drift correction method and system based on a phase image, and belongs to the technical field of fluorescence imaging. The method comprises the following steps: s1, respectively collecting three reference images and two experimental images before imaging; s2, two frames are put into practiceTemplate matching is carried out on the test image and the three reference images one by one, and the correlation zeta of the two test images is obtained according to the matching resultuAnd ζdFitting a linear relation k of the correlation zeta and the Z-axis position; s3, acquiring a frame of initial image I0Then acquiring a drift image I after acquiring N frames of fluorescence imageszAnd obtaining three-dimensional drift amount by template matching of the three-dimensional drift amount and the three-frame reference image, thereby carrying out compensation correction once until the number of the acquired fluorescence image frames meets the requirement. The method has the advantages that the sample is not required to be specially prepared, the real-time drift correction can be realized only by alternately collecting the fluorescence image and the drift image, the process is very simple, and the improvement on an imaging system is very small.

Description

High-precision real-time drift correction method and system based on phase image
Technical Field
The invention belongs to the field of fluorescence imaging, and particularly relates to a high-precision real-time drift correction method and system based on a phase image.
Background
In the field of fluorescence imaging, a large number of frames of fluorescence images are required to be acquired in many imaging scenes for problem analysis and research on samples. Generally, the time of each imaging exposure is 10-50ms, in order to obtain fluorescence images with a large number of frames for research, the whole imaging process needs to last for a long time, and in the continuous acquisition process of these fluorescence images, factors such as mechanical vibration and thermal expansion of the imaging system can cause the drift of the sample, that is, the sample and the objective lens move relatively, causing the positions of the structures imaged at different times in the fluorescence images to be different, thereby causing the degradation of the imaging quality, so it is necessary to correct the drift in the imaging process.
The background is further described below by using super-resolution imaging, which is a microscopic imaging method that essentially breaks through diffraction limit, and improves the resolution of an optical imaging system by nearly one order of magnitude, and the method reconstructs an image with super-high resolution through sparse activation imaging of molecules and high-precision positioning of single molecules, and the process generally needs to acquire thousands of frames of fluorescence images or even tens of thousands of frames of fluorescence images, and the whole imaging process needs to last for tens of seconds or even tens of minutes. Since the positioning accuracy of the single molecule and the resulting imaging resolution are on the nanometer scale, the system needs to maintain extremely high stability during the imaging process. As mentioned above, the mechanical vibration and thermal expansion of the imaging system can cause the sample to drift, and the defocusing of the sample can directly cause the reduction of the single molecule positioning accuracy, thereby reducing the imaging quality of the final reconstructed super-resolution image. The drift amount of the sample can reach dozens of nanometers or even hundreds of nanometers, which is a fatal influence on the super-resolution positioning imaging with nanometer resolution.
The existing drift correction method comprises the following steps: for example, the fiducial-based correction method requires the placement of fiducial points such as fluorescent beads on the slide, but makes sample preparation more complicated, and the random distribution of fiducial points means that it is difficult to ensure the proper number of fiducial points in the field of view. For another example, the three-dimensional position of the sample is estimated based on an analysis method of the structural characteristics of the sample, wherein the method based on the diffraction image of the intracellular circular microstructure requires a circular microstructure with a proper size in a field of view, and the method based on the cross-correlation analysis of the backscattered laser speckle pattern of the sample requires continuous insertion and extraction of EM in order to obtain a backscattered light spot of laser in the imaging process, so that the imaging process is very complex. Therefore, the drift correction method in the prior art often has special requirements on an imaging system, sample preparation or an imaging process, and the imaging difficulty is improved.
Disclosure of Invention
Aiming at the defects or the improvement requirements of the prior art, the invention provides a high-precision real-time drift correction method and system based on a phase image, and aims to solve the technical problem that the drift correction is carried out under a long-time fluorescence imaging scene in the prior art, so that the requirements on an imaging system or sample preparation are high.
To achieve the above object, according to one aspect of the present invention, there is provided a phase image-based high-precision real-time drift correction method, including the steps of:
s1, before imaging, acquiring a frame of reference image R at the focal plane positionfAbove the focal plane ZuAcquiring a frame of reference image R at a positionpAnd a frame of experimental image IuBelow the focal plane ZdAcquiring a frame of reference image R at a positionnAnd a frame of experimental image Id
S2, two frames of experimental images Iu、、IdAnd three reference pictures Rf、Rp、RnTemplate matching is performed one by oneAnd obtaining the correlation zeta of the two frames of experimental images according to the matching resultuAnd ζdFitting a linear relation k of the correlation zeta and the Z-axis position;
s3, starting to perform fluorescence imaging on the sample, and acquiring a frame of initial image I0Comparing the initial image with the reference image RfCarrying out template matching to obtain an initial image and a reference image RfRelative displacement between them, and acquiring a drift image I after acquiring N frames of fluorescence imageszThe drift image is compared with the reference image RfPerforming template matching to obtain a drift image and a reference image RfRelative displacement between the two images, namely the difference of the two relative displacements, namely the transverse drift amount between the drift image and the initial image; template matching is carried out on the drift image and the initial image and the three reference images in the S1 one by one, and the correlation zeta of the drift image is obtained according to the matching resultzAnd the correlation ζ of the initial image0Longitudinal drift amount of
Figure BDA0003130992330000021
And performing compensation correction for one time according to the transverse drift amount and the longitudinal drift amount until the frame number of the acquired fluorescence image meets the requirement.
In the technical scheme, the initial three-dimensional position of the sample is obtained from the first frame of initial image by a template matching method, then a frame of drift image is collected after N frames of fluorescence images are collected, three-dimensional position information after the sample is drifted is obtained by the template matching method, and further the three-dimensional drift amount of the sample is obtained, so that one-time compensation can be carried out, and the steps are repeated for many times until the frame number of the collected fluorescence images meets the requirement. The correction and the imaging are alternately carried out in the whole imaging process, the correction process does not need to carry out special preparation on a sample, the real-time drift correction can be realized only by alternately acquiring a fluorescence image and a drift image, the whole process is very simple, and the improvement on an imaging system is very small.
According to another aspect of the invention, a high-precision real-time drift correction system based on phase images is provided, which comprises a three-dimensional nano displacement platform, a three-dimensional optical sensor and a phase detector, wherein the three-dimensional nano displacement platform is used for bearing a sample;
the two symmetrically distributed LEDs illuminate the sample by the LED light from different illumination angles to form bright field images at different angles and acquire phase images;
the laser module generates exciting light to be projected onto the sample so as to excite fluorescence to form a fluorescence image;
the detection module is used for acquiring a phase image and a fluorescence image;
the image processing module executes the method to obtain the transverse drift amount and the longitudinal drift amount;
and the control module is used for controlling the illumination of the two LEDs, the acquisition of the detection module and the movement of the three-dimensional nanometer displacement platform, and also receiving the transverse drift amount and the longitudinal drift amount so as to control the three-dimensional nanometer displacement platform to carry out drift compensation.
In the technical scheme, the phase images with higher contrast are obtained through the two symmetrical LEDs, so that the dependency on the sample structure is reduced, the method is applicable to more structures, the LEDs, the three-dimensional nanometer displacement and the detection module are controlled through the same control module, the phase images and the fluorescence images can be obtained more conveniently and alternately, and real-time drift compensation is performed during imaging.
Drawings
FIG. 1 is a flow chart of a drift correction method in the present application;
FIG. 2 is a first schematic diagram of a drift correction system of the present application;
fig. 3 is a second schematic diagram of the drift correction system of the present application.
In the figure, 1, a three-dimensional nano translation stage; 2. a sample; 3. an LED; 4. a base; 5. exciting light; 6. a detection module; 7. a control module; 8. a condenser lens; 9. an objective lens; 10. a detector; 11. a beam splitter.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
As shown in fig. 1, the present invention provides a high-precision real-time drift correction method based on phase images, which includes the following steps:
s1, before imaging, acquiring a frame of reference image R at the focal plane positionfAbove the focal plane ZuAcquiring a frame of reference image R at a positionpAnd a frame of experimental image IuBelow the focal plane ZdAcquiring a frame of reference image R at a positionnAnd a frame of experimental image Id
S2, two frames of experimental images Iu、、IdAnd three reference pictures Rf、Rp、RnTemplate matching is carried out one by one, and the correlation zeta of two frames of experimental images is obtained according to the matching resultuAnd ζdFitting a linear relation k of the correlation zeta and the Z-axis position;
s3, starting to perform fluorescence imaging on the sample, and acquiring a frame of initial image I0Comparing the initial image with the reference image RfCarrying out template matching to obtain an initial image and a reference image RfRelative displacement between the two images, and acquiring a frame drift image I after acquiring N (the value of N depends on how long a drift correction is required to be performed, generally 100-zThe drift image is compared with the reference image RfPerforming template matching to obtain a drift image and a reference image RfRelative displacement between the two images, namely the difference of the two relative displacements, namely the transverse drift amount between the drift image and the initial image; template matching is carried out on the drift image and the initial image and the three reference images in the S1 one by one, and the correlation degree zeta of the drift image is obtained according to the matching resultzAnd the correlation ζ of the initial image0Longitudinal drift amount of
Figure BDA0003130992330000041
And performing compensation correction for one time according to the transverse drift amount and the longitudinal drift amount until the frame number of the acquired fluorescence image meets the requirement.
According to the method, a frame of drift image is collected after every N frames of fluorescence images are collected, then three-dimensional position information after sample drift is obtained through a template matching method, and further three-dimensional drift amount of the sample is obtained, compensation is carried out for one time, and the method is repeated for multiple times until the number of the collected fluorescence image frames meets requirements. The whole imaging process is performed with correction and imaging alternately, special preparation is not needed for a sample, only fluorescent images and drift images need to be acquired alternately in the imaging process, the whole process is very simple, and the improvement on an imaging system is very small. The method can be applied to super-resolution imaging scenes and can also be applied to other fluorescence imaging scenes continuously acquired for a long time.
Specifically, the reference image, the experimental image, the initial image and the drift image are all phase images, and the phase image is obtained through the following steps: shooting and collecting two frames of bright field images L from different illumination angles1And L2Phase image of
Figure BDA0003130992330000042
The phase image is calculated through the two frames of bright field images, the contrast of the image can be improved, the dependency on the structure of the sample is reduced, the method is suitable for more structures, and a good correction effect can be still guaranteed for the structure with low contrast of the bright field images.
Further, the linear relationship k in S2, i.e., the slope between the degree of correlation ζ and the Z-axis position,
Figure BDA0003130992330000043
further, the degree of correlation
Figure BDA0003130992330000044
Cp,iThen represents RpAnd IiTemplate matching result of (1), Cn,iThen represents RnAnd IiTemplate matching result of (1), Cf,iThen represents RfAnd IiWherein i denotes u, d, z, 0. It can be found that the experimental image I in this applicationuAnd a reference image RpAt the same Z-axis position ZuExperimental image IdAnd a reference image RnAlso at the same Z-axis position ZuAnd the contents of two phase images acquired at the same Z-axis position are not different, but are influenced by noise, and the values of corresponding pixels are slightly different, so that after the two phase images are correspondingly subjected to template matching, the fitting slope k is more accurate.
Specifically, the experimental image IuAnd a reference image RpThe result of template matching of (a) is Cp,uExperimental image IuAnd a reference image RnThe result of template matching of (a) is Cn,uExperimental image IuAnd a reference image RfThe result of template matching of (a) is Cf,uExperimental image IuDegree of correlation of
Figure BDA0003130992330000051
Experimental image IdAnd a reference image RpThe result of template matching of (a) is Cp,dExperimental image IdAnd a reference image RnThe result of template matching of (a) is Cn,dExperimental image IdAnd a reference image RfThe result of template matching of (a) is Cf,dExperimental image IdDegree of correlation of
Figure BDA0003130992330000052
Drifting image IzAnd a reference image RpThe result of template matching of (a) is Cp,z(iv) drift image IzAnd a reference image RnThe result of template matching of (a) is Cn,zDrift picture IzAnd a reference image RfThe result of template matching of (a) is Cf,z(iv) drift image IzDegree of correlation of
Figure BDA0003130992330000053
Initial image I0And a reference image RpThe result of template matching of (a) is Cp,0Initial image I0And a reference image RnThe result of template matching of (a) is Cn,0Initial image I0And a reference image RfTemplate matching result ofIs Cf,0Initial image I0Degree of correlation of
Figure BDA0003130992330000054
When the axial drift amount is obtained by template matching, the maximum value in the template matching result represents the similarity of two images, the similarity of the two images is related to the distance between the Z axes of the two images, the larger the distance is, the smaller the similarity is, but no matter the images are upwards floated or downwards floated, the peak value of the similarity is reduced, so that the direction of the Z axis position of the drift image and the initial image cannot be judged in the correction process, therefore, a correlation parameter zeta related to the similarity is established, and the general formula is
Figure BDA0003130992330000055
Therefore, the zeta and the Z axis position are close to a linear relation, and the corresponding relation between the Z axis position and the similarity in the template matching result can be established. Specifically, before imaging starts, template matching is carried out on two frames of experimental images and three frames of reference images respectively, then two zeta values are obtained, Z-axis positions corresponding to the two frames of experimental images are recorded, and the zeta values and the Z-axis positions are in a linear relation, so that the corresponding relation between the Z-axis positions and the similarity can be established by calculating the slope between the zeta values and the Z-axis positions, and in the optical imaging process, the axial drift amount can be obtained only by obtaining two zeta values of a drift image and an initial image and dividing the two zeta values by the slope k.
More specifically, the template matching is performed on the two images by using the existing method, which means that the image R is scanned from the upper left corner of the image I, the overlapping area of the two images is partially calculated until the lower right corner of the image I, and finally a two-dimensional matrix result C is obtained, which means that the maximum value of the matrix and the pixel where the maximum value is located represent the similarity and the relative displacement between the two images respectively. Image R corresponds to reference image R in the present applicationf、Rp、RnImage I corresponds to the experimental image I in the present applicationu、IdInitial image I0And drift picture Iz. The template matching process is as follows:
Figure BDA0003130992330000056
wherein the content of the first and second substances,
Figure BDA0003130992330000057
Figure BDA0003130992330000058
Figure BDA0003130992330000059
representing the average value, I, of all pixels of the image RRRepresenting the overlapping portion of image I and image R,
Figure BDA00031309923300000510
is represented byRX, y refers to the coordinates of a certain pixel in the image I, which represents the traversal of the whole image, x ', y' refers to the images R and IRAlso indicates that the entire image is traversed. Since the template matching calculation amount is large as mentioned above, it is preferable that only the sub-area of the pixels near the central position is scanned, and only the image of a part of the pixels is cut out for calculation, so that the calculation amount can be greatly reduced, where R 'and I' denote the scanned sub-areas in the images R and I.
The template matching requires the similarity of two images scanned at different relative positions pixel by pixel, the calculated amount is very large, in the conventional super-resolution imaging, the drift of a sample between two corrections generally does not exceed dozens of nanometers, and the pixel size of a system is generally larger than the drift amount, that is, the drift amount does not exceed one pixel. Therefore, it is preferable that only the area of the pixels near the center position is scanned and the calculation is performed by cutting out the image of only a part of the pixels, which can greatly reduce the amount of calculation. In other imaging scenes, the scanning and calculation regions can be adaptively adjusted according to the actual drift amount so as to reduce the calculation amount and improve the correction speed.
Further, since the amount of drift does not exceed one pixel, the template matching result is a two-dimensional matrixAs a result, the obtained relative displacement is at least in the order of one pixel, and in S3, the initial image and the reference image RfTemplate matching result, drift image and reference image R offAnd after two-dimensional Gaussian fitting is carried out on the template matching results, relative displacement is obtained, so that the lateral drift amount with sub-pixel level precision can be obtained.
Further, in S3, the lateral drift amount and the longitudinal drift amount are compensated after being multiplied by a coefficient smaller than 1. This is because there is a calculation error in obtaining the drift amount, and overestimating the drift amount causes the sample to oscillate at the target position, and therefore, the drift amount can be compensated by multiplying the drift amount by a coefficient smaller than 1, which can ensure that the sample is always in a state of approaching the target position. The preferable range of the coefficient is 0.7 to 0.9.
The invention also provides a high-precision real-time drift correction system based on the phase image, which comprises the following steps:
the three-dimensional nanometer displacement platform 1 is used for bearing a sample 2;
two symmetrically distributed LEDs 3, performing LED light illumination on the sample 2 from different illumination angles to form bright field images at different illumination angles, and acquiring phase images;
the laser module generates exciting light 5 to be projected onto the sample 2 so as to excite fluorescence to form a fluorescence image;
the detection module 6 is used for acquiring a phase image and a fluorescence image;
the image processing module executes the method to obtain the transverse drift amount and the longitudinal drift amount;
and the control module 7 is used for controlling the illumination of the two LEDs 3, the acquisition of the detection module 6 and the movement of the three-dimensional nanometer displacement table 1, and also receiving the transverse drift amount and the longitudinal drift amount so as to control the three-dimensional nanometer displacement table 1 to perform drift compensation.
The LED3, the three-dimensional nanometer displacement 1 and the detection module 6 are controlled by the same control module 7, so that phase images and fluorescence images can be acquired more conveniently and alternately for drift compensation. And before imaging begins, the control module 7 further controls the three-dimensional nano translation stage 1 to drive the sample 2 to respectively move to the focal plane position, the position above the focal plane position and the position below the focal plane position, and three frames of reference images and two frames of experimental images are acquired, so that the image processing module can obtain the transverse drift amount and the longitudinal drift amount through the method.
Because the bright field image is obtained by LED illumination, the bright field image is difficult to ensure good correction effect for the structure with low contrast, so that the differential phase contrast imaging is utilized to obtain the phase image for correction, and the method and the device can be suitable for the sample structure with low contrast. In differential phase contrast imaging, when the numerical aperture of LED illumination is large, the low-frequency structure and the high-frequency structure can obtain good contrast improvement, and the structural contrast improvement of which the phase gradient is vertical to the symmetric axis of the LED array is most obvious. The phase image is acquired by two symmetrical LED structures. The symmetry axes of the two LEDs form an angle of 45 degrees with the horizontal direction, so that the contrast of the x-axis structure and the y-axis structure is improved through single-axis phase imaging. In addition, in order to obtain a high-quality phase image, the distance between two LEDs needs to be adjusted, specifically: when two LEDs 3 are designed, the LEDs are welded on a base 4, the distance between the two LEDs is determined by the positions of the welding points, the farther each LED is away from the center of the base, the larger the illumination numerical aperture of the LED is, and if the distance between the light beam and the optical axis exceeds the clear aperture of the condenser lens 8, so that light cannot be collected by the objective lens 9, and therefore, the distance between the two LEDs needs to be adjusted according to the principle to increase the illumination numerical aperture as much as possible. And, when carrying out LED formation of image collection, need calculate the mean value of bright field image and compare with corresponding reference image, in order to guarantee the correct response of LED.
Further, as shown in fig. 2 and 3, the detection module 6 may include one detector 10, or may include two detectors 10, and in the case of one detector 10, the LED light and the fluorescence are collected by one detector 10, so that their wavelengths are relatively close to each other, and the wavelength difference Δ λ between them should satisfy: delta lambda is more than or equal to 0 and less than or equal to 20nm, and the luminescence spectra of the two are basically overlapped; in the case of two detectors 10, the detection module further includes a beam splitter 11 for splitting the LED light and the fluorescent light, so that their wavelengths must be distinguished, and the wavelength difference Δ λ between them should satisfy: the delta lambda is more than or equal to 60nm, and the luminescence spectra of the two can not be overlapped completely.
In general, when Z is in this applicationu=+300nm,ZdThe linear distance between two LEDs is about 4mm, the pixel size of the system is 100nm, only a sub-area of 15 × 15 pixels near the center position is scanned, only 100 × 100 pixels of images are intercepted for template matching calculation, the transverse drift amount and the longitudinal drift amount are multiplied by a coefficient of 0.9, then drift correction is carried out, the correction precision of about 5cm in the three-dimensional direction can be achieved, the time consumption of drift correction is about 240ms, therefore, the drift correction can be carried out once every 1-2s, and the time consumption of drift correction can be accepted. Compared with the prior art, the system is simple and easy to realize, the sample preparation process is simple, the system is suitable for more structures, and the robustness is very high.
It will be understood by those skilled in the art that the foregoing is only a preferred embodiment of the present invention, and is not intended to limit the invention, and that any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (10)

1. A high-precision real-time drift correction method based on a phase image is characterized by comprising the following steps:
s1, before imaging, acquiring a frame of reference image R at the focal plane positionfAbove the focal plane ZuAcquiring a frame of reference image R at a positionpAnd a frame of experimental image IuBelow the focal plane ZdAcquiring a frame of reference image R at a positionnAnd a frame of experimental image Id
S2, testing image IuRespectively with three reference images Rf、Rp、RnTemplate matching is carried out one by one, and an experimental image I is obtaineddRespectively with three reference images Rf、Rp、RnTemplate matching is carried out one by one, and two frames of experimental images I are obtained according to matching resultsu、IdZeta degree of correlationuAnd ζdFitting a linear relation k of the correlation zeta and the Z-axis position;
s3, starting to perform fluorescence imaging on the sample, and acquiring a frame of initial image I0Comparing the initial image with the reference image RfCarrying out template matching to obtain an initial image and a reference image RfRelative displacement between them, and acquiring a drift image I after acquiring N frames of fluorescence imageszThe drift image is compared with the reference image RfPerforming template matching to obtain a drift image and a reference image RfRelative displacement between the two images, namely the difference of the two relative displacements, namely the transverse drift amount between the drift image and the initial image; template matching is carried out on the drift image and the initial image and the three reference images in the S1 one by one, and the correlation zeta of the drift image is obtained according to the matching resultzAnd the correlation ζ of the initial image0Longitudinal drift amount of
Figure FDA0003457198490000011
And performing primary compensation according to the transverse drift amount and the longitudinal drift amount until the frame number of the acquired fluorescence image meets the requirement.
2. The phase image-based high-precision real-time drift correction method according to claim 1, wherein the reference image, the experimental image, the initial image and the drift image are phase images.
3. The phase image-based high-precision real-time drift correction method according to claim 2, wherein the phase image is obtained by the following specific steps: shooting and collecting two frames of bright field images L from different illumination angles1And L2Phase image of
Figure FDA0003457198490000012
4. The phase image-based high-precision real-time drift correction method according to claim 1, wherein the linear relation k in S2 isThe slope between the correlation ζ and the Z-axis position,
Figure FDA0003457198490000013
5. the phase image-based high-precision real-time drift correction method according to claim 4, wherein the correlation degree
Figure FDA0003457198490000014
Cp,iThen represents RpAnd IiTemplate matching result of (1), Cn,iThen represents RnAnd IiTemplate matching result of (1), Cf,iThen represents RfAnd IiWherein i denotes u, d, z, 0.
6. The phase image-based high-precision real-time drift correction method according to claim 1, wherein in S3, the initial image and the reference image RfTemplate matching result, drift image and reference image R offAnd after two-dimensional Gaussian fitting is carried out on the template matching results, relative displacement is obtained, and further the transverse drift amount of sub-pixel precision is obtained.
7. The method for high-precision real-time drift correction based on phase images according to claim 1, wherein in step S3, the compensation is performed after the lateral drift amount and the longitudinal drift amount are multiplied by a coefficient smaller than 1.
8. A high-precision real-time drift correction system based on phase images is characterized by comprising:
the three-dimensional nano displacement platform is used for bearing a sample;
the two symmetrically distributed LEDs illuminate the sample by the LED light from different illumination angles to form bright field images at different angles and acquire phase images;
the laser module generates exciting light to be projected onto the sample so as to excite fluorescence to form a fluorescence image;
the detection module is used for acquiring a phase image and a fluorescence image;
an image processing module for performing the method of any one of claims 1-7 to obtain an amount of lateral drift and an amount of longitudinal drift;
and the control module is used for controlling the illumination of the two LEDs, the acquisition of the detection module and the movement of the three-dimensional nanometer displacement platform, and also receiving the transverse drift amount and the longitudinal drift amount so as to control the three-dimensional nanometer displacement platform to carry out drift compensation.
9. The phase image-based high-precision real-time drift correction system according to claim 8, wherein the symmetry axes of the two LEDs are at an angle of 45 ° to the horizontal.
10. The phase image-based high-precision real-time drift correction system according to claim 9, wherein the detection module comprises a detector, the LED light and the fluorescence are collected by the detector, and the difference Δ λ between the wavelength of the LED light and the wavelength of the fluorescence is satisfied: delta lambda is more than or equal to 0 and less than or equal to 20 nm;
or, the detection module comprises two detectors and a beam splitter, the LED light and the fluorescence are split into two detectors by the beam splitter and collected respectively, and the difference Δ λ between the wavelength of the LED light and the wavelength of the fluorescence needs to satisfy: delta lambda is more than or equal to 60 nm.
CN202110703134.XA 2021-06-24 2021-06-24 High-precision real-time drift correction method and system based on phase image Active CN113267480B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110703134.XA CN113267480B (en) 2021-06-24 2021-06-24 High-precision real-time drift correction method and system based on phase image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110703134.XA CN113267480B (en) 2021-06-24 2021-06-24 High-precision real-time drift correction method and system based on phase image

Publications (2)

Publication Number Publication Date
CN113267480A CN113267480A (en) 2021-08-17
CN113267480B true CN113267480B (en) 2022-05-20

Family

ID=77235817

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110703134.XA Active CN113267480B (en) 2021-06-24 2021-06-24 High-precision real-time drift correction method and system based on phase image

Country Status (1)

Country Link
CN (1) CN113267480B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116503471B (en) * 2023-06-28 2023-08-29 南开大学 Single-molecule positioning imaging drift correction method and system for K-time neighbor position cloud picture

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6242282B2 (en) * 2014-04-30 2017-12-06 日本電子株式会社 Drift amount calculation device, drift amount calculation method, and charged particle beam device
AU2017294789B2 (en) * 2016-07-14 2022-06-16 Commonwealth Scientific And Industrial Research Organisation Apparatus for measuring spectra
CN109035162B (en) * 2018-07-06 2021-10-19 南京大学 Picture drift correction method and system based on pixel reconstruction
CN109685877B (en) * 2018-12-27 2022-11-25 重庆大学 Micro-nano CT focus drift correction method based on adaptive projection image characteristic region matching
CN111879798B (en) * 2020-06-19 2023-02-24 中国人民解放军战略支援部队信息工程大学 Nano CT projection position drift correction method and device based on acquisition sequence subdivision

Also Published As

Publication number Publication date
CN113267480A (en) 2021-08-17

Similar Documents

Publication Publication Date Title
CN109269438B (en) Structured light illumination microscopic measurement method for detecting multilayer complex micro-nano structure
US10477097B2 (en) Single-frame autofocusing using multi-LED illumination
US10462351B2 (en) Fast auto-focus in imaging
JP4806630B2 (en) A method for acquiring optical image data of three-dimensional objects using multi-axis integration
JP6637653B2 (en) Microscope and SPIM microscopy method
US10502943B2 (en) Microscope system and autofocusing method
JP5784435B2 (en) Image processing apparatus, fluorescence microscope apparatus, and image processing program
CN111912835B (en) LIBS device and LIBS method with ablation measuring function
CN110849289A (en) Double-camera parallel confocal differential microscopic 3D morphology measurement device and method
CN107690595A (en) For by be illuminated with different light angles come image record apparatus and method
CN113267480B (en) High-precision real-time drift correction method and system based on phase image
US11067510B2 (en) System and method for estimating and compensating for sample drift during data acquisition in fluorescence microscopy
CN116183568B (en) High-fidelity reconstruction method and device for three-dimensional structured light illumination super-resolution microscopic imaging
JP2007140322A (en) Optical apparatus
CN109995998B (en) Automatic focusing method suitable for scanning/transmission electron microscope imaging
CN108700732A (en) Method for the height and position for determining object
US20190285401A1 (en) Determining the arrangement of a sample object by means of angle-selective illumination
CN107490566A (en) Airy beam mating plate illumination microscopic imaging device based on binary optical elements
Yi et al. A parallel differential confocal method for highly precise surface height measurements
US10211024B1 (en) System and method for axial scanning based on static phase masks
CN110231320B (en) Sub-millisecond real-time three-dimensional super-resolution microscopic imaging system
CN110455797A (en) Metallographic microscope matrix normalization bearing calibration
TWI286197B (en) 2/3-dimensional synchronous image capturing system and method
JP4381687B2 (en) Total reflection fluorescence microscope
KR102484708B1 (en) Correction method of three-dimensional image based on multi-wavelength light

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant