CN108051897B - Microscopic imaging system and real-time focusing method - Google Patents

Microscopic imaging system and real-time focusing method Download PDF

Info

Publication number
CN108051897B
CN108051897B CN201810045883.6A CN201810045883A CN108051897B CN 108051897 B CN108051897 B CN 108051897B CN 201810045883 A CN201810045883 A CN 201810045883A CN 108051897 B CN108051897 B CN 108051897B
Authority
CN
China
Prior art keywords
image
sample
light source
focusing
focusing light
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810045883.6A
Other languages
Chinese (zh)
Other versions
CN108051897A (en
Inventor
梅蓉
郭亮
余乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ningbo Sunny Instruments Co Ltd
Original Assignee
Ningbo Sunny Instruments Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ningbo Sunny Instruments Co Ltd filed Critical Ningbo Sunny Instruments Co Ltd
Priority to CN201810045883.6A priority Critical patent/CN108051897B/en
Publication of CN108051897A publication Critical patent/CN108051897A/en
Application granted granted Critical
Publication of CN108051897B publication Critical patent/CN108051897B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements

Abstract

The invention relates to a microscopic imaging system and a real-time focusing method, wherein the microscopic imaging system comprises: a primary light source; a condenser; an imaging unit, wherein the condenser lens is positioned between the main light source and the imaging unit; a stage located between the imaging unit and the condenser; an imaging camera for acquiring an image made by the imaging unit; the central control unit is connected with the imaging camera; a driving device for adjusting a distance between the imaging unit and the stage; and the focusing light sources are even in number and at least two, and the focusing light sources and the main light source are positioned on the same side of the condenser lens. The microscopic imaging system can generate a light intensity distribution map according to the defocused autocorrelation image after the defocused image is subjected to autocorrelation processing in a defocused state, and obtains a pixel difference value according to the light intensity primary level and the light intensity secondary level so as to realize real-time focusing of the microscopic imaging system.

Description

Microscopic imaging system and real-time focusing method
Technical Field
The present disclosure relates to imaging systems, and particularly to a microscopic imaging system and a real-time focusing method.
Background
In the microscopic imaging process of the slice specimen, the objective lens needs to be kept at all times to clearly image the specimen. Focusing in a microimaging system is therefore a prerequisite. Because the surface of the slice is uneven, the perpendicularity of the optical axis of the microscopic imaging system and the objective table is not enough, and the like, the imaging surface exceeds the focal depth range to generate defocusing phenomenon, and imaging is not clear any more. The traditional scanning technology is to scan for multiple times in the Z-axis direction to obtain multiple pictures, and calculate the optimal MTF through the gray level difference or the edge definition of the pictures, wherein the Z-axis position corresponding to the value is the focal position. Each focusing is independently carried out and is irrelevant to the position of the sample, but the time required for calculating the MTF value of the image acquired by each scanning is longer, and if the number of times of scanning is larger, the consumed time is longer, so that the working efficiency is seriously influenced.
Disclosure of Invention
The invention aims to provide a microscopic imaging system and a real-time focusing method, which solve the problem of low focusing efficiency of the microscopic imaging system.
To achieve the above object, the present invention provides a microscopic imaging system comprising:
a primary light source;
a condenser;
an imaging unit, wherein the condenser lens is positioned between the main light source and the imaging unit;
a stage located between the imaging unit and the condenser;
an imaging camera for acquiring an image made by the imaging unit;
the central control unit is connected with the imaging camera;
a driving device for adjusting a distance between the imaging unit and the stage; further comprises:
and the focusing light sources are even in number and at least two, and the focusing light sources and the main light source are positioned on the same side of the condenser lens.
According to one aspect of the invention, the number of the focusing light sources is two, and the two focusing light sources are respectively positioned on the focal plane of the collecting lens;
the focusing light sources are symmetrically arranged on two sides of the optical axis of the condenser.
According to one aspect of the invention, the number of the focusing light sources is four, and the four focusing light sources are respectively positioned on the focal plane of the collecting lens;
every two focusing light sources are symmetrically arranged on two sides of the optical axis of the collecting lens.
According to an aspect of the present invention, the imaging unit includes an objective lens and a polymerization mirror;
the distance d between the two symmetrically arranged focusing light sources, the focal length f of the collecting lens and the numerical aperture NA of the objective lens satisfy the formula: d/(2*f) =tan (arcs in (NA)).
According to one aspect of the invention, the focusing light source is an LED light source.
According to one aspect of the invention, the focusing light source is a single green cold light source or a single blue cold light source.
According to one aspect of the invention, the drive means is a linear drive means.
In order to achieve the above object, the present invention provides a real-time focusing method, including:
s1, turning off a main light source, turning on a focusing light source, capturing an image of a sample on an object stage, and acquiring the distance between a sample image and a sample offset image in an autocorrelation image according to the image;
s2, controlling the focus position of the sample on the objective table on the basis of the defocusing relation curve according to the distance.
According to one aspect of the invention, the defocus curve is generated by the steps comprising:
s01, turning on a main light source to enable a sample on the objective table to be located at the focus position of the objective lens;
s02, turning off a main light source, turning on a focusing light source, adjusting the distance between an objective lens and an objective table at a set first interval, and obtaining a defocused image of a sample;
s03, generating a defocused autocorrelation image with the sample image and two sample offset images according to the defocused image, and acquiring the distance between the sample image and the sample offset image;
s04, repeating the steps S02-S03, and acquiring the defocus relation curve according to the distance between the sample image and the sample offset image and the first interval.
According to one aspect of the present invention, before step S01, there is also a need to correct the position of the focused light source, including:
s001, turning on a main light source to enable a sample on the objective table to be located at the focus position of the objective lens;
s002, turning off the main light source, turning on the focusing light source, and obtaining an in-focus image of the sample;
s003, adjusting the distance between the objective lens and the objective table at a second interval, and acquiring a defocused image of the sample;
s004, generating an out-of-focus autocorrelation image with a sample image and two sample offset images according to the out-of-focus image, and correcting the position of the focusing light source according to the position between the sample image and the two sample offset images.
According to one aspect of the invention, the second spacing is in the range of 1 micron to 30 microns.
According to the scheme, the focusing light source is adopted, so that the microscopic imaging system can obtain pixel difference values between bright spots in the image according to the autocorrelation processing of the defocused image in the defocused state. And generating a light intensity distribution diagram according to the out-of-focus autocorrelation image after the autocorrelation processing, and according to the pixel difference value between the light intensity primary level and the light intensity secondary level in the light intensity distribution diagram. The process of obtaining the pixel difference value according to the large light intensity main level and the large light intensity secondary level is rapid and accurate, so that real-time focusing of the microscopic imaging system is realized, focusing time is saved, and focusing precision and efficiency of the microscopic imaging system are ensured.
According to the scheme of the invention, the focusing light source is simple in structure and convenient to assemble and disassemble. By the arrangement mode of the focusing light source, when the focusing light source is installed in the microscopic imaging system, the microscopic imaging system is slightly changed, so that the imaging stability of the imaging system and the structural stability of the microscopic imaging system are ensured. Meanwhile, the focusing light source provided by the invention saves cost and is easy to realize.
Drawings
FIG. 1 schematically illustrates a block diagram of a microscopic imaging system according to one embodiment of the present invention;
FIG. 2 schematically illustrates an in-focus image of a sample of a microscopy imaging system in accordance with one embodiment of the invention;
FIG. 3 schematically illustrates a sample defocus image of a microscopy imaging system according to one embodiment of the present invention;
FIG. 4 schematically illustrates a light intensity profile of an out-of-focus image of a microscopy imaging system in accordance with one embodiment of the invention;
fig. 5 schematically shows a graph of the defocus curve of a microscopy imaging system according to an embodiment of the invention.
Detailed Description
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings that are required to be used in the embodiments will be briefly described below. It is apparent that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained from these drawings without inventive effort for a person of ordinary skill in the art.
In describing embodiments of the present invention, the terms "longitudinal," "transverse," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like are used in terms of orientation or positional relationship shown in the drawings for convenience of description and simplicity of description only, and do not denote or imply that the devices or elements in question must have a particular orientation, be constructed and operated in a particular orientation, so that the above terms are not to be construed as limiting the invention.
The present invention will be described in detail below with reference to the drawings and the specific embodiments, which are not described in detail herein, but the embodiments of the present invention are not limited to the following embodiments.
As shown in fig. 1, the microscopic imaging system of the present invention includes a main light source 1, a condenser lens 2, an imaging unit 3, a focusing light source 4, a stage 5, an imaging camera 6, a central control unit 7, and a driving device 8, according to an embodiment of the present invention. In the present embodiment, the optical axis of the condenser lens 2 and the optical axis of the main light source 1 overlap each other, and the condenser lens 2 is located above the main light source 1. The optical axis of the imaging unit 3 coincides with the optical axis of the condenser lens 2, and the imaging unit 3 is located above the condenser lens 2. The stage 5 is located between the imaging unit 3 and the condenser lens 2. The optical axis of the imaging camera 6 coincides with the optical axis of the imaging unit 3, and the imaging camera 6 is located above the imaging unit 3. The central control unit 7 is connected to the imaging camera 6 and the driving device 8, respectively. An image taken by the imaging camera 6 can be acquired by the central control unit 7, and the distance between the stage 5 and the imaging unit 3 is adjusted by the driving device 8 controlled by the central control unit 7, enabling the imaging unit 3 to perform focusing. In the present embodiment, the driving device 8 is a linear driving device, and may be a linear motor or a ball screw transmission mechanism.
As shown in fig. 1, according to one embodiment of the present invention, the imaging unit 3 includes an objective lens 31 and a converging lens 32. In the present embodiment, the optical axis of the objective lens 31 coincides with the optical axis of the condenser lens 2, and the objective lens 31 is located above the condenser lens 2. The optical axis of the polymerization mirror 32 coincides with the optical axis of the objective lens 31, and the polymerization mirror 32 is located above the objective lens 31. In the present embodiment, the sample a on the stage 5 is imaged by the objective lens 31, and the resultant image is transmitted to the imaging camera 6 by the polymerization mirror 32.
As shown in fig. 1, according to one embodiment of the present invention, the focusing light source 4 is located below the condenser lens 2. The focusing light sources 4 have an even number, and there are at least two focusing light sources 4. In the present embodiment, the number of focusing light sources 4 is two. The two focusing light sources 4 are respectively located on focal planes below the condenser lens 2. In the present embodiment, the focusing light source 4 is an LED light source, a single green cold light source, or a single blue cold light source.
As shown in fig. 1, according to one embodiment of the present invention, focusing light sources 4 are disposed at opposite sides of a main light source 1. In the present embodiment, a space is left between the position of the focusing light source 4 and the optical axis of the condenser lens 2. In the present embodiment, the two focusing light sources 4 are disposed opposite to each other, and the straight-line distance between the position of each focusing light source 4 and the optical axis of the main light source 1 is equal, that is, the two focusing light sources 4 are disposed symmetrically on both sides of the optical axis of the condenser lens 2. In the present embodiment, assuming that the distance between two focusing light sources 4 symmetrical to each other on both sides of the optical axis of the condenser lens 2 is d, the focal length of the condenser lens 2 is f, and the numerical aperture of the objective lens 31 is NA, the formula is satisfied: d/(2*f) =tan (arcsin (NA)). Through the arrangement, the sample A on the object stage 5 is ensured to be in an out-of-focus state, the central control unit 7 can acquire the image of the sample A taken by the imaging unit 3 through the imaging camera 6, and the central control unit 7 can perform autocorrelation processing on the acquired image to acquire a sample image of the sample A and surrounding sample offset images, so that the focusing effect of the microscopic imaging system is ensured.
According to another embodiment of the present invention, the focusing light sources 4 are four. The four focusing light sources 4 are respectively positioned on focal planes below the condenser lens 2. In the present embodiment, the focusing light source 4 is an LED light source, a single green cold light source, or a single blue cold light source.
According to another embodiment of the invention, the focusing light sources 4 are arranged on opposite sides of the main light source 1. In the present embodiment, a space is left between the position of the focusing light source 4 and the optical axis of the condenser lens 2. In the present embodiment, every two focusing light sources 4 are in a group. The focusing light sources 4 in the same group are oppositely arranged, and the straight line distance between the position of each focusing light source 4 and the optical axis of the main light source 1 is equal, namely, the two focusing light sources 4 in the same group are symmetrically arranged at two sides of the optical axis of the condenser lens 2. Each group of focusing light sources 4 is symmetrically arranged about the optical axis of the condenser lens 2, respectively. The four focusing light sources 4 may be arranged in a linear arrangement or may be arranged at annular intervals with respect to the optical axis of the condenser lens 2. In the present embodiment, assuming that the distance between two focusing light sources 4 symmetrical to each other on both sides of the optical axis of the condenser lens 2 is d, the focal length of the condenser lens 2 is f, and the numerical aperture of the objective lens 31 is NA, the formula is satisfied: d/(2*f) =tan (arcsin (NA)). Through the arrangement, the sample A on the object stage 5 is ensured to be in an out-of-focus state, the central control unit 7 can acquire the image of the sample A taken by the imaging unit 3 through the imaging camera 6, and the central control unit 7 can perform autocorrelation processing on the acquired image to acquire a sample image of the sample A and surrounding sample offset images, so that the focusing effect of the microscopic imaging system is ensured. Of course, the focusing light sources 4 may be six, eight, etc., and the setting mode is the same as the above mode, and will not be described here again.
For further details of the present invention, a real-time focusing method of an imaging system of the present invention will be described in detail with reference to the accompanying drawings.
According to one embodiment of the present invention, the real-time focusing method of the present invention includes:
s1, turning off a main light source 1, turning on a focusing light source 4, capturing an image of a sample on an objective table 5, and acquiring the distance between the sample image and a sample offset image in an autocorrelation image according to the image;
s2, controlling the focal position of the sample on the objective table 5 on the objective lens 31 based on the defocus relation curve according to the distance.
According to one embodiment of the present invention, in the process of focusing the sample on the stage 5 in real time according to the real-time focusing method of the present invention, the position of the focusing light source 4 needs to be adjusted first, so that the acquired image is easy to measure, and the accuracy of the focusing process is ensured.
According to one embodiment of the present invention, the step of correcting the position of the focusing light source 4 includes:
s001. the main light source 1 is turned on to position the sample on the stage 5 at the focal position of the objective lens 31. In the present embodiment, the main light source 1 is turned on to image the sample a on the stage 5, and the central control unit 7 acquires a sample image of the sample a through the imaging camera 6. The central control unit 7 controls the driving device 8 to move according to a preset MTF curve, adjusts the distance between the objective table 5 and the objective lens 31, enables the sample A on the objective table 5 to be positioned at the focus position of the objective lens 31, and the imaging camera 6 acquires a clear sample image. In the present embodiment, the driving device 8 may drive the stage 5 or the objective lens 31 to move in the vertical direction, so that the sample a on the stage 5 is located at the focal position of the objective lens 31.
S002, turning off the main light source 1, turning on the focusing light source 4, and obtaining an in-focus image of the sample. In the present embodiment, the main light source 1 is turned off, the focusing light source 4 is turned on, and an in-focus image of the sample a on the stage 5 at the focal position of the objective lens 31 is acquired by the imaging camera 6. In the present embodiment, the main light source 1 is turned off, the focusing light source 4 positioned at the focal plane position below the condenser lens 2 is turned on, and the focusing light source 4 emits oblique light oblique to the optical axis of the condenser lens 2. The position of the focusing light source 4 is fine-tuned so that the in-focus image acquired in the imaging camera 6 is the sharpest (i.e. the bright spot in fig. 2).
S003. the distance between the objective lens 31 and the stage 5 is adjusted at a second interval, and an out-of-focus image of the sample is acquired. In the present embodiment, the stage 5 is moved relative to the objective lens 31 by the driving device 8 at a second interval in the range of 1 to 30 micrometers. An image of the sample a on the stage 5 is taken by the imaging camera 6, and the image obtained at this time is an out-of-focus image of the sample a when it is deviated from the focal position of the objective lens 31.
S004, generating an out-of-focus autocorrelation image with a sample image and two sample offset images according to the out-of-focus image, and correcting the position of the focusing light source 4 according to the position between the sample image and the two sample offset images. In the present embodiment, the central control unit 7 acquires an out-of-focus image captured by the imaging camera 6, and performs an autocorrelation process on the acquired out-of-focus image, thereby acquiring an out-of-focus autocorrelation image having three bright spots. As can be seen from comparing fig. 2 and fig. 3, fig. 3 has three bright spots, wherein the middle bright spot is a sample image of the sample a, which corresponds to the bright spot in fig. 2, i.e. the in-focus image of the sample a, and the bright spots on both sides are sample offset images generated by the sample a in an out-of-focus offset state.
In this embodiment, the defocus image can be expressed by the following formula:
z[x]=s[x]+s[x-x 0 ]
wherein z [ x ]]Representing an out-of-focus image, sx]Representing a portion of the defocused image, i.e. the sample image of sample A or the in-focus image of sample A, s [ x-x ] 0 ]Representing another part of the defocus image, i.e. the sample offset image of sample A, x 0 Representing the offset (i.e., the second interval);
from the above formula, the defocus image also satisfies the convolution relationship between two functions, which can be expressed as:
z[x]=s[x]*h[x]
wherein zx represents an out-of-focus image, sx represents a sample image of sample A or an in-focus image of sample A, and hx represents a transfer function;
from the above relation, it can be seen that recovery is achievedComplex offset x 0 The out-of-focus auto-correlation image can be obtained by performing the auto-correlation processing on the out-of-focus image, and thus the relation of the auto-correlation processing on the out-of-focus image is expressed as:
R[z[x]]=R[s[x]]*R[h[x]]=R[s[x]]*(2δ[x]+δ[x-x 0 ]+δ[x+x 0 ])
wherein R < z < x >, a pharmaceutically acceptable carrier, and a pharmaceutically acceptable salt thereof]]Representing the auto-correlation of out-of-focus images, rsx]]In-focus image autocorrelation processing of sample image representing sample A or sample A, rh [ x ]]]Representing transfer function autocorrelation processing, delta x]Representing the dirac function, delta x-x 0 ]Representing right translation x 0 Is a dirac function of delta [ x+x ] 0 ]Representing a left shift x 0 Dirac function of (a).
In the present embodiment, the positions of three bright spots (one sample image and two sample offset images) are acquired from the out-of-focus autocorrelation image, and the relationship between the positions of the three bright spots and the CCD image plane in the imaging camera 6, that is, the angle between the center line of the three bright spots and the CCD image plane in the imaging camera 6 is acquired. Through the relation between the three obtained bright spot positions and the CCD image surface in the imaging camera 6, the position of the focusing light source 4 is adjusted to enable the central line of the three bright spots to be parallel to the CCD image surface in the imaging camera 6, namely, the central lines of the two sample offset images in the defocusing autocorrelation image are aligned with the center of the sample image, the central lines of the two sample offset images are parallel to the CCD image surface, namely, the central lines of the two sample offset images are parallel to the edge of the defocusing autocorrelation image, so that the measurement of the pixel offset between the two subsequent sample offset images and the sample image is convenient, and the calculation is more convenient.
According to one embodiment of the present invention, when the correction of the position of the in-focus light source 4 is completed, a defocus relation is obtained, including:
s01, the main light source 1 is turned on, so that the sample on the object stage 5 is located at the focus position of the objective lens 31. In the present embodiment, when correction of the position of the focus light source 4 is completed, the focus light source 4 is turned off, and the main light source 1 is turned back on. The sample a on the stage 5 is imaged, and the central control unit 7 acquires a sample image of the sample a through the imaging camera 6. The central control unit 7 controls the driving device 8 to move according to a preset MTF curve, adjusts the distance between the objective table 5 and the objective lens 31, enables the sample A on the objective table 5 to be positioned at the focus position of the objective lens 31, and the imaging camera 6 acquires a clear sample image.
S02, turning off the main light source 1, turning on the focusing light source 4, adjusting the distance between the objective lens 31 and the objective table 5 at a set first interval, and acquiring an out-of-focus image of the sample. In the present embodiment, the main light source 1 is turned off, the focusing light source 4 is turned on, and an in-focus image of the sample a on the stage 5 at the focal position of the objective lens 31 is acquired by the imaging camera 6 (see fig. 2).
S03, generating a defocused autocorrelation image with a sample image and two sample offset images according to the defocused image, and acquiring the distance between the sample image and the sample offset images. In the present embodiment, the stage 5 is moved by the drive device 8 at a first interval with respect to the objective lens 31. An image of the sample a on the stage 5 is taken by the imaging camera 6, and the image obtained at this time is an out-of-focus image of the sample a when it is deviated from the focal position of the objective lens 31. In the present embodiment, the first interval is set to 1 μm. The position of the focusing light source 4 is fixed, and the central control unit 7 controls the driving device 8 to drive the stage 5 or the objective lens 31 to move by 1 μm in the vertical direction. The central control unit 7 acquires a sample defocus image of the sample a through the imaging camera 6.
In the present embodiment, the central control unit 7 acquires an out-of-focus image captured by the imaging camera 6, and performs an autocorrelation process on the acquired out-of-focus image, thereby acquiring an out-of-focus autocorrelation image having three bright spots. The middle bright spot is a sample image of the sample a, which corresponds to the bright spot in fig. 2, i.e. the in-focus image of the sample a, and the bright spots on both sides are sample offset images generated by the sample a in an out-of-focus offset state.
According to one embodiment of the present invention, the step S03 includes:
s031, performing Fourier transform on the obtained defocused image to obtain power spectrum information of the defocused image;
s032, performing inverse Fourier transform on the power spectrum information of the defocused image to obtain an intermediate image;
s033, performing autocorrelation processing on the intermediate image to generate a defocused light intensity distribution map;
s034, obtaining the distance between the two side bright points (two sample offset images) and the middle bright point (sample image) according to the pixel difference between the light intensity primary level large and the light intensity secondary level large in the defocused light intensity distribution diagram.
In this embodiment, the process of generating the defocus autocorrelation image by performing the autocorrelation process on the defocus image is the same as the above-described autocorrelation process, and will not be described again. Two sample offset images appear around the sample image in the out-of-focus autocorrelation image. A defocus light intensity distribution map is generated from the defocus autocorrelation image (see fig. 4), as shown in fig. 4, with light intensity secondary large C appearing on both sides of the light intensity primary large B corresponding to the sample image (middle bright spot) in the defocus autocorrelation image and corresponding to the sample offset image (bright spots on both sides) in the defocus autocorrelation image. The distance between the bright spots on the two sides and the middle bright spot can be obtained by calculating the distance between the light intensity primary level B and the light intensity secondary level C. In this embodiment, only the distance between the primary large B and one secondary large C may be acquired, or the distances between the primary large B and two secondary large C may be acquired and averaged, respectively. Of course, it is also possible to obtain only the distance between the two light intensity secondary large C.
S04, repeating the steps S02-S03, and acquiring a defocusing relation curve according to the distance between the sample image and the sample offset image and the first interval.
According to one embodiment of the present invention, the central control unit 7 controls the driving means 8 to move the objective lens 31 or the stage 5 in the same direction a plurality of times (twice, three times or more) at a distance interval of 1 μm by step S04, and the central control unit 7 acquires out-of-focus images of a plurality of (two, three or more) samples a by the imaging camera 6. For each movement of the drive means 8, the central control unit 7 acquires an out-of-focus image of one sample a. A plurality of (two, three or more) out-of-focus autocorrelation images are generated from the acquired out-of-focus images, and a plurality of (two, three or more) out-of-focus light intensity profiles are generated from the plurality of (two, three or more) out-of-focus autocorrelation images.
In the present embodiment, the central control unit 7 needs to acquire out-of-focus images of 30 samples a by the imaging camera 6, i.e., the total length of travel of the driving device 8 in the same direction is 30 μm. And carrying out autocorrelation processing on all the acquired 30 defocused images to generate defocused autocorrelation images, and acquiring a light intensity distribution map. According to the acquired 30 light intensity distribution patterns, pixel differences (namely, the distance between the sample image and one sample offset image) between the 30 light intensity secondary large C and the light intensity primary large B are acquired. As shown in fig. 5, a coordinate point D is plotted in rectangular coordinates in accordance with the defocus amount (i.e., the length of travel of the driving device 8 at a first interval or the distance of the sample a from the focal point of the objective lens 31) and the pixel difference (i.e., the pixel difference between the light intensity secondary large C and the light intensity primary large B or the distance between the sample image and one sample offset image), and the plotted coordinate point D is fitted by a curve to generate a defocus relation curve E.
In this embodiment, the defocus relation curve E generated by fitting satisfies:
y=kx+b
where y is the defocus amount (i.e. the length of travel of the drive means 8 at the first interval) and x is the pixel difference (i.e. the pixel difference between the light intensity secondary large C and the light intensity primary large B or the distance between the sample image and one sample offset image). The values of k and b that are solved for can be curve fitted using the aforementioned defocus relationship after the pixel differences are measured. According to the formula, the value of the pixel difference value x is obtained, and the corresponding defocus value can be obtained.
According to one embodiment of the invention, after the fitting of the defocus relation is completed and the relation of the defocus relation is obtained, the microscopic imaging system of the invention can perform real-time focusing of the sample A. In step S1, in this embodiment, the main light source 1 needs to be turned off and the focusing light source 4 needs to be turned on during the real-time focusing of the sample a by the microscopic imaging system of the present invention. The central control unit 7 acquires an image of the sample a on the stage 5 through the imaging camera 6, and performs autocorrelation processing on the acquired image. The central control unit 7 acquires the distance between the sample image and the sample offset image (i.e., the pixel difference between the light intensity secondary large C and the light intensity primary large B) from the image after the autocorrelation process.
According to one embodiment of the present invention, in step S2, the defocus amount between the sample a and the objective lens 31 in the current microscopic imaging system, that is, the distance of the sample a from the focal point of the objective lens 31, can be obtained according to the distance between the obtained sample image and the sample offset image and the defocus relation E, and can be understood as the length of travel of the driving device 8 at the first interval. According to the obtained defocus amount, the central control unit 7 controls the driving device 8 to adjust the distance between the objective lens 31 and the objective table 5, so that the sample A on the objective table 5 is always positioned at the focus of the objective lens 31, and real-time focusing of the imaging system is realized.
According to the invention, the focusing light source 4 has a simple structure and is convenient to assemble and disassemble. By the arrangement mode of the focusing light source 4, when the focusing light source 4 is installed in the microscopic imaging system, the microscopic imaging system is slightly changed, so that the imaging stability of the imaging system and the structural stability of the microscopic imaging system are ensured. Meanwhile, the focusing light source 4 is cost-saving and easy to realize.
According to the invention, by adopting the focusing light source 4, the microscopic imaging system can obtain the pixel difference value between the bright points in the image according to the auto-correlation processing of the defocused image in the defocused state, and the real-time focusing of the microscopic imaging system is realized by calculating the obtained pixel difference value, so that the focusing precision and efficiency of the microscopic imaging system are ensured.
The foregoing is merely exemplary of embodiments of the invention and, as regards devices and arrangements not explicitly described in this disclosure, it should be understood that this can be done by general purpose devices and methods known in the art.
The above description is only one embodiment of the present invention, and is not intended to limit the present invention, but various modifications and variations can be made to the present invention by those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. A microscopic imaging system, comprising:
a main light source (1);
a condenser lens (2);
an imaging unit (3), wherein the condenser lens (2) is positioned between the main light source (1) and the imaging unit (3);
a stage (5), the stage (5) being located between the imaging unit (3) and the condenser lens (2);
an imaging camera (6), the imaging camera (6) being adapted to acquire an image of the imaging unit (3);
-a central control unit (7) connected to the imaging camera (6);
a driving device (8) for adjusting the distance between the imaging unit (3) and the stage (5); characterized by further comprising:
the focusing light sources (4) are even in number and at least two, and the focusing light sources (4) and the main light source (1) are positioned on the same side of the condenser lens (2);
the imaging unit (3) comprises an objective lens (31) and a converging lens (32);
the distance d between the two symmetrically arranged focusing light sources (4), the focal length f of the condenser lens (2) and the numerical aperture NA of the objective lens (31) satisfy the formula: d/(2*f) =tan (arcsin (NA));
the optical axis of the objective lens (31) coincides with the optical axis of the condenser lens (2).
2. The microscopic imaging system according to claim 1, wherein the number of focusing light sources (4) is two, and the two focusing light sources (4) are respectively located on the focal plane of the condenser lens (2);
the focusing light sources (4) are symmetrically arranged on two sides of the optical axis of the collecting lens (2).
3. The microscopic imaging system according to claim 1, wherein the number of the focusing light sources (4) is four, and the four focusing light sources (4) are respectively located on the focal plane of the condenser lens (2);
every two focusing light sources (4) are symmetrically arranged on two sides of the optical axis of the collecting lens (2).
4. Microscopic imaging system according to claim 1, characterized in that the focusing light source (4) is an LED light source.
5. The microscopic imaging system according to claim 1, wherein the focusing light source (4) is a single green cold light source or a single blue cold light source.
6. A microscopic imaging system according to claim 1, characterized in that the driving means (8) is a linear driving means.
7. A real-time focusing method employing the microscopic imaging system of any one of claims 1 to 6, comprising:
s1, turning off a main light source (1), turning on a focusing light source (4), capturing an image of a sample on an objective table (5), and acquiring the distance between a sample image and a sample offset image in an autocorrelation image according to the image;
s2, controlling the focus position of the sample on the objective table (5) on the objective lens (31) based on the defocusing relation curve according to the distance.
8. The real-time focusing method according to claim 7, wherein the defocus relation is generated by the steps of:
s01, turning on a main light source (1) to enable a sample on the objective table (5) to be located at the focus position of the objective lens (31);
s02, turning off a main light source (1), turning on a focusing light source (4), adjusting the distance between an objective lens (31) and an objective table (5) at a set first interval, and acquiring an out-of-focus image of a sample;
s03, generating a defocused autocorrelation image with the sample image and two sample offset images according to the defocused image, and acquiring the distance between the sample image and the sample offset image;
s04, repeating the steps S02-S03, and acquiring the defocus relation curve according to the distance between the sample image and the sample offset image and the first interval.
9. The real-time focusing method according to claim 8, wherein before step S01, the position of the focusing light source (4) is further required to be corrected, comprising:
s001, turning on a main light source (1) to enable a sample on the objective table (5) to be located at the focus position of the objective lens (31);
s002, turning off the main light source (1), turning on the focusing light source (4), and obtaining an in-focus image of the sample;
s003, adjusting the distance between the objective lens (31) and the objective table (5) at a second interval, and acquiring an out-of-focus image of the sample;
s004, generating an out-of-focus autocorrelation image with a sample image and two sample offset images according to the out-of-focus image, and correcting the position of the focusing light source (4) according to the position between the sample image and the two sample offset images.
10. The method of claim 9, wherein the second interval is in a range of 1 micron to 30 microns.
CN201810045883.6A 2018-01-17 2018-01-17 Microscopic imaging system and real-time focusing method Active CN108051897B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810045883.6A CN108051897B (en) 2018-01-17 2018-01-17 Microscopic imaging system and real-time focusing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810045883.6A CN108051897B (en) 2018-01-17 2018-01-17 Microscopic imaging system and real-time focusing method

Publications (2)

Publication Number Publication Date
CN108051897A CN108051897A (en) 2018-05-18
CN108051897B true CN108051897B (en) 2023-06-23

Family

ID=62126855

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810045883.6A Active CN108051897B (en) 2018-01-17 2018-01-17 Microscopic imaging system and real-time focusing method

Country Status (1)

Country Link
CN (1) CN108051897B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109031641A (en) * 2018-08-21 2018-12-18 北京青燕祥云科技有限公司 A kind of digitalized sliced sheet system and pathological section analysis method
CN111443477B (en) * 2020-04-13 2022-12-20 腾讯科技(深圳)有限公司 Microscope auto-focusing method, microscope system, medical device, and storage medium
CN116430568A (en) * 2020-04-13 2023-07-14 腾讯科技(深圳)有限公司 Microscope system, microscope auto-focusing method, and medical apparatus
CN113296255B (en) * 2021-05-24 2022-09-30 南京凯泽瑞兹光电科技有限公司 Automatic focusing and imaging device for pathological section of scanning microscope
CN114205519A (en) * 2021-11-09 2022-03-18 南京泰立瑞信息科技有限公司 Rapid parfocal method and device of amplification imaging system
CN114815211B (en) * 2022-04-19 2023-05-30 大连工业大学 Microscope automatic focusing method based on image processing

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106980175A (en) * 2017-05-10 2017-07-25 暨南大学 The non-fluorescence imaging dicing method and device being conjugated based on annular off-axis illumination focal plane

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2974393B2 (en) * 1990-10-25 1999-11-10 オリンパス光学工業株式会社 Microscope image input device
JP2000035675A (en) * 1998-07-17 2000-02-02 Sony Corp Reduction stepper and its autofocusing method
JP2009217222A (en) * 2008-03-06 2009-09-24 Takashi Goto Observation base with reflection type transmissive illumination auxiliary device
US9389408B2 (en) * 2010-07-23 2016-07-12 Zeta Instruments, Inc. 3D microscope and methods of measuring patterned substrates
JP2013088570A (en) * 2011-10-17 2013-05-13 Olympus Corp Microscope apparatus
WO2016048851A1 (en) * 2014-09-22 2016-03-31 Gallager Scott M Continuous particle imaging and classification system
CN104317041B (en) * 2014-09-30 2016-11-02 无锡微焦科技有限公司 A kind of self-focusing light path system
CN104777604A (en) * 2015-04-16 2015-07-15 浙江大学 Positionable microscopic imaging system on basis of USB microscopic probe and stepping scanning table
CN107390356A (en) * 2017-08-28 2017-11-24 电子科技大学 The method focused on automatically based on representation of laser facula
CN208026981U (en) * 2018-01-17 2018-10-30 宁波舜宇仪器有限公司 A kind of micro imaging system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106980175A (en) * 2017-05-10 2017-07-25 暨南大学 The non-fluorescence imaging dicing method and device being conjugated based on annular off-axis illumination focal plane

Also Published As

Publication number Publication date
CN108051897A (en) 2018-05-18

Similar Documents

Publication Publication Date Title
CN108051897B (en) Microscopic imaging system and real-time focusing method
CN108254853B (en) Microscopic imaging system and real-time focusing method thereof
CN109085695B (en) Method for quickly focusing and photographing plane sample
US20130058581A1 (en) Microscopic Vision Measurement Method Based On Adaptive Positioning Of Camera Coordinate Frame
US20110157349A1 (en) Stage control device, stage control method, stage control program, and microscope
US10365466B2 (en) Method and microscope for imaging a volume sample
US8189268B2 (en) Lens unit
CN111999878B (en) Microscopic imaging system and real-time focusing method thereof
US11490068B2 (en) Adaptive 3D-scanner with variable measuring range
CN107340584B (en) Microscope
CN111208144A (en) Defect detection system and defect detection method
CN110849289A (en) Double-camera parallel confocal differential microscopic 3D morphology measurement device and method
CN113093487A (en) Mask alignment system, mask alignment method and lithographic apparatus
CN210922541U (en) Double-camera parallel confocal differential microscopic 3D morphology measuring device
CN104834081B (en) Rapid automatic focusing method for stereoscopic microscope
CN210605176U (en) Automatic focusing microscope
US11356593B2 (en) Methods and systems for single frame autofocusing based on color- multiplexed illumination
EP2866070B1 (en) Microscope imaging device, and microscope imaging method
CN105676356A (en) Fiber core positioning method and fiber core alignment calibration method for optical fiber fusion
CN208026981U (en) A kind of micro imaging system
KR102010818B1 (en) Apparatus for capturing images of blood cell
CN106735863B (en) A method of by material surface automatic positioning at laser spot
CN113253417B (en) Automatic leveling and focusing method for bone marrow smear scanning
CN113433682A (en) Microscopic imaging automatic focusing device and method based on polarization difference image
CN115128763A (en) Differential automatic focusing measuring method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant