CN117434707A - Rapid focusing method and device for microscopic optical system - Google Patents

Rapid focusing method and device for microscopic optical system Download PDF

Info

Publication number
CN117434707A
CN117434707A CN202311453974.0A CN202311453974A CN117434707A CN 117434707 A CN117434707 A CN 117434707A CN 202311453974 A CN202311453974 A CN 202311453974A CN 117434707 A CN117434707 A CN 117434707A
Authority
CN
China
Prior art keywords
light source
module
image
focal plane
offset
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311453974.0A
Other languages
Chinese (zh)
Inventor
林灵杰
黄炜宸
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenjian Qingying Technology Xiamen Co ltd
Original Assignee
Shenjian Qingying Technology Xiamen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenjian Qingying Technology Xiamen Co ltd filed Critical Shenjian Qingying Technology Xiamen Co ltd
Priority to CN202311453974.0A priority Critical patent/CN117434707A/en
Publication of CN117434707A publication Critical patent/CN117434707A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/06Means for illuminating specimens
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • G02B21/367Control or image processing arrangements for digital or video microscopes providing an output produced by processing a plurality of individual source images, e.g. image tiling, montage, composite images, depth sectioning, image comparison
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Multimedia (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention discloses a rapid focusing method and device of a microscopic optical system, which comprises a light source control module, a microscopic amplifying optical module, a CCD camera acquisition module, a focal plane calculation module and a motor driving module, wherein the microscopic amplifying optical module comprises a lens barrel and an objective lens, the CCD camera acquisition module comprises an array CCD acquisition camera, the CCD camera acquisition module is connected with the focal plane calculation module, the focal plane calculation module is connected with the motor driving module, the motor driving module is connected with the microscopic amplifying optical module, the light source control module is connected with the CCD camera acquisition module, the CCD camera acquisition module is connected with the microscopic amplifying optical module, and the clearest focal plane position can be directly calculated through synchronous time-sharing asynchronous light source imaging, and an observation target of a slide sample layer can be driven to move to the focal plane position by a driving motor, so that rapid and convenient microscopic optical focusing can be realized.

Description

Rapid focusing method and device for microscopic optical system
Technical Field
The invention relates to the technical field of micro-optical systems, in particular to a quick focusing method and a quick focusing device for a micro-optical system.
Background
The microscope is used as an important optical instrument, and an optical system of the microscope is a key component for realizing microscope imaging and mainly comprises an objective lens, an ocular lens, a focusing mechanism and an illumination system.
The objective lens is a main component of microscope imaging, and functions to focus the light of the sample to a focal point through the lens to form a magnified image, and the magnification of the objective lens affects the imaging capability of the microscope. The ocular lens is an auxiliary component of microscope imaging and is used for magnifying the image imaged by the objective lens so as to make the image clearer. The magnification of the eyepiece is typically 10 times or 15 times, and the focusing mechanism is a key part of the focal length adjustment of the microscope, and the function of the focusing mechanism is to enable the objective lens to image at different focal lengths. The focusing mechanism generally consists of a rough adjusting mechanism and a fine adjusting mechanism, and the focal length can be adjusted by rotating an adjusting knob. The illumination system is a necessary condition for microscopic imaging and functions to provide a light source to enable the sample to produce light reflection or transmission phenomena.
In the field of microscopic image digital imaging, rapid focusing is realized, so that a clear high-power amplification microscopic image is acquired, the microscopic optical system focusing is one of key targets of slide digital scanning, an observation target of a slide specimen layer is placed on an optical axis focal plane, light is transmitted through the specimen, an array plane is acquired by a CCD camera to obtain clear imaging, and if the slide observation target layer is placed on the focal plane which is not focal anymore, namely, the slide observation target layer is not focal, the image acquired by the CCD camera is blurred, and image diagnosis information is lost, so that doctors are difficult to diagnose. On the other hand, the slide specimen layer is not uniform in height in each microscopic view due to the difference of slide preparation technology and mechanical error of the electric stage.
For the above reasons, a method for fast focusing on each view field is needed in the process of high-precision digital scanning of a glass slide, and therefore, a method for fast focusing on a micro optical system is needed, and the existing maximum contrast method, diffraction effect estimation method, ranging method and phase difference method are difficult to meet the requirements of high-precision digital scanning of a glass slide in terms of fast real-time performance, high precision and low cost, so that the invention provides a method and a device for fast focusing on a micro optical system to solve the problems.
Disclosure of Invention
Aiming at the defects in the prior art, the invention aims to provide a rapid focusing method and device for a microscopic optical system, which can improve the real-time performance of multi-field focusing, thereby improving the digital scanning speed of a slide.
In order to achieve the above purpose, the invention provides a rapid focusing device of a micro-optical system, which comprises a light source control module, a micro-amplifying optical module, a CCD camera acquisition module, a focal plane calculation module and a motor driving module, wherein the micro-amplifying optical module comprises a lens barrel and an objective lens, the CCD camera acquisition module comprises an array CCD acquisition camera, the CCD camera acquisition module is connected with the focal plane calculation module, the focal plane calculation module is connected with the motor driving module, the motor driving module is connected with the micro-amplifying optical module, the light source control module is connected with the CCD camera acquisition module, the CCD camera acquisition module is connected with the micro-amplifying optical module, and can directly calculate the most clear focal plane position through synchronous time-sharing asynchronous light source imaging, and the driving motor can move an observation target of a slide sample layer to the focal plane position, so that rapid and convenient micro-optical focusing is achieved.
Further, a fast focusing method of a micro optical system includes the following steps:
step one: initializing a system, and synchronizing time sequences of the CCD camera acquisition module and the light source module;
step two: starting a light source control module, and respectively turning on and off a left lamp and a right lamp according to a frame rate time sequence to circularly light a main light source and the left and right light sources;
step three: starting a CCD camera acquisition module, wherein light passes through a microscopic amplification optical module, so that a target sample is ensured to acquire an image1 generated by a left light source and an image2 generated by a right light source on an acquisition array surface of the CCD camera under the irradiation of the left light source and the right light source respectively;
step four: starting a focal plane calculation module to calculate pixel offset of image content of the acquired image1 and the acquired image2, and calculating the height difference of the current sample height from the focal plane according to linear unitary one-time equation parameters of historical focal plane deviation and image offset;
step five: and starting a motor driving module to drive the sample object stage to move to the corresponding position according to the height difference from the target focal plane.
Further, in the first step, the light source control based on timing synchronization includes:
(1) The light source control module adopts a rising edge triggering mode to receive and sense a time sequence synchronous signal generated by the imaging CCD according to the frame rate;
(2) At the initial moment, sequentially lighting a main light source, a left light source and a right light source, wherein the lighting time periods are respectively T0, T1 and T2;
(3) The light source control module repeats the cycle all the time in the above order.
Further, in the second step, the primary light source identification includes:
(1) White LED lamps are selected as main light sources, and green LED lamps are selected as left and right light sources;
(2) Randomly selecting N pixel points (defaults N=400, but is not limited to 400) on each graph, and respectively taking the values of red R channels to average, wherein p i represents the pixel point, and R (pi) represents the red R channel value of the pixel point;
(3) Judging that if the image is greater than or equal to 100, the image is formed by a main light source; if less than 100, the map is not an image of the primary light source.
Further, in the second step, the determination of the left and right light sources includes:
(1) After an image acquired after the rising edge of a timing synchronization signal is triggered on a time sequence is identified as a main light source, the timing sequence positions of a follow-up main light source, a left light source and a right light source in the synchronization signal can be calibrated, so that the light source corresponding to the image at any timing sequence position is judged;
(2) After the steps are calibrated once, the primary light source identification calibration is carried out once by taking every 90 time sequences as a period (not limited to the multiple of 90,3), so that the anti-interference performance of the module is enhanced.
Further, the target area optimization selection strategy aims at reducing the image calculation area so as to improve the focusing speed, and searching the optimal area with obvious characteristics for calculation so as to improve the focusing accuracy, and comprises the following steps:
(1) On the image imaged by the main light source, 5x5 blocking is carried out in the central area (10% of the width and the height are reserved as non-central areas respectively in the upper, lower, left and right directions, and the width and the height of the blocking image are respectively w and h);
(2) And randomly selecting 400 pixel points for each block area, calculating standard deviation for three-channel pixel values, and selecting 6 areas with the largest standard deviation as the optimal areas to be matched with the most obvious features.
Further, non-rigidly transformed alien image matching includes:
(1) The image distribution acquired by the left light source and the right light source is subjected to self-adaptive Gray level transformation based on normalization to obtain Gray images Gray1 and Gray2, the result is not influenced by global brightness change, namely, the consistent brightening and darkening of any image has no influence on the result, and the transformation formula is as follows:
wherein (x, y) is the pixel coordinate, R (x, y), G (x, y), B (x, y) are the values of the red, green and blue channels of the pixel, respectively, and are the gray values of the pixel after conversion;
(2) Extracting 6 optimal areas to be matched in a gray level diagram of a left light source, carrying out operator convolution calculation on each pixel of each area, selecting n pixel points (x, y) with the largest convolution value as a set Mn (default n=10), and constructing a matrix by using the pixel gray level values of the pixel point set Mn; similarly, extracting a matrix from the gray level image of the right light source;
(3) The sums are subjected to similarity calculation, and first, the sums may have brightness deviation caused by imaging of different light sources, so that the sums are subjected to brightness deviation elimination calculation, the average value of the sums is subtracted, and the brightness deviation of each pixel point is used for calculation, so that the influence of brightness difference is eliminated, wherein the specific formula is as follows:
second, for any offset (dx, dy) that exists with the sum, then its similarity function is designed as follows:
the final similarity value is scaled to the [ -1,1] range, with R equal to 1.0 for two identical images, the larger the similarity value representing the more similar the two, completely uncorrelated images r= -1.
Further, the image offset calculation includes:
(1) Offset optimization of best match in order to quickly find the best offset (dx, dy) of the left and right light sources, maximizing R (dx, dy), first, c-times downsampling (c default 10) is applied to Gray images Gray1 and Gray2, and the extraction matrix T is traversed on the scaled Gray image Gray2' 2 And calculate and T 1 Taking the position (dx 1, dy 1) where the maximum R is located as a preliminary offset position; secondly, traversing and optimizing the area of w multiplied by h on the Gray2 original image in a small range according to the preliminary offset position (dx 1, dy 1) multiplied by c, and calculating to obtain the largest relative offset position (dx 2, dy 2), thereby reducing the calculation time; finally, the preliminary offset position is superimposed on the relative offset position to obtain the final image offset position (dx 3, dy 3), that is, (dx 3, dy 3) = (dx 1 xc+dx2, dy1 xc+dy2).
(2) And (3) consistency verification (average standard deviation filtering), respectively calculating image offset values of 25 areas, calculating an average value and a labeling difference, considering focusing failure when the standard deviation is larger than 1% of the average value, considering focusing success when the standard deviation is smaller than or equal to 1% of the average value, and using the average value as a final image pixel offset value.
Further, the focal plane offset function fitting and calculating includes:
(1) Finding that the distance delta z between the z-axis sample and the objective lens is in a linear relation with the calculated left and right image offset values;
(2) Let the pixel offset value of the image be x, then there is a function Δz=a-x+b;
(3) Calculating to obtain the values of the slope a and the intercept b by using a least square curve fitting method, and obtaining the functional relation of the z-axis motor required movement step length deltaz corresponding to the image pixel offset x;
(4) The focal plane position can be calculated quickly by using the function.
Compared with the prior art, the invention has the beneficial effects that:
1. the method and the device for quickly realizing the focusing of the micro-optical system can be realized by only adding a pair of point light sources on the two sides of the optical axis, have lower cost, support the same-color light sources, well solve the difference problem of different-color light sources in other methods, save a plurality of image preprocessing links, and avoid more complex matching detection preprocessing operation, thereby achieving a quicker focusing effect;
2. the invention calculates the theoretical focal plane by using the offset table look-up of the point light source time-sharing illumination imaging, and has one-step quick focusing capability compared with the focal plane optimizing method;
3. the imaging offset calculation method based on the convolution matching method has the advantages of strong instantaneity, large effective interval and threshold value, and can judge whether focusing is successful or not, and the problems of time consumption and poor threshold value self-adaption caused by adopting multiple groups of light sources with different intervals for illumination due to insufficient effective interval in the prior related technology are effectively solved.
Drawings
In order to more clearly illustrate the solution of the present invention, a brief description will be given below of the drawings required for the description of the embodiments of the present invention, it being apparent that the drawings in the following description are some embodiments of the present invention, and that other drawings may be obtained from these drawings without the exercise of inventive effort for a person of ordinary skill in the art.
FIG. 1 is a schematic diagram of a system provided by the present invention;
FIG. 2 is a schematic diagram of the principles provided by the present invention;
FIG. 3 is a schematic imaging view provided by the present invention;
FIG. 4 is a schematic diagram of timing synchronization-based light source control provided by the present invention;
FIG. 5 is a schematic diagram of primary light source identification provided by the present invention;
FIG. 6 is a schematic diagram of a primary light source imaging provided by the present invention;
FIG. 7 is a schematic diagram of non-rigidly transformed alien image matching provided by the present invention;
FIG. 8 is a schematic view of the fitting and calculation of a focal plane offset function provided by the present invention;
fig. 9 is a schematic flow chart provided by the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in detail below with reference to the accompanying drawings so that the advantages and features of the present invention can be more easily understood by those skilled in the art, thereby making clear and defining the scope of the present invention. It will be apparent that the described embodiments are only some, but not all, embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to fall within the scope of the invention.
The terms "comprising" and "having" and any variations thereof in the description of the invention and the claims and the description of the drawings above are intended to cover a non-exclusive inclusion. The terms first, second and the like in the description and in the claims or in the above-described figures, are used for distinguishing between different objects and not necessarily for describing a sequential or chronological order.
Referring to fig. 1-9, a fast focusing device of a micro-optical system comprises a light source control module, a micro-amplifying optical module, a CCD camera acquisition module, a focal plane calculation module and a motor driving module, wherein the micro-amplifying optical module comprises a lens barrel and an objective lens, the CCD camera acquisition module comprises an array plane CCD acquisition camera, the CCD camera acquisition module is connected with the focal plane calculation module, the focal plane calculation module is connected with the motor driving module, the motor driving module is connected with the micro-amplifying optical module, the light source control module is connected with the CCD camera acquisition module, the CCD camera acquisition module is connected with the micro-amplifying optical module, and the most clear focal plane position can be directly calculated through synchronous time-sharing asynchronous light source imaging, and the motor is driven to move an observation target of a slide sample layer to the focal plane position, so that fast and convenient micro-optical focusing is realized.
As an improvement of the technical scheme, the rapid focusing method of the micro-optical system comprises the following steps:
step one: initializing a system, and synchronizing time sequences of the CCD camera acquisition module and the light source module;
step two: starting a light source control module, and respectively turning on and off a left lamp and a right lamp according to a frame rate time sequence to circularly light a main light source and the left and right light sources;
step three: starting a CCD camera acquisition module, wherein light passes through a microscopic amplification optical module, so that a target sample is ensured to acquire an image1 generated by a left light source and an image2 generated by a right light source on an acquisition array surface of the CCD camera under the irradiation of the left light source and the right light source respectively;
step four: starting a focal plane calculation module to calculate pixel offset of image content of the acquired image1 and the acquired image2, and calculating the height difference of the current sample height from the focal plane according to linear unitary one-time equation parameters of historical focal plane deviation and image offset;
step five: and starting a motor driving module to drive the sample object stage to move to the corresponding position according to the height difference from the target focal plane.
Heterogeneous same-color light source imaging acquisition
The heterogeneous heterochromatic light source imaging has poor consistency of graying treatment although the light sources are easy to distinguish according to the RGB value of the image, and the matching detection of the heterochromatic light source is tedious and unfavorable for the application of the actual focusing scene. The heterogeneous same-color light source has the advantages of good imaging consistency, saving the matching detection among the light sources and the subsequent time-consuming image preprocessing, but has the disadvantage that the light sources corresponding to the images cannot be distinguished directly through RGB values.
The invention distinguishes the left and right light sources from time sequence by time sequence synchronization technology, thereby solving the problem of distinguishing the light sources corresponding to the images.
As an improvement of the above-mentioned technical solution, in the first step, the light source control based on the timing synchronization includes:
(1) The light source control module adopts a rising edge triggering mode to receive and sense a time sequence synchronous signal generated by the imaging CCD according to the frame rate;
(2) At the initial moment, sequentially lighting a main light source, a left light source and a right light source, wherein the lighting time periods are respectively T0, T1 and T2;
(3) The light source control module repeats the cycle all the time in the above order.
As an improvement of the above technical solution, in the second step, the main light source identification includes:
(1) White LED lamps are selected as main light sources, and green LED lamps are selected as left and right light sources;
(2) Randomly selecting N pixel points (defaults N=400, but is not limited to 400) on each graph, and respectively taking the values of red R channels to average, wherein p i represents the pixel point, and R (pi) represents the red R channel value of the pixel point;
(3) Judging that if the image is greater than or equal to 100, the image is formed by a main light source; if less than 100, the map is not an image of the primary light source.
As an improvement of the above-described technical solution, in the second step, the left and right light source determination includes:
(1) After an image acquired after the rising edge of a timing synchronization signal is triggered on a time sequence is identified as a main light source, the timing sequence positions of a follow-up main light source, a left light source and a right light source in the synchronization signal can be calibrated, so that the light source corresponding to the image at any timing sequence position is judged;
(2) After the steps are calibrated once, the primary light source identification calibration is carried out once by taking every 90 time sequences as a period (not limited to the multiple of 90,3), so that the anti-interference performance of the module is enhanced.
Non-rigidly transformed alien image offset matching:
after the heterogeneous Image acquisition, the images Image1 and Image2 imaged by the left and right light sources have similarity, but are not rigid offset conversion. Image offset calculation is required to be carried out on partial target areas extracted from the Image1 and the Image 2;
as an improvement of the above technical solution, the target area optimization selection strategy aims at reducing the image calculation area to increase the focusing speed, and searching the optimal area with obvious characteristics to calculate to increase the focusing accuracy, which includes:
(1) On the image imaged by the main light source, 5x5 blocking is carried out in the central area (10% of the width and the height are reserved as non-central areas respectively in the upper, lower, left and right directions, and the width and the height of the blocking image are respectively w and h);
(2) And randomly selecting 400 pixel points for each block area, calculating standard deviation for three-channel pixel values, and selecting 6 areas with the largest standard deviation as the optimal areas to be matched with the most obvious features.
As an improvement of the above technical solution, the non-rigidly transformed heterologous image matching includes:
(1) The image distribution acquired by the left light source and the right light source is subjected to self-adaptive Gray level transformation based on normalization to obtain Gray images Gray1 and Gray2, the result is not influenced by global brightness change, namely, the consistent brightening and darkening of any image has no influence on the result, and the transformation formula is as follows:
wherein (x, y) is the pixel coordinate, R (x, y), G (x, y), B (x, y) are the values of the red, green and blue channels of the pixel, respectively, and are the gray values of the pixel after conversion;
(2) Extracting 6 optimal areas to be matched in a gray level diagram of a left light source, carrying out operator convolution calculation on each pixel of each area, selecting n pixel points (x, y) with the largest convolution value as a set Mn (default n=10), and constructing a matrix by using the pixel gray level values of the pixel point set Mn; similarly, extracting a matrix from the gray level image of the right light source;
(3) The sums are subjected to similarity calculation, and first, the sums may have brightness deviation caused by imaging of different light sources, so that the sums are subjected to brightness deviation elimination calculation, the average value of the sums is subtracted, and the brightness deviation of each pixel point is used for calculation, so that the influence of brightness difference is eliminated, wherein the specific formula is as follows:
second, for any offset (dx, dy) that exists with the sum, then its similarity function is designed as follows:
the final similarity value is scaled to the [ -1,1] range, with R equal to 1.0 for two identical images, the larger the similarity value representing the more similar the two, completely uncorrelated images r= -1.
As an improvement of the above-described technical solution, the image shift calculation includes:
(1) Offset optimization of best match in order to quickly find the best offset (dx, dy) of the left and right light sources, maximizing R (dx, dy), first, c-times downsampling (c default 10) is applied to Gray images Gray1 and Gray2, and the extraction matrix T is traversed on the scaled Gray image Gray2' 2 And calculate and T 1 Taking the position (dx 1, dy 1) where the maximum R is located as a preliminary offset position; secondly, traversing and optimizing the area of w multiplied by h on the Gray2 original image in a small range according to the preliminary offset position (dx 1, dy 1) multiplied by c, and calculating to obtain the largest relative offset position (dx 2, dy 2), thereby reducing the calculation time; finally, the preliminary offset position is superimposed on the relative offset position to obtain the final image offset position (dx 3, dy 3), that is, (dx 3, dy 3) = (dx 1 xc+dx2, dy1 xc+dy2).
(2) And (3) consistency verification (average standard deviation filtering), respectively calculating image offset values of 25 areas, calculating an average value and a labeling difference, considering focusing failure when the standard deviation is larger than 1% of the average value, considering focusing success when the standard deviation is smaller than or equal to 1% of the average value, and using the average value as a final image pixel offset value.
As an improvement of the above technical solution, the fitting and calculating of the focal plane offset function includes:
(1) Finding that the distance delta z between the z-axis sample and the objective lens is in a linear relation with the calculated left and right image offset values;
(2) Let the pixel offset value of the image be x, then there is a function Δz=a-x+b;
(3) Calculating to obtain the values of the slope a and the intercept b by using a least square curve fitting method, and obtaining the functional relation of the z-axis motor required movement step length deltaz corresponding to the image pixel offset x;
(4) The focal plane position can be calculated quickly by using the function.
The working principle and the using method of the invention are as follows:
when the system is used, the system is initialized, and the CCD camera acquisition module and the light source module are subjected to time sequence synchronization; starting a light source control module, and respectively turning on and off a left lamp and a right lamp according to a frame rate time sequence to circularly light a main light source and the left and right light sources; starting a CCD camera acquisition module, wherein light passes through a microscopic amplification optical module, so that a target sample is ensured to acquire an image1 generated by a left light source and an image2 generated by a right light source on an acquisition array surface of the CCD camera under the irradiation of the left light source and the right light source respectively; starting a focal plane calculation module to calculate pixel offset of image content of the acquired image1 and the acquired image2, and calculating the height difference of the current sample height from the focal plane according to linear unitary one-time equation parameters of historical focal plane deviation and image offset; starting a motor driving module, driving a sample object stage to move to a corresponding position according to the height difference from a target focal plane, and rapidly realizing a focusing method and device of a micro-optical system by only adding a pair of point light sources on two sides of an optical axis, wherein the cost is low; the method supports the same-color light source, well solves the difference problem of different-color light sources in other methods, saves a plurality of image preprocessing links, and also avoids more complex matching detection preprocessing operation, thereby achieving a quicker focusing effect; the theoretical focal plane is calculated by using a pair of offset table look-up tables of point light source time-sharing illumination imaging, and compared with a focal plane optimizing method, the method has one-step quick focusing capability; the imaging offset amount calculating method based on the convolution matching method has the advantages of strong instantaneity, large effective interval and threshold value, and can judge whether focusing is successful, so that the problems of time consumption and poor threshold value self-adaption caused by adopting multiple groups of light sources with different intervals due to insufficient effective interval in the prior related technology are effectively solved, rapid focusing can be realized, and the instantaneity of multi-field focusing is improved, thereby improving the digital scanning speed of the glass slide.
The foregoing is merely illustrative of the present invention and is not to be construed as limiting thereof; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some or all of the technical features thereof can be replaced with equivalents; all equivalent structures or equivalent flow changes made by the specification and the attached drawings of the invention or directly or indirectly applied to other related technical fields are included in the protection scope of the invention.

Claims (9)

1. A rapid focusing device of a micro-optical system, which is characterized in that: the microscopic optical focusing device comprises a light source control module, a microscopic amplifying optical module, a CCD camera acquisition module, a focal plane calculation module and a motor driving module, wherein the microscopic amplifying optical module comprises a lens barrel and an objective lens, the CCD camera acquisition module comprises an array CCD acquisition camera, the CCD camera acquisition module is connected with the focal plane calculation module, the focal plane calculation module is connected with the motor driving module, the motor driving module is connected with the microscopic amplifying optical module, the light source control module is connected with the CCD camera acquisition module, the CCD camera acquisition module is connected with the microscopic amplifying optical module, and can directly calculate the most clear focal plane position through the same-color time-sharing asynchronous light source imaging, and the driving motor moves an observation target of a slide sample layer to the focal plane position to realize rapid and convenient microscopic optical focusing.
2. The method for rapid focusing of a micro-optical system according to claim 1, comprising the steps of:
step one: initializing a system, and synchronizing time sequences of the CCD camera acquisition module and the light source module;
step two: starting a light source control module, and respectively turning on and off a left lamp and a right lamp according to a frame rate time sequence to circularly light a main light source and the left and right light sources;
step three: starting a CCD camera acquisition module, wherein light passes through a microscopic amplification optical module, so that a target sample is ensured to acquire an image1 generated by a left light source and an image2 generated by a right light source on an acquisition array surface of the CCD camera under the irradiation of the left light source and the right light source respectively;
step four: starting a focal plane calculation module to calculate pixel offset of image content of the acquired image1 and the acquired image2, and calculating the height difference of the current sample height from the focal plane according to linear unitary one-time equation parameters of historical focal plane deviation and image offset;
step five: and starting a motor driving module to drive the sample object stage to move to the corresponding position according to the height difference from the target focal plane.
3. The method of claim 2, wherein in the first step, the timing synchronization-based light source control includes:
(1) The light source control module adopts a rising edge triggering mode to receive and sense a time sequence synchronous signal generated by the imaging CCD according to the frame rate;
(2) At the initial moment, sequentially lighting a main light source, a left light source and a right light source, wherein the lighting time periods are respectively T0, T1 and T2;
(3) The light source control module repeats the cycle all the time in the above order.
4. A method of fast focusing a micro-optical system according to claim 3, wherein in the second step, the main light source identification includes:
(1) White LED lamps are selected as main light sources, and green LED lamps are selected as left and right light sources;
(2) Randomly selecting N pixel points (defaults N=400, but is not limited to 400) on each graph, and respectively taking the values of red R channels to average, wherein pi represents the pixel point, and R (pi) represents the red R channel value of the pixel point;
(3) Judging that if the image is greater than or equal to 100, the image is formed by a main light source; if less than 100, the map is not an image of the primary light source.
5. The method of claim 4, wherein in the second step, the determination of the left and right light sources comprises:
(1) After an image acquired after the rising edge of a timing synchronization signal is triggered on a time sequence is identified as a main light source, the timing sequence positions of a follow-up main light source, a left light source and a right light source in the synchronization signal can be calibrated, so that the light source corresponding to the image at any timing sequence position is judged;
(2) After the steps are calibrated once, the primary light source identification calibration is carried out once by taking every 90 time sequences as a period (not limited to the multiple of 90,3), so that the anti-interference performance of the module is enhanced.
6. The method for rapid focusing of a micro-optical system according to claim 5, wherein: the target area optimization selection strategy aims at reducing the image calculation area so as to improve the focusing speed, and searching the optimal area with obvious characteristics for calculation so as to improve the focusing accuracy, and comprises the following steps:
(1) On the image imaged by the main light source, 5x5 blocking is carried out in the central area (10% of the width and the height are reserved as non-central areas respectively in the upper, lower, left and right directions, and the width and the height of the blocking image are respectively w and h);
(2) And randomly selecting 400 pixel points for each block area, calculating standard deviation for three-channel pixel values, and selecting 6 areas with the largest standard deviation as the optimal areas to be matched with the most obvious features.
7. The method of claim 6, wherein the non-rigidly transformed alien image matching comprises:
(1) The image distribution acquired by the left light source and the right light source is subjected to self-adaptive Gray level transformation based on normalization to obtain Gray images Gray1 and Gray2, the result is not influenced by global brightness change, namely, the consistent brightening and darkening of any image has no influence on the result, and the transformation formula is as follows:
wherein (x, y) is the pixel coordinate, R (x, y), G (x, y), B (x, y) are the values of the red, green and blue channels of the pixel, respectively, and are the gray values of the pixel after conversion;
(2) Extracting 6 optimal areas to be matched in a gray level diagram of a left light source, carrying out operator convolution calculation on each pixel of each area, selecting n pixel points (x, y) with the largest convolution value as a set Mn (default n=10), and constructing a matrix by using the pixel gray level values of the pixel point set Mn; similarly, extracting a matrix from the gray level image of the right light source;
(3) The sums are subjected to similarity calculation, and first, the sums may have brightness deviation caused by imaging of different light sources, so that the sums are subjected to brightness deviation elimination calculation, the average value of the sums is subtracted, and the brightness deviation of each pixel point is used for calculation, so that the influence of brightness difference is eliminated, wherein the specific formula is as follows:
second, for any offset (dx, dy) that exists with the sum, then its similarity function is designed as follows:
the final similarity value is scaled to the [ -1,1] range, with R equal to 1.0 for two identical images, the larger the similarity value representing the more similar the two, completely uncorrelated images r= -1.
8. The method of claim 7, wherein the image shift calculation comprises:
(1) Offset optimization of best match in order to quickly find the best offset (dx, dy) of the left and right light sources, maximizing R (dx, dy), first, c-times downsampling (c default 10) is applied to Gray images Gray1 and Gray2, and the extraction matrix T is traversed on the scaled Gray image Gray2' 2 And calculate and T 1 Taking the position (dx 1, dy 1) where the maximum R is located as a preliminary offset position; secondly, traversing and optimizing the area of w multiplied by h on the Gray2 original image in a small range according to the preliminary offset position (dx 1, dy 1) multiplied by c, and calculating to obtain the largest relative offset position (dx 2, dy 2), thereby reducing the calculation time; finally, the preliminary offset position is superimposed on the relative offset position to obtain the final image offset position (dx 3, dy 3), i.e., (dx 3, dy 3) = (dx 1 xc+dx2, dy1 xc+dy2);
(2) And (3) consistency verification (average standard deviation filtering), respectively calculating image offset values of 25 areas, calculating an average value and a labeling difference, considering focusing failure when the standard deviation is larger than 1% of the average value, considering focusing success when the standard deviation is smaller than or equal to 1% of the average value, and using the average value as a final image pixel offset value.
9. The method of claim 8, wherein the fitting and calculating of the focal plane offset function comprises:
(1) Finding that the distance delta z between the z-axis sample and the objective lens is in a linear relation with the calculated left and right image offset values;
(2) Let the pixel offset value of the image be x, then there is a function Δz=a·x+b;
(3) Calculating to obtain the values of the slope a and the intercept b by using a least square curve fitting method, and obtaining the functional relation of the z-axis motor required movement step length deltaz corresponding to the image pixel offset x;
(4) The focal plane position can be calculated quickly by using the function.
CN202311453974.0A 2023-11-03 2023-11-03 Rapid focusing method and device for microscopic optical system Pending CN117434707A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311453974.0A CN117434707A (en) 2023-11-03 2023-11-03 Rapid focusing method and device for microscopic optical system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311453974.0A CN117434707A (en) 2023-11-03 2023-11-03 Rapid focusing method and device for microscopic optical system

Publications (1)

Publication Number Publication Date
CN117434707A true CN117434707A (en) 2024-01-23

Family

ID=89558126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311453974.0A Pending CN117434707A (en) 2023-11-03 2023-11-03 Rapid focusing method and device for microscopic optical system

Country Status (1)

Country Link
CN (1) CN117434707A (en)

Similar Documents

Publication Publication Date Title
US9088729B2 (en) Imaging apparatus and method of controlling same
JP5856733B2 (en) Imaging device
EP2162790B1 (en) Multi color autofocus apparatus and method
CN106210520B (en) A kind of automatic focusing electronic eyepiece and system
JP5374119B2 (en) Distance information acquisition device, imaging device, and program
JP2015111143A (en) Whole slide fluorescence scanner
CN108776980A (en) A kind of scaling method towards lenticule light-field camera
US9990752B2 (en) Image processing device, imaging device, microscope system, image processing method, and computer-readable recording medium
CN111429562B (en) Wide-field color light slice microscopic imaging method based on deep learning
EP3879172B1 (en) Machine vision based intelligent focusing method for a moving head lamp
CN110646933A (en) Automatic focusing system and method based on multi-depth plane microscope
US8184364B2 (en) Illuminator for a 3-D optical microscope
CN105785561B (en) A kind of digital microscope and its focus method
CN110418070B (en) Method for adjusting camera image exposure in digital slice scanner
JP2006308884A (en) Automatic focusing device
US11356593B2 (en) Methods and systems for single frame autofocusing based on color- multiplexed illumination
US6590612B1 (en) Optical system and method for composing color images from chromatically non-compensated optics
CN110400281A (en) Image enchancing method in a kind of digital slices scanner
CN110290313B (en) Method for guiding automatic focusing equipment to be out of focus
CN109724951A (en) A kind of dynamic super-resolution fluorescence imaging technique
CN117434707A (en) Rapid focusing method and device for microscopic optical system
JP2013043007A (en) Focal position controller, endoscope, and focal position control method
CN117233921A (en) Automatic focusing method for fluorescent imaging equipment
CN109862256B (en) Device and method for visually positioning belt fiber
CN112381896B (en) Brightness correction method and system for microscopic image and computer equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination