CN113933981A - Automatic focusing method based on optical image definition and related equipment - Google Patents

Automatic focusing method based on optical image definition and related equipment Download PDF

Info

Publication number
CN113933981A
CN113933981A CN202010611672.1A CN202010611672A CN113933981A CN 113933981 A CN113933981 A CN 113933981A CN 202010611672 A CN202010611672 A CN 202010611672A CN 113933981 A CN113933981 A CN 113933981A
Authority
CN
China
Prior art keywords
focusing
foc
image
definition
optical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010611672.1A
Other languages
Chinese (zh)
Inventor
郏东耀
李玉娟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Huihuang Yaoqiang Technology Co ltd
Original Assignee
Shenzhen Huihuang Yaoqiang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Huihuang Yaoqiang Technology Co ltd filed Critical Shenzhen Huihuang Yaoqiang Technology Co ltd
Priority to CN202010611672.1A priority Critical patent/CN113933981A/en
Publication of CN113933981A publication Critical patent/CN113933981A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/24Base structure
    • G02B21/241Devices for focusing
    • G02B21/244Devices for focusing using image analysis techniques
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • G02B7/36Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals
    • G02B7/38Systems for automatic generation of focusing signals using image sharpness techniques, e.g. image processing techniques for generating autofocus signals measured at different points on the optical axis, e.g. focussing on two or more planes and comparing image data
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/10Image enhancement or restoration by non-spatial domain filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration by the use of local operators
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Abstract

The embodiment of the application belongs to the field of automatic control, and relates to an automatic focusing method, an automatic focusing device, computer equipment and a storage medium based on optical image definition, wherein the method comprises the steps of initializing the positions of an objective lens and an objective table; adjusting the focusing distance up or down along the z axis by a set step length, and continuously collecting 3 optical focusing images; calculating the definition of each optical focusing image; and comparing the definition, and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement. The method sets line-by-line scanning on the X axis and the Y axis of a microscope scanning platform, designs and realizes an automatic focusing algorithm on the Z axis, improves the automatic focusing algorithm according to two aspects of different image definition calculation methods and focusing plane search algorithms, and improves the focusing accuracy and efficiency.

Description

Automatic focusing method based on optical image definition and related equipment
Technical Field
The present disclosure relates to the field of automation control, and in particular, to an auto-focusing method based on optical image sharpness and related apparatus.
Background
The autofocus technique is originally intended for manual focusing that is constantly required for photography, and therefore is used in photographic systems in the first place. The traditional automatic focusing technology measures and calculates the proper focal plane position according to the distance between a lens and a target, and the method is very rough and has poor effect. With the continuous development of image processing technology and the application of the image processing technology in various fields, researchers try to evaluate the focusing effect in real time and continuously adjust the focusing degree by measuring and calculating the definition of an image so as to achieve the final focusing effect. The method is popular at present, and many scholars at home and abroad put forward own research results.
The image definition evaluation algorithm based on the edge characteristics provided by the Yanse culvert can obtain parameters corresponding to the subjective definition through judging the current frame image, but is easily influenced by noise when the image background is complex and the target object is very small. Zhangfengshou et al propose an improved Sobel gradient focusing algorithm; the gray level difference focusing evaluation function proposed by the Lezha peak and the like can effectively filter noise and enhance stability. The search algorithm is a key technology in the focusing depth method, and is actually an optimization problem of a one-dimensional extreme value, and a position corresponding to the highest point of a focusing evaluation function curve, namely an optimal focusing position, is found.
However, the above methods generally have the following problems: the method has the advantages that firstly, the noise resistance is poor, the accuracy of a focusing result is influenced, secondly, the impurities and garbage cell interference planes cannot be well removed, and meanwhile, the problems of low focusing speed and low efficiency caused by the fact that a large number of images near the focusing plane are collected and compared exist.
Disclosure of Invention
Based on the above, the present application provides an automatic focusing method based on optical image sharpness and related devices, so as to solve the technical problems of slow focusing speed and low efficiency in the prior art.
An auto-focusing method based on optical image sharpness, the method comprising:
initializing the positions of an objective lens and an objective table;
adjusting the focusing distance up or down along the z axis by a set step length, and continuously collecting 3 optical focusing images;
calculating the definition of each optical focusing image;
and comparing the definition, and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
An optical image sharpness based auto-focusing apparatus, the apparatus comprising:
the initial module is used for initializing the positions of the objective lens and the objective table;
the acquisition module is used for adjusting the focusing distance up or down along the z axis by a set step length and continuously acquiring 3 optical focusing images;
the calculating module is used for calculating the definition of each optical focusing image;
and the adjusting module is used for comparing the definition and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
A computer device comprising a memory and a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the steps of the above-mentioned method for optical image sharpness based auto-focusing when executing the computer program.
A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the above-mentioned method for optical image sharpness-based autofocus.
According to the automatic focusing method and device based on the optical image definition, the computer equipment and the storage medium, the plurality of images are continuously collected by setting the step length, then the definition of the collected optical focusing image is calculated in real time, the definition is compared, and the focusing distance is adjusted according to the comparison result, so that the technical problems of low focusing speed and low efficiency in the prior art are solved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a schematic diagram of an application environment of an auto-focusing method based on optical image sharpness;
FIG. 2 is a schematic flow chart of an automatic focusing method based on optical image sharpness;
FIG. 3 is a flow chart of an auto-focusing method based on optical image sharpness;
FIG. 4 is a schematic diagram of an auto-focus device based on optical image sharpness;
FIG. 5 is a diagram of a computer device in one embodiment.
Detailed Description
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs; the terminology used in the description of the application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application; the terms "including" and "having," and any variations thereof, in the description and claims of this application and the description of the above figures are intended to cover non-exclusive inclusions. The terms "first," "second," and the like in the description and claims of this application or in the above-described drawings are used for distinguishing between different objects and not for describing a particular order.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The automatic focusing method based on the optical image definition provided by the embodiment of the invention can be applied to the application environment shown in fig. 1. The application environment may include a terminal 102, a network for providing a communication link medium between the terminal 102 and the server 104, and a server 104, wherein the network may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others.
A user may use the terminal 102 to interact with the server 104 over a network to receive or send messages, etc. The terminal 102 may have installed thereon various communication client applications, such as a web browser application, a shopping application, a search application, an instant messaging tool, a mailbox client, social platform software, and the like.
The terminal 102 may be various electronic devices having a display screen and supporting web browsing, including but not limited to a smart phone, a tablet computer, an e-book reader, an MP3 player (Moving Picture Experts Group Audio Layer III, mpeg compression standard Audio Layer 3), an MP4 player (Moving Picture Experts Group Audio Layer IV, mpeg compression standard Audio Layer 4), a laptop portable computer, a desktop computer, and the like.
The server 104 may be a server that provides various services, such as a background server that provides support for pages displayed on the terminal 102.
It should be noted that the automatic focusing method based on the optical image definition provided in the embodiment of the present application is generally executed by a server/terminal, and accordingly, an automatic focusing apparatus based on the optical image definition is generally disposed in the server/terminal device.
It should be understood that the number of terminals, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Wherein, the terminal 102 communicates with the server 104 through the network. The terminal 102 acquires the collected optical focusing image, calculates the definition of the image, compares the definitions of a plurality of images, controls the distance between the objective lens and the glass slide according to the definition until the definition of the image meets the requirement, and completes focusing, wherein the operation of calculating the definition of the image can be performed on the server 104, the terminal 102 collects the image under the control of the server 104, sends the image to the server 104 for definition calculation, and then performs focusing under the control of the server 104. The terminal 102 and the server 104 are connected through a network, the network may be a wired network or a wireless network, the terminal 102 may be, but is not limited to, various personal computers, notebook computers, smart phones, tablet computers, and portable wearable devices, and the server 104 may be implemented by an independent server or a server cluster formed by a plurality of servers.
In one embodiment, as shown in fig. 2, an auto-focusing method based on optical image sharpness is provided, which is described by taking the method as an example applied to the terminal in fig. 1, and includes the following steps:
in step 202, the position of the objective lens and the stage are initialized.
Before starting, a microscope scanning platform hardware system needs to be built, a biological microscope serves as an observation main, an electric objective table carries out scanning focusing control, and a camera forms a hardware part. Wherein, the optical microscope and the industrial camera are organically combined through photoelectric conversion.
The method comprises the steps of putting a cervical squamous epithelial cell slide on a microscope objective table, initializing X-axis and Y-axis positions, roughly focusing by using a hill-climbing search algorithm, determining a small-range focusing interval as an initialized position, wherein the hill-climbing algorithm is a local preferred method, adopts a heuristic method, is an improvement on depth-first search, and utilizes feedback information to help to generate a solution decision. . In the aspect of improvement of a focusing plane search algorithm, a hill climbing search algorithm is adopted for rough focusing, a small-range focusing interval is determined, and then the following automatic focusing mode is adopted for fine focusing, so that the focusing and positioning efficiency is improved.
And step 204, adjusting the focusing distance up or down along the z axis by a set step length, and continuously acquiring 3 optical focusing images.
The step length setting means that a microscope objective lens on a server side control hardware system moves along the Z axis or up or down, the moving direction of the objective lens is unchanged in each optical image acquisition period, and each time the objective lens moves by one step, the camera acquires an optical image of the cervical squamous epithelial cells as an optical focusing image.
In step 206, the sharpness of each optically focused image is calculated.
And calculating the definition of each acquired optical focusing image.
Optionally, filtering noise points in the optical focusing image through Retinex, and then performing edge extraction on the optical focusing image after the noise points are filtered through lifting wavelet transform to obtain a first edge image based on pixel mutation points, wherein the Retinex algorithm is a commonly used image enhancement mode established on the basis of scientific experiments and scientific analyses; the pixel mutation point refers to a pixel point with the maximum gray value gradient change, the edge detection is carried out on the first edge image through morphology to obtain a plurality of second edge images, finally the second edge images are normalized, weighting processing is carried out on the normalized second edge images, and the definition of the optical training image is obtained.
The optical image definition judgment principle based on the edge characteristics is that firstly, the optical image after the Retinex processing is decomposed in a wavelet transformation mode, the low-frequency edge and the high-frequency edge of the optical image can be monitored through the wavelet transformation, and finally, the edge characteristics of the optical image are used as the basis to realize the accurate calculation of the optical image so as to obtain the correct judgment result.
Compared with the conventional wavelet transform, the lifting wavelet used in the embodiment is a perfect and improved wavelet, and changes the situation that the conventional wavelet transform must rely on the fourier transform. Extracting a low-frequency image from an optical focusing image by lifting wavelet transformation, and then detecting a pixel mutation point in the low-frequency image, wherein the pixel mutation point is a point where the gray scale value of a pixel in the image changes suddenly and can also be called as a signal mutation point; the gray value of the signal catastrophe point is greatly different from the surrounding points, and the image edge can be represented clearly. Specifically, the position where the gradient change of the gray value of the pixel point is the largest can be calculated, and the position is defined as a mutation point. The detection mode is to detect the peripheral gray value of each pixel point, so as to find out the pixel mutation point and achieve the purpose of edge detection.
After the edge image based on the pixel mutation point is obtained, a clearer optical image needs to be obtained through a morphological detection mode, wherein morphology is a method for extracting the edge characteristics of the image, and the method can be applied to the sharpness detection of the optical image to obtain a good judgment effect.
The edge feature detection is performed based on structural elements, which have a large number of types and can be roughly classified into the following types: rectangular, linear, square, etc. because the same type of structural elements also have differences in scale, the scale of edge detection will also be affected accordingly, generally, the scale is in direct proportion to the denoising performance, in short, the smaller the scale, the poorer the denoising capability, and then grasp more edge details, and vice versa. In this embodiment, because the detected cervical squamous epithelial cell is a generally circular or elliptical cell, an elliptical structural element can be taken as an example, and because the size of the cell is uncertain, an elliptical structural element with various dimensions can be selected for edge detection.
The concrete expression is as follows: b isi={BiI ═ 1,2.. n }, where n denotes the scale of the structuring element, the larger n, the larger the ellipse; according to the method, a second edge image is obtained according to each scale structure element detection, and multiple scales can be selected for detection, so that multiple second edge images can be obtained.
Then, in the process of image edge detection, the detection result may need to be processed in a weighting manner, and finally, a clear edge image and the definition of the clear image are obtained in a synthesizing manner. Specifically, the second edge image is normalized to an edge matrix and according to equation (1):
C=(k1A′1+k2A′2+…knA′n)/n (1)
weighting the edge matrix to obtain definition, C refers to definition, k1、k2、knFinger edge matrix weight, A'1、A′2、A′nRefers to the edge matrix, and n refers to the number of second edge images obtained.
There are many ways to compute morphology, for example: the use of the erosion operation method can shrink the image, while the use of the expansion calculation method can enlarge the image, the projection can be reduced by the open operation, the smooth processing of the image is realized, the close operation function is to eliminate the small holes and smooth the contour line of the image, and in order to further improve the accuracy of the operation, the effective detection of the edge of the optical image can be realized by adopting or more than two operation methods.
Optionally, the sharpness of the optically focused image may also be calculated by: filtering noise points in the optical focusing image through Gaussian filtering; performing edge detection on the optical focusing image after the noise points are filtered through a Laplace algorithm to obtain a third edge image; and calculating the local variance information of the third edge image as the definition of the optical focusing image.
Specifically, firstly, the image is convolved through gaussian filtering, such as formula (2), so as to realize the smoothing processing of the image and filter isolated noise points;
fg(m,n)=[g(m,n)*f(m,n)](2)
wherein f (m, n) is a gray scale image of the original optically focused image; f. ofg(m, n) is the original gray image after gaussian filtering, and g (m, n) is the gaussian filter function, as shown in formula (3):
Figure BDA0002561067380000061
where σ is the width of the gaussian filter.
Then, considering that the edge of the squamous epithelial cell is mostly approximate to an ellipse, adding a non-directional laplacian gradient operator to complete the edge detection of the image, as shown in formula (4):
Figure BDA0002561067380000062
wherein h (m, n) is a value after laplace treatment;
finally, utilizing the characteristic that the focused clear image has larger gray difference than the unfocused image, and finally adopting local variance information FGLOGTo describe the degree of image sharpness and the in-focus state of the lens, as shown in equation (5):
Figure BDA0002561067380000063
wherein the content of the first and second substances,
Figure BDA0002561067380000064
is the global average value after treatment; the local variance information represents the continuity degree of adjacent pixel points, the clearer image pixel points are more discontinuous, the more fuzzy image pixel points are more similar, each position can vertically move a lens to shoot an image, then the definition (namely the local variance information) is calculated, and the larger the definition is, the larger the definition isThe clearer.
In the image definition calculation, the method combines the Laplace and the local variance combined focusing algorithm, shortens the calculation time and realizes the quick focusing.
And 208, comparing the definition, and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
As shown in FIG. 3, the flowchart for adjusting the focus distance according to the contrast result, specifically, after obtaining the definition, which can be represented by Foc, the definition of the different optically focused images can be Foc1、Foc2、Foc3And (4) showing.
The set step length is a first focus step length or a second focus step length. Comparing the obtained definition, wherein the definition is according to the first focus step length w1The sharpness rating, recorded as Foc, of 3 optically focused images continuously acquired with focus distance adjusted up or down along the z-axis1、Foc2、Foc3(ii) a Setting a derivative judgment threshold T and a focusing interval threshold sigma; when Foc1<Foc2And Foc2>Foc3And | Foc1-Foc3If | is greater than the threshold σ of the focusing interval, then in the focusing interval [ Foc |1,Foc3]According to a second focusing step length w2Reverse focusing is carried out, and 2 optical focusing images are continuously collected, wherein the second focusing step length w2Is smaller than the first focus step length w1
Optionally, a second focus step w2For a first focus step w1One third to one half.
Preferably, the second focus step w2For a first focus step w1One third of the total.
Then, the definition of the newly collected 2 optical focusing images is compared with the definition of the clearest optical focusing image collected last, and the operations of adjusting the focusing distance and collecting the optical focusing images according to the definition comparison result are repeated until | Foc1-Foc3I is not more than the focusing interval threshold value sigma, and the clearest optical training image acquired last timeLike Foc1Or Foc3If any Foc appear1And Foc3Equal to Foc respectively1、Foc3And carrying out definition contrast.
At | Foc1-Foc3And when the | is not more than the focusing interval threshold sigma, fitting a focusing function by a least square method: y is ax2+ bx + c, where x is the focal length, y is the sharpness, and a, b, and c are parameters; then, a pseudo-focusing plane P is obtainedcFor the plane P to be focusedcRespectively deriving Foc 'at set distance from top to bottom, such as 20 μm'mAnd Foc'n(ii) a When Foc'mGreater than derivative decision threshold T, and Foc'nWhen the current solution is larger than the derivative judgment threshold T, the current solution is not the local optimal solution, the plane to be focused at the moment is recorded as the optimal focusing plane, and focusing is finished; otherwise, the current solution is a local optimal solution, the operation of adjusting the focusing distance according to the first focusing step length is executed again until the derivative judgment threshold is met, and focusing is finished.
The local optimal solution represents that the definition of a certain point or a certain area on an image is the highest, and the local definition means that the whole image is not clear, so that the non-local optimal solution is a clear image which is needed by people.
Further, if Foc1<Foc2<Foc3If the current searching direction is correct, the operations of collecting the optical focusing images according to the first focusing step length and adjusting the focusing distance and calculating the definition of each optical focusing image are repeated until Foc1<Foc2And Foc2>Foc3
If Foc1>Foc2<Foc3Or Foc1>Foc2And Foc2<Foc3The focus distance is adjusted in reverse by the first focus step and the operations of collecting optically focused images and calculating sharpness are repeated until Foc1<Foc2And Foc2>Foc3
The cells on the glass slide are automatically focused in the above mode, and after a clear image is obtained, the electric object stage is finally controlled to scan row by row along the X axis and the Y axis, and the operation of Z-axis automatic focusing is repeated until the scanning of the whole glass slide is completed.
In the automatic focusing method based on the optical image definition, not only the automatic step scanning of the target object can be realized, but also the automatic focusing can be carried out on each visual field. The invention sets line-by-line and column-by-column scanning on the X axis and the Y axis of the microscope scanning platform, designs and realizes the automatic focusing algorithm on the Z axis, improves the automatic focusing algorithm according to two aspects of different image definition calculation methods and focusing plane search algorithms, and improves the focusing accuracy and efficiency.
It should be understood that, although the steps in the flowchart of fig. 2 are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in fig. 2 may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
In one embodiment, as shown in fig. 4, an optical image sharpness based automatic focusing apparatus is provided, and the optical image sharpness based automatic focusing apparatus corresponds to the optical image sharpness based automatic focusing method in the above embodiment one to one. The automatic focusing device based on optical image definition comprises:
an initialization module 402 initializes the position of the objective lens and the stage.
And the acquisition module 404 is configured to adjust the focus distance up or down along the z-axis by a set step size, and continuously acquire 3 optically focused images.
And a calculating module 406 for calculating the sharpness of each optically focused image.
And the adjusting module 408 is used for comparing the definition and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
Further, the adjusting module 408 includes:
a contrast submodule for comparing the obtained sharpness, wherein the sharpness is the sharpness level of 3 optically focused images continuously acquired by adjusting the focus distance up or down along the z-axis according to the first focus step length, which is recorded as Foc1、Foc2、Foc3
The setting submodule is used for setting a derivative judgment threshold value and a focusing interval threshold value;
a first processing submodule for processing when Foc1<Foc2And Foc2>Foc3And | Foc1-Foc3If | is greater than the focusing interval threshold, then in the focusing interval [ Foc1,Foc3]Carrying out reverse focusing according to a second focusing step length, and continuously collecting 2 optical focusing images, wherein the second focusing step length is smaller than the first focusing step length;
an updating submodule for comparing the definition of the newly acquired 2 optical focusing images with the definition of the clearest optical focusing image acquired last time, and repeating the operations of adjusting the focusing distance and acquiring the optical focusing images according to the definition comparison result until | Foc1-Foc3L is not greater than a focusing interval threshold;
fitting submodule for applying at | Foc1-Foc3When | is not more than the threshold value of the focusing interval, a focusing function is fitted by a least square method, and a plane P to be focused is solvedcFor the plane P to be focusedcRespectively obtaining Foc 'at the upper and lower set distances'mAnd Foc'n
A first judgment submodule for judging when Foc'mGreater than derivative decision threshold, and Foc'nWhen the current time is greater than the derivative judgment threshold, recording the plane to be focused as the optimal focusing plane, and finishing focusing;
and the second judgment submodule is used for re-executing the operation of adjusting the focusing distance according to the first focusing step length if the second judgment submodule does not execute the operation of adjusting the focusing distance according to the first focusing step length until the derivative judgment threshold is met, and finishing focusing.
Further, the adjusting module 408 further includes:
a second processing submodule for processing if Foc1<Foc2<Foc3Or Foc1>Foc2And Foc2<Foc3If the current searching direction is correct, the operations of collecting the optical focusing images according to the first focusing step length and adjusting the focusing distance and calculating the definition of each optical focusing image are repeated until Foc1<Foc2And Foc2>Foc3
A third processing submodule for processing if Foc1>Foc2<Foc3Adjusting the focus distance reversely according to the first focus step, and repeating the operations of collecting the optical focusing image and calculating the definition until Foc1<Foc2And Foc2>Foc3
Further, the calculation module 406 includes:
the first denoising submodule is used for filtering noise points in the optical focusing image through Retinex;
the edge detection submodule is used for carrying out edge extraction on the optical focusing image after the noise points are filtered through lifting wavelet transformation to obtain a first edge image based on pixel mutation points, wherein the pixel mutation points refer to pixel points with the maximum gradient change of gray scale values;
the edge detection submodule is used for carrying out edge detection on the first edge image through morphology to obtain a plurality of second edge images;
and the normalization submodule is used for normalizing the second edge image and weighting the normalized second edge image to obtain the definition of the optical training image.
Further, the calculating module 406 further includes:
the second denoising submodule is used for filtering noise points in the optical focusing image through Gaussian filtering;
the detection submodule is used for carrying out edge detection on the optical focusing image after the noise point is filtered through a Laplace algorithm to obtain a third edge image;
and the calculation submodule is used for calculating the local variance information of the third edge image as the definition of the optical focusing image.
The automatic focusing device based on the optical image definition not only can realize automatic step scanning of a target object, but also can automatically focus each visual field. The invention sets line-by-line and column-by-column scanning on the X axis and the Y axis of the microscope scanning platform, designs and realizes the automatic focusing algorithm on the Z axis, improves the automatic focusing algorithm according to two aspects of different image definition calculation methods and focusing plane search algorithms, and improves the focusing accuracy and efficiency.
In one embodiment, a computer device is provided, which may be a terminal, and its internal structure diagram may be as shown in fig. 5. The computer device includes a processor, a memory, a network interface, a display screen, and an input device connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement an auto-focusing method based on optical image sharpness. The display screen of the computer equipment can be a liquid crystal display screen or an electronic ink display screen, and the input device of the computer equipment can be a touch layer covered on the display screen, a key, a track ball or a touch pad arranged on the shell of the computer equipment, an external keyboard, a touch pad or a mouse and the like.
Those skilled in the art will appreciate that the architecture shown in fig. 5 is merely a block diagram of some of the structures associated with the disclosed aspects and is not intended to limit the computing devices to which the disclosed aspects apply, as particular computing devices may include more or less components than those shown, or may combine certain components, or have a different arrangement of components.
In one embodiment, a computer-readable storage medium is provided, on which a computer program is stored, and the computer program when executed by a processor implements the steps of the optical image sharpness based auto-focusing method in the above-described embodiments, such as the steps 202 to 208 shown in fig. 2, or the processor implements the functions of the modules/units of the optical image sharpness based auto-focusing apparatus in the above-described embodiments, such as the functions of the modules 402 to 408 shown in fig. 4.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database, or other medium used in the embodiments provided herein may include non-volatile and/or volatile memory, among others. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), Double Data Rate SDRAM (DDRSDRAM), Enhanced SDRAM (ESDRAM), Synchronous Link DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and memory bus dynamic RAM (RDRAM).
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions.
The technical features of the above embodiments can be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the above embodiments are not described, but should be considered as the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for those skilled in the art, without departing from the spirit and scope of the present invention, several changes, modifications and equivalent substitutions of some technical features may be made, and these changes or substitutions do not make the essence of the same technical solution depart from the spirit and scope of the technical solution of the embodiments of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (10)

1. An auto-focusing method based on optical image sharpness, the method comprising:
initializing the position of the objective lens and stage:
adjusting the focusing distance up or down along the z axis by a set step length, and continuously collecting 3 optical focusing images;
calculating the definition of each optical focusing image;
and comparing the definition, and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
2. The method of claim 1, wherein the comparing the sharpness and adjusting the focus distance according to the comparison until the sharpness of the optically focused image meets a set requirement comprises:
comparing the obtained definition, wherein the definition is adjusted up or down along the z-axis according to the first focusing step lengthThe clarity grade of 3 optically focused images continuously acquired by the focusing distance is recorded as Foc1、Foc2、Foc3
Setting a derivative judgment threshold value and a focusing interval threshold value;
when Foc1<Foc2And Foc2>Foc3And | Foc1-Foc3If | is greater than the focusing interval threshold, then in the focusing interval [ Foc1,Foc3]Carrying out reverse focusing according to a second focusing step length, and continuously collecting 2 optical focusing images, wherein the second focusing step length is smaller than the first focusing step length;
carrying out definition contrast on the definition of the newly acquired 2 optical focusing images and the clearest optical focusing image acquired last time, and repeating the operations of adjusting the focusing distance and acquiring the optical focusing images according to the definition contrast result until | Foc1-Foc3L is not greater than the focusing interval threshold;
at | Foc1-Foc3When | is not more than the focusing interval threshold value, a focusing function is fitted through a least square method, and a plane P to be focused is solvedcFor the plane P to be focusedcRespectively obtaining Foc 'at the upper and lower set distances'mAnd Foc'n
When Foc'mGreater than derivative decision threshold, and Foc'nWhen the current time is greater than the derivative judgment threshold, recording the plane to be focused as the optimal focusing plane, and finishing focusing;
otherwise, the operation of adjusting the focusing distance according to the first focusing step length is executed again until a derivative judgment threshold value is met, and focusing is finished.
3. The method of claim 2, wherein said time Foc is1>Foc2And Foc2<Foc3And | Foc1-Foc3If | is greater than the focusing interval threshold, then in the focusing interval [ Foc1,Foc3]Before carrying out inverse focusing according to the second focusing step length, the method further comprises the following steps:
if Foc1<Foc2<Foc3Or Foc1>Foc2And Foc2<Foc3If the current searching direction is correct, the operations of collecting the optical focusing images and calculating the definition of each optical focusing image according to the focusing distance adjusted by the first focusing step length are repeated until Foc1<Foc2And Foc2>Foc3
If Foc1>Foc2<Foc3Adjusting the focus distance in reverse according to the first focus step, and repeating the operations of collecting the optically focused image and calculating the sharpness until Foc1<Foc2And Foc2>Foc3
4. The method of any of claims 2-3, wherein the second focus step is one third of the first focus step.
5. The method of claim 1, wherein said separately calculating the sharpness of each of said optically focused images comprises:
filtering noise points in the optical focusing image through Retinex;
performing edge extraction on the optical focusing image after the noise points are filtered through lifting wavelet transformation to obtain a first edge image based on pixel mutation points, wherein the pixel mutation points refer to pixel points with maximum gray value gradient change;
performing edge detection on the first edge image through morphology to obtain a plurality of second edge images;
normalizing the second edge image, and performing weighting processing on the normalized second edge image to obtain the definition of the optical training image.
6. The method according to claim 5, wherein the normalizing the second edge image and weighting the normalized second edge image to obtain the sharpness of the optical training image comprises:
normalizing the second edge image to an edge matrix and according to a formula:
C=(k1A′1+k2A′2+…knA′n)/n
weighting the edge matrix to obtain the definition, C refers to definition and k1、k2、knFinger edge matrix weight, A'1、A′2、A′nRefers to the edge matrix, and n refers to the number of second edge images obtained.
7. The method of claim 1, wherein said calculating the sharpness of each of said optically focused images comprises:
filtering out noise points in the optical focusing image through Gaussian filtering;
performing edge detection on the optical focusing image after the noise points are filtered through a Laplace algorithm to obtain a third edge image;
and calculating the local variance information of the third edge image as the definition of the optical focusing image.
8. An automatic focusing apparatus based on optical image sharpness, comprising:
the initial module is used for initializing the positions of the objective lens and the objective table;
the acquisition module is used for adjusting the focusing distance up or down along the z axis by a set step length and continuously acquiring 3 optical focusing images;
the calculating module is used for calculating the definition of each optical focusing image;
and the adjusting module is used for comparing the definition and adjusting the focusing distance according to the comparison result until the definition of the optical focusing image meets the set requirement.
9. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor implements the steps of the method of any one of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
CN202010611672.1A 2020-06-29 2020-06-29 Automatic focusing method based on optical image definition and related equipment Pending CN113933981A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010611672.1A CN113933981A (en) 2020-06-29 2020-06-29 Automatic focusing method based on optical image definition and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010611672.1A CN113933981A (en) 2020-06-29 2020-06-29 Automatic focusing method based on optical image definition and related equipment

Publications (1)

Publication Number Publication Date
CN113933981A true CN113933981A (en) 2022-01-14

Family

ID=79272890

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010611672.1A Pending CN113933981A (en) 2020-06-29 2020-06-29 Automatic focusing method based on optical image definition and related equipment

Country Status (1)

Country Link
CN (1) CN113933981A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114967093A (en) * 2022-06-16 2022-08-30 华东师范大学 Automatic focusing method and system based on microscopic hyperspectral imaging platform
CN117078662A (en) * 2023-10-11 2023-11-17 杭州睿影科技有限公司 Detection method and device for laminated battery, image processing equipment and medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110996002A (en) * 2019-12-16 2020-04-10 深圳大学 Microscope focusing method, device, computer equipment and storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110996002A (en) * 2019-12-16 2020-04-10 深圳大学 Microscope focusing method, device, computer equipment and storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
卢东兴等: "边缘特征的光学图像清晰度判定与分析", 激光杂志, vol. 37, no. 7, 31 December 2016 (2016-12-31), pages 23 - 26 *
周佳琳: "基于改进SSD网络的宫颈细胞分类检测系统", 中国优秀硕士学位论文全文数据库 医药卫生科技辑, no. 1, 15 January 2020 (2020-01-15), pages 9 - 18 *
韦玉科;李江平;段仰广;卢博生;: "一种基于图像处理的舌象采集自动调焦算法", 山东大学学报(工学版), vol. 41, no. 04, 16 August 2011 (2011-08-16) *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114967093A (en) * 2022-06-16 2022-08-30 华东师范大学 Automatic focusing method and system based on microscopic hyperspectral imaging platform
CN114967093B (en) * 2022-06-16 2023-05-12 华东师范大学 Automatic focusing method and system based on microscopic hyperspectral imaging platform
CN117078662A (en) * 2023-10-11 2023-11-17 杭州睿影科技有限公司 Detection method and device for laminated battery, image processing equipment and medium

Similar Documents

Publication Publication Date Title
EP3968280A1 (en) Target tracking method and apparatus, storage medium and electronic device
CN111507333B (en) Image correction method and device, electronic equipment and storage medium
Huang et al. Joint blur kernel estimation and CNN for blind image restoration
CN111626933A (en) Accurate and rapid microscopic image splicing method and system
CN113933981A (en) Automatic focusing method based on optical image definition and related equipment
Ali et al. Robust focus volume regularization in shape from focus
CN111931641A (en) Pedestrian re-identification method based on weight diversity regularization and application thereof
Nayef et al. Metric-based no-reference quality assessment of heterogeneous document images
CN114390201A (en) Focusing method and device thereof
Ali et al. Energy minimization for image focus volume in shape from focus
Anand et al. Pore detection in high-resolution fingerprint images using deep residual network
CN108647605B (en) Human eye gaze point extraction method combining global color and local structural features
CN108876776B (en) Classification model generation method, fundus image classification method and device
Lee et al. Optimizing image focus for 3D shape recovery through genetic algorithm
CN110120009B (en) Background blurring implementation method based on salient object detection and depth estimation algorithm
Lin et al. Hierarchical complementary residual attention learning for defocus blur detection
Lin et al. Defocus blur parameters identification by histogram matching
Qu et al. Method of feature pyramid and attention enhancement network for pavement crack detection
Zhang et al. Coarse-to-fine multiscale fusion network for single image deraining
Sang et al. MoNET: no-reference image quality assessment based on a multi-depth output network
Huang et al. A no reference image quality assessment method based on RepVGG
Chen et al. Autofocus window selection algorithm based on saliency detection
Tang et al. Joint enhancement and denoising method using non-subsampled shearlet transform for low-light images
Liu et al. Crowd counting via an inverse attention residual network
CN113869363B (en) Mountain climbing focusing searching method based on image evaluation network and image evaluation function

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination