CN112541939A - Method and device for obtaining position of clear focal plane of sample - Google Patents

Method and device for obtaining position of clear focal plane of sample Download PDF

Info

Publication number
CN112541939A
CN112541939A CN202011551318.0A CN202011551318A CN112541939A CN 112541939 A CN112541939 A CN 112541939A CN 202011551318 A CN202011551318 A CN 202011551318A CN 112541939 A CN112541939 A CN 112541939A
Authority
CN
China
Prior art keywords
image
definition
images
target object
sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202011551318.0A
Other languages
Chinese (zh)
Other versions
CN112541939B (en
Inventor
侯剑平
王超
赵万里
王聪
刘聪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Autobio Experimental Instrument Zhengzhou Co Ltd
Original Assignee
Autobio Experimental Instrument Zhengzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autobio Experimental Instrument Zhengzhou Co Ltd filed Critical Autobio Experimental Instrument Zhengzhou Co Ltd
Priority to CN202011551318.0A priority Critical patent/CN112541939B/en
Publication of CN112541939A publication Critical patent/CN112541939A/en
Application granted granted Critical
Publication of CN112541939B publication Critical patent/CN112541939B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30024Cell structures in vitro; Tissue sections in vitro

Abstract

The invention discloses a method and a device for acquiring a clear focal plane position of a sample. The invention realizes the automatic finding of the position of the clear focal plane of the sample.

Description

Method and device for obtaining position of clear focal plane of sample
Technical Field
The invention relates to the technical field of optical imaging, in particular to a method and a device for acquiring a clear focal plane position of a sample.
Background
In the medical field, it is generally required to observe a collected sample through a microscope, and to make a diagnosis or a test result according to the condition of the sample. In the prior art, a microscope is manually operated, and a focal plane position which enables the definition degree of a pathological criterion object in a microscopic image to meet requirements is found out through observing an image formed in the moving process of the microscope, but the mode has strong subjectivity, depends on manual experience, and has large workload.
Disclosure of Invention
The invention aims to provide a method and a device for acquiring a clear focal plane position of a sample, which are used for automatically finding out the position of the clear focal plane of the sample.
In order to achieve the purpose, the invention provides the following technical scheme:
a method for acquiring a sample clear focal plane position comprises the following steps:
controlling the distance from a microscope lens to a sample to be different distance values, and acquiring an image of the sample through the microscope at each distance value;
and traversing the image corresponding to each distance value, selecting the image with the highest first definition from the images corresponding to the distance values, and outputting the distance value corresponding to the image, wherein the first definition represents the definition degree of the target object image in the image.
Preferably, traversing the image corresponding to each distance value, and selecting the image with the highest first definition from the images corresponding to the distance values includes: and sequentially comparing the first definition of the images corresponding to the two adjacent distance values, and selecting the image with the first definition larger than that of the image corresponding to the two adjacent distance values.
Preferably, comparing the first definitions of the images corresponding to two adjacent distance values in sequence, and selecting the image with the first definition greater than the first definition of the image corresponding to two adjacent distance values includes:
for the images corresponding to the two adjacent distance values, screening out matched target object images from the images corresponding to the two adjacent distance values;
and for each image of the images corresponding to the two adjacent distance values, obtaining the first definition of the image according to a matching target object image included in the image, wherein the matching target object image is a target object image corresponding to the same target object in the two images.
Preferably, for two adjacent distance value corresponding images, the step of screening out a matching target object image from the two adjacent distance value corresponding images includes:
for the images corresponding to the two adjacent distance values, finding out a target object image which is closest to the target object image in the other frame of image from one frame of image;
and judging whether the two target images are matched target images or not according to the characteristic data of the target image found in one frame of image and the characteristic data of the corresponding target image in the other frame of image.
Preferably, the feature data of the target object image includes: the distance between the central pixels of the two target object images, the area of the target object images, the height or width of the target object images, the average value of the pixel values of the target object images or the definition of the target object images.
Preferably, traversing the image corresponding to each distance value, and selecting the image with the highest first definition from the images corresponding to the distance values includes:
traversing the image corresponding to each distance value, and selecting an image with the highest second definition from the images corresponding to the distance values, wherein the second definition represents the definition of the image;
and sequentially comparing the first definition of the images corresponding to the two adjacent distance values from the image with the highest second definition, and selecting the image with the first definition which is greater than the first definition of the image corresponding to the two adjacent distance values.
Preferably, the detecting of the target object image from the image includes:
carrying out edge detection on the image, and detecting a closed contour from the image;
and judging whether the area surrounded by the closed contour corresponds to the target object image or not according to the characteristic data of the closed contour.
Preferably, the characteristic data of the closed contour includes a contour roundness, which is calculated according to the following formula:
Figure BDA0002857265720000021
Figure BDA0002857265720000031
where p denotes a center pixel of the outline, p _ i denotes any pixel on the outline, and F denotes an area of a region surrounded by the outline.
Preferably, the first definition of the image is a sum of definitions of target images included in the image, and the definition of the target image is calculated according to the following formula:
Figure BDA0002857265720000032
S1=∑∑d(x,y)/k1
wherein S is1Representing the sharpness of an object image in the image, g (x, y) representing the pixel value of an image pixel (x, y), k1Representing an object image comprising k1And (4) a pixel.
A sample clear focal plane position acquisition device is used for executing the sample clear focal plane position acquisition method.
According to the technical scheme, the method and the device for acquiring the clear focal plane position of the sample provided by the invention have the advantages that the distance from the lens of the microscope to the sample is controlled to be different distance values, the sample is acquired through the microscope at each distance value, then the image corresponding to each distance value is traversed, the image with the highest first definition is selected from the images corresponding to the distance values, and the distance value corresponding to the image is output, wherein the first definition represents the definition degree of a target object image in the image. The method and the device for acquiring the position of the clear focal plane of the sample realize automatic finding of the position of the clear focal plane of the sample.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a flowchart of a method for obtaining a clear focal plane position of a sample according to an embodiment of the present invention;
fig. 2 is a flowchart of a method for comparing a first sharpness of an image corresponding to two adjacent distance values according to an embodiment of the present invention.
Detailed Description
In order to make those skilled in the art better understand the technical solution of the present invention, the technical solution in the embodiment of the present invention will be clearly and completely described below with reference to the drawings in the embodiment of the present invention, and it is obvious that the described embodiment is only a part of the embodiment of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of a method for obtaining a sample clear focal plane position according to an embodiment of the present invention, and it can be seen that the method for obtaining a sample clear focal plane position includes the following steps:
s10: controlling the distance from a microscope lens to a sample to be different distance values, and acquiring an image of the sample through the microscope at each distance value.
And controlling the microscope lens to move, controlling the distance from the microscope lens to the sample to be different distance values, acquiring an image of the sample through the microscope at each distance value, and acquiring the image corresponding to each distance value.
S11: and traversing the image corresponding to each distance value, selecting the image with the highest first definition from the images corresponding to the distance values, and outputting the distance value corresponding to the image, wherein the first definition represents the definition degree of the target object image in the image.
And for each obtained distance value corresponding image, obtaining the first definition of the image according to the target object image in the image. And selecting the image with the highest first definition from the images by comparing the first definitions of the images, and outputting the distance value corresponding to the image. The position of the sample corresponding to the distance value relative to the microscope lens is the position of the clear focal plane of the sample.
According to the method for acquiring the position of the clear focal plane of the sample, the position of the clear focal plane of the sample is automatically found out, compared with the prior art, the need of manual operation is avoided, and the problems of strong subjectivity, dependence on manual experience and large workload are solved.
The method for acquiring the clear focal plane position of the sample is described in detail below with reference to specific embodiments. The method for acquiring the position of the clear focal plane of the sample comprises the following steps:
s10: controlling the distance from a microscope lens to a sample to be different distance values, and acquiring an image of the sample through the microscope at each distance value.
Preferably, the microscope lens is controlled to move in one direction from a position farther from the sample to a position closer to the sample, and images are acquired of the sample through the microscope at respective different distance values. In practical application, the microscope lens can be controlled to move through a pulse sequence, each pulse corresponds to a distance value and correspondingly obtains a frame of image, and a series of frames of images respectively corresponding to different distance values are obtained.
S11: and traversing the image corresponding to each distance value, selecting the image with the highest first definition from the images corresponding to the distance values, and outputting the distance value corresponding to the image.
The first sharpness characterizes a degree of sharpness of an image of the object in the image. First, for each image obtained, it is necessary to detect a target object image from the image. Optionally, the target object image may be detected from the image through the following processes, specifically including: and carrying out edge detection on the image, detecting a closed contour from the image, and judging whether an area surrounded by the closed contour corresponds to the target object image or not according to the characteristic data of the closed contour.
For example, if the target object for observation is a cell, such as a white blood cell, it can be determined whether the region surrounded by the contour corresponds to the target object image according to the roundness of the contour. Specifically, the roundness of the profile can be calculated according to the following formula:
Figure BDA0002857265720000051
Figure BDA0002857265720000052
where p denotes a center pixel of the outline, p _ i denotes any pixel on the outline, and F denotes an area of a region surrounded by the outline.
Optionally, whether the region surrounded by the contour corresponds to the target object image may also be determined according to the aspect ratio of the contour. Specifically, the aspect ratio of the profile can be calculated according to the following formula:
ObjHWRatio=ObjboundingboxH/ObjboundingboxW
wherein, ObjHWRatioAspect ratio, Obj, representing the contourboundingboxHIndicating the length of the outline, ObjboundingboxWRepresenting the width of the outline.
Optionally, for each image corresponding to the distance value, the first definitions of the images corresponding to two adjacent distance values may be sequentially compared, and the image with the first definition greater than the first definitions of the images corresponding to two adjacent distance values is selected as the image with the highest first definition.
For example, assume that N different distance values are selected to obtain an image while controlling the microscope lens to move toward the sample, and N frames of images are obtained correspondingly. And if the first definition of the ith frame image is greater than that of the (i-1) th frame image and the first definition of the ith frame image is greater than that of the (i + 1) th frame image, judging that the ith frame image is the image with the highest first definition, and outputting the corresponding distance value of the ith frame image. If the first definition of the ith frame image does not meet the requirement, the first definitions of the ith-1 frame image and the next adjacent frame image are continuously compared, and the first definitions of the (i + 1) th frame image and the next adjacent frame image are compared. And analogizing until finding out the image with the first definition larger than the first definition of the image corresponding to the two adjacent distance values. Or if the first definition of the 1 st frame image is greater than the first definition of the other frame images, outputting the distance value corresponding to the 1 st frame image, or if the first definition of the Nth frame image is greater than the first definition of the other frame images, outputting the distance value corresponding to the Nth frame image.
Preferably, the first definitions of the images corresponding to two adjacent distance values may be compared according to the following process, and an image with a first definition greater than the first definitions of the images corresponding to two adjacent distance values is selected, please refer to fig. 2, which specifically includes the following steps:
s110: and screening out a matched target object image from the images corresponding to the two adjacent distance values.
The matching target object image is a target object image corresponding to the same target object in the two images. Optionally, the matching target object image may be screened out from the images corresponding to two adjacent distance values through the following process, including the following steps:
s1100: and for the images corresponding to the two adjacent distance values, finding out a target object image which is closest to the target object image in the other frame of image from one frame of image.
Optionally, the target object image closest to the target object image in the other frame image may be found from one frame image according to the euclidean distance between the two target object images in the images. The distance between the target object image of one frame image and the target object image of another frame image can be calculated according to the following formula:
Figure BDA0002857265720000071
where d represents the distance between two target images, (p1.x, p1.y) represents the central pixel position of the target image in one frame image, and (p2.x, p2.y) represents the central pixel position of the target image in the other frame image.
S1101: and judging whether the two target images are matched target images or not according to the characteristic data of the target image found in one frame of image and the characteristic data of the corresponding target image in the other frame of image.
Optionally, the feature data of the target object image may be, but is not limited to, a distance between center pixels of two target object images, an area of the target object image, a height or a width of the target object image, an average value of pixel values of the target object image, or a sharpness of the target object image. The definition of the target object image can be calculated according to the following formula:
Figure BDA0002857265720000072
S1=∑∑d(x,y)/k1
wherein S is1Representing the sharpness of an object image in the image, g (x, y) representing the pixel value of an image pixel (x, y), k1Representing an object image comprising k1And (4) a pixel.
Conditions can be set corresponding to various feature data to judge whether the contour corresponds to a target object image or whether the target object image corresponds to the same target object. For example, if the target object for observation is white blood cells, the threshold value for the roundness of the contour may be 0.8; the outline area range is set to be more than 1000 and less than 4500; the length and width of the contour are set to be more than 80% of the short side/long side, and the average value of the pixel values of the area surrounded by the contour is set to be more than 70.
In the fine-grained focusing process, the edge of the target object is gradually blurred after being gradually cleared, the edge generates a light shadow in the blurring process, the gradient of the light shadow is large, the maximum inscribed rectangle of the outline of the target object is obtained through calculation in practical application, and the definition of the target object image is calculated according to the maximum inscribed rectangle of the outline of the target object.
S111: and for each image of the images corresponding to the two adjacent distance values, obtaining the first definition of the image according to the matching target object image included in the image.
The first definition of the image is the sum of the definitions of the matching target images included in the image, for example, M pairs of matching target images are tracked from the images corresponding to the two adjacent distance values, then for each image of the images corresponding to the two adjacent distance values, the definition of the matching target image is calculated according to the M matched target images included in the image, and the sum of the definitions of the M matched target images included in the image is calculated, so that the first definition of the image is obtained.
The method combines the tracking method when comparing the first definition of the two adjacent images, obtains the first definition of the image according to the matching target object image in the two adjacent images, and is beneficial to improving the accuracy of the processing result.
Further preferably, in the method for obtaining a sample clear focal plane position according to this embodiment, in the step S11, when traversing the image corresponding to each distance value, and in the process of selecting the image with the highest first definition from the images corresponding to each distance value, the image corresponding to each distance value may be first traversed, and the image with the highest second definition may be selected from the images corresponding to each distance value, and then, starting from the image with the highest second definition, the first definitions of the images corresponding to two adjacent distance values are sequentially compared, and the image with the first definition that is greater than the first definition of the image corresponding to its two adjacent distance values is selected. Wherein the second sharpness characterizes a sharpness of the image.
According to the method, coarse-grained focusing is performed firstly, the image with the highest definition is selected from the images corresponding to different distance values according to the overall definition of the images, then the first definitions of the images corresponding to two adjacent distance values are compared in sequence from the image with the highest second definition, fine-grained focusing is performed, and the efficiency and the accuracy of obtaining the clear focal plane position of the sample can be improved.
Alternatively, the second sharpness of the image may be calculated according to the following formula:
Figure BDA0002857265720000081
S2=∑∑d(x,y)/k2
wherein S is2Representing the second sharpness of the image, g (x, y) representing the pixel value of the image pixel (x, y), k2Representing an image comprising k2And (4) a pixel.
The method for obtaining the position of the clear focal plane of the sample realizes automatic finding of the position of the clear focal plane of the sample, wherein the method comprises a two-stage focusing mode of thickness and accuracy, and the efficiency and the accuracy for obtaining the position of the clear focal plane of the sample can be improved.
Correspondingly, the embodiment of the invention also provides a device for acquiring the position of the clear focal plane of the sample, which is used for executing the method for acquiring the position of the clear focal plane of the sample.
The sample clear focal plane position obtaining device of the embodiment firstly controls the distance from the microscope lens to the sample to be different distance values, obtains images from the sample through the microscope when each distance value is obtained, then traverses the images corresponding to each distance value, selects the image with the highest first definition from the images corresponding to each distance value, and outputs the distance value corresponding to the image, wherein the first definition represents the definition degree of a target object image in the image. The device for acquiring the position of the clear focal plane of the sample realizes automatic finding of the position of the clear focal plane of the sample.
The method and the device for acquiring the clear focal plane position of the sample provided by the invention are described in detail above. The principles and embodiments of the present invention are explained herein using specific examples, which are presented only to assist in understanding the method and its core concepts. It should be noted that, for those skilled in the art, it is possible to make various improvements and modifications to the present invention without departing from the principle of the present invention, and those improvements and modifications also fall within the scope of the claims of the present invention.

Claims (10)

1. A method for acquiring a clear focal plane position of a sample is characterized by comprising the following steps:
controlling the distance from a microscope lens to a sample to be different distance values, and acquiring an image of the sample through the microscope at each distance value;
and traversing the image corresponding to each distance value, selecting the image with the highest first definition from the images corresponding to the distance values, and outputting the distance value corresponding to the image, wherein the first definition represents the definition degree of the target object image in the image.
2. The method for obtaining the position of the clear focal plane of the sample according to claim 1, wherein traversing each image corresponding to the distance value, and selecting the image with the highest first definition from the images corresponding to the distance values comprises: and sequentially comparing the first definition of the images corresponding to the two adjacent distance values, and selecting the image with the first definition larger than that of the image corresponding to the two adjacent distance values.
3. The method for obtaining the position of the clear focal plane of the sample according to claim 2, wherein the first definitions of the images corresponding to two adjacent distance values are sequentially compared, and the selecting of the image with the first definition greater than the first definition of the image corresponding to the two adjacent distance values comprises:
for the images corresponding to the two adjacent distance values, screening out matched target object images from the images corresponding to the two adjacent distance values;
and for each image of the images corresponding to the two adjacent distance values, obtaining the first definition of the image according to a matching target object image included in the image, wherein the matching target object image is a target object image corresponding to the same target object in the two images.
4. The method for acquiring the clear focal plane position of the sample according to claim 3, wherein for the two adjacent distance value corresponding images, the step of screening out the matching target object image from the two adjacent distance value corresponding images comprises:
for the images corresponding to the two adjacent distance values, finding out a target object image which is closest to the target object image in the other frame of image from one frame of image;
and judging whether the two target images are matched target images or not according to the characteristic data of the target image found in one frame of image and the characteristic data of the corresponding target image in the other frame of image.
5. The method according to claim 4, wherein the feature data of the target object image includes: the distance between the central pixels of the two target object images, the area of the target object images, the height or width of the target object images, the average value of the pixel values of the target object images or the definition of the target object images.
6. The method for obtaining the position of the sharp focal plane of the sample according to any one of claims 1 to 5, wherein traversing each image corresponding to the distance value, and selecting the image with the highest first definition from the images corresponding to the distance values comprises:
traversing the image corresponding to each distance value, and selecting an image with the highest second definition from the images corresponding to the distance values, wherein the second definition represents the definition of the image;
and sequentially comparing the first definition of the images corresponding to the two adjacent distance values from the image with the highest second definition, and selecting the image with the first definition which is greater than the first definition of the image corresponding to the two adjacent distance values.
7. The method according to claim 1, wherein detecting a target object image from the image comprises:
carrying out edge detection on the image, and detecting a closed contour from the image;
and judging whether the area surrounded by the closed contour corresponds to the target object image or not according to the characteristic data of the closed contour.
8. The method according to claim 7, wherein the characteristic data of the closed contour includes a contour circularity, which is calculated according to the following formula:
Figure FDA0002857265710000021
Figure FDA0002857265710000022
where p denotes a center pixel of the outline, p _ i denotes any pixel on the outline, and F denotes an area of a region surrounded by the outline.
9. The method according to claim 1, wherein the first sharpness of the image is a sum of sharpness of target images included in the image, and the sharpness of the target image is calculated according to the following formula:
Figure FDA0002857265710000023
S1=∑∑d(x,y)/k1
wherein S is1Representing the sharpness of an object image in the image, g (x, y) representing the pixel value of an image pixel (x, y), k1Representing an object image comprising k1And (4) a pixel.
10. A sample clear focal plane position acquisition apparatus for executing the sample clear focal plane position acquisition method according to any one of claims 1 to 9.
CN202011551318.0A 2020-12-24 2020-12-24 Method and device for acquiring clear focal plane position of sample Active CN112541939B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011551318.0A CN112541939B (en) 2020-12-24 2020-12-24 Method and device for acquiring clear focal plane position of sample

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011551318.0A CN112541939B (en) 2020-12-24 2020-12-24 Method and device for acquiring clear focal plane position of sample

Publications (2)

Publication Number Publication Date
CN112541939A true CN112541939A (en) 2021-03-23
CN112541939B CN112541939B (en) 2023-06-27

Family

ID=75017294

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011551318.0A Active CN112541939B (en) 2020-12-24 2020-12-24 Method and device for acquiring clear focal plane position of sample

Country Status (1)

Country Link
CN (1) CN112541939B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108345085A (en) * 2017-01-25 2018-07-31 广州康昕瑞基因健康科技有限公司 Focus method and focusing system
CN109116541A (en) * 2018-09-10 2019-01-01 广州鸿琪光学仪器科技有限公司 Microscope focusing method, device, computer equipment and storage medium
US20190025193A1 (en) * 2017-07-24 2019-01-24 Visiongate, Inc. Apparatus and method for measuring microscopic object velocities in flow
CN110996002A (en) * 2019-12-16 2020-04-10 深圳大学 Microscope focusing method, device, computer equipment and storage medium
CN112099217A (en) * 2020-08-18 2020-12-18 宁波永新光学股份有限公司 Automatic focusing method for microscope

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108345085A (en) * 2017-01-25 2018-07-31 广州康昕瑞基因健康科技有限公司 Focus method and focusing system
US20190025193A1 (en) * 2017-07-24 2019-01-24 Visiongate, Inc. Apparatus and method for measuring microscopic object velocities in flow
CN109116541A (en) * 2018-09-10 2019-01-01 广州鸿琪光学仪器科技有限公司 Microscope focusing method, device, computer equipment and storage medium
CN110996002A (en) * 2019-12-16 2020-04-10 深圳大学 Microscope focusing method, device, computer equipment and storage medium
CN112099217A (en) * 2020-08-18 2020-12-18 宁波永新光学股份有限公司 Automatic focusing method for microscope

Also Published As

Publication number Publication date
CN112541939B (en) 2023-06-27

Similar Documents

Publication Publication Date Title
US11100634B2 (en) Method and system for imaging a cell sample
US11721018B2 (en) System and method for calculating focus variation for a digital microscope
US10462351B2 (en) Fast auto-focus in imaging
KR101891364B1 (en) Fast auto-focus in microscopic imaging
CN111462076B (en) Full-slice digital pathological image fuzzy region detection method and system
JP5461630B2 (en) Method for determining focus position and vision inspection system
WO2012169088A1 (en) Image processing apparatus, image processing method and image processing system
CN111462075B (en) Rapid refocusing method and system for full-slice digital pathological image fuzzy region
CN110987886B (en) Full-automatic microscopic image fluorescence scanning system
CN115201092B (en) Method and device for acquiring cell scanning image
CN112541939B (en) Method and device for acquiring clear focal plane position of sample
US20170169557A1 (en) Process and device for direct measurements of plant stomata
CN114693912B (en) Endoscopy system having eyeball tracking function, storage medium, and apparatus
CN115861220A (en) Cold-rolled strip steel surface defect detection method and system based on improved SSD algorithm
CN114972084A (en) Image focusing accuracy evaluation method and system
JP3338625B2 (en) Object identification device
CN111612776A (en) Automatic pathological gross specimen size measuring method based on image edge recognition
Rizvandi et al. Machine vision detection of isolated and overlapped nematode worms using skeleton analysis
CN115079396B (en) Chromosome karyotype analysis microscopic shooting device and method based on 100-fold objective lens
CN117608069A (en) Automatic focusing method, system, device and storage medium
CN114998376A (en) Smear sample coating edge scanning defining method and system
Clark et al. Automatic Vision-based Disease Detection in Blood Samples by a Medical Robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant