CN111242880A - Multi-depth-of-field image superposition method, equipment and medium for microscope - Google Patents

Multi-depth-of-field image superposition method, equipment and medium for microscope Download PDF

Info

Publication number
CN111242880A
CN111242880A CN201911387695.2A CN201911387695A CN111242880A CN 111242880 A CN111242880 A CN 111242880A CN 201911387695 A CN201911387695 A CN 201911387695A CN 111242880 A CN111242880 A CN 111242880A
Authority
CN
China
Prior art keywords
image
feature
depth
images
mapping
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911387695.2A
Other languages
Chinese (zh)
Other versions
CN111242880B (en
Inventor
张春旺
曹江中
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Youai Intelligent Technology Co Ltd
Guangzhou Micro Shot Technology Co ltd
Original Assignee
Guangzhou Youai Intelligent Technology Co Ltd
Guangzhou Micro Shot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Youai Intelligent Technology Co Ltd, Guangzhou Micro Shot Technology Co ltd filed Critical Guangzhou Youai Intelligent Technology Co Ltd
Priority to CN201911387695.2A priority Critical patent/CN111242880B/en
Publication of CN111242880A publication Critical patent/CN111242880A/en
Application granted granted Critical
Publication of CN111242880B publication Critical patent/CN111242880B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • G06T2207/10061Microscopic image from scanning electron microscope
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

The invention provides a multi-depth-of-field image superposition method for a microscope, which comprises the steps of obtaining a plurality of microscopic images of the same sample on object carrying platforms with different heights acquired by the microscope according to a preset acquisition sequence for primary screening, screening two microscopic images in an image sequence according to the preset acquisition sequence to be used as superposition operation images, and carrying out depth-of-field superposition processing on the two superposition operation images to obtain depth-of-field superposition images; judging whether the screened image sequence contains a microscopic image, if so, screening one microscopic image from the rest microscopic images in the screened image sequence according to a preset acquisition sequence, taking the non-primary screened image and the depth-of-field superposed image as superposed operation images, returning to the step of executing image processing, and if not, outputting the depth-of-field superposed image as a multi-depth-of-field superposed result image; the multi-depth-of-field image superposition method for the microscope enables the whole image superposition process to be efficient and accurate, and reduces time cost and labor cost.

Description

Multi-depth-of-field image superposition method, equipment and medium for microscope
Technical Field
The present invention relates to the field of image processing, and in particular, to a method, an apparatus, and a medium for superimposing multiple depth-of-field images for a microscope.
Background
In the field of medical detection, various samples of biological sections are in a sheet shape after being processed by smearing, spreading, grinding and the like, but the samples still have a tiny amount of thickness physically, and all sample details in a visual field cannot be clearly presented under a single depth of field of a microscope. At present, image superposition of multi-field-depth microscope images is mainly realized in the industry through algorithms such as image target extraction, image definition evaluation, image fusion and the like, wherein a variable step hill climbing search method requiring a loading platform to have high flatness is adopted, a pixel point fusion method with low superposition efficiency is adopted, a block fusion method with rough superposition effect is adopted, and a loading platform flatness learning superposition method with complicated preparation work before use is adopted. In the multi-depth-of-field image superposition method, the image superposition efficiency and the superposition effect are a pair of mutual exclusivity, and the complexity of preparation work is increased for adjusting the balance between the image superposition efficiency and the superposition effect, so that the debugging time and the labor cost are wasted.
Disclosure of Invention
In order to overcome the defects of the prior art, an object of the present invention is to provide a multi-depth-of-field image superimposing method for a microscope, which can solve the problems that the image superimposing efficiency and the superimposing effect in the conventional multi-depth-of-field image superimposing method are a pair of mutual exclusion, the complexity of the preparation work is often increased for adjusting the balance between the two, and the debugging time and the labor cost are wasted.
The second objective of the present invention is to provide an electronic device, which can solve the problems that the image stacking efficiency and the stacking effect are a pair of mutual exclusion in the conventional multi-depth-of-field image stacking method, and the complexity of preparation work is often increased to adjust the balance between the two methods, thereby wasting debugging time and labor cost.
The invention also aims to provide a computer-readable storage medium, which can solve the problems that in the traditional multi-depth-of-field image superposition method, the image superposition efficiency and the superposition effect are a pair of mutual exclusion, the complexity of preparation work is increased for adjusting the balance of the image superposition efficiency and the superposition effect, and the debugging time and the labor cost are wasted.
One of the purposes of the invention is realized by adopting the following technical scheme:
a multi-depth-of-field image superimposing method for a microscope, comprising the steps of:
acquiring an image sequence, namely acquiring a plurality of microscopic images of the same sample on loading platforms with different heights, which are acquired by a microscope according to a preset acquisition sequence to obtain the image sequence containing the plurality of microscopic images;
primarily screening images, namely screening two microscopic images in the image sequence according to the preset acquisition sequence to serve as superposition operation images, and taking the image sequence as a screened image sequence;
image processing, namely performing depth-of-field superposition processing on the two superposed operation images to obtain a depth-of-field superposed image;
judging the image, namely judging whether the screened image sequence contains microscopic images, if so, executing image re-screening in the step, and if not, outputting the depth of field superposed image as a multi-depth of field superposed result image;
and (3) image re-screening, namely screening a microscopic image from the rest microscopic images in the screened image sequence according to a preset acquisition sequence, taking the microscopic image as a non-primary screening image, taking both the non-primary screening image and the depth of field superposition image as superposition operation images, and returning to the image processing of the execution step.
Further, the image processing comprises the sub-steps of:
performing edge feature extraction processing, namely performing edge feature extraction processing on the two superposed operation images to obtain corresponding feature point position information, and generating two edge feature mapping images corresponding to the superposed operation images according to the feature point position information;
performing mean filtering processing, namely performing mean filtering processing on the two edge feature mapping images according to a preset filtering template to obtain two feature mean mapping image groups corresponding to the two edge feature mapping images, wherein the preset filtering template comprises a preset first filtering template, a preset second filtering template and a preset third filtering template, and each feature mean mapping image group comprises a first feature mean mapping image, a second feature mean mapping image and a third feature mean mapping image;
comparing feature values, namely comparing the feature values of a first feature mean mapping image in the two feature mean mapping image groups to obtain a first feature weight image, comparing the feature values of a second feature mean mapping image in the two feature mean mapping image groups to obtain a second feature weight image, and comparing the feature values of a third feature mean mapping image in the two feature mean mapping image groups to obtain a third feature weight image;
the guiding filtering processing is carried out, the first characteristic weight image is subjected to guiding filtering processing according to a preset first guiding filtering template to obtain a first characteristic guiding mapping image, the second characteristic weight image is subjected to guiding filtering processing according to a preset second guiding filtering template to obtain a second characteristic guiding mapping image, and the third characteristic weight image is subjected to guiding filtering processing according to a preset third guiding filtering template to obtain a third characteristic guiding mapping image;
generating a weight image, comparing pixel values of all pixel points in the first feature guide mapping image, the second feature guide mapping image and the third feature guide mapping image, and selecting a maximum pixel value to obtain the weight image;
and generating a depth-of-field superposed image according to each pixel value of the weight image and each pixel value of the two superposed operation images.
Further, the edge feature extraction processing is to detect a changed edge or a discontinuous region in the superposition operation image to obtain corresponding feature point position information, and describe a binary image of the edge or the region according to the feature point position information to obtain an edge feature mapping image.
Furthermore, normalization processing is further included before the edge feature extraction processing, and pixel normalization processing is performed on the two superposed operation images respectively.
Further, the normalization process is to map the pixel values of the two superposed operation images from an integer domain to a floating point domain between 0 and 1.
Further, the microscopic images in the image sequence are preprocessed when the pixel size of the microscopic images exceeds a preset processing threshold.
Further, the preprocessing is to reconstruct a pixel size of the microscopic image.
The second purpose of the invention is realized by adopting the following technical scheme:
an electronic device, comprising: a processor;
a memory; and a program, wherein the program is stored in the memory and configured to be executed by the processor, the program comprising instructions for performing a multi-depth image overlaying method for a microscope of the present application.
The third purpose of the invention is realized by adopting the following technical scheme:
a computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program is executed by a processor to perform a multi-depth image overlaying method for a microscope of the present application.
Compared with the prior art, the invention has the beneficial effects that: the method for superposing the multiple depth-of-field images for the microscope comprises the steps of obtaining an image sequence, obtaining a plurality of microscopic images of the same sample on loading platforms with different heights, which are collected by the microscope according to a preset collection sequence, and obtaining the image sequence containing the plurality of microscopic images; primarily screening images, namely screening two microscopic images in an image sequence according to a preset acquisition sequence to serve as an overlapping operation image, and taking the image sequence as a screened image sequence; image processing, namely performing depth-of-field superposition processing on the two superposed operation images to obtain a depth-of-field superposed image; judging the image, namely judging whether the screened image sequence contains microscopic images, if so, executing image re-screening in the step, and if not, outputting the depth of field superposed image as a multi-depth of field superposed result image; and (4) image re-screening, namely screening a microscopic image from the rest microscopic images in the screened image sequence according to a preset acquisition sequence, taking the microscopic image as a non-primary screening image, taking both the non-primary screening image and the depth of field superposed image as superposed operation images, and returning to the execution step for image processing. By sequentially carrying out field depth superposition processing on different microscopic images, the whole image superposition process is efficient and accurate, and time cost and labor cost are reduced.
The foregoing description is only an overview of the technical solutions of the present invention, and in order to make the technical solutions of the present invention more clearly understood and to implement them in accordance with the contents of the description, the following detailed description is given with reference to the preferred embodiments of the present invention and the accompanying drawings. The detailed description of the present invention is given in detail by the following examples and the accompanying drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the invention without limiting the invention. In the drawings:
fig. 1 is a schematic flow chart of a multi-depth-of-field image superimposing method for a microscope according to the present invention.
Detailed Description
The present invention will be further described with reference to the accompanying drawings and the detailed description, and it should be noted that any combination of the embodiments or technical features described below can be used to form a new embodiment without conflict.
As shown in fig. 1, a multi-depth image superimposing method for a microscope of the present application includes the following steps:
and acquiring an image sequence, namely acquiring a plurality of microscopic images of the same sample on the loading platform with different heights, which are acquired by the microscope according to a preset acquisition sequence, so as to obtain the image sequence containing the plurality of microscopic images. In this embodiment, the preset collection sequence is to collect microscopic images of the sample according to the height of the stage from high to low, that is, the height of the stage is freely adjusted while the stage is not moved, the height of the stage is adjusted from high to low until the stage reaches the preset lowest point, and a microscopic image of the sample corresponding to the height is collected when the stage is adjusted to one height.
And (3) primarily screening the images, namely screening two microscopic images in the image sequence according to a preset acquisition sequence to serve as an overlapping operation image, and taking the image sequence as a screened image sequence. In this embodiment, the order of acquiring the images is determined by the height of the object platform, that is, the microscopic images acquired when the object platform is at the highest position and the second highest position are selected as the superposition operation image, which is exemplified as the first superposition operation image and the second superposition operation image in this embodiment.
And image processing, namely performing depth of field superposition processing on the two superposed operation images to obtain a depth of field superposed image. In this embodiment, the image processing specifically includes:
and normalization processing, namely respectively carrying out pixel normalization processing on the two superposed operation images to obtain normalized superposed operation images corresponding to the two thresholds. In the present embodiment, the pixel values of the superposition operation image are mapped from the integer domain to the floating-point domain between 0 and 1.
And edge feature extraction processing, namely performing edge feature extraction processing on the two superposed operation images, namely performing edge feature extraction processing on the two normalized superposed operation images to obtain corresponding feature point position information, and generating two edge feature mapping images corresponding to the superposed operation images according to the feature point position information. The method specifically comprises the following steps: detecting the changed edge or discontinuous area in the two normalized superposition operation images to obtain the corresponding characteristic point position information, and describing a binary image of the edge or area according to the characteristic point position information to obtain an edge characteristic mapping image. The pixel value of each edge feature map image is calculated in the present embodiment according to the following formula (1):
Figure BDA0002344034850000061
wherein the content of the first and second substances,
Figure BDA0002344034850000062
mapping pixel values of the image for the edge feature, f is the pixel value of the normalized superposition calculation image, x is the row coordinate of the pixel point in the normalized superposition calculation image, and y is the normalized superposition calculation imageThe column coordinates of the pixels in the image.
And performing mean filtering processing, namely performing mean filtering processing on the two edge feature mapping images according to a preset filtering template to obtain two feature mean mapping image groups corresponding to the two edge feature mapping images, wherein the preset filtering template comprises a preset first filtering template, a preset second filtering template and a preset third filtering template, and each feature mean mapping image group comprises three feature mean mapping images, specifically a first feature mean mapping image, a second feature mean mapping image and a third feature mean mapping image. In this embodiment, the preset filtering templates are set according to the sizes of the templates, the first filtering template is a larger filtering template, the second filtering template is a medium filtering template, and the third filtering template is a smaller filtering template, and the size relationship between the first filtering template and the second filtering template is as follows: presetting a first filtering template > presetting a second filtering template > presetting a third filtering template. The specific calculation of the pixel value of the feature mean mapping image is as in formula (2):
Figure BDA0002344034850000071
wherein x is the pixel value of the edge feature mapping image, y is the pixel value of the feature mean mapping image, i is the image row sequence, j is the image sequence, and n is the preset filtering template size.
And comparing feature values, namely comparing the feature values of a first feature mean mapping image in the two feature mean mapping image groups to obtain a first feature weight image, comparing the feature values of a second feature mean mapping image in the two feature mean mapping image groups to obtain a second feature weight image, and comparing the feature values of a third feature mean mapping image in the two feature mean mapping image groups to obtain a third feature weight image. In this embodiment, a feature mean value mapping image group corresponding to a superposition operation image acquired first in two superposition operation images is used as a reference, when a feature value of a first feature weight image in a secondary feature mean value mapping image group is greater than a feature value of a first feature weight image in another feature mean value mapping image group, the weights of the two feature weight images are 1, and if the weights are less than or equal to 0, the corresponding first feature weight image is obtained; and respectively carrying out the processing on a second characteristic mean value mapping image and a third characteristic mean value mapping image in the two characteristic mean value mapping image groups according to the method to respectively obtain a second characteristic weight image and a third characteristic weight image.
And (3) performing guiding filtering processing, namely performing guiding filtering processing on the first feature weight image according to a preset first guiding filtering template to obtain a first feature guiding mapping image, performing guiding filtering processing on the second feature weight image according to a preset second guiding filtering template to obtain a second feature guiding mapping image, and performing guiding filtering processing on the third feature weight image according to a preset third guiding filtering template to obtain a third feature guiding mapping image. In this embodiment, the preset guiding filter template is set according to the size of the template, the first guiding filter template is a larger guiding filter template, the second guiding filter template is a medium guiding filter template, and the third guiding filter template is a smaller guiding filter template, and the three dimensional relationships are as follows: the method comprises the steps of presetting a first guide filtering template, presetting a second guide filtering template and presetting a third guide filtering template. Formula (3) is used in this step:
Figure BDA0002344034850000081
where p represents the input image, q represents the output image, I is the guide image, Wij(I) In order to determine the weight value used in the weighted average operation by the guide image I, I is the image line sequence, and j is the image sequence, the guide image I in this embodiment may be a single image, or may be the input image p itself.
And generating a weight image, comparing pixel values of all pixel points in the first feature guide mapping image, the second feature guide mapping image and the third feature guide mapping image, and selecting a maximum pixel value to obtain the weight image.
And generating a depth-of-field superposed image according to each pixel value of the weighted image and each pixel value of the two superposed operation images. The method specifically comprises the following steps: in this embodiment, each pixel value of the weighted image is sequentially multiplied by each pixel value of the superposition operation image acquired first, so as to obtain a superposition component image of the superposition operation image acquired first; and multiplying the difference value of subtracting each pixel value of the weight image from 1 by each pixel value of a later-acquired image in the two superposed operation images in sequence to obtain a superposed component image of the later-acquired superposed operation image, and adding the pixel values of the two superposed component images in sequence to obtain a pixel value of the depth-of-field superposed image, thereby generating the depth-of-field superposed image. In this embodiment, the two superimposed operation images are superimposed operation images or depth-of-field superimposed images, and when the superimposed operation images are depth-of-field superimposed images, the depth-of-field superimposed images are used as the previously acquired superimposed operation images. In the present embodiment, when the pixel size of a microscopic image in an image sequence exceeds a preset processing threshold, the pixel size of the microscopic image is reconstructed.
The present invention provides an electronic device, including: a processor;
a memory; and a program, wherein the program is stored in the memory and configured to be executed by the processor, the program comprising instructions for performing a method of multi-depth image superimposition for a microscope of the present application.
The present invention also provides a computer-readable storage medium having stored thereon a computer program characterized in that: the computer program is executed by a processor to perform a multi-depth image overlaying method for a microscope of the present application.
The method for superposing the multiple depth-of-field images for the microscope comprises the steps of obtaining an image sequence, obtaining a plurality of microscopic images of the same sample on loading platforms with different heights, which are collected by the microscope according to a preset collection sequence, and obtaining the image sequence containing the plurality of microscopic images; primarily screening images, namely screening two microscopic images in an image sequence according to a preset acquisition sequence to serve as an overlapping operation image, and taking the image sequence as a screened image sequence; image processing, namely performing depth-of-field superposition processing on the two superposed operation images to obtain a depth-of-field superposed image; judging the image, namely judging whether the screened image sequence contains microscopic images, if so, executing image re-screening in the step, and if not, outputting the depth of field superposed image as a multi-depth of field superposed result image; and (4) image re-screening, namely screening a microscopic image from the rest microscopic images in the screened image sequence according to a preset acquisition sequence, taking the microscopic image as a non-primary screening image, taking both the non-primary screening image and the depth of field superposed image as superposed operation images, and returning to the execution step for image processing. By sequentially carrying out field depth superposition processing on different microscopic images, the whole image superposition process is efficient and accurate, and time cost and labor cost are reduced.
The foregoing is merely a preferred embodiment of the invention and is not intended to limit the invention in any manner; those skilled in the art can readily practice the invention as shown and described in the drawings and detailed description herein; however, those skilled in the art should appreciate that they can readily use the disclosed conception and specific embodiments as a basis for designing or modifying other structures for carrying out the same purposes of the present invention without departing from the scope of the invention as defined by the appended claims; meanwhile, any changes, modifications, and evolutions of the equivalent changes of the above embodiments according to the actual techniques of the present invention are still within the protection scope of the technical solution of the present invention.

Claims (9)

1. A multi-depth image superimposing method for a microscope, comprising the steps of:
acquiring an image sequence, namely acquiring a plurality of microscopic images of the same sample on loading platforms with different heights, which are acquired by a microscope according to a preset acquisition sequence to obtain the image sequence containing the plurality of microscopic images;
primarily screening images, namely screening two microscopic images in the image sequence according to the preset acquisition sequence to serve as superposition operation images, and taking the image sequence as a screened image sequence;
image processing, namely performing depth-of-field superposition processing on the two superposed operation images to obtain a depth-of-field superposed image;
judging the image, namely judging whether the screened image sequence contains microscopic images, if so, executing image re-screening in the step, and if not, outputting the depth of field superposed image as a multi-depth of field superposed result image;
and (3) image re-screening, namely screening a microscopic image from the rest microscopic images in the screened image sequence according to a preset acquisition sequence, taking the microscopic image as a non-primary screening image, taking both the non-primary screening image and the depth of field superposition image as superposition operation images, and returning to the image processing of the execution step.
2. A multi-depth image superimposing method for a microscope according to claim 1, wherein: the image processing comprises the following sub-steps:
performing edge feature extraction processing, namely performing edge feature extraction processing on the two superposed operation images to obtain corresponding feature point position information, and generating two edge feature mapping images corresponding to the superposed operation images according to the feature point position information;
performing mean filtering processing, namely performing mean filtering processing on the two edge feature mapping images according to a preset filtering template to obtain two feature mean mapping image groups corresponding to the two edge feature mapping images, wherein the preset filtering template comprises a preset first filtering template, a preset second filtering template and a preset third filtering template, and each feature mean mapping image group comprises a first feature mean mapping image, a second feature mean mapping image and a third feature mean mapping image;
comparing feature values, namely comparing the feature values of a first feature mean mapping image in the two feature mean mapping image groups to obtain a first feature weight image, comparing the feature values of a second feature mean mapping image in the two feature mean mapping image groups to obtain a second feature weight image, and comparing the feature values of a third feature mean mapping image in the two feature mean mapping image groups to obtain a third feature weight image;
the guiding filtering processing is carried out, the first characteristic weight image is subjected to guiding filtering processing according to a preset first guiding filtering template to obtain a first characteristic guiding mapping image, the second characteristic weight image is subjected to guiding filtering processing according to a preset second guiding filtering template to obtain a second characteristic guiding mapping image, and the third characteristic weight image is subjected to guiding filtering processing according to a preset third guiding filtering template to obtain a third characteristic guiding mapping image;
generating a weight image, comparing pixel values of all pixel points in the first feature guide mapping image, the second feature guide mapping image and the third feature guide mapping image, and selecting a maximum pixel value to obtain the weight image;
and generating a depth-of-field superposed image according to each pixel value of the weight image and each pixel value of the two superposed operation images.
3. A multi-depth image superimposing method for a microscope according to claim 2, wherein: the edge feature extraction processing is to detect a changed edge or a discontinuous region in the superposition operation image to obtain corresponding feature point position information, and describe a binary image of the edge or the region according to the feature point position information to obtain an edge feature mapping image.
4. A multi-depth image superimposing method for a microscope according to claim 2, wherein: the edge feature extraction processing also comprises normalization processing before the edge feature extraction processing, and the two superposed operation images are respectively subjected to pixel normalization processing.
5. A multi-depth image superimposing method for a microscope according to claim 4, wherein: and the normalization processing is to map the pixel values of the two superposed operation images from an integer domain to a floating point domain between 0 and 1.
6. A multi-depth image superimposing method for a microscope according to claim 1, wherein: and when the pixel size of the microscopic image in the image sequence exceeds a preset processing threshold value, preprocessing the microscopic image.
7. A multi-depth image superimposing method for a microscope according to claim 6, wherein: the preprocessing is to reconstruct the pixel size of the microscopic image.
8. An electronic device, characterized by comprising: a processor;
a memory; and a program, wherein the program is stored in the memory and configured to be executed by the processor, the program comprising instructions for carrying out the method of any one of claims 1-7.
9. A computer-readable storage medium having stored thereon a computer program, characterized in that: the computer program is executed by a processor for performing the method according to any of claims 1-7.
CN201911387695.2A 2019-12-30 2019-12-30 Multi-depth-of-field image superposition method, equipment and medium for microscope Active CN111242880B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911387695.2A CN111242880B (en) 2019-12-30 2019-12-30 Multi-depth-of-field image superposition method, equipment and medium for microscope

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911387695.2A CN111242880B (en) 2019-12-30 2019-12-30 Multi-depth-of-field image superposition method, equipment and medium for microscope

Publications (2)

Publication Number Publication Date
CN111242880A true CN111242880A (en) 2020-06-05
CN111242880B CN111242880B (en) 2023-05-02

Family

ID=70871852

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911387695.2A Active CN111242880B (en) 2019-12-30 2019-12-30 Multi-depth-of-field image superposition method, equipment and medium for microscope

Country Status (1)

Country Link
CN (1) CN111242880B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152132A (en) * 2023-04-19 2023-05-23 山东仕达思医疗科技有限公司 Depth of field superposition method, device and equipment for microscope image

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793888A (en) * 1994-11-14 1998-08-11 Massachusetts Institute Of Technology Machine learning apparatus and method for image searching
US20100014718A1 (en) * 2008-04-17 2010-01-21 Biometricore, Inc Computationally Efficient Feature Extraction and Matching Iris Recognition
US20100103194A1 (en) * 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
CN101930606A (en) * 2010-05-14 2010-12-29 深圳市海量精密仪器设备有限公司 Field depth extending method for image edge detection
CN102129676A (en) * 2010-01-19 2011-07-20 中国科学院空间科学与应用研究中心 Microscopic image fusing method based on two-dimensional empirical mode decomposition
CN102609931A (en) * 2012-02-01 2012-07-25 广州市明美光电技术有限公司 Field depth expanding method and device of microscopic image
CN103308452A (en) * 2013-05-27 2013-09-18 中国科学院自动化研究所 Optical projection tomography image capturing method based on depth-of-field fusion
CN103473542A (en) * 2013-09-16 2013-12-25 清华大学 Multi-clue fused target tracking method
CN103499879A (en) * 2013-10-16 2014-01-08 北京航空航天大学 Method of acquiring microscopic image with super field depth
CN104200450A (en) * 2014-08-25 2014-12-10 华南理工大学 Infrared thermal image resolution enhancing method
CN104463817A (en) * 2013-09-12 2015-03-25 华为终端有限公司 Image processing method and device
CN106327442A (en) * 2016-08-22 2017-01-11 上海奥通激光技术有限公司 Multispectral micro-imaging field depth extension method and system
CN106339998A (en) * 2016-08-18 2017-01-18 南京理工大学 Multi-focus image fusion method based on contrast pyramid transformation
CN107610218A (en) * 2017-08-25 2018-01-19 武汉工程大学 A kind of plane data acquisition methods towards stereochemical structure site three-dimensional image reconstruction
CN108550130A (en) * 2018-04-23 2018-09-18 南京邮电大学 A kind of multiple dimensioned transmission plot fusion method of image pyramid model
CN109102465A (en) * 2018-08-22 2018-12-28 周泽奇 A kind of calculation method of the content erotic image auto zoom of conspicuousness depth of field feature
CN109523480A (en) * 2018-11-12 2019-03-26 上海海事大学 A kind of defogging method, device, computer storage medium and the terminal of sea fog image
CN109741322A (en) * 2019-01-08 2019-05-10 南京蓝绿物联科技有限公司 A kind of visibility measurement method based on machine learning
CN110390659A (en) * 2019-08-01 2019-10-29 易普森智慧健康科技(深圳)有限公司 Total focus image imaging method and device applied to bright field microscope

Patent Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5793888A (en) * 1994-11-14 1998-08-11 Massachusetts Institute Of Technology Machine learning apparatus and method for image searching
US20100014718A1 (en) * 2008-04-17 2010-01-21 Biometricore, Inc Computationally Efficient Feature Extraction and Matching Iris Recognition
US20100103194A1 (en) * 2008-10-27 2010-04-29 Huawei Technologies Co., Ltd. Method and system for fusing images
CN102129676A (en) * 2010-01-19 2011-07-20 中国科学院空间科学与应用研究中心 Microscopic image fusing method based on two-dimensional empirical mode decomposition
CN101930606A (en) * 2010-05-14 2010-12-29 深圳市海量精密仪器设备有限公司 Field depth extending method for image edge detection
CN102609931A (en) * 2012-02-01 2012-07-25 广州市明美光电技术有限公司 Field depth expanding method and device of microscopic image
CN103308452A (en) * 2013-05-27 2013-09-18 中国科学院自动化研究所 Optical projection tomography image capturing method based on depth-of-field fusion
CN104463817A (en) * 2013-09-12 2015-03-25 华为终端有限公司 Image processing method and device
CN103473542A (en) * 2013-09-16 2013-12-25 清华大学 Multi-clue fused target tracking method
CN103499879A (en) * 2013-10-16 2014-01-08 北京航空航天大学 Method of acquiring microscopic image with super field depth
CN104200450A (en) * 2014-08-25 2014-12-10 华南理工大学 Infrared thermal image resolution enhancing method
CN106339998A (en) * 2016-08-18 2017-01-18 南京理工大学 Multi-focus image fusion method based on contrast pyramid transformation
CN106327442A (en) * 2016-08-22 2017-01-11 上海奥通激光技术有限公司 Multispectral micro-imaging field depth extension method and system
CN107610218A (en) * 2017-08-25 2018-01-19 武汉工程大学 A kind of plane data acquisition methods towards stereochemical structure site three-dimensional image reconstruction
CN108550130A (en) * 2018-04-23 2018-09-18 南京邮电大学 A kind of multiple dimensioned transmission plot fusion method of image pyramid model
CN109102465A (en) * 2018-08-22 2018-12-28 周泽奇 A kind of calculation method of the content erotic image auto zoom of conspicuousness depth of field feature
CN109523480A (en) * 2018-11-12 2019-03-26 上海海事大学 A kind of defogging method, device, computer storage medium and the terminal of sea fog image
CN109741322A (en) * 2019-01-08 2019-05-10 南京蓝绿物联科技有限公司 A kind of visibility measurement method based on machine learning
CN110390659A (en) * 2019-08-01 2019-10-29 易普森智慧健康科技(深圳)有限公司 Total focus image imaging method and device applied to bright field microscope

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
XIAOSONG LI 等: "Multi-focus Image Fusion Based on the Filtering Techniques and Block Consistency Verification" *
YONGXIN ZHANG 等: "Multi-focus image fusion with alternating guided filtering" *
夏翔: "白带显微成像中超景深相关技术的研究" *
章学静: "像素级图像增强及配准算法研究" *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116152132A (en) * 2023-04-19 2023-05-23 山东仕达思医疗科技有限公司 Depth of field superposition method, device and equipment for microscope image
CN116152132B (en) * 2023-04-19 2023-08-04 山东仕达思医疗科技有限公司 Depth of field superposition method, device and equipment for microscope image

Also Published As

Publication number Publication date
CN111242880B (en) 2023-05-02

Similar Documents

Publication Publication Date Title
CN108022233A (en) A kind of edge of work extracting method based on modified Canny operators
CN108898610A (en) A kind of object contour extraction method based on mask-RCNN
CN112200045B (en) Remote sensing image target detection model establishment method based on context enhancement and application
CN107767387B (en) Contour detection method based on variable receptive field scale global modulation
CN110189290A (en) Metal surface fine defects detection method and device based on deep learning
CN112819748B (en) Training method and device for strip steel surface defect recognition model
CN111127417B (en) Printing defect detection method based on SIFT feature matching and SSD algorithm improvement
CN110781937B (en) Point cloud feature extraction method based on global visual angle
CN111597941B (en) Target detection method for dam defect image
CN114781514A (en) Floater target detection method and system integrating attention mechanism
CN108090492B (en) Contour detection method based on scale clue suppression
CN112991374A (en) Canny algorithm-based edge enhancement method, device, equipment and storage medium
CN113252103A (en) Method for calculating volume and mass of material pile based on MATLAB image recognition technology
CN111242880A (en) Multi-depth-of-field image superposition method, equipment and medium for microscope
JP2011165170A (en) Object detection device and program
CN113191235B (en) Sundry detection method, sundry detection device, sundry detection equipment and storage medium
CN111462056A (en) Workpiece surface defect detection method, device, equipment and storage medium
CN112801141B (en) Heterogeneous image matching method based on template matching and twin neural network optimization
CN111062916B (en) Definition evaluation method and device for microscopic image
CN111027512B (en) Remote sensing image quayside ship detection and positioning method and device
CN114067186B (en) Pedestrian detection method and device, electronic equipment and storage medium
CN115482178A (en) Multi-focus image fusion method and system based on significant feature difference
CN115588109A (en) Image template matching method, device, equipment and application
CN105654108A (en) Classifying method, inspection method, and inspection apparatus
CN108629788B (en) Image edge detection method, device and equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant