CN114463231A - Intelligent splicing method, device, medium and equipment for microscope images - Google Patents

Intelligent splicing method, device, medium and equipment for microscope images Download PDF

Info

Publication number
CN114463231A
CN114463231A CN202111533437.8A CN202111533437A CN114463231A CN 114463231 A CN114463231 A CN 114463231A CN 202111533437 A CN202111533437 A CN 202111533437A CN 114463231 A CN114463231 A CN 114463231A
Authority
CN
China
Prior art keywords
image
splicing
matching
current
spliced
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111533437.8A
Other languages
Chinese (zh)
Inventor
栗远
黄炳根
陈进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Motic China Group Co Ltd
Original Assignee
Motic China Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Motic China Group Co Ltd filed Critical Motic China Group Co Ltd
Priority to CN202111533437.8A priority Critical patent/CN114463231A/en
Publication of CN114463231A publication Critical patent/CN114463231A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Microscoopes, Condenser (AREA)
  • Studio Devices (AREA)

Abstract

The invention discloses a method, a device, a medium and equipment for intelligently splicing microscope images acquired based on a rolling shutter. The method discloses an intelligent splicing method of microscope images based on an electronic rolling shutter, which comprises the following steps: based on the camera characteristics of the electronic rolling shutter, an automatic splicing flow in the moving process of a microscope objective table is provided; a real-time detection method for image motion blur; calculating and dynamically displaying various states of the splicing process, and coordinating the moving speed of the objective table to perform effective image splicing; and (4) a double-projection automatic splicing algorithm. The invention intelligently judges and automatically splices the images of the electronic rolling shutter camera in the moving process of the objective table, has simple use and convenient operation, effectively expands the application range of the electronic rolling shutter camera, and ensures that a conventional microscope and a popular digital camera can also complete the function of scanning digital slices by a digital slice scanner.

Description

Intelligent splicing method, device, medium and equipment for microscope images
Technical Field
The invention relates to the technical field of optical microscope image processing, in particular to a method, a device, a medium and equipment for intelligently splicing microscope images acquired based on a rolling shutter.
Background
The conventional digital camera adopts shutter exposure imaging, and the shutter controls the photosensitive chip to carry out effective exposure time. The exposure imaging of the digital camera chip is mainly divided into two modes, namely global shutter exposure and rolling shutter exposure. The global shutter is realized by exposing the whole scene at the same time, simultaneously opening exposure and simultaneously closing exposure for each digital pixel of the camera, then reading out the value of each pixel of the image sensor to form a complete image, and the global shutter exposure has the advantages that all pixel points are exposed simultaneously, each pixel is exposed uniformly and stably, and the phenomenon of smear cannot occur. Compared with a global shutter, the rolling shutter is realized by a line-by-line exposure mode of a photosensitive chip, when imaging is started, the photosensitive chip is sequentially scanned line by line and exposed line by line, and the value of each pixel of an image sensor is sequentially read line by line until all pixel points are scanned, exposed and read to form a complete image. Most digital cameras at present use economical electronic rolling shutter cameras.
Patent document CN111279673A discloses "image stitching with electronic rolling shutter correction", a capturing system includes a plurality of electronic rolling shutter camera image sensors to acquire images, determines a parallax correction map by compensating for an epi-polar line, determines a warping map by the parallax correction map and the electronic rolling shutter correction map, applies the electronic rolling shutter correction map to an output of the parallax correction map by the warping map, and the like, corrects image distortion caused by the acquired images of the electronic rolling shutter cameras by this series of correction image compensation methods, and stitches together the images from two or more camera sensors to generate a composite stitched image. This approach requires multiple cameras to be deployed, increasing the complexity of the system configuration and also increasing the cost.
Patent document CN201380059462 discloses "system and method for acquiring images by using rolling shutter camera while asynchronously sequencing microscope devices", the system adopts a method for acquiring images by using rolling shutter camera, that is, assembling rolling shutter camera in microscope to acquire images. The system also comprises a step motor to move the objective table, and is characterized in that the movement of the objective table is synchronously controlled according to the exposure ending time pulse of the rolling shutter and the step motor, and only the image data of the shared exposure period related to the rolling shutter exposure signal is received.
Patent document CN202011053664 discloses a digital microscope image fast splicing and fusing method, which includes four steps of obtaining a microscopic image from a multi-modal imaging system, preprocessing the image (by an algorithm), registering the image, and fusing the image to obtain a final spliced image. The system also comprises a step motor for controlling the movable object stage, an overlapping area of the two images is obtained according to the movement displacement of the step motor, and then the accurate matching and similarity evaluation of the images are carried out on the overlapping area by a Fourier transform method. The method has the advantages that the automatic object stage is required to obtain the image displacement, the system cost is increased, the adaptability is poor, meanwhile, the image matching based on the Fourier transform is complex in calculation, and the calculation amount is increased.
In summary, it is necessary to provide a method, an apparatus, a medium, and a device for intelligently stitching microscope images based on rolling shutter acquisition, which are generally suitable for manual and automatic optical microscopes, and have small calculation amount and high stitching precision.
Disclosure of Invention
The invention aims to provide an intelligent splicing method of microscope images acquired based on a rolling shutter.
In order to achieve the purpose, the invention adopts the following technical scheme:
the intelligent splicing method for microscope images comprises the following steps:
s1, exposing a camera in real time to obtain a real-time dynamic image of the microscope, wherein the camera is an electronic rolling shutter camera;
s2, carrying out real-time detection of image motion blur on the obtained real-time dynamic image, judging whether the image quality meets a preset evaluation standard, if so, entering S3, and if not, returning to S1;
s3, extracting a first image and a second image, wherein the first image is a currently acquired dynamic image, and the second image is an image extracted from the same position of the spliced big image at the same position of the current dynamic image;
and S4, projecting the initial splicing result of the current image and the pre-spliced image to the horizontal direction and the vertical direction of the image respectively to obtain projection histograms of the two directions, and performing horizontal and vertical one-dimensional correlation analysis according to the projection histograms to obtain horizontal and vertical offsets of the image so as to splice the images.
Further, in S1, the stage has a feed rate and a fast forward rate;
when the current real-time dynamic image is spliced, the objective table is switched from the work feed rate to the fast forward rate and moves fast;
and when the objective table moves to the boundary of the current spliced large image, switching the objective table from the fast forward speed to the working condition speed until the image splicing of the next field to be spliced is completed.
Further, S2 includes:
s21, converting the current image from the RGB image to a gray image;
s22, sampling the converted gray level image, carrying out edge detection, extracting image definition parameters and normalizing;
edge detection is carried out through a Laplace edge detection algorithm, an improved Laplace operator is introduced, the sum of absolute values of second-order partial derivatives is taken, and then a two-dimensional image function f (x, y) is subjected to edge detection on the Laplace operator of an image,
Figure BDA0003411710210000031
and (3) calculating by adopting difference to replace differential, considering texture change of the microscopic image, and calculating second-order partial derivatives by using variable step length between pixels.
ML(x,y)=|2f(x,y)-f(x-step,y)-f(x+step,y)|+|2f(x,y)-f(x,y-step)-f(x,y+step|,
Get
Figure BDA0003411710210000041
In the formula, T is a threshold value, only the Laplace value larger than T participates in accumulation, finally the detection parameter is obtained, the detection parameter is compared with the detection threshold value, if the detection threshold value is reached, the image quality is judged to accord with the preset evaluation standard, and the detection parameter is stored in a detection parameter library; otherwise, judging that the image quality does not meet the preset evaluation standard.
Furthermore, the variable step size and the threshold T are positively correlated according to the noise level of the sequence image.
Further, S3 includes:
s31, detecting image features, texture features, morphological features and spatial relationship features of the current image containing rich local information, forming a data structure and a descriptor for the detected feature points, and constructing basic data elements for image matching and splicing;
s32, after the feature points and the feature descriptions of the current image are obtained, carrying out feature matching with the preorder image, and if the matching is successful, entering the step S34; otherwise, go to step S33; the feature library is a set of feature points of all the preorder images;
s33, carrying out feature matching with the feature points of the feature library, and carrying out matching from the current image position to the feature points of the surrounding adjacent images step by step in the matching process, and entering the step S34 if the matching is successful; otherwise, the current image is moved out of the range of the spliced image, and the state display is required to be refreshed, so that the objective table is prompted to return to the position of the original spliced image;
s34, extracting excellent matching points from the feature points of the current image, and storing the excellent matching points into a feature library for matching use of the subsequent image;
s35, calculating the coverage range of the feature points successfully matched, determining the matching area and the current image position information, calculating the moving distance of the image and the preorder spliced image, determining the absolute offset distance of the image movement at the position in the spliced large image, thereby extracting the first image and the second image, and entering the flow chart of the double-projection automatic splicing algorithm.
S4 adopts a double-projection automatic splicing algorithm, comprising the steps of further projecting the current image and the preorder spliced image in the horizontal direction and the vertical direction of the image according to the result of the primary splicing to obtain projection histograms in two directions, and performing horizontal and vertical one-dimensional correlation analysis according to the histograms to obtain horizontal and vertical offsets of the image, thereby splicing the image, enabling the splicing precision to reach the sub-pixel level, improving the image splicing accuracy, reducing the two-dimensional correlation analysis to two one-dimensional correlation analysis, and greatly improving the splicing efficiency.
Further, let the horizontal projection difference histogram of the first image be the reference histogram TXThe size is K; the horizontal projection difference histogram of the second image is a reference histogram SXThe size is M; reference histogram TXSuperimposed on the reference histogram SXUp-shift, reference histogram TXThe search image under the cover is a sub-image Sx iI is the lower left pixel point of the block subgraph at SXThe coordinates in the figure are taken as reference points, the similarity measure is,
Figure BDA0003411710210000051
wherein the molecule is a subgraph Sx iAnd a reference histogram TXIs cross-correlation with denominator being subgraph Sx iThe energy of (a), normalized to it, then,
Figure BDA0003411710210000052
when R isx(i) When the value is maximum, (i) is the correct matching position of the x axis;
similarly, let the horizontal projection difference histogram of the first image be the reference histogram TyL, the size thereof; the horizontal projection difference histogram of the second image is a reference histogram SyThe size is N; reference histogram TySuperimposed on the reference histogram SyUp-shift, reference histogram TyThe search image under the cover is a sub-image Sy jJ is the lower left corner pixel of the block of the subgraph at SyThe coordinates in the figure are taken as reference points, the similarity measure is,
Figure BDA0003411710210000053
in the formula, the molecule is a sub-diagram Sy jAnd a reference histogram TyIs cross-correlation with denominator being subgraph Sy jThe energy of (a), normalized to it, then,
Figure BDA0003411710210000054
when R isy(j) And (j) is the correct matching position of the y axis when the value is maximum.
Further, displaying the dynamic display of the current image and the corresponding operation prompt on a human-computer interaction interface for the user to operate and complete the image splicing; the dynamic display of the current image includes one or more of the current image being a blurred image, the current image being a focused image, the offset distance and positioning of the current image and the preceding stitched image, a matching area of the current image and the preceding image, the current image and the preceding image not being in the matching area, the current image being in the position of the stitched surrounding image.
It is still another object of the present invention to provide an apparatus for intelligent stitching of microscope images, the apparatus comprising:
a microscope having a stage movable along an X/Y axis and a focusing unit;
a camera, the camera being an electronic rolling shutter camera;
the system comprises computer equipment, a control unit and a display unit, wherein the computer equipment is provided with a processor and a human-computer interaction unit; the processor is connected with the camera and used for executing the intelligent microscope image splicing method; and the interactive interface of the man-machine interaction unit displays dynamic display and operation prompt of the image in the execution process.
It is still another object of the present invention to provide a computer-readable storage medium, wherein at least one instruction or at least one program is stored in the storage medium, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the method for intelligent stitching of microscope images as mentioned above.
It is a further object of the present invention to provide a computer device, which includes a processor and a memory, wherein the memory stores at least one instruction or at least one program, and the at least one instruction or the at least one program is loaded by the processor and executes the microscope image intelligent stitching method as described above.
After adopting the technical scheme, compared with the background technology, the invention has the following advantages:
1. the invention only needs one camera to obtain real-time images, meanwhile, in the image obtaining process, the invention does not depend on the automation degree of an objective table, has universality, can adapt to most optical microscopes, and can also complete the function of scanning digital slices by a digital slice scanner if a popular digital camera is added on a conventional microscope; in the double-projection automatic splicing algorithm, the high-precision image splicing is reduced from two-dimensional space correlation analysis and calculation to two one-dimensional space correlation analysis and calculation, the image splicing speed is increased, and the splicing accuracy and precision are improved;
2. after the current image is spliced, the objective table can move rapidly, spans most of the current spliced vision field, and moves rapidly to the vicinity of the next vision field to be spliced, and at the moment, the moving speed needs to be reduced so as to reduce the influence of the rolling shutter exposure on the image moving, obtain the image meeting the requirements, and complete the splicing of the next vision field. The method is repeated to form a rhythmic splicing process, so that efficient image acquisition is realized, and the acquired images are guaranteed to have high quality;
3. according to the method, the improved Laplace operator is introduced, so that the focusing of the image is prevented from generating deviation, meanwhile, the variable step length among the pixels is introduced in the calculation process, and the threshold value is set, so that the detection effect is obviously provided, and the detection result has the characteristics of good unimodal property, unbiased property, very obvious change trend near a focal plane and high sensitivity;
4. the invention considers the optimization processing of the feature library matching in the image matching method, and matches the feature points of the adjacent images from the current image position to the periphery step by step, thereby greatly reducing the time for traversing the feature library.
Drawings
FIG. 1 is a flow chart of the intelligent splicing method for microscope images.
Fig. 2 is a comparison diagram of the global shutter exposure and rolling shutter exposure principles.
Fig. 3 is a flow chart of real-time detection of image motion blur.
Fig. 4 is a flow chart of image feature extraction and image matching.
FIG. 5 is a flow chart of a double projection auto-stitching algorithm.
FIG. 6 is a screen display diagram showing various states in the dynamic tiling process.
FIG. 7 is a schematic diagram of the computer apparatus of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Example 1
Referring to fig. 1, the invention discloses an intelligent splicing method for microscope images, which comprises 4 core steps as shown in a flow chart:
s1, acquiring images: in the moving process of the objective table, a rolling shutter of the camera keeps continuously exposing, and a processor acquires a real-time dynamic image of the exposure;
s2, detection of image motion blur: converting the image, carrying out edge detection on the image by using a Laplace edge detection algorithm, judging whether the image quality meets a preset evaluation standard, if so, entering S3, and if not, returning to S1;
s3, image processing and matching: extracting a first image and a second image, wherein the first image is a currently acquired dynamic image, and the second image is an image extracted from the same position of a spliced big image at the same position of the current dynamic image;
s4, automatic splicing of double-projection images: the result of the initial splicing of the current image and the pre-spliced image is projected in the horizontal direction and the vertical direction of the image respectively to obtain projection histograms in the two directions, and the horizontal and vertical one-dimensional correlation analysis is carried out according to the projection histograms to obtain the horizontal and vertical offset of the image, so that the image is spliced.
In S1, the stage has a feed rate and a fast forward rate. Therefore, when the current real-time dynamic image is spliced, the objective table is switched from the work feed rate to the fast forward rate and moves fast; and when the objective table moves to the boundary of the current spliced large image, switching the objective table from the fast forward speed to the working condition speed until the image splicing of the next field to be spliced is completed.
To facilitate understanding of the advantage of different stage settings with respect to feed rate, please refer to fig. 2, which is a schematic comparison diagram of global shutter exposure and rolling shutter exposure. The global shutter camera simultaneously starts exposure from the first line to the Nth line when exposure starts, and simultaneously finishes exposure from the first line to the Nth line when exposure ends, so that all pixel points of the whole image are in the same exposure time period. The exposure start time and the exposure end time of each line of the camera image sensor are different, for example, the exposure start time of the second line has a fixed delay than the exposure start time of the first line, the exposure end time of the second line has a fixed delay than the exposure end time of the first line, the exposure start time of the third line has a fixed delay than the exposure start time of the second line, the exposure end time of the third line has a fixed delay than the exposure end time of the second line, and so on until the exposure of the Nth line starts and finishes. The exposure mechanism of the rolling shutter camera has no influence on a static image and has influence on the moving speed, the larger the moving speed is, the larger the influence is, and the direct result of the influence is the trailing phenomenon of the image.
Therefore, the shutter camera with the rolling shutter has a large influence on the image quality due to exposure in the moving process and the moving speed of the microscope objective table, the obtained image meets the requirements on image quality and splicing, the moving speed of the objective table needs to be adjusted when the objective table obtains spliced images, and the moving speed is reduced. After the current image is spliced, the objective table can move rapidly to cross most of the current spliced vision field area and move rapidly to the vicinity of the next vision field to be spliced, and at the moment, the moving speed needs to be reduced so as to reduce the influence of the shutter exposure on the image moving, obtain the image meeting the requirements and complete the splicing of the next vision field. And the process is repeated to form a rhythmic splicing process. In the process, various states of the current image are calculated and dynamically displayed through the real-time detection of the image motion blur of the electronic rolling shutter camera in the moving process of the objective table, and the adjustment of the moving speed of the objective table is coordinated, so that the image quality is ensured, and the acquisition speed is improved.
S2 real-time detection method for image motion blur, which comprises dividing target region of image obtained during movement of microscope stage, calculating image focusing quality and judging image blur degree by introducing improved Laplacian, and judging whether the current image meets quality threshold or not according to evaluation criteria.
Specifically, it comprises:
s21, converting the current image from the RGB image to a gray level image;
and S22, sampling the converted gray level image, carrying out edge detection, extracting image definition parameters and normalizing.
In the prior art, there are many image edge detection methods, such as Roberts, Prewitt, Sobel, Laplace, Canny operators, etc., and an optimized Laplace edge detection algorithm is adopted in the implementation. The Laplace edge detection algorithm is a second-order partial derivative, high-frequency components can be obtained, accordingly, sharpened edges can be detected, and a linear differential Laplace operator can be used as an estimator of the high-frequency components. The second order partial derivatives for the x-direction and for the y-direction may be opposite in sign and cancel each other out, thereby causing a deviation in the focus of the image. We introduce an improved laplacian operator, and take the sum of the absolute values of the second-order partial derivatives, then:
Figure BDA0003411710210000091
the difference is generally used instead of the differential for calculation. Previous edge detection functions generally approximate Laplace as a 3 x 3 operator, where second order partial derivatives are calculated using a variable step size between pixels, taking into account texture variations in the microscopic image.
Namely:
Figure BDA0003411710210000101
get
Figure BDA0003411710210000102
In the formula, T is a threshold value, and the Laplace value larger than T participates in accumulation.
When the Laplace operator is used as an edge detection function conventionally, the functions of step length and threshold value are ignored. The step length and the threshold are introduced, so that the detection effect is obviously improved under the action of the step length and the threshold, and the method has the characteristics of good unimodal property, unbiased property, very obvious change trend near a focal plane and high sensitivity.
It is easy to understand that the larger the step size and the threshold value are, the better the focusing effect is, and the step size and the threshold value can be smaller for the sequence image with little noise interference; and for sequence images with larger noise and brightness change interference, the step size and the threshold value can be slightly larger. For example, the step size and threshold may be smaller for a high power objective lens, and larger for a low power objective lens.
And adjusting the Laplace operator coefficient according to the section type under the microscope to achieve the optimal detection effect. And refreshing a software interface by using the detection result parameters, and dynamically displaying the sharpness and focus definition of the current image in a graphic mode. And comparing the detection parameters with the detection threshold, if the detection threshold is reached, judging that the image quality meets a preset evaluation standard, storing the detection parameters to a detection parameter library, and entering the next image feature extraction and image matching process, otherwise, judging that the image quality does not meet the preset evaluation standard, and returning to the microscope operation after the interface state is refreshed.
S3 is used to implement image feature extraction and image matching. The method for extracting the image features and matching the images comprises the following steps: extracting the geometric morphological characteristics, image density characteristics, image texture characteristics and data structures and descriptors of image characteristic description of the image containing rich local information, constructing basic data elements for image recognition, decision and intelligent splicing, and forming a characteristic library. And in the matching process, the current image and the preorder spliced image are obtained to carry out feature matching, or feature matching is carried out in a feature library, the moving distance between the image and the preorder spliced image and the position in the spliced large image are calculated, and the absolute offset distance of image movement is determined.
S3 includes:
s31, detecting image features, texture features, morphological features and spatial relationship features of the current image containing rich local information, forming a data structure and a descriptor for the detected feature points, and constructing basic data elements for image matching and splicing;
s32, after the feature points and the feature descriptions of the current image are obtained, carrying out feature matching with the preorder image, and if the matching is successful, entering the step S34; otherwise, if the image and the preamble image have no overlapping area, the step S33 is entered, and feature matching is performed with the feature points of the feature library; the feature library is a set of feature points of all the preorder images, and is expanded along with the expansion of the splicing distance;
s33, carrying out feature matching with the feature points of the feature library, and carrying out matching from the current image position to the feature points of the surrounding adjacent images step by step in the matching process, and entering the step S34 if the matching is successful; otherwise, refreshing the state, and reminding or controlling the objective table to return to the original spliced image position;
s34, extracting excellent matching points from the feature points of the current image and storing the excellent matching points into a feature library for matching use as subsequent images; the excellent matching points can be extracted by a Harris angular point detection method, and characteristic parameters (distinguishing the number of adjacent angular points, the quality level of the angular points and the minimum distance characteristic parameters of the two angular points) are set;
s35, calculating the coverage range of the feature points successfully matched, determining the matching area and the current image position information, calculating the moving distance between the image and the preorder spliced image and the position in the spliced large image, thereby extracting the first image and the second image and entering a double-projection automatic splicing flow.
S4 the double-projection automatic stitching algorithm comprises the steps of further projecting a current image and a pre-stitched image in the horizontal direction and the vertical direction of the image according to the primary stitching result to obtain projection histograms in two directions, performing horizontal and vertical one-dimensional correlation analysis according to the histograms to obtain horizontal and vertical offsets of the image, and stitching the image, so that the stitching precision reaches a sub-pixel level, the image stitching accuracy is improved, the two-dimensional correlation analysis is reduced to two one-dimensional correlation analysis, and the stitching efficiency is greatly improved.
Meanwhile, in the splicing process, the memory capacity of the image is dynamically expanded according to the gradual expansion of the spliced image, and the size of the image is adaptively expanded and the display proportion of the spliced image is adaptively increased. In other words, dynamically expanding the image memory means that a small memory is applied first, and the memory is not enough to be expanded to apply for a new memory as the image is spliced to be larger in the splicing process, so that the memory applied for use is proper, the system performance can be improved, and precious system memory resources are not wasted, so that the system memory is used for other system software processes.
To facilitate understanding of the superiority of the dual-projection automatic stitching algorithm of the present embodiment, a feasible method is described first.
The two images are subjected to high-precision image splicing of sub-pixels, and one feasible method is to adopt two-dimensional correlation analysis of the images, and the method comprises the following steps: let a reference map T (with a size of K × L) be superimposed and translated on a reference map S (with a size of M × N), and the search image covered by the reference map T is called a sub-map Si,jAnd i and j are coordinates of a lower left corner pixel point of the sub-image in the S picture as reference points, and the value ranges of i and j are more than or equal to 1 and less than or equal to M-K +1, and j is more than or equal to 1 and less than or equal to N-L + 1.
Now T and S can be comparedi,jIf the contents of (1) and (2) are identical, T and Si,jThe difference is zero. So the following measures can be used to measure T and Si,jThe degree of similarity of (a) is, then,
Figure BDA0003411710210000121
the formula is expanded to have, if any,
Figure BDA0003411710210000122
the third term on the right of the formula represents the total energy of the reference map T, which is a constant independent of (i, j), the first term is the energy of the sub-image under the T overlay, which varies slowly with the (i, j) position, the second term is the cross-correlation of the sub-image with the reference map, which varies with (i, j), T and Si,jThis term has the largest value when matching, so the following correlation function can be used for similarity measure, if any,
Figure BDA0003411710210000123
normalization is as follows:
Figure BDA0003411710210000124
it is the reference graph T and the subgraph Si,jThe correlation coefficient of (2). As can be seen from the Schwarz inequality, R (i, j) takes a value between 0 and 1 and only at the ratio Si,jWhen (m, n)/T (m, n) is constant, R (i, j) takes a maximum value (equal to 1). According to the similarity principle, the larger the value of R (i, j) is, the closer the value of R (i, j) is, and when the value of R (i, j) is the maximum, the (i, j) is the correct matching position.
The two-dimensional correlation analysis of the image can obtain the image splicing result of the sub-pixels with high precision, but needs a large amount of calculation time. The implementation provides an optimization scheme, the image two-dimensional correlation analysis is decomposed into the correlation analysis of two one-dimensional images, the splicing performance can be greatly optimized, and the splicing speed is increased. The method comprises the following steps: the method comprises the steps of respectively projecting image gray values of a first image and a second image in the horizontal direction and the vertical direction to obtain 4 projected one-dimensional sequence data, differentiating the 4 projected one-dimensional sequence data, and organizing the 4 projected one-dimensional sequence data into a one-dimensional gray projection differential histogram form, so that one-dimensional correlation analysis can be carried out, and the horizontal projection differential histogram of the first image and the horizontal projection differential histogram of the second image are subjected to correlation analysis to obtain a high-precision optimal coupling point, namely an x offset value of a sub-pixel. And performing correlation analysis on the vertical projection difference histogram of the first image and the vertical projection difference histogram of the second image to obtain a high-precision optimal coupling point, namely a y offset value of a sub-pixel.
The specific method comprises the following steps:
setting the horizontal projection difference histogram of the first image as a reference histogram TXThe size is K; the horizontal projection difference histogram of the second image is a reference histogram SXThe size is M; reference histogram TXSuperimposed on the reference histogram SXUp-shift, reference histogram TXThe search image under the cover is a sub-image Sx iI is thisThe lower left corner pixel of the block subgraph is at SXThe coordinate in the figure is taken as a reference point, and the value range of i is more than or equal to 1 and less than or equal to M.
Thus, T can be comparedXAnd SX iIf the contents of (1) and (b) are identical, T is determinedXAnd SX iThe difference is zero. So T can be measured by the following measureXAnd Sx iTo a similar degree.
The measure of the degree of similarity is then,
Figure BDA0003411710210000131
then it expands to
Figure BDA0003411710210000132
In the formula, the third term on the right represents a reference diagram TXIs a constant independent of (i), the first term is TXThe energy of the sub-image covering that block, which varies slowly with (i) position, the second term being the cross-correlation of the sub-image with the reference map, which varies with (i), TXAnd SX iThis term has the largest value when matching, so the similarity measure can be done with the following correlation function:
Figure BDA0003411710210000141
normalization is as follows:
Figure BDA0003411710210000142
the above formula is the reference diagram TXAnd sub-graph SX iThe correlation coefficient of (2). From the admissi vazval inequality, it can be seen that Rx (i) takes values between 0 and 1 and only at the ratio SX i(m)/TX(m) Rx (i) takes a maximum value (equal to 1) when it is constant. According toAccording to the similarity principle, the larger the Rx (i) value is, the closer the Rx (i) value is, and when the Rx (i) value is the maximum, the (i) value is the correct matching position of the x axis.
Similarly, let the first histogram of vertical projection difference be the reference histogram Ty(TySize of L) is superimposed on the second histogram of vertical projection differences to form a reference histogram Sy(SySize of N), reference map TyThe search image under the overlay is called sub-image Sy jJ is the pixel point at the lower left corner of the block of subgraph at SyThe coordinates in the figure are taken as reference points, and the value range of j is more than or equal to 1 and less than or equal to L without difficulty.
Now T can be comparedyAnd Sy jIf the contents of (1) and (b) are identical, T is determinedyAnd Sy jThe difference is zero. So T can be measured by the following measureyAnd Sy jTo a similar degree.
Figure BDA0003411710210000143
The expansion type of the utility model is that,
Figure BDA0003411710210000144
the third term on the right side represents the total energy of the reference graph T, is a constant and is independent of (j), and the first term is TyThe energy covering the sub-image which varies slowly with position (j), the second term being the cross-correlation of the sub-image with the reference map, varying with (j), T and SiThis term has the largest value when matching, so the similarity measure can be done with the following correlation function:
Figure BDA0003411710210000151
normalization is as follows:
Figure BDA0003411710210000152
the above equation is the correlation coefficient between the reference diagram Ty and the sub-diagram Syj. Also known from the Schwarz inequality is that Ry(j) Take values between 0 and 1, and only in the ratio Sy jWhen (n)/T (n) is constant, Ry(j) Take the maximum (equal to 1). According to the principle of similarity, Ry(j) The larger the value, the closer the two are, when Ry(j) And (j) is the correct matching position of the y axis when the value is maximum.
According to the obtained x and y matching offset, the current image is fused into a large mosaic, and the partial fusion can adopt a gradual-in and gradual-out type image gradient fusion method, which can refer to patent document CN202011053664, that is, the fusion adopts an optimized nonlinear interpolation method and a configurable fusion region overlapping parameter alpha value, so as to obtain the optimal fusion effect.
And storing the fused big spliced graph and related parameters, refreshing the dynamic display state, and starting the next round of operation.
Example 2
The invention further aims to provide an intelligent splicing device for microscope images, which comprises a microscope, a camera and computer equipment.
Wherein the microscope has a stage and a focusing unit movable along an X/Y axis. The objective table can be an automatic objective table or a manual objective table, the focusing unit can be a manual rotating focusing knob or an automatic focusing unit, and the application is not limited in detail.
The camera is an electronic rolling shutter camera, and a popular digital camera can be adopted.
The computer equipment is provided with a processor and a human-computer interaction unit, and can be a desktop computer, a notebook computer or intelligent mobile equipment. The processor is connected with the camera in a manner of executing the intelligent splicing method of the microscope images as described in embodiment 1; and the interactive interface of the man-machine interaction unit displays dynamic display and operation prompt of the image in the execution process.
As shown in fig. 6, the human-computer interface is used for displaying a software screen display image of various states in the dynamic splicing process, and the frame a represents a previous frame image; b frame represents the image of the current frame; the frame C indicates that the definition of the current dynamic image is not enough, and the focal length of the microscope needs to be manually fine-tuned at the moment to obtain a clear dynamic image; the D box indicates that the current image cannot find the splice point, and at this time, the X/Y axis can be moved to find the splice point again.
Example 3
An embodiment of the present application further provides a computer-readable storage medium, in which at least one instruction or at least one program is stored, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the method for intelligently stitching microscope images according to embodiment 1.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by hardware instructions of a computer program, which can be stored in a non-volatile computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. Any reference to memory, storage, database or other medium provided herein and used in the examples may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM) or external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double-rate SDRAM (SSRSDRAM), Enhanced SDRAM (ESDRAM), synchronous link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
Example 4
Referring to fig. 7, another object of the present invention is to provide a computer device, which includes a processor and a memory. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The computer program is loaded by the processor and executes the intelligent splicing method of the microscope images as described in the embodiment 1. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data such as interface access methods and the like. The network interface of the computer device is used for communicating with an external terminal through a network connection.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned.
Furthermore, it should be understood that although the present description refers to embodiments, not every embodiment may contain only a single embodiment, and such description is for clarity only, and those skilled in the art should integrate the description, and the embodiments may be combined as appropriate to form other embodiments understood by those skilled in the art.

Claims (10)

1. The intelligent splicing method for microscope images is characterized by comprising the following steps:
s1, exposing a camera in real time to obtain a real-time dynamic image of the microscope, wherein the camera is an electronic rolling shutter camera;
s2, carrying out real-time detection of image motion blur on the obtained real-time dynamic image, judging whether the image quality meets a preset evaluation standard, if so, entering S3, and if not, returning to S1;
s3, extracting a first image and a second image, wherein the first image is a currently acquired dynamic image, and the second image is an image extracted from the same position of the spliced big image at the same position of the current dynamic image;
and S4, projecting the initial splicing result of the current image and the pre-spliced image to the horizontal direction and the vertical direction of the image respectively to obtain projection histograms of the two directions, and performing horizontal and vertical one-dimensional correlation analysis according to the projection histograms to obtain horizontal and vertical offsets of the image so as to splice the images.
2. The intelligent stitching method for microscope images as claimed in claim 1, characterized in that:
in S1, the stage has a feed rate and a fast forward rate;
when the current real-time dynamic image is spliced, the objective table is switched from the work feed rate to the fast forward rate and moves fast;
and when the objective table moves to the boundary of the current spliced large image, switching the objective table from the fast forward speed to the working condition speed until the image splicing of the next field to be spliced is completed.
3. The method for intelligently stitching microscope images as claimed in claim 1, wherein S2 includes:
s21, converting the current image from the RGB image to a gray image;
s22, sampling the converted gray level image, carrying out edge detection, extracting image definition parameters and normalizing;
edge detection is carried out through a Laplace edge detection algorithm, an improved Laplace operator is introduced, the sum of absolute values of second-order partial derivatives is taken, and then a two-dimensional image function f (x, y) is subjected to edge detection on the Laplace operator of an image,
Figure FDA0003411710200000011
the calculation is performed by using differential instead of differential, taking into account the texture variations of the microscopic image, and calculating the second order partial derivatives using the variable step between pixels, then,
ML(x,y)=|2f(x,y)-f(x-step,y)-f(x+step,y)|+|2f(x,y)-f(x,y-step)-f(x,y+step|,
get
Figure FDA0003411710200000021
In the formula, T is a threshold value, only the Laplace value larger than T participates in accumulation, finally the detection parameter is obtained, the detection parameter is compared with the detection threshold value, if the detection threshold value is reached, the image quality is judged to accord with the preset evaluation standard, and the detection parameter is stored in a detection parameter library; otherwise, judging that the image quality does not meet the preset evaluation standard.
4. The intelligent stitching method for microscope images as claimed in claim 3, characterized in that:
the variable step and the threshold T are positively correlated according to the noise degree of the sequence image.
5. The method for intelligently stitching microscope images as claimed in claim 1, wherein S3 comprises:
s31, detecting image features, texture features, morphological features and spatial relationship features of the current image containing rich local information, forming a data structure and a descriptor for the detected feature points, and constructing basic data elements for image matching and splicing;
s32, after the feature points and the feature descriptions of the current image are obtained, carrying out feature matching with the preorder image, and if the matching is successful, entering the step S34; otherwise, go to step S33; the feature library is a set of feature points of all the preorder images;
s33, carrying out feature matching with the feature points of the feature library, and carrying out matching from the current image position to the feature points of the surrounding adjacent images step by step in the matching process, and entering the step S34 if the matching is successful; otherwise, refreshing the state, and reminding or controlling the objective table to return to the original spliced image position;
s34, extracting excellent matching points from the feature points of the current image, and storing the excellent matching points into a feature library for matching use of the subsequent image;
s35, calculating the coverage of the feature points successfully matched, determining the matching area and the current image position information, calculating the moving distance of the image and the preorder spliced image, and the position of the image in the spliced large image, thereby extracting the first image and the second image.
6. The intelligent stitching method for microscope images as claimed in claim 1, wherein S4 specifically comprises:
setting the horizontal projection difference histogram of the first image as a reference histogram TXThe size is K; the horizontal projection difference histogram of the second image is a reference histogram SXThe size is M; reference histogram TXSuperimposed on the reference histogram SXUp-shift, reference histogram TXThe search image under the cover is a sub-image Sx iI is the lower left pixel point of the block subgraph at SXThe coordinates in the figure are taken as reference points, the similarity measure is,
Figure FDA0003411710200000031
wherein the molecule is a subgraph Sx iAnd a reference histogram TXCross-correlation of (1), denominator being subgraph Sx iThe energy of (a), normalized to it, then,
Figure FDA0003411710200000032
when R isx(i) When the value is maximum, (i) is the correct matching position of the x axis;
similarly, let the horizontal projection difference histogram of the first image be the reference histogram TyWhich isThe size is L; the horizontal projection difference histogram of the second image is a reference histogram SyThe size is N; reference histogram TySuperimposed on the reference histogram SyUp-shift, reference histogram TyThe search image under the cover is a sub-image Sy jJ is the lower left corner pixel of the block of the subgraph at SyThe coordinates in the figure are taken as reference points, the similarity measure is,
Figure FDA0003411710200000033
in the formula, the molecule is a sub-diagram Sy jAnd a reference histogram TyIs cross-correlation with denominator being subgraph Sy jThe energy of (a), normalized to it, then,
Figure FDA0003411710200000034
when R isy(j) And (j) is the correct matching position of the y axis when the value is maximum.
7. The intelligent stitching method for microscope images as claimed in claim 1, characterized in that:
displaying the dynamic display of the current image and the corresponding operation prompt on a human-computer interaction interface for a user to operate and finish the image splicing; the dynamic display of the current image comprises one or more of the fact that the current image is a blurred image, the fact that the current image is a focused image, the offset distance and the positioning of the current image and the prior spliced image, the matching area of the current image and the prior image, the fact that the current image and the prior image are not in the matching area, and the fact that the current image is at the position of the spliced peripheral image.
8. Microscope image intelligence splicing apparatus, its characterized in that, the device includes:
a microscope having a stage movable along an X/Y axis and a focusing unit;
a camera, the camera being an electronic rolling shutter camera;
the system comprises computer equipment, a control unit and a display unit, wherein the computer equipment is provided with a processor and a human-computer interaction unit; the processor is connected with the camera and used for executing the intelligent splicing method of the microscope images as claimed in any one of claims 1 to 7; and the interactive interface of the man-machine interaction unit displays dynamic display and operation prompt of the image in the execution process.
9. A computer-readable storage medium characterized by: the storage medium stores at least one instruction or at least one program, and the at least one instruction or the at least one program is loaded and executed by a processor to implement the intelligent splicing method for microscope images according to any one of claims 1 to 7.
10. A computer device, characterized by: the computer device comprises a processor and a memory, wherein at least one instruction or at least one program is stored in the memory, and the at least one instruction or the at least one program is loaded by the processor and executes the intelligent splicing method for microscope images as claimed in any one of claims 1 to 7.
CN202111533437.8A 2021-12-15 2021-12-15 Intelligent splicing method, device, medium and equipment for microscope images Pending CN114463231A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111533437.8A CN114463231A (en) 2021-12-15 2021-12-15 Intelligent splicing method, device, medium and equipment for microscope images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111533437.8A CN114463231A (en) 2021-12-15 2021-12-15 Intelligent splicing method, device, medium and equipment for microscope images

Publications (1)

Publication Number Publication Date
CN114463231A true CN114463231A (en) 2022-05-10

Family

ID=81405552

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111533437.8A Pending CN114463231A (en) 2021-12-15 2021-12-15 Intelligent splicing method, device, medium and equipment for microscope images

Country Status (1)

Country Link
CN (1) CN114463231A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630164A (en) * 2023-07-21 2023-08-22 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116630164A (en) * 2023-07-21 2023-08-22 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116630164B (en) * 2023-07-21 2023-09-26 中国人民解放军国防科技大学 Real-time splicing method for massive microscopic images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation
CN116978005B (en) * 2023-09-22 2023-12-19 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Similar Documents

Publication Publication Date Title
JP4139853B2 (en) Image processing apparatus, image processing method, and image processing program
CN114463231A (en) Intelligent splicing method, device, medium and equipment for microscope images
US10026183B2 (en) Method, system and apparatus for determining distance to an object in a scene
US20140253785A1 (en) Auto Focus Based on Analysis of State or State Change of Image Content
WO2007077283A1 (en) Method and device for controlling auto focusing of a video camera by tracking a region-of-interest
JP2000182034A (en) Automatic image synthesizing system
CN106031148B (en) Imaging device, method of auto-focusing in an imaging device and corresponding computer program
EP4057623A1 (en) Subject detection method and apparatus, electronic device, and computer-readable storage medium
CN110033461B (en) Mobile phone anti-shake function evaluation method based on target displacement estimation
CN110166680B (en) Device imaging method and device, storage medium and electronic device
CN113645406B (en) Scanning focusing method and terminal
CN114390201A (en) Focusing method and device thereof
US11778327B2 (en) Image reconstruction method and device
CN114339042A (en) Image processing method and device based on multiple cameras and computer readable storage medium
US11803978B2 (en) Generating composite image from multiple images captured for subject
CN117114997B (en) Image stitching method and device based on suture line search algorithm
CN110992408B (en) Digital section processing method and system based on pathological microscope
US20160142616A1 (en) Direction aware autofocus
CN116456191A (en) Image generation method, device, equipment and computer readable storage medium
CN112839168B (en) Method for automatically adjusting camera imaging resolution in AOI detection system
RU2647645C1 (en) Method of eliminating seams when creating panoramic images from video stream of frames in real-time
CN112634298B (en) Image processing method and device, storage medium and terminal
CN113204107B (en) Three-dimensional scanning microscope with double objective lenses and three-dimensional scanning method
CN112866545B (en) Focusing control method and device, electronic equipment and computer readable storage medium
US20170374285A1 (en) Image processing method for movement detection and compensation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination