CN111626936A - Rapid panoramic stitching method and system for microscopic images - Google Patents

Rapid panoramic stitching method and system for microscopic images Download PDF

Info

Publication number
CN111626936A
CN111626936A CN202010439003.0A CN202010439003A CN111626936A CN 111626936 A CN111626936 A CN 111626936A CN 202010439003 A CN202010439003 A CN 202010439003A CN 111626936 A CN111626936 A CN 111626936A
Authority
CN
China
Prior art keywords
image
microscopic image
local
optical flow
microscopic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010439003.0A
Other languages
Chinese (zh)
Other versions
CN111626936B (en
Inventor
谷秀娟
向北海
许会
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hunan Guokezhitong Technology Co ltd
Original Assignee
Hunan Guokezhitong Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hunan Guokezhitong Technology Co ltd filed Critical Hunan Guokezhitong Technology Co ltd
Priority to CN202010439003.0A priority Critical patent/CN111626936B/en
Publication of CN111626936A publication Critical patent/CN111626936A/en
Application granted granted Critical
Publication of CN111626936B publication Critical patent/CN111626936B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10056Microscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

The invention discloses a rapid panoramic stitching method for microscopic images, which can realize coarse-to-fine optical flow estimation by utilizing an end-to-end network, has high calculation speed and accurate optical flow estimation, and can completely meet the requirements of image acquisition and calculation; combining a spatial transformation network, wherein the spatial transformation network can directly predict a transformation matrix from the optical flow characteristic diagram, and performs spatial transformation on the local microscopic images according to the transformation matrix so as to accurately align the overlapping areas of the adjacent local microscopic images; then, linear fusion is carried out on the overlapped area of the aligned microscopic images by adopting a gradually-in and gradually-out linear fusion mode, so that gaps and double images of the fused images can be effectively eliminated, and seamless splicing is realized; and finally, performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering so as to eliminate the spliced fusion gap and generate a seamless spliced image. Compared with the prior art, the panoramic splicing method provided by the invention has the advantages of high splicing speed and high precision, and can meet the requirements of practical application.

Description

Rapid panoramic stitching method and system for microscopic images
Technical Field
The invention relates to the technical field of microscopic image processing, in particular to a method and a system for quickly splicing a panoramic image.
Background
In disease diagnosis and pathological research, the traditional microscope has the problems of multiple operation steps, large workload, difficult resource sharing, incapability of long-term storage and the like, and the digital technology of pathological sections is developed according to the problems. The pathological section scanner scans and photographs cells or tissue sections in the glass slide to obtain microscopic images, and the microscopic images are spliced into panoramic images through subsequent software and are automatically identified, so that the analysis and diagnosis of the cells or the tissue images are realized.
In the existing pathological section scanner, the field range and the resolution ratio are in inverse proportion in the optical field, the requirement of obtaining an image with large field and high resolution ratio on an optical system of a microscope is very strict, and the most common technology for solving the problems is an image splicing technology. The technology acquires high-resolution images of different areas of a slice and finally fuses the images into a panoramic image to construct a complete large-scale high-resolution microscopic image. However, in the acquisition process, too few feature points on image matching caused by too large displacement of two adjacent images often occur, so that the stitching failure is caused.
Disclosure of Invention
The invention provides a method and a system for quickly splicing a panoramic image, which are used for overcoming the defects of splicing failure and the like caused by too few characteristic points in the prior art.
In order to achieve the above object, the present invention provides a method for fast panoramic stitching of microscopic images, comprising:
controlling a microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
performing optical flow estimation on a local microscopic image acquired at the current position and a local microscopic image acquired at the previous position by using a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
performing linear fusion on the overlapping area of the first alignment microscopic image by adopting a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
controlling a microscope objective or a microscope objective stage to move to the next position according to a pre-planned acquisition path and acquiring a local microscopic image, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image;
the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
In order to achieve the above object, the present invention further provides a system for fast panorama stitching of microscopic images, comprising:
the image acquisition module is used for controlling the microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the optical flow estimation module is used for carrying out optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by utilizing a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
the image alignment module is used for converting the optical flow characteristic diagram by utilizing a pre-trained space transformation network to obtain an initial matrix, carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
the fusion module is used for performing linear fusion on the overlapping area of the first alignment microscopic image in a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the circulation module is used for controlling the microscope objective or the microscope objective stage to move to the next position and acquiring a local microscopic image according to a pre-planned acquisition path, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image; the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and the rendering module is used for performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
To achieve the above object, the present invention further provides a computer device, which includes a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the method when executing the computer program.
Compared with the prior art, the invention has the beneficial effects that:
the rapid panoramic stitching method for the microscopic images provided by the invention can realize coarse-to-fine optical flow estimation by utilizing an end-to-end network, has high calculation speed and accurate optical flow estimation, and can completely meet the requirements of image acquisition and calculation; combining a spatial transformation network, wherein the spatial transformation network can directly predict a transformation matrix from the optical flow characteristic diagram, and performs spatial transformation on the local microscopic images according to the transformation matrix so as to accurately align the overlapping areas of the adjacent local microscopic images; then, linear fusion is carried out on the overlapped area of the aligned microscopic images by adopting a gradually-in and gradually-out linear fusion mode, so that gaps and double images of the fused images can be effectively eliminated, and seamless splicing is realized; and finally, performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering so as to eliminate the spliced fusion gap and generate a seamless spliced image. Compared with the prior art, the panoramic stitching method provided by the invention has the advantages of high stitching speed and high accuracy, and can meet the requirements of image acquisition and stitching.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the structures shown in the drawings without creative efforts.
FIG. 1 is a flow chart of a method for rapid panoramic stitching of microscopic images according to the present invention;
FIG. 2 is a block diagram of an end-to-end network in an embodiment of the present invention;
FIG. 3 is a flowchart illustrating the operation of a spatial transform network according to an embodiment of the present invention;
FIG. 4 is a network structure diagram of a redefinition module according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
In addition, the technical solutions in the embodiments of the present invention may be combined with each other, but it must be based on the realization of those skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination of technical solutions should not be considered to exist, and is not within the protection scope of the present invention.
The invention provides a rapid panoramic stitching method of microscopic images, which comprises the following steps of:
101: controlling a microscope objective or a microscope objective table to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the microscope objective will acquire one local microscopic image at each dwell position on the acquisition path.
The collection path may be a serpentine path from top to bottom, a serpentine path from left to right, or other paths as long as the cells in the pathological section are all collected.
102: performing optical flow estimation on a local microscopic image acquired at the current position and a local microscopic image acquired at the previous position by using a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
end-to-end networking refers to the use of deep neural networks to replace the multi-phase process. For example, conventional optical flow estimation by including three stages: feature extraction, feature matching and optical flow estimation. The output optical flow can be predicted directly from the two input images using an end-to-end network.
The optical flow estimation means that when two frames of images are given, luminance motion vectors of corresponding points in the next frame of image and the previous frame of image are estimated, and the luminance motion vectors are instantaneous speeds of pixel motion of a space moving object on an observation imaging plane.
103: converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication (matrix multiplication) operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
spatial Transformer Networks (STNs) are a convolutional neural network architecture model proposed by Jaderberg et al, and the classification accuracy of the convolutional network model is improved by transforming the input pictures and reducing the influence of the Spatial diversity of data, rather than by changing the network structure. The STNs have good robustness and spatial invariance such as translation, expansion, rotation, disturbance, bending and the like.
The transformation matrix includes parameters of translation, scaling, rotation, etc.
104: performing linear fusion on the overlapped area of the first alignment microscopic image by adopting a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the linear fusion method of gradual-in and gradual-out is to linearly assign weights to pixels of two images in an overlapping area, and when the weights increase from 0 to 1, the pixel values of the overlapping area change from a left overlapping area to a right overlapping area.
105: controlling a microscope objective or a microscope objective stage to move to the next position according to a pre-planned acquisition path and acquiring a local microscopic image I2, then carrying out optical flow estimation in step 102 and alignment in step 103 on the local microscopic image I2 acquired at the current position and the local microscopic image I1 acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion in step 104 on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image;
the local microscopic image I2 acquired at the current position and the local microscopic image I1 acquired at the previous position are fused and then the fused microscopic image is stored, and meanwhile, the local microscopic image I2 acquired at the current position is continuously stored until the local microscopic image acquired at the next position and the local microscopic image I2 acquired at the current position are fused and then released.
106: the processes of optical flow estimation, alignment and linear fusion in the step 105 are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and collecting the local microscopic image, and simultaneously carrying out the processes of light stream estimation, alignment and linear fusion, and obtaining the panoramic fusion microscopic image after the local microscopic image is collected.
107: and performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
Bilateral filtering is a non-linear filter whose basic idea is to represent the intensity of a pixel by a weighted average of the intensity values of surrounding pixels, the weighted average being based on a gaussian distribution. The weights for the Bilateral filtering include the Euclidean distance of the pixel and the radiation difference in the pixel domain, both of which are considered simultaneously when computing the center pixel (ref: Tomasi C, Manduchi R. "binary filtering for gray and color images". ICCV [ J ].1998: 839-) -846).
The panorama rendering smoothes all overlapping regions in the panorama fusion microscopy image.
The rapid panoramic stitching method for the microscopic images provided by the invention can realize coarse-to-fine optical flow estimation by utilizing an end-to-end network, has high calculation speed and accurate optical flow estimation, and can completely meet the requirements of image acquisition and calculation; combining a spatial transformation network, wherein the spatial transformation network can directly predict a transformation matrix from the optical flow characteristic diagram, and performs spatial transformation on the local microscopic images according to the transformation matrix so as to accurately align the overlapping areas of the adjacent local microscopic images; then, linear fusion is carried out on the overlapped area of the aligned microscopic images by adopting a gradually-in and gradually-out linear fusion mode, so that gaps and double images of the fused images can be effectively eliminated, and seamless splicing is realized; and finally, performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering so as to eliminate the spliced fusion gap and generate a seamless spliced image. Compared with the prior art, the panoramic stitching method provided by the invention has the advantages of high stitching speed and high accuracy, and can meet the requirements of image acquisition and stitching.
In one embodiment, before performing step 101, the method further includes the steps of:
001: pre-scanning pathological sections to obtain blank areas and target areas of the pathological sections;
the target region is a region with cells.
002: and planning the acquisition path in the target area.
The pre-scanning aims to remove blank areas of pathological sections, so that the working efficiency is improved with less workload.
In the next embodiment, for step 101, the present embodiment employs a high-magnification digital microscope to acquire a single local microscopic image in different fields of view in the pathological section. When acquiring the local microscope images in different fields, it is necessary to ensure that there is an overlapping area between adjacent local microscope images until the sum of the areas of the individual local microscope images in different fields is enough to cover the cell or tissue sample area in the original pathological section.
The microscope objective has a magnification of 20X or 40X or 100X.
The path of this embodiment is a serpentine path from left to right.
The microscope objective or stage is moved in a serpentine path from left to right to acquire local microscopic images of the pathological section as individual microscopic images in different fields of view superimposed on each other.
In the existing pathological section scanner, the speed of scanning and splicing and the quality of the generated large-size microscopic image are both key factors, and a balance needs to be made between the speed and the quality to ensure that the best spliced panoramic image is obtained at the fastest speed. The size of the overlapping area of each time the microscope objective lens or the microscope objective table moves is set to be 20-30% of the area of a single local microscope image.
In another embodiment, for step 102, the step of performing optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by using a pre-trained end-to-end network (a structure diagram of the end-to-end network in this embodiment is shown in fig. 2) to obtain an optical flow feature map includes:
201: local microscopic image I2 (Im) acquired at current position by using pre-trained FlowNet1 network (optical flow estimation network)age2) and a local microscopic Image I1(Image1) acquired at the previous position to carry out rough optical flow estimation to obtain a rough optical flow characteristic diagram F1(Flow1) from the rough optical Flow feature F1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1(Warped1), calculating a local microscopic image I1 and a first transformed image W1Obtaining a first brightness error BE from the brightness difference1(Brightness Error1);
Using a pre-trained FlowNet2 network to perform local microscopic image I1, local microscopic image I2 and rough optical flow characteristic diagram F1The first converted image W1And brightness error BE1Performing fine optical flow estimation to obtain fine optical flow characteristic diagram F2(Flow2) from the fine optical Flow feature map F2The position of the local microscopic image I2 is transformed to obtain a second transformed image W2(Warped2), calculating a local microscopic image I1 and a second transformed image W2Obtaining a second brightness error BE from the brightness difference2(Brightness Error2);
Using a pre-trained FlowNet2 network to perform local microscopic image I1, local microscopic image I2 and fine optical flow characteristic diagram F2The second converted image W2And a second luminance error BE2And performing fine optical flow estimation to obtain an optical flow feature map F (Featuremap).
Considering the inaccuracy of optical flow estimation, the invention uses the optical flow characteristic diagram to transform the first transformed image W obtained after the local microscopic image I2 is transformed1And a second transformed image W2. First transformed image W1And a second transformed image W2Since there is still a certain deviation from the local microscopic image I1, it is necessary to perform the local microscopic image I1 and the first transformed image W1The second converted image W2Is subtracted to obtain a first brightness error BE1And a second luminance error BE2And the method is used for subsequent accurate optical flow estimation.
In a certain embodiment, the FlowNet1 network includes, in order, 9 convolutional layers and 1 redefinement module (adjustment module, which may also be referred to as a decoding module);
the 9 convolutional layers are used for carrying out high-level feature extraction on the local micro-images stacked in advance to obtain a feature map;
and the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a coarse optical flow feature map.
The network structure of the redefinition module is shown in fig. 4. The network structure comprises 4 deconvolution layers (deconv4, deconv3, deconv2 and deconv1), wherein the input of the deconv4 deconvolution layer is a feature map output by a conv6 convolution layer, the input of the last three deconvolution layers comprises two parts, the first part is a deconvolution output of the previous layer, and the second part is a feature map output of convolution layers (conv5_1, conv4_1 and conv3_1) in an end-to-end network, so that information of the upper layer and the bottom layer is fused, and a mechanism from coarse to fine is also introduced. The input of the FlowNet1 network is two 3-channel images, before the FlowNet1 network is input, the two 3-channel images are stacked to become 6-channel tensors, then the 6-channel tensors are input into the FlowNet1 network, the high-level feature extraction is carried out sequentially through 9 convolutional layers, finally the feature diagram in the previous convolutional layer is continuously up-sampled through a redefinement module, and a 2-channel rough optical flow feature diagram with the same size as the input original diagram is output.
In another embodiment, the FlowNet2 network includes, in order, 9 convolutional layers, 1 correlation layer, and 1 redefinition module;
the first 3 convolutional layers are used for carrying out feature extraction on each input local microscopic image to obtain a first feature map;
the correlation layer is used for performing correlation operation on the feature images of different local microscopic images so as to combine the features of the different local microscopic images to obtain a feature fusion image;
the last 6 convolutional layers are used for carrying out high-level feature extraction on the feature fusion graph to obtain a second feature graph;
and the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a fine optical flow feature map.
The input of the FlowNet2 network is two 3-channel images, the two 3-channel images are subjected to higher-layer feature extraction through the first 3 convolutional layers respectively to obtain 1 first feature graph, then the first feature graph enters the related layers to perform related operation so as to merge the features of the first feature graph to obtain a feature fusion graph, the feature fusion graph is subjected to higher-layer feature extraction sequentially through the next 6 convolutional layers to obtain 1 second feature graph, and finally the feature graph of the previous convolutional layer is continuously up-sampled through a redefinement module to output a 2-channel rough optical flow feature graph with the same size as the original graph.
In the next embodiment, the formula of the correlation operation is:
Figure BDA0002503366730000111
in the formula, c (x)1,x2) Is the correlation value, x, of two feature maps1,x2Respectively corresponding points on the two characteristic graphs; f. of1And f2Two characteristic graphs are obtained; k is the boundary value of the region to be compared; o is the position value of any point in the area to be compared;<>the correlation operation is the multiplication of the pixels at the corresponding positions, and the larger the correlation value is, the closer the characteristic diagram is represented.
The correlation operation is used for comparing the correlation between the two input feature maps.
In a further embodiment, the feature map F is based on a coarse optical flow1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1The method comprises the following steps:
from a coarse light flow feature map F1Obtaining the offset value of each pixel (the value of each position in the light flow diagram is a two-dimensional vector and represents the instantaneous speed of pixel movement), offsetting each corresponding pixel in the local microscopic image I2 according to the offset value to obtain a first transformed image W1
In this embodiment, the feature map F is based on the fine optical flow2The position of the local microscopic image I2 is transformed to obtain a second transformed image W2The method comprises the following steps:
from fine optical flow feature maps F2Obtaining an offset value of each pixel, and offsetting each corresponding pixel in the local microscopic image I2 according to the offset value to obtain a second transformation image W2
In another embodiment, for step 103, the spatial transformation network includes a local network (localization network), a Grid generator (Grid generator), and a Sampler (Sampler), and the workflow of the spatial transformation network is as shown in fig. 3;
the spatial transformation network firstly generates a transformation matrix with 9 parameters through a simple regression network for carrying out transformation operation on an original image, then each point on a target image corresponds to the original image according to the transformation matrix, and finally a sampler is used for sampling pixel values on the original image into the target image. The spatial transformation network does not need to calibrate key points, and can adaptively carry out spatial transformation and alignment (comprising translation, scaling, rotation and other set transformation and the like) on data.
Converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image, wherein the method comprises the following steps of:
301: carrying out multilayer convolution operation on the optical flow characteristic diagram by utilizing a pre-trained local network, carrying out characteristic full connection, regressing and outputting an initial matrix, and carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix with 9 parameters;
the local network is essentially a simple regression network.
The 9 parameters include translation, scaling, rotation, etc. Parameters in the obtained transformation matrix are different according to the input optical flow characteristic diagram.
302: fixing a local microscopic image I1 acquired at the previous position, recording a local microscopic image I2 acquired at the current position as an original image, setting a hollow microscopic image as a microscopic image aligned with I1 and marking as a target image, and calculating the coordinate position of each coordinate position in the target image in the original image according to a transformation matrix by using a pre-trained grid generator to obtain a mapping relation T (G) of the overlapping area of the target image and the original image;
303: and searching corresponding coordinate positions of all coordinate positions of the target image in the original image by using a pre-trained sampler according to the mapping relation T (G), and correspondingly copying pixels in the original image into the target image by adopting a bilinear interpolation mode to obtain a first alignment microscopic image.
The bilinear interpolation is also called bilinear interpolation, and mathematically, the bilinear interpolation is linear interpolation extension of an interpolation function with two variables, and the core idea is to perform linear interpolation in two directions respectively.
The reason for using the bilinear interpolation method is that the coordinates mapped from the target image coordinates onto the original image may be decimal, and the integer coordinates need to be obtained by interpolation operation in the original image.
In a next embodiment, for step 104, in order to eliminate the fused seam after the splicing and generate the seamless spliced image, a gradually-in and gradually-out linear fusion mode is adopted to perform linear fusion on the overlapped region of the first alignment microscopic image, and in the step of obtaining the first fused microscopic image, the formula of the linear fusion is as follows:
I(i,j)=αI1(i,j)+(1-α)I2(i,j)0≤α≤1 (2)
in the formula, I is a fusion pixel value; i is1And I2Respectively the original pixel values of the corresponding overlapping areas of the two local microscope images, (i, j) are pixel coordinates, (α) are weight coefficients,
Figure BDA0002503366730000131
wherein w represents the width of the overlapping region;1indicating that the overlap region is near the inner edge position of image I1; dis denotes computing pixel location I1(I, j) to an inner edge1The distance of (c).
In an embodiment, after step 104, an automatic color gradation process is further performed on the panoramic stitched microscopic image to remove the influence of the background light and the white balance, that is, the black background and the impurities are removed by using a soft-edge spray gun.
The automatic tone gradation process is to automatically define the brightest and darkest pixels in each channel as white and black, and then to redistribute the pixel values in between proportionally, so that the image contrast is enhanced and the gradation is clear. The background and edges can be made more natural using an edge smoothing operation.
The invention also provides a rapid panoramic stitching system of the microscopic image, which comprises:
the image acquisition module is used for controlling the microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the optical flow estimation module is used for carrying out optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by utilizing a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
the image alignment module is used for converting the optical flow characteristic diagram by utilizing a pre-trained space transformation network to obtain an initial matrix, carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
the fusion module is used for performing linear fusion on the overlapping area of the first alignment microscopic image in a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the circulation module is used for controlling the microscope objective or the microscope objective stage to move to the next position and acquiring a local microscopic image according to a pre-planned acquisition path, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image; the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and the rendering module is used for performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
The invention also relates to a computer device comprising a memory and a processor, wherein the memory stores a computer program, and the processor implements the steps of the method when executing the computer program.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention, and all modifications and equivalents of the present invention, which are made by the contents of the present specification and the accompanying drawings, or directly/indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A quick panoramic stitching method for microscopic images is characterized by comprising the following steps:
controlling a microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
performing optical flow estimation on a local microscopic image acquired at the current position and a local microscopic image acquired at the previous position by using a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
performing linear fusion on the overlapping area of the first alignment microscopic image by adopting a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
controlling a microscope objective or a microscope objective stage to move to the next position according to a pre-planned acquisition path and acquiring a local microscopic image, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image;
the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
2. The method for fast panorama stitching of microscopic images according to claim 1, wherein the step of obtaining the optical flow feature map by performing optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by using a pre-trained end-to-end network comprises:
carrying out rough optical flow estimation on the local microscopic image I2 acquired at the current position and the local microscopic image I1 acquired at the previous position by utilizing a FlowNet1 network trained in advance to obtain a rough optical flow characteristic diagram F1From said coarse optical flow feature map F1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1Calculating the local microscopic image I1 and the first transformed image W1Obtaining a first brightness error BE from the brightness difference1
Performing pre-trained FlowNet2 network on the local microscopic image I1, the local microscopic image I2 and the rough optical flow characteristic map F1The first converted image W1And the brightness error BE1Performing fine optical flow estimation to obtain fine optical flow characteristic diagram F2From said fine optical flow feature map F2Carrying out position transformation on the local microscopic image I2 to obtain a second transformation image W2Calculating the local microscopic image I1 and the second transformed image W2Obtaining a second brightness error BE from the brightness difference2
Using a pre-trained FlowNet2 network to perform local microscopic image I1 and local microscopic imageImage I2, the fine optical flow feature map F2The second converted image W2And the second brightness error BE2And performing fine optical flow estimation to obtain an optical flow feature map F.
3. The method for rapid panoramic stitching of microscopic images according to claim 2, wherein the FlowNet1 network comprises 9 convolution layers and 1 refinment module in sequence;
the 9 convolutional layers are used for carrying out high-level feature extraction on the local micro-images stacked in advance to obtain a feature map;
the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a coarse optical flow feature map.
4. The method for rapid panoramic stitching of microscopic images according to claim 2, wherein the FlowNet2 network comprises 9 convolutional layers, 1 correlation layer and 1 refinment module in sequence;
the first 3 convolutional layers are used for carrying out feature extraction on each input local microscopic image to obtain a first feature map;
the correlation layer is used for performing correlation operation on the feature images of different local microscopic images so as to combine the features of the different local microscopic images to obtain a feature fusion image;
the last 6 convolutional layers are used for carrying out high-level feature extraction on the feature fusion graph to obtain a second feature graph;
the redefinition module is used for performing upsampling operation by using the deconvolution layer to obtain a fine optical flow feature map.
5. The method for rapid panoramic stitching of microscopic images according to claim 4, wherein the formula of the correlation operation is:
Figure FDA0002503366720000031
in the formula, c (x)1,x2) Is the correlation value, x, of two feature maps1,x2Respectively corresponding points on the two characteristic graphs; f. of1And f2Two characteristic graphs are obtained; k is the boundary value of the region to be compared; o is the position value of any point in the area to be compared;<>is a correlation operation.
6. Method for the rapid panoramic stitching of microscopic images according to claim 2, characterized in that it consists in determining the rough optical flow profile F1Carrying out position transformation on the local microscopic image I2 to obtain a first transformation image W1The method comprises the following steps:
from the coarse optical flow feature map F1Obtaining an offset value of each pixel, and offsetting each corresponding pixel in the local microscopic image I2 according to the offset value to obtain a first transformation image W1
7. The method for rapid panorama stitching of microscopic images according to claim 1, wherein the spatial transformation network comprises a local network, a grid generator and a sampler;
converting the optical flow characteristic diagram by using a pre-trained space transformation network to obtain an initial matrix, performing matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image, wherein the method comprises the following steps of:
carrying out multilayer convolution operation on the optical flow characteristic diagram by utilizing a pre-trained local network, carrying out characteristic full connection, regressing and outputting an initial matrix, and carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix with 9 parameters;
fixing a local microscopic image I1 acquired at the previous position, recording a local microscopic image I2 acquired at the current position as an original image, setting an empty microscopic image as a microscopic image aligned with I1 and marking as a target image, and calculating the coordinate position of each coordinate position in the target image in the original image according to a transformation matrix by using a pre-trained grid generator to obtain the mapping relation of the overlapping area of the target image and the original image;
and searching corresponding coordinate positions of all coordinate positions of the target image in the original image according to the mapping relation by using the pre-trained sampler, and correspondingly copying pixels in the original image to the target image by adopting a bilinear interpolation mode to obtain a first alignment microscopic image.
8. The method for fast panorama stitching of microscopic images according to claim 1, wherein the step of obtaining the first fused microscopic image by linearly fusing the overlapping area of the first aligned microscopic image by a gradually-in and gradually-out linear fusion method is characterized in that the formula of the linear fusion is as follows:
I(i,j)=αI1(i,j)+(1-α)I2(i,j)0≤α≤1 (2)
in the formula, I is a fusion pixel value; i is1And I2The two local microscopic images are respectively the original pixel values of the corresponding overlapping areas of the two local microscopic images, (i, j) is the pixel coordinate, and α is the weight coefficient.
9. A rapid panoramic stitching system for microscopic images is characterized by comprising:
the image acquisition module is used for controlling the microscope objective to acquire a local microscopic image of the pathological section according to a pre-planned acquisition path;
the optical flow estimation module is used for carrying out optical flow estimation on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position by utilizing a pre-trained end-to-end network to obtain an optical flow characteristic diagram;
the image alignment module is used for converting the optical flow characteristic diagram by utilizing a pre-trained space transformation network to obtain an initial matrix, carrying out matrix multiplication operation on the initial matrix and a transformation matrix obtained at the previous position to obtain a transformation matrix at the current position, and aligning two local microscopic images according to the transformation matrix at the current position to obtain a first aligned microscopic image;
the fusion module is used for performing linear fusion on the overlapping area of the first alignment microscopic image in a gradually-in and gradually-out linear fusion mode to obtain a first fusion microscopic image;
the circulation module is used for controlling the microscope objective or the microscope objective stage to move to the next position and acquiring a local microscopic image according to a pre-planned acquisition path, then carrying out optical flow estimation and alignment on the local microscopic image acquired at the current position and the local microscopic image acquired at the previous position to obtain a second aligned microscopic image, and carrying out linear fusion on the second aligned microscopic image and the first fused microscopic image to obtain a second fused microscopic image; the processes of the optical flow estimation, the alignment and the linear fusion are circulated until all the local microscopic images are fused to obtain a panoramic fusion microscopic image;
and the rendering module is used for performing panoramic rendering on the panoramic fusion microscopic image by utilizing bilateral filtering to obtain a panoramic splicing microscopic image.
10. A computer device comprising a memory and a processor, the memory storing a computer program, wherein the processor when executing the computer program implements the steps of the method of any of claims 1-8.
CN202010439003.0A 2020-05-22 2020-05-22 Quick panoramic stitching method and system for microscopic images Active CN111626936B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010439003.0A CN111626936B (en) 2020-05-22 2020-05-22 Quick panoramic stitching method and system for microscopic images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010439003.0A CN111626936B (en) 2020-05-22 2020-05-22 Quick panoramic stitching method and system for microscopic images

Publications (2)

Publication Number Publication Date
CN111626936A true CN111626936A (en) 2020-09-04
CN111626936B CN111626936B (en) 2023-05-12

Family

ID=72272566

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010439003.0A Active CN111626936B (en) 2020-05-22 2020-05-22 Quick panoramic stitching method and system for microscopic images

Country Status (1)

Country Link
CN (1) CN111626936B (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815690A (en) * 2020-09-11 2020-10-23 湖南国科智瞳科技有限公司 Method, system and computer equipment for real-time splicing of microscopic images
CN112750078A (en) * 2020-12-28 2021-05-04 广州市明美光电技术有限公司 Microscopic image real-time splicing method and storage medium based on electric platform
CN113537238A (en) * 2021-07-05 2021-10-22 上海闪马智能科技有限公司 Information processing method and image recognition device
CN114187334A (en) * 2021-10-12 2022-03-15 武汉兰丁云医学检验实验室有限公司 Adjacent slice image superposition and alignment method based on HE staining, Ki67 and P16 combination
WO2022133683A1 (en) * 2020-12-21 2022-06-30 京东方科技集团股份有限公司 Mixed reality display method, mixed reality device, and storage medium
WO2022213734A1 (en) * 2021-04-06 2022-10-13 北京车和家信息技术有限公司 Method and apparatus for fusing traffic markings, and storage medium and electronic device
CN116309036A (en) * 2022-10-27 2023-06-23 杭州图谱光电科技有限公司 Microscopic image real-time stitching method based on template matching and optical flow method
CN116343206A (en) * 2023-05-29 2023-06-27 山东科技大学 Automatic splicing identification method for marine plankton analysis microscope images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090041314A1 (en) * 2007-08-02 2009-02-12 Tom Vercauteren Robust mosaicing method. notably with correction of motion distortions and tissue deformations for a vivo fibered microscopy
WO2009055913A1 (en) * 2007-10-30 2009-05-07 Cedara Software Corp. System and method for image stitching
WO2010105015A2 (en) * 2009-03-11 2010-09-16 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for microscopy tracking
US20120275658A1 (en) * 2011-02-28 2012-11-01 Hurley Neil F Petrographic image analysis for determining capillary pressure in porous media
CN109191380A (en) * 2018-09-10 2019-01-11 广州鸿琪光学仪器科技有限公司 Joining method, device, computer equipment and the storage medium of micro-image
CN111007661A (en) * 2019-12-02 2020-04-14 湖南国科智瞳科技有限公司 Microscopic image automatic focusing method and device based on deep learning

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090041314A1 (en) * 2007-08-02 2009-02-12 Tom Vercauteren Robust mosaicing method. notably with correction of motion distortions and tissue deformations for a vivo fibered microscopy
WO2009055913A1 (en) * 2007-10-30 2009-05-07 Cedara Software Corp. System and method for image stitching
WO2010105015A2 (en) * 2009-03-11 2010-09-16 The University Of North Carolina At Chapel Hill Methods, systems, and computer readable media for microscopy tracking
US20120275658A1 (en) * 2011-02-28 2012-11-01 Hurley Neil F Petrographic image analysis for determining capillary pressure in porous media
CN109191380A (en) * 2018-09-10 2019-01-11 广州鸿琪光学仪器科技有限公司 Joining method, device, computer equipment and the storage medium of micro-image
CN111007661A (en) * 2019-12-02 2020-04-14 湖南国科智瞳科技有限公司 Microscopic image automatic focusing method and device based on deep learning

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
谢安宁;张宏伟;赵志刚;孟智勇;王增国;: "基于三维旋转模型的红外全景图像拼接方法" *
霍春宝;童帅;赵立辉;崔汉峰;: "SIFT特征匹配的显微全景图拼接" *

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111815690B (en) * 2020-09-11 2020-12-08 湖南国科智瞳科技有限公司 Method, system and computer equipment for real-time splicing of microscopic images
CN111815690A (en) * 2020-09-11 2020-10-23 湖南国科智瞳科技有限公司 Method, system and computer equipment for real-time splicing of microscopic images
WO2022133683A1 (en) * 2020-12-21 2022-06-30 京东方科技集团股份有限公司 Mixed reality display method, mixed reality device, and storage medium
CN112750078A (en) * 2020-12-28 2021-05-04 广州市明美光电技术有限公司 Microscopic image real-time splicing method and storage medium based on electric platform
WO2022213734A1 (en) * 2021-04-06 2022-10-13 北京车和家信息技术有限公司 Method and apparatus for fusing traffic markings, and storage medium and electronic device
CN113537238B (en) * 2021-07-05 2022-08-05 上海闪马智能科技有限公司 Information processing method and image recognition device
CN113537238A (en) * 2021-07-05 2021-10-22 上海闪马智能科技有限公司 Information processing method and image recognition device
CN114187334A (en) * 2021-10-12 2022-03-15 武汉兰丁云医学检验实验室有限公司 Adjacent slice image superposition and alignment method based on HE staining, Ki67 and P16 combination
CN116309036A (en) * 2022-10-27 2023-06-23 杭州图谱光电科技有限公司 Microscopic image real-time stitching method based on template matching and optical flow method
CN116309036B (en) * 2022-10-27 2023-12-29 杭州图谱光电科技有限公司 Microscopic image real-time stitching method based on template matching and optical flow method
CN116343206A (en) * 2023-05-29 2023-06-27 山东科技大学 Automatic splicing identification method for marine plankton analysis microscope images
CN116343206B (en) * 2023-05-29 2023-08-08 山东科技大学 Automatic splicing identification method for marine plankton analysis microscope images
CN116978005A (en) * 2023-09-22 2023-10-31 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation
CN116978005B (en) * 2023-09-22 2023-12-19 南京凯视迈科技有限公司 Microscope image processing system based on attitude transformation

Also Published As

Publication number Publication date
CN111626936B (en) 2023-05-12

Similar Documents

Publication Publication Date Title
CN111626936B (en) Quick panoramic stitching method and system for microscopic images
CN107369148B (en) Based on the multi-focus image fusing method for improving SML and Steerable filter
Mann et al. Virtual bellows: Constructing high quality stills from video
CN109584156A (en) Micro- sequence image splicing method and device
US20150170405A1 (en) High resolution free-view interpolation of planar structure
US20090209833A1 (en) System and method for automatic detection of anomalies in images
CN111861880A (en) Image super-fusion method based on regional information enhancement and block self-attention
CN111553841B (en) Real-time video splicing method based on optimal suture line updating
Fanous et al. GANscan: continuous scanning microscopy using deep learning deblurring
CN112116543A (en) Image restoration method, system and device based on detection type generation framework
JP6479178B2 (en) Image processing apparatus, imaging apparatus, microscope system, image processing method, and image processing program
EP3709258B1 (en) Generating composite image from multiple images captured for subject
US20230237617A1 (en) Microscope-based super-resolution
CN111738964A (en) Image data enhancement method based on modeling
CN117173012A (en) Unsupervised multi-view image generation method, device, equipment and storage medium
Rong et al. Mosaicing of microscope images based on SURF
CN114881907B (en) Optical microscopic image multi-depth-of-field focus synthesis method and system and image processing method
CN115060367B (en) Whole-slide data cube acquisition method based on microscopic hyperspectral imaging platform
CN112203023B (en) Billion pixel video generation method and device, equipment and medium
Kostrzewa et al. B4MultiSR: a benchmark for multiple-image super-resolution reconstruction
Gherardi et al. Real-time whole slide mosaicing for non-automated microscopes in histopathology analysis
Qian et al. Extending depth of field and dynamic range from differently focused and exposed images
Chen et al. Color image-guided very low-resolution depth image reconstruction
Krishna et al. GloFlow: Whole slide image stitching from video using optical flow and global image alignment
Wang et al. Defocus deblur microscopy via head-to-tail cross-scale fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant