CN116309036B - Microscopic image real-time stitching method based on template matching and optical flow method - Google Patents

Microscopic image real-time stitching method based on template matching and optical flow method Download PDF

Info

Publication number
CN116309036B
CN116309036B CN202211325386.4A CN202211325386A CN116309036B CN 116309036 B CN116309036 B CN 116309036B CN 202211325386 A CN202211325386 A CN 202211325386A CN 116309036 B CN116309036 B CN 116309036B
Authority
CN
China
Prior art keywords
image
current
initial reference
reference image
optical flow
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202211325386.4A
Other languages
Chinese (zh)
Other versions
CN116309036A (en
Inventor
周海洋
陈庆
余飞鸿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Touptek Photoelectric Technology Co ltd
Original Assignee
Hangzhou Touptek Photoelectric Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Touptek Photoelectric Technology Co ltd filed Critical Hangzhou Touptek Photoelectric Technology Co ltd
Priority to CN202211325386.4A priority Critical patent/CN116309036B/en
Publication of CN116309036A publication Critical patent/CN116309036A/en
Application granted granted Critical
Publication of CN116309036B publication Critical patent/CN116309036B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/14Transformations for image registration, e.g. adjusting or mapping for alignment of images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/269Analysis of motion using gradient-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a microscopic image real-time splicing method based on a template matching and optical flow method, which comprises the following steps: selecting an initial reference image to be positioned in the center of the background image; slowly moving the object stage in the X direction or the Y direction manually, continuously and circularly inputting the images acquired by the camera, and selecting a clear image as a current image to be input; sequentially performing template matching initial positioning, optical flow precision matching, neighborhood similarity and position constraint deletion of mismatching points to obtain the offset of the current image relative to the initial reference image, and taking the offset as a current matching result; evaluating the current matching result, and judging whether to execute fusion operation or not; performing image fusion by a rapid optimal stitching method and a weighted average method; the current image quality is evaluated and a determination is made as to whether to update the initial reference image. The microscopic image real-time stitching method can effectively improve registration accuracy and efficiently finish image stitching work while maintaining real-time performance.

Description

Microscopic image real-time stitching method based on template matching and optical flow method
Technical Field
The invention relates to the technical field of image processing and computer vision, in particular to a microscopic image real-time stitching method based on a template matching and optical flow method.
Background
The microscope is a very precise optical instrument, which enlarges tiny objects through an optical system to realize the display of microstructure, thereby realizing the purpose of recognizing and researching object characteristics from microscopic morphology. In practical applications, the magnification and field of view of a microscope are a pair of persistent contradictors, and often only a small portion of the slice to be observed can be acquired, which causes great inconvenience to the microscopic observation and manipulation, especially in the field of biological and medical microscopic observation. With the development of computer science research and digital image processing technology, an image stitching technology can be fully adopted to stitch a plurality of images with overlapping areas, so that a high-resolution image with a larger visual field is obtained.
The current image stitching system mainly has two processing schemes: one is to collect and splice the image separately, collect the image sequence with overlapping area continuously first, then adopt the registration algorithm of the picture to splice the image sequence collected. The scheme has the advantages of large calculated amount, poor real-time performance and poor user experience. According to the unmanned aerial vehicle remote sensing image stitching method based on image stitching quality evaluation, longitude and latitude information and height information of an unmanned aerial vehicle are recorded when each remote sensing image is acquired, a debouncing fuzzy algorithm is used for removing shake blur of the remote sensing images, the remote sensing images which do not meet stitching requirements in a remote sensing image sequence are found out by adopting image stitching quality evaluation, the unmanned aerial vehicle acquires the remote sensing images which do not meet the requirements again according to the recorded longitude and latitude information and the recorded height information, the acquired new remote sensing images are put into an original remote sensing image sequence to replace and supplement the corresponding original remote sensing images, and image registration and stitching are carried out again.
The other is to control the movement of the object stage to carry out real-time scanning and splicing. The Chinese patent with publication number CN110907453A discloses a large-view polarized microscope rock slice image splicing method which is applied to the petroleum and natural gas exploration field. The method can obtain a large-view polarized microscope slice image. The method comprises the following steps: (1) fixing the slice splicing and collecting device on a polarized light microscope stage; (2) fixing a rock slice on a slice splicing acquisition device glass slide; (3) adjusting transverse and longitudinal screws of the slice splicing and collecting device to ensure that an initial collecting vision is at the uppermost corner edge of the sample; (4) the slide glass is moved at equal intervals by rotating the transverse screw, and a polarized light microscope image is acquired before each movement; (5) after the images of the row are acquired, the position of the slide glass in the longitudinal direction is moved by rotating the longitudinal screw, the transverse screw is rotated in the direction opposite to the direction (4), the slide glass is moved at equal intervals each time, and a polarized light microscope image is acquired before each movement; (6) repeating the step (5) to collect matrix polarized light microscope images of the sheet sample; (7) and splicing the acquired images to obtain the large-view polarized microscope slice image. There are mainly two ways of moving the stage in this solution. An electric scanning platform is applied, and a method of combining hardware and software is adopted to control the scanning platform to move so as to realize the acquisition of full slice images. But due to errors in motor technology, the splicing accuracy is low, the cost is high, and the popularization and promotion are not facilitated. The second mode is manual collection, the objective table is moved, images are spliced in real time when being collected, the position of the current visual field of the microscope in the panoramic view is previewed, an operator can move the objective table to observe a region of interest at any time through manual collection of the images, the autonomy is high, the experience is strong, and the cost is low.
Disclosure of Invention
The invention aims to provide a microscopic image real-time splicing method based on a template matching and optical flow method, which can effectively improve registration accuracy and efficiently finish image splicing work while maintaining real-time performance.
The invention provides the following technical scheme:
a microscopic image real-time splicing method based on template matching and an optical flow method comprises the following steps:
step one: placing the slices to be spliced on an objective table, aligning the slices to the position of a light-transmitting hole, focusing until the images are clear, opening the splicing function, automatically creating a background image on an interface, transmitting the images into the camera, judging the definition of the images in an image cache queue, selecting the first sharpest image as an initial reference image, and placing the first sharpest image in the central position of the background image;
step two: slowly moving the object stage in the X direction or the Y direction manually, continuously and circularly inputting the images acquired by the camera, and selecting a clear image as a current image to be input;
step three: sequentially performing template matching initial positioning, optical flow precision matching, neighborhood similarity and position constraint deletion of mismatching points to obtain the offset of the current image relative to the initial reference image, and taking the offset as a current matching result;
step four: evaluating the current matching result, and judging whether to execute fusion operation or not: if the fusion operation can be executed, the fifth step is carried out, otherwise, the second step is returned;
step five: performing image fusion by a rapid optimal stitching method and a weighted average method;
step six: evaluating the current image quality and judging whether to update the initial reference image: if yes, updating the initial reference image, otherwise, reserving the initial reference image; and then, the second step to the sixth step are circulated until the scanning splicing is finished.
And (3) observing different areas of the slice by operating the microscope stage, registering and fusing the current images in real time, and observing the current local view field image and the current spliced panoramic image in real time to finally obtain the complete panoramic image of the microscopic slice. And the problems of poor real-time performance, low registration accuracy and the like in microscopic image registration can be solved.
The third step is specifically as follows:
(3.1) obtaining initial position estimation through template matching, calculating an offset according to the initial position estimation to obtain a rough matching result, and calculating an overlapping area of the current image and the initial reference image according to the offset; (or expressed as a fast coarse match by phase correlation, resulting in a coarse offset relative to the reference image and a rough overlap region of the two images).
(3.2) determining whether to proceed directly to step (3.3) according to the size of the overlapping region: if the size of the overlapped area is smaller than or equal to the size setting threshold, directly performing the step (3.3) by taking the overlapped area as an area characteristic block; if the size of the overlapped area is larger than the size setting threshold, calculating the gradient of the overlapped area, calculating an integral graph, and searching an area feature block with relatively rich features in the overlapped area for carrying out the step (3.3);
(3.3) calculating the accurate offset of the regional feature blocks of the current image and the initial reference image by a modified optical flow method to obtain a fine matching result;
and (3.4) combining the rough matching result obtained in the step (3.1) and the fine matching result obtained in the step (3.3) to obtain the accurate offset of the current image relative to the initial reference image and the offset slightly deviated from the initial point, and taking the accurate offset and the offset slightly deviated from the initial point as the current matching result.
When the initial reference image is placed on the background image, a coordinate point (a, b) of the upper left corner point of the current image on the drawing board is a starting point, and the coordinate is a value of a coordinate system on the background image. The offset of the current image with respect to the initial reference image is (c, d), the offset of the current image with respect to the starting point is (e, f), both satisfying: e=a+c, f=b+d.
The step (3.1) is specifically as follows:
(3.1.1) selecting a template in the current image;
(3.1.2) traversing the initial reference image by the moving template, calculating a normalized cross-correlation coefficient, and solving the translation amount between the current image and the initial reference image as a rough matching result;
(3.1.3) calculating the overlapping area of the current image and the initial reference image according to the offset.
Wherein, the step (3.1.2) is specifically as follows:
traversing in the reference image, calculating normalized cross-correlation coefficients for each possible position:
in the formula (1), T represents a template image, I represents an input image, (x ', y') is coordinates in the template, and (x, y) is a displacement of the template with respect to the input image. And finally, obtaining the offset according to the calculated normalized correlation coefficient.
According to the offsetThe overlapping position is obtained by measuring the original image size (W 0 ,H 0 ) Template matching yields an initial displacement (x 0 ,y 0 ). The position and the size of the overlapped area in the initial reference image are as follows
The overlapping area in the current image is
The step (3.2) is specifically as follows: in order to further improve the registration efficiency, if the overlapping area of the current image and the reference image is still larger, that is, if the size of the overlapping area is greater than the size set threshold, calculating laplace of the overlapping area, obtaining a gradient matrix by solving a second derivative of the matrix image, and then calculating an integral graph of the gradient matrix, and searching for an area feature block with the size set threshold with the most abundant details in the overlapping area, and performing step (3.3). If the overlapping area is less than or equal to the set threshold value, directly performing the step (3.3) by taking the overlapping area as an area characteristic block
If the size of the overlapping area is greater than 800×800, it is necessary to calculate and search for the area feature block with the set threshold size.
The step (3.3) is specifically as follows:
(3.3.1) extracting feature points by a modified FAST algorithm: 24 pixels with the radius of 3 around the feature point P point are used as detection templates, and a threshold value T is set, wherein the gray value of the P point is I p If the detection template has continuous gray values and I of 14 pixels p If the difference value of the pixel points is larger than T, the pixel points are regarded as characteristic points; can be expressed as:
(3.3.2) performing optical flow tracking by using an LK optical flow method, and minimizing a loss function to meet optical flow constraint conditions to obtain a matching point pair;
and (3.3.3) deleting the mismatching point pairs by utilizing neighborhood similarity and position constraint, and then calculating displacement by using a random sampling coincidence algorithm to obtain the accurate offset of the regional characteristic blocks of the current image and the initial reference image as a fine matching result.
The improved FAST algorithm provided by the invention has high operation speed, and solves the problem that the number of extracted characteristic points is influenced by image jitter.
In the step (3.3.2), the LK optical flow method is utilized to track the optical flow, and the loss function is minimized to meet the optical flow constraint condition, so as to obtain the matching point pair, which is specifically as follows:
assuming constant luminance, for each luminance E (x, y, t) in the image at time t there will be a corresponding point in the image at time t+ [ delta ] t, namely:
E(x,y,t)=E(x+Δx,y+△y,t+△t) (5)
applying a taylor formula to develop on the right side of (5) to obtain a two-dimensional optical flow constraint:
defining a loss function as:
wherein u is m =x'-x,v m =y'-y,
The smaller the loss function, the better the registration. Since the optical flow method is developed based on the Taylor formula and replaces the optical flow method with x '-x and y' -yAnd->The calculation result is inaccurate when the image motion is large. In some cases it may even lead to a match result that differs considerably from the true value. The invention performs preliminary estimation before optical flow tracking, and converts larger motion into smaller motion, the initial estimation is inaccurate, but the optical flow precision is high, so that the precision can be further optimized.
Since optical flow methods often suffer from more mismatching, this problem can be solved by introducing a mismatching point culling method. Position constraints are added on the basis of the similarity measure. The position constraint calculation mode is as follows:
characteristic point pair set P (P) 1 ,P 2 ,...P N ) Wherein P is i =((x i ,y i ),(x i' ,y i' )),(x i ,y i ) And (x) i' ,y i' ) The paired points in the target image and the initial reference image, respectively. Any pair P in the pair-to-pair set A Taking another point pair from the point pair set PCalculate vector +.>And->Cosine value D of the angle of (2) θ And a length difference D L
In the formula (8) and the formula (9), (x) A ,y A ) And (x) A' ,y A' ) Respectively represents a characteristic point in the target image and a point matched with the characteristic point in the reference image,and->Representing another point different from a. If the calculated cosine value D θ And a length difference D L Meets a similarity criteria threshold:
in the formula (10), the threshold T θ =0.10,T L =3.0. Randomly extract 8 timesCalculation C i Is the sum of:
in the formula (11), C represents the number of points whose positional relationship matches correctly in 8 points extracted randomly, and the larger the value of C is, the higher the accuracy of registration of the point pair a is. And setting the threshold value to be 6, if the value of C is larger than 6, considering the point pair A as a correctly matched point pair, and otherwise, eliminating the point pair A.
The method for deleting the mismatching point pairs by utilizing the neighborhood similarity and the position constraint in the step (3.3.3) comprises the following steps:
(3.3.3.1) computing similarity of feature point neighbors;
(3.3.3.2) calculating the position information of the feature points and the adjacent points thereof, namely the distance and the angle;
and (3.3.3.3) eliminating the matching point pairs by combining the similarity and the position information of the feature point neighborhood.
After the mismatching points are removed, a random sampling and coincidence algorithm is adopted to calculate the spatial transformation relation of the two images, and the two images are unified into a coordinate system.
The method for evaluating and judging whether to execute the fusion operation in the step (4) on the current matching result is as follows:
(4.1) calculating the definition of the current image, and if the definition of the current image is greater than or equal to the definition of the reference image with the set threshold multiple, performing image fusion on the current image; if the definition of the current image is smaller than the definition of the reference image with the set threshold multiple, the definition of the current image is poorer, and the current image is not subjected to image fusion;
(4.2) judging the displacement, if Deltax is less than or equal to 0.1 xW and Deltay is less than or equal to 0.1 xH, considering that the current displacement is smaller, and not performing image fusion, wherein Deltax is the displacement in the horizontal direction, deltay is the displacement in the vertical direction, W is the image width, and H is the image height; otherwise, performing image fusion;
and (4.3) judging whether the position of the map exceeds the current background map, if so, expanding the background, then attaching the image to the corresponding position in the background map, and then fusing the image.
Setting the threshold multiple to be 0.9 times in the step (4.1), namely if the definition of the current image is greater than 0.9 times of the definition of the reference image, performing image fusion on the current image; if the definition of the current image is smaller than 0.9 times of the definition of the reference image, the definition of the current image is poorer, and image fusion is not carried out temporarily.
In the fifth step, the method for performing image fusion by a rapid optimal stitching line method and a weighted average method comprises the following steps:
(5.1) transforming the current image and the initial reference image into the same coordinate system, firstly performing eight times downsampling on the image to be fused, transforming the image into a small-scale space, then calculating an optimal joint to obtain an optimal joint mask of the image, and then upsampling the optimal joint mask to obtain a splicing mask of the original image scale;
and (5.2) adopting a weighted average fusion algorithm to further process the optimal splice seam of the original image scale for image fusion.
The step (5.1) is specifically as follows:
(5.1.1) before calculating the optimal suture line, firstly performing eight-time downsampling on the picture to be fused, and transforming the image into a small-scale space;
(5.1.2) first, starting with the 1 st action of the overlapping area image, each point is used as the starting point of the column, traversing from top to bottom, and each column is used for generating a seam line. Let the coordinates of the current traversal point be P n (x, y), then calculate the sum P in the next row n (x, y) differential intensity values of 3 points adjacent. According to the stitch rule, the difference intensity value consists of a color difference and an image structure difference, and is calculated as follows:
in the formula (12), E color (x, y) represents the difference in color intensity, E geometry (x, y) represents the structural difference strength. Structural differential strength E geometry (x, y) is calculated by image gradient:
E geometry (x,y)=(S x ×(I 1 (x,y)-I 2 (x,y))) 2 +(S y ×(I 1 (x,y)-I 2 (x,y))) 2 (13)
in the formula (13), S x And S is y Representing the Sobel operator:
comparing the obtained 3 differential intensity values, selecting the point with the smallest intensity value as the next point on the suture line, wherein the connecting line between the two points is the suture line path, and the intensity value is the intensity value of the suture line of the section; repeating the above process until the last line, adding the difference intensity values of the paths of each section of each suture line, and comparing the difference intensity and the size of all paths, wherein the path with the smallest intensity is the optimal suture line.
And (5.1.3) after obtaining the optimal seam, respectively setting the pixel values at two sides of the seam to be 0 and 1 to obtain an optimal seam mask of the image, and then upsampling the mask to obtain the splicing mask of the original image scale.
The method for further processing the splicing seams by adopting a weighted average fusion algorithm to fuse the images in the step (5.2) comprises the following steps:
assuming that the current image to be fused and the initial reference image are distributed left and right, the overlapping area of the optimal seam is divided into three parts, and the first part is the left side A of the optimal seam line of the overlapping area 1 The second part is a transition area A with the width d of the edge of the suture line seam The third part is the right side A of the optimal suture line of the overlapped area 2
Then, the image fusion is carried out by adopting a gradual-in gradual-out method, and the calculation formula is as follows:
f in the above 1 (x, y) represents the left image, f 2 (x, y) represents the right image, w i The weights are expressed satisfying the following relationship:
wherein x is i Representing the position of the current point, x seam Representing the position of the splice line, d representing a preset width/stitch line edge width; w (w) i The method meets the following conditions:
an area (0, x) greater than d on the left side of the suture seam -d) taking the left image directly, the right region (x) being larger than d seam +d,x max ) Directly take the right image, two sides of suture (x seam -d,x seam Performing a fade-in fade-out fusion in the transition region of +d); from x seam -d to x seam +d,w 1 Gradually changing from 1 to 0, and w 2 Gradually change from 0 to 1, f 1 The ratio of (x, y) gradually decreases, f 2 The ratio of (x, y) increases gradually until leaving the suture zone to transition smoothly into image f 2 (x,y)。
In the sixth step, the method for evaluating the quality of the current image and judging whether to update the reference image comprises the following steps:
(6.1) updating the current image to the initial reference image with a current image definition greater than 0.75 times the initial reference image definition;
(6.2) updating the current image to the initial reference image when the current image is offset from the initial reference image by an image size greater than 1/10.
Compared with the prior art, the invention has the beneficial effects that the invention is realized by designing the software function of the microscopic image splicing system, and the manufacturing cost is low; the relationship between the speed and the precision is balanced through an image registration algorithm based on a template matching and optical flow method, and real-time registration of microscopic images is rapidly carried out, so that the splicing speed is high and the precision is high; an image fusion algorithm based on the optimal suture line method and weighted average fusion is provided to realize shadow elimination and joint seam and optimize the visual effect of the joint; through the screening and error correction functions of the reference image and the current image, the system has high use smoothness, can return to re-splice when unexpected splicing errors occur, and has high fault tolerance.
Drawings
Fig. 1 is a flow chart of a method for real-time stitching of microscopic images.
Fig. 2 is a block diagram of a real-time stitching system for microscopic images.
Fig. 3 is two images to be matched.
Fig. 4 is an enlarged view of the stitching result and details of the two images to be matched in fig. 3.
Fig. 5 is a graph of the results of scan stitching of large pathological sections.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
The invention relates to a manual real-time splicing method for microscope video streams, which is shown in a flow chart in fig. 1 and comprises the following steps:
step one: placing the slice to be spliced on an objective table, aligning the slice with the position of the light-transmitting hole, focusing until the image is clear, and opening the splicing function at the moment, wherein the original image size is (W 0 ,H 0 ) Automatically creating a 9W size on an interface 0 ×9H 0 The camera is used for inputting the image, the definition judgment is carried out on the image in the image buffer queue, the first clearest image is selected as an initial reference image, and the initial reference image is arranged in the center of the background image.
Step two: and moving the object stage to observe different areas of the slice in the XY direction of the microscope by rotating the hand wheel, at the moment, continuously capturing the current image by the image sensor, taking the first 6 pictures for definition calculation, and selecting the sharpest picture as the current image to be registered.
Step three: and reducing the current image to be registered by five times, and obtaining rough offset relative to the initial reference image and the approximate overlapping area of the two images through template matching initial positioning.
The method comprises the following sub-steps:
(3.1) first selecting a template in the current image:
(3.2) traversing in the reference image, calculating normalized cross-correlation coefficients for each possible position, and finally obtaining an offset (x) based on the calculated normalized cross-correlation coefficients 0 ,y 0 )。
In the formula (1), T represents a template image, I represents an input image, (x ', y') is coordinates in the template, and (x, y) is a displacement of the template with respect to the input image. Finally, the offset (x) is obtained according to the calculated normalized correlation coefficient 0 ,y 0 )。
(3.3) obtaining the overlapping position according to the offset, wherein the position and the size of the overlapping region in the initial reference image are as follows
The overlapping area in the current image is:
step four: after the overlapping area is obtained, in order to further improve the registration efficiency, if the size of the overlapping area of the current image and the reference image is still larger (more than 800×800), calculating laplace in the overlapping area of the current image, and obtaining a second derivative of the matrix image to obtain a gradient matrix. Then, a gradient matrix integral graph is calculated, and an 800×800-sized region feature block with the most abundant details in the overlapped region is searched. If the overlap area is less than 800×800, the overlap area is directly registered in the next step.
Step five: and calculating the accurate offset of the two regional feature blocks by an improved optical flow method, and synthesizing the offset obtained by coarse positioning and fine matching to obtain the accurate offset of the current image relative to the initial reference image and the offset slightly deviated from the initial point.
The method comprises the following steps:
(5.1) improving the FAST feature point extraction operator, overcoming the defect that the FAST detection algorithm is sensitive to image shake, taking 24 pixels with the radius of 3 around the P point as a detection template, and if the gray value of 14 continuous pixel points is equal to the gray value I of the P point p If the difference of (a) is greater than 20 (a threshold T), then the point is considered to be a feature point. Can be expressed as:
and (5.2) performing optical flow tracking by using an LK optical flow method, and minimizing a loss function to meet optical flow constraint conditions to obtain a matching point pair.
Assuming constant luminance, for each luminance E (x, y, t) in the image at time t there will be a corresponding point in the image at time t+ [ delta ] t, namely:
E(x,y,t)=E(x+Δx,y+△y,t+△t) (5)
applying a taylor formula to develop on the right side of (5) to obtain a two-dimensional optical flow constraint:
defining a loss function as:
wherein u is m =x'-x,v m =y'-y,
The smaller the loss function, the better the registration. Since the optical flow method is developed based on the Taylor formula and replaces the optical flow method with x '-x and y' -yAnd->The calculation result is inaccurate when the image motion is large. In some cases it may even lead to a match result that differs considerably from the true value. We performed a preliminary estimate before optical flow tracking, converting larger motion into smaller motion, which was inaccurate, but the high optical flow accuracy could further optimize the accuracy.
And (5.3) deleting mismatching points by utilizing neighborhood similarity and position constraint, wherein the feature point similarity is mainly obtained by calculating the similarity of the feature points to the neighborhood and subtracting pixel values in the neighborhood from the feature point pair to obtain the similarity of the point pair.
Because of the high similarity of microscopic images and a large number of complex microstructures, the invention performs mismatching point elimination by combining similarity measurement and position constraint conditions.
Optical flow methods often suffer from more mismatching, which can be addressed by introducing a mismatching point culling method. Position constraints are added on the basis of the similarity measure. The position constraint calculation mode is as follows:
characteristic point pair set P (P) 1 ,P 2 ,...P N ) Wherein P is i =((x i ,y i ),(x i' ,y i' )),(x i ,y i ) And (x) i' ,y i' ) The paired points in the target image and the initial reference image, respectively. Any pair P in the pair-to-pair set A Taking another point pair P from any of the point pair sets P Bi Calculating a vectorAnd->Cosine value D of the angle of (2) θ And a length difference D L
In the formula (8) and the formula (9), (x) A ,y A ) And (x) A' ,y A' ) Respectively represents a characteristic point in the target image and a point matched with the characteristic point in the reference image,and->Representation and AA different other pair of points; if the calculated cosine value D θ And a length difference D L Meets a similarity criteria threshold:
in the formula (10), the threshold T θ =0.10,T L =3.0; randomly extract 8 timesCalculation C i Is the sum of:
in the formula (11), C represents the number of points whose positional relationship matches correctly in 8 points extracted randomly, and the larger the value of C is, the higher the accuracy of registration of the point pair a is. And setting the threshold value to be 6, if the value of C is larger than 6, considering the point pair A as a correctly matched point pair, and otherwise, eliminating the point pair A.
And (5.4) after the mismatching points are removed, calculating the spatial transformation relation of the two images by adopting a random sampling coincidence algorithm, and unifying the two images under a coordinate system.
Step six: judging whether the current image meets the fusion condition, if yes, sending the current image into a fusion thread, and if not, not sending the current image into the fusion thread.
The method comprises the following sub-steps:
(6.1) calculating the offset of the current image and the last fused image, and if the offset is greater than 1/10 of the image size, the partial qualification of fusion is entered.
(6.2) calculating the sharpness of the current image, and if the sharpness of the current image is greater than 0.9 times the sharpness of the initial reference image, the current image is eligible for fusion.
And judging whether the position of the map exceeds the current background map, if so, expanding the background, attaching the image to the corresponding position in the background map, and then fusing the image.
Step seven: fusion is performed by an optimal suture-based method.
The method comprises the following sub-steps:
(7.1) firstly, performing eight-time downsampling on the picture to be fused, and transforming the image into a small-scale space;
(7.2) first, starting with the 1 st action of the overlapped area image, each point is used as the starting point of the column, traversing from top to bottom, and each column is used for generating a seam line.
Let the coordinates of the current traversal point be P n (x, y), then calculate the sum P in the next row n (x, y) differential intensity values of 3 points adjacent. According to the stitch rule, the difference intensity value consists of a color difference and an image structure difference, and is calculated as follows:
in the formula (12), E color (x, y) represents the difference in color intensity, E geometry (x, y) represents the structural difference strength. Structural differential strength E geometry (x, y) is calculated by image gradient:
E geometry (x,y)=(S x ×(I 1 (x,y)-I 2 (x,y))) 2 +(S y ×(I 1 (x,y)-I 2 (x,y))) 2 (13)
in the formula (13), S x And S is y Representing the Sobel operator:
comparing the obtained 3 differential intensity values, selecting the point with the smallest intensity value as the next point on the suture line, wherein the connecting line between the two points is the suture line path, and the intensity value is the intensity value of the suture line of the section; repeating the above process until the last line, adding the difference intensity values of the paths of each section of each suture line, and comparing the difference intensity and the size of all paths, wherein the path with the smallest intensity is the optimal suture line.
And (7.3) after obtaining the optimal seam, respectively setting the pixel values at two sides of the seam to be 0 and 1 to obtain an optimal seam mask of the image, and then upsampling the mask to obtain the splicing mask of the original image scale.
(7.4) adopting a weighted average fusion algorithm to further process the optimal splice seam of the original image scale.
Assuming that the two images to be fused are distributed left and right, the optimal seam overlapping area is divided into three parts, and the first part is the left side A of the optimal seam line of the overlapping area 1 The second part is a transition area A with the width d of the edge of the suture line seam The third part is the right side A of the optimal suture line of the overlapped area 2 The method comprises the steps of carrying out a first treatment on the surface of the Then, the image fusion is carried out by adopting a gradual-in gradual-out method, and the calculation formula is as follows:
f in formula (15) 1 (x, y) represents the left image, f 2 (x, y) represents the right image, w i The weight is expressed and the following relation is satisfied
Wherein x is i Representing the position of the current point, x seam The position of the splice line is indicated, and d is the preset width. As shown in (18), w i Satisfy the following requirements
An area (0, x) greater than d on the left side of the suture seam -d) direct takingLeft image, right region (x seam +d,x max ) Directly take the right image, two sides of suture (x seam -d,x seam Performing a fade-in fade-out fusion in the transition region of +d); from x seam -d to x seam +d,w 1 Gradually changing from 1 to 0, and w 2 Gradually change from 0 to 1, f 1 The ratio of (x, y) gradually decreases, f 2 The ratio of (x, y) increases gradually until leaving the suture zone to transition smoothly into image f 2 (x,y)。
Step eight: evaluating the current image quality and judging whether to update the initial reference image: if yes, updating the initial reference image, otherwise, reserving the initial reference image; and then, the second step to the sixth step are circulated until the scanning splicing is finished.
In this embodiment, when the current image sharpness is greater than 0.75 times the original reference image sharpness and the current image offset from the original reference image is greater than 1/10 of the image size, the current image is taken as a new reference image and the offset of the new reference image from the initial point is accumulated.
Fig. 2 is a schematic diagram of a real-time image stitching system based on template matching and optical flow method, which is provided by the invention, and the system is divided into image acquisition and image stitching modules according to functions. The image acquisition module is responsible for acquiring slice images under a microscope, and comprises a camera, a microscope and an objective table. The image acquisition equipment acquires images, the current images are transmitted to the image stitching system in real time, the current images are registered according to the microscopic image real-time stitching method provided by the invention to obtain an image displacement relation, the current images are stitched to corresponding positions in the panoramic image, and image fusion operation is carried out, so that the panoramic image of the whole slice is finally obtained. The image stitching part is mainly divided into two links of image registration and image fusion.
Fig. 3 is two images to be matched.
Fig. 4 is a graph of a stitching result of the two images in fig. 3 and details thereof, and as can be seen from fig. 4, the stitching algorithm provided by the invention has high registration accuracy and basically eliminates stitching seams.
Fig. 5 is a large pathological section scanning mosaic image spliced by the mosaic algorithm provided by the invention, and as can be seen from fig. 5, the mosaic algorithm can splice images with very large fields of view, and the mosaic accuracy is higher.
The foregoing is merely a preferred embodiment of the present invention and is not intended to limit the present invention in any way, and although the preferred embodiment of the present invention is described above, it is not intended to limit the present invention. Any simple modification, equivalent variation and modification of the above embodiment according to the technical principles of the present embodiment will still fall within the scope of the technical scheme of the present invention, unless the technical principles of the present embodiment deviate from the basic idea of the present invention.

Claims (6)

1. The microscopic image real-time splicing method based on the template matching and optical flow method is characterized by comprising the following steps of:
step one: placing the slices to be spliced on an objective table, aligning the slices to the position of a light-transmitting hole, focusing until the images are clear, opening the splicing function, automatically creating a background image on an interface, transmitting the images into the camera, judging the definition of the images in an image cache queue, selecting the first sharpest image as an initial reference image, and placing the first sharpest image in the central position of the background image;
step two: slowly moving the object stage in the X direction or the Y direction manually, continuously and circularly inputting the images acquired by the camera, and selecting a clear image as a current image to be input;
step three: sequentially performing template matching initial positioning, optical flow precision matching, neighborhood similarity and position constraint deletion of mismatching points to obtain the offset of the current image relative to the initial reference image, and taking the offset as a current matching result;
step four: evaluating the current matching result, and judging whether to execute fusion operation or not: if the fusion operation can be executed, the fifth step is carried out, otherwise, the second step is returned;
step five: performing image fusion by a rapid optimal stitching method and a weighted average method;
step six: evaluating the current image quality and judging whether to update the initial reference image: if yes, updating the initial reference image, otherwise, reserving the initial reference image; then, the second step to the sixth step are circulated until the scanning splicing is finished;
the third step is as follows:
(3.1) obtaining initial position estimation through template matching, calculating an offset according to the initial position estimation to obtain a rough matching result, and calculating an overlapping area of the current image and the initial reference image according to the offset;
(3.2) determining whether to proceed directly to step (3.3) according to the size of the overlapping region: if the size of the overlapped area is smaller than or equal to the size setting threshold, directly performing the step (3.3) by taking the overlapped area as an area characteristic block; if the size of the overlapped area is larger than the size setting threshold, calculating the gradient of the overlapped area, calculating an integral graph, and searching an area feature block with relatively rich features in the overlapped area for carrying out the step (3.3);
(3.3) calculating the accurate offset of the regional feature blocks of the current image and the initial reference image by an optical flow method to obtain a fine matching result;
(3.4) synthesizing the rough matching result obtained in the step (3.1) and the fine matching result obtained in the step (3.3) to obtain the accurate offset of the current image relative to the initial reference image and the offset slightly deviated from the initial point, and taking the accurate offset and the offset slightly deviated from the initial point as the current matching result;
the step (3.3) is specifically as follows:
(3.3.1) extracting feature points by a modified FAST algorithm: 24 pixels with the radius of 3 around the feature point P point are used as detection templates, and a threshold value T is set, wherein the gray value of the P point is I p If the detection template has continuous gray values and I of 14 pixels p If the difference value of the pixel points is larger than T, the pixel points are regarded as characteristic points;
(3.3.2) performing optical flow tracking by using an optical flow method, and minimizing a loss function to meet optical flow constraint conditions so as to obtain a matching point pair;
(3.3.3) deleting the mismatching point pairs by utilizing neighborhood similarity and position constraint, and then calculating displacement by using a random sampling coincidence algorithm to obtain the accurate offset of the regional characteristic blocks of the current image and the initial reference image as a fine matching result;
in the fifth step, the method for performing image fusion by a rapid optimal stitching line method and a weighted average method comprises the following steps:
(5.1) transforming the current image and the initial reference image into the same coordinate system, firstly performing eight times downsampling on the image to be fused, transforming the image into a small-scale space, then calculating an optimal joint to obtain an optimal joint mask of the image, and then upsampling the optimal joint mask to obtain a splicing mask of the original image scale;
(5.2) adopting a weighted average fusion algorithm to further process the optimal splice seam of the original image scale for image fusion;
the method for further processing the splicing seams by adopting a weighted average fusion algorithm to fuse the images in the step (5.2) comprises the following steps:
assuming that the current image to be fused and the initial reference image are distributed left and right, the overlapping area of the optimal seam is divided into three parts, and the first part is the left side A of the optimal seam line of the overlapping area 1 The second part is a transition area A with the width d of the edge of the suture line seam The third part is the right side A of the optimal suture line of the overlapped area 2
Then, the image fusion is carried out by adopting a gradual-in gradual-out method, and the calculation formula is as follows:
f in the above 1 (x, y) represents the left image, f 2 (x, y) represents the right image, w i The weights are expressed satisfying the following relationship:
wherein x is i Representing the position of the current point, x seam Representing the position of the splice line, d representing the width of the suture edge; w (w) i The method meets the following conditions:
an area (0, x) greater than d on the left side of the suture seam -d) taking the left image directly, the right region (x) being larger than d seam +d,x max ) Directly take the right image, two sides of suture (x seam -d,x seam Performing a fade-in fade-out fusion in the transition region of +d); from x seam -d to x seam +d,w 1 Gradually changing from 1 to 0, and w 2 Gradually change from 0 to 1, f 1 The ratio of (x, y) gradually decreases, f 2 The ratio of (x, y) increases gradually until leaving the suture zone to transition smoothly into image f 2 (x,y)。
2. The method for splicing microscopic images in real time based on the template matching and optical flow method according to claim 1, wherein the step (3.1) is specifically:
(3.1.1) selecting a template in the current image;
(3.1.2) traversing the initial reference image by the moving template, calculating a normalized cross-correlation coefficient, and solving the translation amount between the current image and the initial reference image as a rough matching result;
(3.1.3) calculating the overlapping area of the current image and the initial reference image according to the offset.
3. The method for splicing microscopic images in real time based on the template matching and optical flow method according to claim 1, wherein the step (3.2) is specifically: if the size of the overlapped area is larger than the size set threshold, calculating Laplacian of the overlapped area, obtaining a gradient matrix by calculating a second derivative of the matrix image, calculating an integral diagram of the gradient matrix, and searching an area feature block with the size of the size set threshold with the most abundant details in the overlapped area, and performing step (3.3).
4. The method for real-time stitching microscopic images based on a template matching and optical flow method according to claim 1, wherein the method for deleting the mismatching point pairs by using neighborhood similarity and position constraint in the step (3.3.3) is as follows:
(3.3.3.1) computing similarity of feature point neighbors;
(3.3.3.2) calculating the position information of the feature points and the adjacent points thereof, namely the distance and the angle;
and (3.3.3.3) eliminating the matching point pairs by combining the similarity and the position information of the feature point neighborhood.
5. The method for real-time stitching microscopic images based on the template matching and optical flow method according to claim 1, wherein in the fourth step, the method for evaluating the current matching result to determine whether to execute the fusion operation is as follows:
(4.1) calculating the definition of the current image, and if the definition of the current image is greater than or equal to the definition of the reference image with the set threshold multiple, performing image fusion on the current image; if the definition of the current image is smaller than the definition of the reference image with the set threshold multiple, the current image is not subjected to image fusion;
(4.2) judging the displacement, if Deltax is less than or equal to 0.1 xW and Deltay is less than or equal to 0.1 xH, considering that the current displacement is smaller, and not performing image fusion, wherein Deltax is the displacement in the horizontal direction, deltay is the displacement in the vertical direction, W is the image width, and H is the image height; otherwise, performing image fusion;
and (4.3) judging whether the position of the map exceeds the current background map, if so, expanding the background, then attaching the image to the corresponding position in the background map, and then fusing the image.
6. The method for splicing microscopic images in real time based on the template matching and optical flow method according to claim 1, wherein the method for evaluating the current image quality and judging whether to update the reference image in the step six is as follows:
(6.1) updating the current image to the initial reference image with a sharpness greater than 0.75 times the sharpness of the current image;
(6.2) updating the current image to the initial reference image when the current image is offset from the initial reference image by an image size greater than 1/10.
CN202211325386.4A 2022-10-27 2022-10-27 Microscopic image real-time stitching method based on template matching and optical flow method Active CN116309036B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211325386.4A CN116309036B (en) 2022-10-27 2022-10-27 Microscopic image real-time stitching method based on template matching and optical flow method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211325386.4A CN116309036B (en) 2022-10-27 2022-10-27 Microscopic image real-time stitching method based on template matching and optical flow method

Publications (2)

Publication Number Publication Date
CN116309036A CN116309036A (en) 2023-06-23
CN116309036B true CN116309036B (en) 2023-12-29

Family

ID=86800171

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211325386.4A Active CN116309036B (en) 2022-10-27 2022-10-27 Microscopic image real-time stitching method based on template matching and optical flow method

Country Status (1)

Country Link
CN (1) CN116309036B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157701A (en) * 2007-12-27 2009-07-16 Shimadzu Corp Method and unit for image processing
JP2013122639A (en) * 2011-12-09 2013-06-20 Hitachi Kokusai Electric Inc Image processing device
CN104751465A (en) * 2015-03-31 2015-07-01 中国科学技术大学 ORB (oriented brief) image feature registration method based on LK (Lucas-Kanade) optical flow constraint
CN110428367A (en) * 2019-07-26 2019-11-08 北京小龙潜行科技有限公司 A kind of image split-joint method and device
CN110475123A (en) * 2019-08-30 2019-11-19 杭州图谱光电科技有限公司 A kind of manual real-time joining method for microscope video flowing
CN111626936A (en) * 2020-05-22 2020-09-04 湖南国科智瞳科技有限公司 Rapid panoramic stitching method and system for microscopic images
CN112365518A (en) * 2020-12-08 2021-02-12 杭州电子科技大学 Image splicing method based on optimal suture line self-selection area gradual-in and gradual-out algorithm
CN114693720A (en) * 2022-02-28 2022-07-01 苏州湘博智能科技有限公司 Design method of monocular vision odometer based on unsupervised deep learning
CN115205114A (en) * 2022-06-24 2022-10-18 长春理工大学 High-resolution image splicing improved algorithm based on ORB (object-oriented bounding box) features

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109064397B (en) * 2018-07-04 2023-08-01 广州希脉创新科技有限公司 Image stitching method and system based on camera earphone
CN109934772B (en) * 2019-03-11 2023-10-27 影石创新科技股份有限公司 Image fusion method and device and portable terminal
US11055828B2 (en) * 2019-05-09 2021-07-06 Adobe Inc. Video inpainting with deep internal learning

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009157701A (en) * 2007-12-27 2009-07-16 Shimadzu Corp Method and unit for image processing
JP2013122639A (en) * 2011-12-09 2013-06-20 Hitachi Kokusai Electric Inc Image processing device
CN104751465A (en) * 2015-03-31 2015-07-01 中国科学技术大学 ORB (oriented brief) image feature registration method based on LK (Lucas-Kanade) optical flow constraint
CN110428367A (en) * 2019-07-26 2019-11-08 北京小龙潜行科技有限公司 A kind of image split-joint method and device
CN110475123A (en) * 2019-08-30 2019-11-19 杭州图谱光电科技有限公司 A kind of manual real-time joining method for microscope video flowing
CN111626936A (en) * 2020-05-22 2020-09-04 湖南国科智瞳科技有限公司 Rapid panoramic stitching method and system for microscopic images
CN112365518A (en) * 2020-12-08 2021-02-12 杭州电子科技大学 Image splicing method based on optimal suture line self-selection area gradual-in and gradual-out algorithm
CN114693720A (en) * 2022-02-28 2022-07-01 苏州湘博智能科技有限公司 Design method of monocular vision odometer based on unsupervised deep learning
CN115205114A (en) * 2022-06-24 2022-10-18 长春理工大学 High-resolution image splicing improved algorithm based on ORB (object-oriented bounding box) features

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
SIFT特征匹配的显微全景图拼接;霍春宝;童帅;赵立辉;崔汉峰;;辽宁工程技术大学学报(自然科学版)(01);全文 *
基于改进最佳缝合线的图像拼接方法;张 翔,王 伟,肖 迪;《计算机工程与设计》;第39卷(第7期);全文 *
基于特征点的图像拼接技术在动漫中的应用;黄梅;唐琨;肖建新;;湖南师范大学自然科学学报(01);全文 *

Also Published As

Publication number Publication date
CN116309036A (en) 2023-06-23

Similar Documents

Publication Publication Date Title
US7027628B1 (en) Automated microscopic image acquisition, compositing, and display
CN111311666B (en) Monocular vision odometer method integrating edge features and deep learning
CN108648240A (en) Based on a non-overlapping visual field camera posture scaling method for cloud characteristics map registration
CN112067233B (en) Six-degree-of-freedom motion capture method for wind tunnel model
CN107038683B (en) Panoramic imaging method for moving object
CN110736747B (en) Method and system for positioning under cell liquid-based smear mirror
CN111179170B (en) Rapid panoramic stitching method for microscopic blood cell images
CN109900274B (en) Image matching method and system
CN110263716B (en) Remote sensing image super-resolution land cover mapping method based on street view image
CN103700082B (en) Image split-joint method based on dual quaterion relative orientation
CN116703723A (en) High-resolution microscopic image scanning and stitching method based on microscope system
CN116563377A (en) Mars rock measurement method based on hemispherical projection model
CN115063477A (en) Infrared and visible light double-channel synchronous imaging real-time registration fusion acquisition method and device
Tang et al. Content-based 3-D mosaics for representing videos of dynamic urban scenes
CN116309036B (en) Microscopic image real-time stitching method based on template matching and optical flow method
CN117635421A (en) Image stitching and fusion method and device
JP2007323616A (en) Image processor and processing method therefor
CN111260561A (en) Rapid multi-graph splicing method for mask defect detection
CN116402685A (en) Microscopic image stitching method based on objective table motion information
CN115616018A (en) Positioning method and device for scanning electron microscope, electronic equipment and storage medium
CN114913064A (en) Large parallax image splicing method and device based on structure keeping and many-to-many matching
Chidambaram Edge Extraction of Color and Range Images
Pokorný Mapping 2D Skeleton sequences from speed climbing videos onto a virtual reference wall
CN118379459B (en) Bridge disease visualization method based on three-dimensional reconstruction of nerve radiation field
CN116630164B (en) Real-time splicing method for massive microscopic images

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant