CN108537833B - Infrared image rapid splicing method - Google Patents
Infrared image rapid splicing method Download PDFInfo
- Publication number
- CN108537833B CN108537833B CN201810350881.8A CN201810350881A CN108537833B CN 108537833 B CN108537833 B CN 108537833B CN 201810350881 A CN201810350881 A CN 201810350881A CN 108537833 B CN108537833 B CN 108537833B
- Authority
- CN
- China
- Prior art keywords
- matching
- images
- image
- spliced
- points
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 56
- 238000013519 translation Methods 0.000 claims abstract description 22
- 230000004927 fusion Effects 0.000 claims abstract description 10
- 230000008569 process Effects 0.000 claims abstract description 9
- 238000001514 detection method Methods 0.000 claims description 9
- 238000001228 spectrum Methods 0.000 claims description 8
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000005314 correlation function Methods 0.000 claims description 4
- 238000004519 manufacturing process Methods 0.000 claims description 4
- 238000011524 similarity measure Methods 0.000 claims description 4
- 238000012545 processing Methods 0.000 abstract description 4
- 230000009466 transformation Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 239000011159 matrix material Substances 0.000 description 4
- 230000007547 defect Effects 0.000 description 2
- 238000007500 overflow downdraw method Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000021615 conjugation Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000012216 screening Methods 0.000 description 1
- 238000001931 thermography Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/30—Determination of transform parameters for the alignment of images, i.e. image registration
- G06T7/33—Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/46—Descriptors for shape, contour or point-related descriptors, e.g. scale invariant feature transform [SIFT] or bags of words [BoW]; Salient regional features
- G06V10/462—Salient features, e.g. scale invariant feature transforms [SIFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
Abstract
The invention relates to a quick infrared image splicing method, and belongs to the technical field of infrared image splicing. The method comprises the following steps: reading in an image library to be spliced; analyzing the positions of the images to be spliced in the image array according to the shooting information, and splicing the images in sequence; in the process of image registration, adjacent images are regarded as a weak perspective relation, only relative translation between the images is considered, a multi-stage matching strategy is adopted, namely a Fourier registration method is used for rough matching, a template matching method is used for fine matching, and if the matching method fails in matching, a characteristic point matching and RANSAC algorithm is used; and finally, gradually fusing adjacent images by using a Laplacian pyramid fusion algorithm. The method can realize the splicing of the infrared images in large batch, has the rapid processing capability, has the robustness and the accuracy, and is easy to popularize and apply.
Description
Technical Field
The invention belongs to the field of infrared image splicing, particularly relates to a method for quickly splicing infrared images, and particularly relates to a method for splicing infrared images shot by an infrared camera.
Background
With the development of infrared thermal imaging technology, infrared image processing has become an indispensable important component in infrared technology as an important technical means in infrared technology. Among them, the infrared image stitching technology has become a hot point of research in the field of image processing as an important means for expanding the field of vision.
Compared with a visible light camera, due to a special imaging mechanism, an infrared camera has the characteristics of low resolution, narrow dynamic range, more noise and the like in a photographed image. These problems add difficulty to the image registration step in image stitching. Therefore, for a specific application scene, besides considering the stitching algorithm from the image itself, other information acquired during shooting by the infrared camera needs to be combined to assist in completing image stitching. In addition, the number of images generated in some application scenarios (such as aerial photography) is huge, and hundreds of images are acquired by scanning a certain area, which requires a stitching algorithm with fast processing capability and strong robustness, and the stitching task cannot be failed because one or more images cannot be registered.
A typical image stitching method involves the following steps: firstly, detecting feature points in images to be spliced by using Harris, SIFT and other feature point detection algorithms, matching the feature points of the two images, purifying the feature point pairs by using RANSAC algorithm and calculating a transformation matrix of the two images; the images to be spliced are fused by an image fusion method after the images are remapped by using a transformation matrix; in order to prevent the two images with too large gray scale difference from being fused to generate a gray scale step, the gray scale of the images needs to be adjusted before fusion in general. In addition, in the application of stitching a plurality of images, image registration errors are accumulated along with the gradual increase of the number of image stitching, so that a Bundle Adjustment (Bundle Adjustment) algorithm is required to adjust a transformation matrix calculated by a registration algorithm to eliminate the errors, while the Bundle Adjustment algorithm is required to perform continuous iterative computation on each transformation matrix, which is time-consuming. The splicing method has various steps and complex algorithm, so that the splicing task is time-consuming. Therefore, how to overcome the defects of the prior art is an urgent problem to be solved in the field of infrared image splicing at present.
Disclosure of Invention
The invention aims to solve the defects of the prior art and provides a method for quickly splicing infrared images, which is a method for splicing images by using information generated in the shooting process aiming at the infrared images and has rapidness, accuracy and robustness.
In order to achieve the purpose, the technical scheme adopted by the invention is as follows:
a method for quickly splicing aerial infrared images comprises the following steps:
step 1, reading: reading in an image library to be spliced and shooting information generated in the shooting process;
step 2, manufacturing an image array: analyzing the positions of all images to be spliced in the image library to be spliced according to the shooting information in the step 1, and then making all the images to be spliced into an image array according to the positions of all the images to be spliced;
step 3, coarse matching of adjacent images: regarding two adjacent images in the image array obtained in the step 2 as a weak perspective relation, and calculating the relative translation amount of the two adjacent images by adopting an image registration method based on Fourier transform;
step 4, fine matching of adjacent images: according to the relative translation amount obtained in the step 3, an overlapping area of two adjacent images is defined, then the area is respectively expanded to two sides by 20%, and then three images are randomly selected from left to right in the expanded area and marked as a left template image block, a middle template image block and a right template image block;
carrying out template matching on two adjacent images by using a maximum similarity criterion to obtain the optimal matching point of each template image block; the best matching points of the three template image blocks are respectively marked as (x)1,y1)、(x2,y2)、(x3,y3) Then, the absolute value of the horizontal and vertical coordinate difference value of the three points is calculated: w1=|x1-x2|、W2=|x1-x3|、W3=|x2-x3|、H1=|y1-y2|、H2=|y1-y3|、H3=|y2-y3L, |; these three match points were then screened: if W1、W2And W3All are less than or equal to 5 percent of the length of the image to be spliced and corresponding H1、H2And H3If the height of the three points is less than or equal to 5% of the height of the image to be spliced, the three points have no great difference, the matching is successful, and the three points are averaged to be used as a fine matching result; if W1、W2、W3If two of the two pairs of images are less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, it is indicated that two pairs of points have no great difference and are successfully matched, and any one of the two pairs of points is averaged to be used as a matching result; if W1、W2、W3If one of the images is less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, it is indicated that a pair of points has no great difference and is successfully matched, and the pair of points is averaged to serve as a matching result; the other conditions show that the difference of the three optimal matching points is large and the matching fails; step 5 is implemented if matching is successful, and step 6 is implemented if matching is failed;
and 5, judging a matching result: if the difference between the result of the fine matching in the step 4 and the result of the coarse matching in the step 3 is smaller than the threshold value, accepting the matching result and implementing the step 7, otherwise rejecting the matching result and implementing the step 6;
step 6, matching the feature points of two adjacent images: harris feature point detection is carried out on two adjacent images to be matched, Euclidean distance matching is carried out on a feature point descriptor obtained by detection to obtain a matching point, an RANSAC algorithm is used for eliminating an error matching point, and the relative translation amount between the two images to be registered is calculated;
and 7, fusing adjacent images: fusing adjacent images by using a Laplacian pyramid fusion algorithm;
and 8, repeating the steps 3 to 7 until the whole image library is spliced.
Further, preferably, in step 1, the shooting information generated in the shooting process includes a flying speed of the aircraft, a flying altitude, a heading angle, a pitch angle, a longitude and latitude where the shooting is performed, and shooting time.
Further, it is preferable that the splicing is performed in order from top to bottom and from left to right in the splicing.
Further, preferably, the specific method for calculating the relative translation amount of the two adjacent images by using the fourier registration method is as follows: and calculating cross power spectrums of the two images, wherein the phase difference of the cross power spectrums is equivalent to the phase difference between the images, performing inverse Fourier transform on the phase difference to obtain a two-dimensional pulse function, and finding out the position of a pulse peak, wherein the position is the relative translation amount between the two registered images.
Further, in step 4, preferably, the maximum similarity criterion uses the correlation function R (i, j) as the similarity measure, and the calculation formula is as follows:
wherein s (X, Y) is the expanded overlapping area, T (X, Y) is the template image block, X is the length of the template image block, Y is the width of the template image block, i is the moving position of the template image block in the X-axis direction, and j is the moving position of the template image block in the Y-axis direction;
and (i, j) corresponding to the maximum R (i, j) is the best matching point.
Further, in step 4, the length of each template image block is 1/6 the length of the expanded overlap region, and the width is 1/6 the width of the expanded overlap region.
Further, in step 5, it is preferable that the threshold is 5% to 10%. For example, when the threshold is 5%, if the difference between the result of the fine matching in step 4 and the result of the coarse matching in step 3 is less than the threshold, the difference between the result of the fine matching in step 4 and the result of the coarse matching in step 3 is less than 5% of the length of the image to be stitched and less than 5% of the width of the image to be stitched.
Compared with the prior art, the invention has the beneficial effects that:
(1) according to the method, the image splicing sequence and the splicing position are analyzed by fully utilizing the shooting information, the image is spliced in order, and the image projection relation is treated as weak perspective, so that the splicing speed is increased, and the splicing complexity is reduced;
(2) the method comprises the steps of adopting a multi-stage matching strategy, firstly using a Fourier registration method with high registration speed to carry out coarse matching, calculating an overlapped part, then using a template matching method to carry out fine registration, and if the registration is not successful, using a more accurate characteristic point matching method to improve the accuracy of the registration while considering both the rapidity and the robustness of image registration;
(3) the influence caused by pixel dislocation is eliminated by using the Laplacian pyramid fusion method.
Drawings
FIG. 1 is a schematic flow chart of a method for rapidly stitching aerial infrared images according to the present invention;
FIG. 2 is a schematic view of an aerial image capture mode;
FIG. 3 is a schematic view of an array of images obtained by aerial photography;
FIG. 4 is a schematic diagram of a method for selecting a template by a template matching method;
FIG. 5 is a flow chart of screening best match points;
fig. 6 is a graph showing the splicing result of the embodiment.
Detailed Description
The present invention will be described in further detail with reference to examples.
It will be appreciated by those skilled in the art that the following examples are illustrative of the invention only and should not be taken as limiting the scope of the invention. The examples do not specify particular techniques or conditions, and are performed according to the techniques or conditions described in the literature in the art or according to the product specifications. The materials or equipment used are not indicated by manufacturers, and all are conventional products available by purchase.
Example 1
A method for quickly splicing aerial infrared images is shown in figure 1 and comprises the following steps:
step 1, reading: reading in an image library to be spliced and shooting information generated in the shooting process; shooting information generated in the shooting process comprises the flying speed of an aircraft, the flying height, the course angle, the pitch angle, the longitude and latitude during shooting and shooting time;
step 2, manufacturing an image array: analyzing the positions of all images to be spliced in the image library to be spliced according to the shooting information in the step 1, and then making all the images to be spliced into an image array according to the positions (rows and columns) of all the images to be spliced; when splicing, the splicing sequence is from top to bottom and from left to right.
Step 3, coarse matching of adjacent images: regarding two adjacent images in the image array obtained in the step (2) as a weak perspective relation, namely, the two default images only have a translation relation, and calculating the relative translation amount of the two adjacent images by adopting an image registration method based on Fourier transform;
step 4, fine matching of adjacent images: according to the relative translation amount obtained in the step 3, an overlapping area of two adjacent images is defined, then the area is respectively expanded to two sides by 20%, and then three images are randomly selected from left to right in the expanded area and marked as a left template image block, a middle template image block and a right template image block; the length of each template image block is 1/6 the length of the expanded overlapping area and 1/6 the width of the expanded overlapping area;
carrying out template matching on two adjacent images by using a maximum similarity criterion to obtain the optimal matching point of each template image block; the best matching points of the three template image blocks are respectively marked as (x)1,y1)、(x2,y2)、(x3,y3) Then, the absolute value of the horizontal and vertical coordinate difference value of the three points is calculated: w1=|x1-x2|、W2=|x1-x3|、W3=|x2-x3|、H1=|y1-y2|、H2=|y1-y3|、H3=|y2-y3L, |; these three match points were then screened: if W1、W2And W3All are less than or equal to 5 percent of the length of the image to be spliced and corresponding H1、H2And H3If the height of the three points is less than or equal to 5% of the height of the image to be spliced, the three points have no great difference, the matching is successful, and the three points are averaged to be used as a fine matching result; if W1、W2、W3If two of the two pairs of images are less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, it is indicated that two pairs of points have no great difference and are successfully matched, and any one of the two pairs of points is averaged to be used as a matching result; if W1、W2、W3If one of the two points is less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, it is indicated that a pair of points has no great difference, the matching is successful, and the pair of points is averaged to be used as a matching result; the other cases show that the difference of the three best matching points is large and the matching fails(ii) a Step 5 is implemented if matching is successful, and step 6 is implemented if matching is failed;
and 5, judging a matching result: if the difference between the result of the fine matching in the step 4 and the result of the coarse matching in the step 3 is smaller than the threshold value, accepting the matching result and implementing the step 7, otherwise rejecting the matching result and implementing the step 6;
step 6, matching the feature points of two adjacent images: harris feature point detection is carried out on two adjacent images to be matched, Euclidean distance matching is carried out on a feature point descriptor obtained by detection to obtain a matching point, an RANSAC algorithm is used for eliminating an error matching point, and the relative translation amount between the two images to be registered is calculated;
and 7, fusing adjacent images: fusing adjacent images by using a Laplacian pyramid fusion algorithm; since two adjacent images are regarded as weak perspective relationship and there is error in registration, the registered images are usually misaligned by several pixels. And adjacent images are fused by using a Laplacian pyramid fusion algorithm, so that the influence caused by image pixel dislocation can be effectively eliminated.
And 8, repeating the steps 3 to 7 until the whole image library is spliced.
The specific method for calculating the relative translation amount of the two adjacent images by adopting the Fourier registration method comprises the following steps: and calculating cross power spectrums of the two images, wherein the phase difference of the cross power spectrums is equivalent to the phase difference between the images, performing inverse Fourier transform on the phase difference to obtain a two-dimensional pulse function, and finding out the position of a pulse peak, wherein the position is the relative translation amount between the two registered images.
In step 4, the maximum similarity criterion uses a correlation function R (i, j) as a similarity measure, and the calculation formula is as follows:
wherein s (X, Y) is the expanded overlapping area, T (X, Y) is the template image block, X is the length of the template image block, Y is the width of the template image block, i is the moving position of the template image block in the X-axis direction, and j is the moving position of the template image block in the Y-axis direction;
and (i, j) corresponding to the maximum R (i, j) is the best matching point.
In step 5, the threshold value is 5% -10%.
Examples of applications
The present invention will be described in further detail below with reference to examples of aerial images and the accompanying drawings, but embodiments of the present invention are not limited to aerial images.
As shown in fig. 1, a method for quickly stitching aerial infrared images includes the following steps:
step 1, reading: reading in an image library to be spliced and shooting information generated in the shooting process;
the manner of image acquisition in the image library is shown in fig. 2: the aircraft flies forwards (the flying direction is vertical to the paper surface in fig. 2), the infrared camera faces the ground in a plane (the paper surface) vertical to the flying direction and is obliquely swept at an angle theta to shoot, the sweep generates images in the longitudinal direction, and the number of the images in the longitudinal direction is 9 fixed pairs; the airplane flies to generate images in the transverse direction, and the transverse sheets extend infinitely; and reading the aerial photographing information such as latitude and longitude generated during aerial photographing carried in the image head, photographing time and the like. The specific aerial photographing information comprises the flying speed, flying height, course angle, pitch angle, current longitude and latitude and current photographing time of the aircraft.
Step 2, manufacturing an image array: analyzing the positions of all images to be spliced in the image library to be spliced according to the shooting information in the step 1, and then making all the images to be spliced into an image array according to the positions of all the images to be spliced; the resulting image array is shown in fig. 3. When splicing, the splicing sequence is from top to bottom and from left to right.
Step 3, coarse matching of adjacent images: regarding two adjacent images in the image array obtained in the step 2 as a weak perspective relation, and calculating the relative translation amount of the two adjacent images by adopting an image registration method based on Fourier transform; the method comprises the following specific steps:
the images that are sequentially adjacent are preliminarily registered. Because the images are obtained by long-distance aerial photography, projection exists between two adjacent images theoreticallyAnd (3) shadow relation, for aerial images with shooting distance of dozens of kilometers, two adjacent images can be regarded as weak perspective relation, and the default two images only have translation relation. Therefore, a Fourier registration method is adopted to find out the relative displacement relation of two adjacent images. The specific implementation method comprises the following steps: let f1(x, y) is f2(x, y) are adjacent images, f1And f2Corresponding Fourier transform F1(u, v) and F2(u, v), calculating the cross-power spectra of the two images according to the following formula:
wherein F1(u, v) and F2(u, v) are each F1(u, v) and F2(u, v) complex conjugation;
the phase difference of the cross power spectrum is equivalent to the phase difference between the images, the Fourier inverse transformation is carried out on the formula to obtain a two-dimensional pulse function, and the pulse peak position is found out, wherein the position is the relative translation x between the two registered images0And y0。
Step 4, fine matching of adjacent images: according to the relative translation amount obtained in the step 3, an overlapping area of two adjacent images is defined, then the area is respectively expanded to two sides by 20%, and then three images are randomly selected from left to right in the expanded area and marked as a left template image block, a middle template image block and a right template image block; the length of each template image block is 1/6 the length of the expanded overlapping area and 1/6 the width of the expanded overlapping area; as shown in FIG. 4, three template image blocks T of left, middle and right are selected in the image 11、T2、T3The sizes are X multiplied by Y;
carrying out template matching on two adjacent images by using a maximum similarity criterion to obtain the optimal matching point of each template image block; the maximum similarity criterion uses a correlation function R (i, j) as a similarity measure, and the calculation formula is as follows:
wherein s (X, Y) is an overlapped area of the expanded image 2, T (X, Y) is a template image block, X is the length of the template image block, Y is the width of the template image block, i is the moving position of the template image block in the X-axis direction, and j is the moving position of the template image block in the Y-axis direction;
and (i, j) corresponding to the maximum R (i, j) is the best matching point.
The best matching points of the three template image blocks are respectively marked as (x)1,y1)、(x2,y2)、(x3,y3) Then, the absolute values of the horizontal and vertical coordinate difference values of the three points are calculated: w is a group of1=|x1-x2|、W2=|x1-x3|、W3=|x2-x3|、H1=|y1-y2|、H2=|y1-y3|、H3=|y2-y3L, |; then, the three matching points are screened, and the Liu dust is shown in FIG. 5: if W1、W2And W3All are less than or equal to 5 percent of the length of the image to be spliced and correspond to H1、H2And H3If the height of the three points is less than or equal to 5% of the height of the image to be spliced, the three points have no great difference, the matching is successful, and the three points are averaged to be used as a fine matching result; if W1、W2、W3If two of the two points are less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, the two points have no great difference, the matching is successful, and any one of the two points is averaged to serve as a matching result; if W1、W2、W3If one of the images is less than or equal to 5% of the length of the image to be spliced and the corresponding H is less than or equal to 5% of the width of the image to be spliced, it is indicated that a pair of points has no great difference and is successfully matched, and the pair of points is averaged to serve as a matching result; the other conditions show that the difference of the three optimal matching points is large, and the matching fails; step 5 is implemented if matching is successful, and step 6 is implemented if matching is failed;
and 5, judging a matching result: if the difference between the result of the fine matching in the step 4 and the result of the coarse matching in the step 3 is less than 5 percent of the length of the image to be spliced and less than 5 percent of the width of the image to be spliced, accepting the matching result and implementing the step 7, otherwise rejecting the matching result and implementing the step 6;
step 6, matching the feature points of two adjacent images: carrying out Harris characteristic point detection on two adjacent images to be matched, carrying out Euclidean distance matching on the characteristic point descriptors obtained by detection to obtain matching points, eliminating error matching points by using a RANSAC algorithm, and calculating the relative translation amount between the two registered images;
and 7, fusing adjacent images: and fusing adjacent images by using a Laplacian pyramid fusion algorithm, wherein the fusion weight of the fusion area is decreased from two sides to the center. Since two adjacent images are regarded as weak perspective relationship and there is error in registration, the registered images are usually misaligned by several pixels. Adjacent images are fused by using a Laplacian pyramid fusion algorithm, so that the influence caused by image pixel dislocation can be effectively eliminated.
And 8, repeating the steps 3 to 7 until the whole image library is spliced.
The result of stitching an image library containing 72 images (9 rows and 8 columns) according to the above embodiment is shown in fig. 6.
The foregoing shows and describes the general principles, essential features, and advantages of the invention. It will be understood by those skilled in the art that the present invention is not limited to the embodiments described above, which are described in the specification and illustrated only to illustrate the principle of the present invention, but that various changes and modifications may be made therein without departing from the spirit and scope of the present invention, which fall within the scope of the invention as claimed. The scope of the invention is defined by the appended claims and equivalents thereof.
Claims (7)
1. A method for quickly splicing aerial infrared images is characterized by comprising the following steps:
step 1, reading: reading in an image library to be spliced and shooting information generated in the shooting process;
step 2, manufacturing an image array: analyzing the positions of all images to be spliced in the image library to be spliced according to the shooting information in the step 1, and then making all the images to be spliced into an image array according to the positions of all the images to be spliced;
step 3, coarse matching of adjacent images: regarding two adjacent images in the image array obtained in the step 2 as a pair of weak perspective relations, and calculating the relative translation amount of the two adjacent images by adopting an image registration method based on Fourier transform;
step 4, fine matching of adjacent images: according to the relative translation amount obtained in the step 3, an overlapping area of two adjacent images is defined, then the area is respectively expanded to two sides by 20%, and then three images are randomly selected from left to right in the expanded area and marked as a left template image block, a middle template image block and a right template image block;
carrying out template matching on two adjacent images by using a maximum similarity criterion to obtain the optimal matching point of each template image block; the best matching points of the three template image blocks are respectively marked as: (x 1 ,y 1 )、(x 2 ,y 2 )、(x 3 ,y 3 ) Then, the absolute values of the horizontal and vertical coordinate difference values of the three points are calculated:、、、、、(ii) a These three match points were then screened: if it isW 1、W 2AndW 3are all less than or equal to 5 percent of the length of the image to be spliced and are correspondingH 1、H 2AndH 3if the width of the images to be spliced is less than or equal to 5 percent, the three points have no great difference and are successfully matched, and the three points are averaged to be used as a fine matching result; if it isW 1、W 2、W 3Two of them are less than or equal to 5% of the length of the image to be spliced and correspond to each otherHIf the width of the image to be spliced is less than or equal to 5 percent, the two pairs of points have no great difference, the matching is successful, and any one of the two pairs of points is averaged to be used as a matching result; if it isW 1、W 2、W 3One of which is less than or equal to 5% of the length of the image to be spliced and corresponding toHIf the width of the image to be spliced is less than or equal to 5 percent, the point to point has no great difference, the matching is successful, and the point to point is averaged to be used as a matching result; the other conditions show that the difference of the three optimal matching points is large, and the matching fails; step 5 is implemented if matching is successful, and step 6 is implemented if matching is failed;
and 5, judging a matching result: if the difference between the result of the fine matching in the step 4 and the result of the coarse matching in the step 3 is smaller than the threshold value, accepting the matching result and implementing the step 7, otherwise rejecting the matching result and implementing the step 6;
step 6, matching the feature points of two adjacent images: harris feature point detection is carried out on two adjacent images to be matched, Euclidean distance matching is carried out on a feature point descriptor obtained by detection to obtain a matching point, an RANSAC algorithm is used for eliminating an error matching point, and the relative translation amount between the two images to be registered is calculated;
and 7, fusing adjacent images: fusing adjacent images by using a Laplacian pyramid fusion algorithm;
and 8, repeating the steps 3 to 7 until the whole image library is spliced.
2. The method for rapidly splicing infrared images according to claim 1, wherein in the step 1, the shooting information generated in the shooting process comprises the flying speed, flying altitude, heading angle, pitch angle of an aircraft, latitude and longitude during shooting and shooting time.
3. The infrared image fast stitching method as claimed in claim 1, wherein during stitching, the stitching sequence is from top to bottom and from left to right.
4. The infrared image fast stitching method according to claim 1, wherein the specific method for calculating the relative translation amount of two adjacent images by using the fourier registration method is as follows: and calculating cross power spectrums of the two images, wherein the phase difference of the cross power spectrums is equivalent to the phase difference between the images, performing inverse Fourier transform on the phase difference to obtain a two-dimensional pulse function, and finding out the position of a pulse peak, wherein the position is the relative translation amount between the two registered images.
5. The infrared image fast stitching method as claimed in claim 1, wherein in step 4, the maximum similarity criterion is a correlation functionR(i,j) As a similarity measure, the calculation formula is as follows:
wherein,s (x, y) is the expanded overlap region,T(x, y) are the template image blocks,Xis the length of the image block of the template,Yis the width of the image block of the template,ito move the position of the template image block in the x-axis direction,jmoving the position of the template image block in the y-axis direction;
R(i,j) Maximum time of correspondence (i,j) I.e. the best matching point.
6. The infrared image fast stitching method as claimed in claim 1, wherein in step 4, the length of each template image block is 1/6 of the length of the expanded overlapping area, and the width is 1/6 of the width of the expanded overlapping area.
7. The infrared image fast stitching method as claimed in claim 1, wherein in step 5, the threshold is 5% -10%.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810350881.8A CN108537833B (en) | 2018-04-18 | 2018-04-18 | Infrared image rapid splicing method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810350881.8A CN108537833B (en) | 2018-04-18 | 2018-04-18 | Infrared image rapid splicing method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108537833A CN108537833A (en) | 2018-09-14 |
CN108537833B true CN108537833B (en) | 2022-06-21 |
Family
ID=63477627
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810350881.8A Active CN108537833B (en) | 2018-04-18 | 2018-04-18 | Infrared image rapid splicing method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108537833B (en) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109285118B (en) * | 2018-09-26 | 2023-03-07 | 电子科技大学 | Infrared thermal image splicing method with additional accessory layer |
CN109712070A (en) * | 2018-12-04 | 2019-05-03 | 天津津航技术物理研究所 | A kind of infrared panoramic image split-joint method based on graph cut |
CN110264569A (en) * | 2019-05-30 | 2019-09-20 | 南京航空航天大学 | A kind of infrared image three-dimensional reconstruction algorithm |
CN110689511B (en) * | 2019-09-19 | 2023-02-03 | 华东师范大学 | Optimization method for large-visual-field image splicing path |
CN113129213A (en) * | 2020-01-14 | 2021-07-16 | 中国计量大学 | Automatic splicing and fusing method for digital holographic subaperture phase diagram |
CN111260561A (en) * | 2020-02-18 | 2020-06-09 | 中国科学院光电技术研究所 | Rapid multi-graph splicing method for mask defect detection |
CN113916149B (en) * | 2021-09-10 | 2024-06-04 | 北京工业大学 | Cylindrical straight gear appearance vision measuring device based on multi-eye CCD industrial camera and image processing method |
CN115272077B (en) * | 2022-07-29 | 2023-06-06 | 西安羚控电子科技有限公司 | Image stitching method and system based on vision fusion |
CN116930262B (en) * | 2023-09-15 | 2023-11-28 | 佰聆数据股份有限公司 | Defect detection method and system for realizing power equipment based on multi-terminal interaction |
CN117323002A (en) * | 2023-11-30 | 2024-01-02 | 北京万特福医疗器械有限公司 | Neural endoscopic surgery visualization system based on mixed reality technology |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103714548B (en) * | 2013-12-27 | 2017-01-11 | 西安电子科技大学 | Infrared image and visible image registration method based on visual attention |
CN104881841B (en) * | 2015-05-20 | 2019-01-04 | 南方电网科学研究院有限责任公司 | Aerial high-voltage power tower image splicing method based on edge features and point features |
TWI607901B (en) * | 2015-11-06 | 2017-12-11 | 財團法人工業技術研究院 | Image inpainting system area and method using the same |
CN107784632A (en) * | 2016-08-26 | 2018-03-09 | 南京理工大学 | A kind of infrared panorama map generalization method based on infra-red thermal imaging system |
CN106981077B (en) * | 2017-03-24 | 2020-12-25 | 中国人民解放军国防科学技术大学 | Infrared image and visible light image registration method based on DCE and LSS |
CN107563438B (en) * | 2017-08-31 | 2019-08-30 | 西南交通大学 | A kind of multi-modal Remote Sensing Images Matching Method and system of fast robust |
CN107833179A (en) * | 2017-09-05 | 2018-03-23 | 云南电网有限责任公司昆明供电局 | The quick joining method and system of a kind of infrared image |
-
2018
- 2018-04-18 CN CN201810350881.8A patent/CN108537833B/en active Active
Also Published As
Publication number | Publication date |
---|---|
CN108537833A (en) | 2018-09-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108537833B (en) | Infrared image rapid splicing method | |
US10462362B2 (en) | Feature based high resolution motion estimation from low resolution images captured using an array source | |
CN105957007B (en) | Image split-joint method based on characteristic point plane similarity | |
CN109410207B (en) | NCC (non-return control) feature-based unmanned aerial vehicle line inspection image transmission line detection method | |
CN106462944B (en) | High-resolution panorama VR generator and method | |
CN105205781B (en) | Transmission line of electricity Aerial Images joining method | |
CN110956661B (en) | Method for calculating dynamic pose of visible light and infrared camera based on bidirectional homography matrix | |
US20120230602A9 (en) | Multi-bank TDI approach for high-sensitivity scanners | |
CN106991690A (en) | A kind of video sequence synchronous method based on moving target timing information | |
CN112801870B (en) | Image splicing method based on grid optimization, splicing system and readable storage medium | |
Ghosh et al. | Quantitative evaluation of image mosaicing in multiple scene categories | |
CN107240082B (en) | Splicing line optimization method and equipment | |
CN104392416A (en) | Video stitching method for sports scene | |
CN110909750B (en) | Image difference detection method and device, storage medium and terminal | |
KR20110073386A (en) | Method and device for the invariant-affine recognition of shapes | |
WO2009026522A1 (en) | Multi-bank tdi approach for high-sensitivity scanners | |
CN115456870A (en) | Multi-image splicing method based on external parameter estimation | |
Punnappurath et al. | Rolling shutter super-resolution | |
CN108520541B (en) | Calibration method of wide-angle camera | |
Sun et al. | Image stitching method of aerial image based on feature matching and iterative optimization | |
CN115239667A (en) | Tire qualification detection method based on multi-view camera | |
CN108460786A (en) | A kind of high speed tracking of unmanned plane spot | |
Shen et al. | An improved image mosaic algorithm for UAV | |
Majumdar et al. | Registration and mosaicing for images obtained from UAV | |
CN108830781A (en) | A kind of wide Baseline Images matching line segments method under Perspective transformation model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |