CN115908136A - Real-time incremental splicing method for aerial images of unmanned aerial vehicle - Google Patents

Real-time incremental splicing method for aerial images of unmanned aerial vehicle Download PDF

Info

Publication number
CN115908136A
CN115908136A CN202211514624.6A CN202211514624A CN115908136A CN 115908136 A CN115908136 A CN 115908136A CN 202211514624 A CN202211514624 A CN 202211514624A CN 115908136 A CN115908136 A CN 115908136A
Authority
CN
China
Prior art keywords
image
longitude
latitude
spliced
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211514624.6A
Other languages
Chinese (zh)
Inventor
魏倩茹
牛犇
侯晓松
马浩铭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN202211514624.6A priority Critical patent/CN115908136A/en
Publication of CN115908136A publication Critical patent/CN115908136A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention discloses a real-time incremental splicing method for aerial images of an unmanned aerial vehicle, which comprises the following steps: creating a new map template; monitoring an aerial image storage folder of the unmanned aerial vehicle, and updating an image splicing sequence in real time; setting splicing route parameters; image splicing; generating a new map; and correcting the center offset. The method can realize fast, real-time, incremental, robust and efficient splicing of the aerial images of the unmanned aerial vehicle with multiple strips in an unknown regional environment.

Description

Real-time incremental splicing method for aerial images of unmanned aerial vehicle
Technical Field
The invention relates to the unmanned aerial vehicle technology, in particular to a real-time incremental splicing method for aerial images of an unmanned aerial vehicle.
Background
The method has very important application value and research significance for constructing the panoramic image by quickly splicing the aerial image sequence obtained by shooting based on the unmanned aerial vehicle, and can be used in the military field (such as air enemy investigation, border patrol and the like) and the civil field (such as air surveying and mapping, air traffic investigation, disaster air investigation and the like). In many specific tasks, in order to expand the field of view and obtain more comprehensive information, it is important to construct a panoramic image through a rapid image stitching technology, and an image stitching technology based on an unmanned aerial vehicle aerial image sequence has become a large research hotspot.
The unmanned aerial vehicle aerial image splicing method mainly comprises four steps: image geometry correction, image preprocessing, image registration and image fusion.
And (5) geometrically correcting the image. Because unmanned aerial vehicle is less, self stability is relatively poor, can appear slope and shake in shooting process inevitable, consequently at first carry out geometric correction to the image of taking photo by plane that unmanned aerial vehicle shoots and obtain, correct various distortions such as edge distortion, central deviation that produce in the formation of image process, generate a new image that accords with the concatenation requirement.
And (4) image preprocessing. The image preprocessing refers to unifying the images to be spliced into the established coordinate system.
And (5) image registration. The image registration refers to extracting matching information between the images, and finding an optimal matching pair in the extracted information according to a corresponding matching strategy, so that the optimal matching pair is used as a basis for mutual alignment and unification between the images, and the alignment and unification between the images are completed.
And (5) image fusion. Image fusion refers to mosaicing images and smoothing boundaries after image registration to make image stitching transition natural.
The geometric correction of the image is difficult, if the image distortion is serious, the image splicing effect is affected, and the key point of successful image splicing is image registration.
At present, the mainstream image mosaic algorithm is mainly based on a feature extraction and matching method, and a deep learning algorithm or a feature description operator is adopted to detect and extract feature key points of images, then the feature key points between the images are matched according to a specific matching strategy, and a corresponding homography matrix is calculated, so that a plurality of images are converted to the same visual field, and image mosaic is realized.
A great deal of research is carried out at home and abroad on the unmanned aerial vehicle aerial image splicing technology, an image registration algorithm based on Fourier-Mellin transformation and feature point matching is provided, initial registration parameters are calculated by the Fourier-Mellin transformation, then extracted feature points between images are mapped according to the initial registration parameters, the point matching and optimization processes are completed, one-to-one matching point pairs are obtained, and the final registration parameters are calculated by using a least square method.
An ORB feature description operator is adopted for feature extraction, a rapid robust feature matching strategy based on descriptor similarity is provided, modeling is carried out on spatial affine transformation between image sequences obtained by unmanned aerial vehicle shooting, a robust Bayes framework is introduced to estimate the transformation, and a gradual fading method is provided to carry out image fusion.
There is proposed a VGG-style regression homograph network, which learns homographies between two images, obtains an intermediate variable homograph using a neural network, and performs registration using the homograph without separating feature point detection and transformation estimation.
A method for quickly splicing aerial image sequences of a small unmanned aerial vehicle is provided, and the method estimates homography transformation between an image and the ground by using bundle adjustment of a small number of ground control points, calculates an orthorectification parameter, and then corrects and merges the image according to the orthorectification parameter obtained by calculation to generate a spliced panoramic image of the whole scene.
The unmanned aerial vehicle aerial image sequence has the characteristics of large quantity, multiple strips, small phase amplitude between adjacent frames, high overlapping degree and the like, and because the unmanned aerial vehicle can tilt and shake unavoidably in the aerial process, certain affine distortion exists between the shot image and a real scene. The most demand that can't adapt to unmanned aerial vehicle image sequence concatenation task of splicing technique now, when the flight distance is far away, the image quantity that splices increases, the concatenation time that consumes will violently increase to along with the continuous accumulation of error, subsequent concatenation effect will be very unsatisfactory, has serious distortion phenomenon, can't satisfy real-time, incremental type, robustness and required precision simultaneously.
Disclosure of Invention
The invention mainly aims to provide a real-time incremental splicing method for aerial images of an unmanned aerial vehicle, which solves the problems of the existing aerial image sequence splicing technology of the unmanned aerial vehicle.
The technical scheme adopted by the invention is as follows: an unmanned aerial vehicle aerial image real-time incremental splicing method comprises the following steps:
creating a new map template;
monitoring an aerial image storage folder of the unmanned aerial vehicle, and updating an image splicing sequence in real time;
setting parameters of a splicing route;
image splicing;
generating a new map;
and correcting the center offset.
Further, the creating a new map template includes:
obtaining affine transformation parameter information of a new map by adopting a WGS 84 geographic coordinate system according to original map information or given map scanning area information: the longitude and latitude coordinates of the top left corner vertex of the new map and the longitude and latitude span parameters of the new map (the longitude and latitude span parameters of the map can be adjusted as required to set different map precision levels), the mutual conversion between the pixel coordinates and the longitude and latitude coordinates on the image can be realized through affine transformation parameters, and the R, G and B three-band matrix of the map is initialized by 0;
converting the latitude and longitude coordinates into pixel coordinates as shown in the following formula:
col = (longitude - transform[0]) / trans[1]
row = (latitude - transform[3]) / trans[5]
wherein col is a column coordinate of a pixel point, row is a row coordinate of the pixel point, longtude is a longitude value of the pixel point, latitude is a latitude value of the pixel point, transform is an affine transformation parameter (transform [0] represents a longitude value of the upper left corner, transform [1] represents a horizontal resolution of an image, i.e., a longitude span of the pixel, transform [2] represents a rotation angle, north-facing up is 0, transform [3] represents a latitude value of the upper left corner, transform [4] represents a rotation angle, north-facing up is 0, transform [5] represents a vertical resolution of the image, i.e., a latitude span of the pixel), and in the above formula, the north-facing up of the image, i.e., the values of transform [2] and transform [4] are both 0.
Still further, the setting of the splicing route parameter includes:
setting flight path angle parameters, adjacent frame moving distance parameters and spliced image span parameters, judging whether the current image to be spliced is in uniform linear flight according to a preset route or not by combining the current image to be spliced and flight control data of the previous two frames of images, splicing if yes, or directly skipping if not, and eliminating invalid interference images of the unmanned aerial vehicle before picture splicing and in the turning process.
Still further, the image stitching comprises:
and (3) image rotation correction: the orientation of the images to be spliced is not all positive north up, but along the current flight line direction, the orientation can be obtained by the uav _ yaw parameter in the flight control data, a rotation matrix is constructed by taking the central point of the images as the rotation center according to the uav _ yaw angle, the images to be spliced are corrected to be positive north up in a lossless manner, and the missing background is filled with black (0, 0);
the rotation matrix configuration is shown as follows:
Figure 100002_DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 137569DEST_PATH_IMAGE002
is a rotation angle (clockwise is positive) and->
Figure DEST_PATH_IMAGE003
Is a rotation center;
and (3) measuring geographic information: performing SURF feature point calculation and matching on a current image to be spliced and a previous image, adopting a KnnMatch matching strategy, enabling connecting lines between correctly matched feature points to be approximately parallel according to the sequence characteristics of an unmanned aerial vehicle aerial image, adopting an RANSAC algorithm to remove wrong matching points, calculating a homography matrix, converting the central point coordinate of the previous image into the current image to be spliced, and calculating the longitude and latitude span parameters of the image to be spliced according to the pixel coordinates and the longitude and latitude coordinates of the converted point and the two points at the central point of the image to be spliced;
calculating the longitude and latitude span parameters as shown in the following formula:
Figure 80117DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 518052DEST_PATH_IMAGE005
for the central point of the image to be spliced, is judged>
Figure 125620DEST_PATH_IMAGE006
For the centre of the image to be stitched
The latitude and longitude values of the points are,
Figure 598189DEST_PATH_IMAGE007
matching an image center point for a feature, <' > based on the image>
Figure 711639DEST_PATH_IMAGE008
Matching the longitude and latitude values of the central point of the image for the characteristics;
image mosaic: calculating the specific position of the image to be spliced on the new map according to the longitude and latitude span parameters of the image to be spliced, the pixel coordinates and the longitude and latitude coordinates of the image center point, updating R, G and B three-band matrix information of the region, and removing the splicing seams;
the removal of the abutted seam is shown by the following formula:
Figure DEST_PATH_IMAGE009
if the pixel point
Figure 636869DEST_PATH_IMAGE010
And belongs to the splicing area, the pixel value is updated,otherwise, keeping the state unchanged;
and repeating the steps until all the images are spliced.
Still further, the new map generation includes:
and after all the images are spliced, writing the R, G and B three-band matrix data into a new map file to generate a final jigsaw result in a GeoTIFF format.
Still further, the center shift correction includes:
the unmanned aerial vehicle is required to keep stable flight in the scanning process, namely the deviation between a pitch angle (pitch) and a roll angle is as small as possible, and the maximum deviation is not more than 3 degrees; finally, the uniform offset correction is carried out according to the bands.
The invention has the advantages that:
the method can realize fast, real-time, incremental, robust and efficient splicing of the aerial images of the unmanned aerial vehicle with multiple strips in an unknown regional environment;
the algorithm is modularized, and the splicing effect can be improved by subsequently improving a feature extraction method or a feature matching strategy without redesigning the algorithm flow and other modules;
the algorithm can automatically eliminate invalid interference images according to the set splicing route parameters.
The image splicing effect is only related to adjacent frames, no accumulated error is introduced, and the parts are not influenced by each other.
The generated map file has higher precision level compared with the original map, and can be used for other subsequent tasks.
The algorithm can correct the center offset to a certain extent, and the robustness of the algorithm is further improved.
In addition to the above-described objects, features and advantages, the present invention has other objects, features and advantages. The present invention will be described in further detail below with reference to the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the invention and, together with the description, serve to explain the invention and not to limit the invention.
FIG. 1 is a diagram of an example of image input according to the present invention;
FIG. 2 is a schematic diagram of an original map of the present invention;
FIG. 3 is an overall flow chart of the algorithm of the present invention;
FIG. 4 is a diagram of an example image rotation correction of the present invention;
FIG. 5 is a graph of SURF feature point calculation and matching results of the present invention;
FIG. 6 is a mosaic of images of the present invention;
fig. 7 is a graph of the image stitching result of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and do not limit the invention.
The input required by the algorithm is the unmanned aerial vehicle aerial image sequence and the corresponding flight control data, as shown in fig. 1. The method comprises the steps of extracting a frame every 0.5s, enabling aerial images sent back by an unmanned aerial vehicle in real time and corresponding flight control data to exist in corresponding file paths, and uniformly naming and aligning, wherein the flight control data need to comprise longitude and latitude coordinates when the unmanned aerial vehicle shoots (namely longitude and latitude coordinates of a central point of the aerial images), a camera yaw angle (yaw), a pitch angle (pitch) and a roll angle (roll) when the unmanned aerial vehicle shoots. In addition, an original map is also needed for specifying the image splicing area range, as shown in fig. 2. The original map is not necessary, but the range of the image stitching area can be specified by giving longitude and latitude coordinates of the upper left corner and the lower right corner of the image stitching area.
The overall flow design of the algorithm is shown in fig. 3.
(1) Creating a new map template (new _ map. Tif), adopting a 'WGS 84' geographic coordinate system, and obtaining affine transformation parameter information of a new map (due north up) according to original map information or given map area information (longitude and latitude coordinates of the upper left corner and the lower right corner of a map area (due north up)): the longitude and latitude coordinates of the top left corner vertex of the new map and the longitude and latitude span parameters of the new map (different map precision levels can be set by adjusting the longitude and latitude span parameters of the map according to needs), the mutual conversion between the pixel coordinates and the longitude and latitude coordinates on the image can be realized through affine transformation parameters, and the R, G and B three-band matrix of the map is initialized by the total black (0).
Converting the longitude and latitude coordinates into pixel coordinates as shown in the formula (I):
col = (longitude - transform[0]) / trans[1]
row = (late-transform [3 ])/trans [5] formula (one)
Wherein col is a column coordinate of the pixel, row is a row coordinate of the pixel, longitude is a longitude value of the pixel, latitude is a latitude value of the pixel, transform is an affine transformation parameter (transform [0] represents a longitude value of an upper left corner, transform [1] represents a horizontal resolution of the image, i.e., a longitude span of the pixel, transform [2] represents a rotation angle, north-facing up is 0, transform [3] represents a latitude value of an upper left corner, transform [4] represents a rotation angle, north-facing up is 0, transform [5] represents a vertical resolution of the image, i.e., a latitude span of the pixel). In formula (one), the image faces north, i.e., both transform [2] and transform [4] have a value of 0.
(2) And monitoring an aerial image storage folder of the unmanned aerial vehicle, and updating an image splicing sequence in real time to realize real-time incremental splicing.
(3) Setting splicing route parameters: flight line angle parameter, adjacent frame movement distance parameter, and stitched image span parameter (stitched at a span of 10%). And judging whether the current image to be spliced flies at a constant speed along a preset route or not by combining the current image to be spliced with the flight control data of the previous two frames of images, splicing if the current image to be spliced flies at a constant speed along the preset route, and directly skipping if the current image to be spliced does not fly, so that invalid interference images of the unmanned aerial vehicle before splicing and in the turning process are eliminated.
(4) And (3) image splicing algorithm:
1. the image is rotationally corrected as shown in fig. 4. The orientation of the images to be spliced is not all right north up, but along the current flight path direction, the images to be spliced are obtained from the uav _ yaw parameter in the flight control data, and the new map is right north up, so that a rotation matrix needs to be constructed by taking the image center point as the rotation center according to the uav _ yaw angle, the images to be spliced are corrected to be right north up in a lossless manner, and the missing background is filled with black (0, 0).
The rotation matrix structure is shown as formula (two):
Figure 798861DEST_PATH_IMAGE011
formula 2
Wherein, the first and the second end of the pipe are connected with each other,
Figure 860357DEST_PATH_IMAGE012
is a rotation angle (clockwise is positive) and->
Figure 144708DEST_PATH_IMAGE013
Is the center of rotation.
2. And (4) measuring geographic information. Calculating and matching SURF characteristic points of a current image to be spliced and a previous image (the overlap degree of a selected image and the image to be spliced is about 70%), as shown in figure 5, adopting a KnnmMatch matching strategy, according to the characteristics of an unmanned aerial vehicle aerial image sequence, connecting lines between correctly matched characteristic points are approximately parallel, adopting an RANSAC algorithm to remove wrong matching points, calculating a homography matrix, transforming the central point coordinate of the previous image into the current image to be spliced, and calculating the longitude and latitude span parameters of the image to be spliced according to the pixel coordinates and the longitude and latitude coordinates of two points of the transformed point and the central point of the image to be spliced.
Calculating the longitude and latitude span parameter as shown in formula (III):
Figure 744186DEST_PATH_IMAGE014
Figure 772185DEST_PATH_IMAGE015
formula (III)
Wherein the content of the first and second substances,
Figure 953767DEST_PATH_IMAGE016
for the central point of the image to be spliced, is judged>
Figure 143440DEST_PATH_IMAGE017
For the centre of the image to be stitched>
The latitude and longitude values of the points are,
Figure 980946DEST_PATH_IMAGE018
matching an image center point for a feature, <' > based on the image>
Figure 547057DEST_PATH_IMAGE019
And matching the longitude and latitude values of the central point of the image for the characteristic.
3. Image mosaicing, as shown in FIG. 6. And calculating the specific position of the image to be spliced on the new map according to the longitude and latitude span parameters of the image to be spliced, the pixel coordinates and the longitude and latitude coordinates of the image center point, updating R, G and B three-band matrix information of the region, and removing the splicing seams.
Removing the abutted seam is shown in the formula (IV):
Figure 848725DEST_PATH_IMAGE020
formula (IV)
If the pixel point is
Figure 209299DEST_PATH_IMAGE021
And if the image belongs to the splicing area, updating the pixel value, otherwise, keeping the pixel value unchanged.
4. And (5) repeating the steps 1 to 3 until all the images are spliced.
(5) And generating a new map. After all images are spliced, writing the R, G, B three-band matrix data into a new map file (new _ map. Tif), and generating a final mosaic result, as shown in fig. 7, in a format of GeoTIFF.
(6) And correcting the center offset. The unmanned aerial vehicle should keep flight stability in the scanning process, namely the deviation of a pitch angle (pitch) and a roll angle (roll) is as small as possible, the maximum deviation should not exceed 3 degrees, otherwise, the unmanned aerial vehicle is shown to have more serious inclination and shake at the moment, the shot image has more serious central deviation and edge distortion, and the image has larger affine distortion compared with a real scene and seriously influences the splicing effect. According to actual operation, when the deviation angle is small, the edge distortion is light, the splicing effect is less influenced, but the central deviation (namely, longitude and latitude coordinates in the flight control data do not correspond to the central point of the image) still influences the splicing effect among the strips. Therefore, the uniform offset correction is finally carried out according to the strips, and the splicing effect is further improved.
The invention provides an incremental real-time unmanned aerial vehicle aerial image sequence splicing algorithm aiming at the problems of the existing unmanned aerial vehicle aerial image sequence splicing technology and the characteristics of an unmanned aerial vehicle aerial image sequence.
Incremental real-time splicing. When the unmanned aerial vehicle carries out the sweep task, the splicing can be started, all spliced images do not need to be input at one time, and the image splicing process can be displayed in real time.
The algorithm has universality and robustness. The algorithm can meet the requirement of splicing tasks in different areas and is not limited by flight time and flight distance.
Only the aerial image of the unmanned aerial vehicle and the corresponding flight control data are needed, and ground control points and the like are not needed.
The final generated result is a map file in GeoTIFF format, which can be used for other tasks (such as road network extraction, path planning, etc.), has a higher precision level than a satellite map, and can adjust the map precision level as required.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit and scope of the present invention.

Claims (6)

1. The real-time incremental splicing method for the aerial images of the unmanned aerial vehicle is characterized by comprising the following steps of: creating a new map template;
monitoring an aerial image storage folder of the unmanned aerial vehicle, and updating an image splicing sequence in real time;
setting splicing route parameters;
image splicing;
generating a new map;
and correcting the center offset.
2. The real-time incremental stitching method for aerial images of unmanned aerial vehicle of claim 1, wherein the creating of the new map template comprises:
obtaining affine transformation parameter information of a new map by adopting a WGS 84 geographic coordinate system according to original map information or given map scanning area information: the longitude and latitude coordinates of the top left corner vertex of the new map and the longitude and latitude span parameters of the new map (the longitude and latitude span parameters of the map can be adjusted as required to set different map precision levels), the mutual conversion between the pixel coordinates and the longitude and latitude coordinates on the image can be realized through affine transformation parameters, and the R, G and B three-band matrix of the map is initialized by 0;
converting the latitude and longitude coordinates into pixel coordinates as shown in the following formula:
col = (longitude - transform[0]) / trans[1]
row = (latitude - transform[3]) / trans[5]
wherein col is the column coordinate of the pixel, row is the row coordinate of the pixel, longitude is the longitude value of the pixel, latitude is the latitude value of the pixel, transform is the affine transformation parameter (transform [0] represents the longitude value of the upper left corner, transform [1] represents the horizontal resolution of the image, i.e. the longitude span of the pixel, transform [2] represents the rotation angle, north is 0 when facing up, transform [3] represents the latitude value of the upper left corner, transform [4] represents the rotation angle, north is 0 when facing up, transform [5] represents the vertical resolution of the image, i.e. the latitude span of the pixel), in the above formula, the north of the image faces up, i.e. the values of transform [2] and transform [4] are both 0.
3. The real-time incremental stitching method for the aerial images of the unmanned aerial vehicle as claimed in claim 1, wherein the setting of the stitching route parameters comprises:
setting flight path angle parameters, adjacent frame moving distance parameters and spliced image span parameters, judging whether the current image to be spliced is in uniform linear flight according to a preset route or not by combining the current image to be spliced and flight control data of the previous two frames of images, splicing if yes, or directly skipping if not, and eliminating invalid interference images of the unmanned aerial vehicle before picture splicing and in the turning process.
4. The real-time incremental stitching method for the aerial images of the unmanned aerial vehicle of claim 1, wherein the image stitching comprises:
and (3) image rotation correction: the orientation of the images to be spliced is not all positive north up, but along the current flight line direction, the orientation can be obtained by the uav _ yaw parameter in the flight control data, a rotation matrix is constructed by taking the central point of the images as the rotation center according to the uav _ yaw angle, the images to be spliced are corrected to be positive north up in a lossless manner, and the missing background is filled with black (0, 0);
the rotation matrix configuration is shown as follows:
Figure DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 94880DEST_PATH_IMAGE002
is a rotation angle (clockwise is positive) and->
Figure 430047DEST_PATH_IMAGE003
Is a rotation center;
and (3) geographic information measurement: performing SURF feature point calculation and matching on a current image to be spliced and a previous image, adopting a KnnMatch matching strategy, enabling connecting lines between correctly matched feature points to be approximately parallel according to the sequence characteristics of an unmanned aerial vehicle aerial image, adopting an RANSAC algorithm to remove wrong matching points, calculating a homography matrix, converting the central point coordinate of the previous image into the current image to be spliced, and calculating the longitude and latitude span parameters of the image to be spliced according to the pixel coordinates and the longitude and latitude coordinates of the converted point and the two points at the central point of the image to be spliced;
calculating the longitude and latitude span parameters as shown in the following formula:
Figure 345919DEST_PATH_IMAGE004
wherein the content of the first and second substances,
Figure 424733DEST_PATH_IMAGE005
for a central point of the image to be spliced, in each case>
Figure 922711DEST_PATH_IMAGE006
For the centre of the image to be stitched
The latitude and longitude values of the points are,
Figure 694358DEST_PATH_IMAGE008
matching an image center point for a feature, <' > based on the image>
Figure 582679DEST_PATH_IMAGE009
Matching the longitude and latitude values of the central point of the image for the characteristics;
image mosaicing: calculating the specific position of the image to be spliced on the new map according to the longitude and latitude span parameters of the image to be spliced, the pixel coordinates and the longitude and latitude coordinates of the image center point, updating R, G and B three-band matrix information of the region, and removing the splicing seams;
the removal of the patchwork is shown as follows:
Figure 465185DEST_PATH_IMAGE010
if the pixel point is
Figure DEST_PATH_IMAGE011
If the image belongs to the splicing area, updating the pixel value, otherwise, keeping the pixel value unchanged;
and repeating the steps until all the images are spliced.
5. The real-time incremental stitching method for aerial images of unmanned aerial vehicle according to claim 1, wherein the method is characterized in that
In that the new map generation comprises:
and after all the images are spliced, writing the R, G and B three-band matrix data into a new map file to generate a final jigsaw result in a GeoTIFF format.
6. The real-time incremental stitching method for aerial images taken by unmanned aerial vehicles according to claim 1, wherein the correction of center shift comprises:
the unmanned aerial vehicle is required to keep stable flight in the scanning process, namely the deviation between a pitch angle (pitch) and a roll angle is as small as possible, and the maximum deviation is not more than 3 degrees; finally, the uniform offset correction is carried out according to the bands.
CN202211514624.6A 2022-11-30 2022-11-30 Real-time incremental splicing method for aerial images of unmanned aerial vehicle Pending CN115908136A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211514624.6A CN115908136A (en) 2022-11-30 2022-11-30 Real-time incremental splicing method for aerial images of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211514624.6A CN115908136A (en) 2022-11-30 2022-11-30 Real-time incremental splicing method for aerial images of unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
CN115908136A true CN115908136A (en) 2023-04-04

Family

ID=86474347

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211514624.6A Pending CN115908136A (en) 2022-11-30 2022-11-30 Real-time incremental splicing method for aerial images of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN115908136A (en)

Similar Documents

Publication Publication Date Title
CN110211043B (en) Registration method based on grid optimization for panoramic image stitching
CN110648398B (en) Real-time ortho image generation method and system based on unmanned aerial vehicle aerial data
CN110675450B (en) Method and system for generating orthoimage in real time based on SLAM technology
CN104732482A (en) Multi-resolution image stitching method based on control points
CN107492069B (en) Image fusion method based on multi-lens sensor
CN106157304A (en) A kind of Panoramagram montage method based on multiple cameras and system
CN104408689B (en) Streetscape dough sheet optimization method based on full-view image
CN104156968B (en) Large-area complex-terrain-region unmanned plane sequence image rapid seamless splicing method
CN106127697B (en) EO-1 hyperion geometric correction method is imaged in unmanned aerial vehicle onboard
CN105352509B (en) Unmanned plane motion target tracking and localization method under geography information space-time restriction
CN107808362A (en) A kind of image split-joint method combined based on unmanned plane POS information with image SURF features
CN101697105B (en) Camera type touch detection positioning method and camera type touch detection system
CN104835115A (en) Imaging method for aerial camera, and system thereof
McManus et al. Visual teach and repeat using appearance-based lidar
CN113222820B (en) Pose information-assisted aerial remote sensing image stitching method
Guo et al. Mapping crop status from an unmanned aerial vehicle for precision agriculture applications
CN109325913B (en) Unmanned aerial vehicle image splicing method and device
CN113793270A (en) Aerial image geometric correction method based on unmanned aerial vehicle attitude information
CN111161154A (en) Real-time and rapid orthoscopic splicing system and method for videos of unmanned aerial vehicle
CN110223233B (en) Unmanned aerial vehicle aerial photography image building method based on image splicing
CN110986888A (en) Aerial photography integrated method
CN116228539A (en) Unmanned aerial vehicle remote sensing image stitching method
CN116385504A (en) Inspection and ranging method based on unmanned aerial vehicle acquisition point cloud and image registration
EP1081645A1 (en) Method and device for making map using photograph image and method for correcting distortion of photograph image
CN112750075A (en) Low-altitude remote sensing image splicing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination