CN114216485B - Image calibration method for aerial surveying and mapping of unmanned aerial vehicle - Google Patents

Image calibration method for aerial surveying and mapping of unmanned aerial vehicle Download PDF

Info

Publication number
CN114216485B
CN114216485B CN202210164317.3A CN202210164317A CN114216485B CN 114216485 B CN114216485 B CN 114216485B CN 202210164317 A CN202210164317 A CN 202210164317A CN 114216485 B CN114216485 B CN 114216485B
Authority
CN
China
Prior art keywords
image
mapping
corner
attitude angle
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN202210164317.3A
Other languages
Chinese (zh)
Other versions
CN114216485A (en
Inventor
刘晓夏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Juntian Technology Co ltd
Original Assignee
Guangzhou Juntian Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Juntian Technology Co ltd filed Critical Guangzhou Juntian Technology Co ltd
Priority to CN202210164317.3A priority Critical patent/CN114216485B/en
Publication of CN114216485A publication Critical patent/CN114216485A/en
Application granted granted Critical
Publication of CN114216485B publication Critical patent/CN114216485B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures

Abstract

The invention relates to the technical field of unmanned aerial vehicle surveying and mapping, in particular to an image calibration method for unmanned aerial vehicle aerial photography surveying and mapping. The method comprises the following steps: carrying out affine transformation on each obtained initial mapping image by using the target attitude angle to obtain each first mapping image; obtaining fusion error distribution characteristics corresponding to each first coincident image; calculating a corner matching error estimator when each first coincident image is subjected to corner matching; constructing an affine transformation error distribution characteristic model according to the fusion error difference distribution characteristic of each first coincident image and the corner matching error estimator; calibrating the result by utilizing the fusion error distribution characteristics, the angular point matching error estimator and the affine transformation error distribution characteristic model of each first coincident image to obtain the accuracy degree of the attitude angle; and if the attitude angle accuracy is less than or equal to the threshold, adjusting the target attitude angle until the final attitude angle accuracy is greater than the threshold. The invention improves the accuracy of the mapping result.

Description

Image calibration method for aerial surveying and mapping of unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicle surveying and mapping, in particular to an image calibration method for unmanned aerial vehicle aerial photography surveying and mapping.
Background
The realization of the unmanned aerial vehicle aerial survey mapping technology has great significance for the requirements of a small-range and high-standard graph. However, the mapping result generally has some errors, which cause the coordinate of each position in the mapping result to be different from the actual coordinate, and the difference of the coordinate of one position can reach several meters or more than ten meters; when the accuracy requirement of the mapping result is high, such errors cannot be ignored, and error calibration is required.
One of the reasons for errors in the mapping results is: when a plurality of images collected by a camera are fused to obtain a surveying and mapping result, affine transformation needs to be carried out on the images collected by the camera, so that the collected images are in the same visual angle; obtaining the attitude angle of the camera by carrying out affine transformation, wherein the obtaining mode is as follows: acquiring an attitude angle of the camera through a holder control system of the camera; however, errors often exist between the acquired attitude angle of the camera and the true attitude angle of the camera, so that the finally obtained mapping result is inconsistent with the real scene, and the accuracy of the mapping result is influenced.
Disclosure of Invention
In order to solve the problem that the mapping result obtained by the aerial photography mapping of the unmanned aerial vehicle is inaccurate, the invention aims to provide an image calibration method for the aerial photography mapping of the unmanned aerial vehicle, and the adopted technical scheme is as follows:
the invention discloses an image calibration method for unmanned aerial vehicle aerial surveying and mapping, which comprises the following steps:
acquiring initial mapping images corresponding to all the visual field areas acquired in the aerial photography mapping process, and performing affine transformation on all the initial mapping images by using the target attitude angle to obtain all the first mapping images;
fusing the first superposed images to obtain fusion error distribution characteristics corresponding to the first superposed images, wherein the first superposed images are any two images with superposed visual fields in the first mapping images;
calculating a corner matching error estimator when each first coincident image is subjected to corner matching;
constructing an affine transformation error distribution characteristic model according to the fusion error distribution characteristics corresponding to the first coincident images and the corresponding corner matching error estimators;
calibrating an affine transformation result by using the fusion error distribution characteristics corresponding to the first coincident images, the corresponding angular point matching error estimator and the affine transformation error distribution characteristic model to obtain a first attitude angle accuracy degree; if the accuracy degree of the first attitude angle is less than or equal to the threshold value, adjusting the target attitude angle; carrying out affine transformation on each initial surveying image again by using the adjusted target attitude angle to obtain a corresponding second surveying image; and obtaining a second attitude angle accuracy degree according to each second mapping image, if the second attitude angle accuracy degree is less than or equal to a threshold value, continuing to adjust the target attitude angle until the final attitude angle accuracy degree is greater than the threshold value, and obtaining a mapping result by using the final target attitude angle.
Preferably, the method for obtaining the fusion error distribution characteristic corresponding to each first registration image includes:
obtaining all corner points of the two first mapping images corresponding to the first coincident image by using a corner point detection algorithm;
matching the corner points of the two first mapping images corresponding to the first superposed image by using a corner point matching algorithm to obtain a corresponding corner point pair;
and fitting fusion error distribution characteristics corresponding to the first coincident image according to the coordinates of the two corner points corresponding to each corner point pair.
Preferably, the method for calculating the corner matching error estimator when the first registration image is subjected to corner matching includes:
obtaining descriptors of two corner points in a corner point pair corresponding to the first coincident image;
calculating cosine similarity of descriptors corresponding to two corner points in the corner point pair;
and calculating the mean value of the cosine similarity corresponding to all the corner pairs corresponding to the first coincident image to obtain the corner matching error estimator corresponding to the first coincident image.
Preferably, the method for constructing an affine transformation error distribution feature model according to the fusion error distribution feature corresponding to each first registration image and the corresponding matching error estimator includes:
clustering the fusion vectors in the fusion error distribution characteristics corresponding to the first coincident image by using a clustering algorithm to obtain a plurality of categories;
acquiring the mean value of the fusion vectors in each category and the covariance matrix of the fusion vectors in each category corresponding to the first coincidence image;
constructing a multi-dimensional Gaussian model corresponding to each category according to the mean value of the fusion vector corresponding to each category in the first coincidence image and the covariance matrix of the fusion vector;
constructing a Gaussian mixture model corresponding to the first superposed image according to the multi-dimensional Gaussian models corresponding to the categories corresponding to the first superposed image;
and constructing an affine transformation error distribution characteristic model according to the Gaussian mixture model corresponding to each first coincident image and the corner matching error estimator corresponding to each first coincident image.
Preferably, the formula of the affine transformation error distribution characteristic model is as follows:
Figure 389904DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 694328DEST_PATH_IMAGE002
to map the distributed feature model of affine transformation errors,
Figure 615011DEST_PATH_IMAGE003
for the Gaussian mixture model corresponding to the ith first mapping image and the jth first mapping image,
Figure 88848DEST_PATH_IMAGE004
gaussian mixture model corresponding to ith first mapping image and mth first mapping image, and acquisition method and device thereof
Figure 231117DEST_PATH_IMAGE003
In the same way, Q is the number of all the first mapping images,
Figure 651251DEST_PATH_IMAGE005
in order to be a coefficient of rationality,
Figure 387126DEST_PATH_IMAGE006
estimating the amount of corner matching error corresponding to the ith first mapping image and the jth first mapping image,
Figure 913922DEST_PATH_IMAGE007
and estimating the corresponding corner matching error of the ith first mapping image and the mth first mapping image.
Preferably, the formula of the gaussian mixture model corresponding to the first registration image is as follows:
Figure 51642DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 123634DEST_PATH_IMAGE003
for the Gaussian mixture model corresponding to the ith first mapping image and the jth first mapping image,
Figure 471439DEST_PATH_IMAGE009
the number of the vector classes to be fused,
Figure 677293DEST_PATH_IMAGE010
multidimensional Gaussian models of the ith first mapping image and the pth category in the jth first mapping image.
Preferably, the calculation formula of the corner matching error estimator corresponding to the first registration image is as follows:
Figure 777842DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 676527DEST_PATH_IMAGE012
estimating the amount of corner matching error corresponding to the ith first mapping image and the jth first mapping image,
Figure 246049DEST_PATH_IMAGE013
and averaging the cosine similarity of all the corner point pairs in the ith first mapping image and the jth first mapping image.
The invention has the following beneficial effects:
the method comprises the steps of fusing images with coincident views in each first mapping image to obtain fusion error distribution characteristics corresponding to each fused first coincident image, then obtaining corner matching error estimators when corner matching is carried out on each first coincident image, and finally constructing an affine transformation error distribution characteristic model according to the fusion error distribution characteristics corresponding to each first coincident image and the corresponding corner matching error estimators; the method comprises the steps of calibrating a current affine transformation result by utilizing a constructed affine transformation error distribution characteristic model to obtain the accuracy degree of an attitude angle, adjusting a target attitude angle if the accuracy degree of the attitude angle is less than or equal to a threshold value until the accuracy degree of the final attitude angle is greater than the threshold value, and judging that the final target attitude angle meets the standard. According to the method, the error of the attitude angle of the camera is eliminated according to the error distribution characteristics in the image fusion process, the calibration of the affine transformation result of the image is realized, and the accuracy of the mapping result is improved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
Fig. 1 is a flowchart of an image calibration method for unmanned aerial vehicle aerial surveying and mapping according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and functional effects of the present invention adopted to achieve the predetermined purpose, the following detailed description of an image calibration method for aerial surveying and mapping of an unmanned aerial vehicle according to the present invention is provided with the accompanying drawings and the preferred embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The main conception of the invention is as follows: according to the error distribution characteristics in the image fusion process and the constructed affine transformation error distribution characteristic model, the result of affine transformation of the image by using the target attitude angle is calibrated, and the target attitude angle is continuously adjusted, so that errors caused by inaccurate camera attitude angle can be eliminated to the maximum extent, and the accuracy of the mapping result is improved.
The following describes a specific scheme of the image calibration method for unmanned aerial vehicle aerial surveying and mapping in detail with reference to the accompanying drawings.
An embodiment of an image calibration method for unmanned aerial vehicle aerial surveying and mapping comprises the following steps:
as shown in fig. 1, an image calibration method for unmanned aerial vehicle aerial surveying and mapping of the present embodiment includes the following steps.
And step S1, obtaining initial mapping images corresponding to each field of view region acquired in the aerial photography mapping process, and carrying out affine transformation on each initial mapping image by using the target attitude angle to obtain each first mapping image.
When unmanned aerial vehicle aerial photography survey and drawing, many images are gathered along the airline to unmanned aerial vehicle, gather after many images to fuse these images, obtain the overhead view panorama in an area, can show the actual coordinate in every position on this overhead view panorama, mapping result promptly.
The error of the mapping result is mainly generated from the process of image fusion, the images are image data of different positions in the mapping area, the mapping result can be obtained by only fusing the images together and then performing some processing, and the process of the image fusion is specifically as follows:
since the camera on the unmanned aerial vehicle collects images from an oblique viewing angle, each image is not a top viewing angle, and in order to obtain a top panoramic view of an area, affine transformation is firstly performed on the images, namely, the images are changed from the oblique viewing angle to the top viewing angle by using an affine transformation algorithm. When affine transformation is performed on an image, attitude data of a camera during image acquisition is required, and the attitude data refers to an attitude angle (e.g., an elevation angle) of the camera, which can be obtained by a pan-tilt control system of the camera. In this embodiment, the acquired camera pose angle is recorded as a target pose angle, and the target pose angle is used for performing affine transformation on an image.
After all the images are changed from an oblique view angle to a top view angle, a common view field exists between some images, for two images with the common view field, angular points of the two images are obtained by using an angular point detection algorithm, then the angular points on the two images are matched by using an angular point matching algorithm, the two images are spliced and fused according to the matching result of the angular points, and then the two images are fused together. And finally fusing all the images together by using the same method to further obtain a mapping result.
The process of image fusion in this embodiment is the prior art, and is not described herein again.
According to the above, the images are subjected to affine transformation by using the target attitude angles and then fused together through corner matching to obtain a fusion result, in the process, the real attitude angle of the camera and the obtained target attitude angle may have a certain difference, and the corner matching between the images may also have errors, so that the combination of the two errors can cause errors in the mapping result.
In the embodiment, all initial mapping images are acquired first, and then affine transformation is performed on all the initial mapping images by using the target attitude angle, so that the image of the squint viewing angle is transformed into the image of the overlook viewing angle, and a corresponding first mapping image is obtained.
And step S2, fusing the first coincident images to obtain fusion error distribution characteristics corresponding to the first coincident images, wherein the first coincident images are any two images with coincident view fields in the first mapping images.
In the present embodiment, step S1 acquires all the initial mapping images, and these initial mapping images become the top view angle image, i.e., the first mapping image, by affine transformation. After affine transformation, angular point matching is needed, and the two matched angular points are processed to enable coordinates of the two angular points to be equal, so that image fusion is realized, and the method specifically comprises the following steps:
if two images are acquired at adjacent positions, the two first mapping images have overlapped visual field areas, and any two first mapping images with overlapped visual field areas are recorded as a first overlapped image in the embodiment. In this embodiment, assuming that the ith first mapping image and the jth first mapping image have overlapping fields of view, a SIFT corner point detection algorithm is respectively used for the two first mapping images to obtain all corner points of the ith first mapping image and all corner points on the jth first mapping image; then, using a normalized product correlation algorithm (NCC algorithm) to match the corners on the two graphs, a plurality of corner pairs are obtained, and this embodiment assumes that N corner pairs are obtained.
One corner point of the corner point pair is from the ith first mapping image and corresponds to a pixel coordinate on the ith first mapping image; the other corner point is from the jth first mapping image and corresponds to a pixel coordinate on the jth first mapping image. If it is desired to fuse the two images together, it is first necessary to let the ith image undergo translation and scaling so that the pixel coordinates of the two corner points in each corner point pair are equal. The two images are then fused into one large image using a pyramid-based image fusion algorithm.
In this embodiment, the method for making the pixel coordinates of two corner points in each corner point pair equal specifically includes: let two corner point coordinates of the ith pair of the corner point pairs respectively be
Figure 599801DEST_PATH_IMAGE014
And
Figure 712114DEST_PATH_IMAGE015
Figure 375176DEST_PATH_IMAGE014
and
Figure 684191DEST_PATH_IMAGE016
are respectively regarded as two-dimensional vectors, in this embodiment
Figure 231847DEST_PATH_IMAGE014
And
Figure 323300DEST_PATH_IMAGE015
the values of (a) are known; assuming that S is an unknown 2 × 2 scaling matrix and D is an unknown two-dimensional displacement vector, this embodiment constructs a method of translation and scaling according to S and D:
Figure 563788DEST_PATH_IMAGE017
the above equation is a linear model containing unknown parameters S and D, and this embodiment uses all pairs of corner points in the first coincident image as sample data, i.e. set M
Figure 858635DEST_PATH_IMAGE018
And fitting values of S and D according to the sample data by using a RANSAC algorithm.
In the embodiment, any subset in the set M is obtained, and then the subset is also used as sample data, and a group of scaling matrixes and displacement vectors corresponding to the subset are fitted again by using a least square method; for example, in this embodiment, K subsets are randomly selected from the set M, and then K groups of scaling matrices and displacement vectors corresponding to the K subsets can be obtained. In this embodiment, for convenience of subsequent analysis, a group of scaling matrices and displacement vectors are spliced into a one-dimensional vector (i.e., the scaling matrix is flattened into a one-dimensional vector and then spliced with the corresponding displacement vector), and this embodiment records such a vector as a fusion vector. In this embodiment, K +1 fusion vectors are obtained, which include K fusion vectors that can be obtained by K subsets, and one fusion vector that is obtained as a whole.
This embodiment considers that ideally, if there is no error, then the K +1 fused vectors should be the same; however, due to errors, these fused vectors may be inconsistent and different, and this embodiment refers to such errors as fusion errors, and these errors are caused by superposition of errors in affine transformation and errors in corner matching.
This embodiment refers to the set of K +1 fusion vectors as the fusion error distribution characteristics corresponding to the ith and jth first mapping images
Figure 69036DEST_PATH_IMAGE019
. Then the same embodiment can be obtainedAnd fusion error distribution characteristics when all the first coincidence images are fused.
Step S3, a corner matching error estimator is calculated when each first registration image is subjected to corner matching.
The angular points of the two images need to be matched before the first mapping image is fused, but on one hand, because angular point detection has errors, for example, some images have low quality, some angular points cannot be detected; on the other hand, errors occur when some corner points are matched again, for example, two corner points which are not matched together are matched together, or the corner points which should be matched together are not matched together; these two factors add up to cause corner matching errors. In this embodiment, the following step is to calculate the magnitude of the corner matching error, specifically:
in this embodiment, first, all corner point pairs matched on the ith first mapping image and the jth first mapping image, that is, all corner point pairs corresponding to the first registration image are obtained, each corner point corresponds to one descriptor, the descriptor is a vector and is used for describing features of the corner point, the same corner points have the same descriptor, and a SIFT corner point descriptor is used in this embodiment.
Then obtaining descriptors of two corner points in each corner point pair, and obtaining cosine similarity of the two descriptors, namely cosine similarity corresponding to the corner point pairs; calculating the mean value of the cosine similarity corresponding to all the angle point pairs, and recording the mean value as
Figure 624782DEST_PATH_IMAGE020
In this embodiment
Figure 144494DEST_PATH_IMAGE021
Wherein
Figure 175904DEST_PATH_IMAGE022
And estimating the error quantity of corner matching corresponding to the ith first mapping image and the jth first mapping image, wherein the error quantity is used for representing the error quantity when the corners are matched.
This embodiment considers that in an ideal case, the corner points matched as a pair shouldThe descriptors have the same descriptor, but in practice the descriptors are different due to the presence of matching errors; the larger the matching error, the larger the descriptor difference of the corner points matched into a pair,
Figure 65363DEST_PATH_IMAGE013
the smaller, i.e.
Figure 616561DEST_PATH_IMAGE006
The larger, and therefore the more available
Figure 198852DEST_PATH_IMAGE022
To represent the magnitude of the error of the corner matching, i.e. the matching error estimator.
And step S4, constructing an affine transformation error distribution characteristic model according to the fusion error distribution characteristics corresponding to the first coincident images and the corresponding corner matching error estimators.
Considering that affine transformation of an oblique-view image into an overhead-view image using a target attitude angle is also likely to have errors, but such errors cannot be determined; the present embodiment can know that the distribution of the error is the same for each image, that is, the distribution characteristics of the affine transformation error of the ith first mapping image and the distribution characteristics of the affine transformation error of the jth first mapping image are the same, because the error depends only on the attitude angle of the camera. The fusion error is a superposition result of the matching error and the affine transformation error, in other words, the fusion vectors in the error distribution feature are not gathered together and have a difference due to the existence of the matching error and the affine transformation error. According to this embodiment, the distribution of affine transformation errors is analyzed, specifically:
firstly, acquiring fusion error distribution characteristics corresponding to the ith first mapping image and the jth first mapping image
Figure 717558DEST_PATH_IMAGE019
Which is a collection of multiple fused vectors. The embodiment utilizes a mean shift clustering algorithm pair
Figure 410707DEST_PATH_IMAGE019
The fusion vectors in (1) are clustered to obtain a plurality of corresponding categories; the present embodiment assumes that P classes (P may be equal to 1) are obtained, each class is a set of some fused vectors, and the fused vectors in the same class are distributed together in a set, which indicates that the error between the fused vectors is small.
Then, the P-th category is obtained from the P categories, the mean value of the fusion vector in the category and the covariance matrix of the fusion vector in the category are obtained, and a multi-dimensional Gaussian model is constructed by the mean value and the covariance matrix
Figure 46438DEST_PATH_IMAGE023
(the dimensions of this Gaussian model are the same as those of the fusion vector), then based on
Figure 924265DEST_PATH_IMAGE019
Constructing a Gaussian mixture model by using multi-dimensional Gaussian models corresponding to various categories
Figure 540054DEST_PATH_IMAGE024
Wherein
Figure 912260DEST_PATH_IMAGE025
The multidimensional Gaussian models of the pth category in the ith first mapping image and the jth first mapping image are obtained, and P is the number of the fusion vector categories; in this embodiment, the gaussian mixture model is used to represent the distribution of the fusion error corresponding to the ith first mapping image and the jth first mapping image.
Finally, in this embodiment, according to the gaussian mixture models corresponding to all the first coincidence images in all the first mapping images and the corresponding angular point matching error estimators, a distribution characteristic model of mapping affine transformation errors is constructed as follows:
Figure 562684DEST_PATH_IMAGE026
wherein the content of the first and second substances,
Figure 80253DEST_PATH_IMAGE003
the Gaussian mixture model corresponding to the ith first mapping image and the jth first mapping image represents the distribution condition of the corresponding fusion error;
Figure 822819DEST_PATH_IMAGE004
for the Gaussian mixture model corresponding to the ith and mth first mapping images, the distribution of the corresponding fusion error is represented, and the acquisition method and the system thereof
Figure 592192DEST_PATH_IMAGE003
The same is true; q is the number of all first mapping images,
Figure 487336DEST_PATH_IMAGE005
in order to be a coefficient of rationality,
Figure 926539DEST_PATH_IMAGE022
estimating the amount of corner matching error corresponding to the ith first mapping image and the jth first mapping image,
Figure 782499DEST_PATH_IMAGE007
estimating the corresponding corner matching error of the ith first mapping image and the mth first mapping image;
Figure 480197DEST_PATH_IMAGE027
the ith first mapping image and the jth first mapping image are two different images;
Figure 950886DEST_PATH_IMAGE028
it is explained that the ith first mapping image and the mth first mapping image are two different images.
Figure 951203DEST_PATH_IMAGE029
Is a rationality coefficient, namely the rationality of the fusion of the two images and is used for reflecting whether the two images can be fused or notAnd (6) mixing. When the ith first mapping image and the jth first mapping image have the same visual field and can be fused, and the ith first mapping image and the mth first mapping image have the same visual field and can be fused
Figure 887935DEST_PATH_IMAGE030
(ii) a When the ith first mapping image and the jth first mapping image do not have the same visual field and cannot be merged, or the ith first mapping image and the mth first mapping image do not have the same visual field and cannot be merged
Figure 405635DEST_PATH_IMAGE031
In the embodiment, the fusion error of the ith first mapping image and the jth first mapping image is the superposition of the corner matching error and the affine transformation error of the ith first mapping image and the jth first mapping image; similarly, the fusion error of the ith first mapping image and the mth first mapping image is the superposition of the matching error and the affine transformation error of the ith first mapping image and the mth first mapping image; the present embodiment considers that the affine transformation errors of the ith first mapping image and the jth first mapping image are the same as the affine transformation error distributions of the ith first mapping image and the mth first mapping image.
If only affine transformation error exists, the distribution of the fusion error of the ith first mapping image and the jth first mapping image is the same as that of the fusion error of the ith first mapping image and the mth first mapping image, namely the corresponding Gaussian mixture model is the same and is equal to the affine transformation error; if only the corner matching error exists, the distribution of the fusion error of the ith first mapping image and the jth first mapping image
Figure 885158DEST_PATH_IMAGE003
And the fusion error distribution of the ith first mapping image and the mth first mapping image
Figure 181010DEST_PATH_IMAGE004
The method is different because the images are different and the error distribution conditions of corner matching are different; when the two are present at the same time,
Figure 854306DEST_PATH_IMAGE003
and
Figure 34751DEST_PATH_IMAGE004
the same part may represent affine transformation errors to some extent, and the different part may represent corner matching errors to some extent.
Figure 227835DEST_PATH_IMAGE032
The representation simultaneously appears
Figure 445321DEST_PATH_IMAGE003
And
Figure 232011DEST_PATH_IMAGE004
the distribution of the fusion error is characterized by
Figure 75202DEST_PATH_IMAGE003
And
Figure 634710DEST_PATH_IMAGE004
the distribution situation of the fusion errors of the same part, namely the probability that only affine errors exist, can describe the distribution situation of the affine transformation errors to a certain extent.
If it is not
Figure 147731DEST_PATH_IMAGE012
The smaller the angle point matching error estimation quantity of the ith first mapping image and the jth first mapping image is, the smaller the angle point matching error estimation quantity is, and the more accurate the angle point matching result of the two images is; in the same way, if
Figure 811930DEST_PATH_IMAGE007
The smaller the angle point matching error estimates of the ith first mapping image and the mth first mapping image are, the more accurate the angle point matching results of the two images are.Then when
Figure 943966DEST_PATH_IMAGE033
The smaller the matching error between the ith and jth first mapping images and between the ith and mth first mapping images is, the smaller the matching error is, and then
Figure 252587DEST_PATH_IMAGE032
The more the distribution of affine transformation errors can be described.
This embodiment is as follows
Figure 795564DEST_PATH_IMAGE034
Is a weight pair
Figure 665169DEST_PATH_IMAGE032
The result obtained by the weighted summation is the distribution characteristic F of the affine transformation error. In the formula
Figure 991108DEST_PATH_IMAGE035
Is a normalized coefficient.
The F constructed in this embodiment may also be regarded as a gaussian mixture model, the argument of which is a fusion vector, and when a fusion vector is input to F, a probability may be obtained, and the larger the obtained probability value is, the more accurate the fusion vector is, the smaller the error is, and the closer to the true value is. The distribution characteristic models of mapping affine transformation errors constructed under different mapping conditions are different, and the constructed models are different because unmanned aerial vehicle equipment or natural environments in different mappings are different.
Step S5, calibrating an affine transformation result by using the fusion error distribution characteristics corresponding to each first coincidence image, the corresponding corner matching error estimator and the affine transformation error distribution characteristic model to obtain a first attitude angle accuracy degree; if the accuracy degree of the first attitude angle is less than or equal to the threshold value, adjusting the target attitude angle; carrying out affine transformation on each initial surveying image again by using the adjusted target attitude angle to obtain a corresponding second surveying image; and obtaining a second attitude angle accuracy degree according to each second mapping image, if the second attitude angle accuracy degree is less than or equal to a threshold value, continuing to adjust the target attitude angle until the final attitude angle accuracy degree is greater than the threshold value, and obtaining a mapping result by using the final target attitude angle.
In this embodiment, the initially obtained target attitude angle is often inaccurate, so that the target attitude angle needs to be adjusted, the adjusted attitude angle is used to perform affine transformation on the ith image again, the oblique-view image is changed into the top-view image, and the affine transformation result is corrected, where the specific method is as follows:
the attitude angle of the camera is the same when the camera collects each initial mapping image, that is, the target attitude angle is the same, and this embodiment records the initially obtained target attitude angle as
Figure 13291DEST_PATH_IMAGE036
The value is read from the camera-head control system, but it is subject to possible errors depending on the accuracy of the head system on the one hand and on environmental disturbances on the other hand; therefore, if there is an error, the present embodiment will be right
Figure 9060DEST_PATH_IMAGE036
And performing corresponding adjustment. The present example considers the true value to be
Figure 257638DEST_PATH_IMAGE037
Figure 980744DEST_PATH_IMAGE038
Is a scalar quantity, in this embodiment
Figure 375209DEST_PATH_IMAGE039
Is a value randomly sampled from a standard normal distribution.
In this embodiment, firstly, the affine transformation is calibrated by using the fusion error distribution characteristics corresponding to each first registration image, the corner matching error estimator and the affine transformation error distribution characteristic model, specifically: inputting the corner matching error estimator corresponding to each first coincidence image and all the fusion vectors in the corresponding fusion error distribution characteristics into F to obtain a plurality of probabilities, and calculating the mean value of the probabilities, wherein the mean value of the probabilities represents the accuracy corresponding to the current target attitude angle, namely the accuracy of the attitude angle, and the accuracy of the attitude angle obtained by each first coincidence image is recorded as the accuracy of the first attitude angle in the embodiment; if the accuracy degree of the first attitude angle is greater than the threshold value, judging that the current target attitude angle is close to the true value; if the accuracy degree of the first attitude angle is smaller than or equal to the threshold value, the current target attitude angle is judged to be inaccurate, and therefore the target attitude angle needs to be adjusted.
This embodiment will be described
Figure 666513DEST_PATH_IMAGE040
The adjusted target attitude angle is used as the adjusted target attitude angle, affine transformation is carried out on all initial mapping images again by using the adjusted target attitude angle to obtain corresponding second mapping images, then fusion error distribution characteristics and corner matching error estimators of the ith second mapping image and the jth second mapping image are obtained again according to the methods of the step S2 and the step S3, namely, each second superposed image corresponds to one fusion error distribution characteristic and one corner matching error estimator, and the second superposed image is two images with superposed visual fields in each second mapping image; and calibrating by using the fusion error distribution characteristics corresponding to the second superposed images and the corresponding angular point matching error estimators and combining an affine transformation error distribution characteristic model, and calculating the accuracy of the second attitude angle.
Judging whether the accuracy degree of the obtained second attitude angle is greater than a threshold value, and if so, judging that the current adjusted target attitude angle is close to a true value; if the target attitude angle is less than or equal to the threshold value, the target attitude angle is continuously adjusted, namely the target attitude angle is randomly sampled from the standard normal distribution again to obtain the target attitude angle again
Figure 527022DEST_PATH_IMAGE038
By using
Figure 804551DEST_PATH_IMAGE038
The target attitude angle is adjusted, affine transformation is performed on the initial surveying and mapping image by using the adjusted target attitude angle until the final calibration result (the accuracy degree of the attitude angle) is greater than a threshold, and the threshold can be set according to actual needs in the embodiment.
What this embodiment is expected to obtain
Figure 676692DEST_PATH_IMAGE041
The accuracy degree of the corresponding attitude angle can be made as large as possible, so that the fusion vector obtained during fusion among all images has the error as small as possible, the interference of the error of the attitude angle is avoided, and the mapping result obtained by fusion is more accurate.
Therefore, when the accuracy degree of the obtained attitude angle is greater than the threshold value, the target attitude angle at the moment can be judged to be close to the true value, namely the result of affine transformation is close to the true condition; in this embodiment, affine transformation may be performed on each initial mapping image by using the finally obtained target attitude angle (the attitude angle close to the true value after calibration is determined) to obtain the final mapping result.
The embodiment is equivalent to continuously adjusting the target attitude angle, continuously performing corresponding affine transformation on all initial mapping images, and continuously calibrating the result of affine transformation until the accuracy degree of the attitude angle is greater than the threshold value.
In the embodiment, images with coincident views in each first mapping image are fused to obtain fusion error distribution characteristics corresponding to each fused first coincident image, then, the corner matching error estimator when each first coincident image is subjected to corner matching is obtained, and finally, an affine transformation error distribution characteristic model is constructed according to the fusion error distribution characteristics corresponding to each first coincident image and the corresponding corner matching error estimator; in this embodiment, the current affine transformation result is calibrated by using the constructed affine transformation error distribution feature model to obtain the accuracy of the attitude angle, and if the accuracy of the attitude angle is less than or equal to the threshold, the target attitude angle is adjusted until the accuracy of the final attitude angle is greater than the threshold, and it is determined that the final target attitude angle meets the standard. According to the method and the device, the error of the attitude angle of the camera is eliminated according to the error distribution characteristics in the image fusion process, the calibration of the image affine transformation result is realized, and the accuracy of the mapping result is improved.
It should be noted that: the above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. An image calibration method for aerial surveying and mapping of unmanned aerial vehicles, characterized in that the method comprises the following steps:
acquiring initial mapping images corresponding to all the visual field areas acquired in the aerial photography mapping process, and performing affine transformation on all the initial mapping images by using the target attitude angle to obtain all the first mapping images;
fusing the first superposed images to obtain fusion error distribution characteristics corresponding to the first superposed images, wherein the first superposed images are any two images with superposed visual fields in the first mapping images;
calculating a corner matching error estimator when each first coincident image is subjected to corner matching;
constructing an affine transformation error distribution characteristic model according to the fusion error distribution characteristics corresponding to the first coincident images and the corresponding corner matching error estimators;
calibrating an affine transformation result by using the fusion error distribution characteristics corresponding to the first coincident images, the corresponding angular point matching error estimator and the affine transformation error distribution characteristic model to obtain a first attitude angle accuracy degree; if the accuracy degree of the first attitude angle is less than or equal to the threshold value, adjusting the target attitude angle; carrying out affine transformation on each initial surveying image again by using the adjusted target attitude angle to obtain a corresponding second surveying image; and obtaining a second attitude angle accuracy degree according to each second mapping image, if the second attitude angle accuracy degree is less than or equal to a threshold value, continuing to adjust the target attitude angle until the final attitude angle accuracy degree is greater than the threshold value, and obtaining a mapping result by using the final target attitude angle.
2. The image calibration method for unmanned aerial vehicle aerial surveying and mapping according to claim 1, wherein the method of obtaining the fusion error distribution characteristic corresponding to each first registration image comprises:
obtaining all corner points of the two first mapping images corresponding to the first coincident image by using a corner point detection algorithm;
matching the corner points of the two first mapping images corresponding to the first superposed image by using a corner point matching algorithm to obtain a corresponding corner point pair;
and fitting fusion error distribution characteristics corresponding to the first coincident image according to the coordinates of the two corner points corresponding to each corner point pair.
3. The method of claim 2, wherein the method of calculating the estimate of the corner matching error when the first registered image is corner matched comprises:
obtaining descriptors of two corner points in a corner point pair corresponding to the first coincident image;
calculating cosine similarity of descriptors corresponding to two corner points in the corner point pair;
and calculating the mean value of the cosine similarity corresponding to all the corner pairs corresponding to the first coincident image to obtain the corner matching error estimator corresponding to the first coincident image.
4. The image calibration method for unmanned aerial vehicle aerial surveying and mapping according to claim 1, wherein the method for constructing the affine transformation error distribution feature model according to the fusion error distribution feature corresponding to each first registration image and the corresponding matching error estimator comprises:
clustering the fusion vectors in the fusion error distribution characteristics corresponding to the first coincident image by using a clustering algorithm to obtain a plurality of categories;
acquiring the mean value of the fusion vectors in each category and the covariance matrix of the fusion vectors in each category corresponding to the first coincidence image;
constructing a multi-dimensional Gaussian model corresponding to each category according to the mean value of the fusion vector corresponding to each category in the first coincidence image and the covariance matrix of the fusion vector;
constructing a Gaussian mixture model corresponding to the first superposed image according to the multi-dimensional Gaussian models corresponding to the categories corresponding to the first superposed image;
and constructing an affine transformation error distribution characteristic model according to the Gaussian mixture model corresponding to each first coincident image and the corner matching error estimator corresponding to each first coincident image.
5. The image calibration method for unmanned aerial vehicle aerial surveying and mapping of claim 4, wherein the affine transformation error distribution feature model has a formula as follows:
Figure 209850DEST_PATH_IMAGE001
wherein the content of the first and second substances,
Figure 292076DEST_PATH_IMAGE002
to map the distributed feature model of affine transformation errors,
Figure 484547DEST_PATH_IMAGE003
for the Gaussian mixture model corresponding to the ith first mapping image and the jth first mapping image,
Figure 804670DEST_PATH_IMAGE004
gaussian mixture model corresponding to ith first mapping image and mth first mapping image, and acquisition method and device thereof
Figure 811678DEST_PATH_IMAGE003
In the same way, Q is the number of all the first mapping images,
Figure 115621DEST_PATH_IMAGE005
in order to be a coefficient of rationality,
Figure 872968DEST_PATH_IMAGE006
estimating the amount of corner matching error corresponding to the ith first mapping image and the jth first mapping image,
Figure 329488DEST_PATH_IMAGE007
and estimating the corresponding corner matching error of the ith first mapping image and the mth first mapping image.
6. The image calibration method for unmanned aerial vehicle aerial surveying and mapping of claim 5, wherein the formula of the Gaussian mixture model corresponding to the first coincidence image is as follows:
Figure 992550DEST_PATH_IMAGE008
wherein the content of the first and second substances,
Figure 190314DEST_PATH_IMAGE003
for the Gaussian mixture model corresponding to the ith first mapping image and the jth first mapping image,
Figure 580713DEST_PATH_IMAGE009
the number of the vector classes to be fused,
Figure 406586DEST_PATH_IMAGE010
is as followsThe multi-dimensional Gaussian models of the p-th category in the i first mapping image and the j first mapping image.
7. The method according to claim 3, wherein the calculation formula of the corner matching error estimator corresponding to the first registration image is as follows:
Figure 725703DEST_PATH_IMAGE011
wherein the content of the first and second substances,
Figure 269817DEST_PATH_IMAGE006
estimating the amount of corner matching error corresponding to the ith first mapping image and the jth first mapping image,
Figure 621164DEST_PATH_IMAGE012
and averaging the cosine similarity of all the corner point pairs in the ith first mapping image and the jth first mapping image.
CN202210164317.3A 2022-02-23 2022-02-23 Image calibration method for aerial surveying and mapping of unmanned aerial vehicle Expired - Fee Related CN114216485B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210164317.3A CN114216485B (en) 2022-02-23 2022-02-23 Image calibration method for aerial surveying and mapping of unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210164317.3A CN114216485B (en) 2022-02-23 2022-02-23 Image calibration method for aerial surveying and mapping of unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN114216485A CN114216485A (en) 2022-03-22
CN114216485B true CN114216485B (en) 2022-04-29

Family

ID=80709242

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210164317.3A Expired - Fee Related CN114216485B (en) 2022-02-23 2022-02-23 Image calibration method for aerial surveying and mapping of unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN114216485B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114511471B (en) * 2022-04-18 2022-07-01 广州骏天科技有限公司 Image optimization method and system based on gray level co-occurrence matrix

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10340348A (en) * 1997-06-09 1998-12-22 Ricoh Co Ltd Image aligning method, facsimile character recognizing method and recording medium
CN105303567A (en) * 2015-10-16 2016-02-03 浙江工业大学 Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient
CN106408597A (en) * 2016-09-08 2017-02-15 西安电子科技大学 Neighborhood entropy and consistency detection-based SAR (synthetic aperture radar) image registration method
CN107610164A (en) * 2017-09-11 2018-01-19 北京空间飞行器总体设计部 A kind of No. four Image registration methods of high score based on multiple features mixing
CN109285184A (en) * 2018-08-29 2019-01-29 三峡大学 Three-dimensional point cloud initial registration algorithm based on center of gravity and centroid transformation
CN112611361A (en) * 2020-12-08 2021-04-06 华南理工大学 Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle
WO2021141920A1 (en) * 2020-01-06 2021-07-15 Intuitive Surgical Operations, Inc. System and method for inter-arm registration
CN113256679A (en) * 2021-05-13 2021-08-13 湖北工业大学 Electronic image stabilization algorithm based on vehicle-mounted rearview mirror system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2677464B1 (en) * 2012-05-16 2018-05-02 IMEC vzw Feature detection in numeric data

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10340348A (en) * 1997-06-09 1998-12-22 Ricoh Co Ltd Image aligning method, facsimile character recognizing method and recording medium
CN105303567A (en) * 2015-10-16 2016-02-03 浙江工业大学 Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient
CN106408597A (en) * 2016-09-08 2017-02-15 西安电子科技大学 Neighborhood entropy and consistency detection-based SAR (synthetic aperture radar) image registration method
CN107610164A (en) * 2017-09-11 2018-01-19 北京空间飞行器总体设计部 A kind of No. four Image registration methods of high score based on multiple features mixing
CN109285184A (en) * 2018-08-29 2019-01-29 三峡大学 Three-dimensional point cloud initial registration algorithm based on center of gravity and centroid transformation
WO2021141920A1 (en) * 2020-01-06 2021-07-15 Intuitive Surgical Operations, Inc. System and method for inter-arm registration
CN112611361A (en) * 2020-12-08 2021-04-06 华南理工大学 Method for measuring installation error of camera of airborne surveying and mapping pod of unmanned aerial vehicle
CN113256679A (en) * 2021-05-13 2021-08-13 湖北工业大学 Electronic image stabilization algorithm based on vehicle-mounted rearview mirror system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
RLSAK:A recursive least square approximation with k-means for transformation model estimation in image registration techniques;Sisu Ganesh等;《2013 International Conference on Human Computer Interactions》;20140930;1-4 *

Also Published As

Publication number Publication date
CN114216485A (en) 2022-03-22

Similar Documents

Publication Publication Date Title
US6795590B1 (en) SAR and FLIR image registration method
US7376262B2 (en) Method of three dimensional positioning using feature matching
US9691003B2 (en) Keypoint descriptor generation by complex wavelet analysis
Pizarro et al. Relative Pose Estimation for Instrumented, Calibrated Imaging Platforms.
CN115187798A (en) Multi-unmanned aerial vehicle high-precision matching positioning method
CN114216485B (en) Image calibration method for aerial surveying and mapping of unmanned aerial vehicle
Ribera et al. Estimating phenotypic traits from UAV based RGB imagery
CN114022560A (en) Calibration method and related device and equipment
CN114529615B (en) Radar calibration method, device and storage medium
CN111856445B (en) Target detection method, device, equipment and system
Krüger Robust and efficient map-to-image registration with line segments
CN112767457A (en) Principal component analysis-based plane point cloud matching method and device
Hasheminasab et al. Multiscale image matching for automated calibration of UAV-based frame and line camera systems
Mehrdad et al. Toward real time UAVS’image mosaicking
Layek et al. Remote distance measurement from a single image by automatic detection and perspective correction
CN114972451A (en) Rotation-invariant SuperGlue matching-based remote sensing image registration method
CN114898144A (en) Automatic alignment method based on camera and millimeter wave radar data
Bazin et al. Particle filter approach adapted to catadioptric images for target tracking application
CN113554754A (en) Indoor positioning method based on computer vision
Knuth et al. Maximum-likelihood localization of a camera network from heterogeneous relative measurements
Taiana et al. 3d tracking by catadioptric vision based on particle filters
CN116452791B (en) Multi-camera point defect area positioning method, system, device and storage medium
Domke et al. Integration of visual and inertial information for egomotion: a stochastic approach
Qiu et al. Image moment extraction based aerial photo selection for UAV high-precision geolocation without GPS
Fitzgibbon et al. Learning priors for calibrating families of stereo cameras

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20220429

CF01 Termination of patent right due to non-payment of annual fee