CN106971404A - A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering - Google Patents
A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering Download PDFInfo
- Publication number
- CN106971404A CN106971404A CN201710163760.8A CN201710163760A CN106971404A CN 106971404 A CN106971404 A CN 106971404A CN 201710163760 A CN201710163760 A CN 201710163760A CN 106971404 A CN106971404 A CN 106971404A
- Authority
- CN
- China
- Prior art keywords
- rsqb
- lsqb
- point
- feature
- surf
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 28
- 238000006243 chemical reaction Methods 0.000 claims abstract description 5
- 239000011159 matrix material Substances 0.000 claims description 39
- 238000004422 calculation algorithm Methods 0.000 claims description 34
- 230000009466 transformation Effects 0.000 claims description 28
- 239000013598 vector Substances 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 14
- 238000005070 sampling Methods 0.000 claims description 13
- 238000004364 calculation method Methods 0.000 claims description 9
- 230000002457 bidirectional effect Effects 0.000 claims description 8
- 238000000746 purification Methods 0.000 claims description 8
- 230000004044 response Effects 0.000 claims description 7
- 238000001514 detection method Methods 0.000 claims description 6
- 238000010276 construction Methods 0.000 claims description 4
- 238000009499 grossing Methods 0.000 claims description 4
- 238000010606 normalization Methods 0.000 claims description 4
- 238000000354 decomposition reaction Methods 0.000 claims description 3
- 238000000605 extraction Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 4
- 241000193935 Araneus diadematus Species 0.000 abstract 1
- PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000013507 mapping Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Image Analysis (AREA)
Abstract
The present invention relates to a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering, i.e., in the feature point description stage, color notation conversion space is carried out to characteristic point itself, the colour information of characteristic point itself is obtained, then stated in description operator.Matching effect is good, and cross spider does not occur in image.
Description
Technical Field
The invention relates to an unmanned aerial vehicle image registration technology, in particular to a remote sensing image registration method.
Background
The SURF (speeded Up Robust features) algorithm is a Robust local feature detection algorithm, and the algorithm not only has a detector with high repeatability and a descriptor with strong distinguishability, but also has strong robustness and high operation speed; the method can reach the sub-pixel level in precision, can ensure that the feature descriptors extracted from the image have good invariance under the conditions of scale scaling, brightness change, rotation, shielding, noise and the like, and is widely applied to the image registration technology at present. However, the unmanned aerial vehicle remote sensing images are small in image size and large in number, so that the requirement on the registration accuracy is extremely strict. How to enhance the significance of the characteristic points of the color image so as to further improve the registration accuracy, and especially the accurate registration of the color remote sensing image with higher characteristic similarity is a difficult problem to be solved urgently.
Some researchers provide a method for processing a color remote sensing image by using an SURF algorithm after intensity normalization of RGB components and a method for processing color invariant features based on SURF, but the first method has limitations in describing the color invariant features, cannot fully utilize color information, and the second method is long in time consumption and poor in real-time performance, and cannot meet practical requirements.
Disclosure of Invention
Technical problem to be solved
The invention provides a robust SURF unmanned aerial vehicle color remote sensing image registration method, which aims at the problem that when an unmanned aerial vehicle color remote sensing image is registered by using a SURF algorithm, the registration accuracy is reduced by only neglecting color information by using image brightness information.
Technical scheme
A robust SURF unmanned aerial vehicle color remote sensing image registration method is characterized by comprising the following steps:
step 1: carrying out gray processing on the acquired unmanned aerial vehicle color remote sensing reference image1 and the image to be registered 2 to obtain gray images, carrying out Gaussian smoothing on the gray images to complete the construction of a scale space, adopting a determinant of a feature detection operator Hessian matrix to carry out feature point extraction, and respectively acquiring all feature points of the image1 and the image2 as a target set { p }i1,2, … …, n and a reference picture elementqj},j=1,2,……,m;
Step 2: for the resulting target set { piAnd a reference set qjDetermining a square area with the side length of 20s by taking each feature point as a center, wherein s represents a sampling interval under different scale spaces, dividing neighborhood information around the square area into 4 × 4 sub-areas, dividing each small area into 5 × 5 sampling points, finally calculating responses and modulus values dx, | dx |, dy, and | dy |, of each small area in the vertical and horizontal directions by using a Harr wavelet, and counting the total responses ∑ dx, ∑ | dx |, ∑ dy, ∑ | dy |, of 5 × 5 sampling points to obtain a 64-dimensional SURF feature descriptor of the feature point:
V=(∑dx,∑|dx|,∑dy,∑|dy|)
the SURF descriptor of a feature point is defined in the horizontal or vertical direction as:
Vgray=(V1,V2,V3,V4,...,V16)
wherein, Vi1,2,3, 15,16 represents description vectors corresponding to 16 sub-regions; from this, a target set { p }can be obtainediAnd a reference set qj64-dimensional SURF feature descriptors of all feature points in the page;
and step 3: for a feature point with coordinates (x, y), it is represented as R in RGB color space(x,y)、G(x,y)、B(x,y)It is converted into YIQ color space by color space conversion(x,y)、I(x,y)、Q(x,y);
The RGB color space transformation to YIQ color space is represented as:
it is added to the original algorithm descriptor to get the new color descriptor as follows:
carrying out normalization processing on the formula:
wherein ikK 1,2, 3.., 63,64 represents a 64-dimensional description vector of a feature point in the classic SURF algorithm; from this, a target set { p }can be obtainediAnd a reference set qj67-dimensional description vectors of the respective feature points in the };
and 4, step 4: feature point bi-directional matching
Step 4 a: calculating the nearest Euclidean distance d of the characteristic point p1 in the reference image1 in the positive direction in the image1 by using the Euclidean distance formulantAnd a second nearest Euclidean distance dsnt;
The Euclidean distance formula is used for calculating the nearest Euclidean distance d1 of the feature point p1 in the reference image1 in the positive direction of the image2ntAnd the next to European distance d1sntRespectively is p2ntAnd p2snt;
And 4 b: calculating the nearest distance d1ntAnd the next closest distance d1sntRatio T of1I.e. by
Will T1And comparing with a judgment threshold T: if T is1<T, then step 4c is carried out, otherwise, the operation is automatically carried out;
and 4 c: for p2ntCorresponding to the feature point, the Euclidean distance formula is used to calculate the reverse closest distance d2 in image1ntAnd the next closest distance d2sntRespectively is p1ntAnd p1snt;
And 4 d: calculating the ratio T of the nearest distance to the next nearest distance2
Will T1And comparing with a judgment threshold T: if T is2<T, with p1ntIf the searched characteristic point and the original characteristic point p1 are the same characteristic point, the characteristic point is regarded as a correct matching point pair, otherwise, the step 4a is returned to carry out new characteristic point judgment on other characteristic points in the reference image 1;
sequentially traversing all feature points in the reference image target set, and finally realizing bidirectional matching purification;
and 5: method for realizing transformation matrix parameter solving by using RANSAC algorithm
Step 5 a: searching the feature point pair m after bidirectional matching and purification after the step 4 is finishedi'←→mi,i=1,2,......n;
And step 5 b: randomly selecting 4 pairs of matching point pairs from the matching point pair set, and using coordinate values of the 4 pairs of matching point pairs to obtain a transformation matrix H;
and step 5 c: based on Euclidean distance, finding out coincidence d (H) in the matching point pair setimi,mi')<Point pair of t condition, wherein HimiRepresenting the matrix transformation of feature points in image2 to the location coordinates in image1, mi' represents miIn image1, the position coordinates, d (H), of the corresponding feature pointimi,mi') represents the Euclidean distance of two coordinate vectors, takes the point pair meeting the condition as the final inner point, and records the point pair meeting the HiThe number of inliers to the constraint;
and step 5 d: repeating the steps 5b and 5c n times, and recording the number of inner points each time;
and step 5 e: selecting H with the maximum corresponding interior pointsbestFind all matches d (H)bestmi,mi')<the matching point pairs of the t condition are used as final inner points, namely correct matching point pairs, which do not accord with d (H)bestmi,mi')<Removing the error matching point pairs of the t condition, namely the outliers;
step 5 f: obtaining N matching point pairs according to random sampling consistency algorithm, and obtaining x for the 2N matching point pairs1,y1,x2,y2The matrix is marked with A, and is decomposed into A ═ UDV by SVD singular value decompositionTThe U and the V are respectively singular vectors of A, D is singular values on a diagonal line and is arranged in a descending order, and the last column of V is reconstructed into a 3 x 3 matrix which is the parameter estimation of the solved perspective transformation matrix;
step 6: registration by bicubic interpolation
If the coordinate of any point on the reference image1 is (x, y), the gray value thereof is g (x, y), the coordinate of the corresponding point of the point on the image to be registered 2 is (u, v), the gray value thereof is f (u, v), and [ · ] represents an integer, and the matrix formed by 16 pixels in the neighborhood of the corresponding point 4 × 4 in the image to be registered is B:
the function f (u + i, v + j), i ═ 1,0,1, 2; j is-1, 0,1,2, defined as a gray value when the corresponding point coordinate on the image to be registered is (u + i, v + j);
then the gray value calculation formula of the point to be interpolated in the reference image is as follows:
g(x,y)=f(u,v)=ABC
wherein:
s (w) is a bicubic interpolated basis function, which is an approximation to sin (x pi)/x, and the image2 image is interpolated into the reference image1 by bicubic interpolation, so that the final high-precision registration is realized.
And the judgment threshold T in the step 4b is 0.4-0.8.
And t in the step 5c is 4.
N in the step 5d is more than or equal to 100 and less than or equal to 200.
Advantageous effects
The robust SURF unmanned aerial vehicle color remote sensing image registration method provided by the invention is characterized in that in a feature point description stage, the feature points are subjected to color space transformation to obtain the color information of the feature points, and then the color information is expressed in a description operator. The matching effect is good, and no cross line appears in the image.
Drawings
Fig. 1 shows an unmanned aerial vehicle color remote sensing image1 and an image2, wherein fig. 1(a) is a reference image1, and fig. 1(b) is an image to be registered 2.
Fig. 2 shows the registration result of the original SURF algorithm.
Fig. 3 shows the robust SURF registration result of the present invention.
Detailed Description
The invention will now be further described with reference to the following examples and drawings:
because the component of the RGB color space is closely related to the brightness level, and the usually acquired color remote sensing image is too sensitive to the brightness, the RGB color space is not suitable for color image processing. The YIQ color space (Y component represents the brightness information of the image, and I, Q two components carry the color information) and the RGB color space can be directly transformed with each other by matrix multiplication, and the conversion is linear and has high real-time performance. On the other hand, YIQ can extract the luminance component and separate the color information, and thus has a small amount of calculation and good clustering characteristics, and can be effectively used for color image processing. In the method, the feature points are converted into a YIQ space from an RGB space at the feature point description stage, the YIQ model representation is carried out on the feature points, the color information of the feature points is added into a descriptor, and a three-dimensional color information vector is added on the basis of a 64-dimensional descriptor of an original classical SURF to form a 67-dimensional descriptor so as to enhance the color image feature point descriptor differentiation. And then, accurate registration of the unmanned aerial vehicle color remote sensing image is realized by combining RANSAC (random sampling consensus), two-way matching and bicubic interpolation algorithm. The method mainly comprises the following steps:
step one, utilizing a SURF algorithm to detect feature points to obtain a target set and a reference set
Let the unmanned aerial vehicle color remote sensing image1 be a reference image, wherein the feature point description subset is a target set { p }i1,2, … …, n, another color remote sensing image2 is an image to be registered, and the characteristic point description subset is a reference set { q }j},j=1,2,……,m;
Carrying out gray processing on the acquired unmanned aerial vehicle color remote sensing images image1 and image2 to obtain gray images, carrying out Gaussian smoothing processing to complete the construction of a scale space, adopting determinants of a characteristic detection operator Hessian matrix to carry out characteristic point extraction, and respectively recording all the characteristic points of the acquired images 1 and 2 as target sets { p }i1,2, … …, n and a reference set qj},j=1,2, … …, m, described in detail below:
that is, for a certain point X in the grayscale image I (X, y) obtained after the grayscale processing, the Hessian matrix on the scale σ is defined as:
wherein: l isxx(X, σ) is a second-order differential of Gaussian with a scale of σAnd the result of convolution of image I (X, y) at point X:
wherein, represents a two-dimensional convolution operation, the same principle, Lxy(X, σ) and Lyy(X, σ) are second-order differential of Gaussian with the scale σAnd I (X, y) is convolved at point X.
Determinant of Hessian matrix:
Det(H)=Lxx(X,σ)Lyy(X,σ)-L2 xy(X,σ) (3)
different scales sigma represent different spatial resolutions, images with different scale sigma values subjected to Gaussian smoothing are different, and scale space construction is carried out:
interclass scale:
σoctave=σ0*2octave-1(4)
the intra-group scale is as follows:
sigma-scale space coordinate, namely a numerical value characterizing quantity of the scale space;
s-the coordinate of the intra-group scale space, namely a numerical value representing quantity of the intra-group scale space;
σ0initial dimensions, determined by camera performance, are typically selected to be 1.6;
s- -each set of layers (4 layers);
octave- -number of groups (4 groups);
establishing a scale space by the formulas (4) and (5), and searching a local extreme value of determinant response in the scale space of the image by a three-dimensional non-maximum value inhibition search technology to complete the detection of the feature point;
step two, describing the characteristic points and adding color information to the characteristic points
2.1 characteristic Point description
For the resulting target set { piAnd a reference set qjDetermining a square area with the side length of 20s (s represents sampling intervals under different scale spaces) by taking each feature point as a center, dividing the surrounding neighborhood information into 4 × 4 sub-areas, dividing each small area into 5 × 5 sampling points, finally calculating the responses and modulus values dx, | dx |, dy, and | dy |, of each small area in the vertical and horizontal directions by using Harr wavelet, and counting the total responses ∑ dx, ∑ | dx |, ∑ dy, ∑ | dy |, of 5 × 5 sampling points, so as to obtain a 64(4 × 4 × 4) dimensional SURF feature descriptor of the feature point:
V=(∑dx,∑|dx|,∑dy,∑|dy|) (6)
assuming that the coordinates of a feature point are (x, y), the SURF descriptor of the feature point in one direction (horizontal or vertical) can be defined as:
Vgray=(V1,V2,V3,V4,...,V16) (7)
wherein, Vi(i 1,2, 3.., 15,16) represents descriptors corresponding to 16 sub-regions. Using the above method, a target set { p }can be obtainediAnd a reference set qj64-dimensional SURF feature descriptors of all feature points in.
2.2 color information addition
Adding the color information of the feature point into the original descriptor, wherein the coordinate is (x, y) feature point which is expressed as R in RGB color space(x,y)、G(x,y)、B(x,y)It is converted into YIQ color space by color space conversion(x,y)、I(x,y)、Q(x,y)The RGB color space is most commonly used in daily life. However, the color remote sensing image collected by us is usually too sensitive to brightness, and the components of the RGB color space are closely related to brightness. Therefore, the RGB color space is not suitable for color image processing. And YIQ and RGB can be directly transformed into each other through matrix multiplication, and are linearly transformed, so that the real-time performance is high. On the other hand, the YIQ color space can extract the luminance component, and simultaneously separate the color information, and has small calculation amount and good clustering property, so that the YIQ color space can be effectively used for color image processing, which is also the root of our improvement.
The RGB color space transformation to YIQ color space is represented as:
it is added to the original algorithm descriptor to get the new color descriptor as follows:
Vcolor=(V1,V2,V3,V4,...,V16,Y(x,y),I(x,y),Q(x,y)) (9)
to make the expression clearer, equation (6) is simply transformed
V=(i1,i2,i3,i4) (10)
The equation (9) is then redefined as follows:
Vcolor=(i1,i2,i3,i4,...,i64,Y(x,y),I(x,y),Q(x,y)) (11)
in order to have better robustness to rotation, scale and illumination. Here, normalization processing needs to be performed on equation (11):
wherein | Vcolor| is the solving mode of the formula (11), and the expression is as follows:
wherein ik(k 1,2, 3.., 63,64) represents a 64-dimensional descriptor of a feature point in the classic SURF algorithm. As can be seen from equation (11), on the basis of the original 64-dimensional feature point descriptor, the color information of the feature point itself is also loaded, which constitutes a 67-dimensional descriptor, and compared with the original algorithm, there is not much change in time, but the accuracy is further improved due to the addition of the color information in performance.
By analogy, finally obtaining a target set { piAnd a reference set qj67-dimensional descriptors of respective feature points in.
Step three, feature point matching
For each feature point in the target set in the reference image1, the nearest neighbor Euclidean distance d is obtained by inquiring in the reference setntAnd the next nearest neighbor euclidean distance dsntIf the following conditions are met:
keeping the matching between the feature point in the target set according to the formula (14) and the nearest neighbor feature point inquired in the reference set by the feature point in the target set, otherwise, removing the matching pair, wherein T in the formula is a judgment threshold and takes a value between 0.4 and 0.8.
Through the steps, the matching point pairs under the ratio measurement criterion of the nearest neighbor Euclidean distance and the next nearest neighbor Euclidean distance are obtained. The probability of mismatching of these matching point pairs is still high, and a purification process is required, where the bidirectional matching method is operated as follows:
for the sake of clarity, the descriptor expression (12) in image1 and image2 is expressed as v 1-v 11,v12,v13,v14,...,v167) And v2 ═ v21,v22,v23,v24,...,v267) The Euclidean distance formula can obtain:
wherein, Euclidean distance between two vectors of v1 and v2 is represented as d (v1 and v2), and v1i、v2i(i 1,2, 3.., 66,67) represents the respective component values of the vectors v1, v 2;
the closest distance and the next closest distance are denoted as dntAnd dsnt,TiIs the ratio of the closest distance to the next closest distance, expressed as follows:
and (3) performing bidirectional matching point pair purification according to formulas (15) and (16), wherein the specific steps are as follows:
(1) referring to a feature point p1 in an image1, searching a nearest Euclidean distance characteristic point and a second nearest Euclidean distance characteristic point in the image2 by using an Euclidean distance formula to be p2ntAnd p2sntBy d1ntAnd d1sntDescribing the nearest distance and the next nearest distance of the forward direction, wherein the forward direction is positioned from the direction searched in the reference set by the characteristic points of the reference image target set, namely, one characteristic point in the target set is determined to traverse in the reference set to search for a point meeting the condition; the direction searched in the target set from the reference set is determined as the reverse direction;
(2) the ratio T of the closest distance to the next closest distance is calculated using equation (16)1I.e. by
If T is1<T, then the step (3) is carried out, otherwise, the automatic jumping-out is carried out;
(3) for p2ntFinding out the corresponding characteristic points by the formula (15) from the image1, wherein the characteristic points with the nearest distance and the next nearest distance are p1ntAnd p1sntBy d2ntAnd d2sntTo describe the reverse closest distance and the next closest distance;
(4) the ratio T of the closest distance to the next closest distance is calculated using equation (16)2
If T is2<T, with p1ntIf the searched characteristic point and the original characteristic point p1 are the same characteristic point, the characteristic point is regarded as a correct matching point pair, otherwise, the step (1) is returned to carry out new characteristic point judgment on other characteristic points in the reference image 1;
and traversing all the characteristic points in the reference image target set in sequence, and finally realizing bidirectional matching purification.
Step four, obtaining transformation matrix parameters by RANSAC algorithm
Firstly, solving a transformation matrix, giving two points m' and m on color remote sensing images image1 and image2, and according to a perspective transformation matrix H, the method comprises the following steps: m 'to Hm, wherein, the values of-represent proportional equality, and the coordinates of m and m' are (x)1,y1) And (x)2,y2) Written in homogeneous coordinate form: (x)1,y1,z1) And (x)2',y2',z2) Wherein x is2'/z2=x2,y2'/z2=y2Then there is a formula
And further obtaining (20) and (21) on the basis of homogeneous coordinates:
let z be for simplicity and convenience of calculation without loss of generality11, available from (20) and (21):
x1H11+y1H12+H13-x1x2H31-y1x2H32-x2H33=0 (22)
x1H21+y1H22+H23-x1y2H31-y1y2H32-y2H33=0 (23)
from this it is deduced:
the last digit H of the model is transformed due to perspective33Usually we set 1, so the transformation model consists of eight parameters, and at least four pairs of non-collinear matching point pairs are needed to find the values of the parameters of the perspective transformation model. However, errors are always present, whether in the actual calculation process or in objective conditions. Firstly, in actual calculation, due to factors such as noise interference, positioning error of feature point detection, selected model error, mismatching and the like, if the error is large only by using four pairs of non-collinear matching point pairs to obtain eight parameters, the registration accuracy cannot be ensured, and the RANSAC algorithm can just well solve the problems.
The RANSAC algorithm is used as a selection mechanism, the number and the quality of extracted matching point pairs are ensured as much as possible, a transformation model can be accurately estimated, and the RANSAC algorithm realizes the following steps of transformation matrix parameter calculation:
(1) given two color remote sensing images, namely image1 and image2, searching a feature point pair m after bidirectional matching and purificationi'←→mi,i=1,2,......n;
(2) Randomly selecting 4 pairs of matching point pairs from the matching point pair set, and using coordinate values of the 4 pairs of matching point pairs to obtain a transformation matrix H;
(3) based on Euclidean distance, finding out the matching d (Hm) in the matching point pair seti,mi')<Point pair of t condition, where HmiRepresenting the matrix transformation of feature points in image2 to the location coordinates in image1, mi' represents miIn image1, the position coordinates, d (Hm), of the feature pointi,mi') represents the Euclidean distance of two coordinate vectors, takes the point pairs meeting the conditions as final interior points, records the number of the interior points meeting the H constraint, and selects t 4;
(4) repeating the steps (2) and (3) for n times (n is more than or equal to 100 and less than or equal to 200), and recording the number of inner points each time;
(5) selecting H with the maximum corresponding interior pointsbestFind all matches d (H)bestmi,mi')<the matching point pairs of the t condition are used as final inner points, namely correct matching point pairs, which do not accord with d (H)bestmi,mi')<Removing the error matching point pairs of the t condition, namely the outliers;
(6) obtaining N matching point pairs according to a random sampling consistency algorithm, marking A as a matrix formed by 2N equations containing the matching point pairs and corresponding transformation matrix H, and performing SVD singular value decomposition on the matrix to obtain A ═ UDVTThe U and the V are respectively singular vectors of A, D is singular values on a diagonal line and is arranged in a descending order, the last column of V is reconstructed into a 3 x 3 matrix which is the parameter estimation of the solved perspective transformation matrix, and the parameter of the matrix meets the minimum error in the least square sense;
step five, realizing registration by bicubic interpolation
When the transformation matrix is obtained, mapping is needed to realize registration, and in the mapping process, since the pixels are stored in integer coordinate values and are discontinuous values, a situation inevitably occurs: i.e. the point (x) originally on the integer grid1,y1) (coordinates are all integers), after mapping (x)2,y2) The interpolation is not performed on the grid points, so that the interpolation is important.
The bicubic interpolation method estimates the gray value of the point to be interpolated by utilizing the gray information of sixteen pixel points in the neighborhood of the corresponding point 4 x 4 of the point to be interpolated in the image to be registered, not only considers the influence of the gray values of four directly adjacent points, but also considers the influence of the change rate of the gray value of each adjacent point. If sixteen pixel points in 4 x 4 neighborhood exist near the corresponding point of the point to be interpolated in the image to be registered, the method can be adopted.
If the coordinate of any point on the reference image is (x, y), the gray value is g (x, y), the coordinate of the corresponding point of the point on the image to be registered is (u, v), the gray value is f (u, v), [. cndot. ] represents taking an integer, and the matrix formed by 16 pixel points in the neighborhood of 4 x 4 corresponding points in the image to be registered is B:
the function f (u + i, v + j) (i-1, 0,1, 2; j-1, 0,1,2) of the elements in the matrix B is defined as the gray value when the corresponding point coordinate on the image to be registered is (u + i, v + j).
Then the gray value calculation formula of the point to be interpolated in the reference image is as follows:
g(x,y)=f(u,v)=ABC (26)
wherein:
s (w) is the basis function of the bicubic interpolation, which is an approximation of sin (x pi)/x. The image2 image is interpolated into the reference image1 by bicubic interpolation, resulting in a high precision registration.
The bicubic interpolation method adopts the gray information of the pixel points in the larger neighborhood, so that the interpolation precision is further improved, the edge details of the image are well maintained, the visual effect is more ideal, and the algorithm with the highest interpolation precision in the existing interpolation algorithm is basically adopted.
In the following example, the same two unmanned aerial vehicle color remote sensing images, namely, image1 and image2, are verified by using an original SURF algorithm and a robust SURF algorithm through a precision simulation experiment, the matching effect is good, and the robust SURF algorithm does not have a cross line (see fig. 2 and fig. 3). The images 1 and 2 are actually affine transformed by scaling 2 times, rotating 10 degrees, horizontally shifting 239 pixel points and longitudinally shifting 28 pixel points.
Example (c):
firstly, according to the geometrical transformation relation between the color remote sensing images image1 and image2, a theoretical value H can be easily obtainedture. Secondly, registering the color remote sensing images image1 and image2 by using an original SURF algorithm to obtain an original Hsurf. Finally, registering the color remote sensing images image1 and image2 by using a robust SURF algorithm to obtain Hrob。
Without loss of generality, eight groups of unmanned aerial vehicle color remote sensing images are selected to be tested respectively, 8 groups of experiments are carried out to obtain an average value, statistics is carried out on all parameters of a transformation matrix as shown in table 1, the table 1 is an affine transformation matrix parameter comparison table, the fact that the maximum deviation average value of the SURF algorithm is 0.218 higher than the maximum deviation average value of the robustness, the six parameters are close to each other is found, and the robust algorithm is obviously superior to the SURF algorithm.
TABLE 1 affine transformation matrix parameter lookup table
Claims (4)
1. A robust SURF unmanned aerial vehicle color remote sensing image registration method is characterized by comprising the following steps:
step 1: carrying out gray processing on the acquired unmanned aerial vehicle color remote sensing reference image1 and the image to be registered 2 to obtain gray images, carrying out Gaussian smoothing on the gray images to complete the construction of a scale space, adopting a determinant of a feature detection operator Hessian matrix to carry out feature point extraction, and respectively acquiring all feature points of the image1 and the image2 as a target set { p }i1,2, … …, n and a reference set qj},j=1,2,……,m;
Step 2: for the resulting target set { piAnd a reference set qjDetermining a square area with the side length of 20s by taking each feature point as a center, wherein s represents a sampling interval under different scale spaces, dividing neighborhood information around the square area into 4 × 4 sub-areas, dividing each small area into 5 × 5 sampling points, finally calculating responses and modulus dx, | dx |, dy, and | dy |, of each small area in the vertical and horizontal directions by using a Harr wavelet, and counting the total response Σ dx, ∑ | dx |, ∑ dy, ∑ | dy |, of 5 × 5 sampling points to obtain a 64-dimensional SURF feature descriptor of the feature point:
V=(∑dx,∑|dx|,∑dy,∑|dy|)
the SURF descriptor of a feature point is defined in the horizontal or vertical direction as:
Vgray=(V1,V2,V3,V4,...,V16)
wherein, Vi1,2,3, 15,16 represents description vectors corresponding to 16 sub-regions; from this, a target set { p }can be obtainediAnd a reference set qj64-dimensional SURF feature descriptors of all feature points in the page;
and step 3: for a feature point with coordinates (x, y), it is represented as R in RGB color space(x,y)、G(x,y)、B(x,y)It is converted into YIQ color space by color space conversion(x,y)、I(x,y)、Q(x,y);
The RGB color space transformation to YIQ color space is represented as:
it is added to the original algorithm descriptor to get the new color descriptor as follows:
carrying out normalization processing on the formula:
wherein ikK 1,2, 3.., 63,64 represents a 64-dimensional description vector of a feature point in the classic SURF algorithm; from this, a target set { p }can be obtainediAnd a reference set qj67-dimensional description vectors of the respective feature points in the };
and 4, step 4: feature point bi-directional matching
Step 4 a: calculating the nearest Euclidean distance d of the characteristic point p1 in the reference image1 in the positive direction in the image1 by using the Euclidean distance formulantAnd a second nearest Euclidean distance dsnt;
The Euclidean distance formula is used for calculating the nearest forward direction feature point p1 in the image2 in the reference image1Euclidean distance d1ntAnd the next to European distance d1sntRespectively is p2ntAnd p2snt;
And 4 b: calculating the nearest distance d1ntAnd the next closest distance d1sntRatio T of1I.e. by
Will T1And comparing with a judgment threshold T: if T is1If the value is less than T, then the step 4c is carried out, otherwise, the step is automatically jumped out;
and 4 c: for p2ntCorresponding to the feature point, the Euclidean distance formula is used to calculate the reverse closest distance d2 in image1ntAnd the next closest distance d2sntRespectively is p1ntAnd p1snt;
And 4 d: calculating the ratio T of the nearest distance to the next nearest distance2
Will T1And comparing with a judgment threshold T: if T is2< T while p1ntIf the searched characteristic point and the original characteristic point p1 are the same characteristic point, the characteristic point is regarded as a correct matching point pair, otherwise, the step 4a is returned to carry out new characteristic point judgment on other characteristic points in the reference image 1;
sequentially traversing all feature points in the reference image target set, and finally realizing bidirectional matching purification;
and 5: method for realizing transformation matrix parameter solving by using RANSAC algorithm
Step 5 a: searching the feature point pair m after bidirectional matching and purification after the step 4 is finishedi'←→mi,i=1,2,......n;
And step 5 b: randomly selecting 4 pairs of matching point pairs from the matching point pair set, and using coordinate values of the 4 pairs of matching point pairs to obtain a transformation matrix H;
and step 5 c: based on Euclidean distance, finding out coincidence d (H) in the matching point pair setimi,mi')<Point pair of t condition, wherein HimiRepresenting the matrix transformation of feature points in image2 to the location coordinates in image1, mi' represents miIn image1, the position coordinates, d (H), of the corresponding feature pointimi,mi') represents the Euclidean distance of two coordinate vectors, takes the point pair meeting the condition as the final inner point, and records the point pair meeting the HiThe number of inliers to the constraint;
and step 5 d: repeating the steps 5b and 5c n times, and recording the number of inner points each time;
and step 5 e: selecting H with the maximum corresponding interior pointsbestFind all matches d (H)bestmi,mi')<the matching point pairs of the t condition are used as final inner points, namely correct matching point pairs, which do not accord with d (H)bestmi,mi')<Removing the error matching point pairs of the t condition, namely the outliers;
step 5 f: obtaining N matching point pairs according to random sampling consistency algorithm, and obtaining x for the 2N matching point pairs1,y1,x2,y2The matrix is marked with A, and is decomposed into A ═ UDV by SVD singular value decompositionTIn U and VRespectively, singular vectors of A are used, D is singular values on a diagonal line and is arranged in a descending order, and the last column of V is reconstructed into a 3 x 3 matrix which is the parameter estimation of the solved perspective transformation matrix;
step 6: registration by bicubic interpolation
If the coordinate of any point on the reference image1 is (x, y), the gray value thereof is g (x, y), the coordinate of the corresponding point of the point on the image to be registered 2 is (u, v), the gray value thereof is f (u, v), and [ · ] represents an integer, and the matrix formed by 16 pixels in the neighborhood of the corresponding point 4 × 4 in the image to be registered is B:
the function f (u + i, v + j), i ═ 1,0,1, 2; j is-1, 0,1,2, defined as a gray value when the corresponding point coordinate on the image to be registered is (u + i, v + j);
then the gray value calculation formula of the point to be interpolated in the reference image is as follows:
g(x,y)=f(u,v)=ABC
wherein:
s (w) is a bicubic interpolated basis function, which is an approximation to sin (x pi)/x, and the image2 image is interpolated into the reference image1 by bicubic interpolation, so that the final high-precision registration is realized.
2. The method for registering color remote sensing images of a robust SURF unmanned aerial vehicle according to claim 1, wherein the judgment threshold T in the step 4b is 0.4-0.8.
3. The robust SURF unmanned aerial vehicle color remote sensing image registration method according to claim 1, characterized in that t in step 5c selects 4.
4. The robust SURF unmanned aerial vehicle color remote sensing image registration method according to claim 1, wherein n is greater than or equal to 100 and less than or equal to 200 in the step 5 d.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710163760.8A CN106971404A (en) | 2017-03-20 | 2017-03-20 | A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710163760.8A CN106971404A (en) | 2017-03-20 | 2017-03-20 | A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106971404A true CN106971404A (en) | 2017-07-21 |
Family
ID=59329507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710163760.8A Pending CN106971404A (en) | 2017-03-20 | 2017-03-20 | A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106971404A (en) |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108037132A (en) * | 2017-12-25 | 2018-05-15 | 华南理工大学 | A kind of visual sensor system and method for the detection of dry cell pulp layer paper winding defect |
CN109087297A (en) * | 2018-08-10 | 2018-12-25 | 成都工业职业技术学院 | A kind of MR method for registering images based on adaptive neighborhood selection |
CN109102534A (en) * | 2018-08-29 | 2018-12-28 | 长光卫星技术有限公司 | Optical remote sensing image registration method and system under the conditions of haze weather |
WO2019042232A1 (en) * | 2017-08-31 | 2019-03-07 | 西南交通大学 | Fast and robust multimodal remote sensing image matching method and system |
CN109711444A (en) * | 2018-12-18 | 2019-05-03 | 中国科学院遥感与数字地球研究所 | A kind of new remote sensing image matching method based on deep learning |
CN110009558A (en) * | 2019-01-17 | 2019-07-12 | 柳州康云互联科技有限公司 | A kind of normalized method of easy image color |
CN110060285A (en) * | 2019-04-29 | 2019-07-26 | 中国水利水电科学研究院 | A kind of remote sensing image registration method and system based on SURF algorithm |
CN111806702A (en) * | 2020-06-30 | 2020-10-23 | 鑫喆喆 | Parachute jumping mechanism pop-up platform and method based on signal detection |
CN111914855A (en) * | 2020-07-31 | 2020-11-10 | 西安电子科技大学 | Priori feature point sparsization method of ultra-large digital image map |
CN113470085A (en) * | 2021-05-19 | 2021-10-01 | 西安电子科技大学 | Image registration method based on improved RANSAC |
CN115457222A (en) * | 2022-09-14 | 2022-12-09 | 北京建筑大学 | Method for geographic registration of three-dimensional model in geographic information system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426186A (en) * | 2013-09-05 | 2013-12-04 | 山东大学 | Improved SURF fast matching method |
CN103914847A (en) * | 2014-04-10 | 2014-07-09 | 西安电子科技大学 | SAR image registration method based on phase congruency and SIFT |
CN104867126A (en) * | 2014-02-25 | 2015-08-26 | 西安电子科技大学 | Method for registering synthetic aperture radar image with change area based on point pair constraint and Delaunay |
CN105303567A (en) * | 2015-10-16 | 2016-02-03 | 浙江工业大学 | Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient |
CN105809693A (en) * | 2016-03-10 | 2016-07-27 | 西安电子科技大学 | SAR image registration method based on deep neural networks |
CN106210731A (en) * | 2016-07-01 | 2016-12-07 | 兰州理工大学 | Coloured image reversible data concealing method based on bicubic interpolation extension |
-
2017
- 2017-03-20 CN CN201710163760.8A patent/CN106971404A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103426186A (en) * | 2013-09-05 | 2013-12-04 | 山东大学 | Improved SURF fast matching method |
CN104867126A (en) * | 2014-02-25 | 2015-08-26 | 西安电子科技大学 | Method for registering synthetic aperture radar image with change area based on point pair constraint and Delaunay |
CN103914847A (en) * | 2014-04-10 | 2014-07-09 | 西安电子科技大学 | SAR image registration method based on phase congruency and SIFT |
CN105303567A (en) * | 2015-10-16 | 2016-02-03 | 浙江工业大学 | Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient |
CN105809693A (en) * | 2016-03-10 | 2016-07-27 | 西安电子科技大学 | SAR image registration method based on deep neural networks |
CN106210731A (en) * | 2016-07-01 | 2016-12-07 | 兰州理工大学 | Coloured image reversible data concealing method based on bicubic interpolation extension |
Non-Patent Citations (1)
Title |
---|
张二磊等: "一种改进的SURF彩色遥感图像配准算法", 《液晶与显示》 * |
Cited By (16)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11244197B2 (en) | 2017-08-31 | 2022-02-08 | Southwest Jiaotong University | Fast and robust multimodal remote sensing image matching method and system |
WO2019042232A1 (en) * | 2017-08-31 | 2019-03-07 | 西南交通大学 | Fast and robust multimodal remote sensing image matching method and system |
CN108037132A (en) * | 2017-12-25 | 2018-05-15 | 华南理工大学 | A kind of visual sensor system and method for the detection of dry cell pulp layer paper winding defect |
CN108037132B (en) * | 2017-12-25 | 2023-06-16 | 华南理工大学 | Visual sensor system and method for detecting winding defect of dry battery slurry layer paper |
CN109087297A (en) * | 2018-08-10 | 2018-12-25 | 成都工业职业技术学院 | A kind of MR method for registering images based on adaptive neighborhood selection |
CN109102534A (en) * | 2018-08-29 | 2018-12-28 | 长光卫星技术有限公司 | Optical remote sensing image registration method and system under the conditions of haze weather |
CN109102534B (en) * | 2018-08-29 | 2020-09-01 | 长光卫星技术有限公司 | Optical remote sensing image registration method and system under haze weather condition |
CN109711444A (en) * | 2018-12-18 | 2019-05-03 | 中国科学院遥感与数字地球研究所 | A kind of new remote sensing image matching method based on deep learning |
CN110009558A (en) * | 2019-01-17 | 2019-07-12 | 柳州康云互联科技有限公司 | A kind of normalized method of easy image color |
CN110060285A (en) * | 2019-04-29 | 2019-07-26 | 中国水利水电科学研究院 | A kind of remote sensing image registration method and system based on SURF algorithm |
CN111806702A (en) * | 2020-06-30 | 2020-10-23 | 鑫喆喆 | Parachute jumping mechanism pop-up platform and method based on signal detection |
CN111914855A (en) * | 2020-07-31 | 2020-11-10 | 西安电子科技大学 | Priori feature point sparsization method of ultra-large digital image map |
CN111914855B (en) * | 2020-07-31 | 2024-04-05 | 西安电子科技大学 | Priori feature point sparsification method for oversized digital image map |
CN113470085A (en) * | 2021-05-19 | 2021-10-01 | 西安电子科技大学 | Image registration method based on improved RANSAC |
CN113470085B (en) * | 2021-05-19 | 2023-02-10 | 西安电子科技大学 | Improved RANSAC-based image registration method |
CN115457222A (en) * | 2022-09-14 | 2022-12-09 | 北京建筑大学 | Method for geographic registration of three-dimensional model in geographic information system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106971404A (en) | A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering | |
Han et al. | Appearance-based material classification for monitoring of operation-level construction progress using 4D BIM and site photologs | |
CN103106688B (en) | Based on the indoor method for reconstructing three-dimensional scene of double-deck method for registering | |
Montesinos et al. | Matching color uncalibrated images using differential invariants | |
CN111709980A (en) | Multi-scale image registration method and device based on deep learning | |
JPWO2004063991A1 (en) | Multi-parameter high-precision simultaneous estimation method and multi-parameter high-precision simultaneous estimation program in image sub-pixel matching | |
CN108573269B (en) | Image feature point matching method, matching device, electronic device and storage medium | |
CN102122359B (en) | Image registration method and device | |
CN102722887A (en) | Image registration method and device | |
CN102521597A (en) | Hierarchical strategy-based linear feature matching method for images | |
CN105139401A (en) | Depth credibility assessment method for depth map | |
Andaló et al. | Efficient height measurements in single images based on the detection of vanishing points | |
Sharma et al. | Classification based survey of image registration methods | |
CN104504691A (en) | Camera position and posture measuring method on basis of low-rank textures | |
CN113642397B (en) | Object length measurement method based on mobile phone video | |
CN102521837B (en) | Image registration method based on global information | |
CN112163996A (en) | Flat-angle video fusion method based on image processing | |
CN106127261A (en) | A kind of fast multiresolution gray level image template matching method | |
CN109766943B (en) | Template matching method and system based on global perception diversity measurement | |
CN111126484B (en) | NFSC-based wide baseline image matching feature screening method | |
CN113610906B (en) | Multi-parallax image sequence registration method based on fusion image guidance | |
CN109741389A (en) | One kind being based on the matched sectional perspective matching process of region base | |
CN106651950B (en) | Single-camera pose estimation method based on quadratic curve perspective projection invariance | |
Ruf et al. | Towards real-time change detection in videos based on existing 3D models | |
CN104156952A (en) | Deformation resisting image matching method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20170721 |
|
WD01 | Invention patent application deemed withdrawn after publication |