CN106971404A - A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering - Google Patents

A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering Download PDF

Info

Publication number
CN106971404A
CN106971404A CN201710163760.8A CN201710163760A CN106971404A CN 106971404 A CN106971404 A CN 106971404A CN 201710163760 A CN201710163760 A CN 201710163760A CN 106971404 A CN106971404 A CN 106971404A
Authority
CN
China
Prior art keywords
rsqb
lsqb
point
characteristic point
image1
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710163760.8A
Other languages
Chinese (zh)
Inventor
王保平
戈艺萌
张飞
顾漪
张修飞
其他发明人请求不公开姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Northwestern Polytechnical University
Original Assignee
Northwestern Polytechnical University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northwestern Polytechnical University filed Critical Northwestern Polytechnical University
Priority to CN201710163760.8A priority Critical patent/CN106971404A/en
Publication of CN106971404A publication Critical patent/CN106971404A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Image Analysis (AREA)

Abstract

The present invention relates to a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering, i.e., in the feature point description stage, color notation conversion space is carried out to characteristic point itself, the colour information of characteristic point itself is obtained, then stated in description operator.Matching effect is good, and cross spider does not occur in image.

Description

A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering
Technical field
The present invention relates to unmanned plane as registration technique, especially a kind of remote sensing image registration method.
Background technology
SURF (Speeded Up Robust Features) algorithm is that a kind of local feature detection with robustness is calculated Method, the algorithm not only has the high detector and the strong descriptor of ga s safety degree of repeatability, in addition with very strong robustness and Higher arithmetic speed;It can reach sub-pix rank in precision, and change in scaling, brightness, rotate, block and noise When can guarantee that the Feature Descriptor extracted from image has good consistency, the current algorithm extensively should For in image registration techniques.But because unmanned aerial vehicle remote sensing image film size is small, quantity is more, the requirement for registration accuracy is extremely severe Carve.How coloured image characteristic point conspicuousness is strengthened, so as to further improve registration accuracy, especially characteristic similarity is higher Color Remote Sensing Image accuracy registration is always problem urgently to be resolved hurrily.
Some researchers handle colored remote sensing figure with SURF algorithm again after proposing the intensity normalization by RGB component The method of the method for picture and color invariant features based on SURF, but first method has limitation when describing color invariant features Property, it is impossible to colour information is made full use of, second method takes longer, poor real, can not meet actual requirement.
The content of the invention
The technical problem to be solved
The present invention is matched somebody with somebody using SURF algorithm for unmanned plane Color Remote Sensing Image and ignored on time merely with image luminance information Colour information causes the problem of registration accuracy is reduced, and proposes a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering.
Technical scheme
A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering, it is characterised in that step is as follows:
Step 1:The colored remote sensing reference picture image1 of unmanned plane, the image image2 subject to registration of acquisition are carried out at gray scale Reason is changed into gray level image, and Gaussian smoothing is carried out to gray level image and completes metric space structure, using feature detection operator Hessian determinants of a matrix carry out feature point extraction, and all characteristic points for obtaining image1, image2 are designated as object set respectively {pi, i=1,2 ..., n and benchmark set { qj, j=1,2 ..., m;
Step 2:For obtained object set { piAnd benchmark set { qj, determine that the length of side is 20s centered on each characteristic point Square area, wherein s represents the sampling interval under different scale space, its surrounding neighbors information is divided into 4 × 4 sub-districts Domain, each zonule is divided into 5 × 5 sampled points again, finally calculates each zonule vertically and horizontally with Harr small echos Response and modulus value dx, | dx |, dy, | dy |, and count the overall response ∑ dx of 5 × 5 sampled points, ∑ | and dx |, ∑ dy, ∑ | dy |, It can obtain 64 dimension SURF feature descriptors of characteristic point:
V=(∑ dx, ∑ | dx |, ∑ dy, ∑ | dy |)
Then the SURF descriptors of characteristic point are defined as in the horizontal or vertical directions:
Vgray=(V1,V2,V3,V4,...,V16)
Wherein, Vi, i=1,2,3 ..., 15,16 represent the description vectors corresponding to 16 sub-regions;It can thus be concluded that target Collect { piAnd benchmark set { qjIn all characteristic point 64 dimension SURF feature descriptors;
Step 3:It is (x, y) characteristic point for coordinate, it is expressed as R in RGB color(x,y)、G(x,y)、B(x,y), by its Color space conversion is carried out, YIQ color spaces are transformed into for Y(x,y)、I(x,y)、Q(x,y)
RGB color transforms to YIQ color spaces and is expressed as:
Increased in primal algorithm descriptor, obtain new color descriptors as follows:
Above formula is normalized:
Wherein ik, k=1,2,3 ..., 63,64 represent 64 dimension description vectors of characteristic point in classical SURF algorithm;Thus Object set { p can be obtainediAnd benchmark set { qjIn each characteristic point 67 dimension description vectors;
Step 4:Characteristic point bi-directional matching
Step 4a:Using Euclidean distance formula calculate characteristic point p1 in reference picture image1 in image1 it is positive Nearest Euclidean distance dntWith secondary nearly Euclidean distance dsnt
Use the nearest Europe positive in image2 of the characteristic point p1 in Euclidean distance formula calculating reference picture image1 Formula is apart from d1ntWith secondary nearly Euclidean distance d1sntCharacteristic point be respectively p2ntAnd p2snt
Step 4b:Calculate minimum distance d1ntWith secondary minimum distance d1sntRatio T1, i.e.,
By T1It is compared with judgment threshold T:If T1<T, carries out step 4c, otherwise, jumps out automatically therewith;
Step 4c:For p2ntCharacter pair point, minimum distance reverse in image1 is calculated using Euclidean distance formula d2ntWith secondary minimum distance d2sntCharacteristic point be respectively p1ntAnd p1snt
Step 4d:Calculate the ratio T of minimum distance and time minimum distance2
By T1It is compared with judgment threshold T:If T2<T, while p1ntCharacteristic point and primitive character the point p1 found It is same characteristic point, then it is assumed that be correct matching double points, otherwise return to step 4a is to other spies in reference picture image1 A new characteristic point of progress is levied to judge;
Traversal reference picture target tightening all characteristic points, finally realize that bi-directional matching is purified successively;
Step 5:Realize that transformation matrix parameter is asked for using RANSAC algorithms
Step 5a:The characteristic point after bi-directional matching purification after the completion of searching step 4 is to mi'←→mi, i=1, 2,......n;
Step 5b:From matching double points set, 4 pairs of matching double points are arbitrarily chosen, the coordinate of this 4 pairs of matching double points is used Value, tries to achieve transformation matrix H;
Step 5c:Using Euclidean distance as foundation, found in matching double points set and meet d (Himi,mi')<The point of t conditions It is right, wherein, HimiRepresent to carry out characteristic point in image2 the position coordinates that matrixing is mapped in image1, mi' represent mi The position coordinates of character pair point, d (H in image1imi,mi') represent two coordinate vectors Euclidean distance, bar will be met The point of part is recorded and meets H to as final interior pointiInterior quantity of constraint;
Step 5d:5b and two steps of 5c n time are repeated, interior quantity each time is recorded;
Step 5e:Choose the maximum H of point quantity in correspondencebest, searching is all to meet d (Hbestmi,mi')<Of t conditions With point pair, using them as final interior point, i.e., correct matching double points do not meet d (Hbestmi,mi')<The mistake of t conditions With point pair, as exterior point is rejected;
Step 5f:N number of matching double points are tried to achieve according to RANSAC algorithm, to this 2N by x1,y1,x2,y2Institute The matrix of composition is marked, and is denoted as A, and carrying out SVD singular value decompositions to it obtains A=UDVT, be respectively in U and V A it is unusual to Amount, D is that the singular value on diagonal is arranged in descending order, and it is required perspective transform square that V last row, which are reconstructed into 3*3 matrixes, Battle array parameter Estimation;
Step 6:Bicubic interpolation realizes registration
If any point coordinate is (x, y) on reference picture image1, its gray value is g (x, y), and the point is in figure subject to registration Corresponding point coordinates on picture image2 is (u, v), and its gray value is f (u, v), and [] represents right in round numbers, image subject to registration Should the matrix of 16 pixels compositions be B in point 4*4 neighborhoods:
Function f (u+i, v+j), i=-1,0,1,2 in element in matrix B;J=-1,0,1,2, is defined as in figure subject to registration As gray value when upper corresponding point coordinates is (u+i, v+j);
Then the gray value calculation formula of interpolation point is as follows in reference picture:
G (x, y)=f (u, v)=ABC
Wherein:
S (w) is the basic function of bicubic interpolation, is that sin (x* π)/x is approached, by bicubic interpolation, by image2 Image interpolation realizes final high registration accuracy into reference picture image1.
Judgment threshold T in described step 4b is 0.4-0.8.
T in described step 5c chooses 4.
100≤n≤200 in described step 5d.
Beneficial effect
A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering proposed by the present invention, i.e., in feature point description rank Section, carries out color notation conversion space to characteristic point itself, obtains the colour information of characteristic point itself, is then carried out in description operator Statement.Matching effect is good, and cross spider does not occur in image.
Brief description of the drawings
Fig. 1 is unmanned plane Color Remote Sensing Image image1 and image2 of the present invention, and wherein Fig. 1 (a) is reference picture Image1, Fig. 1 (b) are image image2 subject to registration.
Fig. 2 is the registration result of former SURF algorithm.
Fig. 3 is a kind of SURF registration results of robust of the invention.
Embodiment
In conjunction with embodiment, accompanying drawing, the invention will be further described:
Due to component and the light levels close relation of RGB color, and the Color Remote Sensing Image pair generally collected Brightness is excessively sensitive, so RGB color is not suitable for carrying out Color Image Processing.YIQ color spaces (Y-component representative image Monochrome information, two components of I, Q then carry colouring information) and RGB color can be directly mutual by a matrix multiple Conversion, and be linear transformation, real-time is high.On the other hand, YIQ has and can extract luminance component, while colour information Also it is separated, amount of calculation is small, Clustering features are good, therefore, it is possible to be efficiently used for Color Image Processing.This method is in feature The point description stage is transformed to characteristic point in YIQ spaces by rgb space, and YIQ model expressions are carried out to characteristic point, by characteristic point certainly Body colour information is added in descriptor, and three-dimensional colour information vector structure is increased on the basis of former classics SURF 64 dimension descriptors Into 67 dimension descriptors discrimination is accorded with to strengthen coloured image feature point description.Later in conjunction with RANSAC (Random Sample Consensus), bi-directional matching, bicubic interpolation algorithm realize unmanned plane Color Remote Sensing Image accuracy registration.Mainly include as follows Step:
Step 1: carrying out feature point detection using SURF algorithm obtains object set and benchmark set
It is reference picture to make unmanned plane Color Remote Sensing Image image1, and feature point description subset therein is object set {pi, i=1,2 ... ..., n, another width Color Remote Sensing Image image2 are image subject to registration, feature point description subset therein On the basis of collect { qj, j=1,2 ..., m;
Unmanned plane Color Remote Sensing Image image1, image2 to acquisition carries out gray proces, is changed into gray level image, carries out Gaussian smoothing completes metric space and built, and carrying out characteristic point using feature detection operator Hessian determinants of a matrix carries Take, all characteristic points for obtaining image1, image2 are designated as object set { p respectivelyi, i=1,2 ..., n and benchmark set { qj},j =1,2 ... ..., m are described in detail below:
I.e. for the certain point X=(x, y) in gray level image I (x, y) resulting after gray proces, on yardstick σ Hessian matrixes are defined as:
Wherein:Lxx(X, σ) is the Gauss second-order differential that yardstick is σWith image I (x, y) at point X convolution As a result:
Wherein, Represent two-dimensional convolution computing, similarly, Lxy(X, σ) and Lyy(X,σ) It is the Gauss second-order differential that yardstick is σ respectivelyWith I (x, y) at point X convolution.
Hessian determinants of a matrix:
Det (H)=Lxx(X,σ)Lyy(X,σ)-L2 xy(X,σ) (3)
Different scale σ represents different spatial resolutions, and different scale σ values carry out the image difference after Gaussian smoothing, Carry out metric space structure:
Yardstick between group:
σoctave0*2octave-1 (4)
Yardstick in group:
σ --- -- metric space coordinate, as metric space a numerical value token state;
Metric space coordinate in s----- groups, as organizes a numerical value token state of interior metric space;
σ0--- -- initial gauges, determine according to camera properties, typically select 1.6;
Every group of number of plies of S----- (4 layers);
Octave----- groups number (4 groups);
Metric space is set up by formula (4), (5), yardstick of the search technique in image is then suppressed by three-dimensional non-maximum The local extremum that determinant response is searched in space completes the detection of characteristic point;
Step 2: feature point description and its addition colour information
2.1 feature point description
For obtained object set { piAnd benchmark set { qj, determine that (s is represented the length of side for 20s centered on each characteristic point Sampling interval under different scale space) square area, its surrounding neighbors information is divided into 4 × 4 sub-regions, Mei Ge little Region is divided into 5 × 5 sampled points again, finally calculates the response vertically and horizontally of each zonule and modulus value with Harr small echos Dx, | dx |, dy, | dy |, and count the overall response ∑ dx of 5 × 5 sampled points, ∑ | and dx |, ∑ dy, ∑ | dy |, can obtain feature 64 (4 × 4 × 4) dimension SURF feature descriptors of point:
V=(∑ dx, ∑ | dx |, ∑ dy, ∑ | dy |) (6)
Assuming that certain feature point coordinates is (x, y), then the SURF descriptors of this feature point in one direction (hang down by level Directly) it may be defined as:
Vgray=(V1,V2,V3,V4,...,V16) (7)
Wherein, Vi(i=1,2,3 ..., 15,16) represents the descriptor corresponding to 16 sub-regions.Utilize the above method Object set { p can be obtainediAnd benchmark set { qjIn all characteristic point 64 dimension SURF feature descriptors.
2.2 colour informations are added
The colour information of characteristic point itself is added in of the initially described symbol, coordinate is (x, y) characteristic point, and it is in RGB face The colour space is expressed as R(x,y)、G(x,y)、B(x,y), color space conversion is carried out, YIQ color spaces are transformed into for Y(x,y)、 I(x,y)、Q(x,y), RGB color is the most commonly used in daily life.However, the Color Remote Sensing Image that we generally collect It is excessively sensitive to brightness, and the component of RGB color and light levels close relation.So, RGB color be not suitable for into Row Color Image Processing.And YIQ and RGB directly can be converted mutually by a matrix multiple, linear conversion, real-time is high. On the other hand, YIQ color spaces have and can extract luminance component, while colour information is also separated, amount of calculation Small, Clustering features are good, therefore, it is possible to be efficiently used for Color Image Processing, where this is also our improved roots.
RGB color transforms to YIQ color spaces and is expressed as:
Increased in primal algorithm descriptor, obtain new color descriptors as follows:
Vcolor=(V1,V2,V3,V4,...,V16,Y(x,y),I(x,y),Q(x,y)) (9)
Become apparent from order that must state, formula (6) is subjected to simple transformation
V=(i1,i2,i3,i4) (10)
Then the progress of formula (9) formula is redefined as follows:
Vcolor=(i1,i2,i3,i4,...,i64,Y(x,y),I(x,y),Q(x,y)) (11)
In order to have more preferable robustness to rotation, yardstick, illumination.Need exist for that formula (11) is normalized:
Wherein | Vcolor| it is the modulus formula of (11) formula, expression formula is:
Wherein ik(k=1,2,3 ..., 63,64) represents 64 dimension descriptors of characteristic point in classical SURF algorithm.From formula (11) it will be seen that on the basis of the feature point descriptions symbol of original 64 dimension in, the colour information of characteristic point itself Be loaded into, constitute the descriptor of 67 dimensions, compare, do not changed much on the time with original algorithm, but in performance because For the addition of colour information so that precision can be lifted further.
By that analogy, object set { p is finally tried to achieveiAnd benchmark set { qjIn each characteristic point 67 dimension descriptors.
Step 3: Feature Points Matching
For target tightening in reference picture image1 each characteristic point, its arest neighbors is obtained in benchmark Integrated query Euclidean distance dntWith secondary neighbour's Euclidean distance dsntIf meeting:
Then retaining coincidence formula (14) characteristic point that target tightening, target tightening that this feature point is looked into benchmark set with it The matching that the characteristic point arest neighbors for the arest neighbors ask is constituted, otherwise rejects this matching pair, the T in formula is judgment threshold, The value between 0.4-0.8.
By above step, we have been obtained under arest neighbors Euclidean distance and time neighbour's Euclidean distance ratio measurement criterion Matching double points.The possibility that error hiding occur in these matching double points is still very high, it is necessary to purification processes be carried out, here with two-way The method matched somebody with somebody, is operated as follows:
To cause statement to become apparent from, the descriptor formula (12) in image1 and image2 is designated as v1=(v1 here1, v12,v13,v14,...,v167) and v2=(v21,v22,v23,v24,...,v267), it can be obtained by Euclidean distance formula:
Wherein, the Euclidean distance between two vectors of v1, v2 is expressed as d (v1, v2), v1i、v2i(i=1,2,3 ..., 66,67) vector v 1, v2 each component value are represented;
Minimum distance and time proximity table are shown as dntAnd dsnt, TiIt is minimum distance and secondary ratio closely, represents such as Under:
Bi-directional matching point is carried out to purification according to formula (15) and (16), is comprised the following steps that:
(1) the characteristic point p1 in reference picture image1, sought with Euclidean distance formula in image2 its recently it is European Distance is respectively p2 with time nearly Euclidean distance characteristic pointntAnd p2snt, use d1ntAnd d1sntCome describe forward direction minimum distance and time Minimum distance, wherein the direction found from reference picture object set characteristic point in benchmark set, positioning is positive, that is, determines object set In a characteristic point carry out traversal in benchmark set and find to meet the point of condition;And from benchmark set in the side that target tightening finds To, it is determined as reversely;
(2) the ratio T of minimum distance and time minimum distance is calculated using formula (16)1, i.e.,
If T1<T, carries out step (3), otherwise, jumps out automatically therewith;
(3) for p2ntCorresponding characteristic point, seeks obtaining minimum distance and time minimum distance spy with formula (15) in image1 Levy is respectively a little p1ntAnd p1snt, use d2ntAnd d2sntTo describe reverse minimum distance and time minimum distance;
(4) the ratio T of minimum distance and time minimum distance is calculated using formula (16)2
If T2<T, while p1ntCharacteristic point and primitive character the point p1 found is same characteristic point, then it is assumed that be just True matching double points, otherwise return to step (1) the characteristic point judgement new to the further feature point progress in reference picture image1;
Traversal reference picture target tightening all characteristic points, finally realize that bi-directional matching is purified successively.
Step 4: RANSAC algorithms ask for transformation matrix parameter
Transformation matrix is solved first, the Color Remote Sensing Image upper two point m' and m of image1 and image2 is given, according to perspective Transformation matrix H has:M'~Hm, wherein~be expressed as being in equal proportions, if m and m' coordinate is respectively (x1,y1) and (x2,y2), write It is into homogeneous coordinates form:(x1,y1,z1) and (x2',y2',z2), wherein x2'/z2=x2, y2'/z2=y2, then then there is formula
And then (20) and (21) are obtained on the basis of homogeneous coordinates:
Without loss of generality and in order to calculate simplicity, z is made1=1, it can be obtained by (20) and (21):
x1H11+y1H12+H13-x1x2H31-y1x2H32-x2H33=0 (22)
x1H21+y1H22+H23-x1y2H31-y1y2H32-y2H33=0 (23)
Thus derive:
Due to the last one digit number H of Perspective transformation model331 generally is set to by us, so transformation model is by eight parameters Composition, at least need four pairs of not conllinear matching double points, just can in the hope of each parameter of Perspective transformation model value.But whether being In actual calculating process or in objective condition, certainly existed during error.First, in actually calculating, because noise is dry Disturb, the position error of feature point detection, the factor such as model error, mispairing chosen, rely only on four pairs of not conllinear matching double points Error is larger if trying to achieve eight parameters, it is impossible to ensure registration accuracy, and RANSAC algorithms can be good at solving above-mentioned ask just Topic.
RANSAC algorithms are as a kind of selection mechanism, and it not only ensure that the number and matter for extracting matching double points as far as possible Amount, and transformation model can accurately be estimated, RANSAC algorithms realize that transformation matrix parameter asks for the following institute of step Show:
(1) two given width Color Remote Sensing Image image1 and image2, find the characteristic point pair after bi-directional matching purification mi'←→mi, i=1,2 ... n;
(2) from matching double points set, 4 pairs of matching double points is arbitrarily chosen, using the coordinate value of this 4 pairs of matching double points, are asked Obtain transformation matrix H;
(3) using Euclidean distance as foundation, found in matching double points set and meet d (Hmi,mi')<The point pair of t conditions, its In, HmiRepresent to carry out characteristic point in image2 the position coordinates that matrixing is mapped in image1, mi' represent mi The position coordinates of character pair point, d (Hm in image1i,mi') represent two coordinate vectors Euclidean distance, will be qualified Point records interior quantity for meeting H constraints to as final interior point, and t chooses 4;
(4) (2) and (3) two steps n times (100≤n≤200), the interior quantity of record each time are repeated;
(5) the maximum H of point quantity in correspondence is chosenbest, searching is all to meet d (Hbestmi,mi')<The match point of t conditions Right, using them as final interior point, i.e., correct matching double points do not meet d (Hbestmi,mi')<The error matching points of t conditions Right, as exterior point is rejected;
(6) N number of matching double points are tried to achieve according to RANSAC algorithm, we to 2N comprising matching double points and correspondingly The matrix that transformation matrix H equation is constituted is denoted as A, and carrying out SVD singular value decompositions to it obtains A=UDVT, it is A respectively in U and V Singular vector, D is that the singular value on diagonal is arranged in descending order, and it is required saturating that V last row, which are reconstructed into 3*3 matrixes, Depending on transformation matrix parameter Estimation, the parameter of this matrix meets error minimum under least square meaning;
Step 5: bicubic interpolation realizes registration
Need to carry out Mapping implementation registration after transformation matrix when obtaining, during mapping, because pixel is all with whole Number coordinate values are stored, and are discrete value one by one, and this will cause a kind of situation generation necessarily occurred:Exist originally Point (x in integer grid1,y1) (coordinate is all integer), after mapping (x2,y2) do not fall within mesh point, therefore interpolation Effect be just particularly important.
Bicubic interpolation method utilizes ten six pixels of the interpolation point in the corresponding points 4*4 neighborhoods in image subject to registration Half-tone information estimate the gray value of interpolation point, not only allow for the influence of four direct neighbor point gray values, it is also contemplated that The influence of each consecutive points gray-value variation rate.If it is adjacent to there is 4*4 in interpolation point near the corresponding points in image subject to registration 16 pixels in domain, it is possible to use the method.
If any point coordinate is (x, y) on reference picture, its gray value is g (x, y), and this is on image subject to registration Correspondence point coordinates is (u, v), and its gray value is f (u, v), and [] is represented in round numbers, image subject to registration in corresponding points 4*4 neighborhoods The matrix of 16 pixel compositions is B:
Function f (u+i, v+j) (i=-1,0,1,2 in element in matrix B;J=-1,0,1,2) it is defined as in figure subject to registration As gray value when upper corresponding point coordinates is (u+i, v+j).
Then the gray value calculation formula of interpolation point is as follows in reference picture:
G (x, y)=f (u, v)=ABC (26)
Wherein:
S (w) is the basic function of bicubic interpolation, is that sin (x* π)/x is approached.By bicubic interpolation, by image2 Image interpolation realizes final high registration accuracy into reference picture image1.
Bicubic interpolation method employs the half-tone information of pixel in larger neighborhood, and interpolation precision is further improved, and Also there is good holding to the edge details of image, visual effect is more preferable, interpolation essence in substantially existing interpolation algorithm Spend highest algorithm.
Following instance is to identical two width unmanned plane Color Remote Sensing Image image1 and image2 original SURF algorithm and one Kind of robust SURF algorithm carries out precision simulation experimental verification, and matching effect is good, robust SURF algorithm do not occur cross spider (see Accompanying drawing 2 and accompanying drawing 3).It is actually a kind of by 2 times of scalings between image1 and image2, and has 10 degree of rotations, horizontal-shift 239 pixels, 28 pixel affine transformations of vertical misalignment.
Example:
First according to geometric transform relation between foregoing Color Remote Sensing Image image1 and image2, we can be with It is easily obtained theoretical value Hture.Secondly, we are carried out with former SURF algorithm to Color Remote Sensing Image image1 and image2 Registration, obtains original Hsurf.Registration finally is carried out to Color Remote Sensing Image image1 and image2 with a kind of robust SURF algorithm, Obtain Hrob
Without loss of generality, eight groups of unmanned plane Color Remote Sensing Images are then chosen to be tested respectively, 8 groups of experiments is carried out here Averaged, parameter each to transformation matrix carries out counting as shown in table 1, and table 1 is affine transformation matrix parameter comparison table, finds SURF algorithm its maximum deviation average value is the 0.218 maximum deviation average value 0.134 for being higher than robust, six parameter degrees of closeness, Robust algorithm is substantially better than SURF algorithm.
The affine transformation matrix parameter lookup table of table 1

Claims (4)

1. a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering, it is characterised in that step is as follows:
Step 1:Gray proces change is carried out to the colored remote sensing reference picture image1 of unmanned plane, the image image2 subject to registration of acquisition For gray level image, Gaussian smoothing is carried out to gray level image and completes metric space structure, using feature detection operator Hessian Determinant of a matrix carries out feature point extraction, and all characteristic points for obtaining image1, image2 are designated as object set { p respectivelyi},i =1,2 ..., n and benchmark set { qj, j=1,2 ..., m;
Step 2:For obtained object set { piAnd benchmark set { qj, determine the length of side for 20s just centered on each characteristic point Square region, wherein s represent the sampling interval under different scale space, and its surrounding neighbors information is divided into 4 × 4 sub-regions, Each zonule is divided into 5 × 5 sampled points again, finally calculates the response of each zonule vertically and horizontally with Harr small echos And modulus value dx, | dx |, dy, | dy |, and count the overall response Σ dx of 5 × 5 sampled points, ∑ | dx |, ∑ dy, ∑ | dy |, can obtain 64 to characteristic point tie up SURF feature descriptors:
V=(∑ dx, ∑ | dx |, ∑ dy, ∑ | dy |)
Then the SURF descriptors of characteristic point are defined as in the horizontal or vertical directions:
Vgray=(V1,V2,V3,V4,...,V16)
Wherein, Vi, i=1,2,3 ..., 15,16 represent the description vectors corresponding to 16 sub-regions;It can thus be concluded that object set {piAnd benchmark set { qjIn all characteristic point 64 dimension SURF feature descriptors;
Step 3:It is (x, y) characteristic point for coordinate, it is expressed as R in RGB color(x,y)、G(x,y)、B(x,y), carried out Color space conversion, is transformed into YIQ color spaces for Y(x,y)、I(x,y)、Q(x,y)
RGB color transforms to YIQ color spaces and is expressed as:
Y I Q = 0.229 0.587 0.114 0.596 - 0.274 - 0.322 0.211 - 0.523 0.312 R G B
Increased in primal algorithm descriptor, obtain new color descriptors as follows:
V c o l o r = ( i 1 , i 2 , i 3 , i 4 , ... , i 64 , Y ( x 0 , y 0 ) , I ( x 0 , y 0 ) , Q ( x 0 , y 0 ) )
Above formula is normalized:
V &OverBar; c o l o r = ( i 1 | V c o l o r | , i 2 | V c o l o r | , i 3 | V c o l o r | , i 4 | V c o l o r | , ... , i 64 | V c o l o r | , Y ( x 0 , y 0 ) | V c o l o r | , I ( x 0 , y 0 ) | V c o l o r | , Q ( x 0 , y 0 ) | V c o l o r | )
| V c o l o r | = &Sigma; k = 1 64 i k 2 + Y 2 ( x 0 , y 0 ) + I 2 ( x 0 , y 0 ) + Q 2 ( x 0 , y 0 )
Wherein ik, k=1,2,3 ..., 63,64 represent 64 dimension description vectors of characteristic point in classical SURF algorithm;It can thus be concluded that mesh Mark collection { piAnd benchmark set { qjIn each characteristic point 67 dimension description vectors;
Step 4:Characteristic point bi-directional matching
Step 4a:Using Euclidean distance formula calculate characteristic point p1 in reference picture image1 in image1 it is positive nearest Euclidean distance dntWith secondary nearly Euclidean distance dsnt
Using Euclidean distance formula calculate characteristic point p1 in reference picture image1 in image2 it is positive it is nearest it is European away from From d1ntWith secondary nearly Euclidean distance d1sntCharacteristic point be respectively p2ntAnd p2snt
Step 4b:Calculate minimum distance d1ntWith secondary minimum distance d1sntRatio T1, i.e.,
T 1 = d 1 n t d 1 s n t
By T1It is compared with judgment threshold T:If T1< T, carry out step 4c, otherwise, jump out automatically therewith;
Step 4c:For p2ntCharacter pair point, minimum distance d2 reverse in image1 is calculated using Euclidean distance formulant With secondary minimum distance d2sntCharacteristic point be respectively p1ntAnd p1snt
Step 4d:Calculate the ratio T of minimum distance and time minimum distance2
T 2 = d 2 n t d 2 s n t
By T1It is compared with judgment threshold T:If T2< T, while p1ntCharacteristic point and primitive character the point p1 found is same One characteristic point, then it is assumed that be correct matching double points, otherwise return to step 4a is to the further feature point in reference picture image1 New characteristic point is carried out to judge;
Traversal reference picture target tightening all characteristic points, finally realize that bi-directional matching is purified successively;
Step 5:Realize that transformation matrix parameter is asked for using RANSAC algorithms
Step 5a:The characteristic point after bi-directional matching purification after the completion of searching step 4 is to mi'←→mi, i=1,2 ... n;
Step 5b:From matching double points set, 4 pairs of matching double points are arbitrarily chosen, using the coordinate value of this 4 pairs of matching double points, are asked Obtain transformation matrix H;
Step 5c:Using Euclidean distance as foundation, found in matching double points set and meet d (Himi,mi')<The point pair of t conditions, its In, HimiRepresent to carry out characteristic point in image2 the position coordinates that matrixing is mapped in image1, mi' represent mi The position coordinates of character pair point, d (H in image1imi,mi') represent two coordinate vectors Euclidean distance, will be eligible Point to as final interior point, and record and meet HiInterior quantity of constraint;
Step 5d:5b and two steps of 5c n time are repeated, interior quantity each time is recorded;
Step 5e:Choose the maximum H of point quantity in correspondencebest, searching is all to meet d (Hbestmi,mi')<The match point of t conditions Right, using them as final interior point, i.e., correct matching double points do not meet d (Hbestmi,mi')<The error matching points of t conditions Right, as exterior point is rejected;
Step 5f:N number of matching double points are tried to achieve according to RANSAC algorithm, to this 2N by x1,y1,x2,y2Constituted Matrix be marked, be denoted as A, carrying out SVD singular value decompositions to it obtains A=UDVT, it is A singular vector, D respectively in U and V Arranged in descending order for the singular value on diagonal, it is required perspective transformation matrix ginseng that V last row, which are reconstructed into 3*3 matrixes, Number estimation;
Step 6:Bicubic interpolation realizes registration
If any point coordinate is (x, y) on reference picture image1, its gray value is g (x, y), and the point is in image subject to registration Corresponding point coordinates on image2 is (u, v), and its gray value is f (u, v), and [] represents correspondence in round numbers, image subject to registration The matrix of 16 pixel compositions is B in point 4*4 neighborhoods:
B = f ( &lsqb; u &rsqb; - 1 , &lsqb; v &rsqb; - 1 ) f ( &lsqb; u &rsqb; - 1 , &lsqb; v &rsqb; ) f ( &lsqb; u &rsqb; - 1 , &lsqb; v &rsqb; + 1 ) f ( &lsqb; u &rsqb; - 1 , &lsqb; v &rsqb; + 2 ) f ( &lsqb; u &rsqb; , &lsqb; v &rsqb; - 1 ) f ( &lsqb; u &rsqb; , &lsqb; v &rsqb; ) f ( &lsqb; u &rsqb; , &lsqb; v &rsqb; + 1 ) f ( &lsqb; u &rsqb; , &lsqb; v &rsqb; + 2 ) f ( &lsqb; u &rsqb; + 1 , &lsqb; v &rsqb; - 1 ) f ( &lsqb; u &rsqb; + 1 , &lsqb; v &rsqb; ) f ( &lsqb; u &rsqb; + 1 , &lsqb; v &rsqb; + 1 ) f ( &lsqb; u &rsqb; + 1 , &lsqb; v &rsqb; + 2 ) f ( &lsqb; u &rsqb; + 2 , &lsqb; v &rsqb; - 1 ) f ( &lsqb; u &rsqb; + 2 , &lsqb; v &rsqb; ) f ( &lsqb; u &rsqb; + 2 , &lsqb; v &rsqb; + 1 ) f ( &lsqb; u &rsqb; + 2 , &lsqb; v &rsqb; + 2 )
Function f (u+i, v+j), i=-1,0,1,2 in element in matrix B;J=-1,0,1,2, is defined as on image subject to registration Corresponding point coordinates be (u+i, v+j) when gray value;
Then the gray value calculation formula of interpolation point is as follows in reference picture:
G (x, y)=f (u, v)=ABC
Wherein:
a = u - &lsqb; u &rsqb; b = v - &lsqb; v &rsqb; C = s ( 1 + a ) s ( a ) s ( 1 - a ) s ( 2 - a ) T A = s ( 1 + b ) s ( b ) s ( 1 - b ) s ( 2 - b )
s ( w ) = 1 - 2 | w | 2 + | w | 3 , | w | < 1 4 - 8 | w | + 5 | w | 2 - | w | 3 , 1 &le; | w | < 2 0 , | w | &GreaterEqual; 2
S (w) is the basic function of bicubic interpolation, is that sin (x* π)/x is approached, by bicubic interpolation, by image2 images It is interpolated into reference picture image1, realizes final high registration accuracy.
2. a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering according to claim 1, it is characterised in that institute Judgment threshold T in the step 4b stated is 0.4-0.8.
3. a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering according to claim 1, it is characterised in that institute T in the step 5c stated chooses 4.
4. a kind of robust SURF unmanned planes Color Remote Sensing Image method for registering according to claim 1, it is characterised in that institute 100≤n≤200 in the step 5d stated.
CN201710163760.8A 2017-03-20 2017-03-20 A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering Pending CN106971404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710163760.8A CN106971404A (en) 2017-03-20 2017-03-20 A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710163760.8A CN106971404A (en) 2017-03-20 2017-03-20 A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering

Publications (1)

Publication Number Publication Date
CN106971404A true CN106971404A (en) 2017-07-21

Family

ID=59329507

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710163760.8A Pending CN106971404A (en) 2017-03-20 2017-03-20 A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering

Country Status (1)

Country Link
CN (1) CN106971404A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108037132A (en) * 2017-12-25 2018-05-15 华南理工大学 A kind of visual sensor system and method for the detection of dry cell pulp layer paper winding defect
CN109087297A (en) * 2018-08-10 2018-12-25 成都工业职业技术学院 A kind of MR method for registering images based on adaptive neighborhood selection
CN109102534A (en) * 2018-08-29 2018-12-28 长光卫星技术有限公司 Optical remote sensing image registration method and system under the conditions of haze weather
WO2019042232A1 (en) * 2017-08-31 2019-03-07 西南交通大学 Fast and robust multimodal remote sensing image matching method and system
CN109711444A (en) * 2018-12-18 2019-05-03 中国科学院遥感与数字地球研究所 A kind of new remote sensing image matching method based on deep learning
CN110009558A (en) * 2019-01-17 2019-07-12 柳州康云互联科技有限公司 A kind of normalized method of easy image color
CN110060285A (en) * 2019-04-29 2019-07-26 中国水利水电科学研究院 A kind of remote sensing image registration method and system based on SURF algorithm
CN111806702A (en) * 2020-06-30 2020-10-23 鑫喆喆 Parachute jumping mechanism pop-up platform and method based on signal detection
CN111914855A (en) * 2020-07-31 2020-11-10 西安电子科技大学 Priori feature point sparsization method of ultra-large digital image map
CN113470085A (en) * 2021-05-19 2021-10-01 西安电子科技大学 Image registration method based on improved RANSAC
CN115457222A (en) * 2022-09-14 2022-12-09 北京建筑大学 Method for geographic registration of three-dimensional model in geographic information system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426186A (en) * 2013-09-05 2013-12-04 山东大学 Improved SURF fast matching method
CN103914847A (en) * 2014-04-10 2014-07-09 西安电子科技大学 SAR image registration method based on phase congruency and SIFT
CN104867126A (en) * 2014-02-25 2015-08-26 西安电子科技大学 Method for registering synthetic aperture radar image with change area based on point pair constraint and Delaunay
CN105303567A (en) * 2015-10-16 2016-02-03 浙江工业大学 Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient
CN105809693A (en) * 2016-03-10 2016-07-27 西安电子科技大学 SAR image registration method based on deep neural networks
CN106210731A (en) * 2016-07-01 2016-12-07 兰州理工大学 Coloured image reversible data concealing method based on bicubic interpolation extension

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103426186A (en) * 2013-09-05 2013-12-04 山东大学 Improved SURF fast matching method
CN104867126A (en) * 2014-02-25 2015-08-26 西安电子科技大学 Method for registering synthetic aperture radar image with change area based on point pair constraint and Delaunay
CN103914847A (en) * 2014-04-10 2014-07-09 西安电子科技大学 SAR image registration method based on phase congruency and SIFT
CN105303567A (en) * 2015-10-16 2016-02-03 浙江工业大学 Image registration method integrating image scale invariant feature transformation and individual entropy correlation coefficient
CN105809693A (en) * 2016-03-10 2016-07-27 西安电子科技大学 SAR image registration method based on deep neural networks
CN106210731A (en) * 2016-07-01 2016-12-07 兰州理工大学 Coloured image reversible data concealing method based on bicubic interpolation extension

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张二磊等: "一种改进的SURF彩色遥感图像配准算法", 《液晶与显示》 *

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11244197B2 (en) 2017-08-31 2022-02-08 Southwest Jiaotong University Fast and robust multimodal remote sensing image matching method and system
WO2019042232A1 (en) * 2017-08-31 2019-03-07 西南交通大学 Fast and robust multimodal remote sensing image matching method and system
CN108037132A (en) * 2017-12-25 2018-05-15 华南理工大学 A kind of visual sensor system and method for the detection of dry cell pulp layer paper winding defect
CN108037132B (en) * 2017-12-25 2023-06-16 华南理工大学 Visual sensor system and method for detecting winding defect of dry battery slurry layer paper
CN109087297A (en) * 2018-08-10 2018-12-25 成都工业职业技术学院 A kind of MR method for registering images based on adaptive neighborhood selection
CN109102534A (en) * 2018-08-29 2018-12-28 长光卫星技术有限公司 Optical remote sensing image registration method and system under the conditions of haze weather
CN109102534B (en) * 2018-08-29 2020-09-01 长光卫星技术有限公司 Optical remote sensing image registration method and system under haze weather condition
CN109711444A (en) * 2018-12-18 2019-05-03 中国科学院遥感与数字地球研究所 A kind of new remote sensing image matching method based on deep learning
CN110009558A (en) * 2019-01-17 2019-07-12 柳州康云互联科技有限公司 A kind of normalized method of easy image color
CN110060285A (en) * 2019-04-29 2019-07-26 中国水利水电科学研究院 A kind of remote sensing image registration method and system based on SURF algorithm
CN111806702A (en) * 2020-06-30 2020-10-23 鑫喆喆 Parachute jumping mechanism pop-up platform and method based on signal detection
CN111914855A (en) * 2020-07-31 2020-11-10 西安电子科技大学 Priori feature point sparsization method of ultra-large digital image map
CN111914855B (en) * 2020-07-31 2024-04-05 西安电子科技大学 Priori feature point sparsification method for oversized digital image map
CN113470085A (en) * 2021-05-19 2021-10-01 西安电子科技大学 Image registration method based on improved RANSAC
CN113470085B (en) * 2021-05-19 2023-02-10 西安电子科技大学 Improved RANSAC-based image registration method
CN115457222A (en) * 2022-09-14 2022-12-09 北京建筑大学 Method for geographic registration of three-dimensional model in geographic information system

Similar Documents

Publication Publication Date Title
CN106971404A (en) A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering
Buch et al. Pose estimation using local structure-specific shape and appearance context
CN109410255A (en) A kind of method for registering images and device based on improved SIFT and hash algorithm
CN106960451A (en) A kind of method for lifting the weak texture region characteristic point quantity of image
CN106991695A (en) A kind of method for registering images and device
CN104318570A (en) Self-adaptation camouflage design method based on background
CN105427298A (en) Remote sensing image registration method based on anisotropic gradient dimension space
CN106447704A (en) A visible light-infrared image registration method based on salient region features and edge degree
CN107392215A (en) A kind of multigraph detection method based on SIFT algorithms
Perret et al. Directed connected operators: Asymmetric hierarchies for image filtering and segmentation
CN101388115A (en) Depth image autoegistration method combined with texture information
CN104504723A (en) Image registration method based on remarkable visual features
CN112396643A (en) Multi-mode high-resolution image registration method with scale-invariant features and geometric features fused
CN108765476A (en) Polarized image registration method
CN106127748A (en) A kind of characteristics of image sample database and method for building up thereof
CN102446356A (en) Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
Jansson et al. Exploring the ability of CNN s to generalise to previously unseen scales over wide scale ranges
Cui et al. A new image registration scheme based on curvature scale space curve matching
CN104361573B (en) The SIFT feature matching algorithm of Fusion of Color information and global information
Roy et al. LGVTON: A landmark guided approach to virtual try-on
Chen et al. Learning continuous implicit representation for near-periodic patterns
CN105894494A (en) Three-line-array stereo aerial survey camera parallel spectrum band registration method based on GPU technology
CN117351078A (en) Target size and 6D gesture estimation method based on shape priori
Matusiak et al. Unbiased evaluation of keypoint detectors with respect to rotation invariance
Khoshelham et al. A Model‐Based Approach to Semi‐Automated Reconstruction of Buildings from Aerial Images

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170721

WD01 Invention patent application deemed withdrawn after publication