CN105427298A - Remote sensing image registration method based on anisotropic gradient dimension space - Google Patents

Remote sensing image registration method based on anisotropic gradient dimension space Download PDF

Info

Publication number
CN105427298A
CN105427298A CN201510770880.5A CN201510770880A CN105427298A CN 105427298 A CN105427298 A CN 105427298A CN 201510770880 A CN201510770880 A CN 201510770880A CN 105427298 A CN105427298 A CN 105427298A
Authority
CN
China
Prior art keywords
remote sensing
sensing images
registration
subject
metric space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510770880.5A
Other languages
Chinese (zh)
Other versions
CN105427298B (en
Inventor
马文萍
闻泽联
焦李成
武越
马进
任琛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201510770880.5A priority Critical patent/CN105427298B/en
Publication of CN105427298A publication Critical patent/CN105427298A/en
Application granted granted Critical
Publication of CN105427298B publication Critical patent/CN105427298B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Landscapes

  • Image Analysis (AREA)

Abstract

The invention discloses a remote sensing image registration method based on anisotropic gradient dimension space, which mainly solves the problem of relatively low correct matching rate under the condition of relatively great brightness nonlinear change of the remote sensing images. The implementing steps of the remote sensing image registration method based on anisotropic gradient dimension space are as follows: (1) inputting remote sensing image pairs; (2) constructing dimension space of anisotropic diffusion; (3) calculating a gradient amplitude image; (4) detecting feature points; (5) generating a main direction of the feature points; (6) generating a descriptor of each feature point; (7) matching the feature points; (8) deleting wrongly matched feature point pairs; and (9) registering a reference image and a to-be-registered image. As feature point detection, feature point main direction generation and feature point descriptor generation are carried out on the gradient amplitude image in the anisotropic dimension space, the situation of relatively great brightness nonlinear change of the images can be dealt efficiently, and the remote sensing image registration method based on anisotropic gradient dimension space can be applied to complex multisource and multispectral remote sensing image registration.

Description

Based on the remote sensing image registration method of anisotropic gradient metric space
Technical field
The invention belongs to technical field of image processing, further relate to a kind of remote sensing image registration method based on anisotropic gradient metric space remote sensing images carried out in registration process technical field.The present invention can be applicable to the registration to the remote sensing images that multispectral, multi-source do not obtain in the same time.
Background technology
Image registration refers at Different periods, uses the overlapping region image that has that is identical or different sensors shooting to carry out the process of Geometry rectification to Same Scene from different visual angles.Image registration techniques is development in recent years one of image processing techniques rapidly.Image registration techniques is widely used in every field, as aeronautical and space technology, image mosaic, Geographic Information System, image co-registration, three-dimensional reconstruction, target identification and change detect etc.At present along with the development of remote sensing technology, the remote sensing images that the different physical characteristicss due to sensor produce are on the increase, and therefore fully utilize various image and carry out data extraction and analyze the important means having become remote sensing fields.Simultaneously due to the difference of physical characteristics and imaging mode between various sensor, strict registration must be carried out between the image of market demand geometrical property different from during data fusion and different resolution.
Current remote sensing image registration is mainly divided into two classes: based on the method for registering of area grayscale and the method for registering of feature based.Wherein, the conventional method for registering images based on area grayscale has: cross-correlation (CC), not bending moment, phase correlation method and mutual information (MI) etc. based on FFT.But the method for registering images based on gray scale also exists following shortcoming: 1. more responsive to variation of image grayscale, especially nonlinear illumination variation, this greatly reduces the performance of algorithm; 2. computation complexity is high; 3. to rotation, the deformation of target with block more responsive.In order to overcome the shortcoming that it exists, there has been proposed the method for registering images of feature based.First method based on characteristics of image extracts the features such as edge, angle point, profile and regional center from image, then uses the optimum alignment of the relevant decision image between feature.These notable features can greatly reduce the quantity of information of image, and calculated amount is reduced, and speed is faster, and has robustness to the grey scale change of image.At present point patterns commonly uses the most and a kind of higher method of efficiency.Angle point in two dimensional image, flex point, point of crossing etc. are the obvious characteristics of image.These points have translation, Rotation and Zoom unchangeability, hardly by the impact of illumination condition.It only need with in image about 0.05% pixel just can represent the data message of entire image, its information content is high, computing velocity soon, make to be treated as possibility in real time.Therefore, the method for registering images of feature based have also been obtained in remote sensing image registration field and applies widely.
Classical SIFT algorithm has the advantage maintained the invariance to illumination and the view transformation of visible images rotation, convergent-divergent, part.When SIFT method detects the increase of unique point number, generating feature point principal direction and generating feature point descriptor step spended time increase sharply, and limit the application in practice of SIFT algorithm.Due to the advantage of SIFT algorithm, Chinese scholars proposes many boosting algorithms based on SIFT.But having there is many incorrect match points when using the method coupling remote sensing images based on SIFT, correct matching rate (CMT) declines rapidly.Reason is that the difference of shooting time, spectrum and acquisition sensor causes remote sensing images significantly different to the pixel intensity of the same area, simultaneously pixel between brightness to map may be linear, non-linear, even there will be brightness Reversion.
The paper " UniformRobustScale-InvariantFeatureMatchingforOpticalRem oteSensingImages " (" IEEETransactionsonGeoscienceandRemoteSensing " that Sedaghat delivers at it, 2011, pages4516-4527) in propose a kind of UR-SIFT algorithm.This algorithm controls the quantity of unique point that metric space extracts, quality and distribution preferably, improve the registration accuracy of the part remote sensing images with partial transformation, but the weak point that this algorithm still exists is, registering images brightness the remote sensing images pair of larger nonlinearities change can not be had accurately.
The paper " ANovelCoarse-to-FineSchemeforAutomaticImageRegistrationB asedonSIFTandMutualInformation " (" IEEETransactionsonGeoscienceandRemoteSensing " that Gong delivers at it, 2014, pages4328-4338) in disclose a kind of by thick to smart method for registering.The method first use SIFT algorithm obtain image between initial transformation relation, then obtain more accurate image registration in conjunction with initial transformation relation and mutual information method.The weak point that the method still exists is, due to the method that the method is based on mutual information, so computation complexity is high, and registration accuracy according to be disinclined to sift algorithm obtain initial solution, when there is larger nonlinearities change in brightness of image, sift algorithm can not obtain good initial solution, and the mutual information method relying on the initial solution that sift algorithm obtains can not realize accurate registration.
The patent that Xian Electronics Science and Technology University applies at it " selects the remote sensing image registration method of block and sift feature " in (number of patent application: CN201410379927, publication number: CN104200461A) and proposes a kind of remote sensing image registration method based on sift and mutual information based on mutual information.The implementation procedure of the method is: choose image pair from reference to remote sensing images and remote sensing images subject to registration at random, calculate the mutual information of every a pair image; Descending sort is carried out to mutual information; Choose the subimage pair that a front n mutual information is larger; To the larger image of n mutual information before choosing to extraction sift feature, and slightly mate; Deletion error match point, carefully mates; Calculate registration parameter and association relationship; Choose the maximum registration parameter of mutual information as final registration parameter.Although the method can accelerate the speed of image registration, but, the weak point that the method still exists is, generate in unique point, unique point principal direction generates and the descriptor generation phase of unique point still adopts sift method, when the luminance non-linearity that remote sensing images are right changes greatly, correct matching rate declines rapidly, the method adopts traditional arest neighbors and time nearest neighbor distance than matching criterior simultaneously, and under there is more repeated characteristic situation in remote sensing images, the unique point number of correct pairing also declines rapidly.
Summary of the invention
The object of the invention is the deficiency overcoming above-mentioned prior art, a kind of remote sensing image registration method based on anisotropic gradient metric space is proposed, to improve the accuracy of characteristic matching, realize the registration having the remote sensing images of larger nonlinearities change right to brightness of image.
The thinking that the present invention realizes above-mentioned purpose is: first build the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration according to nonlinear diffusion principle, Harris operator is used to carry out Corner Detection on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration again, what then generate point patterns has descriptor compared with strong robustness to the nonlinearities change of brightness of image, finally use the Feature Points Matching criterion of improvement to carry out Feature Points Matching, obtain last remote sensing images to registration result figure.
Step of the present invention comprises as follows:
(1) input is with reference to remote sensing images and remote sensing images subject to registration;
(2) metric space of structural anisotropy's diffusion:
(2a) scale-value of each layer of anisotropy metric space of computing reference remote sensing images and remote sensing images subject to registration;
(2b) metric space value is transformed into time measure value;
(2c) to reference remote sensing images and the remote sensing images subject to registration of input, employing standard deviation is σ 0gaussian filtering, obtain with reference to remote sensing images and remote sensing images anisotropy metric space the 0th tomographic image subject to registration;
(2d) the sequence number i of anisotropy metric space layer is initialized as zero;
(2e) according to the following formula, the diffusion coefficient matrix of the i-th tomographic image of difference computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
c i = 1 1 + ( | ▿ I g i | K ) 2
Wherein, c irepresent the diffusion coefficient matrix of the i-th tomographic image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration iapplication standard difference is the image after the gaussian filtering of 1, represent the image after gaussian filtering gradient amplitude, || represent modulo operation, K represents contrast factor, and the value of K is gradient amplitude the gradient magnitude of statistic histogram 70% hundredths;
(2f) according to the following formula, row diffusion is done to the i-th tomographic image with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration:
I 1 i + 1 = ( I - 2 ( t i + 1 - t i ) · A 1 ( I i ) ) - 1 I i
Wherein, represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow diffusion after image, I represents the unit square formation that line number and columns are all identical with reference remote sensing images or the columns of remote sensing images subject to registration, t iand t i+1represent respectively with reference to remote sensing images or i-th layer of remote sensing images anisotropy metric space and the time measure value of the i-th+1 layer subject to registration, A 1( ii) presentation code is with reference to the i-th tomographic image coefficient of diffusion c of remote sensing images or remote sensing images anisotropy metric space subject to registration imatrix, I irepresent the i-th tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, () -1represent inverse matrix operation;
(2g) according to the following formula, row diffusion is done to the i-th tomographic image with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration:
I 2 i + 1 = ( I - 2 ( t i + 1 - t i ) · A 2 ( I i ) ) - 1 I i
Wherein, represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow diffusion after image, I represents the unit square formation that line number and columns are all identical with reference remote sensing images or the line number of remote sensing images subject to registration, t iand t i+1represent respectively with reference to remote sensing images or i-th layer of remote sensing images anisotropy metric space and the time measure value of the i-th+1 layer subject to registration, A 2(I i) presentation code is with reference to the i-th tomographic image coefficient of diffusion c of remote sensing images or remote sensing images anisotropy metric space subject to registration imatrix, I irepresent the i-th tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, () -1represent inverse matrix operation;
(2h) according to the following formula, the i-th+1 tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
I i + 1 = 1 2 ( I 1 i + 1 + I 2 i + 1 )
Wherein, I i+1represent the i-th+1 tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after space diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after row diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration;
(2i) judge whether i >=N-2 sets up, if set up, obtain the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration, otherwise, make i=i+1, perform step (2e); Wherein, i represents the sequence number of anisotropy metric space layer, and N represents the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(3) compute gradient magnitude image:
Use Sobel Operator Sobel, the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration;
(4) difference of compute gradient magnitude image:
(4a) according to the following formula, Sobel Operator Sobel is used, the difference of the x-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI x n = - 1 0 1 - 2 0 2 - 1 0 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nthe difference of horizontally right x-axis positive dirction, GI nrepresent the n-th layer image of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(4b) according to the following formula, Sobel Operator Sobel is used, the difference of the y-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI y n = - 1 - 2 1 0 0 0 1 2 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, GI nrepresent the n-th layer image of the gradient amplitude image of input reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(5) according to the following formula, the gradient amplitude of compute gradient magnitude image:
GGI n = ( GI x n ) 2 + ( GI y n ) 2
Wherein, GGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient amplitude, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, represent and open radical sign operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(6) according to the following formula, the gradient angle of compute gradient magnitude image:
AGGI n = a r c t a n ( GI y n GI x n )
Wherein, AGGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient angle, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, arctan () represents the operation of four-quadrant arc tangent, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(7) unique point is detected:
Use Harris corner detection operator, the gradient amplitude image of the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration detect unique point, obtains unique point set:
C I R = { c I R 1 , c I R 2 , ... , c I R R } , C I S = { c I S 1 , c I S 2 , ... , c I S S }
Wherein, C iRrepresent the unique point set detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, C iSrepresent the unique point set detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, R represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, and S represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration;
(8) generating feature point principal direction:
(8a) by horizontal direction angle 36 deciles in [0,2 π];
(8b) be 6 σ by the radius of unique point set ncircle shaped neighborhood region region in the gradient direction AGGI of pixel n(X), the decile angular range at pixel place in border circular areas is determined, wherein, σ nrepresent the scale-value of the n-th layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(8c) the cumulative gradient magnitude GGI being positioned at all pixels of each decile angular range respectively n(X), gradient orientation histogram is formed, wherein, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(8d) by gradient direction corresponding for the numerical value being greater than maximal value 0.8 times in gradient orientation histogram, as the principal direction of unique point;
(9) generating feature point descriptor:
(9a) be that the circle shaped neighborhood region of unique point of ρ is rotated counterclockwise θ by radius mdegree, ρ is divided into 3 intervals along radial direction by circle shaped neighborhood region, the inner circle radius of neighbourhood is 0.25 ρ, and the middle circle radius of neighbourhood is 0.73 ρ, and the cylindrical radius of neighbourhood is ρ, circle shaped neighborhood region incites somebody to action [0 along angle direction, 2 π] be divided into 8 intervals, integrally, around unique point, circle shaped neighborhood region is divided the subregion of the area equation in order to 17 diverse locations to inner circle, wherein, the value of ρ is 12 σ n, σ nthe scale-value of the reference remote sensing images at representation feature point place or the n-th layer of remote sensing images anisotropy metric space subject to registration, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration, θ mrepresent the principal direction of this unique point;
(9b) Cartesian coordinates of pixel in unique point circle shaped neighborhood region is converted to log-polar, log-polar angle direction level, is divided into 8 intervals to the right in the scope of [0,2 π], log-polar logarithm length direction straight down, non-ly in scope be divided into 3 intervals, wherein, the radius of circle shaped neighborhood region around ρ representation feature point;
(9c) all pixels in log-polar grid in every sub regions are according to its gradient amplitude GGI nand gradient direction AGGI (X) n(X) compute gradient direction histogram, each subregion defines the gradient direction vector of one 8 dimension, the gradient direction vector splicing 17 subregions successively just defines the descriptor of the unique point of one 136 dimension, wherein, the angle of gradient orientation histogram is [0,2 π] scope in be divided into 8 intervals, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X;
(9d) same procedure of step (9a), step (9b), step (9c) is adopted, generating feature point set C iRdescriptor set D iRwith unique point set C iSdescriptor set D iS, D I R = { d I R 1 , d I R 2 , ... , d I R L 1 } , D I S = { d I S 1 , d I S 2 , ... , d I S L 2 } , Wherein, D iRrepresent the descriptor set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, D iSrepresent the descriptor set of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects, L 1represent the descriptor sum of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, L 2represent the descriptor sum of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects;
(10) Feature Points Matching:
(10a) the descriptor sequence number k of Characteristics of The Remote Sensing Images point subject to registration is initialized as 1;
(10b) descriptor of Characteristics of The Remote Sensing Images subject to registration point according to the following formula, is calculated the vector formed with the Euclidean distance of the descriptor with reference to each unique point of remote sensing images:
d k = [ d k , 1 , d k , 2 , ... , d k , j , ... d k , L 1 ]
Wherein, d krepresent the descriptor of Characteristics of The Remote Sensing Images subject to registration point the vector formed with the Euclidean distance of each descriptor with reference to remote sensing images, d k,jrepresent the descriptor of registration Characteristics of The Remote Sensing Images point with the descriptor with reference to Characteristics of The Remote Sensing Images point euclidean distance, represent a kth descriptor of Characteristics of The Remote Sensing Images subject to registration point, represent the jth descriptor with reference to Characteristics of The Remote Sensing Images point, j=1,2 ..., L 1, L 1represent the number with reference to Characteristics of The Remote Sensing Images point descriptor, || || represent and get norm operation;
(10c) to vectorial d kin L 1individual element sorts from small to large and obtains vectorial s k;
If (10d) then descriptor characteristic of correspondence point and and descriptor euclidean distance be s k, 0the descriptor characteristic of correspondence Point matching of reference remote sensing images; If and then descriptor characteristic of correspondence point and and descriptor euclidean distance be d kthe descriptor characteristic of correspondence Point matching of reference remote sensing images of front E element; If then descriptor characteristic of correspondence point discord is with reference to any Feature Points Matching of remote sensing images;
Wherein, s k, 0represent vectorial s kfirst element, s k, 1represent vectorial s ksecond element, T land T hrepresent that arest neighbors and time nearest neighbor distance are than threshold value, the span of E is the integer in [2,3];
(10e) k>=L is judged 2whether set up, if set up, obtain remote sensing images subject to registration and the Feature Points Matching relation C with reference to remote sensing images iR, IS, perform step (11); Otherwise, make k=k+1, perform step (10b);
Wherein, k represents the descriptor sequence number of Characteristics of The Remote Sensing Images subject to registration point, C iR, ISrepresent the pair relationhip set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects and the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, L 3the point of representation feature Point matching stage match is to sum;
(11) feature point pairs of deletion error coupling:
Use the consistent RANSAC algorithm of random sampling, the matching double points of deletion error, obtain remote sensing images subject to registration to the geometric transform relation with reference to remote sensing images;
(12) registration reference picture and image subject to registration:
According to geometric transform relation, use bilinear interpolation method registration with reference to remote sensing images and remote sensing images subject to registration.
The present invention compared with prior art tool has the following advantages:
First, because the present invention carries out feature point detection on the gradient amplitude image of anisotropy metric space, use the gradient amplitude of the gradient amplitude image of anisotropy metric space and the principal direction of gradient angle calculation unique point and the descriptor of unique point, overcome prior art can not tackle remote sensing images between the problem that changes greatly of luminance non-linearity, make to invention increases the right correct matching rate of remote sensing images.
Second, because the present invention does not use Gauss's weight window in the principal direction of generating feature point and generating feature point descriptor stage, overcome prior art in generating feature point principal direction and the high problem of generating feature point descriptor stage computation complexity, make the operational efficiency that invention increases algorithm.
3rd, because the present invention is provided with two threshold values in the Feature Points Matching stage to arest neighbors and time nearest neighbor distance ratio, overcome the problem that the unique point number that correctly matches when prior art uses a threshold value to carry out Feature Points Matching is few, make the unique point number and the registration accuracy that invention increases correct pairing.
Accompanying drawing explanation
Fig. 1 is process flow diagram of the present invention;
Fig. 2 be prior art unique point around circle shaped neighborhood region Region dividing and log-polar grid schematic diagram;
Fig. 3 is the consistent RANSAC algorithm flow chart of stochastic sampling of step 11 of the present invention;
Fig. 4 be emulation experiment of the present invention first group of multi-spectral remote sensing image to registration result figure;
Fig. 5 be emulation experiment of the present invention second group of multi-spectral remote sensing image to registration result figure;
Fig. 6 be emulation experiment of the present invention multi-source Remote Sensing Images to registration result figure.
Embodiment
Below in conjunction with accompanying drawing, the present invention is described in further detail.
With reference to accompanying drawing 1, step of the present invention is as follows.
Step 1, input remote sensing images pair.
Input is with reference to remote sensing images and remote sensing images subject to registration.
Step 2, the metric space of structural anisotropy's diffusion.
(2a) scale-value of each layer of anisotropy metric space of computing reference remote sensing images and remote sensing images subject to registration:
σ e(m,u)=σ 02 m+u/L
Wherein, σ erepresent the scale-value of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, σ 0represent that anisotropy metric space is G group altogether with reference to remote sensing images or the station meter angle value with remote sensing images subject to registration, often organize L layer, m represents the group index of anisotropy metric space, m=0,1 ..., G-1, u represent the index often organizing internal layer, u=0,1 ..., L-1, e=0,1,2, ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
(2b) metric space value is transformed into time measure value:
t e = 1 2 σ e 2 ( m , u )
Wherein, t erepresent the time measure value of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, σ erepresent the yardstick of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, e=0,1,2 ..., N-1, N represents the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration, and m represents the group index of anisotropy metric space, m=0,1, ..., G-1, G represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space group subject to registration, u represents the index often organizing internal layer, u=0,1 ..., L-1, L represent the sum often organizing layer with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration.
(2c) to reference remote sensing images and the remote sensing images subject to registration of input, employing standard deviation is σ 0gaussian filtering, obtain with reference to remote sensing images and remote sensing images anisotropy metric space the 0th tomographic image subject to registration.
(2d) the sequence number i of anisotropy metric space layer is initialized as zero.
(2e) according to the following formula, the diffusion coefficient matrix of the i-th tomographic image of difference computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
c i = 1 1 + ( | ▿ I g i | K ) 2
Wherein, c irepresent the diffusion coefficient matrix of the i-th tomographic image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration iapplication standard difference is the image after the gaussian filtering of 1, represent the image after gaussian filtering gradient amplitude, || represent modulo operation, K represents contrast factor, and the value of K is gradient amplitude the gradient magnitude of statistic histogram 70% hundredths.
(2f) to the i-th tomographic image I with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration irow do one dimension diffusion:
The first step: the i-th tomographic image I of calculation code anisotropy metric space idiffusion coefficient matrix c ithe matrix A that h is capable 1, h(I i).
Wherein, A 1, h(I i) presentation code is with reference to the i-th tomographic image I of remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix c ithe matrix that h is capable, c irepresent the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix, represent diffusion coefficient matrix c ih capable, J row element, J=0,1 ..., Q-1, representing matrix c ih is capable, the set of the row coordinate of left and right 2 neighborhood of the element of w row, w=0, and 1 ..., Q-1, Q represent the columns with reference to remote sensing images or remote sensing images subject to registration.
Second step: according to the following formula, adopts Thomas algorithm solution anisotropy row diffusion equation.
I 1 , h i + 1 = ( I - 2 ( t i + 1 - t i ) · A 1 , h ( I i ) ) - 1 I h i
Wherein, represent the i-th tomographic image I along the anisotropy metric space with reference to remote sensing images or remote sensing images subject to registration ithe capable diffusion of h after result, I is the unit matrix that size and input reference remote sensing images or the columns of remote sensing images subject to registration are identical, t iand t i+1with reference to remote sensing images or i-th layer of remote sensing images anisotropy metric space and the time measure value of the i-th+1 layer subject to registration respectively, A 1, h(I i) presentation code is with reference to the i-th tomographic image I of remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix c ithe matrix that h is capable, represent the i-th tomographic image I of the anisotropy metric space of reference remote sensing images or remote sensing images subject to registration ih capable, () -1represent inverse matrix operation.
3rd step: for h=0,1 ..., P-1, repeats the first step in this step and second step, obtains wherein, represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow do the result after one dimension diffusion, P represents the line number with reference to remote sensing images or remote sensing images subject to registration.
(2g) to the i-th tomographic image I with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration irow do one dimension diffusion:
The first step: the i-th tomographic image I of calculation code anisotropy metric space idiffusion coefficient matrix c ithe matrix A of v row 2, v(I i).
Wherein, A 2, v(I i) presentation code is with reference to the i-th tomographic image I of remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix c ithe matrix of v row, c irepresent the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix, representing matrix c im capable, v row element, M=0,1 ..., P-1, representing matrix c iv arranges, the set of the row-coordinate of 2 neighborhoods up and down of the element that f is capable, f=0, and 1 ..., P-1, P represent the line number with reference to remote sensing images or remote sensing images subject to registration.
Second step: according to the following formula, adopts Thomas algorithm solution anisotropy row diffusion equation.
I 2 , v i + 1 = ( I - 2 ( t i + 1 - t i ) · A 2 , v ( I i ) ) - 1 I v i
Wherein, represent the i-th tomographic image I along the anisotropy metric space with reference to remote sensing images or remote sensing images subject to registration ithe diffusion of v row after result, I is size and input reference remote sensing images or the identical unit matrix of remote sensing images line number subject to registration, t iand t i+1i-th layer of anisotropy metric space and the time measure value of the i-th+1 layer respectively, A 2, v(I i) presentation code is with reference to the i-th tomographic image I of remote sensing images or remote sensing images anisotropy metric space subject to registration idiffusion coefficient matrix c ithe matrix of v row, represent the i-th tomographic image I of the anisotropy metric space of reference remote sensing images or remote sensing images subject to registration iv row, () -1represent inverse matrix operation.
3rd step: for v=0,1 ..., Q-1, repeats the first step in this step and second step, obtains represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow do the result after one dimension diffusion, Q represents the columns with reference to remote sensing images or remote sensing images subject to registration.
(2h) according to the following formula, the i-th+1 tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
I i + 1 = 1 2 ( I 1 i + 1 + I 2 i + 1 )
Wherein, I i+1represent the i-th+1 tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after space diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after row diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration.
(2i) judge whether i >=N-2 sets up, if set up, finishing iteration, obtain with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration, otherwise, make i=i+1, perform step (2d); Wherein, i represents the sequence number of anisotropy metric space layer, and N represents the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 3, compute gradient magnitude image.
Use Sobel Operator Sobel, the gradient amplitude figure of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration.
According to the following formula, each tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration is along the difference of x-axis positive dirction:
I x n = - 1 0 1 - 2 0 2 - 1 0 1 ⊕ I n
Wherein, I nrepresent the n-th layer image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, denotation coordination initial point is at the image I in the upper left corner nthe horizontally difference of right x-axis positive dirction, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
According to the following formula, each tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration is along the difference of y-axis positive dirction:
I y n = - 1 - 2 1 0 0 0 1 2 1 ⊕ I n
Wherein, I nrepresent the n-th layer image of input reference remote sensing images or remote sensing images anisotropy metric space subject to registration, denotation coordination initial point is at the image I in the upper left corner nalong the difference of y-axis positive dirction straight down, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
According to the following formula, the gradient amplitude of computing reference remote sensing images and each tomographic image of remote sensing images anisotropy metric space subject to registration:
GI n = ( I x n ) 2 + ( I y n ) 2
Wherein, GI nrepresent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient amplitude, represent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 4, the difference of compute gradient magnitude image.
According to the following formula, Sobel Operator Sobel is used, the difference of the x-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI x n = - 1 0 1 - 2 0 2 - 1 0 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nthe difference of horizontally right x-axis positive dirction, GI nrepresent the n-th layer image of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
According to the following formula, Sobel Operator Sobel is used, the difference of the y-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI y n = - 1 - 2 1 0 0 0 1 2 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, GI nrepresent the n-th layer image of the gradient amplitude image of input reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 5, the gradient amplitude of compute gradient magnitude image.
According to the following formula, the gradient amplitude of compute gradient magnitude image:
GGI n = ( GI x n ) 2 + ( GI y n ) 2
Wherein, GGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient amplitude, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, represent and open radical sign operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 6, the gradient angle of compute gradient magnitude image.
According to the following formula, the gradient angle of compute gradient magnitude image:
AGGI n = a r c t a n ( GI y n GI x n )
Wherein, AGGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient angle, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, arctan () represents the operation of four-quadrant arc tangent, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 7, detects unique point.
Use Harris corner detection operator, the gradient amplitude image of the anisotropy metric space with reference to remote sensing images or remote sensing images subject to registration detects unique point.
According to the following formula, the Harris matrix of each pixel of each tomographic image of gradient amplitude image of computing reference remote sensing images or remote sensing images anisotropy metric space subject to registration:
u ( X , σ n ) = σ n 2 g 2 · σ n * ( GI x n ) 2 GI x n . GI y n GI y n . GI x n ( GI y n ) 2
Wherein, u (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris matrix of the pixel of X, and X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, to be x-axis direction and y-axis standard error of direction be all g gaussian function, * represents convolution operation, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
The Harris function of each pixel of each tomographic image of the gradient amplitude image of computing reference remote sensing images or remote sensing images anisotropy metric space subject to registration:
R(X,σ n)=det(u(X,σ n))-D·tr(u(X,σ n)) 2
Wherein, R (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris function of the pixel of X, u (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris matrix of the pixel of X, and X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer of reference remote sensing images and remote sensing images anisotropy metric space subject to registration, the determinant of det () expression matrix, D represents arbitrary constant, the summation of tr () representing matrix the elements in a main diagonal, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
For R (X, σ n) > T thand R (X, σ n) be greater than the point of the Harris function of any point in its 8 neighborhood as unique point, obtain the set of the unique point that the gradient amplitude image of the anisotropy metric space of reference remote sensing images and remote sensing images subject to registration detects:
C I R = { c I R 1 , c I R 2 R , ... , c I R } , C I S = { c I S 1 , c I S 2 , ... , c I S S }
Wherein, R (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris function of the pixel of X, T threpresent Harris function threshold, X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, C iRrepresent the set of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, C iSrepresent the set of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, R represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, S represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, n=0,1, ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
Step 8, generating feature point principal direction.
By horizontal direction angle 36 deciles in [0,2 π].
Be 6 σ by the radius of unique point set ncircle shaped neighborhood region region in the gradient direction AGGI of pixel n(X), the decile angular range at pixel place in border circular areas is determined, wherein, σ nrepresent the scale-value of the n-th layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
The cumulative gradient magnitude GGI being positioned at all pixels of each decile angular range respectively n(X), gradient orientation histogram is formed, wherein, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
By gradient direction corresponding for the numerical value being greater than maximal value 0.8 times in gradient orientation histogram, as the principal direction of unique point.
Step 9, generating feature point descriptor.
(9a) be that the circle shaped neighborhood region of unique point of ρ is rotated counterclockwise θ by radius mdegree, obtain the circle shaped neighborhood region of the same radius shown in accompanying drawing 2 (a), ρ is divided into 3 intervals along radial direction by circle shaped neighborhood region, and the inner circle radius of neighbourhood is 0.25 ρ, the middle circle radius of neighbourhood is 0.73 ρ, the cylindrical radius of neighbourhood is ρ, and [0,2 π] is divided into 8 intervals along angle direction by circle shaped neighborhood region, inner circle integrally, around unique point, circle shaped neighborhood region is divided the subregion of the area equation in order to 17 diverse locations, and wherein, the value of ρ is 12 σ n, σ nthe scale-value of the reference remote sensing images at representation feature point place or the n-th layer of remote sensing images anisotropy metric space subject to registration, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration, θ mrepresent the principal direction of this unique point.
(9b) Cartesian coordinates of pixel in unique point circle shaped neighborhood region is converted to log-polar, log-polar angle direction level to the right, the arabic numeral region of the middle mark in arabic numeral region respective figure 2 (b) of mark in accompanying drawing 2 (a), [0,2 π] scope in be divided into 8 intervals, log-polar logarithm length direction straight down, non-ly in scope be divided into 3 intervals, wherein, the radius of circle shaped neighborhood region around ρ representation feature point.
(9c) all pixels in log-polar grid in every sub regions are according to its gradient amplitude GGI nand gradient direction AGGI (X) n(X) compute gradient direction histogram, each subregion defines the gradient direction vector of one 8 dimension, the gradient direction vector splicing 17 subregions successively just defines the descriptor of the unique point of one 136 dimension, wherein, the angle of gradient orientation histogram is [0,2 π] scope in be divided into 8 intervals, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X.
(9d) same procedure of step (9a), step (9b), step (9c) is adopted, generating feature point set C iRdescriptor set D iRwith unique point set C iSdescriptor set D iS, D I R = { d I R 1 , d I R 2 , ... , d I R L 1 } , D I S = { d I S 1 , d I S 2 , ... , d I S L 2 } , Wherein, D iRrepresent the descriptor set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, D iSrepresent the descriptor set of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects, L 1represent the descriptor sum of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, L 2represent the descriptor sum of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects.
Step 10, Feature Points Matching.
Mate the unique point detected on the anisotropy metric space gradient amplitude image of reference remote sensing images and the unique point detected on the anisotropy metric space gradient amplitude image of remote sensing images subject to registration:
(10a) the descriptor sequence number k of Characteristics of The Remote Sensing Images point subject to registration is initialized as 1.
(10b) descriptor of Characteristics of The Remote Sensing Images subject to registration point according to the following formula, is calculated the vector formed with the Euclidean distance of the descriptor with reference to each unique point of remote sensing images:
d k = [ d k , 1 , d k , 2 , ... , d k , j , ... d k , L 1 ]
Wherein, d krepresent the descriptor of Characteristics of The Remote Sensing Images subject to registration point the vector formed with the Euclidean distance of each descriptor with reference to remote sensing images, d k,jrepresent the descriptor of registration Characteristics of The Remote Sensing Images point with the descriptor with reference to Characteristics of The Remote Sensing Images point euclidean distance, represent a kth descriptor of Characteristics of The Remote Sensing Images subject to registration point, represent the jth descriptor with reference to Characteristics of The Remote Sensing Images point, j=1,2 ..., L 1, L 1represent the number with reference to Characteristics of The Remote Sensing Images point descriptor, || || represent and get norm operation.
(10c) to vectorial d kin L 1individual element sorts from small to large and obtains vectorial s k.
If (10d) then descriptor characteristic of correspondence point and and descriptor euclidean distance be s k, 0the descriptor characteristic of correspondence Point matching of reference remote sensing images; If and then descriptor characteristic of correspondence point and and descriptor euclidean distance be d kthe descriptor characteristic of correspondence Point matching of reference remote sensing images of front E element; If then descriptor characteristic of correspondence point discord is with reference to any Feature Points Matching of remote sensing images;
Wherein, s k, 0represent vectorial s kfirst element, s k, 1represent vectorial s ksecond element, T land T hrepresent that arest neighbors and time nearest neighbor distance are than threshold value, the span of E is the integer in [2,3].
(10e) k>=L is judged 2whether set up, if set up, obtain remote sensing images subject to registration and the Feature Points Matching relation C with reference to remote sensing images iR, IS, perform step (11); Otherwise, make k=k+1, perform the step (10b) of this step;
Wherein, k represents the descriptor sequence number of Characteristics of The Remote Sensing Images subject to registration point, C iR, ISrepresent the pair relationhip set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects and the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, L 3the point of representation feature Point matching stage match is to sum.
Step 11, the feature point pairs of deletion error coupling.
With reference to accompanying drawing 3, use the consistent RANSAC algorithm of random sampling, the matching double points of deletion error, obtain remote sensing images subject to registration to the geometric transform relation with reference to remote sensing images, concrete steps are as follows:
(11a) loop initialization times N um is 0, and the matching double points number comprised in the optimum consistent point set C of initialization is 0.
(11b) Num=Num+1, judges whether Num>1000 sets up, if set up, performs step (11i); Otherwise, perform step (11c).
(11c) from Feature Points Matching relation C iR, ISthe matching characteristic point pair that middle random selecting 3 is different;
Wherein, C iR, ISrepresent the pair relationhip set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects and the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration.
(11d) use affine Transform Model, calculate and meet the right transformation matrix T1 of 3 different matching characteristic points.
(11e) Feature Points Matching relation C is calculated iR, ISin meet the consistent point set Con of transformation matrix T1;
Wherein, C iR, ISrepresent the pair relationhip set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects and the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration.
(11f) judge that the point comprised in consistent point set Con is greater than consistent point to number and whether sets up threshold value TH, if set up, perform step (11g); Otherwise, perform step (11b);
Wherein, TH represents that consistent point is to threshold value, and TH value is L 30.1 times, L 3the point of representation feature Point matching stage match is to sum.
(11g) judge that the point that comprises in consistent point set Con is greater than current optimum to number and unanimously puts and whether set up number, if set up, perform step (11h); Otherwise, perform step (11b).
(11h) consistent point set Con replaces the consistent point set C of current optimum, uses affine Transform Model, calculates the transformation matrix T2 meeting consistent point set C, performs step (11b).
(11i) current optimum consistent point set C and transformation matrix T2 is exported.
The feature point pairs correctly mated with reference to remote sensing images and remote sensing images subject to registration is comprised in the optimum consistent point set C exported; Transformation matrix T2 is the remote sensing images subject to registration of acquisition to the geometric transform relation T with reference to remote sensing images.
Step 12, registration reference picture and image subject to registration.
According to remote sensing images subject to registration to the geometric transform relation T with reference to remote sensing images, use bilinear interpolation method registration with reference to remote sensing images and remote sensing images subject to registration.
Below in conjunction with analogous diagram, effect of the present invention is described further.
1. simulated conditions:
Emulation experiment of the present invention, under Intel (R) Core (TM) i3CPUM3802.53GHzWindows7 system, on Matlab2010a operation platform, completes emulation experiment of the present invention.
2. emulation experiment content:
The group number G got in emulation experiment of the present invention with reference to remote sensing and remote sensing images metric space subject to registration is 3, and in group, number of plies L is 3, and therefore, the total N of the layer of anisotropy metric space is 9; Station meter angle value σ 0be 1.6, in emulation experiment, input reference remote sensing images and remote sensing image data subject to registration are converted to the floating type between 0-1, Harris function threshold T thbe 0.005, threshold value T in Feature Points Matching land T hbe set to 0.6 and 0.97 respectively, between image, geometric transformation model is affined transformation.The test remote sensing images of emulation experiment input of the present invention are divided into two classes: (1) multi-spectral remote sensing image pair, (2) multi-source Remote Sensing Images pair.
Accompanying drawing 4 (a) and accompanying drawing 4 (b) be first group of multi-spectral remote sensing image to P-A, wherein, accompanying drawing 4 (a) is reference picture, size is 761 × 748 pixels, sensor type is Landsat-7ETM+, 5 wave bands, and acquisition time is 2000/7/24; Accompanying drawing 4 (b) is image subject to registration, and size is 761 × 748 pixels, and sensor type is Landsat4 – 5TM, and 3 wave bands, acquisition time is 1999/7/6.
Accompanying drawing 5 (a) and accompanying drawing 5 (b) be second group of multi-spectral remote sensing image to P-B, wherein, accompanying drawing 5 (a) is reference picture, size is 761 × 748 pixels, sensor type is Landsat-7ETM+, 3 wave bands, and acquisition time is 2003/4/12; Accompanying drawing 5 (b) is image subject to registration, and size is 761 × 748 pixels, and sensor type is Landsat4 – 5TM, and 5 wave bands, acquisition time is 2006/06/15.
Accompanying drawing 6 (a) and accompanying drawing 6 (b) be multi-source Remote Sensing Images to P-C, wherein, accompanying drawing 6 (a) is reference picture, and size is 800 × 800 pixels, and sensor type is ALOS-PALSAR, and acquisition time is 2010/6/5; Accompanying drawing 6 (b) is image subject to registration, and size is 800 × 800 pixels, and sensor type is LandsatETM+, and acquisition time is 1999/6/26.
First classical SIFT algorithm is contrasted in emulation experiment of the present invention, also provide a comparison the dual threshold arest neighbors matching criterior and classical single threshold arest neighbors matching criterior that propose in literary composition, therefore, each test pattern is contrasted the method proposed in application two kinds of diverse ways and literary composition, these two kinds of methods are SIFT algorithm and Harris anisotropic gradient metric space+nearest neighbor distance ratio coupling+RANSAC algorithm (Harris-NDGSS-NNDR-RANSAC), respectively referred to as SIFT and HNNR, the complete algorithm mentioned in literary composition is that Harris anisotropic gradient metric space+dual threshold nearest neighbor distance is than coupling+RANSAC algorithm (Harris-NDGSS-DNNDR-RANSAC), referred to as HNDR.3 kinds of diverse ways are made to same image to the unique point producing same quantity as far as possible, shown in the simulation experiment result following table in emulation experiment of the present invention.
3. matching performance appraisal procedure:
Use the feature point pairs number N of correct coupling cwith the standard of root-mean-square error RMSE as assessment registration performance.N cit is the feature point pairs number of the correct coupling obtained after using random sampling consistent RANSAC algorithm.
Root-mean-square error RMSE calculates according to following formula:
R M S E = 1 N G T Σ r = 1 N G T ( x r - x ~ r ) 2 + ( y r - y ~ r ) 2
Wherein, x rand y rrepresent the row coordinate with reference to r the point that remote sensing images are manually chosen and row-coordinate respectively, with represent remote sensing images subject to registration are manually chosen r point respectively through remote sensing images subject to registration to reference to the row coordinate after the geometric transform relation T conversion of remote sensing images and row-coordinate, r=1,2 ..., N gT, N gTrepresent that the true match point manually chosen from reference remote sensing images and remote sensing images subject to registration is to sum.
4. the simulation experiment result and analysis:
As can be seen from annex Fig. 4 (a) and Fig. 4 (b), the most of same area brightness with reference to remote sensing images and remote sensing images subject to registration is reversed; As shown in the table, SIFT algorithm can not realize correct registration, but can detect more correct matching double points based on HNNR and the HNDR algorithm of anisotropic gradient metric space, demonstrates the remote sensing images pair of the algorithm energy registering images brightness reversion that article proposes.Accompanying drawing 5 (a) and accompanying drawing 5 (b) are also multi-spectral remote sensing images pair, the two brightness of image reversion of width figure respective regions or nonlinearities change; As shown in the table, method based on SIFT also accurately can realize registration, but HNNR and the HNDR algorithm based on anisotropic gradient metric space can detect more correct matching double points, also demonstrates the remote sensing images pair that algorithm energy registering images luminance non-linearity changes or brightness is reversed that article proposes.Accompanying drawing 6 (a) and accompanying drawing 6 (b) are multi-source Remote Sensing Images pair, same area brightness of image mainly presents linear change pattern, as shown in the table, 3 kinds of methods can realize correct registration, and SIFT algorithm also to realize more correct coupling right, this is because the main detected image texture structure of SIFT algorithm, the good image of texture Edge preservation can extract more how valuable point patterns, also demonstrate and can use SIFT algorithm in the remote sensing image registration task of certain limit.Wherein, accompanying drawing 4 (c), accompanying drawing 5 (c) and accompanying drawing 6 (c) are remote sensing images subject to registration to reference to the registration result after remote sensing images coordinate system transformation, accompanying drawing 4 (d), accompanying drawing 5 (d) and accompanying drawing 6 (d) be with reference to remote sensing images and conversion after the fusion of remote sensing images subject to registration.
Adopt HNNR algorithm and HNDR algorithm, as shown in the table to the registration result of carrying out to remote sensing images.As can be seen from following table, the feature point number that HNDR algorithm correctly mates is all more than the feature point number that HNNR algorithm correctly mates, what the difference of HNDR algorithm and HNNR algorithm was that HNDR algorithm adopts is that the dual threshold nearest neighbor distance proposed in literary composition carries out Feature Points Matching than matching criterior, and HNNR algorithm adopts be classical nearest neighbor distance than matching criterior, demonstrate dual threshold nearest neighbor distance thus and can more effectively utilize unique point descriptor information to carry out Feature Points Matching than matching process.By comparing root-mean-square error RMSE, can find out that the method for proposition can realize the registration of subpixel accuracy.In a word, HNDR algorithm can not only complete the registration task of multi-source Remote Sensing Images preferably, overcomes existing algorithm registering images brightness simultaneously and there is larger the nonlinearities change even multi-spectral remote sensing image of the brightness of image reversion problem low to accuracy.
Wherein, Pair represents the remote sensing images pair of test, and Method represents the distinct methods that coupling remote sensing images are right, N iRand N iSrepresent the feature point number detected with reference to remote sensing images and remote sensing images subject to registration respectively, N crepresent the feature point pairs number of correct coupling, RMSE represents root-mean-square error.

Claims (5)

1., based on a remote sensing image registration method for anisotropic gradient metric space, comprise the steps:
(1) input is with reference to remote sensing images and remote sensing images subject to registration;
(2) metric space of structural anisotropy's diffusion:
(2a) scale-value of each layer of anisotropy metric space of computing reference remote sensing images and remote sensing images subject to registration;
(2b) metric space value is transformed into time measure value;
(2c) to reference remote sensing images and the remote sensing images subject to registration of input, employing standard deviation is σ 0gaussian filtering, obtain with reference to remote sensing images and remote sensing images anisotropy metric space the 0th tomographic image subject to registration;
(2d) the sequence number i of anisotropy metric space layer is initialized as zero;
(2e) according to the following formula, the diffusion coefficient matrix of the i-th tomographic image of difference computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
c i = 1 1 + ( | ▿ I g i | K ) 2
Wherein, c irepresent the diffusion coefficient matrix of the i-th tomographic image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration iapplication standard difference is the image after the gaussian filtering of 1, represent the image after gaussian filtering gradient amplitude, || represent modulo operation, K represents contrast factor, and the value of K is gradient amplitude the gradient magnitude of statistic histogram 70% hundredths;
(2f) according to the following formula, row diffusion is done to the i-th tomographic image with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration:
I 1 i + 1 = ( I - 2 ( t i + 1 - t i ) · A 1 ( I i ) ) - 1 I i
Wherein, represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow diffusion after image, I represents the unit square formation that line number and columns are all identical with reference remote sensing images or the columns of remote sensing images subject to registration, t iand t i+1represent respectively with reference to remote sensing images or i-th layer of remote sensing images anisotropy metric space and the time measure value of the i-th+1 layer subject to registration, A 1(I i) presentation code is with reference to the i-th tomographic image coefficient of diffusion c of remote sensing images or remote sensing images anisotropy metric space subject to registration imatrix, I irepresent the i-th tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, () -1represent inverse matrix operation;
(2g) according to the following formula, row diffusion is done to the i-th tomographic image with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration:
I 2 i + 1 = ( I - 2 ( t i + 1 - t i ) · A 2 ( I i ) ) - 1 I i
Wherein, represent along the i-th tomographic image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration irow diffusion after image, I represents the unit square formation that line number and columns are all identical with reference remote sensing images or the line number of remote sensing images subject to registration, t iand t i+1represent respectively with reference to remote sensing images or i-th layer of remote sensing images anisotropy metric space and the time measure value of the i-th+1 layer subject to registration, A 2(I i) presentation code is with reference to the i-th tomographic image coefficient of diffusion c of remote sensing images or remote sensing images anisotropy metric space subject to registration imatrix, I irepresent the i-th tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, () -1represent inverse matrix operation;
(2h) according to the following formula, the i-th+1 tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
I i + 1 = 1 2 ( I 1 i + 1 + I 2 i + 1 )
Wherein, I i+1represent the i-th+1 tomographic image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after space diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, represent the image of the i-th tomographic image after row diffusion with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration;
(2i) judge whether i >=N-2 sets up, if set up, obtain the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration, otherwise, make i=i+1, perform step (2e); Wherein, i represents the sequence number of anisotropy metric space layer, and N represents the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(3) compute gradient magnitude image:
Use Sobel Operator Sobel, the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration;
(4) difference of compute gradient magnitude image:
(4a) according to the following formula, Sobel Operator Sobel is used, the difference of the x-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI x n = - 1 0 1 - 2 0 2 - 1 0 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nthe difference of horizontally right x-axis positive dirction, GI nrepresent the n-th layer image of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(4b) according to the following formula, Sobel Operator Sobel is used, the difference of the y-axis positive dirction of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
GI y n = - 1 - 2 1 0 0 0 1 2 1 ⊕ GI n
Wherein, represent the gradient amplitude image GI with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, GI nrepresent the n-th layer image of the gradient amplitude image of input reference remote sensing images or remote sensing images anisotropy metric space subject to registration, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(5) according to the following formula, the gradient amplitude of compute gradient magnitude image:
GGI n = ( GI x n ) 2 + ( GI y n ) 2
Wherein, GGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient amplitude, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, represent and open radical sign operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(6) according to the following formula, the gradient angle of compute gradient magnitude image:
AGGI n = a r c t a n ( GI y n GI x n )
Wherein, AGGI nrepresent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient angle, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, arctan () represents the operation of four-quadrant arc tangent, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(7) unique point is detected:
Use Harris corner detection operator, the gradient amplitude image of the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration detect unique point, obtains unique point set:
C I R = { c I R 1 , c I R 2 , ... , c I R R } , C I S = { c I S 1 , c I S 2 , ... , c I S S }
Wherein, C iRrepresent the unique point set detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, C iSrepresent the unique point set detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, R represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, and S represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration;
(8) generating feature point principal direction:
(8a) by horizontal direction angle 36 deciles in [0,2 π];
(8b) be 6 σ by the radius of unique point set ncircle shaped neighborhood region region in the gradient direction AGGI of pixel n(X), the decile angular range at pixel place in border circular areas is determined, wherein, σ nrepresent the scale-value of the n-th layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(8c) the cumulative gradient magnitude GGI being positioned at all pixels of each decile angular range respectively n(X), gradient orientation histogram is formed, wherein, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
(8d) by gradient direction corresponding for the numerical value being greater than maximal value 0.8 times in gradient orientation histogram, as the principal direction of unique point;
(9) generating feature point descriptor:
(9a) be that the circle shaped neighborhood region of unique point of ρ is rotated counterclockwise θ by radius mdegree, ρ is divided into 3 intervals along radial direction by circle shaped neighborhood region, the inner circle radius of neighbourhood is 0.25 ρ, and the middle circle radius of neighbourhood is 0.73 ρ, and the cylindrical radius of neighbourhood is ρ, circle shaped neighborhood region incites somebody to action [0 along angle direction, 2 π] be divided into 8 intervals, integrally, around unique point, circle shaped neighborhood region is divided the subregion of the area equation in order to 17 diverse locations to inner circle, wherein, the value of ρ is 12 σ n, σ nthe scale-value of the reference remote sensing images at representation feature point place or the n-th layer of remote sensing images anisotropy metric space subject to registration, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration, θ mrepresent the principal direction of this unique point;
(9b) Cartesian coordinates of pixel in unique point circle shaped neighborhood region is converted to log-polar, log-polar angle direction level, is divided into 8 intervals to the right in the scope of [0,2 π], log-polar logarithm length direction straight down, non-ly in scope be divided into 3 intervals, wherein, the radius of circle shaped neighborhood region around ρ representation feature point;
(9c) all pixels in log-polar grid in every sub regions are according to its gradient amplitude GGI nand gradient direction AGGI (X) n(X) compute gradient direction histogram, each subregion defines the gradient direction vector of one 8 dimension, the gradient direction vector splicing 17 subregions successively just defines the descriptor of the unique point of one 136 dimension, wherein, the angle of gradient orientation histogram is [0,2 π] scope in be divided into 8 intervals, GGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient amplitude of the pixel of X, AGGI n(X) around representation feature point, in circle shaped neighborhood region, position coordinates is the gradient angle of the pixel of X;
(9d) same procedure of step (9a), step (9b), step (9c) is adopted, generating feature point set C iRdescriptor set D iRwith unique point set C iSdescriptor set D iS, D I R = { d I R 1 , d I R 2 , ... , d I R L 1 } , D I S = { d I S 1 , d I S 2 , ... , d I S L 2 } , Wherein, D iRrepresent the descriptor set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, D iSrepresent the descriptor set of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects, L 1represent the descriptor sum of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects, L 2represent the descriptor sum of the unique point that the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration detects;
(10) Feature Points Matching:
(10a) the descriptor sequence number k of Characteristics of The Remote Sensing Images point subject to registration is initialized as 1;
(10b) descriptor of Characteristics of The Remote Sensing Images subject to registration point according to the following formula, is calculated the vector formed with the Euclidean distance of the descriptor with reference to each unique point of remote sensing images:
d k = [ d k , 1 , d k , 2 , . . . , d k , j , . . . d k , L 1 ]
Wherein, d krepresent the descriptor of Characteristics of The Remote Sensing Images subject to registration point the vector formed with the Euclidean distance of each descriptor with reference to remote sensing images, d k,jrepresent the descriptor of registration Characteristics of The Remote Sensing Images point with the descriptor with reference to Characteristics of The Remote Sensing Images point euclidean distance, represent a kth descriptor of Characteristics of The Remote Sensing Images subject to registration point, represent the jth descriptor with reference to Characteristics of The Remote Sensing Images point, j=1,2 ..., L 1, L 1represent the number with reference to Characteristics of The Remote Sensing Images point descriptor, || || represent and get norm operation;
(10c) to vectorial d kin L 1individual element sorts from small to large and obtains vectorial s k;
If (10d) then descriptor characteristic of correspondence point and and descriptor euclidean distance be s k, 0the descriptor characteristic of correspondence Point matching of reference remote sensing images; If and then descriptor characteristic of correspondence point and and descriptor euclidean distance be d kthe descriptor characteristic of correspondence Point matching of reference remote sensing images of front E element; If then descriptor characteristic of correspondence point discord is with reference to any Feature Points Matching of remote sensing images;
Wherein, s k, 0represent vectorial s kfirst element, s k, 1represent vectorial s ksecond element, T land T hrepresent that arest neighbors and time nearest neighbor distance are than threshold value, the span of E is the integer in [2,3];
(10e) k>=L is judged 2whether set up, if set up, obtain remote sensing images subject to registration and the Feature Points Matching relation C with reference to remote sensing images iR, IS, perform step (11); Otherwise, make k=k+1, perform step (10b);
Wherein, k represents the descriptor sequence number of Characteristics of The Remote Sensing Images subject to registration point, C iR, ISrepresent the pair relationhip set of the unique point that the gradient amplitude image with reference to the anisotropy metric space of remote sensing images detects and the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, L 3the point of representation feature Point matching stage match is to sum;
(11) feature point pairs of deletion error coupling:
Use the consistent RANSAC algorithm of random sampling, the matching double points of deletion error, obtain remote sensing images subject to registration to the geometric transform relation with reference to remote sensing images;
(12) registration reference picture and image subject to registration:
According to geometric transform relation, use bilinear interpolation method registration with reference to remote sensing images and remote sensing images subject to registration.
2. the remote sensing image registration method based on anisotropic gradient metric space according to claim 1, it is characterized in that, the scale-value of each layer of anisotropy metric space of the computing reference remote sensing images described in step (2a) and remote sensing images subject to registration obtains according to the following formula:
σ e(m,u)=σ 02 m+u/L
Wherein, σ erepresent the scale-value of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, σ 0represent that anisotropy metric space is G group altogether with reference to remote sensing images or the station meter angle value with remote sensing images subject to registration, often organize L layer, m represents the group index of anisotropy metric space, m=0,1 ..., G-1, u represent the index often organizing internal layer, u=0,1 ..., L-1, e=0,1,2, ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
3. the remote sensing image registration method based on anisotropic gradient metric space according to claim 1, is characterized in that, metric space value is transformed into time measure value carries out according to the following formula described in step (2b):
t e = 1 2 σ e 2 ( m , u )
Wherein, t erepresent the time measure value of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, σ erepresent the yardstick of the e layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, e=0,1,2 ..., N-1, N represents the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration, and m represents the group index of anisotropy metric space, m=0,1, ..., G-1, G represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space group subject to registration, u represents the index often organizing internal layer, u=0,1 ..., L-1, L represent the sum often organizing layer with reference to remote sensing images and remote sensing images anisotropy metric space subject to registration.
4. the remote sensing image registration method based on anisotropic gradient metric space according to claim 1, it is characterized in that, use Sobel Operator Sobel described in step (3), the step of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration is as follows:
The first step: according to the following formula, each tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration is along the difference of x-axis positive dirction:
I x n = - 1 0 1 - 2 0 2 - 1 0 1 ⊕ I n
Wherein, I nrepresent the n-th layer image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, denotation coordination initial point is at the image I in the upper left corner nthe horizontally difference of right x-axis positive dirction, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
Second step: according to the following formula, each tomographic image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration is along the difference of y-axis positive dirction:
I y n = - 1 - 2 1 0 0 0 1 2 1 ⊕ I n
Wherein, I nrepresent the n-th layer image of input reference remote sensing images or remote sensing images anisotropy metric space subject to registration, denotation coordination initial point is at the image I in the upper left corner nalong the difference of y-axis positive dirction straight down, represent associative operation, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
3rd step: according to the following formula, the gradient amplitude of computing reference remote sensing images and each tomographic image of remote sensing images anisotropy metric space subject to registration:
GI n = ( I x n ) 2 + ( I y n ) 2
Wherein, GI nrepresent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration ngradient amplitude, represent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image I with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
5. the remote sensing image registration method based on anisotropic gradient metric space according to claim 1, it is characterized in that, use Harris corner detection operator described in step (7), the gradient amplitude image of the anisotropy metric space with reference to remote sensing images and remote sensing images subject to registration detects unique point and carries out in accordance with the following steps:
The first step: according to the following formula, the Harris matrix of each pixel of each tomographic image of gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
u ( X , σ n ) = σ n 2 g 2 · σ n * ( GI x n ) 2 GI x n · GI y n GI y n . GI x n ( GI y n ) 2
Wherein, u (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris matrix of the pixel of X, and X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration, to be x-axis direction and y-axis standard error of direction be all g gaussian function, * represents convolution operation, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nthe horizontally difference of right x-axis positive dirction, represent the n-th layer image GI of the gradient amplitude image of reference remote sensing images or remote sensing images anisotropy metric space subject to registration nalong the difference of y-axis positive dirction straight down, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
Second step: the Harris function of each pixel of each tomographic image of the gradient amplitude image of computing reference remote sensing images and remote sensing images anisotropy metric space subject to registration:
R(X,σ n)=det(u(X,σ n))-D·tr(u(X,σ n)) 2
Wherein, R (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris function of the pixel of X, u (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris matrix of the pixel of X, and X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer of reference remote sensing images and remote sensing images anisotropy metric space subject to registration, the determinant of det () expression matrix, D represents arbitrary constant, the summation of tr () representing matrix the elements in a main diagonal, n=0,1 ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration;
3rd step: for R (X, σ n) > T thand R (X, σ n) be greater than the point of the Harris function of any point in its 8 neighborhood as unique point, obtain the set of the unique point that the gradient amplitude image of the anisotropy metric space of reference remote sensing images and remote sensing images subject to registration detects:
C I R = { c I R 1 , c I R w , ... , c I R R } , C I S = { c I S 1 , c I S 2 , ... , c I S S }
Wherein, R (X, σ n) represent the n-th layer image GI of the gradient amplitude image with reference to remote sensing images or remote sensing images anisotropy metric space subject to registration nposition is the Harris function of the pixel of X, T threpresent Harris function threshold, X represents location of pixels coordinate, σ nrepresent the scale-value of the n-th layer of reference remote sensing images or remote sensing images anisotropy metric space subject to registration, C iRrepresent the set of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, C iSrepresent the set of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, R represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space with reference to remote sensing images, S represents the sum of the unique point detected on the gradient amplitude image of the anisotropy metric space of remote sensing images subject to registration, n=0,1, ..., N-1, N represent the sum with reference to remote sensing images and remote sensing images anisotropy metric space layer subject to registration.
CN201510770880.5A 2015-11-12 2015-11-12 Remote sensing image registration method based on anisotropic gradient metric space Active CN105427298B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510770880.5A CN105427298B (en) 2015-11-12 2015-11-12 Remote sensing image registration method based on anisotropic gradient metric space

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510770880.5A CN105427298B (en) 2015-11-12 2015-11-12 Remote sensing image registration method based on anisotropic gradient metric space

Publications (2)

Publication Number Publication Date
CN105427298A true CN105427298A (en) 2016-03-23
CN105427298B CN105427298B (en) 2018-03-06

Family

ID=55505479

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510770880.5A Active CN105427298B (en) 2015-11-12 2015-11-12 Remote sensing image registration method based on anisotropic gradient metric space

Country Status (1)

Country Link
CN (1) CN105427298B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023183A (en) * 2016-05-16 2016-10-12 西北工业大学 Real-time line segment matching method
CN107563438A (en) * 2017-08-31 2018-01-09 西南交通大学 The multi-modal Remote Sensing Images Matching Method and system of a kind of fast robust
CN107657597A (en) * 2017-10-19 2018-02-02 中国科学院遥感与数字地球研究所 Cross-platform moon base earth observation image automatic geometric correction method
CN107909018A (en) * 2017-11-06 2018-04-13 西南交通大学 A kind of sane multi-modal Remote Sensing Images Matching Method and system
WO2018076137A1 (en) * 2016-10-24 2018-05-03 深圳大学 Method and device for obtaining hyper-spectral image feature descriptor
CN108009595A (en) * 2017-12-25 2018-05-08 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN108346162A (en) * 2018-03-26 2018-07-31 西安电子科技大学 Remote sensing image registration method based on structural information and space constraint
CN109003293A (en) * 2017-06-07 2018-12-14 北京航空航天大学 Inhibit the SAR image registration method of model based on anisotropy spot
CN110096540A (en) * 2019-04-16 2019-08-06 湖北地信科技集团股份有限公司 Surveying and mapping data conversion method, equipment, storage medium and device
CN110458876A (en) * 2019-08-08 2019-11-15 哈尔滨工业大学 Multidate POLSAR method for registering images based on SAR-SIFT feature
CN110464379A (en) * 2018-05-11 2019-11-19 深圳市理邦精密仪器股份有限公司 A kind of fetus head circumference measurement method, device and terminal device
CN110501728A (en) * 2018-05-16 2019-11-26 清华大学 The frequency discrimination method and frequency discrimination device of signal when locating base station is jumped
CN111028201A (en) * 2019-11-13 2020-04-17 东北大学 Fundus blood vessel image segmentation system and method based on multi-scale level set
CN111125414A (en) * 2019-12-26 2020-05-08 盐城禅图智能科技有限公司 Automatic searching method for specific target of remote sensing image of unmanned aerial vehicle
CN111223133A (en) * 2020-01-07 2020-06-02 上海交通大学 Registration method of heterogeneous images
CN112784761A (en) * 2021-01-26 2021-05-11 哈尔滨理工大学 Special texture background remote sensing image matching method
CN113240743A (en) * 2021-05-18 2021-08-10 浙江大学 Heterogeneous image pose estimation and registration method, device and medium based on neural network
US11107199B2 (en) 2017-09-30 2021-08-31 Institute Of Remote Sensing And Digital Earth, Chinese Academy Of Sciences Automatic cross-platform geometric correction method for moon-based earth observation image
CN113379808A (en) * 2021-06-21 2021-09-10 昆明理工大学 Method for registration of multiband solar images
WO2024000950A1 (en) * 2022-07-01 2024-01-04 浙江工商大学 Rapid image registration method and apparatus for multi-spectral camera

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103198483A (en) * 2013-04-07 2013-07-10 西安电子科技大学 Multiple time phase remote sensing image registration method based on edge and spectral reflectivity curve
CN104794678A (en) * 2015-05-04 2015-07-22 福建师范大学 Automatic registration method for high-spatial-resolution remote-sensing images based on SIFI feature points

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103198483A (en) * 2013-04-07 2013-07-10 西安电子科技大学 Multiple time phase remote sensing image registration method based on edge and spectral reflectivity curve
CN104794678A (en) * 2015-05-04 2015-07-22 福建师范大学 Automatic registration method for high-spatial-resolution remote-sensing images based on SIFI feature points

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
JUN KONG等: "IR remote sensing image registration based on multi-scale feature extraction", 《2014 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS》 *
张万强等: "基于梯度尺度空间的遥感影像多尺度分割方法及应用研究", 《安徽农业科学》 *

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106023183A (en) * 2016-05-16 2016-10-12 西北工业大学 Real-time line segment matching method
WO2018076137A1 (en) * 2016-10-24 2018-05-03 深圳大学 Method and device for obtaining hyper-spectral image feature descriptor
CN109003293A (en) * 2017-06-07 2018-12-14 北京航空航天大学 Inhibit the SAR image registration method of model based on anisotropy spot
US11244197B2 (en) 2017-08-31 2022-02-08 Southwest Jiaotong University Fast and robust multimodal remote sensing image matching method and system
CN107563438A (en) * 2017-08-31 2018-01-09 西南交通大学 The multi-modal Remote Sensing Images Matching Method and system of a kind of fast robust
WO2019042232A1 (en) * 2017-08-31 2019-03-07 西南交通大学 Fast and robust multimodal remote sensing image matching method and system
CN107563438B (en) * 2017-08-31 2019-08-30 西南交通大学 A kind of multi-modal Remote Sensing Images Matching Method and system of fast robust
US11107199B2 (en) 2017-09-30 2021-08-31 Institute Of Remote Sensing And Digital Earth, Chinese Academy Of Sciences Automatic cross-platform geometric correction method for moon-based earth observation image
CN107657597B (en) * 2017-10-19 2020-09-08 中国科学院遥感与数字地球研究所 Automatic geometric correction method for cross-platform moon-based earth observation image
CN107657597A (en) * 2017-10-19 2018-02-02 中国科学院遥感与数字地球研究所 Cross-platform moon base earth observation image automatic geometric correction method
CN107909018B (en) * 2017-11-06 2019-12-06 西南交通大学 Stable multi-mode remote sensing image matching method and system
CN107909018A (en) * 2017-11-06 2018-04-13 西南交通大学 A kind of sane multi-modal Remote Sensing Images Matching Method and system
CN108009595A (en) * 2017-12-25 2018-05-08 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN108009595B (en) * 2017-12-25 2018-10-12 北京航空航天大学 A kind of image-recognizing method of feature based stipulations
CN108346162A (en) * 2018-03-26 2018-07-31 西安电子科技大学 Remote sensing image registration method based on structural information and space constraint
CN110464379A (en) * 2018-05-11 2019-11-19 深圳市理邦精密仪器股份有限公司 A kind of fetus head circumference measurement method, device and terminal device
CN110501728B (en) * 2018-05-16 2022-03-29 清华大学 Frequency discrimination method and device for time hopping signal of positioning base station
CN110501728A (en) * 2018-05-16 2019-11-26 清华大学 The frequency discrimination method and frequency discrimination device of signal when locating base station is jumped
CN110096540A (en) * 2019-04-16 2019-08-06 湖北地信科技集团股份有限公司 Surveying and mapping data conversion method, equipment, storage medium and device
CN110458876A (en) * 2019-08-08 2019-11-15 哈尔滨工业大学 Multidate POLSAR method for registering images based on SAR-SIFT feature
CN110458876B (en) * 2019-08-08 2023-01-31 哈尔滨工业大学 Multi-temporal POLSAR image registration method based on SAR-SIFT features
CN111028201A (en) * 2019-11-13 2020-04-17 东北大学 Fundus blood vessel image segmentation system and method based on multi-scale level set
CN111028201B (en) * 2019-11-13 2023-10-27 东北大学 Fundus blood vessel image segmentation system and method based on multi-scale level set
CN111125414A (en) * 2019-12-26 2020-05-08 盐城禅图智能科技有限公司 Automatic searching method for specific target of remote sensing image of unmanned aerial vehicle
CN111125414B (en) * 2019-12-26 2023-08-18 郑州航空工业管理学院 Automatic searching method for specific target of unmanned aerial vehicle remote sensing image
CN111223133B (en) * 2020-01-07 2022-10-11 上海交通大学 Registration method of heterogeneous images
CN111223133A (en) * 2020-01-07 2020-06-02 上海交通大学 Registration method of heterogeneous images
CN112784761A (en) * 2021-01-26 2021-05-11 哈尔滨理工大学 Special texture background remote sensing image matching method
CN113240743B (en) * 2021-05-18 2022-03-25 浙江大学 Heterogeneous image pose estimation and registration method, device and medium based on neural network
CN113240743A (en) * 2021-05-18 2021-08-10 浙江大学 Heterogeneous image pose estimation and registration method, device and medium based on neural network
CN113379808B (en) * 2021-06-21 2022-08-12 昆明理工大学 Method for registration of multiband solar images
CN113379808A (en) * 2021-06-21 2021-09-10 昆明理工大学 Method for registration of multiband solar images
WO2024000950A1 (en) * 2022-07-01 2024-01-04 浙江工商大学 Rapid image registration method and apparatus for multi-spectral camera

Also Published As

Publication number Publication date
CN105427298B (en) 2018-03-06

Similar Documents

Publication Publication Date Title
CN105427298A (en) Remote sensing image registration method based on anisotropic gradient dimension space
Ye et al. Fast and robust matching for multimodal remote sensing image registration
Zuo et al. A robust approach to reading recognition of pointer meters based on improved mask-RCNN
CN104200461B (en) The remote sensing image registration method of block and sift features is selected based on mutual information image
CN101556692A (en) Image mosaic method based on neighborhood Zernike pseudo-matrix of characteristic points
CN103914847A (en) SAR image registration method based on phase congruency and SIFT
CN107563438A (en) The multi-modal Remote Sensing Images Matching Method and system of a kind of fast robust
CN104134208B (en) Using geometry feature from slightly to the infrared and visible light image registration method of essence
CN107993258A (en) A kind of method for registering images and device
CN107392215A (en) A kind of multigraph detection method based on SIFT algorithms
CN106919944A (en) A kind of wide-angle image method for quickly identifying based on ORB algorithms
CN105069811A (en) Multi-temporal remote sensing image change detection method
CN103295239A (en) Laser-point cloud data automatic registration method based on plane base images
CN104599258A (en) Anisotropic characteristic descriptor based image stitching method
CN106971404A (en) A kind of robust SURF unmanned planes Color Remote Sensing Image method for registering
CN105631872B (en) Remote sensing image registration method based on multi-characteristic points
CN108765476A (en) A kind of polarization image method for registering
CN111462198B (en) Multi-mode image registration method with scale, rotation and radiation invariance
CN112163622B (en) Global and local fusion constrained aviation wide-baseline stereopair line segment matching method
CN102324045B (en) Invariant-moment target recognition method based on Radon transformation and polar harmonic transformation
CN109308715A (en) A kind of optical imagery method for registering combined based on point feature and line feature
Huang et al. Correlation and local feature based cloud motion estimation
CN107490356A (en) A kind of noncooperative target rotary shaft and rotation angle measuring method
CN107967477A (en) A kind of improved SIFT feature joint matching process
CN105488541A (en) Natural feature point identification method based on machine learning in augmented reality system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant