CN101251926B - Remote sensing image registration method based on local configuration covariance matrix - Google Patents

Remote sensing image registration method based on local configuration covariance matrix Download PDF

Info

Publication number
CN101251926B
CN101251926B CN2008101023293A CN200810102329A CN101251926B CN 101251926 B CN101251926 B CN 101251926B CN 2008101023293 A CN2008101023293 A CN 2008101023293A CN 200810102329 A CN200810102329 A CN 200810102329A CN 101251926 B CN101251926 B CN 101251926B
Authority
CN
China
Prior art keywords
image
registration
local
zoom factor
covariance matrix
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008101023293A
Other languages
Chinese (zh)
Other versions
CN101251926A (en
Inventor
王鹏波
杨威
陈杰
徐华平
周荫清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Original Assignee
Beihang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University filed Critical Beihang University
Priority to CN2008101023293A priority Critical patent/CN101251926B/en
Publication of CN101251926A publication Critical patent/CN101251926A/en
Application granted granted Critical
Publication of CN101251926B publication Critical patent/CN101251926B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Analysis (AREA)

Abstract

The invention provides a method for registering a remote sensing image based on a local contour covariance matrix, which combines a corner feature, a local split image and a local edge contour together as a local feature to implement an extraction of control points, has the local contour covariance matrix introduced into a registering processing, uses the local contour covariance matrix to implement an automatic extraction of a rotating and scaling factor, and combines a technology of self adaptive selection of a local window to reduce an effect of the twiddle and scaling factor on the registering processing so as to improve the robustness of the registering processing, thereby implementing an accurate registering between remote sensing images in a mode of large rotation and scaling without human participation.

Description

A kind of remote sensing image registration method based on local configuration covariance matrix
Technical field
The invention belongs to image processing field, relate to a kind of method for registering images, be specifically related to a kind of remote sensing image registration disposal route based on local configuration covariance matrix.
Background technology
Image registration is a underlying issue of Flame Image Process, is the process that two width of cloth or multiple image to the Same Scene of taking from different time, different sensors or different visual angles mate, superpose.Say that accurately the target of image registration is exactly the best mapping relations that find between the input picture.
At present, image registration disposal route mainly can be divided three classes:
1, based on the registration process method of overall gray-scale statistical information;
The gray-scale statistical characteristic of directly utilizing image itself to have based on the registration process method of overall gray-scale statistical information is come the similarity between the dimensioned plan picture, and then realizes the registration process between image.This disposal route great advantage is that principle is simple, is easy to realize, for obtaining the good treatment effect under the less situation of similar image and affine error.Yet for foreign peoples's remote sensing images, because there is bigger difference in the gradation of image characteristic, its treatment effect is unsatisfactory.Simultaneously, owing to adopt iterative search in the processing procedure, when affine error was big, its processing speed sharply descended, even can't restrain.
2, based on the registration process method of phase place correlation technique;
Theoretical foundation based on the registration process method of phase place correlation technique is the Fourier phase correlation technique, utilizes frequency domain information to carry out relevant treatment, the search optimum matching, and then realize automatic image registration not needing to seek under the situation at reference mark.The advantage of this algorithm is that to insensitive for noise owing to can adopt ready-made fft algorithm, therefore, this registration process algorithm has certain advantage on processing speed.In actual use, this algorithm needs to have bigger overlapping region between image subject to registration, and the gray feature of image is similar, and this has limited the use of this Processing Algorithm in the remote sensing image registration process.
3, based on the registration process method of characteristics of image;
Based on the registration process method of characteristics of image is another big class in the method for registering images, as shown in Figure 1, this method is read in each image from each image channel after, at first extract the characteristic information in the image, utilize the feature of extracting to carry out the local similar coupling, and set up mapping relations between the image by the matching relationship of feature, it is right to handle the image that obtains coupling by image resampling at last.
Comparatively speaking, because the too much gamma characteristic that depends on image of preceding two class disposal routes, their application in remote sensing image registration is handled have been restricted, therefore, registration process method based on feature becomes the common method that remote sensing image registration is handled, and its core is to set up the mapping relations between characteristics of image.
Aspect characteristic matching relation sets up, traditional method be by statistical parameters such as statistics partial auto-correlation or interactive information weigh in the image 2 whether consistent, but this method is subjected to the influence of twiddle factor and zoom factor very big, along with the increase of zoom factor and twiddle factor, its correlativity reduces rapidly.Utilize affine invariant to carry out the similarity test, then depend on the edge contour that extracts complete closure, and this is difficult to usually for remote sensing images especially radar image realize.The point feature of utilizing aggregation between the some feature to realize that the similarity coupling then requires each input picture and extracted has very high consistance, and for remote sensing images, this is difficult to realize equally.Therefore, how effectively to set up the mapping relations between input picture, become the difficult point that remote sensing image registration is handled.
Summary of the invention
The present invention proposes a kind of remote sensing image registration method based on local configuration covariance matrix, with the angle point feature, local segmentation image and local edge contour combine, realize the reference mark extraction as local feature jointly, and local configuration covariance matrix introduced registration process, utilize local configuration covariance matrix to realize rotating the automatic extraction of zoom factor, reduce to rotate the influence of zoom factor in conjunction with local window self-adaptation selecting technology to registration process, improve the robustness of registration process, realizing the accurate registration between remote sensing images under the big rotation zoom mode under the no artificial participation condition.
At first extract the angle point feature in the input picture, be characterized as reference with angle point, extract near the local configuration feature of angle point, and the covariance matrix that utilizes the local configuration feature extracts the rotation zoom factor between input picture, reduce to rotate the influence of zoom factor, improve the extraction precision of same place characteristic matching; Then, utilize detected same place to carry out the affine model parameter estimation; At last, carry out image resampling according to the transformation model parameter that estimates, the remote sensing images behind the acquisition registration are right.On the whole, this treatment scheme automatic extraction, optimum window size of mainly comprising the selection of feature and extraction, rotation zoom factor choose, angle point characteristic matching, affine model parameter estimation and five parts of image resampling.
A kind of remote sensing image registration method based on local configuration covariance matrix comprises the steps:
Step 1: from each image channel, read in each image simultaneously, adopt the Harris Corner Detection Algorithm to realize the image Corner Detection, judge whether to exist angle point by the angular response functional value of adding up each point in the mobile search window;
If the angular response function C RF value of any is the maximal value in the local neighborhood, then this point is set to alternative angle point; Otherwise show that this point is not an angle point.After in having judged image, having a few, utilize threshold processing to extract final angle point.
Step 2: adopt inter-class variance automatic threshold method that reference picture and image subject to registration are carried out local threshold and cut apart; Utilization is extracted to such an extent that optimal threshold is cut apart topography, obtains local bianry image, and obtains the local edge profile according to the local bianry image that extracts.
Step 3: the automatic extraction of rotation zoom factor; Employing is based on the rotation zoom factor extraction method of local configuration covariance matrix.The bianry image that utilizes Threshold Segmentation to obtain calculates the covariance matrix of reference picture and image subject to registration respectively; Distribute according to the partial structurtes of reference picture and image subject to registration and can extract twiddle factor and zoom factor between image.
Step 4: choose the analysis window size by the ratio of selected window size in comparative analysis reference picture and the image subject to registration and the zoom factor that utilizes input picture to extract, extract optimum window.
Step 5: utilize the rotation zoom factor and the optimum window size that extract that the edge contour that extracts in the step 1 is resampled, reduce to rotate the influence of zoom factor to characteristic matching; And the edge feature after resampling carried out the Freeman coding, and add up similarity between two line segments by the chain code correlation technique; Utilize the mutual support characteristic between relative position relation between point set and point set to reject error matching points.
Step 6: after finishing Feature Points Matching, utilize the parameter in the least square method estimation affine model; After determining transformation parameter, according to affine parameter to input picture to processings that resample, it is right to obtain the image that mates mutually.
In the described step 3, at the automatic extraction of local segmentation image stretching rotation zoom factor.It is bright, dark two class images that passing threshold is cut apart image segmentation, adds up two-part covariance matrix and rotation zoom factor then respectively, determines the corresponding relation between image, and extracts corresponding rotation zoom factor.
In the described step 3, when determining the corresponding relation between image, these two kinds of corresponding relations of dark, bright image in bright, dark image in the corresponding image subject to registration of bright in the comparative analysis reference picture, dark image and the reference picture in the corresponding image subject to registration of bright, dark image, choose the ratio of anglec of rotation difference and zoom factor, it is littler to choose anglec of rotation difference, and the more approaching corresponding relation of the ratio of diaxon rotation zoom factor is as the corresponding relation between two images.
In the described step 4, the zoom factor that the statistics local configuration covariance matrix extracts, calculate the ratio that twiddle factor and zoom factor calculate selected window size in comparative analysis reference picture and the image subject to registration then, and choose under the different windows size condition both difference in the comparative analysis image subject to registration, if it is minimum that both differences reach, then obtain the size of the size of this window subject to registration as optimum window; Otherwise recomputate the zoom factor that local configuration covariance matrix extracts after revising the window size of image subject to registration, thereby the size of adaptively correcting analysis window.
The advantage of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention is:
(1) the present invention can be implemented in the high precision autoregistration between remote sensing images under the big rotation zoom mode, has strengthened the robustness that remote sensing image registration is handled, and has improved the treatment effeciency of remotely-sensed data, for the automatic processing that realizes remotely-sensed data lays the foundation.
(2) the present invention combines angle point feature, local segmentation image and local edge contour, realizes the automatic extraction of same place jointly as local feature;
(3) the present invention introduces registration process with local configuration covariance matrix, realize the automatic extraction of rotation zoom factor between reference picture and image subject to registration, simultaneously reduce to rotate the influence of zoom factor, improve the robustness of registration process registration process in conjunction with local window self-adaptation selecting technology.
Description of drawings
Fig. 1 is based on the process flow diagram of the registration process method of characteristics of image;
Fig. 2 is the flow chart of steps of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 3 is the angle point characteristic extraction step process flow diagram of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 4 is the Edge Gradient Feature flow chart of steps of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 5 is the optimum window size extraction step process flow diagram of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 6 is the angle point characteristic matching treatment step process flow diagram of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 7 a is the rectangular coordinate system synoptic diagram of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 7 b is that the intrinsic coordinates of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention is a synoptic diagram;
Fig. 8 a is the change curve of coefficient of rotary at the different angle points place of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 8 b is the change curve of zoom factor at the different angle points place of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Fig. 9 is the analysis result figure that the optimum window size of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention extracts;
Figure 10 a is the change curve of the intersegmental similarity Dkl in the partial correction front of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Figure 10 b is the change curve of the intersegmental similarity Dkl in the partial correction front of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention;
Figure 11 is the registration process figure as a result of a kind of remote sensing image registration method based on local configuration covariance matrix of the present invention.
Embodiment
The present invention is described in further detail below in conjunction with drawings and Examples.
The present invention is primarily aimed at and is difficult to set up this difficult point of mapping relations under the big rotation zoom mode between the remote sensing images feature, a kind of remote sensing image registration disposal route based on local configuration covariance matrix has been proposed, this disposal route can realize the automatic extraction of rotation zoom factor between remote sensing images, increased the matching precision between the remote sensing images feature effectively, strengthen the robustness that remote sensing image registration is handled, realized the accurate registration between remote sensing images under the big rotation zoom mode under the no artificial participation condition.
This method is at first extracted the angle point feature in the input picture, be characterized as reference with angle point, extract near the local configuration feature of angle point, and the covariance matrix that utilizes the local configuration feature extracts the rotation zoom factor between input picture, reduce to rotate the influence of zoom factor, improve the extraction precision of same place characteristic matching; Then, utilize detected same place to carry out the affine model parameter estimation; At last, carry out image resampling according to the transformation model parameter that estimates, the remote sensing images behind the acquisition registration are right.On the whole, this treatment scheme automatic extraction, optimum window size of mainly comprising the selection of feature and extraction, rotation zoom factor choose, angle point characteristic matching, affine model parameter estimation and six parts of image resampling.
A kind of remote sensing image registration method based on local configuration covariance matrix of this method as shown in Figure 2, comprises the steps:
Step 1: the selection of feature and extraction.
The selection of feature is the prerequisite that remote sensing image registration is handled with extracting, and it directly has influence on the effect of registration process, even causes realizing matching treatment.Say in principle, complete edge contour is a kind of good characteristics of image, and it not only has good station-keeping ability, and can set up mapping relations easily, therefore, for optical imagery, utilize edge contour to can be good at realizing the image registration processing.Yet for radar image, strong speckle noise makes and is difficult to extract complete edge contour, at this moment, can't obtain result preferably if directly adopt edge feature to carry out image registration.In order effectively to address this problem, adopt in the present invention angle point feature, local segmentation image are combined with the local edge contour feature, make full use of the feature that the good and local configuration feature of angle point feature location characteristic is easy to set up mapping relations, consider that simultaneously the angle point feature is usually located at the place of gray scale sudden change, therefore can effectively reduce local configuration Feature Extraction difficulty, improve the robustness of registration process.
Angle point Feature Extraction: adopt the Harris Corner Detection Algorithm to realize the image Corner Detection in this article.This algorithm judges whether to exist angle point by the gradient of gradation of image on all directions in the statistics mobile search window.If the grey scale change that gradation of image is gone up in any direction in the mobile search window is all very little, then do not comprise the angle point feature in the region of search, if the tangible grey scale change of all appearance that gradation of image is gone up in any direction in the mobile search window, then the feature in the region of search is an angle point.For effectively judging the variation of gradation of image, define angular response function C RF, it comprises two kinds of patterns and is respectively Harris pattern and Nobel pattern:
CRF = det M - k · trace ( M ) ( Harris ) trace ( M ) det M ( Nobel ) - - - ( 1 )
Wherein: M represents gradient matrix, and its expression formula is:
M = G ⊗ I x 2 I x I y I x I y I y 2 - - - ( 2 )
I xGradient on the expression x direction; I yGradient on the expression y direction; G is Gauss's template;
Figure S2008101023293D00053
The expression convolution algorithm; The determinant of det representing matrix; The mark of trace representing matrix; K is a constant coefficient, generally gets 0.04.
Whether the CRF value of judging every bit is the interior maximal value (Nobel angular response function correspondence minimum value) of local neighborhood.If then this point is set to alternative angle point; If not, show that then this point is not an angle point.Utilize threshold processing to extract final angle point at last.
Fig. 3 has provided angle point and has extracted process flow diagram.
During concrete operations, use masterplate respectively - 1 0 1 - 1 0 1 - 1 0 1 And masterplate - 1 - 1 - 1 0 0 0 1 1 1 In orientation, image upper edge to distance to moving, calculate each locational Grad, obtain the gradient image on the both direction, and calculate the gradient product of each position correspondence, obtained three width of cloth figure so in fact, its pixel value is corresponding respectively I x, I yAnd I xI yUtilize Gauss's masterplate to carry out process of convolution with this three width of cloth image respectively then, then, calculate angular response function (CRF) value of each point respectively according to formula (1), and whether the CRF value of judging every bit is the maximal value of (as 5 * 5) in its local neighborhood, if not show this point is not angle point, if then this point is set to alternative angle point, utilize threshold processing to extract the optimum point of some at last as last result.
Step 2: local configuration extracts: on the basis of finishing the angle point feature extraction, further extract the local configuration feature of each angle point.According to the definition of angle point and angle point Feature Extraction method as can be known, the angle point feature is usually located at the place of gray scale sudden change, therefore, regional area at angle point exists tangible grey scale change, therefore, utilize simple image partition method just topography can be carried out and cause division, and then extract local configuration.The inter-class variance automatic threshold method of Cai Yonging is carried out Threshold Segmentation in the present invention.
As shown in Figure 4, according to the tonal range of image 0,1 ..., l-1} selects threshold value t that it is divided into two classes: C successively 0: 0,1 ..., t} and C 1: t+1, t+2 ..., l-1}.At first, the histogram of statistics topography is to image histogram normalization, and the probability distribution of each gray-scale value is:
p i=n i/N,p i≥0, Σ i = 0 l - 1 p i = 1 - - - ( 3 )
Wherein: n iThe pixel number of representing i gray-scale value, N represents total pixel number, p iThe shared probability of pixel of representing this gray-scale value.
At this moment, can obtain the inter-class variance σ of two classes B 2For:
σ B 2 = ω 0 ( μ 0 - μ T ) 2 + ω 1 ( μ 1 - μ T ) 2 = ω 0 ω 1 ( μ 1 - μ 0 ) 2 - - - ( 4 )
Wherein: ω 0And ω 1Represent C respectively 0Class and C 1The probability of occurrence of class, its computing formula is:
ω 0 = Σ i = 0 t p i ω 1 = Σ i = t + 1 l - 1 p i - - - ( 5 )
μ 0, μ 1And μ TRepresent C respectively 0Class, C 1The average on class and whole gray scales rank, its computing formula is:
μ 0 = Σ i = 0 t ip i ω 0 μ 1 = Σ i = t + 1 l - 1 ip i ω 1 μ T = Σ i = 0 l - 1 ip i - - - ( 6 )
Set different segmentation thresholds, calculate the inter-class variance of two class images under various Threshold Segmentation situations respectively, and be criterion to the maximum, can obtain optimal threshold t with inter-class variance according to formula (4), formula (5) and formula (6) *For:
t * = Max 0 ≤ t ≤ l - 1 σ B - - - ( 7 ) 2
And utilize this threshold value to finish the two-value of image is cut apart, pixel value by each point in the neighborhood around each point in the comparative analysis bianry image and its changes and extracts the edge feature of protecting local neighborhood at last, when existing pixel value to change, show and have edge feature, when not existing pixel value to change, show and do not have edge feature, at last, extract the nearest edge feature of digression point feature as the local configuration feature by the relative position relation of analyzing angle point and edge.
Step 3: the automatic extraction of rotation zoom factor
When having bigger rotation scaled error between input picture, the local similar between two images sharply reduces with the increase of rotation scaled error, and this makes characteristic matching become difficult more.Rotate zoom factor between input picture and utilize local covariance matrix between input picture to extract effectively, and then reduce to rotate the influence of zoom factor, improve the robustness of registration process characteristic matching.
Any piece image himself all exists such intrinsic coordinates, and promptly by this coordinate system, the covariance matrix of this image is a diagonal matrix, and the eigenwert of covariance matrix has then reflected the energy distribution in the image.With the star target is that example illustrates intrinsic coordinates system, shown in Fig. 7 a, in rectangular coordinate system, there is a star target, at this moment, intrinsic coordinates for this star target is shown in Fig. 7 b, image is about the axis symmetry in intrinsic coordinates system, and its covariance matrix is a diagonal matrix, and eigenwert has reflected the distribution of image energy.The relation of the intrinsic coordinates system by contrast input picture angle point place local configuration can be extracted the twiddle factor between registering images and image subject to registration, and relatively the eigenwert of covariance matrix can obtain the zoom factor between two images.Below derive the in detail extracting method of rotation zoom factor.
Suppose I iAnd T jBe respectively the corresponding angle point in reference picture and the image subject to registration, then reference picture angle point I iLocal neighborhood I 0With image angle point T subject to registration jLocal neighborhood T 0Between have affined transformation f, make between two images and to satisfy:
T 0=f(I 0)=AI 0(8)
Wherein: A is 2 * 2 affine transformation matrix.
Distributed according to local gray level earlier before extracting local yardstick twiddle factor and take Threshold Segmentation to handle to topography, gray-scale value is set to 1 greater than the point of threshold value, is set to 0 less than the point of threshold value, forms characteristic image, and note is made I and T.Though consider that the gray difference of multi-mode remote sensing images is very big, the basic structural feature of each image is consistent, therefore, still satisfy affine transformation relationship between its characteristic image.
Employing is based on the rotation zoom factor extraction method of local configuration covariance matrix.The bianry image that utilizes Threshold Segmentation to obtain calculates the covariance matrix ∑ of reference picture I and image T subject to registration respectively IAnd ∑ T, satisfy between two covariance matrixes: the covariance matrix ∑ that calculates reference picture I and image T subject to registration respectively IAnd ∑ T, satisfy between two covariance matrixes:
T=A∑ IA′(9)
Wherein: Σ I = 1 N Σ j = 1 N r → ij r → ij ′ ; Σ T = 1 N Σ j = 1 N r → tj r → tj ′ ; N is the pixel number of O; r → j = x y Be the position vector of each point, subscript ' expression transposition.
Because oblique variance matrix ∑ IAnd ∑ TBe normal matrix, have orthogonal matrix U 1, U 2With diagonal matrix Λ 1, Λ 2Make:
Σ I = 1 N Σ j = 1 N r → ij r → ij ′ = U 1 Λ 1 U 1 ′ Σ T = 1 N Σ j = 1 N r → tj r → tj ′ = U 2 Λ 2 U 2 ′ - - - ( 10 )
Wherein: U 1 = cos θ 1 sin θ 1 - sin θ 1 cos θ 1 ; U 2 = cos θ 2 sin θ 2 - sin θ 2 cos θ 2 ;
Λ 1 = λ 11 0 0 λ 12 , λ 11>λ 12 Λ 2 = λ 21 0 0 λ 22 , λ 21>λ 22
Following formula substitution formula (9) is had:
U 2Λ 2U′ 2=AU 1Λ 1U′ 1A′(11)
Order B 2 = U 2 Λ 2 0.5 , B 1 = U 1 Λ 1 0.5 Have:
A = B 2 B 1 - 1 - - - ( 12 )
Like this, distribute according to the partial structurtes of reference picture and image subject to registration and can extract twiddle factor between image
Figure S2008101023293D000813
With zoom factor λ.
Figure S2008101023293D000814
λ i=(λ 2i1i)(13)
When specific implementation rotation zoom factor extracts, need the corresponding relation between earlier definite characteristic image, when determining the corresponding relation between image, dark, bright image in bright, dark image in the corresponding image subject to registration of bright in the comparative analysis reference picture, dark image and the reference picture in the corresponding image subject to registration of bright, dark image, the ratio of anglec of rotation difference and zoom factor under the both of these case, it is littler to choose anglec of rotation difference, diaxon rotation zoom factor λ 1And λ 2The more approaching corresponding relation of ratio as the corresponding relation between two images.
Shown in Fig. 8 a, be the extraction result of the anglec of rotation of the different angle points place of input picture in this example estimation, the zoom factor that Fig. 8 b has provided each point extracts the result.From scheming as seen, the variation range of the twiddle factor of extraction is in 1.2 degree, and the variation range of zoom factor is in 0.14, and they all have very high consistance separately, has reacted the rotation convergent-divergent relation between input picture.
Step 4: optimum window size is chosen
When having bigger rotation and scale error between two width of cloth images, the test performance of local matching criterior commonly used such as related coefficient, crosscorrelation coefficient, interactive information etc. sharply descends, and the reason that causes test performance to descend mainly contains two aspects:
1: because the test performance that twiddle factor and zoom factor cause descends;
2, owing to the influence of zoom factor, under identical window size, the difference of picture material causes test performance to descend.Therefore, when having bigger rotation and scaled error between two width of cloth images, need extract twiddle factor and zoom factor between two width of cloth images according to the content of image, the size of adaptive adjustment analysis window is eliminated the influence of rotation scaled error.
The cardinal rule that self-adaptation is adjusted the analysis window size is as follows: when selected window size is optimum window size, picture material unanimity in two windows, the ratio of two image local configuration covariance matrix eigenwerts has reflected the zoom factor between two images at this moment, the ratio of analysis window has reflected the zoom factor between two images equally in while two images, in theory will, when reaching optimum window size, two ratios are consistent, therefore, can choose the analysis window size to the zoom factor λ that local configuration covariance matrix extracts by the ratio p and the input picture of selected window size in comparative analysis reference picture and the image subject to registration.As shown in Figure 5, specific analytical method is as follows:
The zoom factor λ that the statistics local configuration covariance matrix extracts, calculate the ratio p that twiddle factor and zoom factor calculate selected window size in comparative analysis reference picture and the image subject to registration then, and choose the difference of λ and p under the different windows size condition in the comparative analysis image subject to registration, if it is minimum that the difference of λ and p reaches, then obtain the size of the size of this window subject to registration as optimum window; Otherwise recomputate the zoom factor λ that local configuration covariance matrix extracts after revising the window size of image subject to registration, thereby the size of adaptively correcting analysis window.
If the interior partial image texture characteristic of local window is identical in the topography in the reference picture in the local window and the image subject to registration, at this moment, the ratio of selected window size has also just in time reflected the zoom factor between two images in reference picture and the image subject to registration, therefore, will be consistent according to the zoom factor that ratio calculated of selected window size in reference picture and the image subject to registration with the zoom factor that is extracted according to formula (13).
If the partial image texture characteristic in topography in the reference picture in the local window and the image subject to registration in the local window is inequality, the zoom factor that is extracted according to the zoom factor that ratio obtained of selected window size in reference picture and the image subject to registration with according to formula (13) all can not be correct extraction topography between zoom factor, at this moment, will there be certain deviation between two results.
Therefore, can judge whether to obtain optimum window size, promptly hour reach optimum window size when difference reaches by the difference of the zoom factor that extracted under two kinds of situations relatively.Fig. 9 has provided the analysis result that optimum window size extracts, there is the optimum window of selecting between 23~25, to fluctuate as seen from the figure, by final registration process as can be known the theoretical optimum window between two images be 23, have the consistance of height by the relatively more visible optimum window size that extracts, well reacted the rotation convergent-divergent relation between input picture.
Step 5: angle point characteristic matching
Utilize the local edge contour feature to realize the coupling of angle point feature.At first utilize the rotation zoom factor and the optimum window size that extract that the edge contour that extracts in the step 1 is resampled, reduce to rotate the influence of zoom factor characteristic matching.
Then, according to the spatial relation between each point the local edge feature after resampling is carried out the Freeman coding, the similarity measurement formula of encoding between the line segment that back two length are N through curve is shown below:
D kl = 1 N Σ j = 0 N - 1 cos π 4 ( a k + j - b l + j ) - - - ( 14 )
Wherein: a K+jAnd b L+jRepresent that respectively two line segments carry out the resulting code value in Freeman coding back; D KlBe the value between 0 to 1, two line segments are coupling more, and it is big more that its value levels off to 1, two straight line difference more, and its value levels off to 0, two line segment when mating fully approximately, and its value is 1.
Add up the similarity of two line segments, the similarity D when between two line segments KlWhen satisfying following condition, think that this is a same place to angle point.
D kl = max k ∈ A ( D kl ) D kl = max l ∈ B ( D kl ) D kl>D th(15)
Wherein: A represents the angle point collection that extracts in the reference picture; B represents the angle point collection that extracts in the image subject to registration; D ThThe threshold value of expression curve similarity.
So far, can realize the thick coupling of angle point feature between remote sensing images by above-mentioned steps, mapping relations have between points been set up, yet because the difference between noise and gradation of image, inevitably exist error matching points right, in order to improve the precision of registration process, be necessary that the match point of further rejecting mistake is right on the basis of thick coupling.
For this reason, can further utilize the mutual support characteristic between relative position relation between point set and point set to reject error matching points.At first, top described initial matching point is sorted to the size according to the similarity test.Consider the space structure that 4 pairs of reference mark can fine reaction reference mark part, and the reference mark group that is made of 4 pairs of reference mark, the reference mark group that sequential search is all are carried out being combined into according to putting in order in all reference mark.Suppose to satisfy between them the affined transformation model, when these 4 pairs of reference mark when spatially not having the conllinear phenomenon, very little according to these 4 pairs of reference mark to the affine matrix and the error between actual affine matrix of estimation, affined transformation is carried out at the reference mark that the affine matrix of utilization estimation is treated in the registering images, and with 4 pairs of reference mark to average position error be that index weighs whether be that correct reference mark is right, when the reference mark to average position error during less than preset threshold, can think that to find correct reference mark right, stop search.When the average position error at reference mark greater than preset threshold or the 4 pairs of reference mark when spatially having the conllinear phenomenon, then think the mistake match point, continue search, all analyze up to the reference mark of initial extraction and finish.
Determining on the right basis, the 4 pairs of correct reference mark, testing remaining reference mark to whether being that correct reference mark is right.Order is right to introducing correct reference mark with reference mark to be tested, according to correct reference mark to estimating affine matrix, recomputate the reference mark to average position error, when average position error during greater than setting threshold, think that initiate reference mark is right to being wrong reference mark, it is deleted from initial matching point centering.Otherwise, think the reference mark of new introducing to right for correct reference mark, it is right to continue all the other reference mark of search, and it is right to extract all correct reference mark.Fig. 6 provides the process flow diagram that the angle point characteristic matching is handled.
Shown in Figure 10 a, 10b, Figure 10 a is the similarity D that does not extract under rotation zoom factor and the optimum window size KlChange curve, Figure 10 b is for extracting the similarity D behind rotation zoom factor and the optimum window size KlChange curve.From the result, as seen,, improved the robustness of angle point characteristic matching greatly owing to can eliminate the influence of rotation zoom factor between topography.
Step 6: model parameter estimation and image resampling
After finishing Feature Points Matching, can utilize the parameter in the least square method estimation affine model.Concrete method of estimation is as follows:
Suppose to have extracted altogether between input picture the N in the step 4 to the reference mark, be respectively { ((x R1, y R1), (x W1, y W1)), ((x R2, y R2), (x W2, y W2)) ..., ((x RN, y RN), (x WN, y WN)), consider and satisfy affine relation between image that at this moment, arbitrarily the reference mark is to satisfying following relational expression:
x wi y wi = a 11 a 12 a 21 a 22 x ri y ri + Δx Δy = 1 0 x ri y ri 0 0 0 1 0 0 x ri y ri Δx Δy a 11 a 12 a 21 a 22 - - - ( 16 )
To extract the equal substitution equation in all reference mark, can obtain following system of equations:
Y = x w 1 y wl · · · x wN y wN = 1 0 x r 1 y r 1 0 0 0 1 0 0 x r 1 y r 1 · · · 1 0 x rN y rN 0 0 0 1 0 0 x rN y rN = Δx Δy a 11 a 12 a 21 a 22 = H · Δx Δy a 11 a 12 a 21 a 22 - - - ( 17 )
After each position coordinates substitution system of equations to the reference mark, the estimated value that can obtain affine parameter is shown below:
Δx Δy a 11 a 12 a 21 a 22 = ( H T · H ) - 1 · H T · Y - - - ( 18 )
Wherein: subscript T represents transposition, and subscript-1 expression is inverted.
After determining transformation parameter, can be according to affine parameter to input picture to processings that resample, the image that can obtain coupling mutually is right, adopts the resampling of bilinear interpolation method realization image in this Processing Algorithm.
At first, calculate matrix of coefficients H and matrix H in the formula (17) according to the position of each point in the reference diagram TH is then with matrix H T.H be decomposed into nonsingular diagonal matrix D and nonsingular triangle row matrix Λ down,
D = d 1 0 0 0 0 0 0 d 2 0 0 0 0 0 0 d 3 0 0 0 0 0 0 d 4 0 0 0 0 0 0 d 5 0 0 0 0 0 0 d 6 , Λ = 1 0 0 0 0 0 λ 21 1 0 0 0 0 λ 31 λ 32 1 0 0 0 λ 41 λ 42 λ 43 1 0 0 λ 51 λ 52 λ 53 λ 54 1 0 λ 61 λ 62 λ 63 λ 64 λ 65 1 - - - ( 19 )
Wherein: x ij = h ij - Σ k = 1 j - 1 x ik λ jk , λ ij = x ij d j , d i = h ii - Σ k = 1 j - 1 x ik λ jk ·
Ask for the inverse matrix D of matrix D and matrix Λ respectively -1And Λ -1
D - 1 = 1 d 1 0 0 0 0 0 0 1 d 2 0 0 0 0 0 0 1 d 3 0 0 0 0 0 0 1 d 4 0 0 0 0 0 0 1 d 5 0 0 0 0 0 0 1 d 6 , Λ = 1 0 0 0 0 0 ρ 21 1 0 0 0 0 ρ 31 ρ 32 1 0 0 0 ρ 41 ρ 42 ρ 43 1 0 0 ρ 51 ρ 52 ρ 53 ρ 54 1 0 ρ 61 ρ 62 ρ 63 ρ 64 ρ 65 1 - - - ( 20 )
Wherein: ρ ij = - Σ k = j i - 1 λ ik ρ kj ρ ii = 1 ( j = 1,2 , · · · , n - 1 ; i = j + 1 , j + 2 , · · · , n )
Like this, utilize inverse matrix D -1And Λ -1Can ask for matrix H TThe inverse matrix of H, right position, reference mark can calculate affine transformation parameter in associate(d) matrix H and the image subject to registration.
Table 1 has provided the registration process result, Figure 11 has provided the change curve of the root-mean-square error of position, reference mark, the position at reference mark in reference picture X coordinate and the Y coordinate representation reference picture wherein, the position at reference mark in image X coordinate subject to registration and the Y coordinate representation image subject to registration, the room parameter that back X coordinate and Y coordinate representation utilization extract of resampling is treated reference mark in the registering images and is carried out the position coordinates that obtains after the conversion process, the distance that departs between the position, reference mark in the position at reference mark and the reference picture after the root-mean-square error of position is represented to resample, from table 1 and Figure 11 as can be seen, there are 25 degree rotation errors, 1.4 under the zoom factor condition doubly, the 10 pairs of same places are 0.94 pixel cell to the maximum through the root mean square of position after the treatment for correcting, can realize that registration error is better than 1 pixel cell.
Table 1 registration process result
Sequence number Reference picture X coordinate Reference picture Y coordinate Image X coordinate subject to registration Image Y coordinate subject to registration Back X coordinate resamples Back Y coordinate resamples The root-mean-square error of position
0 652 856 943 564 651.78 855.95 0.23
1 620 679 989 390 619.96 678.87 0.14
2 498 655 889 317 498.54 654.95 0.54
3 188 520 665 64 188.78 520.30 0.84
4 298 758 664 325 298.11 757.20 0.81
5 45 874 386 324 45.88 873.67 0.94
6 316 782 670 355 316.22 781.85 0.27
7 104 549 576 54 103.94 548.82 0.19
8 442 442 928 100 442.21 441.87 0.25
9 447 591 869 237 446.64 590.91 0.37

Claims (4)

1. the remote sensing image registration method based on local configuration covariance matrix is characterized in that, comprises the steps:
Step 1: from each image channel, read in each image simultaneously, adopt the Harris Corner Detection Algorithm to realize the image Corner Detection, judge whether to exist angle point by the angular response functional value of adding up each point in the mobile search window;
If the angular response function C RF value of any is the maximal value in the local neighborhood, then this point is set to alternative angle point; Otherwise show that this point is not an angle point; After in having judged image, having a few, utilize threshold processing to extract final angle point;
Step 2: adopt inter-class variance automatic threshold method that reference picture and image subject to registration are carried out local threshold and cut apart;
Utilize the optimal threshold that extracts that topography is cut apart, obtain local bianry image, and obtain the local edge profile according to the local bianry image that extracts;
Step 3: the automatic extraction of rotation zoom factor; The automatic extraction of described rotation zoom factor is based on the local configuration covariance matrix realization, concrete steps comprise: the bianry image that utilizes Threshold Segmentation to obtain, calculate the covariance matrix of reference picture and image subject to registration respectively, distribute according to the partial structurtes of reference picture and image subject to registration and can extract twiddle factor and zoom factor between image;
Step 4: choose the analysis window size by the ratio of selected window size in comparative analysis reference picture and the image subject to registration and the zoom factor that utilizes input picture to extract, extract optimum window;
Step 5: utilize the rotation zoom factor and the optimum window size that extract that the edge contour that extracts in the step 2 is resampled, reduce to rotate the influence of zoom factor to characteristic matching; And the edge feature after resampling carried out the Freeman coding, and add up similarity between two line segments by the chain code correlation technique; Utilize the mutual support characteristic between relative position relation between point set and point set to reject error matching points;
Step 6: after finishing Feature Points Matching, utilize the parameter in the least square method estimation affine model; After determining transformation parameter, according to affine parameter to input picture to processings that resample, it is right to obtain the image that mates mutually.
2. according to the described a kind of remote sensing image registration method of claim 1, it is characterized in that based on local configuration covariance matrix:
In the described step 3, at the automatic extraction of local segmentation image stretching rotation zoom factor; It is bright, dark two class images that passing threshold is cut apart image segmentation, adds up two-part covariance matrix and rotation zoom factor then respectively, determines the corresponding relation between image, and extracts corresponding rotation zoom factor.
3. according to the described a kind of remote sensing image registration method of claim 2, it is characterized in that based on local configuration covariance matrix:
In the described step 3, when determining the corresponding relation between image, these two kinds of corresponding relations of dark, bright image in bright, dark image in the corresponding image subject to registration of bright in the comparative analysis reference picture, dark image and the reference picture in the corresponding image subject to registration of bright, dark image, choose the ratio of twiddle factor and zoom factor, it is littler to choose twiddle factor, and the more approaching corresponding relation of the ratio of diaxon rotation zoom factor is as the corresponding relation between two images.
4. according to the described a kind of remote sensing image registration method of claim 1, it is characterized in that based on local configuration covariance matrix:
In the described step 4, choose the analysis window size, extract optimum window, be specially: the zoom factor λ that the statistics local configuration covariance matrix extracts, calculate twiddle factor and zoom factor then, calculate the ratio p of selected window size in comparative analysis reference picture and the image subject to registration, and choose the difference of λ and p under the different windows size condition in the comparative analysis image subject to registration, if it is minimum that both differences reach, then obtain the size of the size of this window subject to registration as optimum window, otherwise recomputate the zoom factor that local configuration covariance matrix extracts after revising the window size of image subject to registration, thereby the size of adaptively correcting analysis window.
CN2008101023293A 2008-03-20 2008-03-20 Remote sensing image registration method based on local configuration covariance matrix Expired - Fee Related CN101251926B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008101023293A CN101251926B (en) 2008-03-20 2008-03-20 Remote sensing image registration method based on local configuration covariance matrix

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008101023293A CN101251926B (en) 2008-03-20 2008-03-20 Remote sensing image registration method based on local configuration covariance matrix

Publications (2)

Publication Number Publication Date
CN101251926A CN101251926A (en) 2008-08-27
CN101251926B true CN101251926B (en) 2011-08-17

Family

ID=39955305

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101023293A Expired - Fee Related CN101251926B (en) 2008-03-20 2008-03-20 Remote sensing image registration method based on local configuration covariance matrix

Country Status (1)

Country Link
CN (1) CN101251926B (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101403743B (en) * 2008-10-31 2012-07-18 广东威创视讯科技股份有限公司 Automatic separating method for X type overlapping and adhering chromosome
CN101794446B (en) * 2010-02-11 2011-12-14 东南大学 Line search type detection method of image corner point
CN101887582B (en) * 2010-06-03 2011-12-14 西北工业大学 Curve corner point detection method based on difference accumulated values and three-point chain code differences
CN101872480B (en) * 2010-06-09 2012-01-11 河南理工大学 Automatic detection method for position and dimension of speckled characteristic in digital image
CN101922930B (en) * 2010-07-08 2013-11-06 西北工业大学 Aviation polarization multi-spectrum image registration method
CN101916441B (en) * 2010-08-06 2012-02-29 西北工业大学 Freeman chain code-based method for matching curves in digital image
CN101976256A (en) * 2010-11-01 2011-02-16 重庆大学 Double nearest neighbour first searching method in point feature image registration
CN102129669B (en) * 2011-02-24 2012-07-11 武汉大学 Least square area network color-homogenizing method of aerial remote sensing image
CN102136142B (en) * 2011-03-16 2013-03-13 内蒙古科技大学 Nonrigid medical image registration method based on self-adapting triangular meshes
CN102169584B (en) * 2011-05-28 2013-04-03 西安电子科技大学 Remote sensing image change detection method based on watershed and treelet algorithms
CN102446356A (en) * 2011-09-24 2012-05-09 中国测绘科学研究院 Parallel and adaptive matching method for acquiring remote sensing images with homogeneously-distributed matched points
CN103217429B (en) * 2012-01-19 2017-06-06 昆山思拓机器有限公司 Soft board detection partition position-alignment correction method
CN102722732B (en) * 2012-05-30 2015-01-14 清华大学 Image set matching method based on data second order static modeling
CN103324948B (en) * 2013-07-01 2016-04-27 武汉大学 The sane matching process of a kind of low altitude remote sensing image based on line features
CN104021559B (en) * 2014-06-17 2017-04-19 西安电子科技大学 Image registration method based on mutual information and Harris corner point detection
CN104463866B (en) * 2014-12-04 2018-10-09 无锡日联科技有限公司 A kind of local shape matching process based on profile stochastical sampling
CN106056598A (en) * 2016-05-27 2016-10-26 哈尔滨工业大学 Line segment detection and image segmentation fusion-based satellite high-resolution image building contour extraction method
CN107962456B (en) * 2016-10-19 2019-11-12 电子科技大学中山学院 A kind of more electrical axis alignment detection devices of Novel numerical control machine
CN108734706B (en) * 2018-05-21 2022-07-19 东南大学 Rotor winding image detection method fusing regional distribution characteristics and edge scale angle information
CN109255801B (en) * 2018-08-03 2022-02-22 百度在线网络技术(北京)有限公司 Method, device and equipment for tracking edges of three-dimensional object in video and storage medium
CN109255796B (en) * 2018-09-07 2022-01-28 浙江大丰实业股份有限公司 Safety analysis platform for stage equipment
CN109741306B (en) * 2018-12-26 2021-07-06 北京石油化工学院 Image processing method applied to dangerous chemical storehouse stacking
CN109711418B (en) * 2019-01-29 2020-12-01 浙江大学 Contour corner detection method for object plane image
CN110880003B (en) * 2019-10-12 2023-01-17 中国第一汽车股份有限公司 Image matching method and device, storage medium and automobile
CN112233789A (en) * 2020-10-12 2021-01-15 辽宁工程技术大学 Regional feature fusion type hypertensive retinopathy classification method
CN113326856B (en) * 2021-08-03 2021-12-03 电子科技大学 Self-adaptive two-stage feature point matching method based on matching difficulty
CN115082472B (en) * 2022-08-22 2022-11-29 江苏东跃模具科技有限公司 Quality detection method and system for hub mold casting molding product

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1545061A (en) * 2003-11-20 2004-11-10 上海交通大学 Multi-source image registering method on the basis of contour under rigid body transformation
CN1799068A (en) * 2003-07-08 2006-07-05 佳能株式会社 Image registration method improvement
CN101097601A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Image rapid edge matching method based on angle point guiding

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1799068A (en) * 2003-07-08 2006-07-05 佳能株式会社 Image registration method improvement
CN1545061A (en) * 2003-11-20 2004-11-10 上海交通大学 Multi-source image registering method on the basis of contour under rigid body transformation
CN101097601A (en) * 2006-06-26 2008-01-02 北京航空航天大学 Image rapid edge matching method based on angle point guiding

Also Published As

Publication number Publication date
CN101251926A (en) 2008-08-27

Similar Documents

Publication Publication Date Title
CN101251926B (en) Remote sensing image registration method based on local configuration covariance matrix
CN107301661B (en) High-resolution remote sensing image registration method based on edge point features
CN104050681B (en) A kind of road vanishing Point Detection Method method based on video image
CN101976437B (en) High-resolution remote sensing image variation detection method based on self-adaptive threshold division
CN106548462B (en) Non-linear SAR image geometric correction method based on thin-plate spline interpolation
CN101980293B (en) Method for detecting MTF of hyperspectral remote sensing system based on edge image
CN101667293A (en) Method for conducting high-precision and steady registration on diversified sensor remote sensing images
JP2004516533A (en) Synthetic aperture radar and forward-looking infrared image superposition method
Mikhail et al. Detection and sub-pixel location of photogrammetric targets in digital images
CN101923711A (en) SAR (Synthetic Aperture Radar) image change detection method based on neighborhood similarity and mask enhancement
CN106682678B (en) Image corner detection and classification method based on support domain
CN104992403B (en) Hybrid operator image redirection method based on visual similarity measurement
CN104899892A (en) Method for quickly extracting star points from star images
CN103871039A (en) Generation method for difference chart in SAR (Synthetic Aperture Radar) image change detection
CN101853509A (en) SAR (Synthetic Aperture Radar) image segmentation method based on Treelets and fuzzy C-means clustering
CN109461132A (en) SAR image automatic registration method based on feature point geometric topological relation
CN106067172A (en) A kind of underwater topography image based on suitability analysis slightly mates and mates, with essence, the method combined
CN104680536B (en) The detection method changed to SAR image using improved non-local mean algorithm
CN105426872A (en) Face age estimation method based on correlation Gaussian process regression
CN108550146A (en) A kind of image quality evaluating method based on ROI
CN106682689A (en) Image matching method based on multiscale Fourier-Mellin transform
CN106980809A (en) A kind of facial feature points detection method based on ASM
CN105488541A (en) Natural feature point identification method based on machine learning in augmented reality system
CN103324921B (en) A kind of mobile identification method based on interior finger band and mobile identification equipment thereof
CN105374047A (en) Improved bilateral filtering and clustered SAR based image change detection method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110817

Termination date: 20160320

CF01 Termination of patent right due to non-payment of annual fee