CN105160686B - A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators - Google Patents

A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators Download PDF

Info

Publication number
CN105160686B
CN105160686B CN201510688554.XA CN201510688554A CN105160686B CN 105160686 B CN105160686 B CN 105160686B CN 201510688554 A CN201510688554 A CN 201510688554A CN 105160686 B CN105160686 B CN 105160686B
Authority
CN
China
Prior art keywords
mrow
msubsup
msub
image
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510688554.XA
Other languages
Chinese (zh)
Other versions
CN105160686A (en
Inventor
邵振峰
李从敏
周维勋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201510688554.XA priority Critical patent/CN105160686B/en
Publication of CN105160686A publication Critical patent/CN105160686A/en
Application granted granted Critical
Publication of CN105160686B publication Critical patent/CN105160686B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Abstract

The invention discloses a kind of based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, feature point detection is carried out to the first image and the second image using the DoG operators of optimization first, then utilize SIFT description simulated of being sampled based on regional area that characteristic point is described to form characteristic vector, initial matching characteristic point is finally formed using NNDR strategies, and match point purified using the RANSAC algorithms based on Epipolar geometric constraint.The problems such as present invention can efficiently solve various visual angles present in low altitude remote sensing image matching and weak texture.

Description

A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators
Technical field
The invention belongs to Remote Sensing Image Processing Technology field, it is related to a kind of Remote Sensing Images Matching Method, is related to one kind and is based on Improve the low latitude various visual angles Remote Sensing Images Matching Method of SIFT operators.
Background technology
Image Matching is always photogrammetric and research emphasis and focus in remote sensing fields, is image orientation, orthogonal projection As making, image autoregistration and one of splicing, the committed step of three-dimensional reconstruction directly influence the essence of subsequent product making Degree and effect.Compared with traditional remote sensing image, the acquisition environment of low altitude remote sensing image is more complicated, particularly obtains with different view The image taken, there are problems that serious geometric distortion, similar grain, texture fracture, shade, substantially reduce same Similitude of one target on the different images with overlapping region, the matching strip to low latitude image comes great difficult.Therefore, How reliable and stable match point is obtained from the low latitude various visual angles image with serious geometric distortion to be significant.
Image Matching Algorithm is broadly divided into two classes:Matching based on image greyscale and the matching based on image feature.It is based on The matching process of image greyscale is directly operated to the pixel on image in certain window ranges, passes through pixel in window ranges Gray Correlation carry out homotopy mapping, although such method can be obtained in the case where image radiation and geometric form diminish Higher matching precision, but its grey scale change to image is more sensitive, is not suitable for the geometry in the various visual angles image of low latitude Distortion, similar grain, texture fracture, shade, the problems such as block.Matching process based on image feature is by extracting in image The geometric properties such as point, line, surface are matched, and can preferably supplement the deficiency of the matching process based on gray scale.
At present, in feature-based matching method, most widely used is the SIFT (Scale that Lowe is proposed Invariant Feature Transformation) algorithm, it can preferably handle rotation, the yardstick contracting in image geometry deformation Put and translate change the problem of, but big perspective images to the problem of on, still can not obtain gratifying effect.In order to Improve the matching effect for different visual angles image, scholars carry out a variety of researchs and improvement to SIFT operators, it is proposed that ASIFT (Affine-SITT), ISIFT (Iterative-SITT) and MM-SIFT (Multi-resolution MSER based The various visual angles image matching method such as SIFT).ASIFT is carried out according to camera affine model to longitude, three parameters of latitude and yardstick Sampling, with analog image view transformation, so that affine space image is set up, it is then special in the upper carry out SIFT of all analog images Levy a detection and match, thus this method has complete affine-invariant features, in the case of larger visual angle change, can also obtain Preferable matching result is obtained, at the same time, its time complexity increases therewith, limits its application.In order to balance matching As a result and time efficiency, ISIFT methods utilize a geometric transformation modeling wherein width image for iterative estimate, then using only The width analog image having carries out SIFT with another width image and matched, and substantially increases matching efficiency.But when geometry between image When distorting larger, the algorithm can be because geometrical model solves failure and fails.MM-SIFT is met according to correspondence regional area in image The theory of same affine model, carries out circle normalization, then normalized to the locally ellipticity region in two width images SIFT matchings are carried out in elliptic region.This method substantially increases time efficiency, and there is between image big visual angle change In the case of, remain able to obtain some correct match points, but its regional area dependence to extraction is big, it uses MSER During operator extraction regional area, not optimal threshold, and existing Region Segmentation Algorithm is still immature for various visual angles image.
The content of the invention
The deficiency existed for existing Image Matching technology is high it is an object of the invention to provide a kind of time efficiency and right Image geometry distortion is compared with stable based on the improved low latitude various visual angles image matching methods of SIFT.
The technical solution adopted in the present invention is:A kind of low latitude various visual angles Remote Sensing Images Matching based on improvement SIFT operators Method, it is characterised in that comprise the following steps:
Step 1:The characteristic point in the first image and the second image is detected using the DoG operators of adaptive threshold;
Step 2:Son is described based on improved SIFT description is clicked through to feature;
For the characteristic point extracted on the first image and the second image, it is respectively adopted and is adopted based on border circular areas in its neighborhood Sample and based on elliptic region sample gray value characteristic point is described, ultimately form 128 dimensional feature vectors;
Step 3:Nearest neighbor distance and the ratio of time nearest neighbor distance between the corresponding characteristic vector of distinguished point based (NNDR, Nearest Neighbor Distance Ratio) this strategy obtains initial matching point set;
Step 4:RANSAC algorithms based on Epipolar geometric constraint are purified to initial matching point set, are finally matched Point.
Preferably, the DoG operators of the use adaptive threshold described in step 1 are detected in the first image and the second image Characteristic point, be according to the adaptive threshold value T_ of half-tone information in certain window ranges centered on characteristic point Contrast, its expression formula is:
Wherein, (m, n) is coordinate of the characteristic point in image, and N is window size, xklIt is pixel in DoG metric spaces Gray value,It is the average gray of pixel in window ranges, r is determined by the yardstick of image, can be calculated as follows:
Wherein, SiFor the yardstick where current signature point, MIRepresent the quantity of image in DoG metric spaces, σ0Represent image Smooth degree.
Preferably, characteristic point is described based on the gray value that border circular areas is sampled for the use described in step 2, it is right In the arbitrary characteristics point (x of the first imagei,yi), the border circular areas sampled point in its neighborhoodExpression formula be:
Wherein, rijFor sample radius, θkFor sampling angle.
Preferably, characteristic point is described based on the gray value that elliptic region is sampled for the use described in step 2, it is right In the arbitrary characteristics point (x ' of the second imagei,y′i), the elliptic region sampled point (x ' in its neighborhoodi jk,y′i jk) expression formula be:
x′i jk=x 'i+aij cos(θk)cos(θ0)+bij cos(θk)sin(θ0);
y′i jk=y 'i+aij sin(θk)sin(θ0)-bij cos(θk)cos(θ0);
Wherein, aijFor ellipse sampling major axis, bijFor ellipse sampling short axle, bij=aij/ γ, γ >=1, θ0For oval direction, θkFor sampling angle.
Preferably, the gradient calculation expression formula of sampled point is as follows:
Wherein, d.() represents the point θ and the Grad in γ directions, and m () represents the gradient direction of the point, sampled point Gray value then asked for by the way of bilinear interpolation, be designated as
Preferably, determining elliptic parameter γ and θ using the strategy of iterative0, comprise the following steps that:
Step 2.1:M characteristic point is randomly selected from the first image, its characteristic vector is designated as
Step 2.2:When carrying out oval sampling in the second image, if γ=γs, θ0=0, obtain all feature point features Vector, is designated asThe first image and the second image, the correct matching of note are matched using closest minimum distance method Number is n;
Step 2.3:Change γ and θ0Value, wherein γ ∈ [γse], θ0In the case of ∈ [0 °, 90 °], acquisition multiple analog D ' and corresponding correct match point ni
Step 2.4:Compare it is a variety of in the case of matching result, when obtain max (ni) when, corresponding γ*And θ*It is as optimal Solution;
Step 2.5:Max (n are selected in the first imagei) when characteristic point as initial point, re-start step 2.1- Step 2.5, until γ*And θ*Value tend towards stability.
Preferably, being completed in step 2 after neighbouring point sampling, the gray value of sampled point is asked for by bilinear interpolation method.
Compared with prior art, the present invention has following features and beneficial effect,
1st, adaptive threshold is used in the feature point extraction stage so that DoG operators can adapt to the image of different texture, from And ensure that subsequent match result;
2nd, the stage is described in feature, goes to simulate visual angle change by the way of ellipse sampling, add same target not With the similitude on perspective images, the matching effect of various visual angles and weak texture region is substantially improved;
3rd, simulated using localized region rather than view picture image is handled, improving the same of matching precision When, take into account time efficiency.
Brief description of the drawings
Fig. 1:For the flow chart of the embodiment of the present invention;
Fig. 2:For the border circular areas sampling schematic diagram in the embodiment of the present invention;
Fig. 3:For the elliptic region sampling schematic diagram in the embodiment of the present invention.
Embodiment
Understand for the ease of those of ordinary skill in the art and implement the present invention, below in conjunction with the accompanying drawings and embodiment is to this hair It is bright to be described in further detail, it will be appreciated that implementation example described herein is merely to illustrate and explain the present invention, not For limiting the present invention.
It is proposed by the present invention a kind of based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, use first The DoG operators of optimization extract characteristic point respectively on reference to image and image to be matched;Secondly using based on local area simulation SIFT description son carry out feature description;Minimum distance is then based on the ratio strategy of time adjacency to characteristic point progress Match somebody with somebody, and initial matching result is purified using the RANSAC based on pole geometry, obtain final match point.
When it is implemented, the present invention can realize automatic running flow using computer software technology.To describe this hair in detail Bright technical scheme, see Fig. 1, it is a kind of based on the low latitude various visual angles Remote Sensing Images Matching side for improving SIFT operators that the present invention is provided Method, comprises the following steps:
Step 1:The characteristic point in the first image and the second image is detected using the DoG operators of adaptive threshold;
In order to adapt to the image of different details, the present invention does not use the fixed threshold in original DoG, but according to feature The adaptive threshold value T_contrast of half-tone information in certain window ranges centered on point, its expression formula is:
Wherein, (m, n) is coordinate of the characteristic point in image, and N is window size, xklIt is pixel in DoG metric spaces Gray value,It is the average gray of pixel in window ranges, r is determined by the yardstick of image, can be calculated as follows:
Wherein, SiFor the yardstick where current signature point, MIRepresent the quantity of image in DoG metric spaces, σ0Represent image Smooth degree.
Step 2:Son is described based on improved SIFT description is clicked through to feature;
For the characteristic point extracted on the first image and the second image, it is respectively adopted and is adopted based on border circular areas in its neighborhood Sample and based on elliptic region sample gray value characteristic point is described, ultimately form 128 dimensional feature vectors;
The research object of the present invention is low latitude various visual angles remote sensing image, due to the complexity of image-forming condition, different visual angles shadow Serious geometric distortion is there is as between so that overall image is unsatisfactory for same affine transformation, but, with overlapping region The corresponding part plan region of various visual angles image can be represented with same affine transformation.In image geometry, circle and ellipse Between relation can be for portraying an affine transformation, therefore, feature of the present invention in feature describes stage, a width image Point description scope uses the description scope in border circular areas, another width image using the elliptic region of simulation, it is therefore an objective to simulate shadow The similitude of half-tone information in the view transformation of picture, increase feature vertex neighborhood, so as to improve the matching effect of various visual angles image.
When to the characteristic point of a wherein width image with border circular areas to describe when, to each characteristic point (xi,yi), its office The size available radius in portion circle description region represents that its expression formula is as follows.
Wherein, σiIt is characterized a place yardstick.
Different directly using the gray value in adjacent integers coordinate from original SIFT, the present invention uses sampling and the side of interpolation Formula obtains the half-tone information in neighborhood, and sample mode is as shown in Figure 2.Radius in region and angle are described to part-circular Sampling expression formula be respectively:
rij=jwi/s;
θk=k π/180;
Wherein, s is total radius sampling number, and j=1 ..., s, k are sampling angle, k=1 ..., 360 °/n ..., 360°。
According to sample radius and sampling angle, sampled point can be calculatedCoordinate:
Wherein, rijFor sample radius, θkFor sampling angle.
The gray value of sampled point is then asked for by the way of bilinear interpolation, is designated asBilinear interpolation is one Common method is planted, is repeated no more herein.
When using elliptic region Expressive Features point to other width image, the content that characteristic point elliptic region is included should It is as far as possible identical with the content that same place border circular areas is included to ensure matching effect.For each characteristic point (x 'i,y′i), root According to major axis aij, short axle bijAnd ellipse direction θ0Three elliptic parameters, you can determine the sample range of elliptic region.aijMeter Calculation mode and rijIt is identical, if bij=aij/ γ, the then parameter for controlling elliptic region to change is γ and θ0.Ellipse is sampled When, sample mode is as shown in figure 3, each sampled point (x 'i jk,y′i jk) coordinate can be calculated according to equation below:
x′i jk=x 'i+aij cos(θk)cos(θ0)+bij cos(θk)sin(θ0);
y′i jk=y 'i+aij sin(θk)sin(θ0)-bij cos(θk)cos(θ0);
Wherein j=1 ..., s, θ0∈ [0 °, 180 °], θk=k π/180, k=1 ..., 360 °/n ..., 360 °.
Complete after neighbouring point sampling, the gray value of elliptic region sampled point is equally obtained by the way of bilinear interpolation.
It is different from original SIFT when calculating gradient, without using adjacent feature point, such as (xi,yi), (xi,yi- 1), (xi-1, yi) etc., but above-mentioned neighbouring sample point is used, such as Deng being retouched with circle State exemplified by region, the specific calculation expression of gradient is as follows:
Wherein, d.() represents the point θ and the Grad in γ directions, and m () represents the gradient direction of the point.
The shade of gray obtained according to above formula, can calculate gradient direction Nogata according to original SIFT calculation Figure, and then obtain the characteristic vector of each characteristic point.
Can above-mentioned SIFT description based on local area simulation sampling effectively solve the geometric deformation under big visual angle and ask Topic, key is simulation degree of the elliptic region to border circular areas, that is, the parameter for controlling elliptic region to change is γ and θ0.This hair Bright use iterative strategy solves γ and θ0Value, to ensure matching effect, comprise the following steps that:
Step 2.1:M characteristic point is randomly selected from the first image, its characteristic vector is designated as
Step 2.2:When carrying out oval sampling in the second image, if γ=γs, θ0=0, obtain all feature point features Vector, is designated asThe first image and the second image, the correct matching of note are matched using closest minimum distance method Number is n;
Step 2.3:Change γ and θ0Value, wherein γ ∈ [γse], θ0In the case of ∈ [0 °, 90 °], acquisition multiple analog D ' and corresponding correct match point ni
Step 2.4:Compare it is a variety of in the case of matching result, when obtain max (ni) when, corresponding γ*And θ*It is as optimal Solution;
Step 2.5:Max (n are selected in the first imagei) when characteristic point as initial point, re-start step 2.1- Step 2.5, until γ*And θ*Value tend towards stability.
Step 3:Using NNDR matching strategies matching characteristic point
Strategy using arest neighbors and time neighbour's ratio (NNDR, Nearest Neighbor Distance Ratio) enters Row matching, when it is implemented, calculate the Euclidean distance between each feature descriptor, for each feature, if with its most phase The distance of two near features meets minimum range and the ratio of time small distance is less than certain threshold value △, then it is assumed that the matching pair It is a pair of correct matchings pair;
Step 4:Error matching points are rejected using the RANSAC based on Epipolar geometric constraint
Complete after Feature Points Matching, in initial matching result, there can be some erroneous matchings.Conventional method is to utilize Affine transformation matrix between RANSAC (Random Sample Consensus) estimation image rejects erroneous matching, but right For big perspective images, view picture image is not particularly suited for same affine transformation, therefore, in the specific implementation, using extremely several The method of what constraint, erroneous matching is rejected by estimating the fundamental matrix between image picture pair.
It should be appreciated that the part that this specification is not elaborated belongs to prior art.
It should be appreciated that the above-mentioned description for preferred embodiment is more detailed, therefore it can not be considered to this The limitation of invention patent protection scope, one of ordinary skill in the art is not departing from power of the present invention under the enlightenment of the present invention Profit is required under protected ambit, can also be made replacement or be deformed, each fall within protection scope of the present invention, this hair It is bright scope is claimed to be determined by the appended claims.

Claims (5)

1. it is a kind of based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, it is characterised in that including following step Suddenly:
Step 1:The characteristic point in the first image and the second image is detected using the DoG operators of adaptive threshold;
Step 2:Son is described based on improved SIFT description is clicked through to feature;
For the characteristic point extracted on the first image and the second image, it is respectively adopted what is sampled based on border circular areas in its neighborhood Gray value and based on elliptic region sample gray value characteristic point is described, ultimately form 128 dimensional feature vectors;
Characteristic point is described based on the gray value that elliptic region is sampled for wherein described use, for any of the second image Characteristic point (x 'i,y′i), the elliptic region sampled point in its neighborhoodExpression formula be:
<mrow> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mo>&amp;prime;</mo> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mo>+</mo> <msub> <mi>a</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>+</mo> <msub> <mi>b</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mi>s</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mo>&amp;prime;</mo> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>=</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mo>&amp;prime;</mo> </msubsup> <mo>+</mo> <msub> <mi>a</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>s</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mi>s</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>-</mo> <msub> <mi>b</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mn>0</mn> </msub> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Wherein, aijFor ellipse sampling major axis, bijFor ellipse sampling short axle, bij=aij/ γ, γ >=1, θ0For oval direction, θkFor Sampling angle;
Elliptic parameter γ and θ are determined using the strategy of iterative0, comprise the following steps that:
Step 2.1:M characteristic point is randomly selected from the first image, its characteristic vector is designated as
Step 2.2:When carrying out oval sampling in the second image, if γ=γs, θ0=0, all characteristic point characteristic vectors are obtained, It is designated asThe first image and the second image are matched using closest distance method, the correct matching number of note is n;
Step 2.3:Change γ and θ0Value, wherein γ ∈ [γse], θ0∈ [0 °, 90 °], obtains the D ' in the case of multiple analog With corresponding correct match point ni
Step 2.4:Compare it is a variety of in the case of matching result, when obtain max (ni) when, corresponding γ*And θ*As optimal solution, Wherein * is general symbol;
Step 2.5:Max (n are selected in the first imagei) when characteristic point as initial point, re-start step 2.1- steps 2.5, until γ*And θ*Value tend towards stability;
Step 3:Nearest neighbor distance and this strategy of the ratio of time nearest neighbor distance between the corresponding characteristic vector of distinguished point based Obtain initial matching point set;
Step 4:RANSAC algorithms based on Epipolar geometric constraint are purified to initial matching point set, obtain final match point.
2. it is according to claim 1 based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, its feature It is:The DoG operators of use adaptive threshold described in step 1 detect the characteristic point in the first image and the second image, are According to the adaptive threshold value T_contrast of half-tone information in certain window ranges centered on characteristic point, its expression formula For:
<mrow> <mi>T</mi> <mo>_</mo> <mi>c</mi> <mi>o</mi> <mi>n</mi> <mi>t</mi> <mi>r</mi> <mi>a</mi> <mi>s</mi> <mi>t</mi> <mo>=</mo> <mfrac> <mn>1</mn> <mrow> <mn>4</mn> <mo>&amp;CenterDot;</mo> <msup> <mi>r</mi> <mn>2</mn> </msup> </mrow> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mi>m</mi> <mo>-</mo> <mi>r</mi> </mrow> <mrow> <mi>i</mi> <mo>=</mo> <mi>m</mi> <mo>+</mo> <mi>r</mi> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>i</mi> <mo>=</mo> <mi>n</mi> <mo>-</mo> <mi>r</mi> </mrow> <mrow> <mi>j</mi> <mo>=</mo> <mi>n</mi> <mo>+</mo> <mi>r</mi> </mrow> </munderover> <msqrt> <mrow> <mfrac> <mn>1</mn> <msup> <mi>N</mi> <mn>2</mn> </msup> </mfrac> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>k</mi> <mo>=</mo> <mi>i</mi> <mo>-</mo> <mfrac> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> <mn>2</mn> </mfrac> </mrow> <mrow> <mi>k</mi> <mo>=</mo> <mi>i</mi> <mo>+</mo> <mfrac> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> <mn>2</mn> </mfrac> </mrow> </munderover> <munderover> <mo>&amp;Sigma;</mo> <mrow> <mi>l</mi> <mo>=</mo> <mi>j</mi> <mo>-</mo> <mfrac> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> <mn>2</mn> </mfrac> </mrow> <mrow> <mi>l</mi> <mo>=</mo> <mi>j</mi> <mo>+</mo> <mfrac> <mrow> <mi>N</mi> <mo>-</mo> <mn>1</mn> </mrow> <mn>2</mn> </mfrac> </mrow> </munderover> <mrow> <mo>(</mo> <msub> <mi>x</mi> <mrow> <mi>k</mi> <mi>l</mi> </mrow> </msub> <mo>-</mo> <msub> <mover> <mi>x</mi> <mo>&amp;OverBar;</mo> </mover> <mi>N</mi> </msub> <mo>)</mo> </mrow> </mrow> </msqrt> <mo>;</mo> </mrow>
Wherein, (m, n) is coordinate of the characteristic point in image, and N is window size, xklIt is gray scale of the pixel in DoG metric spaces Value,It is the average gray of pixel in window ranges, r is determined by the yardstick of image, is calculated as follows:
<mrow> <mi>r</mi> <mo>=</mo> <mn>4</mn> <mo>&amp;CenterDot;</mo> <msub> <mi>&amp;sigma;</mi> <mn>0</mn> </msub> <mo>&amp;CenterDot;</mo> <msup> <mn>2</mn> <mfrac> <mrow> <mn>2</mn> <msub> <mi>S</mi> <mi>i</mi> </msub> </mrow> <msub> <mi>M</mi> <mi>I</mi> </msub> </mfrac> </msup> <mo>;</mo> </mrow>
Wherein, SiFor the yardstick where current signature point, MIRepresent the quantity of image in DoG metric spaces, σ0Represent that image is smooth Degree.
3. it is according to claim 1 based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, its feature It is:Characteristic point is described based on the gray value that border circular areas is sampled for use described in step 2, for the first image Arbitrary characteristics point (xi,yi), the border circular areas sampled point in its neighborhoodExpression formula be:
<mrow> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>=</mo> <msub> <mi>x</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>r</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>s</mi> <mi>i</mi> <mi>n</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>=</mo> <msub> <mi>y</mi> <mi>i</mi> </msub> <mo>+</mo> <msub> <mi>r</mi> <mrow> <mi>i</mi> <mi>j</mi> </mrow> </msub> <mi>c</mi> <mi>o</mi> <mi>s</mi> <mrow> <mo>(</mo> <msub> <mi>&amp;theta;</mi> <mi>k</mi> </msub> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
Wherein, rijFor sample radius, θkFor sampling angle.
4. it is according to claim 1 based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, its feature It is:The characteristic point gradient calculation expression formula based on area sampling described in step 2 is as follows:
<mrow> <msub> <mi>d</mi> <mi>&amp;theta;</mi> </msub> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mrow> <mo>(</mo> <mi>k</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> </mrow> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <msub> <mi>d</mi> <mi>&amp;gamma;</mi> </msub> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>-</mo> <mi>I</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mo>(</mo> <mi>j</mi> <mo>-</mo> <mn>1</mn> <mo>)</mo> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>;</mo> </mrow>
<mrow> <mi>m</mi> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>=</mo> <msqrt> <mrow> <msubsup> <mi>d</mi> <mi>&amp;theta;</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> <mo>+</mo> <msubsup> <mi>d</mi> <mi>&amp;gamma;</mi> <mn>2</mn> </msubsup> <mrow> <mo>(</mo> <msubsup> <mi>x</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>,</mo> <msubsup> <mi>y</mi> <mi>i</mi> <mrow> <mi>j</mi> <mi>k</mi> </mrow> </msubsup> <mo>)</mo> </mrow> </mrow> </msqrt> <mo>;</mo> </mrow>
Wherein,Represent characteristic pointGrad in θ directions,Represent characteristic pointGrad in γ directions,Represent characteristic pointGradient direction, the gray value of sampled point Then asked for, be designated as by the way of bilinear interpolationWherein * is general symbol.
5. it is according to claim 1 based on the low latitude various visual angles Remote Sensing Images Matching Method for improving SIFT operators, its feature It is:Completed in step 2 after neighbouring point sampling, the gray value of sampled point is asked for by bilinear interpolation method.
CN201510688554.XA 2015-10-21 2015-10-21 A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators Active CN105160686B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510688554.XA CN105160686B (en) 2015-10-21 2015-10-21 A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510688554.XA CN105160686B (en) 2015-10-21 2015-10-21 A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators

Publications (2)

Publication Number Publication Date
CN105160686A CN105160686A (en) 2015-12-16
CN105160686B true CN105160686B (en) 2017-08-25

Family

ID=54801528

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510688554.XA Active CN105160686B (en) 2015-10-21 2015-10-21 A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators

Country Status (1)

Country Link
CN (1) CN105160686B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105469415A (en) * 2015-12-28 2016-04-06 云南师范大学 Multi-view remote sensing image fusion method
CN107909094A (en) * 2017-10-31 2018-04-13 天津大学 A kind of adaptive threshold multiple target SIFT matching algorithm implementation methods
CN108021886B (en) * 2017-12-04 2021-09-14 西南交通大学 Method for matching local significant feature points of repetitive texture image of unmanned aerial vehicle
CN109579847B (en) * 2018-12-13 2022-08-16 歌尔股份有限公司 Method and device for extracting key frame in synchronous positioning and map construction and intelligent equipment
CN109784223B (en) * 2018-12-28 2020-09-01 珠海大横琴科技发展有限公司 Multi-temporal remote sensing image matching method and system based on convolutional neural network
CN110856112B (en) * 2019-11-14 2021-06-18 深圳先进技术研究院 Crowd-sourcing perception multi-source information fusion indoor positioning method and system
CN111739079B (en) * 2020-06-18 2022-10-11 东华理工大学 Multisource low-altitude stereopair fast matching method based on semantic features
CN115019181B (en) * 2022-07-28 2023-02-07 北京卫星信息工程研究所 Remote sensing image rotating target detection method, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 Image matching method
CN103400384A (en) * 2013-07-22 2013-11-20 西安电子科技大学 Large viewing angle image matching method capable of combining region matching and point matching

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101140624A (en) * 2007-10-18 2008-03-12 清华大学 Image matching method
CN103400384A (en) * 2013-07-22 2013-11-20 西安电子科技大学 Large viewing angle image matching method capable of combining region matching and point matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Invariant matching method for different viewpoint angle images;Min Chen 等;《APPLIED OPTICS》;20130131;第52卷(第1期);第96-104页 *
Robust Affi ne-Invariant Line Matching for High Resolution Remote Sensing Images;Min Chen 等;《Photogrammetric Engineering & Remote Sensing》;20130831;第79卷(第8期);正文第754页左栏第3段,右栏第1段 *
结合区域匹配和点匹配的大视角图像匹配方法;陈月玲;《中国优秀硕士学位论文全文数据库 信息科技辑》;20141115(第11期);正文第11页第2-3段,第13页第2-3段,第27页第2-4段、第8-9段,第29页第7段,第30页第3段,第34页第3段,图3.2 *

Also Published As

Publication number Publication date
CN105160686A (en) 2015-12-16

Similar Documents

Publication Publication Date Title
CN105160686B (en) A kind of low latitude various visual angles Remote Sensing Images Matching Method based on improvement SIFT operators
CN108764048B (en) Face key point detection method and device
CN111795704B (en) Method and device for constructing visual point cloud map
CN103727930B (en) A kind of laser range finder based on edge matching and camera relative pose scaling method
CN110569878B (en) Photograph background similarity clustering method based on convolutional neural network and computer
CN105740899B (en) A kind of detection of machine vision image characteristic point and match compound optimization method
WO2020177432A1 (en) Multi-tag object detection method and system based on target detection network, and apparatuses
CN108052942B (en) Visual image recognition method for aircraft flight attitude
GB2581374A (en) 3D Face reconstruction system and method
CN105096317A (en) Fully automatic calibration method for high performance camera under complicated background
WO2019011958A1 (en) System and method for pose-invariant face alignment
CN104200461A (en) Mutual information image selected block and sift (scale-invariant feature transform) characteristic based remote sensing image registration method
Urban et al. Finding a good feature detector-descriptor combination for the 2D keypoint-based registration of TLS point clouds
CN107490356B (en) Non-cooperative target rotating shaft and rotation angle measuring method
CN103700101A (en) Non-rigid brain image registration method
CN111998862B (en) BNN-based dense binocular SLAM method
CN104881029A (en) Mobile robot navigation method based on one point RANSAC and FAST algorithm
CN110222661B (en) Feature extraction method for moving target identification and tracking
CN110334762A (en) A kind of feature matching method combining ORB and SIFT based on quaternary tree
CN108550165A (en) A kind of image matching method based on local invariant feature
CN107516322A (en) A kind of image object size based on logarithm pole space and rotation estimation computational methods
CN112163588A (en) Intelligent evolution-based heterogeneous image target detection method, storage medium and equipment
CN104899892A (en) Method for quickly extracting star points from star images
CN112883850A (en) Multi-view aerospace remote sensing image matching method based on convolutional neural network
Stentoumis et al. A local adaptive approach for dense stereo matching in architectural scene reconstruction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant