CN109523585B - Multisource remote sensing image feature matching method based on direction phase consistency - Google Patents

Multisource remote sensing image feature matching method based on direction phase consistency Download PDF

Info

Publication number
CN109523585B
CN109523585B CN201811375054.0A CN201811375054A CN109523585B CN 109523585 B CN109523585 B CN 109523585B CN 201811375054 A CN201811375054 A CN 201811375054A CN 109523585 B CN109523585 B CN 109523585B
Authority
CN
China
Prior art keywords
points
point
image
matching
phase consistency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811375054.0A
Other languages
Chinese (zh)
Other versions
CN109523585A (en
Inventor
杨宇辉
李欣
杨博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan University WHU
Original Assignee
Wuhan University WHU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan University WHU filed Critical Wuhan University WHU
Priority to CN201811375054.0A priority Critical patent/CN109523585B/en
Publication of CN109523585A publication Critical patent/CN109523585A/en
Application granted granted Critical
Publication of CN109523585B publication Critical patent/CN109523585B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a multisource remote sensing image feature matching method based on direction phase consistency, which realizes multisource remote sensing image high-precision matching based on direction phase consistency and comprises the steps of performing image orthorectification on a reference image and a target image; partitioning the reference image and extracting feature points; respectively calculating directional phase consistency performance maps of the reference image and the target image in multiple directions to obtain a directional phase consistency vector of each pixel point; constructing feature description vectors of feature points based on intensive combination of direction phase consistency vectors; and matching images, and removing gross errors and wrong matching points. The invention provides an effective multi-source remote sensing image matching method, which can obtain reliable and high-precision homonymy points among images from various sources.

Description

Multisource remote sensing image feature matching method based on direction phase consistency
Technical Field
The invention belongs to the field of remote sensing image processing, and particularly relates to a multisource remote sensing image feature matching method based on directional phase consistency.
Background
Image matching is a process of identifying homonymous points between two or more images through a certain matching algorithm, and is an important early step of image registration, image fusion, target identification and change detection. In the field of remote sensing, image matching can be divided into two types, one type is connection point matching and is used for obtaining the same-name image points between two or more images with certain overlapping degree, a balance equation is established according to the coordinates of the image points and a certain mathematical model, imaging model parameters (such as RPC model coefficients and strict imaging model coefficients) of the images are corrected, registration between the images is realized, and the relative positioning accuracy of the images is improved; and the other is control point matching which is used for acquiring homonymous image points between a digital ortho image (DOM) and a satellite image, establishing a balance equation by utilizing image point coordinates on the DOM image with known object space coordinates and a certain mathematical model, correcting imaging model parameters of the image and improving the absolute positioning accuracy of the satellite image.
In the heterogeneous remote sensing image matching process, due to the fact that imaging characteristics of different sensors are different, significant nonlinear radiation difference generally exists between images, and therefore a reliable homonymy point is difficult to obtain by a traditional matching algorithm based on gray-scale correlation or gradient characteristics. The difference of imaging characteristics of the sensors can cause the feature to be shifted and lost on different images, which makes it difficult for a matching algorithm based on the point feature and the line feature to extract the same point-line feature on two images, which also reduces the reliability of matching. Therefore, in order to meet the high-precision matching requirement of the multi-source remote sensing image, the invention provides a multi-source remote sensing image feature matching method based on direction phase consistency.
Disclosure of Invention
The invention aims to solve the problem that the matching between the heterogeneous remote sensing images is difficult, provides a multisource remote sensing image matching method based on directional phase consistency energy combination characteristics, and has the characteristics of high matching precision and reliable result.
The invention provides a multisource remote sensing image feature matching method based on direction phase consistency, which realizes the high-precision matching of multisource remote sensing images based on the direction phase consistency and comprises the following steps,
step 1, performing image orthorectification on a reference image and a target image;
step 2, only the reference image is subjected to blocking and feature point extraction;
step 3, calculating directional phase consistency performance maps of the reference image and the target image in multiple directions respectively to obtain directional phase consistency vectors of each pixel point;
step 4, constructing feature description vectors of the feature points based on intensive combination of the direction phase consistency vectors;
step 5, image matching, including determining the homonymous points of each characteristic point through sliding window searching and the correlation coefficient of the characteristic description vector, and extracting matching point pairs;
and 6, performing gross error elimination and removing error matching points.
And the step 2 is implemented by dividing the reference image into grid regions with consistent sizes, then respectively performing non-maximum value suppression in each grid region, reserving a plurality of points with response values larger than a preset threshold and the maximum values as characteristic angular points, and taking the characteristic angular points as the characteristic points to be matched.
In step 3, the directional phase-consistency vector of each pixel point is a one-dimensional vector composed of phase-consistency energy in each direction.
Furthermore, the implementation of constructing the feature description vector is as follows,
at the point (x) to be described0,y0) Within the range of setting radius for the center, selecting points according to the distance D in the row direction and the column direction, setting a total of selected points to obtain s points, and marking the group of points as [ (x)0,y0),(x1,y1),…,(xs-1,ys-1)];
Taking each point of the group of points as a center, respectively determining an accumulation domain with the size of t multiplied by t, obtaining s accumulation vectors in total by using the direction phase consistency vector corresponding to each pixel in the accumulation range, and finally combining the s vectors into a one-dimensional vector VfAs a point (x)0,y0) The feature description vector.
And in step 5, after the homonymous points of the characteristic points are determined, peak fitting is carried out to obtain the matching point coordinates of the sub-pixel level on the target image.
In step 6, the mismatching points are removed by the least square adjustment method.
The method comprises the steps of taking remote sensing images obtained by two different types of sensors as input, eliminating geometrical deformation caused by rotation, scale difference and imaging angle difference between the images through image pre-correction, carrying out blocking and feature point extraction on a reference image, controlling the number, position and distribution of matching points, calculating direction phase consistency performance maps of a target image and a reference image respectively, constructing feature vectors of all pixels in a feature point and a search range, determining homonymy points through a normalized cross-correlation coefficient, and finally removing rough difference points through a least square adjustment method of an affine transformation model to obtain correct matching points. The invention provides an effective multisource remote sensing image matching method, which can obtain reliable and high-precision homonymy points among optical, infrared, multispectral and SAR images and the like. The invention has the advantages that:
1. the problems of characteristic deviation and characteristic loss commonly exist between the heterogeneous remote sensing images, so that the same characteristic points are difficult to extract from the two images, and the matching failure is easily caused by using the characteristic point matching. The method only aims at extracting the characteristic points of the reference image, not only reduces the calculated amount, but also can not obtain the image points with the same name on part of the images because of the problem of the recurrence rate of the characteristic points, and greatly improves the reliability of matching.
2. The images are pre-corrected, so that the scale and rotation difference existing between the heterogeneous remote sensing images can be overcome, and the matching efficiency and reliability are improved.
3. The direction phase consistency calculates the image through filters in different directions, so that the phase intensity of the image in different directions can be directly obtained, the characteristics of the image are better saved, and compared with a characteristic descriptor constructed by phase consistency direction weighting, the steps of weighting energy in each direction and solving the gradient direction are omitted, and the calculation speed is higher.
4. The information loss of sparse description operators such as SIFT (scale invariant feature transform) operators and SURF (speeded up robust features) and the like in the prior art is large, the method is not suitable for matching heterogeneous images with large feature difference, and the dense feature description adopted by the method can well overcome the matching problem caused by slight geometric deformation and feature offset between matching regions.
Drawings
FIG. 1 is a schematic flow chart of an embodiment of the present invention.
Fig. 2 is a schematic diagram of point selection in a process of constructing a feature description vector according to an embodiment of the present invention.
Detailed Description
The following detailed description of the invention refers to the accompanying drawings and examples.
The invention provides a multisource remote sensing image feature matching method based on direction phase consistency, which comprises the steps of pre-correcting an image before matching; in the characteristic point extraction process, only the reference image (reference image) is subjected to blocking and characteristic point extraction; calculating a phase consistency performance map of each direction; constructing a feature vector of the feature point; image matching determines the same-name point through sliding window searching and correlation coefficient; and eliminating the error matching points.
Referring to fig. 1, a flow of the multi-source remote sensing image feature matching method based on directional phase consistency according to the embodiment includes the following steps:
step 1, pre-correcting the reference image and the target image. The main purpose of this step is to eliminate the rotation between images, the difference in scale and the significant geometric distortion caused by the imaging angle. The step can also be called image orthorectification, the imaging model parameters of the remote sensing image and the existing DEM data or average elevation data are utilized, the image is projected to the object space from the image space to eliminate the rotation and imaging angle difference between the images, and the scale difference caused by the difference of the image resolution ratio is eliminated through image resampling.
The images are pre-corrected in matching, so that the rotation and scale difference among the images is eliminated, and the matching constraint is strengthened.
And 2, partitioning the reference image and extracting the characteristic points.
The invention overcomes the prejudice of the prior art, and provides a method for extracting the characteristic points, which only carries out blocking and characteristic point extraction on the reference image in the characteristic point extraction process and does not need to carry out the characteristic point extraction of the target image.
Embodiments select the Harris algorithm to extract feature corners. The Harris algorithm considers that a sliding window of a feature corner in any direction should have large gray scale variation. The algorithm principle is as follows:
Figure BDA0001870463630000041
wherein E isx,yRepresenting the Harris response value for the pixel at image coordinates (x, y); w is au,vIs a gaussian weight function; u, v represent the offset in the X and Y directions on the image, respectively; i denotes an image matrix, Ix+u,y+vRepresenting the image gray scale at (x + u, y + v) coordinates, Iu,vRepresenting the image gray scale at image coordinates (u, v); gx,GyRepresenting the gradient values of the image in the X direction and the Y direction respectively; o (x)2,y2) Is a second order minimum for x, y.
The calculation formula is as follows:
Figure BDA0001870463630000042
Figure BDA0001870463630000043
R=Det-kTr2
wherein the content of the first and second substances,
Figure BDA0001870463630000044
respectively representing the gradients of the original image in the x direction and the y direction; w is a Gaussian convolution template used for suppressing the influence of image noise; r is the Harris response value of the obtained image, which is calculated by a determinant Det and a trace Tr of an autocorrelation matrix, and the size of R represents the strength of an angular point; k is a fixed constant, and the empirical value is 0.04.
In order to ensure the uniform characteristic of the distribution of the feature points, a reference image is divided into a certain number of grid regions with consistent size, then non-maximum value inhibition is respectively carried out in each region, Z points with response values larger than a threshold value and with maximum values are reserved as feature angular points, and the feature angular points are taken as feature points to be matched. During specific implementation, the number of grid regions, the value of a threshold value and the value of Z can be preset, the number of grids and the size of Z are determined according to the adjustment model and the precision requirement, a 5 x 5 grid is generally adopted, 3-5 feature points are reserved in each grid for matching, and the threshold value can be set to be 0.7 times of the maximum R value.
And 3, respectively calculating directional phase consistency performance maps of the reference image and the target image in multiple directions to obtain a directional phase consistency vector of each pixel point.
The directional phase consistency is calculated in a certain direction o, and the calculation formula is as follows:
Figure BDA0001870463630000045
wherein, PCo(x, y) is the phase consistency energy of the pixel point at the image coordinate (x, y) in the direction o, and its value range is 0 to 1; n represents a scale, and generally 3 or more scales are taken; woIs a gaussian weight function; a. thenoIs an amplitude value obtained by filtering on a scale n by using a Log-Gabor filter with the direction of o; delta phinoIs a phase difference function; t isoIs a noise estimate; ε is a very small constant set to avoid the divisor approaching 0.
The phase difference function in the direction o of the scale n preferably used in the embodiment is expressed as follows:
Figure BDA0001870463630000051
wherein phi isno(x, y) is the phase value at the image coordinate (x, y) in the direction o of the dimension n,
Figure BDA0001870463630000052
is the weighted average phase value at all scales of direction o at the image coordinates (x, y).
For a pair of input images with the size of M multiplied by N, the phase consistency energy of the input images in each direction is respectively calculated, and the number of the directions is preferably four, namely the directionsThe o directions are respectively 0, pi/4, pi/2 and 3 pi/4 to obtain a group of direction phase consistency performance graphs, wherein the size of each graph is also M multiplied by N, so that for any point (x, y) on the image, phase consistency energy PC in each direction can be usedoA one-dimensional vector V consisting of (x, y)p(x, y) describes this, i.e. a directional phase consistency vector, the length of which is equal to the number of directions.
In the examples, Vp(x,y)=(PC0(x,y),PCπ/4(x,y),PCπ/2(x,y),PC3π/4(x,y))。
The image matching is carried out by utilizing the characteristic combination of the phase consistency performance graphs in multiple directions, the geometric structure characteristics of the image in different directions are fully considered, the characteristic dimension is less, and the calculation efficiency is higher.
And 4, constructing a feature description vector of the feature points. And respectively taking all the feature points extracted from the reference image as points to be described to perform the following processing:
at the point (x) to be described0,y0) As a center, selecting points in the row direction and the column direction according to the distance D within a range with a preset value Rad as a radius, referring to FIG. 2, setting a total of s points (including the point itself to be described) and marking the group of points as [ (x)0,y0),(x1,y1),…,(xs-1,ys-1)]。
Respectively using each point of said group of points as centre, using said point as centre to define an accumulation domain whose size is t x t, accumulating the directional phase-consistency vectors correspondent to every pixel in the range of t x t to obtain s accumulated vectors, finally using said s vectors to combine them into a one-dimensional vector VfAs a point (x)0,y0) The length of the feature description vector is s × the number of directions.
In specific implementation, Rad, D and t values can be preset. The matching efficiency is reduced when the radius is too large, the matching precision cannot be guaranteed when the radius is too small, and Rad values are set according to image types and are generally between 30 and 60 pixels. The proposed values for D and t are 2 and 3, respectively.
The characteristic points are represented by using the dense combination vectors of the directional phase consistency energy, the geometric structure information in the matching area is fully utilized, the characteristic change and the tiny geometric distortion caused by the imaging difference of the heterogeneous remote sensing images can be overcome to a great extent, and the matching accuracy can be greatly improved.
And 5, image matching, namely determining the homonymous points of the characteristic points through sliding window searching and the correlation coefficients of the characteristic description vectors, and extracting matching point pairs. The coordinate of each feature point on the pre-corrected target image is directly determined according to the longitude and latitude relationship of the two pre-corrected images, then an approximate search radius is determined according to the initial positioning precision of the images, a user can set the search radius automatically during specific implementation, the search radius is as small as possible in principle, and possible homonymous points are ensured to fall into the search range determined by the search radius. Calculating the normalized cross-correlation coefficient of the feature description vector of the feature point and the feature description vector of each point in the search range on the target image, taking the maximum point of the correlation coefficient as the homonymy point, and setting the row-column number coordinate of the homonymy point on the target image, which is obtained by matching a certain feature point on the reference image, as X0. The construction method of the feature description vector of each point in the search range on the target image is consistent with the step 4, and the feature description vectors are respectively used as the points to be described to perform corresponding processing, which is not repeated in the invention.
Let the feature description vector of a certain feature point on the reference image be VfThe feature description vector of a point in the corresponding search range on the target image is Vf', two feature description vectors Vf,VfThe normalized cross-correlation coefficient between' is determined by the following equation:
Figure BDA0001870463630000061
wherein, sNoIs a feature description vector VfLength of (d); veIs a row vector of equal length to the feature description vector and all values are 1. The range of the normalized cross-correlation coefficient is [ -1,1 [)]A larger value indicates a higher similarity of the two feature description vectors.
In order to make the positioning accuracy reach sub-pixel, the invention further provides the need of carrying out peak value fitting on the correlation coefficient matching result, wherein a Taylor series expansion is used as a fitting function, and in X0And (4) performing second-order expansion by using a Taylor series (the higher-order terms are omitted):
Figure BDA0001870463630000062
wherein X0The coordinate of the interpolation center is the row-column number coordinate of the homonymous point on the target image obtained by matching; f (x) is a similarity measure function, i.e. a normalized cross-correlation coefficient function. The derivative at X corresponding to the maximum point should be 0(X is a column vector of length 2 consisting of image coordinates), so let
Figure BDA0001870463630000063
Then to
Figure BDA0001870463630000064
Taking the derivative and making the derivative equal to 0 yields:
Figure BDA0001870463630000065
this is achieved by
Figure BDA0001870463630000066
I.e. the peak point relative to the interpolation center X0The finally obtained coordinates of the matching points are:
Figure BDA0001870463630000067
thus, X' is the matching point coordinates at the sub-pixel level on the target image. And determining a matching point pair according to the matching point coordinates, and entering the step 6 for further processing.
And 6, removing gross error points. Embodiments reject the mismatching points by a least squares adjustment method.
For each feature point on the reference image obtained in the step 2, corresponding matching points on the target image are obtained through the step 4, and an affine transformation model can be constructed between the two images according to the matching point pairs, as shown in the following formula:
x'=ax+by+c
y'=dx+ey+f
where (x, y), (x ', y') are coordinates of the matching point pairs on the reference image and the target image, respectively, and a, b, c, d, e, f are model coefficients. Then, the matching points with the residual errors larger than the gross error threshold are removed by a least square adjustment method (the value of the gross error threshold can be preset in specific implementation and is generally set as 3 pixels), and finally the remaining matching points are regarded as correct matching points for subsequent adjustment processing.
In specific implementation, the automatic operation of the processes can be realized by adopting a computer software technology.
The specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (6)

1. A multisource remote sensing image feature matching method based on direction phase consistency is characterized by comprising the following steps: realizing the high-precision matching of the multi-source remote sensing images based on the direction phase consistency, comprising the following steps,
step 1, performing image orthorectification on a reference image and a target image;
step 2, only the reference image is subjected to blocking and feature point extraction;
step 3, calculating directional phase consistency performance maps of the reference image and the target image in multiple directions respectively to obtain directional phase consistency vectors of each pixel point; when calculating the directional phase consistency, the phase difference function in the direction o of the adopted scale n is expressed as follows,
Figure FDA0003240994670000011
wherein phi isno(x, y) is the phase value at the image coordinate (x, y) in the direction o of the dimension n,
Figure FDA0003240994670000012
is the weighted average phase value at all scales of direction o at image coordinates (x, y);
step 4, constructing feature description vectors of the feature points based on intensive combination of the direction phase consistency vectors;
step 5, image matching, including determining the homonymous points of each characteristic point through sliding window searching and the correlation coefficient of the characteristic description vector, and extracting matching point pairs; the realization mode is that,
directly determining the coordinates of each characteristic point on a pre-corrected target image according to the longitude and latitude relationship of the two images obtained by pre-correction, and then determining a search radius according to the initial positioning precision of the images; calculating the normalized cross-correlation coefficient of the feature description vector of the feature point and the feature description vector of each point in the search range on the target image, taking the maximum point of the correlation coefficient as the homonymy point, and setting the row-column number coordinate of the homonymy point on the target image, which is obtained by matching a certain feature point on the reference image, as X0
After the homonymous points of the characteristic points are determined, peak fitting is carried out to obtain the matching point coordinates of the sub-pixel level on the target image, wherein Taylor series expansion is used as a fitting function, and X is carried out0The position is obtained by Taylor series second-order expansion:
Figure FDA0003240994670000013
wherein, X0The coordinate of the interpolation center is the row-column number coordinate of the homonymous point on the target image obtained by matching; (X) is a similarity measure function, the derivative at X corresponding to a maximum point should be 0, X is a column vector of length 2 consisting of image coordinates, so that
Figure FDA0003240994670000014
Then to
Figure FDA0003240994670000015
Derivative and let the derivative equal 0:
Figure FDA0003240994670000016
here, the
Figure FDA0003240994670000021
For the peak point with respect to the centre of interpolation X0The finally obtained coordinates of the matching points are:
Figure FDA0003240994670000022
wherein, X' is the matching point coordinate of the sub-pixel level on the target image; determining a matching point pair according to the matching point coordinates;
and 6, performing gross error elimination and removing error matching points.
2. The multi-source remote sensing image feature matching method based on directional phase consistency as claimed in claim 1, characterized in that: the implementation manner of step 2 is that, firstly, the reference image is divided into grid regions with consistent size, then non-maximum value suppression is respectively carried out in each grid region, a plurality of points with response values larger than a preset threshold value and with maximum values are reserved as characteristic angular points, and the characteristic angular points are regarded as characteristic points to be matched.
3. The multi-source remote sensing image feature matching method based on directional phase consistency as claimed in claim 1, characterized in that: in step 3, the directional phase consistency vector of each pixel point is a one-dimensional vector composed of phase consistency energy in each direction.
4. The multi-source remote sensing image feature matching method based on directional phase consistency as claimed in claim 1, characterized in that: the implementation of constructing the feature description vector is as follows,
at the point (x) to be described0,y0) Within the range of setting radius for the center, selecting points according to the distance D in the row direction and the column direction, setting a total of selected points to obtain s points, and marking the group of points as [ (x)0,y0),(x1,y1),… ,(xs-1,ys-1)];
Taking each point of the group of points as a center, respectively determining an accumulation domain with the size of t multiplied by t, obtaining s accumulation vectors in total by using the direction phase consistency vector corresponding to each pixel in the accumulation range, and finally combining the s vectors into a one-dimensional vector VfAs a point (x)0,y0) The feature description vector.
5. The multi-source remote sensing image feature matching method based on directional phase consistency as claimed in claim 1, 2,3 or 4, wherein: and 5, after the homonymous points of the characteristic points are determined, peak value fitting is carried out to obtain the sub-pixel-level matching point coordinates on the target image.
6. The multi-source remote sensing image feature matching method based on directional phase consistency as claimed in claim 1, 2,3 or 4, wherein: and 6, eliminating the error matching points by a least square adjustment method.
CN201811375054.0A 2018-11-19 2018-11-19 Multisource remote sensing image feature matching method based on direction phase consistency Active CN109523585B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811375054.0A CN109523585B (en) 2018-11-19 2018-11-19 Multisource remote sensing image feature matching method based on direction phase consistency

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811375054.0A CN109523585B (en) 2018-11-19 2018-11-19 Multisource remote sensing image feature matching method based on direction phase consistency

Publications (2)

Publication Number Publication Date
CN109523585A CN109523585A (en) 2019-03-26
CN109523585B true CN109523585B (en) 2021-10-22

Family

ID=65777948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811375054.0A Active CN109523585B (en) 2018-11-19 2018-11-19 Multisource remote sensing image feature matching method based on direction phase consistency

Country Status (1)

Country Link
CN (1) CN109523585B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163896B (en) * 2019-03-29 2023-02-03 西南电子技术研究所(中国电子科技集团公司第十研究所) SAR image matching method
CN111967365B (en) * 2020-08-11 2023-09-15 中国人民解放军国防科技大学 Image connection point extraction method and device
CN112669297B (en) * 2020-12-31 2022-05-27 中国科学院长春光学精密机械与物理研究所 Target detection method
CN113096163B (en) * 2021-04-16 2022-09-27 湘潭大学 Satellite-borne SAR image high-precision matching method without priori lifting rail information
CN113409369A (en) * 2021-05-25 2021-09-17 西安电子科技大学 Multi-mode remote sensing image registration method based on improved RIFT
CN114863145B (en) * 2022-04-21 2023-03-17 武汉大学 Sparse light connection point matching method for luminous images
CN117237680B (en) * 2023-08-18 2024-03-12 暨南大学 Heterogeneous model fitting-based multi-source image matching method and system

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842137A (en) * 2012-08-14 2012-12-26 中山大学 Automatic registration method for multi-temporal empty spectrum remote sensing image based on space comprehensive mutual information
CN103218787A (en) * 2013-04-23 2013-07-24 国家测绘地理信息局卫星测绘应用中心 Multi-source heterogeneous remote-sensing image control point automatic collecting method
CN103679714A (en) * 2013-12-04 2014-03-26 中国资源卫星应用中心 Method for automatic registration of optical image and SAR image based on gradient cross-correlation
CN104021556A (en) * 2014-06-13 2014-09-03 西南交通大学 Heterological remote-sensing image registration method based on geometric structure similarity
CN104732532A (en) * 2015-03-11 2015-06-24 中国空间技术研究院 Remote sensing satellite multispectral image registration method
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
CN105631872A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Remote sensing image registration method based on multiple feature points

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8994784B2 (en) * 2010-12-24 2015-03-31 Lockheed Martin Corporation Wide field image distortion correction
CN103077527A (en) * 2013-02-05 2013-05-01 湖北工业大学 Robust multi-source satellite remote sensing image registration method
CN103914847B (en) * 2014-04-10 2017-03-29 西安电子科技大学 Based on phase equalization and the SAR image registration method of SIFT
CN107909620A (en) * 2017-10-24 2018-04-13 天津大学 Discontinuous fringe projected phase method of deploying based on orientation consistency

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102842137A (en) * 2012-08-14 2012-12-26 中山大学 Automatic registration method for multi-temporal empty spectrum remote sensing image based on space comprehensive mutual information
CN103218787A (en) * 2013-04-23 2013-07-24 国家测绘地理信息局卫星测绘应用中心 Multi-source heterogeneous remote-sensing image control point automatic collecting method
CN103679714A (en) * 2013-12-04 2014-03-26 中国资源卫星应用中心 Method for automatic registration of optical image and SAR image based on gradient cross-correlation
CN104021556A (en) * 2014-06-13 2014-09-03 西南交通大学 Heterological remote-sensing image registration method based on geometric structure similarity
CN104732532A (en) * 2015-03-11 2015-06-24 中国空间技术研究院 Remote sensing satellite multispectral image registration method
CN105261014A (en) * 2015-09-30 2016-01-20 西南交通大学 Multi-sensor remote sensing image matching method
CN105631872A (en) * 2015-12-28 2016-06-01 西安电子科技大学 Remote sensing image registration method based on multiple feature points

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A Multi-sensor Remote Sensing Image Matching Method Based on SIFT Operator and CRA Similarity Measure;WU Yingdan等;《2011 International Conference on Intelligence Science and Information Engineering》;20110830;115-118 *
Multi-Modal Remote Sensing Image Registration Based on Multi-Scale Phase Congruency;Song Cui等;《2018 10th IAPR Workshop on Pattern Recognition in Remote Sensing (PRRS)》;20181011;1-5 *
利用局部自相似进行多光谱遥感图像自动配准;叶沅鑫等;《测绘学报》;20140315;第43卷(第3期);268-275 *
局部相位特征描述的多源遥感影像自动匹配;叶沅鑫等;《武汉大学学报(信息科学版)》;20170905;第42卷(第9期);1278-1284 *

Also Published As

Publication number Publication date
CN109523585A (en) 2019-03-26

Similar Documents

Publication Publication Date Title
CN109523585B (en) Multisource remote sensing image feature matching method based on direction phase consistency
KR101404640B1 (en) Method and system for image registration
WO2019042232A1 (en) Fast and robust multimodal remote sensing image matching method and system
CN112017224B (en) SAR data area network adjustment processing method and system
CN104574347A (en) On-orbit satellite image geometric positioning accuracy evaluation method on basis of multi-source remote sensing data
KR100986809B1 (en) The Method of Automatic Geometric Correction for Multi-resolution Satellite Images using Scale Invariant Feature Transform
CN110569861B (en) Image matching positioning method based on point feature and contour feature fusion
CN108364279B (en) Method for determining pointing deviation of stationary orbit remote sensing satellite
CN109376641B (en) Moving vehicle detection method based on unmanned aerial vehicle aerial video
CN111044037B (en) Geometric positioning method and device for optical satellite image
CN110136177B (en) Image registration method, device and storage medium
CN108830808B (en) On-satellite infrared image stripe noise removing method based on similar line window mean value compensation
CN108564532B (en) Large-scale ground distance satellite-borne SAR image mosaic method
CN112419380B (en) Cloud mask-based high-precision registration method for stationary orbit satellite sequence images
CN111583119B (en) Orthoimage splicing method and equipment and computer readable medium
CN112907493A (en) Multi-source battlefield image rapid mosaic fusion algorithm under unmanned aerial vehicle swarm cooperative reconnaissance
CN108562900B (en) SAR image geometric registration method based on elevation correction
CN110738693B (en) Multi-angle image registration method for ground-based imaging radar
Dalmiya et al. A survey of registration techniques in remote sensing images
Huang et al. SAR and optical images registration using shape context
CN117058008A (en) Remote sensing image geometry and radiation integrated correction method, device, equipment and medium
CN115690473A (en) SAR image feature matching method based on inertial navigation error correction
CN113592744B (en) Geometric fine correction method suitable for high-resolution remote sensing image
CN114565653B (en) Heterologous remote sensing image matching method with rotation change and scale difference
CN115330619A (en) Local geometric fine correction method suitable for high-resolution remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant