CN113160288A - SAR image registration method based on feature points - Google Patents

SAR image registration method based on feature points Download PDF

Info

Publication number
CN113160288A
CN113160288A CN202110303757.8A CN202110303757A CN113160288A CN 113160288 A CN113160288 A CN 113160288A CN 202110303757 A CN202110303757 A CN 202110303757A CN 113160288 A CN113160288 A CN 113160288A
Authority
CN
China
Prior art keywords
image
value
point
offset
main image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110303757.8A
Other languages
Chinese (zh)
Inventor
刘智勇
祁宏昌
刘泽楷
张滔
来立永
黄海生
袁俊健
冉倩
雷超平
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Original Assignee
Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd filed Critical Guangzhou Power Supply Bureau of Guangdong Power Grid Co Ltd
Priority to CN202110303757.8A priority Critical patent/CN113160288A/en
Publication of CN113160288A publication Critical patent/CN113160288A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • G01S13/9023SAR image post-processing techniques combined with interferometric techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Processing (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention provides a SAR image registration method based on characteristic points, which comprises the steps of preprocessing a gray value of an SAR image, and detecting the characteristic points by using a SURF operator; the SAR image comprises a main image and an auxiliary image; calculating the correlation coefficient of each point of the main image and the auxiliary image and the offset of the azimuth direction and the distance direction; the point of the maximum value of the extreme value of the correlation coefficient is the optimal matching point; acquiring a first offset value obtained by adding the coordinate of the optimal matching point and the offset of the azimuth direction and the distance direction; according to the coordinates of the optimal matching point and the first deviation value, geometric conversion parameters between the main image and the auxiliary image are carried out; calculating the coordinates of the optimal matching point in the main image and the conversion parameters to obtain a second offset value, and deleting the optimal matching point exceeding the threshold value when the second offset value is larger than the preset threshold value after subtracting the first offset value; and repeating iterative calculation on the coordinates and the conversion parameters of the remaining optimal matching points, and obtaining the final conversion parameters to finish the registration when all the optimal matching points do not exceed the threshold.

Description

SAR image registration method based on feature points
Technical Field
The invention relates to the technical field, in particular to an SAR image registration method based on feature points.
Background
Synthetic Aperture Radar (SAR) image registration is a key step of Synthetic Aperture Radar Interferometric SAR (InSAR) data processing, and is a necessary condition for generating a good interferogram, but due to various factors, accurate registration is very difficult. The conventional registration method is to select a control point, i.e. a Regular Point (RP), on the main intensity map according to a certain spacing rule. Because the selection of the regular points does not consider the scattering characteristic of the ground, the effect is better in the area with higher coherence, but in practical application, the regular points are often positioned in a low coherence/incoherent area (such as a large-area water area, a densely-vegetated forest or an agricultural area with seasonal variation) so that the defects of large calculation amount, high mismatching rate, low reliability and the like exist in registration.
Disclosure of Invention
The invention aims to overcome the defects in the prior art and provide a SAR image registration method based on feature points.
One embodiment of the present invention provides a method for registering an SAR image based on feature points, including:
preprocessing the gray value of the SAR image, and detecting the characteristic points by using a SURF operator after preprocessing the gray value of the SAR image; the SAR image map comprises a main image and an auxiliary image;
selecting a reference window from the main image, selecting a search window from the auxiliary image, and calculating a correlation coefficient of the reference window at each point of the search window and an offset of the reference window in the direction and the distance direction according to the gray value of each point of the main image, the gray value of each point of the auxiliary image, the average gray value of the main image and the average gray value of the auxiliary image; determining the point where the maximum value of the extreme value of the correlation coefficient is located as the best matching point;
acquiring a first offset value obtained by adding the coordinate of the optimal matching point in the main image and the offset of the optimal matching point in the azimuth direction and the distance direction;
calculating geometric transformation parameters between the main image and the auxiliary image by performing a least square polynomial according to the coordinates of the optimal matching point in the main image and the first offset value;
performing least square polynomial calculation on the coordinates of the optimal matching point in the main image and the conversion parameters to obtain a second offset value, and deleting the optimal matching point exceeding the threshold value when the second offset value is larger than a preset threshold value after subtracting the first offset value;
and repeating iterative calculation on the transformation parameters of the least square polynomial calculation of the coordinates and the transformation parameters of the optimal matching points which do not exceed the threshold in the main image, and obtaining final transformation parameters to finish registration when all the optimal matching points do not exceed the threshold.
Compared with the prior art, the SAR image registration method based on the characteristic points firstly carries out gray value processing on a main image and an auxiliary image of an SAR, uses a SURF operator to extract the characteristic points in the image after the gray value processing, then selects a reference window in the main image, selects a search window in the auxiliary image, calculates the correlation coefficient of the reference window at each point of the search window and the offset of the azimuth direction and the distance direction according to the gray values of the main image and the auxiliary image to obtain the best matching point, carries out least square polynomial calculation through the coordinate of the best matching point and a first offset value to obtain the conversion parameter between main and auxiliary influences, then carries out least square polynomial calculation through the conversion parameter and the coordinate of the best matching point to obtain a second offset value, compares the first offset value with the second offset value, and if the difference between the first offset value and the second offset value is larger than a preset threshold value, and deleting the best matching points exceeding the threshold, then carrying out repeated iterative computation, and obtaining the final conversion parameters to finish the registration when all the best matching points do not exceed the threshold. The SAR images are registered by utilizing the characteristic points, the obtained correlation coefficient is much higher than that of the regular points under the same condition no matter on the peak value or the whole distribution, the robustness is stronger, the distribution of the offset is more concentrated on 0, and the offset expression is more stable and reliable.
In one embodiment, said calculating said correlation coefficients of said reference window at each point of the search window further comprises:
calculating a signal-to-noise ratio according to the correlation coefficient and the offset of the azimuth direction and the distance direction;
after the point where the maximum value of the extreme value of the correlation coefficient is located is determined as the best matching point, the method further includes:
and eliminating the best matching points with the signal-to-noise ratio more than or equal to a first preset value and the correlation coefficient less than or equal to a second preset value.
In an embodiment, after the preprocessing the SAR image by using the grayscale value and then detecting the feature points by using the SURF operator, the method further includes: and (3) converting the coordinate system of the water body data extracted from the database in advance, converting the water body data into a radar coordinate system used by the main image and the auxiliary image, and removing the characteristic points falling into water in the main image and the auxiliary image.
In one embodiment, the gray value preprocessing includes performing linear stretching to 0-255 on pixels in the range of gray values greater than 0.5% and less than 99.5% in the SAR image main intensity map, and setting the values of 255 and 0 on the pixels with the gray values greater than 99.5% and less than 0.5%, respectively.
In one embodiment, the water body data is a binary data file; the binary values include 0 and-1, the 0 representing land and the-1 representing a body of water; the water body data is a geographical coordinate system.
In one embodiment, the primary image is any image selected from the SAR images, and the secondary image is an image other than the primary image in the SAR images.
In one embodiment, the correlation coefficient and the offset of the azimuth direction and the distance direction of the reference window at each point of the search window are calculated by:
Figure BDA0002987300800000021
where M (i, j) represents a gradation value of each point of the main image, S (i, j) represents a gradation value of each point of the sub image, i and j represent abscissa and ordinate of the point, respectively, a and b represent length and width of the reference window, respectively, E (M (i, j)) represents an average gradation value of the main image, and E (S (i, j)) represents an average gradation value of the sub image.
In one embodiment, the signal-to-noise ratio is calculated by:
Figure BDA0002987300800000031
rho represents the correlation coefficient of a reference window at each point of a search window, A and B are respectively the length and width of the search window, a and B are respectively the length and width of the reference window, wherein a is less than or equal to A, and B is less than or equal to B.
In one embodiment, the least squares polynomial is calculated by:
Figure BDA0002987300800000032
in the formula x1And y1Respectively the abscissa and ordinate, x, of the main image2And y2Respectively the abscissa and ordinate of the auxiliary image, m1、m2、m3、m4、m5And m6Representing the transformation parameters, the transformation parameters including rotation parameters and translation parameters, m1、m2、m3And m4Represents the rotation parameter, m5And m6The representation represents the translation parameter.
In one embodiment, the first preset value is 5; the second preset value is 4.5.
In order that the invention may be more clearly understood, specific embodiments thereof will be described hereinafter with reference to the accompanying drawings.
Drawings
Fig. 1 is a flowchart of an SAR image registration method based on feature points according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Please refer to fig. 1, which is a flowchart illustrating an exemplary embodiment of a method for registering SAR images based on feature points, comprising:
step 1: preprocessing the gray value of the SAR image, and detecting the characteristic points by using a SURF operator after preprocessing the gray value of the SAR image; the SAR image map comprises a main image and an auxiliary image.
The Synthetic Aperture Radar (SAR) image is an image shot by a Synthetic Aperture Radar, wherein the Synthetic Aperture Radar can be installed on flight platforms such as airplanes, satellites and spacecrafts, and observes a target object all day long and all weather.
In one embodiment, the primary image is any image selected from the SAR images, and the secondary image is an image other than the primary image in the SAR images.
Step 2: selecting a reference window from the main image, selecting a search window from the auxiliary image, and calculating a correlation coefficient of the reference window at each point of the search window and an offset of the reference window in the direction and the distance direction according to the gray value of each point of the main image, the gray value of each point of the auxiliary image, the average gray value of the main image and the average gray value of the auxiliary image; and determining the point where the maximum value of the extreme value of the correlation coefficient is positioned as the best matching point.
And step 3: and acquiring a first offset value obtained by adding the coordinate of the optimal matching point in the main image and the offset of the optimal matching point in the azimuth direction and the distance direction.
And 4, step 4: and calculating the geometric transformation parameters between the main image and the auxiliary image by performing a least square polynomial according to the coordinates of the optimal matching point in the main image and the first offset value.
And 5: and performing least square polynomial calculation on the coordinates of the optimal matching point in the main image and the conversion parameters to obtain a second deviation value, and deleting the optimal matching point exceeding the threshold value when the second deviation value is larger than a preset threshold value after subtracting the first deviation value.
Step 6: and repeating iterative calculation on the transformation parameters of the least square polynomial calculation of the coordinates and the transformation parameters of the optimal matching points which do not exceed the threshold in the main image, and obtaining final transformation parameters to finish registration when all the optimal matching points do not exceed the threshold.
Specifically, in this embodiment, a grayscale value preprocessing is performed on the SAR image map, and after the grayscale value preprocessing of the SAR image map, a SURF operator is used to extract the positions of the feature points in the image; the SAR is a synthetic aperture radar image map;
dividing a reference window by taking one point of a main image as a center, dividing a search window in an auxiliary image, and dividing an area which has the same size as the reference window and takes a same-name point as the center in the search window, wherein the same-name point is an image point formed by the same point on the ground on different images; at the same point, the coordinate of the main image is (x)1,y1) The coordinates of the auxiliary image are (x)2,y2) Then the offset is (Δ x, Δ y) ═ x2,y2)-(x1,y1) (ii) a The region is moved pixel by pixel in the search window to shift the search windowThe area in the mouth corresponds to a reference window, and then cross-correlation calculation is carried out, a correlation coefficient value and an offset are recorded, calculation of (A-a +1) × (B-B +1) times is totally needed, A and B are respectively the length and width of the search window, a and B are respectively the length and width of the reference window, a and B are generally set to be the powers of 2, such as 64 and 128, and A is more than or equal to a, and B is more than or equal to B; counting all correlation coefficients, wherein the table is completely correlated when the rho absolute value is 1, the rho absolute value is 0 to represent complete irrelevance, the maximum value of rho is an extreme value, the position of the extreme value is the best matching position, recording the offset of the extreme value, and performing sub-pixel level registration;
after the offset of all the same-name points is obtained through calculation, gross errors can be removed through setting a threshold preliminarily. Then, the geometric transformation parameter between the main image and the auxiliary image is calculated by using least square, and after the cross-correlation calculation, (x)1,y1) Is a known quantity, (x)2,y2) Can be approximated by (x)1,y1) Adding the offset and the corresponding offset to obtain the conversion parameter m, performing least square polynomial fitting calculation, and obtaining the conversion parameter m through least square calculation; will (x)1,y1) Substituting the sum of the conversion parameter m into the following formula to obtain (x)2,y2) If the second offset value minus the first offset value is larger than the threshold value, deleting the point, then utilizing the remaining points to calculate the conversion parameter m again, namely utilizing the remaining points to perform polynomial fitting calculation again until all the points meet the condition, deleting the points exceeding the threshold value, then recalculating the conversion parameter, and continuously iterating until all the remaining data meet the condition; for secondary images, there is only one set of transformation parameters, i.e. rotation parameters (m)1、m2、m3、m4) And translation parameter (m)5、m6)。
In one embodiment, said calculating said correlation coefficients of said reference window at each point of the search window further comprises: calculating a signal-to-noise ratio according to the correlation coefficient and the offset of the azimuth direction and the distance direction;
after the point where the maximum value of the extreme value of the correlation coefficient is located is determined as the best matching point, the method further includes: and eliminating the best matching points with the signal-to-noise ratio more than or equal to a first preset value and the correlation coefficient less than or equal to a second preset value.
In this embodiment, correlation coefficients or signal-to-noise ratios of some feature points are very low, the signal-to-noise ratio is smaller than a first preset value or the correlation coefficients are smaller than a second preset value, the point may be considered unreliable, and generally, an offset thereof is an outlier, so that the feature point may be rejected.
In an embodiment, after the preprocessing the SAR image by using the grayscale value and then detecting the feature points by using the SURF operator, the method further includes: and (3) converting the coordinate system of the water body data extracted from the database in advance, converting the water body data into a radar coordinate system used by the main image and the auxiliary image, and removing the characteristic points falling into water in the main image and the auxiliary image.
In the embodiment, the water body data can be acquired from a scientific data center of a global system, the data of the water body area on the ground are subjected to cross correlation in a large area of the water body area, the correlation degree is extremely low, and the offset is extremely unstable, so that the calculation efficiency of registration is greatly reduced, and the error offset estimation is caused, so that the finally calculated main and auxiliary image conversion models are unreliable, and therefore the characteristic points detected in the main intensity map are removed through the water body data.
In one embodiment, the gray value preprocessing includes performing linear stretching to 0-255 on pixels in the range of gray values greater than 0.5% and less than 99.5% in the SAR image main intensity map, and setting the values of 255 and 0 on the pixels with the gray values greater than 99.5% and less than 0.5%, respectively.
Because the gray value span of the SAR image is generally large, the whole image window is dark, and the extraction of the feature points is not facilitated, before the feature points are detected by the SURF, the gray value of each pixel is firstly subjected to primary processing to ensure that the whole image window is not too bright or too dim, and the subsequent SURF operator can be conveniently used for extracting the feature points.
In one embodiment, the water body data is a binary data file; the binary values include 0 and-1, the 0 representing land and the-1 representing a body of water; the water body data is a geographical coordinate system.
The water body data is a file representing a water body area, and in this embodiment, therefore, land is represented by 0, binary water body data of the water body is represented by-1, and the purpose of removing characteristic points falling into water is achieved by removing-1.
In one embodiment, the correlation coefficient and the offset of the azimuth direction and the distance direction of the reference window at each point of the search window are calculated by:
Figure BDA0002987300800000061
where M (i, j) represents a gradation value of each point of the main image, S (i, j) represents a gradation value of each point of the sub image, i and j represent abscissa and ordinate of the point, respectively, a and b represent length and width of the reference window, respectively, E (M (i, j)) represents an average gradation value of the main image, and E (S (i, j)) represents an average gradation value of the sub image.
Keeping the reference window unchanged, moving the region in the search window row by row and column by column, calculating a correlation coefficient rho between the reference window and the region once by using the calculation mode every time the region moves once, recording the calculated correlation coefficient, the signal-to-noise ratio and the offset of the azimuth direction and the distance direction, sequentially traversing the whole search window, and finally, considering the position of the extreme value of the correlation coefficient as the best matching point.
In one embodiment, the signal-to-noise ratio is calculated by:
Figure BDA0002987300800000062
rho represents the correlation coefficient of a reference window at each point of a search window, A and B are respectively the length and width of the search window, a and B are respectively the length and width of the reference window, wherein a is less than or equal to A, and B is less than or equal to B.
In the embodiment, the feature points with the signal-to-noise ratio lower than the first preset value are unreliable feature points, and the feature points with the signal-to-noise ratio lower than the first preset value have an influence on the calculation of the conversion parameters when participating in polynomial fitting, so that registration errors are generated.
In one embodiment, the least squares polynomial is calculated by:
Figure BDA0002987300800000063
in the formula x1And y1Respectively the abscissa and ordinate, x, of the main image2And y2Respectively the abscissa and ordinate of the auxiliary image, m1、m2、m3、m4、m5And m6Representing the transformation parameters, the transformation parameters including rotation parameters and translation parameters, m1、m2、m3And m4Represents the rotation parameter, m5And m6The representation represents the translation parameter.
In one embodiment, the first preset value is 5; the second preset value is 4.5.
In this embodiment, points with a signal-to-noise ratio smaller than 5 or a correlation coefficient smaller than 0.45 are eliminated, and feature points with a signal-to-noise ratio smaller than 5 or a correlation coefficient smaller than 0.45 have an influence on the calculation of the conversion parameters when participating in polynomial fitting, so that registration errors are generated.
Although embodiments of the present invention have been shown and described, it will be appreciated by those skilled in the art that changes, modifications, substitutions and alterations can be made in these embodiments without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (10)

1. A SAR image registration method based on feature points is characterized by comprising the following steps:
preprocessing the gray value of the SAR image, and detecting the characteristic points by using a SURF operator after preprocessing the gray value of the SAR image; the SAR image map comprises a main image and an auxiliary image;
selecting a reference window from the main image, selecting a search window from the auxiliary image, and calculating a correlation coefficient of the reference window at each point of the search window and an offset of the reference window in the direction and the distance direction according to the gray value of each point of the main image, the gray value of each point of the auxiliary image, the average gray value of the main image and the average gray value of the auxiliary image; determining the point where the maximum value of the extreme value of the correlation coefficient is located as the best matching point;
acquiring a first offset value obtained by adding the coordinate of the optimal matching point in the main image and the offset of the optimal matching point in the azimuth direction and the distance direction;
calculating a geometric transformation parameter between the main image and the auxiliary image by performing a least square polynomial according to the coordinate of the optimal matching point in the main image and a first offset value;
performing least square polynomial calculation on the coordinates of the optimal matching point in the main image and the conversion parameters to obtain a second offset value, and deleting the optimal matching point exceeding the threshold value when the second offset value is larger than a preset threshold value after subtracting the first offset value;
and repeating iterative calculation on the transformation parameters of the least square polynomial calculation of the coordinates and the transformation parameters of the optimal matching points which do not exceed the threshold in the main image, and obtaining final transformation parameters to finish registration when all the optimal matching points do not exceed the threshold.
2. The method as claimed in claim 1, wherein said calculating the correlation coefficient of each point in the reference window after searching for the correlation coefficient of each point in the window further comprises:
calculating a signal-to-noise ratio according to the correlation coefficient and the offset of the azimuth direction and the distance direction;
after the point where the maximum value of the extreme value of the correlation coefficient is located is determined as the best matching point, the method further includes:
and eliminating the best matching points with the signal-to-noise ratio smaller than a first preset value and the correlation coefficient smaller than or equal to a second preset value.
3. The method of claim 1, wherein after the SAR image is preprocessed according to a gray value and detected by using a SURF operator after the SAR image gray value preprocessing, the method further comprises: and (3) converting the coordinate system of the water body data extracted from the database in advance, converting the water body data into a radar coordinate system used by the main image and the auxiliary image, and removing the characteristic points falling into water in the main image and the auxiliary image.
4. The method as claimed in claim 1, wherein the preprocessing of the gray values includes performing linear stretching to 0-255 on pixels in the range of gray values greater than 0.5% and less than 99.5% in the main intensity map of the SAR image, and setting the values of 255 and 0 on pixels with gray values greater than 99.5% and less than 0.5%, respectively.
5. The SAR image registration method based on the feature points as claimed in claim 3, characterized in that the water body data is a binary data file; the binary values include 0 and-1, the 0 representing land and the-1 representing a body of water; the water body data is a geographical coordinate system.
6. The SAR image registration method based on the feature points as claimed in claim 1, wherein the primary image is any selected image in the SAR image, and the secondary image is an image other than the primary image in the SAR image.
7. The method according to claim 1, wherein the correlation coefficients of the reference window at each point in the search window and the offsets of the azimuth direction and the range direction are calculated in the following manner:
Figure FDA0002987300790000021
where M (i, j) represents a gradation value of each point of the main image, S (i, j) represents a gradation value of each point of the sub image, i and j represent abscissa and ordinate of the point, respectively, a and b represent length and width of the reference window, respectively, E (M (i, j)) represents an average gradation value of the main image, and E (S (i, j)) represents an average gradation value of the sub image.
8. The SAR image registration method based on the feature points as claimed in claim 1, wherein the calculation manner of the signal-to-noise ratio is as follows:
Figure FDA0002987300790000022
rho represents the correlation coefficient of a reference window at each point of a search window, A and B are respectively the length and width of the search window, a and B are respectively the length and width of the reference window, wherein a is less than or equal to A, and B is less than or equal to B.
9. The SAR image registration method based on the feature points as claimed in claim 1, wherein the least squares polynomial calculation method is:
Figure FDA0002987300790000023
in the formula x1And y1Respectively the abscissa and ordinate, x, of the main image2And y2Respectively the abscissa and ordinate of the auxiliary image, m1、m2、m3、m4、m5And m6Representing the transformation parameters, the transformation parameters including rotation parameters and translation parameters, m1、m2、m3And m4Represents the rotation parameter, m5And m6Representing said translationAnd (4) parameters.
10. The SAR image registration method based on the feature points as claimed in claim 2, wherein the first preset value is 5; the second preset value is 4.5.
CN202110303757.8A 2021-03-22 2021-03-22 SAR image registration method based on feature points Pending CN113160288A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110303757.8A CN113160288A (en) 2021-03-22 2021-03-22 SAR image registration method based on feature points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110303757.8A CN113160288A (en) 2021-03-22 2021-03-22 SAR image registration method based on feature points

Publications (1)

Publication Number Publication Date
CN113160288A true CN113160288A (en) 2021-07-23

Family

ID=76887879

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110303757.8A Pending CN113160288A (en) 2021-03-22 2021-03-22 SAR image registration method based on feature points

Country Status (1)

Country Link
CN (1) CN113160288A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115712118A (en) * 2022-11-07 2023-02-24 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050078881A1 (en) * 2003-09-22 2005-04-14 Chenyang Xu Method and system for hybrid rigid registration based on joint correspondences between scale-invariant salient region features
CN107909606A (en) * 2017-12-25 2018-04-13 南京市测绘勘察研究院股份有限公司 A kind of SAR image registration communication center elimination of rough difference method
CN109035312A (en) * 2018-07-17 2018-12-18 中国人民解放军国防科技大学 DEM (digital elevation model) -assisted SAR (synthetic aperture radar) image high-precision registration method
CN112068136A (en) * 2020-09-14 2020-12-11 广东省核工业地质局测绘院 Azimuth deformation monitoring method based on amplitude offset
CN112305510A (en) * 2020-09-22 2021-02-02 江苏师范大学 DEM matching-based synthetic aperture radar image geometric calibration method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050078881A1 (en) * 2003-09-22 2005-04-14 Chenyang Xu Method and system for hybrid rigid registration based on joint correspondences between scale-invariant salient region features
CN107909606A (en) * 2017-12-25 2018-04-13 南京市测绘勘察研究院股份有限公司 A kind of SAR image registration communication center elimination of rough difference method
CN109035312A (en) * 2018-07-17 2018-12-18 中国人民解放军国防科技大学 DEM (digital elevation model) -assisted SAR (synthetic aperture radar) image high-precision registration method
CN112068136A (en) * 2020-09-14 2020-12-11 广东省核工业地质局测绘院 Azimuth deformation monitoring method based on amplitude offset
CN112305510A (en) * 2020-09-22 2021-02-02 江苏师范大学 DEM matching-based synthetic aperture radar image geometric calibration method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
彭林才: "基于特征点的SAR影像偏移跟踪方法研究", 《城市勘测》 *
彭林才: "特征点的偏移跟踪算法及其应用", 《万方数据库》 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115712118A (en) * 2022-11-07 2023-02-24 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method
CN115712118B (en) * 2022-11-07 2023-08-11 江苏省水利科学研究院 Pixel offset tracking monitoring and correcting method

Similar Documents

Publication Publication Date Title
Paul et al. The glaciers climate change initiative: Methods for creating glacier area, elevation change and velocity products
Eisenbeiss et al. Potential of IKONOS and QUICKBIRD imagery for accurate 3D-Point positioning, orthoimage and DSM generation
KR102127405B1 (en) Method and appartus for estimating stream flow discharge using satellite images at streams
CN107610164B (en) High-resolution four-number image registration method based on multi-feature mixing
CN108428220B (en) Automatic geometric correction method for ocean island reef area of remote sensing image of geostationary orbit satellite sequence
CN109579872B (en) Star equivalent estimation method for star sensor instrument
CN109523585B (en) Multisource remote sensing image feature matching method based on direction phase consistency
CN112307901B (en) SAR and optical image fusion method and system for landslide detection
CN111856459B (en) Improved DEM maximum likelihood constraint multi-baseline InSAR phase unwrapping method
CN112013822A (en) Multispectral remote sensing water depth inversion method based on improved GWR model
CN108171647B (en) Landsat7 strip image restoration method considering surface deformation
CN108919319B (en) Method and system for positioning island reef satellite image without ground control point
CN112419380B (en) Cloud mask-based high-precision registration method for stationary orbit satellite sequence images
CN110673109A (en) Full waveform data decomposition method for satellite-borne large-light-spot laser radar
CN114022783A (en) Satellite image-based water and soil conservation ecological function remote sensing monitoring method and device
CN113610729B (en) Method, system and storage medium for correcting hyperspectral remote sensing image satellite-ground cooperative atmosphere
Paillou et al. Relief reconstruction from SAR stereo pairs: the" optimal gradient" matching method
CN111144350B (en) Remote sensing image positioning accuracy evaluation method based on reference base map
CN113065467A (en) Satellite image low-coherence region identification method and device based on deep learning
CN114627087A (en) Method and system for automatically detecting ground object change of multi-temporal satellite remote sensing image
CN110738693B (en) Multi-angle image registration method for ground-based imaging radar
CN113160288A (en) SAR image registration method based on feature points
CN106204596B (en) Panchromatic waveband remote sensing image cloud detection method based on Gaussian fitting function and fuzzy mixed estimation
CN114565653B (en) Heterologous remote sensing image matching method with rotation change and scale difference
CN117058008A (en) Remote sensing image geometry and radiation integrated correction method, device, equipment and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210723

RJ01 Rejection of invention patent application after publication