CN115540908A - InSAR interference fringe matching method based on wavelet transformation - Google Patents

InSAR interference fringe matching method based on wavelet transformation Download PDF

Info

Publication number
CN115540908A
CN115540908A CN202211239103.4A CN202211239103A CN115540908A CN 115540908 A CN115540908 A CN 115540908A CN 202211239103 A CN202211239103 A CN 202211239103A CN 115540908 A CN115540908 A CN 115540908A
Authority
CN
China
Prior art keywords
interference
matching
wavelet
image
fringe
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211239103.4A
Other languages
Chinese (zh)
Inventor
汪丙南
李岚玉
丁满来
向茂生
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Information Research Institute of CAS
Original Assignee
Aerospace Information Research Institute of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Information Research Institute of CAS filed Critical Aerospace Information Research Institute of CAS
Priority to CN202211239103.4A priority Critical patent/CN115540908A/en
Publication of CN115540908A publication Critical patent/CN115540908A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • G01S13/9023SAR image post-processing techniques combined with interferometric techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses an InSAR interference fringe matching method based on wavelet transformation, which is an interference fringe matching method used in InSAR matching navigation, and is used for matching InSAR fringes obtained by an aircraft in real time with InSAR fringes obtained by inversion of a digital elevation model stored in an aircraft to obtain relevant state parameters such as distance direction and azimuth direction positioning offset. And correcting the system error of the INS by adopting a mode of interference synthetic aperture radar image fringe matching, and realizing autonomous long-time accurate positioning navigation.

Description

InSAR interference fringe matching method based on wavelet transformation
Technical Field
The invention relates to the technical field of navigation, in particular to an InSAR interference fringe matching method based on wavelet transformation.
Background
An Inertial Navigation System (INS) is an autonomous Navigation System independent of an external environment, and is widely used in military and civil fields. However, drift errors of the INS are accumulated over time, and in order to eliminate positioning errors of the INS and realize long-time autonomous navigation, the navigation system generally uses other navigation modes to assist the INS, and currently, autonomous integrated navigation modes applied in practice mainly include terrain-matched navigation and Synthetic Aperture Radar (SAR) scene-matched navigation.
The terrain matching navigation is that during the flying process of the aircraft, a radar altimeter is utilized to measure the terrain profile height along a flight path, the measured real-time height data and the reference terrain profile height data prestored in the aircraft are subjected to maximum correlation processing, and the geographic position of the aircraft is determined according to the optimal matching. And information fusion is carried out between the accurate geographic position information output by the terrain matching and the INS, so that the error of the INS is eliminated, and accurate navigation is realized.
The SAR scene matching navigation utilizes the characteristics of all-time and all-weather observation of SAR and two-dimensional high-resolution capability, and can provide high-resolution images similar to optical cameras in environments with poor visibility. And carrying out image matching on the image information acquired by the SAR in real time and map data of a corresponding surveying and mapping zone in a digital map database stored in the computer to obtain the accurate geographic position information at the current moment, so as to calculate the positioning deviation of the INS, and outputting the estimation error of the INS after the positioning deviation is taken as an observed quantity and passes through a Kalman filter to carry out error correction on the INS, thereby obtaining the accurate geographic position information for a long time. Meanwhile, the information can be used for carrying out motion compensation and visual area positioning parameter calculation on the SAR, so that the state information of the aircraft is determined, and the navigation precision is improved.
The existing terrain matching navigation mainly carries out related retrieval on a terrain profile measured by a radar altimeter in a reference terrain map stored in a machine, mismatching is easily generated in a flat area, so that an area with obvious terrain slope characteristic needs to be selected for matching, and the terrain matching navigation is one-dimensional matching without two-dimensional resolution capability, so that the matching precision is not high.
SAR scene matching has the all-time and all-weather working characteristics and two-dimensional high-resolution capability, and has been successfully applied to the military field, but some problems exist, such as the SAR image matching is easily affected by seasonal changes of ground features, and a typical ground surface needs to be selected to ensure the stability of matching. On the other hand, the single-channel SAR cannot sense the attitude change of the aircraft, so that effective motion compensation cannot be carried out, the imaging precision is influenced, and the matching precision is reduced.
Disclosure of Invention
In order to solve the technical problem, the invention provides an Interferometric Synthetic Aperture Radar (InSAR) fringe matching method based on wavelet transformation, which corrects the system error of INS by adopting an Interferometric Synthetic Aperture Radar (InSAR) image fringe matching mode, and realizes autonomous long-time accurate positioning navigation.
In order to achieve the purpose, the invention adopts the technical scheme that:
an InSAR interference fringe matching method based on wavelet transformation specifically comprises the following steps:
step 1, obtaining interference fringes, which specifically comprises the following steps:
(1) Firstly, a single-view complex image s is carried out according to actual InSAR echo data 1 And monoscopic complex number image s 2 Including a single view complex image s 1 And monoscopic complex number image s 2 Calculating the distance and direction offset values and resampling the secondary image to the primary image;
(2) The two registered images are subjected to interference processing, namely one image is multiplied by the conjugate of the other image, namely s 1 .s * 2 Then, phase information is taken out;
(3) Removing the ground phase to obtain interference fringes to be matched;
step 2, separating interference fringe characteristics, which specifically comprises the following steps:
(1) Decomposing and reconstructing interference fringes;
(2) Carrying out interference fringe texture filtering;
(3) Extracting the characteristic of the stripe profile line;
(4) Fast extracting the interference fringe characteristics of the shadow area;
(5) Rapidly extracting interference fringe characteristics of the overlapping and masking region;
step 3, multi-scale correlation matching, which specifically comprises the following steps:
setting a standard interference fringe image or a characteristic I (x, y) under a certain scale, taking actually obtained interference fringes and extracted characteristics as a template picture J (x, y), and overlapping the template picture J (x, y) on the standard interference fringe image or the characteristic I (x, y) under the certain scale to search a translation sliding window; the similarity measurement between the reference interference fringes and the actual interference fringes is realized by adopting a matching operator of normalized cross correlation; the normalized correlation coefficient is:
Figure BDA0003884319980000021
wherein, mu I And mu J Are the mean, σ, of the two images, respectively I And σ J The standard deviation of two images is respectively, N represents the number of pixel points in the images, and the value range of normalized cross-correlation [0,1 ]]。
Further, the (1) step of the step 2 includes:
the interference fringe is decomposed into the following components on a j-1 scale through a two-dimensional Mallat wavelet decomposition algorithm:
C j =H az H rg C j-1
Figure BDA0003884319980000031
Figure BDA0003884319980000032
Figure BDA0003884319980000033
wherein H az As an azimuthal low-pass decomposition filter, H rg As a distance-wise low-pass decomposition filter, G az For azimuthal high-pass decomposition filters, G rg A distance-wise high-pass decomposition filter;
Figure BDA0003884319980000034
respectively corresponding to the stripe images C j-1 A low frequency component in a vertical direction, a high frequency component in a horizontal direction, and a high frequency component in a diagonal direction;
reconstructing the interference fringes of the original resolution by using a Mallat wavelet reconstruction algorithm:
Figure BDA0003884319980000035
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003884319980000036
weighting coefficients of the four components of the j-th layer are respectively, and when the coefficients are all equal to 1, the original interference fringe image is completely recovered; h az * ,H rg * ,G az * And G rg * Are respectively H az ,H rg ,G az And G rg Conjugation of (1).
Further, the step (2) of the step 2 specifically includes:
defining the distance between the coherence coefficients of the central pixel and the neighborhood pixels as:
D=|γ ij |
wherein, gamma is i And gamma j The coherence coefficients of the central pixel i and the neighboring pixels j, respectively;
based on the relationship between the correlation coefficient distance and the domain energy value, the improved neighborhood energy form is as follows:
Figure BDA0003884319980000037
wherein x is i Representing a pixel division label, x j Is x i The adjacent segmentation labels of (a); alpha is used to control the shape of the curve and beta is a parameter greater than zero used to adjust the weights of the likelihood energy and the neighborhood energy.
Further, the step (3) of the step 2 specifically includes:
high-frequency component D in vertical direction of reserved azimuth distance by using Mallat wavelet decomposition and reconstruction j 1 And a high frequency component D in the horizontal direction j 2 Using the horizontal and vertical directionsThe straight two orthogonal components can completely reconstruct the edge structure of the line, and simultaneously, the low-frequency component is reserved for extracting the equiphase line and the maximum gradient profile, and the line characteristic wavelet reconstruction equation is expressed as follows:
Figure BDA0003884319980000038
after wavelet decomposition and reconstruction processing, removing high-frequency noise information and high-frequency change edges in the interference phase image to obtain a smoothed interference phase image; and then according to the definition of the equiphase line and the maximum gradient profile, further extracting the equiphase line and the maximum gradient profile line in the actual interference fringe and the reference simulation interference fringe, and performing characteristic matching to obtain a matching result.
Further, the step (4) of the step 2 specifically includes:
under the framework of wavelet multi-scale analysis, the decomposition and reconstruction of Mallat wavelet should reserve the high frequency component in the diagonal direction as much as possible
Figure BDA0003884319980000041
The shadow area is quickly detected by utilizing the components in the diagonal direction, and the wavelet reconstruction equation of the shadow area is as follows:
Figure BDA0003884319980000042
after wavelet domain decomposition and reconstruction, further performing threshold segmentation processing by combining intensity information on image gray scale, and removing shadow areas of interference phase noise and smaller-area shadow detection by using a graphical corrosion and expansion method to obtain a larger-area shadow area detection result; and extracting the edge of the shadow region by adopting an edge detection algorithm, and matching the boundary information of the shadow region with a shadow edge vector stored in the reference image to obtain a matching result of the shadow region.
Further, the step (5) of the step 2 specifically includes:
performing threshold segmentation processing by combining intensity information on image gray, removing a superposition false alarm region by using a graphical corrosion and expansion method, wherein interference fringes of the superposition region have certain texture characteristics, and directly matching the interference fringes of the superposition region as surface characteristics with a superposition region of the reference simulation interference fringes to obtain a matching result of the superposition region; meanwhile, the boundary of the overlapping area and the geographic position of the pixel point in the reference library are mapped mutually, so that the boundary of the overlapping area is used as a matching result of the overlapping area.
Further, the step 3 specifically includes:
step (1), performing j-layer wavelet decomposition on the actual interference fringes and the reference simulation interference fringes respectively to obtain interference phase diagrams after decomposition of each level;
step (2), aiming at the texture features, line features, shadow interferograms and typical features of overlapping interferograms of the interference phase images, under a wavelet multi-scale decomposition framework, according to the wavelet decomposition and reconstruction method in the step (1) in the step (2), interference fringe image processing is carried out, and interference phase surface features, equal phase line features, maximum gradient section line features, shadow line and surface features and overlapping surface features in interference fringes are extracted according to fringe feature definition;
step (3), on the j layer, performing coarse matching on the extracted features of the two images at the bottom layer of the scale space by adopting a cross-correlation similarity measurement template matching method to obtain an optimal matching area on the scale;
step (4), on the j-1 layer, in the optimal matching area of the previous step, further performing high-resolution cross-correlation matching calculation on the j-1 scale, and screening mismatching features;
and (5) repeating the steps (3) and (4) until recursion to the topmost position of the scale space is performed, and obtaining a final matching result.
Has the advantages that:
(1) The InSAR fringe matching and the traditional SAR scene matching have obvious difference in the aspects of image characteristics and observable motion parameters, the SAR scene matching is easily influenced by ground feature changes, a single-channel SAR cannot sense the attitude information of an aircraft, interference fringes of the InSAR after removing the flat ground are sensitive to long-term stable terrain slope characteristics, and the InSAR has stronger position and attitude inversion capability.
(2) The existing scene matching navigation only adopts the position of a matching point to correct the inertial navigation error. Compared with SAR image matching, inSAR fringe matching not only obtains position information through interference fringe matching, but also can invert platform attitude parameters at high precision. The InSAR interference phase comprises roll angle information, the Doppler frequency comprises yaw angle and pitch angle attitude information, and the point target phase history comprises relative motion information. The motion parameters comprise absolute three-dimensional position, radar slant range error, time-varying baseline error and Doppler frequency error information, and the information is fused with the INS, so that the attitude information of the platform can be inverted at high precision, and the positioning navigation precision is improved.
Drawings
FIG. 1 is a flow chart of an InSAR interference fringe matching method based on wavelet transform according to the present invention;
FIG. 2 is a schematic diagram of fringe texture information;
FIG. 3 is an interference fringe equiphase line;
FIG. 4 is a section line of maximum gradient of interference fringes;
FIG. 5 is a schematic diagram of shadow region effects and interference fringe characteristics;
FIG. 6 is a schematic diagram of the image of the shadow mask area and the interference fringe characteristics.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. In addition, the technical features involved in the embodiments of the present invention described below may be combined with each other as long as they do not conflict with each other.
The invention discloses an InSAR interference fringe matching method based on wavelet transformation, which is an interference fringe matching method used in InSAR matching navigation, and is used for matching InSAR fringes obtained by an aircraft in real time with InSAR fringes obtained by inversion of a digital elevation model stored in an aircraft to obtain relevant state parameters such as distance direction and azimuth direction positioning offset.
As shown in fig. 1, the method for matching InSAR interference fringes based on wavelet transform specifically includes the following steps:
step 1, obtaining interference fringes, which specifically comprises the following steps:
(1) Firstly, a single-view complex image s is carried out according to actual InSAR echo data 1 And monoscopic complex image s 2 Including a single-view complex image s 1 And monoscopic complex number image s 2 Calculating the distance and direction offset values and resampling the secondary image to the primary image;
(2) The two registered images are subjected to interference processing, namely one image is multiplied by the conjugate of the other image, namely s 1 .s * 2 Then, phase information is taken out;
(3) And removing the flat phase to obtain the interference fringes to be matched.
Step 2, interference fringe characteristic separation, which specifically comprises the following steps:
(1) And (3) decomposition and reconstruction of interference fringes:
and decomposing the interference fringes into the following components on a j-1 scale by a two-dimensional Mallat wavelet decomposition algorithm:
C j =H az H rg C j-1
Figure BDA0003884319980000061
Figure BDA0003884319980000062
Figure BDA0003884319980000063
wherein H az As an azimuthal low-pass decomposition filter, H rg Is a distance-wise low-pass decomposition filter, G az For azimuthal high-pass decomposition filters, G rg A distance-wise high-pass decomposition filter;
Figure BDA0003884319980000064
respectively corresponding to the stripe images C j-1 A high frequency component in the vertical direction, a high frequency component in the horizontal direction, and a high frequency component in the diagonal direction. The interference fringes at the original resolution can be reconstructed by the Mallat wavelet reconstruction algorithm:
Figure BDA0003884319980000065
wherein, the first and the second end of the pipe are connected with each other,
Figure BDA0003884319980000066
the weighting coefficients of the j-th layer four components are respectively, and when the coefficients are all equal to 1, the original interference fringe image is completely recovered; h az * ,H rg * ,G az * And G rg * Are respectively H az ,H rg ,G az And G rg Conjugation of (1).
The multi-resolution analysis characteristic of the wavelet can perform multi-resolution decomposition on signals under different scales, and decompose mixed signals composed of various different frequencies and interwoven together into sub-signals of different frequency bands. The method for denoising by applying wavelet decomposition and reconstruction comprises the following specific steps: the signal containing noise is decomposed into different frequency bands under a certain scale, then the frequency band where the noise is located is set to zero, and wavelet reconstruction is carried out, so that the purpose of separating different characteristics is achieved.
Slowly varying topographic information in the interference phase will remain in the low frequency component C j In the middle and high mountains, stripe winding line edge information is mainly concentrated on horizontal and vertical high-frequency components
Figure BDA0003884319980000067
Performing the following steps; the shadow area fringe planar area shows the high frequency noise property, and the high frequency component mainly existing in the diagonal direction
Figure BDA0003884319980000068
In (1). The low-frequency interference phase in the original interference fringe, the linear edge feature, the shadow face high-frequency random noise, the overlap area edge feature and the like are separated by controlling the weighting coefficient.
(2) And (3) carrying out interference fringe texture filtering, wherein the specific method comprises the following steps:
in mountainous areas with large topographic relief, such as mountains and intersecting mountains, after the land removal processing, 2 pi winding phase still occurs in the interference fringes, and the fringes include a large number of fringe texture features, as shown in fig. 2. These texture features are actually interference phase periodic winding features, with fringe interference phase values between fringes of 0 or 2 pi.
In a mountain area with large topographic relief, interference fringe winding phenomena often exist, and the winding characteristics are periodically changed from the top of the mountain to the bottom of the mountain by [0 pi ]. Compared with the reference simulation interference fringe, the actual interference fringe image has a large amount of ground object texture information, and the ground object texture information needs to be removed. In the invention, under the framework of wavelet domain decomposition, interference phases are denoised firstly, the technical idea of Bayes method denoising of a wavelet domain coherence-Markov model is provided, the characteristic that strong correlation exists among interference phase truth values in interference fringes and wavelet coefficients after wavelet transformation also have strong correlation is utilized, and interference fringe noise filtering and texture information maintenance are carried out by processing the local action relationship among the wavelet coefficients.
In the conventional markov random field model, when the labels of the central pixel and the neighborhood pixels are the same, the neighborhood energy is a fixed value, and when the labels are different, the neighborhood energy is zero, which causes the influence of the adjacent pixels on the central pixel to be the same, so that the context space information cannot be fully utilized. Redefining a Markov neighborhood energy model based on an interference coherence coefficient, and providing a detection method based on a coherence coefficient-Markov random field model to fully utilize context interference information.
Defining the distance between the coherence coefficients of the central pixel and the neighborhood pixels as:
D=|γ ij |
wherein, γ i And gamma j The coherence coefficients of the central pixel i and the neighboring pixels j, respectively.
When the segmentation labels of the central pixel and the neighboring pixels are the same, the closer the coherence coefficient is, the more similar they are, and thus the corresponding neighborhood energy value should be lower and the probability of having the same label is greater. When the segmentation labels are different between the central pixel and the neighboring pixels, the closer the coherence coefficient distance is, the more similar they are, and thus the corresponding neighborhood energy value should be higher and the probability of having a different label is lower. Based on the above analysis, the improved neighborhood energy form is as follows:
Figure BDA0003884319980000071
wherein x is i Represents a pixel division label, x j Is x i Adjacent split labels.
Where α is used to control the shape of the curve and β is a parameter greater than zero used to adjust the weights of the likelihood energy and the neighborhood energy. In the wavelet domain, a coherent Markov prior model is constructed to reflect the slowly-varying edge characteristics of interference fringes as the criterion of wavelet coefficient reconstruction, so that interference phase noise is reduced while the fringe texture under the large scale is reserved.
(3) And (3) extracting the stripe section line characteristics, wherein the specific method comprises the following steps:
the extraction of the interference fringe textural features is a surface matching method, but the calculation amount is large, and simpler line feature representation is needed, so that the data volume of the image to be matched is simplified, and the matching operation efficiency is improved. Based on the change characteristics of the interference fringes, the invention provides an interference fringe matching method based on the characteristics of two lines, namely an equiphase line and a maximum gradient profile.
Under the framework of wavelet multi-scale analysis, a line feature is an edge feature, and high-frequency components D in the vertical direction of the preserved azimuth distance are decomposed and reconstructed by using Mallat wavelets j 1 And a high frequency component D in the horizontal direction j 2 Using waterThe edge structure of the line can be completely reconstructed by the horizontal and vertical orthogonal components, and simultaneously, the low-frequency component is reserved for extracting the equiphase line and the maximum gradient profile, and the line characteristic wavelet reconstruction equation can be expressed as follows:
Figure BDA0003884319980000081
the equiphase line is defined as the curve formed by connecting adjacent points on the interference fringe pattern with equal interference phase values. The contour lines represent contour lines in a 2 pi interference phase period, and generally represent closed curves in the fringe pattern, as shown in fig. 3. The equiphase line is also an expression form of the terrain gradient, and the denser the equiphase line is, the larger the ground gradient is; the larger the equal phase line plain space is, the more sparse the arrangement is, which shows that the ground gradient is smaller, and particularly the direction of the equal phase line can be changed when the equal phase line passes through ridges or valleys, which shows the obvious characteristic of the change of the terrain gradient.
The maximum gradient profile line is defined as the curve formed by connecting the points in the interference fringe gradient map in the two directions of azimuth and distance and in the adjacent maximum gradient direction, as shown in FIG. 4. The maximum gradient profile represents an area with the maximum gradient change of the terrain in the two orthogonal directions, is the steepest place of the terrain of the fringe pattern, further simplifies the characteristics of the equiphase line compared with the equiphase line, and further revises the most obvious characteristics of the interference fringe pattern.
After wavelet decomposition and reconstruction processing, high-frequency noise information and high-frequency change edges in the interference phase diagram are removed, and the smoothed interference phase diagram is obtained. And then according to the definition of the equiphase line and the maximum gradient profile, further extracting the equiphase line and the maximum gradient profile line in the actual interference fringe and the reference simulation interference fringe, and performing characteristic matching to obtain a matching result.
(4) Fast extraction of interference fringe characteristics in shadow areas is carried out, and the specific method comprises the following steps:
since the backward echo cannot reach the receiver due to the shielding of the radar sight, the interference phase in the shadow area is completely represented as random noise, as shown in fig. 5. Due to the fact thatInSAR side-looking imaging geometry, the edge characteristics of shadow are closely related to the radar incidence angle, and the same mountain body is observed under different angles and also presents different shadow characteristics. The shadow region interference fringe matching is realized by combining the random phase characteristic and the edge characteristic of the shadow, and the matching point of the fringe is increased. Conventional shadow region detection methods typically use image brightness and statistical coherence coefficients to determine shadows. In fact, the shadow area in the interference phase diagram shows disorder high-frequency white noise, and under the framework of wavelet multi-scale analysis, the decomposition and reconstruction by using Mallat wavelet should reserve the high-frequency component in the diagonal direction as much as possible
Figure BDA0003884319980000082
Shadow areas can be rapidly detected by utilizing diagonal direction components, and wavelet reconstruction equations of the shadow areas are as follows:
Figure BDA0003884319980000091
after wavelet domain decomposition and reconstruction, further performing threshold segmentation processing by combining intensity information on image gray scale, and removing shadow areas of interference phase noise and smaller-area shadow detection by using a graphical corrosion and expansion method to obtain a larger-area shadow area detection result. The shadow area shows random noise, and the point-to-point matching cannot be realized in the area, but the boundary of the shadow area can be mapped with the geographical position of the pixel point in the reference library, so that the edge of the shadow area is extracted by adopting an edge detection algorithm, and the boundary information of the shadow is matched with the shadow edge vector stored in the reference map to obtain the matching result of the shadow area, as shown in fig. 5.
(5) The method for rapidly extracting the interference fringe characteristics of the overlap masking region comprises the following specific steps:
the overlap mask generation mechanism is that radar echoes at the top and bottom of a mountain simultaneously arrive at a radar receiver, and are overlapped to the same pixel unit in an image, so that an interference phase jumps and is reversely biased, and the interference phase appears as a highlight area in an SAR image, as shown in fig. 6. Because the coherence of the overlapping region is not low, the coherence cannot be completely used for judging the overlapping region, and therefore, a semi-automatic detection method of image brightness threshold segmentation and manual intervention is also commonly used.
And the phase of the overlapped region in the interference fringe image is reversely biased, and the automatic detection of the overlapped region is carried out by utilizing the extraction results of the equiphase line and the maximum gradient profile under the total framework of the wavelet multi-scale analysis. The overlap area is the area with the largest terrain gradient and is intersected with the maximum gradient profile; meanwhile, in the equiphase line, the equiphase line of the overlapped area is jumped and cut off.
In the actual feature extraction, threshold segmentation processing is carried out by combining intensity information on image gray, a superposition false alarm area is removed by using a graphical corrosion and expansion method, interference fringes of the superposition area have certain texture features, and the interference fringes of the area are directly matched with a reference simulation interference fringe superposition area as surface features to obtain a matching result of the superposition area. Meanwhile, the boundary of the overlap area can be mapped with the geographical position of the pixel point in the reference library, so that the boundary of the overlap area is used as a matching result of the overlap area.
Step 3, multi-scale correlation matching, which specifically comprises the following steps:
in consideration of the real-time performance of navigation application and improvement of matching efficiency, the invention provides a wavelet multi-scale interference fringe correlation matching method. After wavelet decomposition, a target feature vector on a scale space is established, in order to reduce the matching search times, matching is carried out from low resolution, the low resolution features represent the overall features of the texture, high resolution feature matching is further carried out on the basis of the matching, and the low resolution features are utilized to carry out layer-by-layer screening on a search strategy, so that the stripe matching search efficiency is improved.
And (3) setting a reference interference fringe image or a feature I (x, y) under a certain scale, taking the actually acquired interference fringe and the extracted feature as a template picture J (x, y), and overlapping the template picture J (x, y) on the reference interference fringe image or the feature I (x, y) under the certain scale to perform translation sliding window search. Because the interference fringe characteristic form has the characteristics of line, surface, interference phase value and the like, the similarity measurement between the reference interference fringe and the actual interference fringe is realized by adopting the matching operator of normalized cross correlation. Normalized correlation coefficient:
Figure BDA0003884319980000101
wherein, mu I And mu J Are the mean values, σ, of the two images, respectively I And σ J Are respectively the standard deviation of two images, N represents the number of pixel points in the images, and the value range of normalized cross-correlation [0,1 ]]。
Specifically, the wavelet multi-scale interference fringe matching algorithm comprises the following steps:
and (1) respectively carrying out j-layer wavelet decomposition on the actual interference fringes and the reference simulation interference fringes to obtain interference phase images after decomposition at all levels.
And (2) aiming at typical characteristics of texture characteristics, line characteristics, shadow interferograms and overlay interferograms of the interference phase images, processing the interference fringe images according to the wavelet decomposition reconstruction method under a wavelet pair scale decomposition framework, and extracting interference phase surface characteristics, equal phase line characteristics, maximum gradient section line characteristics, shadow line and surface characteristics and overlay surface characteristics in the interference fringes according to fringe characteristic definition.
And (3) on the j layer, performing coarse matching on the extracted features of the two images at the bottom layer of the scale space by adopting a cross-correlation similarity measurement template matching method to obtain the optimal matching area on the scale.
And (4) on the j-1 layer, further performing high-resolution cross-correlation matching calculation on the j-1 scale in the optimal matching area of the previous step, and screening mismatching features.
And (5) repeating the steps (3) and (4) until recursion to the topmost position of the scale space is carried out, and obtaining a final matching result.
It will be understood by those skilled in the art that the foregoing is only an exemplary embodiment of the present invention, and is not intended to limit the invention to the particular forms disclosed, since various modifications, substitutions and improvements within the spirit and scope of the invention are possible and within the scope of the appended claims.

Claims (7)

1. An InSAR interference fringe matching method based on wavelet transformation is characterized by comprising the following steps:
step 1, obtaining interference fringes, which specifically comprises the following steps:
(1) Firstly, a single-view complex image s is carried out according to actual InSAR echo data 1 And monoscopic complex number image s 2 Including a single-view complex image s 1 And monoscopic complex number image s 2 Calculating the distance and direction offset values and resampling the secondary image to the primary image;
(2) The two registered images are subjected to interference processing, namely one image is multiplied by the conjugate of the other image, namely s 1 .s * 2 Then, phase information is taken out;
(3) Removing the flat ground phase to obtain interference fringes to be matched;
step 2, separating interference fringe characteristics, which specifically comprises the following steps:
(1) Decomposing and reconstructing interference fringes;
(2) Carrying out interference fringe texture filtering;
(3) Extracting the characteristic of the stripe profile line;
(4) Fast extracting the interference fringe characteristics of the shadow area;
(5) Rapidly extracting interference fringe characteristics of the overlapping and masking region;
step 3, multi-scale correlation matching, which specifically comprises the following steps:
setting a reference interference fringe image or a feature I (x, y) under a certain scale, taking actually obtained interference fringes and extracted features as a template picture J (x, y), and overlapping the template picture J (x, y) on the reference interference fringe image or the feature I (x, y) under the certain scale to perform translation sliding window search; the similarity measurement between the reference interference fringes and the actual interference fringes is realized by adopting a matching operator of normalized cross correlation; the normalized correlation coefficient is:
Figure FDA0003884319970000011
wherein, mu I And mu J Are the mean values, σ, of the two images, respectively I And σ J Are respectively the standard deviation of two images, N represents the number of pixel points in the images, and the value range of normalized cross-correlation [0,1 ]]。
2. The InSAR fringe matching method based on wavelet transform as claimed in claim 1, characterized in that: the step (1) of the step 2 comprises the following steps:
and decomposing the interference fringes into the following components on a j-1 scale by a two-dimensional Mallat wavelet decomposition algorithm:
C j =H az H rg C j-1
Figure FDA0003884319970000021
Figure FDA0003884319970000022
Figure FDA0003884319970000023
wherein H az As an azimuthal low-pass decomposition filter, H rg As a distance-wise low-pass decomposition filter, G az For azimuthal high-pass decomposition filters, G rg A distance-wise high-pass decomposition filter; c j ,
Figure FDA0003884319970000024
Respectively corresponding to the stripe images C j-1 A low frequency component of (a), a high frequency component in a vertical direction, a high frequency component in a horizontal direction, and a high frequency component in a diagonal direction;
reconstructing the interference fringes of the original resolution by using a Mallat wavelet reconstruction algorithm:
Figure FDA0003884319970000025
wherein the content of the first and second substances,
Figure FDA0003884319970000026
weighting coefficients of the four components of the j-th layer are respectively, and when the coefficients are all equal to 1, the original interference fringe image is completely recovered; h az * ,H rg * ,G az * And G rg * Are each H az ,H rg ,G az And G rg Conjugation of (1).
3. The InSAR interference fringe matching method based on wavelet transform as claimed in claim 2, characterized in that: the step (2) of the step 2 specifically comprises:
defining the distance between the coherence coefficients of the central pixel and the neighborhood pixels as:
D=|γ ij |
wherein, γ i And gamma j The coherence coefficients of the central pixel i and the neighboring pixels j, respectively;
based on the relationship between the correlation coefficient distance and the domain energy value, the improved neighborhood energy form is as follows:
Figure FDA0003884319970000027
wherein x is i Represents a pixel division label, x j Is x i The adjacent segmentation labels of (a); alpha is used to control the shape of the curve and beta is a parameter greater than zero used to adjust the weights of the likelihood energy and the neighborhood energy.
4. The InSAR interference fringe matching method based on wavelet transform as claimed in claim 3, characterized in that: the step (3) of the step 2 specifically comprises:
decomposition and reconstruction of high-frequency components D in the vertical direction with preserved azimuth distance by using Mallat wavelet j 1 And a high frequency component D in the horizontal direction j 2 The line edge structure can be completely reconstructed by utilizing horizontal and vertical orthogonal components, and simultaneously, low-frequency components are reserved for extracting an equiphase line and a maximum gradient profile, and a line characteristic wavelet reconstruction equation is expressed as follows:
Figure FDA0003884319970000028
after wavelet decomposition and reconstruction processing, removing high-frequency noise information and high-frequency change edges in the interference phase image to obtain a smoothed interference phase image; and then according to the definition of the equiphase line and the maximum gradient profile, further extracting the equiphase line and the maximum gradient profile line in the actual interference fringe and the reference simulation interference fringe, and performing characteristic matching to obtain a matching result.
5. The InSAR interference fringe matching method based on wavelet transform as claimed in claim 4, characterized in that: the step (4) of the step 2 specifically comprises the following steps:
under the framework of wavelet multi-scale analysis, the decomposition and reconstruction of Mallat wavelet should reserve the high frequency component in the diagonal direction as much as possible
Figure FDA0003884319970000032
The shadow area is rapidly detected by utilizing the components in the diagonal direction, and the wavelet reconstruction equation of the shadow area is as follows:
Figure FDA0003884319970000031
after wavelet domain decomposition and reconstruction, further performing threshold segmentation processing by combining intensity information on image gray scale, and removing shadow areas of interference phase noise and smaller-area shadow detection by using a graphical corrosion and expansion method to obtain a larger-area shadow area detection result; and extracting the edge of the shadow region by adopting an edge detection algorithm, and matching the boundary information of the shadow region with a shadow edge vector stored in the reference image to obtain a matching result of the shadow region.
6. The InSAR fringe matching method based on wavelet transform as recited in claim 5, characterized in that: the step (5) of the step 2 specifically comprises the following steps:
performing threshold segmentation processing by combining intensity information on image gray, removing a superposition false alarm region by using a graphical corrosion and expansion method, wherein interference fringes of the superposition region have certain texture characteristics, and directly matching the interference fringes of the superposition region as surface characteristics with a superposition region of the reference simulation interference fringes to obtain a matching result of the superposition region; meanwhile, the boundary of the overlapping area and the geographic position of the pixel point in the reference library are mapped mutually, so that the boundary of the overlapping area is used as a matching result of the overlapping area.
7. The InSAR fringe matching method based on wavelet transform as claimed in claim 6, characterized in that: the step 3 specifically includes:
step (1), performing j-layer wavelet decomposition on the actual interference fringes and the reference simulation interference fringes respectively to obtain interference phase diagrams after decomposition of each level;
step (2), aiming at the texture features, line features, shadow interference patterns and typical features of overlapping interference patterns of the interference phase patterns, processing the interference fringe patterns according to the wavelet decomposition and reconstruction method in the step (1) in the step (2) under a wavelet multi-scale decomposition framework, and extracting interference phase surface features, equiphase line features, maximum gradient section line features, shadow line and surface features and overlapping surface features in the interference fringes according to fringe feature definition;
step (3) on the layer j, performing coarse matching on the extracted features of the two images at the bottommost layer of the scale space by adopting a cross-correlation similarity measurement template matching method to obtain an optimal matching area on the scale;
step (4), on the j-1 layer, in the optimal matching area of the previous step, further performing high-resolution cross-correlation matching calculation on the j-1 scale, and screening mismatching features;
and (5) repeating the steps (3) and (4) until recursion to the topmost position of the scale space is performed, and obtaining a final matching result.
CN202211239103.4A 2022-10-11 2022-10-11 InSAR interference fringe matching method based on wavelet transformation Pending CN115540908A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211239103.4A CN115540908A (en) 2022-10-11 2022-10-11 InSAR interference fringe matching method based on wavelet transformation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211239103.4A CN115540908A (en) 2022-10-11 2022-10-11 InSAR interference fringe matching method based on wavelet transformation

Publications (1)

Publication Number Publication Date
CN115540908A true CN115540908A (en) 2022-12-30

Family

ID=84734029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211239103.4A Pending CN115540908A (en) 2022-10-11 2022-10-11 InSAR interference fringe matching method based on wavelet transformation

Country Status (1)

Country Link
CN (1) CN115540908A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116608922A (en) * 2023-05-17 2023-08-18 小儒技术(深圳)有限公司 Radar-based water level and flow velocity measurement method and system

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116608922A (en) * 2023-05-17 2023-08-18 小儒技术(深圳)有限公司 Radar-based water level and flow velocity measurement method and system
CN116608922B (en) * 2023-05-17 2024-04-05 小儒技术(深圳)有限公司 Radar-based water level and flow velocity measurement method and system

Similar Documents

Publication Publication Date Title
Kulkarni et al. Pixel level fusion techniques for SAR and optical images: A review
Goel et al. A distributed scatterer interferometry approach for precision monitoring of known surface deformation phenomena
Pipaud et al. Evaluation of TanDEM-X elevation data for geomorphological mapping and interpretation in high mountain environments—A case study from SE Tibet, China
CN109212522B (en) High-precision DEM inversion method and device based on double-base satellite-borne SAR
CN113960595A (en) Surface deformation monitoring method and system
CN113393497B (en) Ship target tracking method, device and equipment of sequence remote sensing image under condition of broken clouds
Paillou et al. Relief reconstruction from SAR stereo pairs: the" optimal gradient" matching method
Dong et al. Radargrammetric DSM generation in mountainous areas through adaptive-window least squares matching constrained by enhanced epipolar geometry
CN110703252B (en) Digital elevation model correction method for interferometric synthetic aperture radar shadow area
CN115540908A (en) InSAR interference fringe matching method based on wavelet transformation
Han et al. A method for classifying land and ocean area by removing Sentinel-1 speckle noise
CN114820552A (en) Method for detecting landslide displacement field by using optical satellite stereo image
CN114325706A (en) Distributed scatterer filtering method
Soergel et al. Segmentation of interferometric SAR data for building detection
US20190311461A1 (en) Method and apparatus for enhancing 3d model resolution
Schmitt et al. Towards airborne single pass decimeter resolution SAR interferometry over urban areas
Biamino et al. A “dynamic” land masking algorithm for synthetic aperture radar images
Recla et al. From Relative to Absolute Heights in SAR-based Single-Image Height Prediction
CN109839635B (en) Method for extracting elevation of height measurement foot points through Cryosat-2 SARIn mode L1 b-level waveform data
CN113311432A (en) InSAR long and short baseline fusion phase estimation method based on phase derivative variance
Saba et al. Application of sub-pixel-based technique “orthorectification of optically sensed images and its correlation” for co-seismic landslide detection and its accuracy modification through the integration of various masks
Cheng et al. Generation of pixel-level SAR image time series using a locally adaptive matching technique
Nascetti et al. Radargrammetric digital surface models generation from high resolution satellite SAR imagery: Methodology and case studies
Lombardi et al. Accuracy of high resolution CSK interferometric Digital Elevation Models
Wei et al. 3D digital elevation model generation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination