CN113189551A - GB-InSAR heavy-orbit error compensation method based on scene DEM - Google Patents

GB-InSAR heavy-orbit error compensation method based on scene DEM Download PDF

Info

Publication number
CN113189551A
CN113189551A CN202110279364.8A CN202110279364A CN113189551A CN 113189551 A CN113189551 A CN 113189551A CN 202110279364 A CN202110279364 A CN 202110279364A CN 113189551 A CN113189551 A CN 113189551A
Authority
CN
China
Prior art keywords
radar
point
scene
dimensional
heavy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110279364.8A
Other languages
Chinese (zh)
Other versions
CN113189551B (en
Inventor
田卫明
朱嘉鑫
邓云开
胡程
姚迪
董锡超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Institute of Technology BIT
Chongqing Innovation Center of Beijing University of Technology
Original Assignee
Beijing Institute of Technology BIT
Chongqing Innovation Center of Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Institute of Technology BIT, Chongqing Innovation Center of Beijing University of Technology filed Critical Beijing Institute of Technology BIT
Priority to CN202110279364.8A priority Critical patent/CN113189551B/en
Publication of CN113189551A publication Critical patent/CN113189551A/en
Application granted granted Critical
Publication of CN113189551B publication Critical patent/CN113189551B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/40Means for monitoring or calibrating
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/89Radar or analogous systems specially adapted for specific applications for mapping or imaging
    • G01S13/90Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
    • G01S13/9021SAR image post-processing techniques
    • G01S13/9023SAR image post-processing techniques combined with interferometric techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Electromagnetism (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a GB-InSAR heavy-orbit error compensation method based on a scene DEM, which can reasonably deduce the relation between the heavy-orbit error and the slope distance R, and correct the phase error caused by the orbit offset in multiple measurements by adopting a parameter modeling mode. And observing the scene to be monitored twice in different time intervals by using one observation radar to obtain two radar images. And (4) selecting the PS point by using a coherence coefficient method. And scanning the scene to be monitored in a horizontal dimension and a pitching dimension by using a laser scanner to obtain discrete point cloud reflecting the surface relief condition of the scene to be monitored, namely a scene DEM. And constructing a corresponding relation between a two-dimensional imaging plane of the radar and a three-dimensional coordinate system of the scene DEM, and reversely mapping the PS point to the three-dimensional coordinate system of the scene DEM to obtain the three-dimensional coordinate of the PS point. And constructing a polynomial model of the heavy-rail error phase, calculating to obtain an estimated value of the heavy-rail error phase, and subtracting the estimated value of the heavy-rail error phase from the interference phase of the PS point to realize the compensation of the heavy-rail error phase.

Description

GB-InSAR heavy-orbit error compensation method based on scene DEM
Technical Field
The invention relates to the technical field of ground-based interferometric radars, in particular to a GB-InSAR heavy-orbit error compensation method based on a scene DEM.
Background
Among various types of geological disasters, landslide is the most harmful and occurs frequently. Over the last decade, landslides and debris flow have had serious consequences for the death or loss of nearly 10000 people. Since 1949, the number of dead people caused by collapse, landslide and debris flow to Beijing area exceeds 600, and direct economic loss reaches hundreds of millions yuan, so that systematically developing the prevention and treatment work for landslide disasters is an urgent need at present.
The Ground-based Interferometric Synthetic Aperture Radar (GB-InSAR) has the advantages of all-time, all-weather, high precision, non-contact and the like, can measure submillimeter-level target weak deformation, and is widely applied to the field of deformation monitoring. The GB-InSAR measurement principle is based on a phase difference interference technology, radar complex images in the same area in different time periods are subjected to difference interference processing, and high-precision deformation information is obtained by using the interference phase of a target point. In many times's radar measurement experiment, need dismantle equipment and install to set for the acquisition cycle according to the kinematics law in deformation region, under actual measurement environment, equipment is located the same position when hardly guaranteeing repeated installation, and the skew can take place for radar center and synthetic aperture direction, leads to losing between the image coherent, consequently needs consider heavy rail error to the influence of interfering the phase place.
From radar echo phase
Figure BDA0002978034660000011
The heavy rail error can change the slant distance R between the target point and the radar center, and the light rail is obtained under the measuring precision of sub-millimeter magnitudeThe micro-offset will cause the interference phase to be twisted, which affects the final deformation measurement result.
Therefore, how to reasonably derive the heavy track error so as to correct the phase error caused by the track offset in multiple measurements is a problem to be solved urgently at present.
Disclosure of Invention
In view of the above, the invention provides a GB-InSAR heavy-rail error compensation method based on a scene DEM, which can reasonably derive a relationship between a heavy-rail error and a slant range R, and correct a phase error caused by a track offset in multiple measurements by using a parameter modeling method.
In order to achieve the purpose, the technical scheme of the invention comprises the following steps:
and S1, observing the scene to be monitored twice in different time intervals by using one observation radar to obtain two radar images.
And S2, based on the two radar images, selecting a PS point by using a coherence coefficient method.
And S3, scanning the scene to be monitored in the horizontal dimension and the pitch dimension by using the laser scanner, and acquiring the discrete point cloud reflecting the surface relief condition of the scene to be monitored, namely the scene DEM.
S4, obtaining the two-dimensional plane coordinates of the PS points according to coordinate axis information of the radar image, constructing the corresponding relation between the two-dimensional imaging plane of the radar and the three-dimensional coordinate system of the scene DEM based on the geometric principle of ground SAR imaging, and reversely mapping the PS points to the three-dimensional coordinate system of the scene DEM to obtain the three-dimensional coordinates of the PS points.
And S5, analyzing the relation between the heavy-rail error phase and the three-dimensional coordinate of the PS point, and constructing a polynomial model of the heavy-rail error phase.
S6, calculating parameters of the polynomial model of the heavy-rail error phase based on a least square method to obtain an estimated value of the heavy-rail error phase, and subtracting the estimated value of the heavy-rail error phase from the interference phase of the PS point to realize compensation of the heavy-rail error phase.
Further, a ground-based radar is used for observing a scene to be monitored twice in different time periods, specifically, the time interval between two times of observation exceeds one week.
Further, based on the two radar images, a coherence coefficient method is used for selecting a PS point, which specifically comprises the following steps:
based on the two radar images, one of the two radar images is set as a main image, and the other radar image is set as an auxiliary image.
Setting the size d of a rectangular window1And d2Construction of d1×d2For the k-th pixel, its coherence coefficient gamma is calculatedk
Figure BDA0002978034660000031
Wherein
Figure BDA0002978034660000032
Representing complex information of all pixel points in a rectangular window which takes the pixel point k as the center in the main image;
Figure BDA0002978034660000033
and representing complex information of all pixel points in a rectangular window taking the pixel point k as the center in the auxiliary image.
And calculating the coherence coefficients of all pixel points in the main image and the auxiliary image.
Setting a threshold value of a coherence coefficient to gammaTSelecting a value satisfying a coherence coefficient greater than gammaTThe pixel point of (2) is used as a PS point.
Further, S4 specifically includes the following steps:
setting a reference radar A, and imaging a scene to be monitored by using the reference radar A, wherein the aperture center C of the reference radar A is used(A)As the origin of coordinates, with reference to the aperture direction of the radar A
Figure BDA0002978034660000034
Is in the x-axis direction, wherein
Figure BDA0002978034660000035
And
Figure BDA0002978034660000036
respectively serving as two end points of a track of a reference radar A, taking a direction which is perpendicular to an x axis and points to a scene to be monitored on a horizontal plane as a y-axis direction, wherein an x-y plane formed by the direction is a two-dimensional imaging plane of the reference radar, and the x axis is defined as an azimuth axis and the y axis is defined as a distance axis; according to the rule of a right-hand system, defining the direction vertical to the imaging plane as a height direction, namely a z-axis, and obtaining a three-dimensional coordinate system C of the reference radar A(A)Xyz, i.e. the three-dimensional coordinate system of the scene DEM.
For any target point P in the scene to be monitored, which is at C(A)The coordinate in the-xyz coordinate system is (x)p,yp,zp) The projection point of which on the two-dimensional imaging plane of the reference radar is P(A)The coordinates are
Figure BDA0002978034660000037
According to the principle of back projection, namely BP imaging algorithm, when the radar moves on the track, the target point P and the projection point P(A)The distance migration is kept completely consistent, and a distance migration triangle is constructed
Figure BDA0002978034660000038
Triangular migration distance
Figure BDA0002978034660000039
Rotating around the x-axis to the x-y plane, projecting a point P according to the imaging geometry(A)Coordinates of (2)
Figure BDA0002978034660000041
Expressed as:
Figure BDA0002978034660000042
wherein
Figure BDA0002978034660000043
As target point P to aperture center C of reference radar A(A)The distance of (c).
Will be referred toThree-dimensional coordinate system C of radar A(A)-xyz into the coordinate system of the observation radar B;
the aperture center of the observation radar B is C(B),C(B)At C(A)The coordinate in the-xyz coordinate system is (x)B,yB,zB) (ii) a The angle between the aperture direction of the observation radar B and the x-y plane is
Figure BDA0002978034660000044
The included angle between the projection of the aperture direction of the observation radar B on the x-y plane and the x axis is theta; taking the aperture direction of the observation radar B as an x 'axis, taking the direction which is vertical to the x' axis and points to a scene to be monitored on a horizontal plane as a y 'axis, taking the x' -y 'plane formed by the direction as a two-dimensional imaging plane of the observation radar B, determining a z' axis by a right-hand rule, and constructing a three-dimensional rectangular coordinate system C of the observation radar B(B)-x′y′z′。
The target point P (x)p,yp,zp) At C(B)Coordinates under-x 'y' z 'are expressed as (x'p y′p z′p):
Figure BDA0002978034660000045
Projection point P of target point P on two-dimensional imaging plane of observation radar B(B)Has the coordinates of
Figure BDA0002978034660000046
Figure BDA0002978034660000047
Wherein
Figure BDA0002978034660000048
From target point P to aperture center C of observation radar B(B)The distance of (c).
Accordingly, a plurality of topographical points are selected as target points P in the scene to be monitored, and the projection of each target point P on the two-dimensional imaging plane of the observation radar B is correspondingly obtainedPoint P(B)Fitting to obtain the three-dimensional coordinates of the target point P and the projection point P(B)And reversely mapping the PS point to a three-dimensional coordinate system of the scene DEM by combining a two-dimensional linear interpolation method according to the functional relation between the PS point and the scene DEM to obtain the three-dimensional coordinate of the PS point.
Further, S5 specifically includes: the three-dimensional coordinates of the PS point are the three-dimensional coordinates (x) of the corresponding target point Pp,yp,zp) (ii) a Then the polynomial model of the heavy track error phase is constructed as:
Figure BDA0002978034660000051
wherein
Figure BDA0002978034660000052
To reconstruct the error phase; a, B and C are three parameters to be estimated; rpThe distance from the PS point to the center of the aperture of the observation radar; e is the unmodeled error phase.
Has the advantages that:
the invention provides a heavy rail error compensation method assisted by a scene DEM. Carrying out two-time intermittent observation on the same area by adopting a radar to obtain two radar images, and selecting a PS point as a reference point based on a coherence coefficient of the radar images; reversely mapping the PS point to a three-dimensional scene DEM according to the imaging geometry principle of the synthetic aperture radar to obtain the three-dimensional coordinate of the PS point; and constructing a phase model of the heavy rail error in a multi-element Taylor expansion mode, and estimating model parameters by using a least square method and performing error phase compensation. The method solves the problem of heavy rail error of GB-InSAR, can accurately compensate the phase error caused by rail offset under the measuring precision of sub-millimeter magnitude, and is favorable for improving the precision of deformation measurement.
Drawings
FIG. 1 is a schematic diagram of a ground-based SAR imaging geometry in an embodiment of the present invention;
FIG. 2 is a schematic diagram of two radar imaging coordinate systems of a reference radar A and an observation radar B in the embodiment of the present invention;
FIG. 3 is a schematic diagram of two-dimensional linear interpolation according to an embodiment of the present invention;
fig. 4 is a flow chart illustrating an embodiment of the present invention.
Detailed Description
The invention is described in detail below by way of example with reference to the accompanying drawings.
The invention provides a GB-InSAR heavy-orbit error compensation method based on a scene DEM, the flow of which is shown in figure 4 and specifically comprises the following steps:
and S1, observing the scene to be monitored twice in different time intervals by using one observation radar to obtain two radar images. A ground-based SAR imaging geometry diagram is shown in fig. 1.
And S2, based on the two radar images, selecting a PS point by using a coherence coefficient method.
In the embodiment of the invention, the step of selecting the PS point specifically comprises the following steps:
s201, based on two radar images, one of the two radar images is set as a main image, the other one is set as an auxiliary image, the main image and the auxiliary image can be randomly set, specifically, a radar image in a front time interval can be selected as the main image, and a radar image in a rear time interval can be selected as the auxiliary image.
S202, constructing d based on two radar images1×d2A rectangular window of (2), calculating a coherence coefficient according to equation (1)
Figure BDA0002978034660000061
In the formula (1), gammakThe coherence coefficient of the k-th pixel point is represented,
Figure BDA0002978034660000062
representing complex information of all pixel points in a rectangular window which takes the pixel point k as the center in the main image;
Figure BDA0002978034660000063
and representing complex information of all pixel points in a rectangular window taking the pixel point k as the center in the auxiliary image. All images in the main and auxiliary images are calculated according to the aboveThe coherence coefficient of the prime point.
Setting a threshold value gamma of a coherence coefficientTSelecting a material satisfying gamma > gammaTThe pixel point of (2) is taken as a PS point, wherein gammaTObtained empirically.
S3, scanning a scene to be monitored in a horizontal dimension and a pitching dimension by using a laser scanner, measuring the distance of a visual line from the fixed center of the instrument by adopting a pulse ranging technology, and then obtaining discrete point cloud reflecting the surface relief condition of the scene, namely the scene DEM, by using a conversion formula from a spherical coordinate system to a three-dimensional rectangular coordinate system according to the distance, the azimuth angle and the pitching angle of each measuring point.
S4, obtaining the two-dimensional plane coordinates of the PS points according to coordinate axis information of the radar image, constructing the corresponding relation between the two-dimensional imaging plane of the radar and the three-dimensional coordinate system of the scene DEM based on the geometric principle of ground SAR imaging, and reversely mapping the PS points to the three-dimensional coordinate system of the scene DEM to obtain the three-dimensional coordinates of the PS points.
During ground SAR imaging, three-dimensional terrain of an observation area is projected on a two-dimensional imaging plane, the imaging geometry principle is shown in figure 2, a reference radar A is set, the reference radar A is adopted to image a scene to be monitored, and the aperture center C of the reference radar A is used(A)As the origin of coordinates, with reference to the aperture direction of the radar A
Figure BDA0002978034660000071
Is in the x-axis direction, wherein
Figure BDA0002978034660000072
And
Figure BDA0002978034660000073
respectively serving as two end points of a track of a reference radar A, taking a direction which is perpendicular to an x axis and points to a scene to be monitored on a horizontal plane as a y-axis direction, wherein an x-y plane formed by the direction is a two-dimensional imaging plane of the reference radar, and the x axis is defined as an azimuth axis and the y axis is defined as a distance axis; according to the rule of a right-hand system, defining the direction vertical to the imaging plane as a height direction, namely a z-axis, and obtaining a three-dimensional coordinate system C of the reference radar A(A)Xyz, i.e. the three-dimensional coordinate system of the scene DEM.
For any target point P in the scene to be monitored, which is at C(A)The coordinate in the-xyz coordinate system is (x)p,yp,zp) The projection point of which on the two-dimensional imaging plane of the reference radar is P(A)The coordinates are
Figure BDA0002978034660000074
According to the principle of back projection, namely BP imaging algorithm, when the radar moves on the track, the target point P and the projection point P(A)The distance migration is kept completely consistent, and a distance migration triangle is constructed
Figure BDA0002978034660000075
Triangular migration distance
Figure BDA0002978034660000076
Rotating around the x-axis to the x-y plane, projecting a point P according to the imaging geometry(A)Coordinates of (2)
Figure BDA0002978034660000077
Expressed as:
Figure BDA0002978034660000078
wherein
Figure BDA0002978034660000079
As target point P to aperture center C of reference radar A(A)The distance of (c).
Three-dimensional coordinate system C of reference radar A(A)-xyz into the coordinate system of the observation radar B. When the coordinates of the radar center are not at the origin, i.e., the observation radar B in fig. 2, the aperture center of the observation radar B is C(B),C(B)At C(A)The coordinate in the-xyz coordinate system is (x)B,yB,zB) (ii) a The angle between the aperture direction of the observation radar B and the x-y plane is
Figure BDA00029780346600000710
The included angle between the projection of the aperture direction of the observation radar B on the x-y plane and the x axis is theta; taking the aperture direction of the observation radar B as an x 'axis, taking the direction which is vertical to the x' axis and points to a scene to be monitored on a horizontal plane as a y 'axis, taking the x' -y 'plane formed by the direction as a two-dimensional imaging plane of the observation radar B, determining a z' axis by a right-hand rule, and constructing a three-dimensional rectangular coordinate system C of the observation radar B(B)-x′y′z′;
Analysis of the target Point P (x)p,yp,zp) At C(B)-coordinate P (x ') under x ' y ' z ' coordinate system 'p,y′p,z′p). Assume that in the reference coordinate system C(A)At xyz, the unit vectors along the x-axis, y-axis and z-axis are
Figure BDA0002978034660000081
Figure BDA0002978034660000082
According to the geometrical relationship between the two coordinate systems, the unit vectors along the x ' -axis, the y ' -axis and the z ' -axis are respectively
Figure BDA0002978034660000083
Figure BDA0002978034660000084
The conversion results for the two sets of unit vectors can be expressed as:
Figure BDA0002978034660000085
in the formula (3), the reaction mixture is,
Figure BDA0002978034660000086
vx=-sinθ,vy=cosθ,vz=0,
Figure BDA0002978034660000087
coordinate system C(A)-xyz and C(B)The conversion relationship between-x ' y ' z ' can be expressed as:
Figure BDA0002978034660000088
in the formula (4), (x, y, z) and (x ', y ', z ') are coordinate values of the same point in two coordinate systems, respectively, the target point P (x) isp,yp,zp) At C(B)The coordinates under-x ' y ' z ' can be expressed as:
Figure BDA0002978034660000089
according to the formula (2), the projection coordinate of the target point P on the imaging plane of the observation radar B
Figure BDA00029780346600000810
Can be expressed as:
Figure BDA0002978034660000091
accordingly, a plurality of topographic points are selected from the scene to be monitored as target points P, and the projection point P of each target point P on the two-dimensional imaging plane of the observation radar B is correspondingly obtained(B)Fitting to obtain the three-dimensional coordinates of the target point P and the projection point P(B)And reversely mapping the PS point to a three-dimensional coordinate system of the scene DEM by combining a two-dimensional linear interpolation method according to the functional relation between the PS point and the scene DEM to obtain the three-dimensional coordinate of the PS point.
Assuming that there are M topographical points in space, their three-dimensional coordinates can be expressed as
Figure BDA0002978034660000092
Wherein k ∈ 1, 2.. M, projecting topographical points to the radar plane is
Figure BDA0002978034660000093
The functional relationship between the coordinates can be expressed as:
Figure BDA0002978034660000094
through PS selection by a correlation coefficient method, N PS points can be selected, and the plane coordinates of the PS points are
Figure BDA0002978034660000095
Wherein i ∈ 1, 2.. N, by using a two-dimensional linear interpolation method, a three-dimensional coordinate corresponding to the PS point can be calculated based on a projection coordinate of the topographical point, and the principle is shown in FIG. 3, wherein a point P is arranged therein0The corresponding function value can be obtained by using equation (8), where k ∈ {1,2,3 }:
Figure BDA0002978034660000096
s5, assuming that the coordinate of the target point P is (x)p,yp,zp) And the coordinates of the radar aperture center O are (x, y, z), the slant distance between the radar center and the target point P can be expressed as:
Figure BDA0002978034660000097
the slant distance R (x, y, z) is subjected to a multivariate taylor expansion at the origin (0,0,0) as follows:
Figure BDA0002978034660000101
when the radar center shifts to O' (x + epsilon)x,y+εy,z+εz) In which epsilonxyzRespectively representing the components of the heavy rail error in the azimuth direction, the distance direction and the altitude direction, and the distance change Δ R caused by the heavy rail error can be expressed as:
Figure BDA0002978034660000102
as can be seen from equation (11), the phase of the re-tracking error at a certain PS point can be modeled as:
Figure BDA0002978034660000103
wherein
Figure BDA0002978034660000104
To reconstruct the error phase; a, B and C are three parameters to be estimated; rpThe distance from the PS point to the center of the aperture of the observation radar; e is the unmodeled error phase.
S6, calculating parameters of the polynomial model of the heavy-rail error phase based on a least square method to obtain an estimated value of the heavy-rail error phase, and subtracting the estimated value of the heavy-rail error phase from the interference phase of the PS point to realize compensation of the heavy-rail error phase.
Estimating polynomial parameters based on a least square method, compensating the heavy rail error phase, and firstly establishing a linear equation set:
ΔΦ=Xβ+E (13)
in the formula (13), the reaction mixture is,
Figure BDA0002978034660000111
Δ Φ is an N × 1-dimensional vector formed by interference phases of N PS points, X is an N × 3-dimensional matrix formed by three-dimensional coordinates and skew distances of the PS points, β is a 3 × 1-dimensional vector to be estimated, and E is an N × 1-dimensional error vector.
According to the principle of least square algorithm, the estimation parameters can be obtained
Figure BDA0002978034660000112
Figure BDA0002978034660000113
In equation (14), T represents the transpose of the matrix, and the estimated value of the heavy tracking error phase can be expressed as:
Figure BDA0002978034660000114
subtracting Δ Φ from Δ ΦErrorTherefore, compensation of the heavy rail error in GB-InSAR two-time discontinuous measurement can be realized.
In summary, the above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (5)

1. The GB-InSAR heavy-orbit error compensation method based on the scene DEM is characterized by comprising the following steps of:
s1, observing the scene to be monitored twice in different time intervals by using one observation radar to obtain two radar images;
s2, based on the two radar images, selecting a PS point by using a coherence coefficient method;
s3, scanning a scene to be monitored in a horizontal dimension and a pitching dimension by using a laser scanner to obtain discrete point cloud, namely a scene DEM, reflecting the surface relief condition of the scene to be monitored;
s4, acquiring two-dimensional plane coordinates of the PS points according to coordinate axis information of the radar image, constructing a corresponding relation between a two-dimensional imaging plane of the radar and a three-dimensional coordinate system of the scene DEM based on the geometric principle of ground SAR imaging, and mapping the PS points in a reverse direction to the three-dimensional coordinate system of the scene DEM to obtain the three-dimensional coordinates of the PS points;
s5, analyzing the relation between the heavy rail error phase and the three-dimensional coordinate of the PS point, and constructing a polynomial model of the heavy rail error phase;
and S6, calculating parameters of the polynomial model of the heavy-rail error phase based on a least square method to obtain an estimated value of the heavy-rail error phase, and subtracting the estimated value of the heavy-rail error phase from the interference phase of the PS point to realize compensation of the heavy-rail error phase.
2. The method of claim 1, wherein the scene to be monitored is observed twice with a ground-based radar at different time intervals, in particular, the time intervals of the two observations are more than one week apart.
3. The method according to claim 1 or 2, wherein the PS points are selected based on the two radar images by using a coherence coefficient method, specifically:
setting one of the two radar images as a main image and the other as an auxiliary image based on the two radar images;
setting the size d of a rectangular window1And d2Construction of d1×d2For the k-th pixel, its coherence coefficient gamma is calculatedk
Figure FDA0002978034650000021
Wherein
Figure FDA0002978034650000022
Representing complex information of all pixel points in a rectangular window which takes the pixel point k as the center in the main image;
Figure FDA0002978034650000023
representing complex information of all pixel points in a rectangular window taking the pixel point k as the center in the auxiliary image;
calculating the coherence coefficients of all pixel points in the main image and the auxiliary image according to the correlation coefficients;
setting a threshold value of a coherence coefficient to gammaTSelecting a value satisfying a coherence coefficient greater than gammaTThe pixel point of (2) is used as a PS point.
4. The method according to any one of claims 1 to 3, wherein the S4 specifically comprises the following steps:
setting a reference radar A, and imaging the scene to be monitored by adopting the reference radar A, wherein the reference radar A is usedCenter of pore diameter C(A)As the origin of coordinates, with reference to the aperture direction of the radar A
Figure FDA0002978034650000024
Is in the x-axis direction, wherein
Figure FDA0002978034650000025
And
Figure FDA0002978034650000026
respectively serving as two end points of a track of a reference radar A, taking a direction which is perpendicular to an x axis and points to a scene to be monitored on a horizontal plane as a y-axis direction, wherein an x-y plane formed by the direction is a two-dimensional imaging plane of the reference radar, and the x axis is defined as an azimuth axis and the y axis is defined as a distance axis; according to the rule of a right-hand system, defining the direction vertical to the imaging plane as a height direction, namely a z-axis, and obtaining a three-dimensional coordinate system C of the reference radar A(A)-xyz, which is the three-dimensional coordinate system of the scene DEM;
for any target point P in the scene to be monitored, which is at C(A)The coordinate in the-xyz coordinate system is (x)p,yp,zp) The projection point of which on the two-dimensional imaging plane of the reference radar is P(A)The coordinates are
Figure FDA0002978034650000027
According to the principle of back projection, namely BP imaging algorithm, when the radar moves on the track, the target point P and the projection point P(A)The distance migration is kept completely consistent, and a distance migration triangle is constructed
Figure FDA0002978034650000028
Moving the distance to a triangle
Figure FDA0002978034650000029
Rotating around the x-axis to the x-y plane, projecting a point P according to the imaging geometry(A)Coordinates of (2)
Figure FDA00029780346500000210
Expressed as:
Figure FDA0002978034650000031
wherein
Figure FDA0002978034650000032
As target point P to aperture center C of reference radar A(A)The distance of (d);
three-dimensional coordinate system C of reference radar A(A)-xyz into the coordinate system of the observation radar B;
the aperture center of the observation radar B is C(B),C(B)At C(A)The coordinate in the-xyz coordinate system is (x)B,yB,zB) (ii) a The aperture direction of the observation radar B and the x-y plane form an included angle of
Figure FDA0002978034650000033
The included angle between the projection of the aperture direction of the observation radar B on the x-y plane and the x axis is theta; taking the aperture direction of the observation radar B as an x 'axis, taking the direction which is vertical to the x' axis and points to a scene to be monitored on a horizontal plane as a y 'axis, taking the x' -y 'plane formed by the direction as a two-dimensional imaging plane of the observation radar B, determining a z' axis by a right-hand rule, and constructing a three-dimensional rectangular coordinate system C of the observation radar B(B)-x′y′z′;
The target point P (x)p,yp,zp) At C(B)Coordinates under-x 'y' z 'are expressed as (x'p y′p z′p):
Figure FDA0002978034650000034
Projection point P of target point P on two-dimensional imaging plane of observation radar B(B)Has the coordinates of
Figure FDA0002978034650000035
Figure FDA0002978034650000036
Wherein
Figure FDA0002978034650000037
From target point P to aperture center C of observation radar B(B)The distance of (d);
accordingly, a plurality of topographical points are selected from the scene to be monitored as the target points P, and the projection point P of each target point P on the two-dimensional imaging plane of the observation radar B is correspondingly obtained(B)Fitting to obtain the three-dimensional coordinates of the target point P and the projection point P(B)And reversely mapping the PS point to a three-dimensional coordinate system of the scene DEM by combining a two-dimensional linear interpolation method according to the functional relation between the PS point and the scene DEM to obtain the three-dimensional coordinate of the PS point.
5. The method according to claim 4, wherein the S5 is specifically:
the three-dimensional coordinates of the PS point are the three-dimensional coordinates (x) of the corresponding target point Pp,yp,zp);
Then the polynomial model of the heavy track error phase is constructed as:
Figure FDA0002978034650000041
wherein
Figure FDA0002978034650000042
To reconstruct the error phase; a, B and C are three parameters to be estimated; rpThe distance from the PS point to the center of the aperture of the observation radar; e is the unmodeled error phase.
CN202110279364.8A 2021-03-16 2021-03-16 GB-InSAR heavy rail error compensation method based on scene DEM Active CN113189551B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110279364.8A CN113189551B (en) 2021-03-16 2021-03-16 GB-InSAR heavy rail error compensation method based on scene DEM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110279364.8A CN113189551B (en) 2021-03-16 2021-03-16 GB-InSAR heavy rail error compensation method based on scene DEM

Publications (2)

Publication Number Publication Date
CN113189551A true CN113189551A (en) 2021-07-30
CN113189551B CN113189551B (en) 2023-05-05

Family

ID=76973401

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110279364.8A Active CN113189551B (en) 2021-03-16 2021-03-16 GB-InSAR heavy rail error compensation method based on scene DEM

Country Status (1)

Country Link
CN (1) CN113189551B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113702972A (en) * 2021-08-30 2021-11-26 中国科学院空天信息创新研究院 Airborne multi-channel radar amplitude-phase error estimation method based on terrain prior
CN114035188A (en) * 2022-01-11 2022-02-11 西南交通大学 Ground-based radar glacier flow speed high-precision monitoring algorithm and system
CN116559866A (en) * 2023-07-11 2023-08-08 南京天辰礼达电子科技有限公司 Ground-based synthetic aperture radar atmosphere compensation method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109242872A (en) * 2018-08-27 2019-01-18 西安电子科技大学 Interference baseline estimation method based on SRTM DEM
CN109782276A (en) * 2017-11-13 2019-05-21 三亚中科遥感研究所 A kind of airborne heavy rail interference SAR method for registering of Long baselines
CN110703245A (en) * 2019-10-15 2020-01-17 北京理工大学 Foundation SAR multi-angle image registration method based on homonymous point matching and DEM assistance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109782276A (en) * 2017-11-13 2019-05-21 三亚中科遥感研究所 A kind of airborne heavy rail interference SAR method for registering of Long baselines
CN109242872A (en) * 2018-08-27 2019-01-18 西安电子科技大学 Interference baseline estimation method based on SRTM DEM
CN110703245A (en) * 2019-10-15 2020-01-17 北京理工大学 Foundation SAR multi-angle image registration method based on homonymous point matching and DEM assistance

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
WEIMING TIAN ET AL.: "GB-InSAR-Based DEM Generation Method and Precision Analysis", 《REMOTE SENSING》 *
方东生 等: "运动补偿对机载SAR重轨干涉成像的影响分析", 《雷达科学与技术》 *
陶秋香 等: "SAR图像中PS点的识别与选取", 《应用科学学报》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113702972A (en) * 2021-08-30 2021-11-26 中国科学院空天信息创新研究院 Airborne multi-channel radar amplitude-phase error estimation method based on terrain prior
CN114035188A (en) * 2022-01-11 2022-02-11 西南交通大学 Ground-based radar glacier flow speed high-precision monitoring algorithm and system
CN114035188B (en) * 2022-01-11 2022-04-01 西南交通大学 High-precision monitoring method and system for glacier flow velocity of ground-based radar
CN116559866A (en) * 2023-07-11 2023-08-08 南京天辰礼达电子科技有限公司 Ground-based synthetic aperture radar atmosphere compensation method
CN116559866B (en) * 2023-07-11 2023-09-29 南京天辰礼达电子科技有限公司 Ground-based synthetic aperture radar atmosphere compensation method

Also Published As

Publication number Publication date
CN113189551B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
CN113189551B (en) GB-InSAR heavy rail error compensation method based on scene DEM
JP6421395B2 (en) 3D topographic map formation method from SAR map
CN107102333B (en) Satellite-borne InSAR long and short baseline fusion unwrapping method
Catalão et al. Merging GPS and atmospherically corrected InSAR data to map 3-D terrain displacement velocity
CN111208512B (en) Interferometric measurement method based on video synthetic aperture radar
CN103885059B (en) A kind of multi-baseline interference synthetic aperture radar three-dimensional rebuilding method
CN113740844B (en) Dam three-dimensional deformation monitoring-oriented two-foundation radar combined observation method
CN113340191B (en) Time series interference SAR deformation quantity measuring method and SAR system
CN112050725A (en) Surface deformation monitoring method fusing InSAR and GPS
Wang et al. Planar block adjustment and orthorectification of ZY-3 satellite images
CN111721241A (en) GNSS-InBSAR and GB-InSAR cross-system fusion three-dimensional deformation measurement method
CN109239710B (en) Method and device for acquiring radar elevation information and computer-readable storage medium
KR101712084B1 (en) Method and Apparatus for Correcting Ionospheric Distortion based on multiple aperture interferometry
CN115097450A (en) Cross-track high-resolution three-SAR (synthetic aperture radar) offset large-gradient landslide deformation estimation method
Baffelli et al. Geostatistical Analysis and Mitigation of the Atmospheric Phase Screens in Ku-Band Terrestrial Radar Interferometric Observations of an Alpine Glacier.
CN115930800A (en) Tunnel face displacement field monitoring method based on three-dimensional laser point cloud
CN115712095A (en) SAR satellite three-dimensional positioning error correction method and system based on single angular reflection
CN110133653B (en) Satellite-borne SAR image rapid indirect positioning method based on DSM data
Scheiber A three-step phase correction approach for airborne repeat-pass interferometric SAR data
CN111681299B (en) Method and device for generating digital surface model based on InSAR unwrapping phase
Insfran et al. Accurate 3D Measurement from Two SAR Images Without Prior Knowledge of Scene
Jaafar et al. Health monitoring of historic ruins and heritage buildings using terrestrial laser scanning and Generalised Procrustes Analysis
CN117830543A (en) Method, device, equipment and medium for estimating DEM (digital elevation model) based on satellite-borne double-station InSAR (interferometric synthetic aperture radar) and laser radar data
Tian et al. 3D laser odometry for a mobile robot platform
Matsuba et al. Stereo Reconstruction of SURF Zone Waves Using UAV

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant