CN113671500B - Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method - Google Patents
Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method Download PDFInfo
- Publication number
- CN113671500B CN113671500B CN202110917945.XA CN202110917945A CN113671500B CN 113671500 B CN113671500 B CN 113671500B CN 202110917945 A CN202110917945 A CN 202110917945A CN 113671500 B CN113671500 B CN 113671500B
- Authority
- CN
- China
- Prior art keywords
- amplitude
- scene
- space
- initial phase
- motion error
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 32
- 238000010586 diagram Methods 0.000 claims description 12
- 238000003384 imaging method Methods 0.000 description 6
- 238000004088 simulation Methods 0.000 description 6
- 238000004422 calculation algorithm Methods 0.000 description 4
- 230000000694 effects Effects 0.000 description 4
- 238000004364 calculation method Methods 0.000 description 3
- 208000004350 Strabismus Diseases 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 238000013508 migration Methods 0.000 description 1
- 230000005012 migration Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000035485 pulse pressure Effects 0.000 description 1
- 238000000638 solvent extraction Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/904—SAR modes
- G01S13/9058—Bistatic or multistatic SAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/904—SAR modes
- G01S13/9052—Spotlight mode
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/89—Radar or analogous systems specially adapted for specific applications for mapping or imaging
- G01S13/90—Radar or analogous systems specially adapted for specific applications for mapping or imaging using synthetic aperture techniques, e.g. synthetic aperture radar [SAR] techniques
- G01S13/9094—Theoretical aspects
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/40—Means for monitoring or calibrating
- G01S7/4052—Means for monitoring or calibrating by simulation of echoes
Landscapes
- Engineering & Computer Science (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Physics & Mathematics (AREA)
- Electromagnetism (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
The invention discloses a high-frequency motion error compensation method of an unmanned aerial vehicle-mounted bistatic SAR, which comprises the steps of constructing a high-frequency motion error model of an unmanned aerial vehicle-mounted bistatic SAR system, and determining the amplitude and the initial phase space-variant property and the space-variant gradient direction of the high-frequency motion error model; then, estimating the amplitude and initial phase information of a high signal-to-noise ratio area in the whole scene by using a traditional parameter estimation method, projecting the information to the space-variant gradient direction, and obtaining the amplitude and initial phase information of all positions in the space-variant gradient direction by using a fitting method; and then sub-scene division is carried out on the full scene, and high-frequency motion error compensation of each sub-scene is carried out, so that the high-frequency motion error compensation of the full scene is realized. The method adopts a projection and fitting mode, only needs to estimate the amplitude and initial phase information of the high-frequency motion error in the high signal-to-noise ratio area, reduces the calculated amount, improves the operation efficiency, and adopts a sub-scene division mode to directly compensate the motion error of the full scene, thereby improving the compensation efficiency and precision.
Description
Technical Field
The invention relates to the technical field of bistatic synthetic aperture radars, in particular to an unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method.
Background
Compared with the traditional large-plane-borne system or satellite-borne system, the unmanned aerial vehicle-borne bistatic synthetic aperture radar system (UAV-BiSAR, unmanned Aerial Vehicle Bistatic Synthetic Aperture Radar) is more susceptible to the influence of the environment to shake during flight, so that errors are generated in the track; compared with a single-base system, the UAV-BiSAR system has doubled motion error sources, and the track error of a transceiver is considered during motion compensation; when the system works in a higher frequency band, the transmitter and the receiver shake simultaneously to introduce high-frequency errors with similar frequencies, obvious space-variant phenomena exist, the two phenomena are seriously coupled, the space-variant is more complex, and the difficulty of subsequent motion error compensation processing is greatly increased.
The traditional motion error compensation method uses an iterative parameter-adjusting self-focusing algorithm which needs strong point information in a scene and mainly aims at a single space-variant direction, so that the operation amount is large. In the double-base system, the track is greatly influenced by the environment when the unmanned aerial vehicle flies, the double-base system causes motion error coupling, the compensation efficiency is lower under the condition that the space-variant direction is changed, and the compensation effect is poor.
Disclosure of Invention
In view of the above, the invention provides an unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method, which can efficiently realize high-precision compensation of high-frequency motion errors under the conditions of complex space-variant and large operand.
The specific scheme of the invention is as follows:
an unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method comprises the following steps:
step one, establishing a sinusoidal motion error space-variant model of an unmanned aerial vehicle-mounted bistatic SAR system to obtain the space-variant gradient direction of the amplitude and the initial phase in a full scene;
estimating the amplitude and initial phase information of a high signal-to-noise ratio area in the full scene, and projecting an estimation result to the space-variant gradient direction to obtain the amplitude and initial phase information in the space-variant gradient direction;
fitting the amplitude and initial phase information in the space-variant gradient direction to obtain a fitting result, namely the amplitude and initial phase information of all positions in the space-variant gradient direction;
dividing the full scene into n sub-scenes, projecting the central position of each sub-scene onto the space-variant gradient direction, and finding out amplitude and initial phase information corresponding to the projection of the central position of the n sub-scenes from the fitting result to serve as high-frequency motion error parameters of the centers of the n sub-scenes; wherein n is a positive integer;
and fifthly, constructing a phase compensation function of each sub-scene according to the high-frequency motion error parameters, completing high-frequency motion error compensation of each sub-scene, and realizing high-frequency motion error compensation of the whole scene.
Further, in the first step, the space-variant gradient direction of the amplitude and the initial phase in the whole scene is obtained: and obtaining the jitter states of the transmitter and the receiver according to the inertial navigation system, and calculating to obtain the amplitude and the initial phase information in the whole scene, thereby obtaining the space-variant gradient direction of the amplitude and the initial phase in the whole scene.
Further, in the second step, the projecting the estimation result to the space-variant gradient direction of the amplitude and the initial phase in the scene in the first step specifically includes: and drawing a contour diagram of the amplitude and initial phase of the full scene, wherein an intersection point of a contour line of the high signal-to-noise ratio area and the space-variant gradient direction is a projection position of the center of the high signal-to-noise ratio area in the space-variant gradient direction.
Further, in the third step, a linear least square second order fitting method is adopted, and the amplitude and the initial phase information in the space-variant gradient direction are fitted to obtain the amplitude and the initial phase information of all positions in the space-variant gradient direction.
Further, in the fourth step, the dividing criteria for dividing the full scene into n sub-scenes are: the phase error within each sub-scene is less than pi/4.
Further, the phase compensation function is:
wherein,,and the parameter estimation result of the nth sub-scene respectively represents an estimated value of amplitude, an estimated value of frequency and an estimated value of initial phase, j is an imaginary unit, lambda is the wavelength of a transmitted signal, and u is the azimuth slow time.
The beneficial effects are that:
(1) The method adopts the projection mode to only estimate the amplitude and the initial phase information of the high-frequency motion error in the high-signal-to-noise ratio area during calculation, reduces the calculated amount, and then fits the partial amplitude and the initial phase information corresponding to the high-signal-to-noise ratio area, so that the amplitude and the initial phase information of the high-frequency motion error corresponding to all the positions can be directly obtained, and the calculation efficiency is further improved. And divide the whole scene into n sub-scenes, through carrying on the motion error compensation to the sub-scene one by one, and then realize the motion error compensation of the whole scene, compare and carry on the motion error compensation to the whole scene directly, have improved compensating efficiency and precision.
(2) In a preferred embodiment, the projection position of the space-variant gradient direction corresponding to the central position of the high signal-to-noise ratio area is obtained by drawing the contour map, so that the calculation difficulty of the projection position can be simplified on the premise of ensuring that the accuracy of the projection position is enough.
Drawings
Fig. 1 is a schematic view of a body attitude angle of an unmanned aerial vehicle according to an exemplary embodiment of the present invention;
FIG. 2 is a schematic diagram of the space-variant gradient of scene amplitude and phase according to an exemplary embodiment of the present invention;
FIG. 3 is a schematic diagram of a space-variant gradient line model according to an exemplary embodiment of the present invention;
FIG. 4 is a simulated target point distribution diagram according to an exemplary embodiment of the present invention;
FIG. 5 is a graph showing the results of an amplitude null-shift simulation according to an exemplary embodiment of the present invention;
FIG. 6 is a diagram showing the results of a primary phase space modification simulation in accordance with an exemplary embodiment of the present invention;
FIG. 7 is a graph showing the result of phase error magnitude fitting according to an exemplary embodiment of the present invention;
FIG. 8 is a preliminary fitting result of an exemplary embodiment of the present invention;
FIG. 9 is a graph showing the imaging result before motion compensation of the object 1 in accordance with an exemplary embodiment of the present invention;
FIG. 10 is a graph showing the result of imaging the object 1 after distance motion compensation according to an exemplary embodiment of the present invention;
FIG. 11 is an image of the subject 1 after azimuthal motion compensation in accordance with an exemplary embodiment of the present invention;
FIG. 12 is a graph showing the imaging result before motion compensation of the target 25 in accordance with an exemplary embodiment of the present invention;
FIG. 13 is a graph showing the result of distance-compensated imaging of a target 25 in accordance with an exemplary embodiment of the present invention;
fig. 14 is an image of the target 25 after azimuthal motion compensation in accordance with an exemplary embodiment of the present invention.
Detailed Description
The invention will now be described in detail by way of example with reference to the accompanying drawings.
The high-frequency motion error compensation method of the unmanned aerial vehicle-mounted bistatic SAR comprises the steps of constructing a high-frequency motion error model of an unmanned aerial vehicle-mounted bistatic SAR system, and determining the amplitude and the space-variant property and the space-variant gradient direction of an initial phase based on jitter information of the unmanned aerial vehicle during working, which is obtained by an inertial navigation system; then, estimating the amplitude and initial phase information of a high signal-to-noise ratio area in the whole scene by using a traditional parameter estimation method, projecting the information to the space-variant gradient direction, and obtaining the amplitude and initial phase information of all positions in the space-variant gradient direction by using a fitting method; and then sub-scene division is carried out on the full scene, projection positions of all sub-scenes in the space-variant gradient direction are found, estimated values of the amplitude and the initial phase of the sub-scenes are obtained, and then a compensation signal, namely a phase compensation function, is constructed based on the linear frequency modulation echo signal model, so that high-frequency motion error compensation of all the sub-scenes is completed, and further high-frequency motion error compensation of the full scene is realized.
The method specifically comprises the following steps:
step one, establishing a sinusoidal motion error space-variant model of the unmanned aerial vehicle-mounted bistatic SAR system to obtain the space-variant gradient direction of the amplitude and the initial phase in the whole scene.
According to the approximate jitter state of the unmanned aerial vehicle obtained by the inertial navigation system, obtaining an amplitude and initial phase space-variant diagram in a scene, and further obtaining the amplitude and initial phase space-variant gradient direction in the scene;
the method comprises the following steps:
the attitude angle of the machine body, including yaw angle α, pitch angle β, and roll angle θ, is represented using static euler angles, as shown in fig. 1. The origin of the carrier coordinate system is the centroid, represented by O, the right, front and upper sides of the carrier respectively form the X, Y, Z axis positive direction of the carrier coordinate system, ENU is the scene local coordinate system, and the plane S is the plumb plane passing through the Y axis and is perpendicular to the EON plane. The yaw angle alpha is defined as the included angle between the horizontal projection of the Y axis of the carrier and the N direction, north-west is positive, and the range is (-180 degrees, 180 degrees); the pitch angle beta is defined as the included angle between the Y axis of the carrier and the horizontal projection of the carrier, the carrier is raised to be positive, and the range is (-90 degrees, 90 degrees); the roll angle θ is defined as the angle between the vertical plane of the carrier Y axis and its Z axis, and is positive when the carrier is tilted to the right, with a range of (-180 °,180 °).
After the attitude angle of the unmanned aerial vehicle is obtained, according to the position of the antenna aperture center under the carrier coordinate system, the position of the Antenna Phase Center (APC) at different moments under the scene coordinate system is calculated:
wherein,,is the amplitude of the high frequency motion error caused by the body shake. The antenna phase center in the initial condition has a coordinate [ p ] in the carrier coordinate system x p y p z ] T 。
By the formula (1), the high-frequency motion error caused by the body shake at each azimuth slow time u can be expressed as:
wherein,,frequency representing sinusoidal jitter, +.>Representing the initial phase of the error.
The skew error of the system can be expressed as:
ΔR(u,P)=ΔR low (u,P)+ΔR high (u,P) (3)
where u is the azimuth slow time, P is the three-dimensional point coordinates of the target, ΔR low Is a pitch error, ΔR, introduced by a low frequency motion error high The invention aims to solve the skew error introduced by the high-frequency motion error. The high frequency error of the transceiver in the bistatic configuration needs to be considered, and the skew error introduced by the high frequency motion error can be finally expressed as:
wherein phi is TP Representing the unit vector of the transmitter to the target P in the line of sight direction, phi RP Representing the unit vector of the receiver to the target P in the line of sight direction.
Because the external environment conditions of the transmitter and the receiver are similar when in operation, only the high-frequency motion errors with the same frequency are considered, namely the relation of the high-frequency motion errors of the transmitter and the receiver is made to satisfy f eT ≈f eR =f e Substituting it into formula (4) to obtain:
wherein,,
a is amplitude information of transceiver after the skew history introduces high-frequency motion error, a T And a R Amplitude information after high-frequency motion errors are introduced into the skew histories of the transmitter and the receiver respectively,is the initial phase information after the high-frequency motion error is introduced into the skew history of the transceiver. The space-variant graphs of the two parameters can be drawn by using the amplitude and initial phase information of the full scene, so as to obtain the space-variant gradient directions of the amplitude and the initial phase, wherein the schematic diagram is shown in fig. 2, fig. 2 (a) is a schematic diagram of the gradient direction of the amplitude, and fig. 2 (b) is a schematic diagram of the gradient direction of the initial phase.
And secondly, estimating the amplitude and initial phase information of a high signal-to-noise ratio area in the full scene, and projecting an estimation result to the space-variant gradient direction of the amplitude and initial phase in the full scene in the first step to obtain the amplitude and initial phase information in the space-variant gradient direction.
And obtaining the parameter (namely amplitude and initial phase) estimation result of the sinusoidal frequency modulation signal in the high signal-to-noise ratio area in the scene by using a traditional parameter estimation method, and projecting the parameter (namely amplitude and initial phase) estimation result to the space-variant gradient direction of the amplitude and initial phase in the whole scene to obtain several groups of amplitude and initial phase information in the space-variant gradient direction of the whole scene.
Firstly, carrying out rough partitioning on an original imaging result which is not subjected to motion error compensation, calculating signal-to-noise ratios of the areas according to whether scene information which is distinguishable by human eyes exists, and then obtaining parameter estimation results of sinusoidal frequency modulation signals of the areas with high signal-to-noise ratios by using a traditional parameter estimation method. And then, calculating and obtaining the amplitude and initial phase space-variant gradient direction of the whole scene by utilizing the amplitude and initial phase information of each point in the scene. And finally, drawing a contour diagram of amplitude and initial phase, finding out a contour line of the center of the high signal-to-noise ratio region, wherein the intersection position of the space-variant gradient direction line and the contour line is the projection position of the center of the high signal-to-noise ratio region on the space-variant gradient direction line, and the motion error estimated value of the point on the space-variant gradient direction line corresponds to the sinusoidal frequency modulation signal parameter estimated value of the high signal-to-noise ratio region. In this embodiment, only the high snr region is estimated, the error parameters of the low snr region are not estimated, but the result of estimating the parameters of the low snr region obtained by the projection and fitting method is not estimated, so that the operation efficiency is improved.
In this embodiment, the contour method is adopted to project the estimation result of the high signal-to-noise ratio region, and the manner of how to project is not limited in the actual operation process.
Fitting the amplitude and initial phase information in the space-variant gradient direction corresponding to the high signal-to-noise ratio area in the step two to obtain the fitting result, namely the amplitude and initial phase information of all positions in the space-variant gradient direction.
In this embodiment, a linear least square bivalent fitting method is adopted to fit the amplitude and initial phase information in the space-variant gradient direction corresponding to the high signal-to-noise ratio region, so as to obtain the amplitude and initial phase information of all positions in the space-variant gradient direction. In the actual operation process, the fitting algorithm is not limited on the premise of ensuring the fitting accuracy.
Dividing the full scene into n (n is a positive integer) sub-scenes, projecting the central position of each sub-scene onto the space-variant gradient direction, and finding out amplitude and primary phase information corresponding to the projection of the central position of the n sub-scenes from the fitting result to serve as high-frequency motion error parameters of the centers of the n sub-scenes; wherein n is a positive integer.
The space-variant direction of the motion error of the system is the result of the coupling of the motion errors of the transmitter and the receiver, so that the space-variant direction change of the scene is complex, and the operation amount is very large if error parameter estimation is carried out on each block after the sub-scene is divided. In the step, firstly, sub-scene division is carried out on the whole scene based on the criterion that the phase error caused by the space variation of the amplitude and the initial phase in the sub-scene is less than pi/4; projecting the position of the center of each sub-scene to the space-variant gradient direction obtained in the third step, and taking the amplitude and initial phase estimated value of the projected position as the high-frequency motion error parameter of the sub-scene, as shown in fig. 3, wherein P 1 ,P 2 ,P 3 Is the center of the sub-scene and the positions of the corresponding dashed lines connected to the gradient lines are their projection positions, respectively.
And fifthly, constructing a phase compensation function of each sub-scene according to the high-frequency motion error parameters, completing high-frequency motion error compensation of each sub-scene, and realizing high-frequency motion error compensation of the whole scene.
And substituting the linear frequency modulation signal as a signal model of the algorithm into a compensation signal constructed by high-frequency error parameters of the center of each sub-scene, namely a phase compensation function, so as to complete high-frequency motion error compensation of each sub-scene, thereby realizing high-frequency motion error compensation of the whole scene.
First, the skew history is expressed as:
R(u,P)=R 0 (u,P)+ΔR(u,P) (7)
wherein R is 0 (u, P) is the slope distance history, ΔR, calculated from the GPS and INS data 0 (u, P) is the skew history error. Substituting the echo signal into a linear frequency modulation signal model to obtain an echo signal as follows:
wherein j is an imaginary unit, T int For synthetic aperture integration time, K r Is of frequency modulation rate, T p For the transmit signal pulse width, t is the fast time, u is the azimuth slow time, c is the speed of light, and λ is the transmit signal wavelength. For an unmanned aerial vehicle platform, the rotation quantity of the carrier platform is very small and is in millimeter magnitude, so that influence of the carrier platform on migration of a range cell is not considered, and signals after pulse pressure are expressed as follows:
substituting the formula (5) into the above formula results in the following signal expression before azimuth focusing:
let the parameter of the nth sub-sceneThe number estimation result isDescribing the amplitude, frequency and initial phase of the motion error of the nth sub-scene respectively, the corresponding phase compensation function is:
substituting the parameter estimation result into the formula (6) to further determine the phase compensation function of each sub-scene. The resulting high frequency motion error compensated signal is expressed as:
wherein the delta phi residual high frequency phase error does not affect the overall focusing effect of the imaging. And then, the residual low-frequency phase error is compensated by using a traditional motion error compensation algorithm, so that the high-frequency and low-frequency motion compensation of the unmanned aerial vehicle bistatic SAR is realized.
In this embodiment, according to the parameters in table 1, simulation experiments are performed on a scene in which 5×5 point targets are uniformly distributed on a 1600×1600 grid, and the positions and numbers are shown in fig. 4.
Table 1 transceiver parameter table
Transmitter distance | 1.90km | Transmitter height | 500m |
Transmitter squint angle | 45° | Transmitter speed | 20m/s |
Transmitter acceleration | 0.34m/s 2 | Receiver distance | 1.12km |
Receiver height | 500m | Receiver squint angle | 85° |
Receiver speed | 15m/s | Receiver acceleration | 0.34m/s 2 |
Wavelength of | 0.019m | Bandwidth of a communication device | 50MHz |
Synthetic aperture time | 2.5s | Pulse repetition frequency | 600Hz |
Table 2 track high frequency motion error table for transmitters and receivers
First, the map curves of the amplitude and the initial phase in the whole scene are simulated and analyzed, as shown in fig. 5 and 6, wherein fig. 5 is a simulation result of amplitude null-degeneration, and fig. 6 is a null-degeneration simulation result of the initial phase. The two figures are contour diagrams, wherein the dotted line is a gradient line, and the minimum value is 0 after normalization treatment. From simulation results, the space-variant changes very complicated in the whole scene, and it is difficult to obtain a space-variant model with accurate description through modeling in a single direction.
Then, repeated experimental estimation is carried out on sinusoidal motion parameters of a plurality of strong points, taking the 3,8,14,17 4 experimental sample points as an example, and the results of the parameter estimation are shown in table 3:
TABLE 3 parameter estimation results
Then, the amplitude and the initial phase of the phase are fitted along the gradient line by adopting the method disclosed in the article, and an error curve after the fitting is shown in fig. 7 and 8, wherein fig. 7 is a fitting result of the amplitude of the phase error, and fig. 8 is a fitting result of the initial phase.
As can be seen from the results of the amplitude and phase fitting of the space variation, the space variation of the motion error can be compensated. Taking targets 1 and 25 as examples, the results before and after compensation are shown in fig. 9, 10, 11, 12, 13, and 14. The false target compensation effect is shown in table 4, with the 4 points furthest spaced apart as examples, the values in the table representing the amplitude differences from the main lobe.
TABLE 4 false target Compensation Effect
Target 1 | Target 5 | Target 21 | Target 25 | |
5Hz | -28.03dB | -27.33dB | -27.63dB | -26.58dB |
As can be seen from the results in table 4, after the motion compensation, false targets in the image are effectively suppressed, and the suppression effect is more than 26 dB. Meanwhile, the global motion errors are effectively compensated, and the invariance of the errors is effectively solved.
The above specific embodiments merely describe the design principle of the present invention, and the shapes of the components in the description may be different, and the names are not limited. Therefore, the technical scheme described in the foregoing embodiments can be modified or replaced equivalently by those skilled in the art; such modifications and substitutions do not depart from the spirit and technical scope of the invention, and all of them should be considered to fall within the scope of the invention.
Claims (6)
1. The unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method is characterized by comprising the following steps of:
step one, establishing a sinusoidal motion error space-variant model of an unmanned aerial vehicle-mounted bistatic SAR system to obtain the space-variant gradient direction of the amplitude and the initial phase in a full scene;
estimating the amplitude and initial phase information of a high signal-to-noise ratio area in the full scene, and projecting an estimation result to the space-variant gradient direction to obtain the amplitude and initial phase information in the space-variant gradient direction;
fitting the amplitude and initial phase information in the space-variant gradient direction to obtain a fitting result, namely the amplitude and initial phase information of all positions in the space-variant gradient direction;
dividing the full scene into n sub-scenes, projecting the central position of each sub-scene onto the space-variant gradient direction, and finding out amplitude and initial phase information corresponding to the projection of the central position of the n sub-scenes from the fitting result to serve as high-frequency motion error parameters of the centers of the n sub-scenes; wherein n is a positive integer;
and fifthly, constructing a phase compensation function of each sub-scene according to the high-frequency motion error parameters, completing high-frequency motion error compensation of each sub-scene, and realizing high-frequency motion error compensation of the whole scene.
2. The method of high frequency motion error compensation according to claim 1, wherein in step one, the spatial gradient directions of the amplitude and the initial phase in the full scene are obtained: and obtaining the jitter states of the transmitter and the receiver according to the inertial navigation system, and calculating to obtain the amplitude and the initial phase information in the whole scene, thereby obtaining the space-variant gradient direction of the amplitude and the initial phase in the whole scene.
3. The method of high frequency motion error compensation according to claim 1, wherein in the second step, the projecting the estimation result to the space-variant gradient direction of the amplitude and the initial phase in the scene in the first step is specifically: and drawing a contour diagram of the amplitude and initial phase of the full scene, wherein an intersection point of a contour line of the high signal-to-noise ratio area and the space-variant gradient direction is a projection position of the center of the high signal-to-noise ratio area in the space-variant gradient direction.
4. The method of high frequency motion error compensation according to claim 1, wherein in step three, a linear least square second order fitting method is adopted, and amplitude and initial phase information in the space-variant gradient direction are fitted, so as to obtain amplitude and initial phase information of all positions in the space-variant gradient direction.
5. The high frequency motion error compensation method according to claim 1, wherein in the fourth step, the division criterion for dividing the full scene into n sub-scenes is: the phase error within each sub-scene is less than pi/4.
6. The high frequency motion error compensation method of claim 1, wherein the phase compensation function is:
wherein,,and the parameter estimation result of the nth sub-scene respectively represents an estimated value of amplitude, an estimated value of frequency and an estimated value of initial phase, j is an imaginary unit, lambda is the wavelength of a transmitted signal, and u is the azimuth slow time.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110917945.XA CN113671500B (en) | 2021-08-11 | 2021-08-11 | Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110917945.XA CN113671500B (en) | 2021-08-11 | 2021-08-11 | Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113671500A CN113671500A (en) | 2021-11-19 |
CN113671500B true CN113671500B (en) | 2023-07-25 |
Family
ID=78542270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110917945.XA Active CN113671500B (en) | 2021-08-11 | 2021-08-11 | Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113671500B (en) |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108020836A (en) * | 2018-01-10 | 2018-05-11 | 电子科技大学 | Double-base synthetic aperture radar moving target localization method |
CN113030899A (en) * | 2021-03-17 | 2021-06-25 | 内蒙古工业大学 | Radar signal motion compensation method and device, and radar image de-jittering method and device |
CN113189585A (en) * | 2021-06-23 | 2021-07-30 | 北京理工大学 | Motion error compensation algorithm based on unmanned aerial vehicle bistatic SAR system |
CN113221062A (en) * | 2021-04-07 | 2021-08-06 | 北京理工大学 | High-frequency motion error compensation algorithm of small unmanned aerial vehicle-mounted BiSAR system |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8207887B2 (en) * | 2009-06-19 | 2012-06-26 | The United States Of America As Represented By The Secretary Of The Army | Computationally efficent radar processing method and sytem for SAR and GMTI on a slow moving platform |
-
2021
- 2021-08-11 CN CN202110917945.XA patent/CN113671500B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108020836A (en) * | 2018-01-10 | 2018-05-11 | 电子科技大学 | Double-base synthetic aperture radar moving target localization method |
CN113030899A (en) * | 2021-03-17 | 2021-06-25 | 内蒙古工业大学 | Radar signal motion compensation method and device, and radar image de-jittering method and device |
CN113221062A (en) * | 2021-04-07 | 2021-08-06 | 北京理工大学 | High-frequency motion error compensation algorithm of small unmanned aerial vehicle-mounted BiSAR system |
CN113189585A (en) * | 2021-06-23 | 2021-07-30 | 北京理工大学 | Motion error compensation algorithm based on unmanned aerial vehicle bistatic SAR system |
Also Published As
Publication number | Publication date |
---|---|
CN113671500A (en) | 2021-11-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112098964B (en) | Calibration method, device, equipment and storage medium of road-end radar | |
CN107092014B (en) | Optimization method for missile-borne double-base forward-looking SAR ship target positioning | |
CN112070894B (en) | Real environment navigation multipath real-time simulation method, device, medium and electronic equipment | |
US10656259B2 (en) | Method for determining trajectories of moving physical objects in a space on the basis of sensor data of a plurality of sensors | |
KR102028324B1 (en) | Synthetic Aperture Radar Image Enhancement Method and Calculating Coordinates Method | |
CN112083387B (en) | Radar calibration method and device | |
JP2017508978A (en) | Method and apparatus for determining angle of arrival (AOA) in a radar warning receiver | |
CN107918115B (en) | Radar target positioning method based on multipath utilization | |
CN115144825A (en) | External parameter calibration method and device for vehicle-mounted radar | |
CN112689775B (en) | Radar point cloud clustering method and device | |
CN108896957A (en) | The positioning system and method in a kind of unmanned plane control signal source | |
Iqbal et al. | Imaging radar for automated driving functions | |
CN111551934A (en) | Motion compensation self-focusing method and device for unmanned aerial vehicle SAR imaging | |
CN116430310A (en) | UWB technology-based high-precision unmanned ship positioning system, method and storage medium | |
CN109917373B (en) | Dynamic planning track-before-detect method for motion compensation search of moving platform radar | |
CN113376625B (en) | Method and device for obtaining deviation angle of target object, electronic equipment and storage medium | |
CN118259285A (en) | Missile-borne multi-baseline interference height measurement method based on ground distance matching | |
CN113671500B (en) | Unmanned aerial vehicle-mounted bistatic SAR high-frequency motion error compensation method | |
CN112560295A (en) | Satellite equivalent velocity calculation method for passive synthetic aperture positioning | |
KR102028323B1 (en) | Synthetic Aperture Radar Image Enhancement Apparatus and System | |
CN116879892A (en) | Radar target decomposition method based on electromagnetic scattering characteristics of parameterized component | |
Kauffman et al. | Simulation study of UWB-OFDM SAR for navigation with INS integration | |
CN113189585A (en) | Motion error compensation algorithm based on unmanned aerial vehicle bistatic SAR system | |
CN109738890A (en) | A method of distance figure is generated based on missile-borne Bistatic SAR range Doppler image | |
CN111856464B (en) | DEM extraction method of vehicle-mounted SAR (synthetic aperture radar) based on single control point information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |