CN109799502A - A kind of bidimensional self-focusing method suitable for filter back-projection algorithm - Google Patents

A kind of bidimensional self-focusing method suitable for filter back-projection algorithm Download PDF

Info

Publication number
CN109799502A
CN109799502A CN201910078771.5A CN201910078771A CN109799502A CN 109799502 A CN109799502 A CN 109799502A CN 201910078771 A CN201910078771 A CN 201910078771A CN 109799502 A CN109799502 A CN 109799502A
Authority
CN
China
Prior art keywords
dimensional
phase error
image
azimuth
frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910078771.5A
Other languages
Chinese (zh)
Other versions
CN109799502B (en
Inventor
毛新华
李丹琪
施天玥
鲍悦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing University of Aeronautics and Astronautics
Original Assignee
Nanjing University of Aeronautics and Astronautics
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing University of Aeronautics and Astronautics filed Critical Nanjing University of Aeronautics and Astronautics
Priority to CN201910078771.5A priority Critical patent/CN109799502B/en
Publication of CN109799502A publication Critical patent/CN109799502A/en
Application granted granted Critical
Publication of CN109799502B publication Critical patent/CN109799502B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a kind of bidimensional self-focusing methods suitable for filter back-projection algorithm, it is two pretreatments first including four steps, first pretreatment inhibits image spectrum fuzzy, the image of different target is composed alignment by another pretreatment, so that subsequent phase is estimated and correct to unify to carry out.Third step is the one-dimensional phase error estimation and phase error in orientation.The priori analytic structure information for finally utilizing bidimensional phase error, directly maps from the orientation phase error of estimation and obtains bidimensional phase error, and realize correction.When the measurement of radar motion track is inaccurate, the result of filter back-projection algorithm imaging, which still has, to be defocused, and existing method is all based on the optimal method of picture quality criterion at present, and such methods common problem is that computational efficiency is not high.The present invention takes full advantage of the prior information of filtered backprojection image residual error, by dimension-reduction treatment come estimating phase error parameter, efficiently carries out effective refocusing to the filtered backprojection image defocused.

Description

Two-dimensional self-focusing method suitable for filtering back projection algorithm
Technical Field
The invention relates to a two-dimensional self-focusing method suitable for a filtering back-projection algorithm, in particular to a two-dimensional self-focusing method under the condition that a radar flight path is unstable or no high-precision motion sensor exists, and belongs to the technical field of radar imaging.
Background
Synthetic Aperture Radar (SAR) uses the motion of an antenna to achieve high resolution imaging. When the radar platform flies along a linear track and acquires data at a constant Pulse Repetition Frequency (PRF), a frequency domain imaging algorithm (such as a range-Doppler algorithm, a linear frequency modulation calibration algorithm and a range migration algorithm) is utilized to effectively carry out coherent processing, and a high-precision image is obtained. This linear flight trajectory assumption is generally valid when the synthetic aperture time is not too long. However, as the resolution becomes higher, the required synthetic aperture length becomes very long. Or when the radar is equipped on a highly mobile platform, such as a miniature multi-rotor drone, a non-linear radar flight path will become unavoidable. In this case, if the frequency domain algorithm is still used, a complex motion compensation process is required to correct for phase errors caused by non-ideal radar motion. Time-domain imaging algorithms, such as filtered back-projection (FBP) algorithms, have shown their advantages in these cases due to their inherent nonlinear motion compensation capabilities.
However, FBP imaging also requires precise measurement of the relative geometry between the radar flight path and the imaged scene. Modern synthetic aperture radar sensors use a motion sensing system that is a combination of an Inertial Measurement Unit (IMU) and a Global Positioning System (GPS) navigator to accomplish the measurement of radar motion. However, these sensors may be too expensive or may not provide satisfactory measurement accuracy for high resolution SAR imaging. Therefore, signal-based motion compensation, i.e., auto-focusing, is often an essential process in SAR processing, which can provide the necessary additions to the GPS/IMU device, reducing system cost.
The conventional self-focusing algorithm assumes that the migration of a target residual range cell can be ignored after the imaging algorithm is carried out, and the self-focusing only needs to estimate and correct one-dimensional azimuth phase errors. This assumption is often no longer true with increasing resolution, especially when high precision motion sensors are not available. In this case, two-dimensional auto-focusing becomes an indispensable step for accurate focus imaging. In the published literature, existing two-dimensional auto-focusing methods can be divided into two categories. One is to blindly estimate the two-dimensional phase error. They assume that the two-dimensional phase error is completely unknown and directly estimate all two-dimensional phase error parameters. Because of more unknown phase parameters, the calculation efficiency and the parameter estimation accuracy of the strategy are poor. In the second strategy, the two-dimensional phase error is estimated in a semi-blind manner. The method estimates the two-dimensional phase error in a lower parameter space by using the priori knowledge of the phase error structure and dimension reduction processing. Compared with blind estimation, the method has great advantages in computational efficiency and estimation accuracy due to the reduction of the phase parameter dimension.
For such dimension-reducing autofocus methods, a priori structural information of the phase error is a prerequisite. For frequency domain imaging algorithms, many studies have been made in recent years on the spectral characteristics of two-dimensional phase errors. However, for the time domain imaging algorithm, due to the unclear image spectrum characteristics, the characteristics of the two-dimensional phase error processed by the time domain algorithm are still poorly known, such as the ambiguity of the residual two-dimensional phase error and the analytic structure property, and no published literature is reported at present. Therefore, how to effectively refocus a two-dimensional defocused FBP image using such a two-dimensional auto-focusing strategy based on a priori information remains a challenging problem.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the two-dimensional self-focusing method suitable for the filtering back projection algorithm is provided, phase error parameter estimation is carried out in a dimensionality reduction subspace by utilizing a prior structure of a two-dimensional phase error, and compared with the traditional two-dimensional self-focusing method, the two-dimensional self-focusing method has higher precision and efficiency.
The invention adopts the following technical scheme for solving the technical problems:
a two-dimensional self-focusing method suitable for use in a filtered back-projection algorithm, comprising the steps of:
step 1, multiplying an image obtained by a filtering back projection algorithm by a correction function in an image domain to eliminate spectrum blurring;
step 2, Fourier transform is carried out on the image obtained in the step 1 in the distance direction to (x, k)y) Domain in (x, k)y) Multiplying the domain by a phase correction function to carry out image frequency spectrum uniformization processing, wherein x is a coordinate in the azimuth direction, and k isyIs a distance spatial frequency coordinate;
step 3, performing inverse Fourier transform on the image subjected to the image frequency spectrum uniformization processing in the step 2 in the distance direction to an image domain, and estimating a one-dimensional azimuth phase error by adopting a one-dimensional self-focusing algorithm;
step 4, calculating a two-dimensional phase error according to the mapping relation between the prior phase error analytic structure information, namely the one-dimensional azimuth phase error and the two-dimensional phase error, and performing compensation imaging by using the calculated two-dimensional phase error;
and 5, repeating the steps 3 to 4 until the image meets the precision requirement.
As a preferred embodiment of the present invention, the formula of the correction function in step 1 is as follows:
fcor1(x,y)=exp{jykyc}
wherein ,fcor1(x, y) represents the correction function, x, y are the coordinates of the image pixel in the azimuth and distance directions, respectively, j is the unit of imaginary number, kyc=4πfcC is the offset from the frequency variation, fcThe carrier frequency for the radar transmitted signal and c the speed of light.
As a preferred embodiment of the present invention, the formula of the phase correction function in step 2 is as follows:
wherein ,representing the phase correction function, j being the unit of an imaginary number, kyIs a distance space frequency coordinate, x is a coordinate of the azimuth direction of the image pixel, ya(0) Indicating the position of the radar range coordinate at the time when the azimuth time is zero.
As a preferred solution of the present invention, the one-dimensional autofocus algorithm in step 3 is selected as a phase gradient autofocus algorithm.
As a preferred embodiment of the present invention, the mapping relationship between the one-dimensional azimuth phase error and the two-dimensional phase error in step 4 specifically includes:
wherein ,representing a two-dimensional phase error estimate, kxAs an azimuth space frequency coordinate, kyAs a distance space frequency coordinate, kyc=4πfcC is the offset from the frequency variation, fcThe carrier frequency for the radar transmitted signal, c the speed of light,is a one-dimensional azimuth phase error estimate.
Compared with the prior art, the invention adopting the technical scheme has the following technical effects:
the method fully utilizes the prior information of the residual error of the filtered back projection image, estimates the phase error parameter through dimension reduction processing, and has higher calculation efficiency and parameter estimation precision compared with the traditional method.
Drawings
FIG. 1 is a geometric plot of radar versus an imaged scene.
FIG. 2 is a flow chart of a two-dimensional auto-focusing method of the present invention suitable for use in a filtered back-projection algorithm.
Fig. 3 is an FBP-processed image in which (a) is a spatial domain image and (b) is a range-doppler domain image.
Fig. 4 shows two-dimensional amplitude spectra, where (a) is the FBP processed two-dimensional amplitude spectrum, and (b), (c), and (d) are the two-dimensional spectra of the object A, B, C in fig. 1 at this time.
Fig. 5 shows a two-dimensional amplitude spectrum in which (a) is a two-dimensional amplitude spectrum in which blur removal is performed in accordance with the spectrum, and (b), (c), and (d) are two-dimensional spectra of the object A, B, C in fig. 1 at this time, respectively.
Fig. 6 is a phase spectrum of the object a after the blur elimination and the spectrum are conformed.
Fig. 7 is a one-dimensional azimuthal cross-section of fig. 6 measured at four different spatial range frequencies.
FIG. 8 is a cross-sectional view of a one-dimensional azimuth direction at four different spatial range frequencies calculated using the method of the present invention.
FIG. 9 is a delta of the measured values of FIG. 7 and the calculated values of FIG. 8.
Fig. 10 is a spatial domain image processed by the self-focusing method of the present invention.
FIG. 11 shows the amplified target responses after the autofocus method proposed by the present invention is applied, wherein (a), (B), and (C) are the target responses at point A, point B, and point C, respectively.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
The prior structure information analysis of the two-dimensional phase error after the processing of the filtering back projection algorithm is as follows:
in the beaming mode, the filtered back-projection algorithm image f (x, y) may be represented as:
wherein, (x, y) is the pixel coordinate in the imaging grid, r (t) is the instantaneous distance between the radar and the (x, y) pixel point, rp(T) is the instantaneous distance from the radar to the target, j is the imaginary unit, T is the time of the synthetic aperture,fcis the carrier frequency, fτIs the azimuth frequency, B is the bandwidth of the frequency modulated signal, and c is the speed of light.
The situation without position measurement error is first analyzed.
Under error-free conditions, the differential distance r (t) -rp(t) can be approximated as:
r(t)-rp(t)≈(xp-x)sinθ+(yp-y)cosθ (2)
wherein (xp,yp) The position of the point target, θ, is the radar azimuth.
Substituting it into formula (1) to obtain
Considering the one-to-one correspondence characteristic of the azimuth time and the azimuth angle, the variable substitution is carried out to obtain
wherein ,[θstartend]For the range of observation angles, theta, of the radar to the target in synthetic aperture timestartIs the radar azimuth, theta, corresponding to the synthetic aperture start timeendIs the radar azimuth corresponding to the synthetic aperture end time.
Finally, the polar coordinate (k) is determinedrTheta) into a Cartesian coordinate system (k)x,ky) The filtered backprojection algorithm image f (x, y) can be rewritten as:
wherein kx=krsinθ,ky=krcos θ, D is the integration limit, consisting ofAnd (6) determining.
Next, the case where a distance measurement error exists is considered.
For each pixel in the imaging grid, we assume that the actual and measured instantaneous distances from the radar to that pixel are r (t) and r (t) + r, respectivelye(t) in which re(t) is the measurement error. When imaging, we image with the measured instantaneous distance, so the imaging result is
Then its two-dimensional spectral error can be expressed as Φe(t,kr)=krre(t) of (d). Will polar coordinate (k)rTheta) into a Cartesian coordinate system (k)x,ky) The filtered backprojection algorithm image f (x, y) can be rewritten as:
wherein the function ξ () is the function reThe (. + -.) argument is converted from the azimuth time t to the polar angle θ. Order toThe spectral error can be written as:taylor expansion of the spectral error at distance from the center of the frequency:
Φe(kx,ky)=φ0(kx)+φ1(kx)(ky-kyc)+φ2(kx)(ky-kyc)2+… (8)
wherein The representative azimuth phase error can be estimated by using a traditional self-focusing method (such as a phase gradient self-focusing method), and the relation between the azimuth phase error and the two-dimensional phase error can be obtained by comparing the above formula:
from the above analysis, it can be known that the two-dimensional phase error of the spatial frequency domain is Φe(kx,ky) It is noted that there is a constant offset k from the spatial frequencyycI.e. byΔkyIs the width of the range-wise spatial spectrum. However, when the image spectrum domain is returned by the fast Fourier algorithm, the observation spectrum range is the baseband, that is, the actual spectrum has an offset in the distance frequency domain, and the FFT is used for observationThe measured spectral domain is limited to the baseband, which will produce spectral ambiguity whose effect needs to be removed before the estimation error.
Suppose there are two targets (A and B) whose phase error is null-invariant in the phase history domain, i.e., it is null-invariantBut after processing and transformation to the spatial frequency domain, the phase error relationship becomes:
wherein Δθ=xp/ya(0) Since Δ θ is xp/ya(0) Very little, the above equation can be written approximately as:
that is, the support domains of the image spectra of different targets are offset on the azimuth spatial frequency axis, requiring the image spectrum support domains to be aligned before estimation errors.
Based on the prior information of the residual two-dimensional phase error of the filtered back projection algorithm image, the invention provides a two-dimensional self-focusing method based on prior knowledge.
The technical idea of the invention is to firstly carry out preprocessing, including eliminating frequency spectrum blur and frequency spectrum consistency processing, then estimate the azimuth phase error by using the traditional one-dimensional self-focusing method, and finally calculate the two-dimensional phase error according to the prior analytic structure information of the residual two-dimensional phase error and compensate. As shown in fig. 2, the implementation steps include the following:
and step 1, eliminating spectrum ambiguity.
For the two-dimensional self-focusing method with input as defocused FBP image, due to the estimation and correction of two-dimensional phase errorIs done in the spectral domain and it is therefore necessary to return to the spectral domain. Because fast fourier transform has high computational efficiency, it is often implemented by fast fourier transform. From the previous analysis, it can be known that the frequency spectrum of the FBP image has an offset k in the distance frequency domainyc. But since the FFT does not take into account the spectral offset, if we return from the FBP image domain to the spectral domain through one FFT, the spectrum will be aliased to baseband.
Filtered Back Projection (FBP) algorithms have image spectra that shift in range frequency and therefore require multiplication of the image domain by a phase correction function to convert the data to baseband to avoid blurring of the image spectrum samples. In one-dimensional azimuth autofocusing, this spectral ambiguity is negligible, because a constant ambiguity in the range dimension does not affect the estimation and correction of the azimuth phase error. However, in our proposed two-dimensional auto-focusing method, this ambiguity must be resolved because the two-dimensional auto-focusing process involves a mapping that is related to the range frequency. To remove spectral blurring, we multiply in the image domain by a rectification function, which is fcor1(x,y)=exp{jykycWhere x, y are the coordinates of the image pixel in the azimuth and distance directions, respectively, kyc=4πfc/c,fcIs the carrier frequency and c is the speed of light.
And 2, unifying the image frequency spectrum.
The two-dimensional phase error spectra at different locations in the FBP image have the same shape but different support regions. Thus, the phase error is spatially varying. To facilitate the estimation and correction of the phase error, some pre-processing of the phase error spectrum is required. From the previous analysis, it can be known that the spatial variation is approximated as a spectral shift, and the spectral shift exists only in the azimuth dimension, and the magnitude of the spectral shift is linear with the azimuth position of the spatial domain pixel, so that the spatial shift can be in (x, k)y) The domain is multiplied by a phase correction function, where x is the azimuth position, kyIs the range spatial frequency. A phase correction function ofya(0) The position of the vertical coordinate of the radar at the moment when the azimuth time is zero.
And 3, estimating the azimuth phase error.
After the two pre-treatments, the frequency spectrums of all the image pixels are not blurred, and the support domains are consistent. Thus, the remaining two-dimensional phase error can be estimated and corrected by batch processing. We combine a priori information of the phase error structure so that only one-dimensional phase error is needed for direct estimation. The one-dimensional phase error can be an azimuth phase error or a residual range migration. Since there are a variety of auto-focusing techniques to estimate the azimuth phase error, the estimation of the azimuth phase error is generally chosen.
The azimuth error can be estimated using a conventional one-dimensional autofocus algorithm, such as the phase gradient autofocus algorithm (PGA). It is noted that the conventional one-dimensional auto-focusing algorithm assumes that the energy of the scattering point is within a range unit, and in the two-dimensional auto-focusing algorithm, this can be ensured by reducing the range resolution or dividing the aperture through data preprocessing.
And 4, calculating and compensating the two-dimensional phase error.
Suppose the azimuth phase error estimated in step 3 isAnd directly calculating a two-dimensional phase error according to the prior phase error analytic structure information and performing compensation imaging on the data by using the calculated two-dimensional error. The two-dimensional phase error is expressed as wherein kxIn the form of the azimuth spatial frequency,is a one-dimensional phase error estimate,is a two-dimensional phase error estimate. This mapping can be in two steps: firstly, the scale transformation is realized through an interpolation or frequency modulation scaling technology, and then the distance space frequency related coefficient is multiplied. And the data is subjected to compensation imaging by using the calculated two-dimensional phase error.
And 5, iterating.
In this method, the two-dimensional phase error estimate is calculated from the estimated azimuth phase error, and therefore the accuracy of the two-dimensional autofocus correction depends entirely on the accuracy of the azimuth phase error estimate. However, residual range migration limits accurate measurement of azimuth phase error, since the error energy is distributed over multiple range resolution units. In this case, it may be necessary to perform the estimation and correction process in an iterative manner. That is, step 3 and step 4 are repeated until the image meets the accuracy requirement.
The invention can be further illustrated by Matlab simulation experiment results:
simulation environment: assuming that a radar signal is a linear frequency modulation signal, a carrier frequency is 10GHz, a distance resolution is 0.12m, an azimuth resolution is 0.12m, a scene center distance is 15km, a radar height is 5000m, assuming that information given by an IMU and a GPU is radar linear motion, a motion speed is 120m/s, and actual radar motion has certain maneuvering, and fig. 1 is a geometric relation diagram of the radar and an imaging scene.
Experimental results and analysis: fig. 3 (a) and (b) show the results after FBP treatment, and a significant defocus can be seen. Fig. 4 (a), (b), (c), and (d) show two-dimensional spectra after FBP imaging, and it can be seen that there is a spectrum blur and the spectrum support domain of the target A, B, C is different. Fig. 5 (a), (b), (c), and (d) show that the spectrum of each target is not blurred and the range of the support domain is almost uniform after the blurring elimination and spectrum matching proposed by the present invention. FIG. 6 is a two-dimensional phase spectrum of object A, with several locations of range-space frequencies taken, comparing its directly measured azimuthal profile (FIG. 7) with a profile calculated using a priori analytic structure information (FIG. 8), and calculating the difference between the two (FIG. 9). As can be seen from fig. 9, both are consistent within the measurement error. Fig. 10 shows the image processed by the autofocus algorithm proposed by the present invention, and three point objects can be clearly seen. FIGS. 11 (a), (B), and (C) are enlarged views of object A, object B, and object C, and it can be seen that the point objects are all well focused, thereby demonstrating the effectiveness of the method of the present invention.
The above embodiments are only for illustrating the technical idea of the present invention, and the protection scope of the present invention is not limited thereby, and any modifications made on the basis of the technical scheme according to the technical idea of the present invention fall within the protection scope of the present invention.

Claims (5)

1. A two-dimensional self-focusing method suitable for use in a filtered back-projection algorithm, comprising the steps of:
step 1, multiplying an image obtained by a filtering back projection algorithm by a correction function in an image domain to eliminate spectrum blurring;
step 2, Fourier transform is carried out on the image obtained in the step 1 in the distance direction to (x, k)y) Domain in (x, k)y) Multiplying the domain by a phase correction function to carry out image frequency spectrum uniformization processing, wherein x is a coordinate in the azimuth direction, and k isyIs a distance spatial frequency coordinate;
step 3, performing inverse Fourier transform on the image subjected to the image frequency spectrum uniformization processing in the step 2 in the distance direction to an image domain, and estimating a one-dimensional azimuth phase error by adopting a one-dimensional self-focusing algorithm;
step 4, calculating a two-dimensional phase error according to the mapping relation between the prior phase error analytic structure information, namely the one-dimensional azimuth phase error and the two-dimensional phase error, and performing compensation imaging by using the calculated two-dimensional phase error;
and 5, repeating the steps 3 to 4 until the image meets the precision requirement.
2. A two-dimensional auto-focusing method suitable for use in a filtered back-projection algorithm as claimed in claim 1, wherein the correction function of step 1 is formulated as follows:
fcor1(x,y)=exp{jykyc}
wherein ,fcor1(x, y) represents a correction function, x, y are coordinates of the image pixel in the azimuth and distance directions, respectively, j is an imaginary unit, kyc=4πfcC is the offset from the frequency variation, fcThe carrier frequency for the radar transmitted signal and c the speed of light.
3. A two-dimensional auto-focusing method suitable for use in a filtered back-projection algorithm as claimed in claim 1, wherein the phase correction function of step 2 is formulated as follows:
wherein ,representing the phase correction function, j being the unit of an imaginary number, kyIs a distance space frequency coordinate, x is a coordinate of the azimuth direction of the image pixel, ya(0) Indicating the position of the radar range coordinate at the time when the azimuth time is zero.
4. The two-dimensional auto-focusing method suitable for the filtered back-projection algorithm of claim 1, wherein the one-dimensional auto-focusing algorithm of step 3 is selected to be a phase gradient auto-focusing algorithm.
5. The two-dimensional self-focusing method suitable for the filtered back-projection algorithm as claimed in claim 1, wherein the mapping relationship between the one-dimensional azimuth phase error and the two-dimensional phase error in step 4 is specifically:
wherein ,representing a two-dimensional phase error estimate, kxAs an azimuth space frequency coordinate, kyAs a distance space frequency coordinate, kyc=4πfcC is the offset from the frequency variation, fcThe carrier frequency for the radar transmitted signal, c the speed of light,is a one-dimensional azimuth phase error estimate.
CN201910078771.5A 2019-01-28 2019-01-28 Two-dimensional self-focusing method suitable for filtering back projection algorithm Active CN109799502B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910078771.5A CN109799502B (en) 2019-01-28 2019-01-28 Two-dimensional self-focusing method suitable for filtering back projection algorithm

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910078771.5A CN109799502B (en) 2019-01-28 2019-01-28 Two-dimensional self-focusing method suitable for filtering back projection algorithm

Publications (2)

Publication Number Publication Date
CN109799502A true CN109799502A (en) 2019-05-24
CN109799502B CN109799502B (en) 2023-05-16

Family

ID=66560427

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910078771.5A Active CN109799502B (en) 2019-01-28 2019-01-28 Two-dimensional self-focusing method suitable for filtering back projection algorithm

Country Status (1)

Country Link
CN (1) CN109799502B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111537999A (en) * 2020-03-04 2020-08-14 云南电网有限责任公司电力科学研究院 Robust and efficient decomposition projection automatic focusing method
CN113376632A (en) * 2021-05-18 2021-09-10 南京航空航天大学 Large squint airborne SAR imaging method based on pretreatment and improved PFA
CN115453530A (en) * 2022-08-11 2022-12-09 南京航空航天大学 Bistatic SAR (synthetic aperture radar) filtering back-projection two-dimensional self-focusing method based on parameterized model

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2650695A1 (en) * 2012-08-02 2013-10-16 Institute of Electronics, Chinese Academy of Sciences Imaging method for synthetic aperture radar in high squint mode
CN103941241A (en) * 2014-05-14 2014-07-23 中国人民解放军国防科学技术大学 Radiation correction method suitable for non-linear track SAR imaging
CN104316924A (en) * 2014-10-15 2015-01-28 南京邮电大学 Autofocus motion compensation method of airborne ultra-high resolution SAR (Synthetic Aperture Radar) back projection image
CN105116411A (en) * 2015-08-17 2015-12-02 南京航空航天大学 A two-dimensional self-focusing method applicable to a range migration algorithm

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2650695A1 (en) * 2012-08-02 2013-10-16 Institute of Electronics, Chinese Academy of Sciences Imaging method for synthetic aperture radar in high squint mode
CN103941241A (en) * 2014-05-14 2014-07-23 中国人民解放军国防科学技术大学 Radiation correction method suitable for non-linear track SAR imaging
CN104316924A (en) * 2014-10-15 2015-01-28 南京邮电大学 Autofocus motion compensation method of airborne ultra-high resolution SAR (Synthetic Aperture Radar) back projection image
CN105116411A (en) * 2015-08-17 2015-12-02 南京航空航天大学 A two-dimensional self-focusing method applicable to a range migration algorithm

Non-Patent Citations (7)

* Cited by examiner, † Cited by third party
Title
HASSAN MANSOUR等: "Sparse Blind Deconvolution for Distributed Radar Autofocus Imaging", 《IEEE TRANSACTIONS ON COMPUTATIONAL IMAGING》 *
LEI RAN等: "Multiple Local Autofocus Back-Projection Algorithm for Space-Variant Phase-Error Correction in Synthetic Aperture Radar", 《IEEE GEOSCIENCE AND REMOTE SENSING LETTERS》 *
OCTAVIO PONCE等: "First Airborne Demonstration of Holographic SAR Tomography With Fully Polarimetric Multicircular Acquisitions at L-Band", 《 IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》 *
XINHUA MAO等: "Knowledge-Aided 2-D Autofocus for Spotlight", 《IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING》 *
毛新华等: "基于先验知识的SAR两维自聚焦算法", 《电子学报》 *
阚学超等: "一种新的圆迹SAR快速后向投影算法", 《制导与引信》 *
陈家瑞等: "基于对比度最优准则的反投影自聚焦方法", 《雷达科学与技术》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111537999A (en) * 2020-03-04 2020-08-14 云南电网有限责任公司电力科学研究院 Robust and efficient decomposition projection automatic focusing method
CN111537999B (en) * 2020-03-04 2023-06-30 云南电网有限责任公司电力科学研究院 Robust and efficient decomposition projection automatic focusing method
CN113376632A (en) * 2021-05-18 2021-09-10 南京航空航天大学 Large squint airborne SAR imaging method based on pretreatment and improved PFA
CN113376632B (en) * 2021-05-18 2023-12-15 南京航空航天大学 Large strabismus airborne SAR imaging method based on pretreatment and improved PFA
CN115453530A (en) * 2022-08-11 2022-12-09 南京航空航天大学 Bistatic SAR (synthetic aperture radar) filtering back-projection two-dimensional self-focusing method based on parameterized model
CN115453530B (en) * 2022-08-11 2024-03-29 南京航空航天大学 Double-base SAR filtering back projection two-dimensional self-focusing method based on parameterized model

Also Published As

Publication number Publication date
CN109799502B (en) 2023-05-16

Similar Documents

Publication Publication Date Title
CN108051809B (en) Moving target imaging method and device based on Radon transformation and electronic equipment
CN109856635B (en) CSAR ground moving target refocusing imaging method
Mao et al. Knowledge-aided 2-D autofocus for spotlight SAR range migration algorithm imagery
CN106324597B (en) The translational compensation and imaging method of big corner ISAR radar based on PFA
CN109799502B (en) Two-dimensional self-focusing method suitable for filtering back projection algorithm
CN104597447B (en) A kind of big stravismus of sub-aperture SAR improves Omega K imaging method
CN105116411B (en) A kind of bidimensional self-focusing method suitable for range migration algorithm
CN110554385B (en) Self-focusing imaging method and device for maneuvering trajectory synthetic aperture radar and radar system
CN114545411B (en) Polar coordinate format multimode high-resolution SAR imaging method based on engineering realization
KR101456185B1 (en) Method and apparatus for yielding radar imaging
CN108828597B (en) Radar echo inversion method and device for sliding bunching mode image
CN110954899B (en) Sea surface ship target imaging method and device under high sea condition
CN110988873B (en) Single-channel SAR ship speed estimation method and system based on energy center extraction
CN105572648B (en) A kind of synthetic aperture radar echo data range migration correction method and apparatus
CN108594196B (en) Method and device for extracting parameters of target scattering center
Dai et al. High accuracy velocity measurement based on keystone transform using entropy minimization
CN111127334B (en) SAR image real-time geometric correction method and system based on RD plane pixel mapping
CN108562898A (en) A kind of front side regards the distance and bearing bidimensional space-variant self-focusing method of SAR
CN112505647A (en) Moving target azimuth speed estimation method based on sequential sub-image sequence
CN116819466A (en) Double-base ISAR azimuth calibration and geometric correction method based on minimum entropy of image
CN112859018B (en) Video SAR imaging method based on image geometric correction
Çetin et al. Handling phase in sparse reconstruction for SAR: Imaging, autofocusing, and moving targets
CN115453530A (en) Bistatic SAR (synthetic aperture radar) filtering back-projection two-dimensional self-focusing method based on parameterized model
CN114879187A (en) Multidimensional synthetic aperture radar registration method and device
CN104166140B (en) Method and device for realizing inverse synthetic aperture radar imaging

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant